Calculating the Collatz conjecture with Swift

The Collatz conjecture is a mathematical conjecture named after Lothar Collatz. The basic idea is described in Wikipedia as the following:

“Take any natural number n. If n is even, divide it by 2 to get n / 2. If n is odd, multiply it by 3 and add 1 to obtain 3n + 1. Repeat the process (which has been called "Half Or Triple Plus One”, or HOTPO[6]) indefinitely. The conjecture is that no matter what number you start with, you will always eventually reach 1.“

So fundamentally very simple mathematics. So let’s get started by defining the range of natural numbers to test and creating an array where to store results to increase efficiency dramatically.

// Range we test
let FROM = 2
let UNTIL = 999_999

// Save sequence lengths in array for gains in performance
let ARRAY_SIZE = 1_000_000
let stepsArray = Array(count: ARRAY_SIZE, repeatedValue: -1)
stepsArray[1] = 0

Notice the nice underscore syntax Swift allows for improving the readability of large numbers: 1_000_000 instead of 1000000.

You might wonder why the stepsArray is defined as a constant instead of a variable. The ‘let’ definition here does not mean that contents of the array cannot be modified. It means that the array cannot be resized during runtime, as variable Swift arrays seem to be closer to C++ vectors than C-style arrays.

Apple recommends creating constant arrays for efficiency when resizing is not required:
”It is good practice to create immutable collections in all cases where the collection’s size does not need to change. Doing so enables the Swift compiler to optimize the performance of the collections you create.”

Next we iterate over the range.

// Loop each number in the range
for number in FROM...UNTIL
{
    var value = number
    var steps = 0

    // If previously calculated, no need to recalculate
    var notFound = stepsArray[value] < 0

    while (notFound)
    {
        // Actual calculation of the Collatz conjecture
        // If even, divide by 2. If odd, multiply by 3 and add one.
        value = value % 2 == 0 ? value / 2 : value * 3 + 1
        ++steps

        // Limit range to avoid overflow runtime error
        notFound = value < (ARRAY_SIZE - 1) ? stepsArray[value] < 0 : true
    }

    steps += stepsArray[value]
    stepsArray[number] = steps
}

The above should quite self-explanatory. Now that everything is calculated we just need to check which number had the longest path to reach 1 in our range and place it in the longestSequence tuple (starting number and step count to reach 1).

// Now find out which one had the longest sequence
var longestSequence : (Int, Int) = (0, 0)
for number in FROM...UNTIL
{
    if stepsArray[number] > longestSequence.1
    {
        // Set longest sequence tuple to hold the number and sequence length
        longestSequence = (number, stepsArray[number])
    }
}
println("DONE! Value \(longestSequence.0) has longest sequence \(longestSequence.1)")

And done! Follow-up to this article will be a performance comparison of this solution against a C++ solution.

 
21
Kudos
 
21
Kudos

Now read this

Dealing with large numbers in Swift

Type inference is a handy feature in Swift that makes declaring most variables fun and easy. However, there are cases when you must explicitly define the type of the variable. In this example, the starting point is having 4 big numbers,... Continue →