The **3n + 1 problem** is deceptively simple. Consider a function f(n) and a sequence a_{i} where:

f(n) = | 3n+1 where n is odd |

n/2 where n is even |

a_{i} = |
n where i = 0 |

f(a_{i-1}) where i > 1 |

The **Collatz conjecture** states that a_{i} will become 1 for some i regardless of what value of n is chosen initially. It remains to date a conjecture as there is no proof for it, but there is no counterexample either.

Verifying that the conjecture is true for any given n is pretty easy. One simply has to compute the Collatz stopping time, which is to say the number of steps i required for the sequence a_{i} to becomes equal to 1.

Interestingly, a substantial increase in the size of n results in only a modest increase in stopping time on average. For example, the largest stopping time for an integer under 100,000,000 is 949, and the largest stopping time for an integer under 10,000,000,000 is 1132.

Because the standard definition uses only very simple arithmetic operations, it is not difficult to create a simple multi-precision arithmetic package that lets one deal with arbitrarily large values of n.

**Here** is one such implementation, along with a small test program.