Hi.
I need to make a series of divisions with a precision of 10,000 decimal digits (or larger). I believe the best way to achieve this is with a linked list, with each node holdings six orders of magnitude more than the predecessor node, and so forth. Thus, the division of, say 1/9753, which yields a period 3250 digits long can be represented in its totality in the data structure, a few times over.
Can anyone suggest an algorithm that would work on these conditions? (Or point me in the direction of a similar solution already posted here?)
Thanks!
