I have a while loop in my program that is going to execute a very large
number of loops (well over 100000000). Sometimes this can take quite
some time for the computer to process so I'd like the program to supply
some sort of estimate as to how long the process may take. The number
of times the program must loop as well as the amount of proccessing
required for the block of code in the loop is entirely dependant on
choices the user makes before the processing begins so I can't just
print out a generic estimate. I'm not sure how to approach something
like this, is there a way for the computer to estimate using the time
it takes to complete a small number of loops (say 1000 or so)?