By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,984 Members | 1,009 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,984 IT Pros & Developers. It's quick & easy.

Long(er) Factorial

P: n/a
This is my code:

long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
}

But it works only for small numbers (up to 15 or so). Can someone tell
me what to change/add, so I can calculate larger factorials?

Thank You,

Kris

Apr 21 '06 #1
Share this Question
Share on Google+
35 Replies


P: n/a
"aNt17017" wrote:
long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
}

But it works only for small numbers (up to 15 or so). Can someone tell
me what to change/add, so I can calculate larger factorials?


a) Changing the function type from long to unsigned long will give you
one more usable bit, it may be just enough to obtain one more value
from your function.

b) Check your system (compiler docs, header file "stdint.h", etc.) to
see if you have a built-in type offering a greater range than "long"
("long long" may be available)

c) Use a library like GMP (http://www.swox.com/gmp/) instead of the C
built-in data types and operators.

Apr 21 '06 #2

P: n/a
take the log of the number and do plus since a*b=log(a)+log(b).

Apr 21 '06 #3

P: n/a

aNt17017 wrote:
This is my code:

long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
}

But it works only for small numbers (up to 15 or so). Can someone tell
me what to change/add, so I can calculate larger factorials?

Thank You,

Kris


take log of the numbers, say you want a.b.c,instead of multiply, just
do log(a)+log(b)+log(c)
this==log(abc), you can hold big numbers in log scale.

Apr 21 '06 #4

P: n/a
In article <11*********************@v46g2000cwv.googlegroups. com>,
aNt17017 <3v******@gmail.com> wrote:
This is my code: long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
} But it works only for small numbers (up to 15 or so). Can someone tell
me what to change/add, so I can calculate larger factorials?


How many bits are in a 'long' in your implementation?

12 factorial is the largest that will fit in a signed 32 bit long.
13! -- needs 34 bits of signed (2's complement) long
14! -- 38 bits
15! -- 42 bits
16! -- 46 bits

19! -- 58 bits
20! -- 63 bits
21! -- 67 bits

100! -- requires at least a 526 bit signed 2's complement long
(158 decimal digits)

If you want to be able to calculate such large numbers, you will
have to write (or find) an extended precision library.
--
"law -- it's a commodity"
-- Andrew Ryan (The Globe and Mail, 2005/11/26)
Apr 21 '06 #5

P: n/a
ro******@ibd.nrc-cnrc.gc.ca (Walter Roberson) writes:
In article <11*********************@v46g2000cwv.googlegroups. com>,
aNt17017 <3v******@gmail.com> wrote:
This is my code:
long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
}

But it works only for small numbers (up to 15 or so). Can someone tell
me what to change/add, so I can calculate larger factorials?


How many bits are in a 'long' in your implementation?

<SNIP> 100! -- requires at least a 526 bit signed 2's complement long
(158 decimal digits)

If you want to be able to calculate such large numbers, you will
have to write (or find) an extended precision library.


Or he could perhaps change the data types to double.

--

John Devereux
Apr 21 '06 #6

P: n/a
in my my implementation 'long' = 32 bits
I tried to use 'long long' or 'double' types and it doesn't work.

Apr 21 '06 #7

P: n/a
This is my complete setup:
1 - get data from user
2 - calculate factorial

and it's still doesn't work as I wish. I don't wont to use other
libraries 'cause it should base on "simple" recursive formula.
void factorial(void)
{
clearScreen();
printf("\n\t\tCalculate the Factorial of entered numbers.\n");
printf("\t\t====================================== =====\n");

int num;
printf("\t\tPlease enter a value in the range of 0 to 100(q to
quit): ");
while (scanf("%d",&num) == 1)
{
if(num <= 0)
printf("\t\tPlease enter a value greater than 0.\n");
else if (num > 100)
printf("\t\tPlease keep the value under 100.\n");
else
{
printf("\t\t%d factorial = %ld\n",num, fact(num));
}
printf("\t\tPlease enter a value in the range of 0 to 100(q to
quit): ");
}
pressEnter();
}
long long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
}

Apr 21 '06 #8

P: n/a
"aNt17017" <3v******@gmail.com> wrote:

# But it works only for small numbers (up to 15 or so). Can someone tell
# me what to change/add, so I can calculate larger factorials?

Factorial is a restriction of the gamma function to integers.
n! = either gamma(n+1) or gamma(n-1), don't remember which offhand.
The gamma function should be in the math library; it will efficiently
and accurately compute factorials of doubles outside the range of
long long long long ints.

--
SM Ryan http://www.rawbw.com/~wyrmwif/
God's a skeeball fanatic.
Apr 21 '06 #9

P: n/a
"aNt17017" <3v******@gmail.com> writes:
in my my implementation 'long' = 32 bits
I tried to use 'long long' or 'double' types and it doesn't work.


You need to provide some context when you post a followup.
Read <http://cfaj.freeshell.org/google/>.

Saying "it doesn't work" really doesn't tell us anything useful. What
exactly did you try, and what happened?

Note that using double will give you imprecise results; double can
represent very large numbers, but it pays for that by not being able
to distinguish between very large numbers that differ by a small
amount.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Apr 21 '06 #10

P: n/a
John Devereux wrote:

ro******@ibd.nrc-cnrc.gc.ca (Walter Roberson) writes:
In article <11*********************@v46g2000cwv.googlegroups. com>,
aNt17017 <3v******@gmail.com> wrote:
This is my code:

long fact(int n)
[...]
But it works only for small numbers (up to 15 or so). Can someone tell
me what to change/add, so I can calculate larger factorials?


How many bits are in a 'long' in your implementation?

<SNIP>
100! -- requires at least a 526 bit signed 2's complement long
(158 decimal digits)

If you want to be able to calculate such large numbers, you will
have to write (or find) an extended precision library.


Or he could perhaps change the data types to double.


But only if an approximation of the factorial is permissible.

--
+-------------------------+--------------------+-----------------------------+
| Kenneth J. Brody | www.hvcomputer.com | |
| kenbrody/at\spamcop.net | www.fptech.com | #include <std_disclaimer.h> |
+-------------------------+--------------------+-----------------------------+
Don't e-mail me at: <mailto:Th*************@gmail.com>

Apr 21 '06 #11

P: n/a
"aNt17017" <3v******@gmail.com> writes:
This is my complete setup:
1 - get data from user
2 - calculate factorial

and it's still doesn't work as I wish. I don't wont to use other
libraries 'cause it should base on "simple" recursive formula.
void factorial(void)
{
clearScreen();
printf("\n\t\tCalculate the Factorial of entered numbers.\n");
printf("\t\t====================================== =====\n");

int num;
printf("\t\tPlease enter a value in the range of 0 to 100(q to
quit): ");
while (scanf("%d",&num) == 1)
{
if(num <= 0)
printf("\t\tPlease enter a value greater than 0.\n");
else if (num > 100)
printf("\t\tPlease keep the value under 100.\n");
else
{
printf("\t\t%d factorial = %ld\n",num, fact(num));
}
printf("\t\tPlease enter a value in the range of 0 to 100(q to
quit): ");
}
pressEnter();
}
long long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
}


Again, read <http://cfaj.freeshell.org/google/>.

This is not a complete program. You don't have a main() function, and
you don't have the "#include <stdio.h>" that's required for printf and
scanf.

You say it doesn't work as you wish. What does it do? Or are we
supposed to guess?

What are clearScreen() and pressEnter()? It's obvious from their
names what they're supposed to do, but why on Earth do you want to
clear the screen before asking for input? If I'm running your
program, I might have important information on my screen.

In factorial(), you declare "int num;" after several call statements.
In C90, all declarations must precede all statements in a block. (C99
allows them to be mixed, as does C++, and some compilers allow it as
an extension, but there's no real benefit in using the feature here.)

You call fact() from factorial(). At the point of the call, the
declaration of the fact() function hasn't been seen yet. In C90, this
causes the compiler to assume that fact() returns an int; since it
actually returns long long, the result is undefined behavior. If
you're lucky, the program will crash; if you're unlucky, you may get
seemingly valid output in some cases. The simplest way to fix this is
to move the definition of fact() above the definition of factorial().
You can also provide separate declarations at the top of the program,
and give the definitions in any order you like (make sure the
declarations match the definitions).

fact() returns long long int (using unsigned long long int would give
you one additional bit), but you print its result using printf's "%ld"
format, which expects a long int. The format for printing a long long
int is "%lld" (or "%llu" for unsigned long long).

If you fix these problems, your program should be able to handle
values up to fact(20), or 2432902008176640000, which fits in 62 bits.
fact(21), or 51090942171709440000, requires 66 bits.

Some interesting numbers:

2432902008176640000 fact(20)
9223372036854775807 2**63-1
18446744073709551615 2**64-1
51090942171709440000 fact(21)

2**63-1 is likely to be the maximum value of a long long; 2**64-1 is
likely to be the maximum value of an unsigned long long.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Apr 21 '06 #12

P: n/a
aNt17017 a écrit :
This is my code:

long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
}

But it works only for small numbers (up to 15 or so). Can someone tell
me what to change/add, so I can calculate larger factorials?

Thank You,

Kris


You can use the lcc-win32 compiler:
#include <stdio.h>
#include <bignums.h>
pBignum factorial(int n)
{
pBignum b = 1,r=1;

while (n) {
r = b*r;
b++;
n--;
}
return r;
}
int main(void)
{
pBignum b = factorial(100);
char buffer[4096];

quadtoa(b,buffer);
printf("%s\n",buffer);
}

Output:
93326215443944152681699238856266700490715968264381 62146859296389521759999322991560894146397615651828 62536979208272237582511852109168640000000000000000 00000000

lcc-win32:
http://www.cs.virginia.edu/~lcc-win32
Apr 21 '06 #13

P: n/a
jacob navia <ja***@jacob.remcomp.fr> writes:
aNt17017 a écrit :
This is my code:
long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
}
But it works only for small numbers (up to 15 or so). Can someone
tell
me what to change/add, so I can calculate larger factorials?
Thank You,
Kris


You can use the lcc-win32 compiler:

[snip]

Only if he happens to be using a win32 system, and only if he doesn't
mind his code being portable only to win32 systems.

This is, of course, off-topic.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Apr 21 '06 #14

P: n/a
questions? wrote:
aNt17017 wrote:
This is my code:

long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
}

But it works only for small numbers (up to 15 or so). Can someone tell
me what to change/add, so I can calculate larger factorials?

Thank You,

Kris


take log of the numbers, say you want a.b.c,instead of multiply, just
do log(a)+log(b)+log(c)
this==log(abc), you can hold big numbers in log scale.


True, but you need to use a double and hope that you aren't losing
precision; holding logs requires enough space to hold all the decimal
places.
So, 8 can fit into a signed char, which is 7 usable bits, but log(8) is
0.903089987 and will require a float or a double (the latter of which is
80 bits on some systems!).
Apr 21 '06 #15

P: n/a
aNt17017 wrote:

This is my code:

long fact(int n)
{
if (n == 0)
return(1);
if(n > 100)
{
printf("\t\tERROR: %d is too large for factorial.\n", n);
return 1;
}
return(n * fact(n-1));
}

But it works only for small numbers (up to 15 or so). Can someone
tell me what to change/add, so I can calculate larger factorials?


Here is one way:

/* compute factorials, extended range
on a 32 bit machine this can reach fact(15) without
unusual output formats. With the prime table shown
overflow occurs at 101.

Public domain, by C.B. Falconer. 2003-06-22
*/

#include <stdio.h>
#include <stdlib.h>
#include <limits.h>

/* 2 and 5 are handled separately
Placing 2 at the end attempts to preserve such factors
for use with the 5 factor and exponential notation
*/
static unsigned char primes[] = {3,7,11,13,17,19,23,29,31,37,
41,43,47,53,57,59,61,67,71,
/* add further primes here -->*/
2,5,0};
static unsigned int primect[sizeof primes]; /* = {0} */

static double fltfact = 1.0;

static
unsigned long int fact(unsigned int n, unsigned int *zeroes)
{
unsigned long val;
unsigned int i, j, k;

#define OFLOW ((ULONG_MAX / j) < val)

/* This is a crude mechanism for passing back values */
for (i = 0; i < sizeof primes; i++) primect[i] = 0;

for (i = 1, val = 1UL, *zeroes = 0; i <= n; i++) {
fltfact *= i; /* approximation */
j = i;
/* extract exponent of 10 */
while ((0 == (j % 5)) && (!(val & 1))) {
j /= 5; val /= 2;
(*zeroes)++;
}
/* Now try to avoid any overflows */
k = 0;
while (primes[k] && OFLOW) {
/* remove factors primes[k] */
while (0 == (val % primes[k]) && OFLOW) {
val /= primes[k];
++primect[k];
}
while (0 == (j % primes[k]) && OFLOW) {
j /= primes[k];
++primect[k];
}
k++;
}

/* Did we succeed in the avoidance */
if (OFLOW) {
#if DEBUG
fprintf(stderr, "Overflow at %u, %lue%u * %u\n",
i, val, *zeroes, j);
#endif
val = 0;
break;
}
val *= j;
}
return val;
} /* fact */

int main(int argc, char *argv[])
{
unsigned int x, zeroes;
unsigned long f;

if ((2 == argc) && (1 == sscanf(argv[1], "%u", &x))) {
if (!(f = fact(x, &zeroes))) {
fputs("Overflow\n", stderr);
return EXIT_FAILURE;
}

printf("Factorial(%u) == %lu", x, f);
if (zeroes) printf("e%u", zeroes);
for (x = 0; primes[x]; x++) {
if (primect[x]) {
printf(" * pow(%d,%d)", primes[x], primect[x]);
}
}
putchar('\n');
printf("or approximately %.0f.\n", fltfact);
return 0;
}
fputs("Usage: fact n\n", stderr);
return EXIT_FAILURE;
} /* main */

--
"If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers." - Keith Thompson
More details at: <http://cfaj.freeshell.org/google/>
Also see <http://www.safalra.com/special/googlegroupsreply/>
Apr 22 '06 #16

P: n/a
[snips]

On Fri, 21 Apr 2006 19:23:18 +0000, Keith Thompson wrote:
What are clearScreen() and pressEnter()? It's obvious from their
names what they're supposed to do, but why on Earth do you want to
clear the screen before asking for input? If I'm running your
program, I might have important information on my screen.


God, I hate that asinine argument. If your information is so damned
important, then a) why are you storing it on a screen, instead of backed
up to a DVD, tape, or even simply a disk file, and b) why are you running
such wildly unknown and unpredictable applications *right in the middle of
your critical data*?

I very rarely have problems with my backups containing the correct data...
because I very rarely dump the results of /dev/random into the files I
create as part of the backup. See, that data is important, so I simply do
not treat it as a scratch pad. End of problem.

If the data's important, but the user treats the data store like a scratch
pad, blame the user, not the applications, when the data goes away.

Run it on a different machine. In a different terminal. Whatever turns
your crank. Running it in the same place your "important information"
resides, when you don't know the results of running it, is just stupid.
Apr 25 '06 #17

P: n/a
Kelsey Bjarnason <kb********@gmail.com> writes:
[snips]
On Fri, 21 Apr 2006 19:23:18 +0000, Keith Thompson wrote:
What are clearScreen() and pressEnter()? It's obvious from their
names what they're supposed to do, but why on Earth do you want to
clear the screen before asking for input? If I'm running your
program, I might have important information on my screen.


God, I hate that asinine argument. If your information is so damned
important, then a) why are you storing it on a screen, instead of backed
up to a DVD, tape, or even simply a disk file, and b) why are you running
such wildly unknown and unpredictable applications *right in the middle of
your critical data*?

[...]

I think we're both guilty of overstatement.

Ok, so my screen is a scratchpad, and I shouldn't depend on any
information on it sticking around for long. But it's *my* scratchpad,
and if a program clears it for no good reason, I'm not going to be
pleased.

If your program needs to take control of the entire screen (say, it's
a text editor), on many systems it can use a system-specific library
that allows it to save and restore the current screen contents.

What's on my screen is probably the output of the last program (or
programs) I ran. Your own program's output isn't so important that
you need to erase *my* (possibly unimportant) data.

Unless you can think of a good reason why any program should need to
clear the screen before printing "Please enter a number"?

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Apr 25 '06 #18

P: n/a
Kelsey Bjarnason <kb********@gmail.com> wrote:
On Fri, 21 Apr 2006 19:23:18 +0000, Keith Thompson wrote:
What are clearScreen() and pressEnter()? It's obvious from their
names what they're supposed to do, but why on Earth do you want to
clear the screen before asking for input? If I'm running your
program, I might have important information on my screen.


God, I hate that asinine argument. If your information is so damned
important, then a) why are you storing it on a screen, instead of backed
up to a DVD, tape, or even simply a disk file,


Who's talking about storing? Suppose I've just run a command - it need
be no more complicated than ls - and now I want to enter some
information from the result of that command into the next program. Very
useful, at times. If that next command just clobbered over my directory
listing for no good reason, all filenames that I wanted to enter are now
gone.

Richard
Apr 25 '06 #19

P: n/a
On Tue, 25 Apr 2006 11:25:55 +0000, Richard Bos wrote:
Kelsey Bjarnason <kb********@gmail.com> wrote:
On Fri, 21 Apr 2006 19:23:18 +0000, Keith Thompson wrote:
> What are clearScreen() and pressEnter()? It's obvious from their
> names what they're supposed to do, but why on Earth do you want to
> clear the screen before asking for input? If I'm running your
> program, I might have important information on my screen.
God, I hate that asinine argument. If your information is so damned
important, then a) why are you storing it on a screen, instead of backed
up to a DVD, tape, or even simply a disk file,


Who's talking about storing?


He is. That's where his important data is. where he stored it. If he
doesn't want it messed with, he has two choices: store it somewhere else,
or don't run unpredictable things in the same place it's being stored.

There are _no_ other options if the data is important.
Suppose I've just run a command - it need
be no more complicated than ls - and now I want to enter some
information from the result of that command into the next program. Very
useful, at times. If that next command just clobbered over my directory
listing for no good reason, all filenames that I wanted to enter are now
gone.


So you just ran a program with unpredictable outputs all over your data
store, and you're complaining that your data store is gone. Do you
regularly dump /dev/random ovetop of your backups? No? Why not? Could it
have something to do with it being a *stupid* idea to dump unpredictable
data overtop of data you want to keep?

Of course you'd never do that. It would be stupid. But somehow it's
magically not stupid when the data's on the screen.

Nope, sorry, it's stupid in both cases. If the data's important, you do
*not* run apps with unpredictable outputs in the middle of your store.
Apr 25 '06 #20

P: n/a
[snips]

On Tue, 25 Apr 2006 09:21:20 +0000, Keith Thompson wrote:
Ok, so my screen is a scratchpad, and I shouldn't depend on any
information on it sticking around for long. But it's *my* scratchpad,
and if a program clears it for no good reason, I'm not going to be
pleased.
I might not be, either... but I'm not going to be overly annoyed, either,
for the simple reason that it's a screen, not a permanent or even
semi-permanent record, so if it gets zapped, well, it was a scratch in the
first place, what the heck did I expect, especially when running an app
with unknown outputs smack bang in the middle of the scratch area?

If I did that and it, say, dumped crap into my kernel files, I'd be
_really_ annoyed; those are _not_ scratch data.

If your program needs to take control of the entire screen (say, it's a
text editor), on many systems it can use a system-specific library that
allows it to save and restore the current screen contents.
Indeed, in the ideal world, if you need to "clear the screen", do it in a
"window", even where that window is, itself, the full screen - then dump
the original contents back.

I just find the notion of getting upset that an app clears a scratch area
about as silly as complaining that it creates temp files in the temp file
dir. It's a transitory medium, and a lot of apps do unpredictable things
with it - not just clearing screens; could be simply dumping volumes of
data out fast enough to flush the scrollback buffer before you can kill
it. Didn't clear the screen, but the data's still gone. Relying on the
screen as a storage medium, then complaining when the storage gets nuked,
is just damnfoolishness.
What's on my screen is probably the output of the last program (or
programs) I ran. Your own program's output isn't so important that you
need to erase *my* (possibly unimportant) data.
My program may or may not be that important - that's for you to decide.
And you did decide it was that important, by running it in the middle of
your storage area. In fact, in doing so, you demonstrated that your
on-screen data has absolutely _zero_ value. Why? Simple: the app you're
complaining about is clearing the screen. If you _knew_ it was going to
do this, you'd have run it in another console, or saved your existing data
somewhere, etc.

Instead, the clear screen took you by surprise. This means you have no
idea what the output of the application is, whether it clears the screen,
locks up the computer, floods the scrollback buffers, etc. Doesn't matter
the actual details, the fact that it's taking you by surprise means you
simply _do not know_. If you don't know what the output is, but you're
running it in the middle of your data store, then you, yourself, are, by
that very action, defining the value of your data in the store. The value
is zero. If it were anything else, you wouldn't be running unknown
applications in the middle of it.
Unless you can think of a good reason why any program should need to
clear the screen before printing "Please enter a number"?


In the case of some silly-ass little app that does nothing more than
asking for a number, no, there's not much point. That doesn't mean that
there aren't apps where the author feels there _is_ a point in clearing
the screen. Or dumping enormous quantities of data. Or whatever.

I understand your key concept, that apps should be "well behaved", and I,
at least, generally try to achieve that end myself. On the other hand,
having been bitten once or twice by exactly this sort of thing - buffer
floods, screen clears, etc - I simply ask myself if what I've already got
matters. If it doesn't, run the app. If it does, and I run the app and
lose the data, I've got nobody to blame but myself.
Apr 25 '06 #21

P: n/a
Kelsey Bjarnason <kb********@gmail.com> wrote:
On Tue, 25 Apr 2006 11:25:55 +0000, Richard Bos wrote:
Kelsey Bjarnason <kb********@gmail.com> wrote:
On Fri, 21 Apr 2006 19:23:18 +0000, Keith Thompson wrote:

> What are clearScreen() and pressEnter()? It's obvious from their
> names what they're supposed to do, but why on Earth do you want to
> clear the screen before asking for input? If I'm running your
> program, I might have important information on my screen.

God, I hate that asinine argument. If your information is so damned
important, then a) why are you storing it on a screen, instead of backed
up to a DVD, tape, or even simply a disk file,
Who's talking about storing?


He is. That's where his important data is. where he stored it.


No, he didn't. He generated it for immediate use. If you want to call
that "storing", you may; by the same token, I may call you a hideous
baboon, but that doesn't mean you are one.
doesn't want it messed with, he has two choices: store it somewhere else,
or don't run unpredictable things in the same place it's being stored.


And that's the very point. If Turbo-C wannabe programmers weren't so
orgasmic about clearing the screen, their programs would not predictably
clobber over useful data we've just generated.

Richard
Apr 25 '06 #22

P: n/a
In article <pa****************************@gmail.com>,
Kelsey Bjarnason <kb********@gmail.com> wrote:
I just find the notion of getting upset that an app clears a scratch area
about as silly as complaining that it creates temp files in the temp file
dir.


[The below is inherently OT, as "temp file dir" is not a C concept.]

There are several reasons to grumble about temp files in the temp file
dir. Some of them include:

- Many programs assume that the temp file dir is /tmp which is not the
case on all systems (not even all unix-like systems)

- May programs do not use the TEMP environment variable to determine
where temporary files should go (a common unix-ism)

- It is not uncommon for programs to use invariant filenames for
their temporary files; if those files go into a common temporary directory
then the clash of filenames interferes with multiple copies of the
program running simultaneously

- The C standard routines tmpfile() and tmpnam() do not define where
the files will live, and there is no standard way to ask; this can
lead to difficulties with routines that try to avoid resource exhaustion

- When files are put into common temporary directories, a number of
security and race conditions arise, that are at least much -reduced-
if the files do not go into a common directory

- On heavily-used multiuser systems, such as university systems,
it is not uncommon for the standard temporary directories to be
on filesystems of strictly limited size, to avoid having temporary
files use too much space. On such systems, temporary files should
go into one of the user's directories (and thus subject to the user's
quotas) or some other authorized directory [e.g., staff might have
access to a temporary directory that students do not.] Thus users
need to be able to control where temporary files go instead of
having them automatically go into "the temp file dir".

- On heavily-used multiuser systems, it might be impractical to
have "the temp file dir" be on a filesystem large enough to accomedate
the sum of all the reasonable requests, whereas even having different
temporary directories for different sets of users might make the
resource allocation practical.
None of these reasons had to do with the transient nature of temporary
files, but they are all reasons against programs putting files
in "the temp file dir" ["the" implies there is only one such directory]
without attempting to discern where the user would like the files
to be placed.
--
"No one has the right to destroy another person's belief by
demanding empirical evidence." -- Ann Landers
Apr 25 '06 #23

P: n/a
Kelsey Bjarnason <kb********@gmail.com> writes:
[snips]

On Tue, 25 Apr 2006 09:21:20 +0000, Keith Thompson wrote:
Ok, so my screen is a scratchpad, and I shouldn't depend on any
information on it sticking around for long. But it's *my* scratchpad,
and if a program clears it for no good reason, I'm not going to be
pleased.

[...]
What's on my screen is probably the output of the last program (or
programs) I ran. Your own program's output isn't so important that you
need to erase *my* (possibly unimportant) data.


My program may or may not be that important - that's for you to decide.
And you did decide it was that important, by running it in the middle of
your storage area. In fact, in doing so, you demonstrated that your
on-screen data has absolutely _zero_ value. Why? Simple: the app you're
complaining about is clearing the screen. If you _knew_ it was going to
do this, you'd have run it in another console, or saved your existing data
somewhere, etc.


How do you conclude that my on-screen data has zero value? Yes, if
it's at all important I should have stored it somewhere, or put it in
a separate window, or otherwise been more careful to avoid clearing
it. But if all my on-screen data had zero value, I'd run my shell in
a one-line terminal window (and I'd have room for a lot more windows
on my screen).
Instead, the clear screen took you by surprise. This means you have no
idea what the output of the application is, whether it clears the screen,
locks up the computer, floods the scrollback buffers, etc. Doesn't matter
the actual details, the fact that it's taking you by surprise means you
simply _do not know_. If you don't know what the output is, but you're
running it in the middle of your data store, then you, yourself, are, by
that very action, defining the value of your data in the store. The value
is zero. If it were anything else, you wouldn't be running unknown
applications in the middle of it.
Nonsense. By running an unknown program, I'm accepting, for the sake
of convenience, the risk that it might clear my screen. If the author
of the program was stupid enough to clear the screen for no good
reason, I'm going to be annoyed -- and I probably either won't run
that program again, or I'll modify it so it doesn't clear my screen.

Sure, it's my fault for trusting it. That doesn't excuse the author.
I understand your key concept, that apps should be "well behaved", and I,
at least, generally try to achieve that end myself.
Which is really all I'm trying to say.
On the other hand,
having been bitten once or twice by exactly this sort of thing - buffer
floods, screen clears, etc - I simply ask myself if what I've already got
matters. If it doesn't, run the app. If it does, and I run the app and
lose the data, I've got nobody to blame but myself.


I've been bitten by such things too. Since the information on my
current screen is generally not critical, it's an acceptable risk.
Since that information has *some* value, it's still annoying when it
happens.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Apr 25 '06 #24

P: n/a
# How do you conclude that my on-screen data has zero value? Yes, if
# it's at all important I should have stored it somewhere, or put it in

My program can conclude your disk drive has zero value and
erase it--if the program so documents its function.

--
SM Ryan http://www.rawbw.com/~wyrmwif/
What kind of convenience store do you run here?
Apr 25 '06 #25

P: n/a
> > How do you conclude that my on-screen data has zero value? Yes, if
it's at all important I should have stored it somewhere, or put it in


My program can conclude your disk drive has zero value and
erase it--if the program so documents its function.


(Attribution deliberately snipped.)

Certainly. If a program is intended to erase a disk drive, that's
exactly what it should do. If a program is intended to clear the
screen, such as the "clear" command in Unix or "cls" in Windows, then
it should do so.

My objection is to programs that clear the screen with no good reason.
I'm getting tired of people pretending that I've said anything more
than that.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Apr 25 '06 #26

P: n/a
[snips]

On Tue, 25 Apr 2006 14:34:34 +0000, Richard Bos wrote:
No, he didn't. He generated it for immediate use.


Then if it goes away, it doesn't matter, so the whole discussion is moot.
It is precisely because he doesn't *want* it to go away - he's storing it
- that the issue arises.
doesn't want it messed with, he has two choices: store it somewhere
else, or don't run unpredictable things in the same place it's being
stored.


And that's the very point. If Turbo-C wannabe programmers weren't so
orgasmic about clearing the screen, their programs would not predictably
clobber over useful data we've just generated.


No, that's not the point. The point is that the user who runs
unpredictable programs in the middle of his important data has nobody but
himself to blame when the data goes away. Or are you the sort who
regularly runs random number generators over all the company financial
files and then complains when the results aren't valid data? No? Why
not? Oh, right, because running unpredictable programs in the middle of
your important data is stupid. Right, good. We're in perfect agreement.
Apr 26 '06 #27

P: n/a
[snips]

On Tue, 25 Apr 2006 18:18:10 +0000, Keith Thompson wrote:
How do you conclude that my on-screen data has zero value?
As I explained; you defined that. By running unknown, unpredictable
programs in the middle of it. Since those programs may well nuke the
data, the fact you choose to run them there means that you, yourself, have
set the value of the data, and the value is zero. If it were non-zero, you
wouldn't be running unpredictable, unknown apps in the middle of it, or
you'd have saved it elsewhere.
Nonsense. By running an unknown program, I'm accepting, for the sake of
convenience, the risk that it might clear my screen.
Or flood the scrollback buffers, or lock the machine, etc, etc, etc, not
a single one of which is in the remotest degree consistent with keeping
the "important data" untouched.
If the author of
the program was stupid enough to clear the screen for no good reason,
I'm going to be annoyed


But a lockup, that's okay, flooding the buffers, that's okay, closing the
terminal, that's okay. Only clearing the screen isn't?

Rubbish. *Any* result which causes the loss of the data will have the
same, possibly even worse, annoyance factor. You know that, but guess
what? You run the program anyways. Obviously, you don't care about the
data, or you wouldn't be running the app in the middle of it.

You simply cannot have it both ways. Either the data matters, in which
case you *treat* it like it matters - not running unknown, unpredictable
apps in the middle of it - or the data doesn't matter. The very fact of
running such apps in the middle of it demonstrates that you don't care
about the data - thus the argument offered, thus far, at least against
screen clears, is complete tripe.

Why is it so many folks have this weird bug up their butt about screen
clears? You say "clear the screen", it's like pushing a magic button.
There's about 973 other ways to lose the data, all of which are equally
unacceptable, and every single one of which - including via screen clears
- can be trivially avoided, usually by *not* doing something so silly as
running unknown, unpredictable, untrusted apps in the middle of one's
important data stores.

The lesson to be learned here is not "don't clear the screen", but,
rather, "don't run untrusted apps in the middle of your important data".
If there's an argument against clearing the screen, I'd be happy to hear
it... but this isn't it. It focuses on one trivial detail to the
exclusion of the general premise - missing the forest for a single tree -
then blames the tree. It's patently absurd.
Apr 26 '06 #28

P: n/a
Kelsey Bjarnason <kb********@gmail.com> writes:
[snip]
The lesson to be learned here is not "don't clear the screen", but,
rather, "don't run untrusted apps in the middle of your important data".
If there's an argument against clearing the screen, I'd be happy to hear
it... but this isn't it. It focuses on one trivial detail to the
exclusion of the general premise - missing the forest for a single tree -
then blames the tree. It's patently absurd.


Sure, there are many other ways a program can screw things up.
Unnecessarily clearing the screen is just one of them, and not even
the most important.

I mentioned it because it came up in the context of a program that
unnecessarily cleared the screen.

I'm done with this. Bye.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Apr 26 '06 #29

P: n/a
Kelsey Bjarnason <kb********@gmail.com> wrote:
On Tue, 25 Apr 2006 14:34:34 +0000, Richard Bos wrote:
No, he didn't. He generated it for immediate use.


Then if it goes away, it doesn't matter, so the whole discussion is moot.


Well, that just goes to show: never trust a programmer who isn't a user
as well.

Richard
Apr 26 '06 #30

P: n/a
Kelsey Bjarnason wrote:
.... snip ...
The lesson to be learned here is not "don't clear the screen", but,
rather, "don't run untrusted apps in the middle of your important
data". If there's an argument against clearing the screen, I'd be
happy to hear it... but this isn't it. It focuses on one trivial
detail to the exclusion of the general premise - missing the forest
for a single tree - then blames the tree. It's patently absurd.


No, the lesson to be learned is "Do not unnecessarily annoy the
customer".

--
"If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers." - Keith Thompson
More details at: <http://cfaj.freeshell.org/google/>
Also see <http://www.safalra.com/special/googlegroupsreply/>
Apr 26 '06 #31

P: n/a
Kelsey Bjarnason wrote:
[snips]

On Tue, 25 Apr 2006 18:18:10 +0000, Keith Thompson wrote:
How do you conclude that my on-screen data has zero value?


As I explained; you defined that. By running unknown, unpredictable
programs in the middle of it. Since those programs may well nuke the
data, the fact you choose to run them there means that you, yourself, have
set the value of the data, and the value is zero. If it were non-zero, you
wouldn't be running unpredictable, unknown apps in the middle of it, or
you'd have saved it elsewhere.
Nonsense. By running an unknown program, I'm accepting, for the sake of
convenience, the risk that it might clear my screen.


Or flood the scrollback buffers, or lock the machine, etc, etc, etc, not
a single one of which is in the remotest degree consistent with keeping
the "important data" untouched.
If the author of
the program was stupid enough to clear the screen for no good reason,
I'm going to be annoyed


But a lockup, that's okay, flooding the buffers, that's okay, closing the
terminal, that's okay. Only clearing the screen isn't?

Rubbish. *Any* result which causes the loss of the data will have the
same, possibly even worse, annoyance factor. You know that, but guess
what? You run the program anyways. Obviously, you don't care about the
data, or you wouldn't be running the app in the middle of it.

You simply cannot have it both ways. Either the data matters, in which
case you *treat* it like it matters - not running unknown, unpredictable
apps in the middle of it - or the data doesn't matter. The very fact of
running such apps in the middle of it demonstrates that you don't care
about the data - thus the argument offered, thus far, at least against
screen clears, is complete tripe.

Why is it so many folks have this weird bug up their butt about screen
clears? You say "clear the screen", it's like pushing a magic button.
There's about 973 other ways to lose the data, all of which are equally
unacceptable, and every single one of which - including via screen clears
- can be trivially avoided, usually by *not* doing something so silly as
running unknown, unpredictable, untrusted apps in the middle of one's
important data stores.

The lesson to be learned here is not "don't clear the screen", but,
rather, "don't run untrusted apps in the middle of your important data".
If there's an argument against clearing the screen, I'd be happy to hear
it... but this isn't it. It focuses on one trivial detail to the
exclusion of the general premise - missing the forest for a single tree -
then blames the tree. It's patently absurd.


Unless you believe that all C apps should aspire to be "unpredictable",
you are shooting yourself in the foot, because obviously if a good
program should be predictable (which we know, as no one accepts UB
here), then an "unpredictable" program, or one that clears your screen,
must not be good.

Here's an analogy:
Suppose you are writing a test. Your teacher comes up to you and sets
fire to your notepaper. You look at him and ask why he did that. He
replies that by putting your looseleaf openly on your desk, you were
putting its value at 0, because you were in a room with an unpredictable
teacher holding a lighter. Anything important should have stored in your
binders, and unless you're the sort of person who frequently dips their
binder in a tank of ink, you have no right to complain when you put your
quick notes at his disposal like so.
Apr 26 '06 #32

P: n/a
In article <pa****************************@gmail.com>,
Kelsey Bjarnason <kb********@gmail.com> wrote:
On Tue, 25 Apr 2006 18:18:10 +0000, Keith Thompson wrote:
How do you conclude that my on-screen data has zero value?

As I explained; you defined that. By running unknown, unpredictable
programs in the middle of it. Since those programs may well nuke the
data, the fact you choose to run them there means that you, yourself, have
set the value of the data, and the value is zero. If it were non-zero, you
wouldn't be running unpredictable, unknown apps in the middle of it, or
you'd have saved it elsewhere.


No. Your logic is insufficient.

Each time a program is run, even a well-known program, there is a risk
of malbehaviour, even if only due to hardware errors. Indeed, with
traditional electronic displays, there is a non-zero risk of loss of
information, such as if the power fails, or if the device blows a fuse,
or if a capacitor burns out. One could arrange to have the outputs
logged to a file (e.g., the unix "script" program), but the disk might
fill up, the filesystem might get corrupted, the drive assembly might
Halt And Catch Fire. The risk can never be totally removed. By your
logic, since these risks of loss are non-zero, by doing any computation
at all, the person has set the value of the computation to be zero,
which is [to me anyhow] clearly not realistic.

Instead, the person is not setting the information value to be zero:
the person is multiplying the probability of significant unfriendly
program behaviour (or other failure) times the cost of reproducing the
information, and deciding that the value of the information is lower
than that risk-weighted cost. (More correctly, the person is
integrating rather than multiplying, as there are multiple potential
risks that have different associated costs.) The modern -probability-
that a program will clear the display upon starting is not high --
except perhaps when dealing with programs written by novices.

As the "if you valued your data you wouldn't have run the program"
argument is not correct, the only excuse for gratitiously clearing
the user's screen is for the perversity of reminding users to
take more care in running unknown programs -- a reminder that
unknown programs could be even more deliberately malicious.
--
"law -- it's a commodity"
-- Andrew Ryan (The Globe and Mail, 2005/11/26)
Apr 26 '06 #33

P: n/a

In article <pa****************************@gmail.com>, Kelsey Bjarnason <kb********@gmail.com> writes:
On Tue, 25 Apr 2006 14:34:34 +0000, Richard Bos wrote:
No, he didn't. He generated it for immediate use.


Then if it goes away, it doesn't matter, so the whole discussion is moot.


What nonsense. "Importance" isn't a binary attribute of data. Some
data are more important than others at any given moment; importance
changes over time; it depends on context, cost of reproduction, and a
host of other factors; and data may have subjective value that is
independent of its objective importance (that is, its productivity in
current and future processes versus the cost of recreating it or
doing without it).

Richard's ls example is a fine one. Yes, I can run ls again, as many
times as I like; that doesn't excuse a program's removing that
information from my screen and making me run ls again, if the program
has no reason to clear the screen.

As far as I can see, your argument hinges on a false dichotomy
between "important" information which must be carefully preserved and
"unimportant" information which can be discarded with zero cost.
That's a ridiculous model.
And that's the very point. If Turbo-C wannabe programmers weren't so
orgasmic about clearing the screen, their programs would not predictably
clobber over useful data we've just generated.


No, that's not the point. The point is that the user who runs
unpredictable programs in the middle of his important data has nobody but
himself to blame when the data goes away.


No, that's not the point either. The argument does not depend on
whether the user knows or does not know that the program will clear
the screen. I can know that a program will clear the screen
unnecessarily and still deplore that it does so.

--
Michael Wojcik mi************@microfocus.com

He described a situation where a man is there to feed a dog and the dog is
there to keep the man from touching the equipment. -- Anthony F. Giombetti
Apr 26 '06 #34

P: n/a
I suggest you to change your second if as else if{...} then write a if
statement for return (n*fact(n-1)),and the last thing i wanna say is
change your fnc return type as double

Apr 26 '06 #35

P: n/a
"Typhonike" <ty*******@gmail.com> writes:
I suggest you to change your second if as else if{...} then write a if
statement for return (n*fact(n-1)),and the last thing i wanna say is
change your fnc return type as double


What are you talking about?

Read <http://cfaj.freeshell.org/google/>.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Apr 26 '06 #36

This discussion thread is closed

Replies have been disabled for this discussion.