As you correctly titled this thread #define is a preprocessor directive. The preprocessor is NOT the compiler and only deals with preprocessor commands. The compiler can not handle preprocessor commands, they are not a part of the real language syntax.
When a #define is used the preprocessor does a text substitution of the symbol name with the text text it is defined to so for your #define the code
double oneoverpi = 1 / PI;
is converted by the preprocessor to
double oneoverpi = 1.0 / 4.0*atan(1.0);
The compiler then compiles this using the normal operator precedence rules so effectively it does calculate (1.0 / 4.0)*atan(1.0).
This is a known gotcha with #defines, you should always surround your #define with parenthesis which avoids the problem
- # #define PI (4.0*atan(1.0))
-
-
double oneoverpi = 1 / PI;
-
-
// after preprocessing
-
-
double oneoverpi = 1 / (4.0*atan(1.0)); // Calculation is correct
-
No "calling" is done it is entirely text substitution.
One final thought is that in C++ certainly it would be considered best practice not to use a #define for this at all but to use constant which then has the added benefit of proper type checking.
- static const double PI = 4.0*atan(1.0);