Folks,
Can anyone thow some clarifying light on the following?
I have come across a column with the same name and same data contents
defined on different tables, on some the column is defined as a FLOAT
on others it is a REAL.
(Don't ask me why, it's inherited, legacy and due for removal once we
have absorbed all the good bits into the data warehouse). .
[BreakDown_Hours] [real] NULL,
[BreakDown_Hours] [float] NULL,
So far so good, according to the documentation REAL is basically a 4
Byte float and equivlent to FLOAT(24).
Reading the documentation it clearly states that the 'Precision' of a
REAL is 7.
As I recall the definition of 'precision' that means that it is
capable of upto 7 decimal points.
Now it may well be I have that wrong, if so feel free to correct me.
When I started looking at the data in the columns defined as a REAL I
found upto 9 decimal places albeit with 2 leading zeros.
0.003305263
0.003305383
0.003305879
0.003306
0.003306305
0.003306702
0.003307051
0.003307974
0.003308263
0.003308823
0.003308901
Looking at the properties of the column in Management studio it states
the 'Numeric Precision' as 24
I can get round all of this without too much trouble but I'd like to
understand what is going on.
Does anyone have an explanation?
TIA Tim