I must be missing something here...
The inconsistency described below is minorly annoying in a current
project. Can anyone come up w/ an explanation?
I've got two seemingly identical object graphs which serialize to
XML. Each has some decimal properties, ex:
class Widget
{
public decimal SubTotal { get; set; }
public decimal Tax { get; set; }
}
When I serialize each graph, some of the decimals which are zero come
out as "0" while others come out as "0.00". Specifically:
widgetA <SubTotal>0</SubTotal><Tax>0</Tax>
widgetB <SubTotal>0.00</SubTotal><Tax>0</Tax>
Comparing the objects in memory, the following all evaluate to true:
widgetA.SubTotal == widgetB.SubTotal
widgetA.Tax == widgetB.Tax
widgetA.SubTotal * 1000000 == widgetB.SubTotal * 1000000
widgetA.SubTotal * 1000000m == widgetB.SubTotal * 1000000m
I'm trying to find some rhyme or reason to this behavior. Any ideas?
Thanks in advance!
James