"sgh" <sh*****@gmail. com> wrote in message news:3e******** *************** ***@posting.goo gle.com...
Why doesn't this date time pattern support milliseconds?
: : DateTime.ToStri ng("u") format specifier chops off the milliseconds of
my values
UniversalSortab leDateTimePatte rn's purposefully supports the
'u' format specifier to format output. The governing standard is
ISO 8601. That standard states that a conformant date time may
or may not include fractional seconds.
Had 'u' formatted output with milliseconds, it would be producing
a representation of a date time that some ISO 8601 conformant
recipients may not understand, undercutting the interoperabilit y
of the format.
The .NET Framework fully supports custom date/time formats
(i.e., "pictures") . There's nothing to stop you from using,
"yyyy-MM-ddTHH:mm:ss.fff "
as a format specifier.
it hard for me to map my date, time, and dateTime XML schema types to
System.DateTime values without losing millisecond precision.
Format specifiers produce textual output; if you are mapping an
xsd:dateTime primitive type value to a CLR DateTime type, then
output isn't what you need.
Anybody got a work-around for this?
I think System.Globaliz ation is the wrong namespace to look at
for classes to go to/from primitive XML Schema Datatypes. If
you look in the System.Xml namespace, there is an XmlConvert
class that has all the functions you need for translating the
xsd:dateTime and related datatypes.
DateTime dt = XmlConvert.ToDa teTime( strDateTimeAttr ValueIn);
string strDateTimeAttr ValueOut = XmlConvert.ToSt ring( dt);
ToDateTime( ) and ToString( ) both preserve optional milliseconds
and timezone qualifiers that may be present.
Derek Harmon