473,748 Members | 2,621 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

How Can I Increase the Speed of a Large Number of Date Conversions

I am a programming amateur and a Python newbie who needs to convert
about 100,000,000 strings of the form "1999-12-30" into ordinal dates
for sorting, comparison, and calculations. Though my script does a ton
of heavy calculational lifting (for which numpy and psyco are a
blessing) besides converting dates, it still seems to like to linger
in the datetime and time libraries. (Maybe there's a hot module in
there with a cute little function and an impressive set of
attributes.)

Anyway, others in this group have said that the date and time
libraries are a little on the slow side but, even if that's true, I'm
thinking that the best code I could come up with to do the conversion
looks so clunky that I'm probably running around the block three times
just to go next door. Maybe someone can suggest a faster, and perhaps
simpler, way.

Here's my code, in which I've used a sample date string instead of its
variable name for the sake of clarity. Please don't laugh loud enough
for me to hear you in Davis, California.

dateTuple = time.strptime(" 2005-12-19", '%Y-%m-%d')
dateTuple = dateTuple[:3]
date = datetime.date(d ateTuple[0], dateTuple[1],
dateTuple[2])
ratingDateOrd = date.toordinal( )

P.S. Why is an amateur newbie trying to convert 100,000,000 date
strings into ordinal dates? To win try to win a million dollars, of
course! In case you haven't seen it, the contest is at www.netflixprize.com.
There are currently only about 23,648 contestants on 19,164 teams from
151 different countries competing, so I figure my chances are pretty
good. ;-)

Jun 8 '07 #1
6 1562
Some Other Guy <bg****@microso ft.comwrote:
vdicarlo wrote:
>I am a programming amateur and a Python newbie who needs to convert
about 100,000,000 strings of the form "1999-12-30" into ordinal dates
for sorting, comparison, and calculations. Though my script does a ton
of heavy calculational lifting (for which numpy and psyco are a
blessing) besides converting dates, it still seems to like to linger
in the datetime and time libraries. (Maybe there's a hot module in
there with a cute little function and an impressive set of
attributes.)
...
>dateTuple = time.strptime(" 2005-12-19", '%Y-%m-%d')
dateTuple = dateTuple[:3]
date = datetime.date(d ateTuple[0], dateTuple[1],
dateTuple[2])
ratingDateOrd = date.toordinal( )
There's nothing terribly wrong with that, although strptime() is overkill
if you already know the date format. You could get the date like this:
date = apply(datetime. date, map(int, "2005-12-19".split('-')))
But, more importantly... 100,000,000 individual dates would cover 274000
years! Do you really need that much?? You could just precompute a
dictionary that maps a date string to the ordinal for the last 50 years
or so. That's only 18250 entries, and can be computed in less than a second.
Lookups after that will be near instantaneous:
For that matter why not memoize the results of each conversion
(toss it in a dictionary and precede each conversion with a
check like: if this_date in datecache: return datecache[this_date]
else: ret=convert(thi s_date); datecache[this_date]=ret; return ret)

(If you don't believe that will help, consider that a memo-ized
implementation of a recursive Fibonacci function runs about as quickly
as iterative approach).
--
Jim Dennis,
Starshine: Signed, Sealed, Delivered

Jun 8 '07 #2
James T. Dennis wrote:
Some Other Guy <bg****@microso ft.comwrote:
>vdicarlo wrote:
>>I am a programming amateur and a Python newbie who needs to convert
about 100,000,000 strings of the form "1999-12-30" into ordinal dates
for sorting, comparison, and calculations. Though my script does a ton
of heavy calculational lifting (for which numpy and psyco are a
blessing) besides converting dates, it still seems to like to linger
in the datetime and time libraries. (Maybe there's a hot module in
there with a cute little function and an impressive set of
attributes. )
...
>>dateTuple = time.strptime(" 2005-12-19", '%Y-%m-%d')
dateTuple = dateTuple[:3]
date = datetime.date(d ateTuple[0], dateTuple[1],
dateTuple[2])
ratingDateOrd = date.toordinal( )
>There's nothing terribly wrong with that, although strptime() is overkill
if you already know the date format. You could get the date like this:
> date = apply(datetime. date, map(int, "2005-12-19".split('-')))
>But, more importantly... 100,000,000 individual dates would cover 274000
years! Do you really need that much?? You could just precompute a
dictionary that maps a date string to the ordinal for the last 50 years
or so. That's only 18250 entries, and can be computed in less than a second.
Lookups after that will be near instantaneous:

For that matter why not memoize the results of each conversion
(toss it in a dictionary and precede each conversion with a
check like: if this_date in datecache: return datecache[this_date]
else: ret=convert(thi s_date); datecache[this_date]=ret; return ret)

(If you don't believe that will help, consider that a memo-ized
implementation of a recursive Fibonacci function runs about as quickly
as iterative approach).

Even better do something like (not tested):

try: dateord=datedic t[cdate]
except KeyError:
dateord=datetim e.date(*[int(x) for x in "2005-12-19".split('-'))
datedict[cdate]=dateord
hat way you build the cache on the fly and there is no penalty if
lookup key is already in the cache.
Jun 8 '07 #3
Some Other Guy wrote:
vdicarlo wrote:
>I am a programming amateur and a Python newbie who needs to convert
about 100,000,000 strings of the form "1999-12-30" into ordinal dates
for sorting, comparison, and calculations. Though my script does a ton
of heavy calculational lifting (for which numpy and psyco are a
blessing) besides converting dates, it still seems to like to linger
in the datetime and time libraries. (Maybe there's a hot module in
there with a cute little function and an impressive set of
attributes.)
...
>dateTuple = time.strptime(" 2005-12-19", '%Y-%m-%d')
dateTuple = dateTuple[:3]
date = datetime.date(d ateTuple[0], dateTuple[1],
dateTuple[2])
ratingDateOrd = date.toordinal( )

There's nothing terribly wrong with that, although strptime() is overkill
if you already know the date format. You could get the date like this:

date = apply(datetime. date, map(int, "2005-12-19".split('-')))

But, more importantly... 100,000,000 individual dates would cover 274000
years! Do you really need that much?? You could just precompute a
dictionary that maps a date string to the ordinal for the last 50 years
or so. That's only 18250 entries, and can be computed in less than a second.
Lookups after that will be near instantaneous:
import datetime

days = 365*50 # about 50 years worth
dateToOrd = {} # dict. of date string to ordinal
....

Then there's the argument of "why bother using real dates?" I mean, all
that is necessary is a mapping of date -number for sorting. Who needs
accuracy?

for date in inp:
y, m, d = map(int, date.split('-'))
ordinal = (y-1990)*372 + (m-1)*31 + d-1

Depending on the allowable range of years, one could perhaps adjust the
1990 up, and get the range of date ordinals down to about 12 bits (if
one packs netflix data properly, you can get everything in memory).
With a bit of psyco, the above is pretty speedy.

- Josiah
Jun 8 '07 #4
vdicarlo <vd******@gmail .comwrites:
>I am a programming amateur and a Python newbie who needs to convert
about 100,000,000 strings of the form "1999-12-30" into ordinal dates
for sorting, comparison, and calculations. Though my script does a ton
of heavy calculational lifting (for which numpy and psyco are a
blessing) besides converting dates, it still seems to like to linger
in the datetime and time libraries. (Maybe there's a hot module in
there with a cute little function and an impressive set of
attributes.)
>Anyway, others in this group have said that the date and time
libraries are a little on the slow side but, even if that's true, I'm
thinking that the best code I could come up with to do the conversion
looks so clunky that I'm probably running around the block three times
just to go next door. Maybe someone can suggest a faster, and perhaps
simpler, way.
>Here's my code, in which I've used a sample date string instead of its
variable name for the sake of clarity. Please don't laugh loud enough
for me to hear you in Davis, California.
>dateTuple = time.strptime(" 2005-12-19", '%Y-%m-%d')
dateTuple = dateTuple[:3]
date = datetime.date(d ateTuple[0], dateTuple[1],
dateTuple[2])
ratingDateOrd = date.toordinal( )
>P.S. Why is an amateur newbie trying to convert 100,000,000 date
strings into ordinal dates? To win try to win a million dollars, of
course! In case you haven't seen it, the contest is at www.netflixprize.com.
There are currently only about 23,648 contestants on 19,164 teams from
151 different countries competing, so I figure my chances are pretty
good. ;-)
I can't help noticing that dates in 'yyyy-mm-dd' format are already sortable
as strings.

>>'1999-12-30' '1999-12-29'
True

depending on the pattern of access the slightly slower compare speed _might_
compensate for the conversion speed. Worth a try.

Eddie
Jun 8 '07 #5
Many thanks for the lucid and helpful suggestions. Since my date range
was only a few years, I used Some Other Guy's suggestion above, which
the forum is saying will be deleted in five days, to make a dictionary
of the whole range of dates when the script starts. It was so fast it
wasn't even worth saving in a file. Made the script a LOT faster. I
guess two thousand function calls must be faster than 200 million?
Like maybe a hundred thousand times faster?

I also benefitted from studying the other suggestons. I had actually
implemented an on the fly dictionary caching scheme for one of my
other calculations. I don't know why it didn't occur to me to do it
with the dates, except I think I must be assuming, as a newbie
Pythonista, that the less I do and the more I let the libraries do the
better off I will be.

Thanks for putting me straight. As someone I know said to me when I
told him I wanted to learn Python, "the power of Python is in the
dictionaries".

Funny how long it's taking me to learn that.

Jun 9 '07 #6

"vdicarlo" <vd******@gmail .comwrote in message
news:11******** ************@i3 8g2000prf.googl egroups.com...
| Many thanks for the lucid and helpful suggestions. Since my date range
| was only a few years, I used Some Other Guy's suggestion above, which
| the forum is saying will be deleted in five days, to make a dictionary
| of the whole range of dates when the script starts. It was so fast it
| wasn't even worth saving in a file. Made the script a LOT faster. I
| guess two thousand function calls must be faster than 200 million?
| Like maybe a hundred thousand times faster?

Any function called repeatedly with the same input is a candidate for a
lookup table. This is a fairly extreme example.

|| I also benefitted from studying the other suggestons. I had actually
| implemented an on the fly dictionary caching scheme for one of my
| other calculations. I don't know why it didn't occur to me to do it
| with the dates, except I think I must be assuming, as a newbie
| Pythonista, that the less I do and the more I let the libraries do the
| better off I will be.
|
| Thanks for putting me straight. As someone I know said to me when I
| told him I wanted to learn Python, "the power of Python is in the
| dictionaries".
|
| Funny how long it's taking me to learn that.

Well, look at how many of us also did not quite see the now obvious answer.

Even Python's list.sort() only fairly recently gained the optional 'key'
parameter, which implements the decorate-sort-undecorate pattern and which
often obsoletes the original compare-function parameter because it saves
time by calculating (and saving) the key only once per item instead of once
each comparison.
>>import this
The Zen of Python, by Tim Peters
[snip]
Namespaces are one honking great idea -- let's do more of those!

I include lookup dictionaries in this admonition.

tjr

Jun 9 '07 #7

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

8
2903
by: Darsant | last post by:
I'm currently reading 1-n number of binary files, each with 3 different arrays of floats containing about 10,000 values a piece for a total of about 30,000 values per file. I'm looking for a way to load them all into memory. I've tried using vector pushback with reserving, but it was horribly slow. The current method I am using is upon opening the file and reading the number of values, resizing the vectors (I have 3, one for each data...
9
1640
by: Jean-David Beyer | last post by:
When initially populating a database, it runs slower that I would have supposed. Checking here and there, especially with iostat, reveals that most of the time is spent writting the logfiles. In other words, I am not compute limited, but I am not IO limited either (in terms of bytes/second to the drives). The system is in IO-WAIT state most of the time, so I assume the process is really seek limited. It is true that my method of...
16
4692
by: raj | last post by:
Hi, I saw it mentioned that "int" is the fastest data-type for use in C ,that is data storage/retrieval would be the fastest if I use int among the following 4 situations in a 32 bit machine with 4-byte ints: int m; bool m; // assuming I use C++ char m; unsigned char m;
6
2783
by: Dennis | last post by:
I was trying to determine the fastest way to build a byte array from components where the size of the individual components varied depending on the user's input. I tried three classes I built: (1) using redim arrays to add to a normal byte array (2) using an ArrayList and finally (3) using a memorystream. These three classes are listed below the test sub called "TestBuildByteArray". It was interesting that using the memorystream was...
30
3869
by: zexpe | last post by:
I have an extremely cpu/data intensive piece of code that makes heavy use of the following function: void convertToDouble(const std::string& in, double& out) { out = atof(in.c_str()); } I would really like to get away from using any old C-style functions. So, I modified the above function to make it follow the C++ convention:
3
5057
by: Mike Kelly | last post by:
Hi. I've built a page using standard ASP.NET 2.0 features and when I upload a large file (>20MB) to our intranet server, I get a paltry 100KB/s on our 100Mb/s LAN. Simply copying the file, I get around 7MB/s. I'm using a FileUpload control on an .aspx page, and then I'm writing the MyFileUpload.PostedFile.InputStream off to a database. What can I do to speed up this uploading? Thanks in advance
11
1997
by: Jim Lewis | last post by:
Has anyone found a good link on exactly how to speed up code using pyrex? I found various info but the focus is usually not on code speedup.
5
2781
by: RThaden | last post by:
Hi all, I have a text file with a list of MAC addresses. Each time, my program is called it reads the last MAC address entry from the file, increases it by one, writes this new address into a 6 byte binary file and stores the new address in text representation in the text file. Looks like this 00-0C-F1-B9-A1-11 00-0C-F1-B9-A1-12
0
1117
by: Charles | last post by:
3000 rows is not a big quantity. You can load it into VC program memory, a linked list for example, and "asynchronously" load into Oracle. The connection method can be embedded SQL or ODBC. Charles "Nicke Verenius" <nicholaus_verenius@hotmail.comwrote in message news:Fw8Ba.19250$_2.494@news1.bredband.com...
0
8832
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
9562
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
9333
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
8255
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
6078
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4879
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3319
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
2791
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2217
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.