471,321 Members | 1,830 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 471,321 software developers and data experts.

open function fail after running a day

I have created a script using python that will batch process data
files every hour
The script is running on Solaris. Python version 2.3.3

t=open(filename,'rb')
data=t.read()
#processing data...
t.close()

The script is working fine on the day of execution.
It is able to process the data files very hour. However, the
processing fail one day later i.e. the date increment by 1.

Traceback (most recent call last):
File "./alexCopy.py", line 459, in processRequestModule
sanityTestSteps(reqId,model)
File "./alexCopy.py", line 699, in sanityTestSteps
t = open(filename, 'rb')
IOError: [Errno 24] Too many open files:

I have explicitly closed the file. Is there something else I need to
do?

Appreciate your comments

Jun 7 '07 #1
5 1452
In article <11**********************@n15g2000prd.googlegroups .com>,
alexteo21 <al*******@yahoo.comwrote:
The script is working fine on the day of execution.
It is able to process the data files very hour. However, the
processing fail one day later i.e. the date increment by 1.

Traceback (most recent call last):
File "./alexCopy.py", line 459, in processRequestModule
sanityTestSteps(reqId,model)
File "./alexCopy.py", line 699, in sanityTestSteps
t = open(filename, 'rb')
IOError: [Errno 24] Too many open files:

I have explicitly closed the file. Is there something else I need to
do?
Sounds like the .close() isn't getting executed as you think. Try using
the logging module to log a line immediately before each open and close
so that you can ensure you're really closing all the files.
Alternatively, some other bit of code my be the guilty party. A utility
like fstat can show you who has files open.

Good luck

--
Philip
http://NikitaTheSpider.com/
Whole-site HTML validation, link checking and more
Jun 7 '07 #2
On Jun 7, 3:33 pm, alexteo21 <alexte...@yahoo.comwrote:
I have created a script using python that will batch process data
files every hour
The script is running on Solaris. Python version 2.3.3

t=open(filename,'rb')
data=t.read()
#processing data...
t.close()
Try the following approach:

t=open(filename,'rb')
try:
data=t.read()
#processing data...
finally:
t.close()

and see if that improves matters. If you want to add logging for a
quick check, then...

import logging

t=open(filename,'rb')
try:
data=t.read()
#processing data...
except:
logging.exception("Failed to process file %r", filename)
finally:
t.close()

Regards,

Vinay Sajip

Jun 7 '07 #3
Try the following:

import logging

t=open(filename,'rb')
try:
data=t.read()
#processing data...
except:
logging.exception("Failed to process %r", filename)
finally:
t.close()


Jun 7 '07 #4
Try the following (Python 2.5.x):

import logging

t=open(filename,'rb')
try:
data=t.read()
#processing data...
except:
logging.exception("Failed to process %r", filename)
finally:
t.close()

For earlier versions of Python, you will need to nest the try blocks:

import logging

t=open(filename,'rb')
try:
try:
data=t.read()
#processing data...
except:
logging.exception("Failed to process %r", filename)
finally:
t.close()

Regards,
Vinay Sajip

Jun 7 '07 #5
Sorry for the multiple posts. I kept getting network errors and it
looked like the posts weren't getting through.

Regards,

Vinay

Jun 8 '07 #6

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

10 posts views Thread by Marshall Dudley | last post: by
3 posts views Thread by NeverLift | last post: by
23 posts views Thread by Markus | last post: by
5 posts views Thread by pembed2003 | last post: by
21 posts views Thread by Joakim Hove | last post: by
2 posts views Thread by Jonathan Trevor | last post: by
10 posts views Thread by Gunnar G | last post: by
6 posts views Thread by Ros | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.