By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
443,908 Members | 1,850 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 443,908 IT Pros & Developers. It's quick & easy.

File Closing Problem in 2.3 and 2.4, Not in 2.5

P: n/a
Greetings:

Please forgive me if this is the wrong place for this post. I couldn't find a more acceptable forum. If there is one, please point me in the right direction.

I am part of a small team writing a table-driven automated testing framework for embedded software. The tables, which contain rows of keywords and data that drive the testing, are stored as plain-text "Comma-Separated Value" or .csv files. Each table can call other tables, which means that multiple files may be open at a time.

The framework includes a Parser class. The program takes the name of the top-level table as a parameter, creates an instance of the Parser and passes the table name to it. The Parser instance opens the .csv file with that name, reads each line of the file (row of the table) and takes the appropriate action. When it encounters a row referencing another table, it creates a new Parser instance and passes it the name of the new table, suspending its own operation until the new Parser instance completes.

In this way, a tree of Parser instances is created, each with a single open file object. (BTW, recursive and circular references are not allowed.) When each Parser instance comes to the end of its table, the instance is explicitly destroyed, presumably destroying any objects it holds, AND closing its open file.

Because of the nature of the framework and the testing we do, this Parser tree never gets very tall: four or five levels at the most. The same table may be invoked dozens or hundreds of times, however, with different sets of data each time.

This is where things go wrong. After about 500 table invocations, the framework starts refusing to process any more tables, responding with the following error:

[Errno 24] Too many open files: 'HostCommandDevNotReady.csv'

We can correct the behavior by explicitly closing each Parser's table file object before exiting the Parser code. This indicates that Python is failing to free up some file-related internal resource when the Parser object is destroyed. This behavior occurs on Python 2.3 and 2.4 for Windows, but not on Python 2.3 for Linux, and not on the Windows version of Python2.5.

This is why I didn't just post a report to Bug Tracker: the problem seems to have been fixed. I did search through the archive of Windows related bugs, but found no mention of this type of a bug. What I want to know is:

* has anyone else encountered a problem like this,
* how was the problem corrected,
* can the fix be retro-fitted to 2.5 and 2.4?

Thanks in advance for any information you can provide.

Regards,
*
Barry
ba***********@psc.com
541-302-1107
________________________
We who cut mere stones must always be envisioning cathedrals.
-Quarry worker's creed
Jan 6 '07 #1
Share this Question
Share on Google+
3 Replies


P: n/a
Carroll, Barry schrieb:
What I want to know is:

* has anyone else encountered a problem like this, * how was the
problem corrected, * can the fix be retro-fitted to 2.5 and 2.4?
From your description, I suspect an error in your code. Your description
indicates that you don't expect to have more than five files open
simultaneously. Yet, the error message "Too many open files" occurs when
you open many more files (in the order of hundreds of files).

It is very unlikely that there is a bug in Python where it would fail to
close a file when .close() is explicitly invoked on it (as your
description suggests that you do), so if you get that error message, it
can only mean that you fail to close some files.

Notice that you may have other files open, as well, and that those also
count towards the limit.

As a debugging utility, you can use Sysinternal's process explorer.
Make the program halt (not exit) when the exception occurs (e.g. by
having it sleep(1) in a loop), then view all open handles in the
process explorer (check the menu if it doesn't display them initially).

Regards,
Martin
Jan 6 '07 #2

P: n/a
Martin v. Lwis wrote:
Carroll, Barry schrieb:
>What I want to know is:

* has anyone else encountered a problem like this, * how was the
problem corrected, * can the fix be retro-fitted to 2.5 and 2.4?

From your description, I suspect an error in your code. Your description
indicates that you don't expect to have more than five files open
simultaneously. Yet, the error message "Too many open files" occurs when
you open many more files (in the order of hundreds of files).

It is very unlikely that there is a bug in Python where it would fail to
close a file when .close() is explicitly invoked on it (as your
description suggests that you do), so if you get that error message, it
can only mean that you fail to close some files.

Notice that you may have other files open, as well, and that those also
count towards the limit.

As a debugging utility, you can use Sysinternal's process explorer.
Make the program halt (not exit) when the exception occurs (e.g. by
having it sleep(1) in a loop), then view all open handles in the
process explorer (check the menu if it doesn't display them initially).

Regards,
Martin
I agree with Martin .. this code to close is solid.

Make certain you are really closing the files when you think you should.
I am pretty sure you are not. Look at the code that closes the files
closely. Put a print statement in the block that is supposed to close
the files (may bee even a raw_input("closing file" + afile) statement).

My guess is that you won't see the print statements trigger when you
though they should .. they may be out of "the loop" you thought that
they were in.
Jan 6 '07 #3

P: n/a
Martin v. Lwis wrote:
Carroll, Barry schrieb:
What I want to know is:

* has anyone else encountered a problem like this, * how was the
problem corrected, * can the fix be retro-fitted to 2.5 and 2.4?

From your description, I suspect an error in your code. Your description
indicates that you don't expect to have more than five files open
simultaneously. Yet, the error message "Too many open files" occurs when
you open many more files (in the order of hundreds of files).

It is very unlikely that there is a bug in Python where it would fail to
close a file when .close() is explicitly invoked on it (as your
description suggests that you do), so if you get that error message, it
can only mean that you fail to close some files.
I don't understand: the OP's description suggests nothing of the sort
to me. What he said was:
"""
In this way, a tree of Parser instances is created, each with a single
open file object. (BTW, recursive and circular references are not
allowed.) When each Parser instance comes to the end of its table, the
instance is explicitly destroyed, presumably destroying any objects it
holds, AND closing its open file.
"""
which I interpret as: he is doing del parser_instance, and *presuming*
(incorrectly) that attributes of parser_instance (including an open
file object) are magically whisked away instantly, instead of
later/maybe. He later says he explicitly closed the files, which fixed
what he alleges (incorrectly) to be a bug.

To the OP:
(1) The del statement doesn't "destroy" anything. It unbinds the name
from the object in the current namespace, and decrements the object's
reference count. Only if the reference count is then zero will the
janitor be called in.
(2) Check the reference count on the parser_instance just before you
del it. You could be retaining a reference somewhere.
(3) Explicitly close all non-lightweight objects like files (even
read-only ones) and sockets rather than hoping they will go away.

HTH,
John

Jan 7 '07 #4

This discussion thread is closed

Replies have been disabled for this discussion.