By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
455,766 Members | 1,432 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 455,766 IT Pros & Developers. It's quick & easy.

stderr, stdout, and errno 24

P: n/a
To capture output from python scripts run from a C++ app I've added the
following code at the beggening of the C++ app:

PyRun_SimpleString("import grabber");
PyRun_SimpleString("import sys");
PyRun_SimpleString("class a:\n\tdef
write(self,s):\n\t\tograbber.grab(s)\n");
PyRun_SimpleString("import sys\nsys.stderr=a()\nsys.stdout=a()");

Its hard to read that way, here's what it expands to:
import grabber
import sys
class a:
def write(self, s)
grabber.grab(s)

grabber is a C++ extension, the grab function prints displays the
captured text in a Windows app. After running about 450+ scripts in a
row, I get "IOError Errno 24 Too many open files."

I've searched this group and the net and determined that stderr and
stdout may open files, is that correct? If so would each running of a
script be opening new files related to stderr and stdout and not
closing them? I'm just guessing.

Jul 13 '06 #1
Share this Question
Share on Google+
5 Replies


P: n/a
On 12 Jul 2006 18:09:42 -0700 in comp.lang.python, "Wesley Henwood"
<we***********@hotmail.comwrote:
>To capture output from python scripts run from a C++ app I've added the
following code at the beggening of the C++ app:

PyRun_SimpleString("import grabber");
PyRun_SimpleString("import sys");
PyRun_SimpleString("class a:\n\tdef
write(self,s):\n\t\tograbber.grab(s)\n");
PyRun_SimpleString("import sys\nsys.stderr=a()\nsys.stdout=a()");

Its hard to read that way, here's what it expands to:
import grabber
import sys
class a:
def write(self, s)
grabber.grab(s)
Actually, that last line will more like
ograbber.grab(s)
At least, if what you posted above is accurate...

It's not the question you asked, but if you want to make that easier
to read, you can do something like

PyRun_SimpleString("import grabber");
PyRun_SimpleString("import sys");

PyRun_SimpleString("class a:\n"
" def write(self,s):\n"
" grabber.grab(s)\n");

PyRun_SimpleString("import sys\n"
"sys.stderr=a()\n"
"sys.stdout=a()\n");
C++, like Python, will concatenate strings seperated only by
whitespace.

Regards,
-=Dave

--
Change is inevitable, progress is not.
Jul 13 '06 #2

P: n/a
In article <11**********************@h48g2000cwc.googlegroups .com>,
Wesley Henwood <we***********@hotmail.comwrote:
>To capture output from python scripts run from a C++ app I've added the
following code at the beggening of the C++ app:

PyRun_SimpleString("import grabber");
PyRun_SimpleString("import sys");
PyRun_SimpleString("class a:\n\tdef
write(self,s):\n\t\tograbber.grab(s)\n");
PyRun_SimpleString("import sys\nsys.stderr=a()\nsys.stdout=a()");

Its hard to read that way, here's what it expands to:
import grabber
import sys
class a:
def write(self, s)
grabber.grab(s)

grabber is a C++ extension, the grab function prints displays the
captured text in a Windows app. After running about 450+ scripts in a
row, I get "IOError Errno 24 Too many open files."

I've searched this group and the net and determined that stderr and
stdout may open files, is that correct? If so would each running of a
script be opening new files related to stderr and stdout and not
closing them? I'm just guessing.
I'm guessing, but it sounds like perhaps you're creating an object which has
an open file handle for output for each script that's run. When the
script finishes, if that object still exists, it will keep a file
handle open and eventually you'll hit the system limit on open file
handles for one process.

It's also possible that your C++ app is the one which is failing to
close file handles created for running the scripts - there's no easy
way to tell from the information posted.

You need to examine carefully what happens to any stdin/stdout/stderr
files which are created to execute scripts and ensure that they are
all properly closed (or, in the case of Python, if you don't
explicitly close them, that any references to the files cease to exist
after the script runs). I'd personally recommend explicit closing
here.


--
Jim Segrave (je*@jes-2.demon.nl)

Jul 13 '06 #3

P: n/a
I've checked and double checked my code and I am closing all files
explicitly after opening them. The only possibliy I can think of is
Python opening files each time I run a script, or each time imput to
stderr or stdout is redirected.

Here's a link that is perhaps related to my
problem:<http://pyfaq.infogami.com/why-doesn-t-closing-sys-stdout-stdin-stderr-really-close-it>

Here is a thread in this group, see post by
alisonken1:<http://groups.google.com/group/comp.lang.python/browse_thread/thread/75e65baa1a51b3a6/512bba0739924917?q=too+many+open+files&rnum=20#512 bba0739924917>

Jul 13 '06 #4

P: n/a

Wesley Henwood wrote:
I've checked and double checked my code and I am closing all files
explicitly after opening them. The only possibliy I can think of is
Python opening files each time I run a script, or each time imput to
stderr or stdout is redirected.
<snip>

The problem >I think< is that stout and stderr are not shared with each
invocation.

Since you're calling a python interpreter, the stdout/stderr from the
C++ program is not inherited - so now everytime you call the python
script, it's creating a new process.

With each new process, you're creating a new stdout/stderr handle, and
if you're calling outside of C++ relatively quickly (say more than 100
times a minute), then the old stdout/stderr handles have not had a
chance to be garbage collected - hence, you get too many files open
errors.

The workaround would possibly be to create a Python thread or create a
stdout fifo and a stderr fifo and have your script redirect these
outputs through the fifo buffers that you're C++ code can listen to.

Not sure how to do either one in MS environments, so you'll have to ask
someone else how to work with them.

Jul 13 '06 #5

P: n/a
In article <11*********************@b28g2000cwb.googlegroups. com>,
"Wesley Henwood" <we***********@hotmail.comwrote:
>I've checked and double checked my code and I am closing all files
explicitly after opening them.
If you're running your program under Linux, a very easy way to confirm
this is to look in the directory

/proc/<pid>/fd

where <pidis the PID of your running program. In here you will see a
symlink to every file your program has open, the name of the link being
the file descriptor number.

To make it easier to watch, you may want to stick in a sleep of a few
seconds in-between iterations of the code that executes the Python
script.

If you see the entries piling up in this directory, that will confirm
that you're not closing those files.
Jul 15 '06 #6

This discussion thread is closed

Replies have been disabled for this discussion.