470,614 Members | 1,520 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 470,614 developers. It's quick & easy.

output to console and to multiple files

Hello,

I searched on Google and in this Google Group, but did not find any
solution to my problem.

I'm looking for a way to output stdout/stderr (from a subprocess or
spawn) to screen and to at least two different files.

eg.
stdout/stderr -screen
stdout -log.out
stderr -log.err

and if possible
stdout/stderr -screen and log.txt

3 files from stdout/stderr

Feb 14 '07 #1
14 6785
I took a look around and I couldn't find anything either. I will be
keeping an eye on this thread to see if someone posts a more standard
solution. In the mean time though, I will offer up a potential
solution. Duck typing is your friend. If you are only using the write
method of your files, it can be pretty simple to implement a fake file
object to do what you want.

Expand|Select|Wrap|Line Numbers
  1. import sys
  2.  
  3. class TeeFile(object):
  4. def __init__(self,*files):
  5. self.files = files
  6. def write(self,txt):
  7. for fp in self.files:
  8. fp.write(txt)
  9.  
  10. if __name__ == "__main__":
  11. outf = file("log.out","w")
  12. errf = file("log.err","w")
  13. allf = file("log.txt","w")
  14. sys.stdout = TeeFile(sys.__stdout__,outf,allf)
  15. sys.stderr = TeeFile(sys.__stderr__,errf,allf)
  16.  
  17. print "hello world this is stdout"
  18. print >sys.stderr , "hello world this is stderr"
  19.  
Feb 15 '07 #2
like this?

class Writers (object):

def __init__(self, *writers):
self.writers = writers

def write(self, string):
for w in self.writers:
w.write(string)

def flush(self):
for w in self.writers:
w.flush():

import sys

logfile = open('log.txt', 'w')
sys.stdout = Writers(aya.stdout, file('log.out', 'w'), logfile)
sys.stderr = Writers(aya.stdout, file('log.err', 'w'), logfile)

Feb 15 '07 #3
En Wed, 14 Feb 2007 19:28:34 -0300, na**********@gmail.com
<na**********@gmail.comescribió:
I'm looking for a way to output stdout/stderr (from a subprocess or
spawn) to screen and to at least two different files.
Look at the tee command. If you control the subprocess, and it's written
in Python, using the Python recipes would be easier and perhaps you have
more control.
But if you can't modify the subprocess, you'll have to use tee.

--
Gabriel Genellina

Feb 15 '07 #4
On Feb 14, 5:10 pm, "goodwolf" <Robert.Ka...@gmail.comwrote:
like this?

class Writers (object):

def __init__(self, *writers):
self.writers = writers

def write(self, string):
for w in self.writers:
w.write(string)

def flush(self):
for w in self.writers:
w.flush():

import sys

logfile = open('log.txt', 'w')
sys.stdout = Writers(aya.stdout, file('log.out', 'w'), logfile)
sys.stderr = Writers(aya.stdout, file('log.err', 'w'), logfile)

i've tried simliar methods to this and to what Matimus wrote. I know
it works great when using print statements.
However, I'm looking to find something that will work with the output
from a subprocess, such as from spawn, os.system, os.popen, etc.

Feb 15 '07 #5
On Feb 14, 6:52 pm, "Gabriel Genellina" <gagsl...@yahoo.com.arwrote:
En Wed, 14 Feb 2007 19:28:34 -0300, nathan.sh...@gmail.com
<nathan.sh...@gmail.comescribió:
I'm looking for a way to output stdout/stderr (from a subprocess or
spawn) to screen and to at least two different files.

Look at the tee command. If you control the subprocess, and it's written
in Python, using the Python recipes would be easier and perhaps you have
more control.
But if you can't modify the subprocess, you'll have to use tee.

--
Gabriel Genellina
Tee, the unix function? Or is there a tee that is python?

Feb 15 '07 #6
On Feb 15, 7:53 am, "nathan.sh...@gmail.com" <nathan.sh...@gmail.com>
wrote:
On Feb 14, 5:10 pm, "goodwolf" <Robert.Ka...@gmail.comwrote:
like this?
class Writers (object):
def __init__(self, *writers):
self.writers = writers
def write(self, string):
for w in self.writers:
w.write(string)
def flush(self):
for w in self.writers:
w.flush():
import sys
logfile = open('log.txt', 'w')
sys.stdout = Writers(aya.stdout, file('log.out', 'w'), logfile)
sys.stderr = Writers(aya.stdout, file('log.err', 'w'), logfile)

i've tried simliar methods to this and to what Matimus wrote. I know
it works great when using print statements.
However, I'm looking to find something that will work with the output
from a subprocess, such as from spawn, os.system, os.popen, etc.
I think you should be able to use my or goodwolf's solution with the
subprocess module. Something like this (untested):

Expand|Select|Wrap|Line Numbers
  1. class TeeFile(object):
  2. def __init__(self,*files):
  3. self.files = files
  4. def write(self,txt):
  5. for fp in self.files:
  6. fp.write(txt)
  7.  
  8. if __name__ == "__main__":
  9. import sys
  10. from subprocess import Popen
  11.  
  12. command = "whatever you want to run"
  13. outf = file("log.out","w")
  14. errf = file("log.err","w")
  15. allf = file("log.txt","w")
  16. Popen(
  17. command,
  18. stdout = TeeFile(sys.__stdout__,outf,allf),
  19. stderr = TeeFile(sys.__stderr__,errf,allf)
  20. )
  21.  
Feb 15 '07 #7
On Feb 15, 8:51 am, "Matimus" <mccre...@gmail.comwrote:
On Feb 15, 7:53 am, "nathan.sh...@gmail.com" <nathan.sh...@gmail.com>
wrote:
On Feb 14, 5:10 pm, "goodwolf" <Robert.Ka...@gmail.comwrote:
like this?
class Writers (object):
def __init__(self, *writers):
self.writers = writers
def write(self, string):
for w in self.writers:
w.write(string)
def flush(self):
for w in self.writers:
w.flush():
import sys
logfile = open('log.txt', 'w')
sys.stdout = Writers(aya.stdout, file('log.out', 'w'), logfile)
sys.stderr = Writers(aya.stdout, file('log.err', 'w'), logfile)
i've tried simliar methods to this and to what Matimus wrote. I know
it works great when using print statements.
However, I'm looking to find something that will work with the output
from a subprocess, such as from spawn, os.system, os.popen, etc.

I think you should be able to use my or goodwolf's solution with the
subprocess module. Something like this (untested):

Expand|Select|Wrap|Line Numbers
  1. class TeeFile(object):
  2.     def __init__(self,*files):
  3.         self.files = files
  4.     def write(self,txt):
  5.         for fp in self.files:
  6.             fp.write(txt)
  7. if __name__ == "__main__":
  8.     import sys
  9.     from subprocess import Popen
  10.     command = "whatever you want to run"
  11.     outf = file("log.out","w")
  12.     errf = file("log.err","w")
  13.     allf = file("log.txt","w")
  14.     Popen(
  15.         command,
  16.         stdout = TeeFile(sys.__stdout__,outf,allf),
  17.         stderr = TeeFile(sys.__stderr__,errf,allf)
  18.     )
  19.  
I tried this at lunch and it doesn't work. Some version of this method
may work, but Popen tries to call the 'fileno' method of the TeeFile
object (at least it did on my setup) and it isn't there. This is just
a preemptive warning before someone comes back to let me know my code
doesn't work.

Feb 15 '07 #8
En Thu, 15 Feb 2007 19:35:10 -0300, Matimus <mc******@gmail.comescribió:
>I think you should be able to use my or goodwolf's solution with the
subprocess module. Something like this (untested):

[code]
class TeeFile(object):
def __init__(self,*files):
self.files = files
def write(self,txt):
for fp in self.files:
fp.write(txt)

I tried this at lunch and it doesn't work. Some version of this method
may work, but Popen tries to call the 'fileno' method of the TeeFile
object (at least it did on my setup) and it isn't there. This is just
a preemptive warning before someone comes back to let me know my code
doesn't work.
I don't think any Python only solution could work. The pipe options
available for subprocess are those of the underlying OS, and the OS knows
nothing about Python file objects.

--
Gabriel Genellina

Feb 16 '07 #9
On Feb 15, 5:48 pm, "Gabriel Genellina" <gagsl...@yahoo.com.arwrote:
En Thu, 15 Feb 2007 19:35:10 -0300, Matimus <mccre...@gmail.comescribió:
I think you should be able to use my or goodwolf's solution with the
subprocess module. Something like this (untested):
[code]
class TeeFile(object):
def __init__(self,*files):
self.files = files
def write(self,txt):
for fp in self.files:
fp.write(txt)
I tried this at lunch and it doesn't work. Some version of this method
may work, but Popen tries to call the 'fileno' method of the TeeFile
object (at least it did on my setup) and it isn't there. This is just
a preemptive warning before someone comes back to let me know my code
doesn't work.

I don't think any Python only solution could work. The pipe options
available for subprocess are those of the underlying OS, and the OS knows
nothing about Python file objects.

--
Gabriel Genellina
I've tried the subprocess method before without any luck.
Thanks for all your suggestions. I guess it's time to rethink what I
want to do.

Feb 16 '07 #10
On Feb 14, 11:28 pm, "nathan.sh...@gmail.com" <nathan.sh...@gmail.com>
wrote:
Hello,

I searched on Google and in this Google Group, but did not find any
solution to my problem.

I'm looking for a way to output stdout/stderr (from a subprocess or
spawn) to screen and to at least two different files.

eg.
stdout/stderr -screen
stdout -log.out
stderr -log.err

and if possible
stdout/stderr -screen and log.txt

3 files from stdout/stderr
I'd derive a class from file, overwrite it's write() method to send a
copy to the log, and then assign sys.stdout = newFile(sys.stdout).
Same for stderr.

Feb 16 '07 #11
En Fri, 16 Feb 2007 14:04:33 -0300, Bart Ogryczak <B.********@gmail.com>
escribió:
On Feb 14, 11:28 pm, "nathan.sh...@gmail.com" <nathan.sh...@gmail.com>
wrote:
>I'm looking for a way to output stdout/stderr (from a subprocess or
spawn) to screen and to at least two different files.

I'd derive a class from file, overwrite it's write() method to send a
copy to the log, and then assign sys.stdout = newFile(sys.stdout).
Same for stderr.
That's ok inside the same process, but the OP needs to use it "from a
subprocess or spawn".
You have to use something like tee, working with real file handles.

--
Gabriel Genellina

Feb 16 '07 #12
On Feb 16, 3:28 pm, "Gabriel Genellina" <gagsl...@yahoo.com.arwrote:
>
That's ok inside the same process, but the OP needs to use it "from a
subprocess or spawn".
You have to use something like tee, working with real file handles.
I'm not particularly familiar with this, but it seems to me that if
you're trying to catch stdout/stderr from a program you can call with
(say) popen2, you could just read from the returned stdout/stderr
pipe, and then write to a series of file handles (including
sys.stdout).

Or am I missing something? =)

~G

Feb 16 '07 #13
On Feb 16, 4:07 pm, garri...@gmail.com wrote:
On Feb 16, 3:28 pm, "Gabriel Genellina" <gagsl...@yahoo.com.arwrote:
That's ok inside the same process, but the OP needs to use it "from a
subprocess or spawn".
You have to use something like tee, working with real file handles.

I'm not particularly familiar with this, but it seems to me that if
you're trying to catch stdout/stderr from a program you can call with
(say) popen2, you could just read from the returned stdout/stderr
pipe, and then write to a series of file handles (including
sys.stdout).

Or am I missing something? =)

~G
That works, but it isn't live streaming of stdout/stderr. Most of the
time, if you stream both, one could lock the process, or have the
stdout/stderr printed in the wrong order.

Feb 16 '07 #14
On Feb 16, 11:37 pm, "nathan.sh...@gmail.com" <nathan.sh...@gmail.com>
wrote:
On Feb 16, 4:07 pm, garri...@gmail.com wrote:
On Feb 16, 3:28 pm, "Gabriel Genellina" <gagsl...@yahoo.com.arwrote:
That's ok inside the same process, but the OP needs to use it "from a
subprocess or spawn".
You have to use something like tee, working with real file handles.
I'm not particularly familiar with this, but it seems to me that if
you're trying to catch stdout/stderr from a program you can call with
(say) popen2, you could just read from the returned stdout/stderr
pipe, and then write to a series of file handles (including
sys.stdout).
Or am I missing something? =)
~G

That works, but it isn't live streaming of stdout/stderr. Most of the
time, if you stream both, one could lock the process, or have the
stdout/stderr printed in the wrong order.
Everytime I've looked to do something like this (non-blocking read on
the stdout of a subprocess) I've always come back to the conclusion
that threads and queues are the only reasonable way (particularly on
windows). There may be a better solution using select.

Fuzzyman
http://www.voidspace.org.uk/python/articles.shtml

Feb 17 '07 #15

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

8 posts views Thread by Marko Faldix | last post: by
1 post views Thread by Teis Draiby | last post: by
4 posts views Thread by Michael Maes | last post: by
2 posts views Thread by Steve | last post: by
3 posts views Thread by Ritesh Raj Sarraf | last post: by
4 posts views Thread by garyusenet | last post: by
3 posts views Thread by =?Utf-8?B?VGltIEE=?= | last post: by
27 posts views Thread by CarlosMB | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.