471,353 Members | 1,760 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 471,353 software developers and data experts.

logging.py: mutiple system users writing to same file gettingpermission errors.

An a redhat box I have root, apache and other normal users run code
that uses the logging module to write to the same log file. Since
umasks are set to 2 or 022 this gets permission errors.

I have fixed my issue by patching the logging code everywhere there is
an open for write with:
try:
old_umask = os.umask(0)
# open for write here
finally:
os.umask(old_umask)

Is there a better way to solve this issue?
Are there any security problems with this solution other than the log
file not being protected?
Dec 6 '07 #1
3 2831
On Dec 6, 6:35 pm, evenrik <even...@gmail.comwrote:
An a redhat box I have root, apache and other normal users run code
that uses theloggingmodule to write to the same log file. Since
umasks are set to 2 or 022 this gets permission errors.

I have fixed my issue by patching theloggingcode everywhere there is
an open for write with:
try:
old_umask = os.umask(0)
# open for write here
finally:
os.umask(old_umask)

Is there a better way to solve this issue?
Are there any security problems with this solution other than the log
file not being protected?
Multiple processes writing to the same log file may step on each
other's toes: logging contains thread synchronisation code but no
protection against multiple processes accessing the same resource. The
best solution would be to log from all processes to a SocketHandler,
and then have a socket receiver process write the logs to file. This
effectively serialises access to the log file. An example is given in
the logging docs, see

http://docs.python.org/lib/network-logging.html

Of course, you can have the receiver process run under a uid of your
choosing which has the appropriate permissions to write to the log
file.

Regards,

Vinay Sajip
Dec 7 '07 #2
On Dec 7, 12:46 pm, Vinay Sajip <vinay_sa...@yahoo.co.ukwrote:
On Dec 6, 6:35 pm, evenrik <even...@gmail.comwrote:
An a redhat box I have root, apache and other normal users run code
that uses theloggingmodule to write to the same log file. Since
umasks are set to 2 or 022 this gets permission errors.
I have fixed my issue by patching theloggingcode everywhere there is
an open for write with:
try:
old_umask = os.umask(0)
# open for write here
finally:
os.umask(old_umask)
Is there a better way to solve this issue?
Are there any security problems with this solution other than the log
file not being protected?

Multiple processes writing to the same log file may step on each
other's toes: logging contains thread synchronisation code but no
protection against multiple processes accessing the same resource. The
best solution would be to log from all processes to a SocketHandler,
and then have a socket receiver process write the logs to file. This
effectively serialises access to the log file. An example is given in
the logging docs, see

http://docs.python.org/lib/network-logging.html

Of course, you can have the receiver process run under a uid of your
choosing which has the appropriate permissions to write to the log
file.

Regards,

Vinay Sajip
Thank you for the warning about multiple processes. We decided to try
creating a DBHandler to write the logs to PostgeSQL.
Dec 10 '07 #3
On Dec 10, 8:34 pm, evenrik <even...@gmail.comwrote:
On Dec 7, 12:46 pm, Vinay Sajip <vinay_sa...@yahoo.co.ukwrote:
On Dec 6, 6:35 pm, evenrik <even...@gmail.comwrote:
An a redhat box I have root, apache and other normal users run code
that uses theloggingmodule to write to the same log file. Since
umasks are set to 2 or 022 this gets permission errors.
I have fixed my issue by patching theloggingcode everywhere there is
an open for write with:
try:
old_umask = os.umask(0)
# open for write here
finally:
os.umask(old_umask)
Is there a better way to solve this issue?
Are there any security problems with this solution other than the log
file not being protected?
Multiple processes writing to the same log file may step on each
other's toes:loggingcontains thread synchronisation code but no
protection against multiple processes accessing the same resource. The
best solution would be to log from all processes to a SocketHandler,
and then have a socket receiver process write the logs to file. This
effectively serialises access to the log file. An example is given in
theloggingdocs, see
http://docs.python.org/lib/network-logging.html
Of course, you can have the receiver process run under a uid of your
choosing which has the appropriate permissions to write to the log
file.
Regards,
Vinay Sajip

Thank you for the warning about multiple processes. We decided to try
creating a DBHandler to write the logs to PostgeSQL.
Okay. In case you're interested - the original distribution of the
logging package (before it became part of Python) is at
http://www.red-dove.com/python_logging.html and some of the test
scripts, which are in the tarball available from that page, contain an
example database handler (in test script log_test14.py).

Best regards,

Vinay Sajip
Dec 11 '07 #4

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

16 posts views Thread by Einar Høst | last post: by
1 post views Thread by ramA | last post: by
8 posts views Thread by A.M | last post: by
7 posts views Thread by Leo Breebaart | last post: by
3 posts views Thread by nicholas.petrella | last post: by
4 posts views Thread by Alexandru Mosoi | last post: by
reply views Thread by XIAOLAOHU | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.