By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,850 Members | 972 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,850 IT Pros & Developers. It's quick & easy.

Any way to not create .pyc files?

P: n/a
In short:

Is there any way to run Python WITHOUT trying to create .pyc files (or
..pyo) or to have Python not attempt to import the .pyc files it finds?

Reason:

We have a site-specific package installed on a network drive[1]. When
anyone with write access imports this package, the network drive gets
spammed with .pyc files.

If these .pyc files exist, they appear to cause problems when other
users' Python interpreters use them instead of the .py files. (I know,
they *should* work, but they don't). This may have something to do
with the fact that all of these users (on Windows) have the network
drive mapped to arbitrary drive letters. I don't know.

PEP 304 would have helped, but it appears to be deceased. What I
really want is a command line option

I'm going to have to cobble together a work-around, like having
imported modules delete their own .pyc files immediately after import,
but it would be really nice to have a /good/ way of not using .pyc...

(footnotes)

[1] Because it's an administrative nightmare to distribute code updates
to dozens of non-technical users

Jul 19 '05 #1
Share this Question
Share on Google+
17 Replies


P: n/a
Lonnie Princehouse wrote:
Is there any way to run Python WITHOUT trying to create .pyc files (or
.pyo) or to have Python not attempt to import the .pyc files it finds?


You could roll your package into a zip archive and then import that. For
instance, keep your main.py out of the archive and put everything else
in. Then, at the top of main.py:

import sys
sys.path.insert("network_path/package.zip")

import package
# do normal stuff with package here.

As long as you zipped up the package will all .pyc and .pyo files
removed, Python will have no choice but to compile the files every time
they are imported - unless I'm grossly mistaken Python won't put the pyc
files into the zip archive, or modify any that were there already.

As far as the maintenance headache of distributing updated copies to
individual workstations, IMO this just requires a little forethought and
then it isn't a headache at all. Instead of the users starting the
application directly, they could start a starter application that checks
with the server to determine if local files need to be updated, and if
so grab them and then start the main app. This actually removes
headaches in the Windows world, where you can't drop-in updates to
programs that are currently running.

What I've done in the past adds on to the starter application idea, and
has the main application check to see if there are updates to the
starter application, and if so pull those changes down upon exit of the
main application. I just saved the file locations locally in an INI file.

--
Paul McNett
http://paulmcnett.com

Jul 19 '05 #2

P: n/a
Lonnie Princehouse wrote:
In short:

Is there any way to run Python WITHOUT trying to create .pyc files (or
.pyo) or to have Python not attempt to import the .pyc files it finds?

Reason:

We have a site-specific package installed on a network drive[1]. When
anyone with write access imports this package, the network drive gets
spammed with .pyc files.
Well, at least they have the right to create them. Wouldn't it be
easier, though, to just create all the .pyc files.
If these .pyc files exist, they appear to cause problems when other
users' Python interpreters use them instead of the .py files. (I know,
they *should* work, but they don't). This may have something to do
with the fact that all of these users (on Windows) have the network
drive mapped to arbitrary drive letters. I don't know.
The .pyc file is normally created in the same directory as the .py file.
You may be seeing problems because one user doesn't have permissions to
access a file created by another. In that case you may be able to set
the inherited permissions for Creator Owner to allow reading by all users.

You might also expect problems if one user was using python 2.2 and
another was using 2.4, since each version requires its own format for
the .pyc file, and they might conflict. Ulitmately it sounds like a
permissions problem.

PEP 304 would have helped, but it appears to be deceased. What I
really want is a command line option
Not sure it's deceased (a dead parrot?) - it's on the standards track,
it hasn't been rejected, and Skip has actually provided a patch to
implement the solution.
I'm going to have to cobble together a work-around, like having
imported modules delete their own .pyc files immediately after import,
but it would be really nice to have a /good/ way of not using .pyc...
It would be *much* more sensible to find the underlying cause of the
problems and actually fix them :-)
(footnotes)

[1] Because it's an administrative nightmare to distribute code updates
to dozens of non-technical users

I hear that.

regards
Steve
--
Steve Holden +1 703 861 4237 +1 800 494 3119
Holden Web LLC http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/

Jul 19 '05 #3

P: n/a
Steve Holden wrote:
Lonnie Princehouse wrote:
If these .pyc files exist, they appear to cause problems when other
users' Python interpreters use them instead of the .py files. (I know,
they *should* work, but they don't). This may have something to do
with the fact that all of these users (on Windows) have the network
drive mapped to arbitrary drive letters. I don't know.

That won't be the reason.
You might also expect problems if one user was using python 2.2 and
another was using 2.4, since each version requires its own format for
the .pyc file, and they might conflict. Ulitmately it sounds like a
permissions problem. It would be *much* more sensible to find the underlying cause of the
problems and actually fix them :-)


I agree with Steve, and want to add to what he's given you.

The only reason a .pyc file gets written is if the magic number inside
doesn't match the one expected by your version of python, or if the
timestamp that is stored inside doesn't match the timestamp of the .py
file from which it was created. As far as I know, if both these things
are true (and if the file is actually readable) then the .pyc file is
loaded and the .py is ignored, and no new .pyc file is written.

I suggest you pick two users who have the same version of python and
test whether your .pyc files are written again and again. If they are
not, then start looking for the "bad" user who probably has an older
version of python. If they are written over and over, then Steve is
probably correct about a permissions issue.

HTH
-Peter
Jul 19 '05 #4

P: n/a
Of course! ZIP imports! I think that will solve the problem nicely.

We already have a starter application that locates the codebase on the
network drive, so it wouldn't be too hard to implement the "keep a
local copy up to date" solution. But I'll try the zip idea first.
Many thanks for your helpful suggestions =)

Jul 19 '05 #5

P: n/a
Peter Hansen wrote:
Lonnie Princehouse wrote:
If these .pyc files exist, they appear to cause problems when other
users' Python interpreters use them instead of the .py files. (I know,
they *should* work, but they don't). This may have something to do
with the fact that all of these users (on Windows) have the network
drive mapped to arbitrary drive letters. I don't know.


That won't be the reason.


I take it back... that could be the reason. I just checked and the path
to the .py file is encoded in the .pyc. I don't know if it's actually
used in the decision whether to recompile the .pyc or not (it could just
be for use in showing tracebacks with source), but it's a possibility...
pretty easy to test too.

-Peter
Jul 19 '05 #6

P: n/a
Peter Hansen blathered [about whether a .pyc will be regenerated if the
path encoded in it changes]:
I take it back... that could be the reason. I just checked and the path
to the .py file is encoded in the .pyc. I don't know if it's actually
used in the decision whether to recompile the .pyc or not (it could just
be for use in showing tracebacks with source), but it's a possibility...
pretty easy to test too.


Well, it doesn't look like that's the case. I have a system happily
importing (without regenerating it) a .pyc file that has a
now-nonexistent path encoded in it. Even shows tracebacks properly...
(now I'm curious what the embedded path is actually used for).

-Peter
Jul 19 '05 #7

P: n/a
>> PEP 304 would have helped, but it appears to be deceased.
Not sure it's deceased (a dead parrot?) - it's on the standards track,
it hasn't been rejected, and Skip has actually provided a patch to
implement the solution.
It is possible that PEP 304 is really just pining for the fjords, but I
don't know. Digging through the newsgroup, it looks like Skip's patch
is for UNIX and progress has stalled in trying to get a Windows patch.
Really, it's moot, since I'm not going to recompile Python and try to
convince everyone to reinstall it. (the latter would be a herculean
labor) :P

I guess I sort of assumed that Python would ignore the .pyc files if it
didn't have read access, but it could be some sort of permissions
issue. Only a handful of users have write access. Maybe the issue
happens when Python tries to /replace/ .pyc files with newer versions,
but doesn't have permission to overwrite the old files? It doesn't
help that I'm one step removed from the users and administrators, so I
don't really have any diagnostic information besides people calling me
up to say "it's broken". Feh.

Python versioning could definitely cause a problem, but in this case
everyone is using 2.3.5 (I should have mentioned that).

It would be *much* more sensible to find the underlying cause of the
problems and actually fix them :-)


Amen!

Jul 19 '05 #8

P: n/a
Lonnie Princehouse wrote:
We have a site-specific package installed on a network drive[1]. When
anyone with write access imports this package, the network drive gets
spammed with .pyc files.

If these .pyc files exist, they appear to cause problems when other
users' Python interpreters use them instead of the .py files. (I know,
they *should* work, but they don't).


You didn't really describe the nature of the problem. Perhaps the whole
..pyc thing is a bit of a red herring, and the real problem lies
elsewhere? What are the actual symptoms of your problem?
Jul 19 '05 #9

P: n/a
> You didn't really describe the nature of the problem. Perhaps the whole
.pyc thing is a bit of a red herring, and the real problem lies
elsewhere? What are the actual symptoms of your problem?


Yes, the .pyc thing could be a red herring. I was hoping to find an
easy way to disable them to see if it mysteriously solved the problem.

I apologize in advance: This is not going to be a satisfying answer.
:(

All that I know about the problem comes second-hand from the admins who
encountered it.
They didn't save any tracebacks or error messages or even screenshots
(really helpful, I know).

At one point last week, users started reporting that they were
encountering problems running our Python application (the one that uses
the package on the network drive). The admins noticed that lots of
..pyc files had been inadvertantly created when someone with write
access had run the application. The admins deleted all of the .pyc
files, and users were once again able to run the application.
I suspect this hadn't come up before because very few people have write
access, and those who do are not usually users. I don't know the
nature of the problems encountered.

I have tried to recreate a scenario wherein .pyc files cause a problem
(mostly by going through permutations of file permissions and remapping
drives), but with no luck. I have asked the admins to intentionally
recreate the situation so I can get more information, but it may take
days for that to happen. I don't have write access to the network
drive, so I can't do it myself. I'll post a follow-up when (if) I get
more information.

Jul 19 '05 #10

P: n/a
"Lonnie Princehouse" <fi**************@gmail.com> wrote in message
news:11**********************@g14g2000cwa.googlegr oups.com...
In short:

Is there any way to run Python WITHOUT trying to create .pyc files (or
.pyo) or to have Python not attempt to import the .pyc files it finds?

Reason:

We have a site-specific package installed on a network drive[1]. When
anyone with write access imports this package, the network drive gets
spammed with .pyc files.

If these .pyc files exist, they appear to cause problems when other
users' Python interpreters use them instead of the .py files. (I know,
they *should* work, but they don't). This may have something to do
with the fact that all of these users (on Windows) have the network
drive mapped to arbitrary drive letters. I don't know.

PEP 304 would have helped, but it appears to be deceased. What I
really want is a command line option

I'm going to have to cobble together a work-around, like having
imported modules delete their own .pyc files immediately after import,
but it would be really nice to have a /good/ way of not using .pyc...

(footnotes)

[1] Because it's an administrative nightmare to distribute code updates
to dozens of non-technical users


Are you using ActivePython? ActivePython's installation updates the PATHEXT
environment variable on Windows with the .pyc and .py extensions (.pyc
first). This environment variable is used to try extensions when you run a
program from the command line without an extension.

Example:

[1] C:\ex>set pathext
PATHEXT=.com;.exe;.bat;.cmd;.pyc;.py

[2] C:\ex>echo print "old" > script.py

[3] C:\ex>python -c "import script"
old

[4] C:\ex>script
old

[5] C:\ex>echo print "new" > script.py

[6] C:\ex>script
old

[7] C:\ex>python -c "import script"
new

[8] C:\ex>script
new

Even though script.py contains new (line 5), script in line 6 runs the .pyc
generated by line 3.

To fix this problem, put .py and .pyw extenstions ahead of .pyc and .pyo in
PATHEXT.

Hope this helps,
Mark
Jul 19 '05 #11

P: n/a
PEP 304 would have helped, but it appears to be deceased.


Just resting...

FWIW, I reapplied it to my cvs sandbox the other day and plan to at least
generate a new patch from that. It's pretty much done, except... Once upon
a time, someone identified some problems for Windows with its multiple-root
file system. I've tried a couple times to dig it up, but have been
unsuccessful. If anyone can find it (or was the author, better yet), let me
know. At the very least I'd like to amend the PEP. Ideally, I'd like to
solve the problem and get PEP 304 going again.

Skip

Jul 19 '05 #12

P: n/a
On Thursday 09 June 2005 03:48 pm, Lonnie Princehouse wrote:
At one point last week, users started reporting that they were
encountering problems running our Python application (the one that uses
the package on the network drive). The admins noticed that lots of
.pyc files had been inadvertantly created when someone with write
access had run the application. The admins deleted all of the .pyc
files, and users were once again able to run the application.
I suspect this hadn't come up before because very few people have write
access, and those who do are not usually users. I don't know the
nature of the problems encountered.


Well, if most users are using an older version of Python, but the user
with write access was using a new version, I can see this happening.

The user with write access would run the script, causing the pyc files
to be generated for that interpreter. Then a normal user, running an
older Python tries to load the modules. Since a .pyc file exists, it gets
used instead, but *oops* it's for a later version of the interpreter and
stuff breaks.

A better solution than getting rid of the pyc files would be to put good
ones there --- use the version of python that users are expected to be
using and generate them. If you delete the pyc files, you create an
unnecessary drag on performance and the hazard remains to mess you
up again. If the pyc files are generated, though, I *think* they will be
used and work for both the expected python and (fingers crossed) the
later version.

If the later version doesn't work, your "unusual" person with write
access ought to be smart enough to use the right version, right? It's
the usual user you should be designing for.

I hope I'm not totally off-base here --- I've had relatively little
experience with mixed versions and pyc files, so my assumptions
may be a little off, but hopefully someone will correct me if that's so.

Cheers,
Terry

--
Terry Hancock ( hancock at anansispaceworks.com )
Anansi Spaceworks http://www.anansispaceworks.com

Jul 19 '05 #13

P: n/a
Terry Hancock wrote:
The user with write access would run the script, causing the pyc files
to be generated for that interpreter. Then a normal user, running an
older Python tries to load the modules. Since a .pyc file exists, it gets
used instead, but *oops* it's for a later version of the interpreter and
stuff breaks.

A better solution than getting rid of the pyc files would be to put good
ones there --- use the version of python that users are expected to be
using and generate them. If you delete the pyc files, you create an
unnecessary drag on performance and the hazard remains to mess you
up again. If the pyc files are generated, though, I *think* they will be
used and work for both the expected python and (fingers crossed) the
later version.


Sorry Terry, but both assumptions are wrong. Different versions of
Python store different "magic numbers" in the .pyc files, and they will
not use a .pyc file with a wrong number. I believe you'll actually get
an error about "bad magic number" if you do manage to force a bad .pyc
to be loaded (probably by having no matching .py from which to
recompile). If the .py does exist, it will be recompiled and the newly
generated bytecode will be used, whether or not the .pyc file can be
written to cache it for next time.

Your suggestion about pre-generating the .pyc files (using compileall)
is a good one in general for this sort of setup (shared libraries),
though it really won't help if there are different versions of Python in
use. (If that's true, nothing will really help except perhaps PEP304
and the time machine. Well, having the two versions use two different
copies of the library files would help, but the OP doesn't think there
are different versions in use.)

-Peter
Jul 19 '05 #14

P: n/a
On Friday 10 June 2005 06:52 am, Peter Hansen wrote:
Sorry Terry, but both assumptions are wrong.


Yes, thanks. I realized that when I read the other replies.
Oh well. I guess I learned something anyway. ;-)

The interesting question I'd have then, is what happens
if a wrong version .pyc exists and the process does not
have permission to overwrite it? I understand that if
no permissions exist and no pyc file exists, that the bytecode
will be generated in memory, but not written. That would
seem to be the correct thing to do, but I wonder if it's what
actually happens here.

I just tried to test this, but I found something even scarier
in the process:

samwise:/project/terry> python
Python 2.3.4 (#2, May 29 2004, 03:31:27)
[GCC 3.3.3 (Debian 20040417)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
import pkgs
samwise:/project/terry> su
Password:
samwise:/project/terry# chmod a-w pkgs.pyc
samwise:/project/terry# chown root:root pkgs.pyc
samwise:/project/terry# ls -l pkgs.pyc
-r--r--r-- 1 root root 1628 Jun 10 22:45 pkgs.pyc
samwise:/project/terry# exit
exit
samwise:/project/terry> python2.1
Python 2.1.3+ (#1, Feb 25 2004, 08:52:22)
[GCC 3.3.3 (Debian)] on linux2
Type "copyright", "credits" or "license" for more information. import pkgs

samwise:/project/terry> ls -l pkgs.pyc
-rw-rw-r-- 1 terry anansi 1701 Jun 10 22:46 pkgs.pyc
samwise:/project/terry>

It looks to me like Python just deleted a read-only file owned by
root in order to replace it with a new pyc file. Can somebody
explain that to me?! Isn't that supposed to be impossible?

(I can only guess that Python is running setuid root in this
situation, and taking advantage of that --- but isn't that, well,
*evil*?)

--
Terry Hancock ( hancock at anansispaceworks.com )
Anansi Spaceworks http://www.anansispaceworks.com

Jul 19 '05 #15

P: n/a
>>>>> Terry Hancock <ha*****@anansispaceworks.com> (TH) wrote:

TH> It looks to me like Python just deleted a read-only file owned by
TH> root in order to replace it with a new pyc file. Can somebody
TH> explain that to me?! Isn't that supposed to be impossible?


If the directory is writable, you can delete a file and write a new one
with the same name. The permissions of the file itself are of no importance
in this case.
--
Piet van Oostrum <pi**@cs.uu.nl>
URL: http://www.cs.uu.nl/~piet [PGP]
Private email: pi**@vanoostrum.org
Jul 19 '05 #16

P: n/a
On Saturday 11 June 2005 06:14 am, Piet van Oostrum wrote:
>> Terry Hancock <ha*****@anansispaceworks.com> (TH) wrote:

TH> It looks to me like Python just deleted a read-only file owned by
TH> root in order to replace it with a new pyc file. Can somebody
TH> explain that to me?! Isn't that supposed to be impossible?


If the directory is writable, you can delete a file and write a new one
with the same name. The permissions of the file itself are of no importance
in this case.


Yeah, I guess that must be so. I wonder why I never noticed it, though.

--
Terry Hancock ( hancock at anansispaceworks.com )
Anansi Spaceworks http://www.anansispaceworks.com

Jul 19 '05 #17

P: n/a
On Thu, 9 Jun 2005 18:12:35 -0500, Skip Montanaro <sk**@pobox.com> wrote:
>> PEP 304 would have helped, but it appears to be deceased.


Just resting...

FWIW, I reapplied it to my cvs sandbox the other day and plan to at least
generate a new patch from that. It's pretty much done, except... Once upon
a time, someone identified some problems for Windows with its multiple-root
file system. I've tried a couple times to dig it up, but have been
unsuccessful. If anyone can find it (or was the author, better yet), let me
know. At the very least I'd like to amend the PEP. Ideally, I'd like to
solve the problem and get PEP 304 going again.

Re multiple-root...

IMO it would be nice to have an optional way of running python so that all file system
paths would be normalized to Unix-style and platform-independent. On a windows
system, the model used by msys/MinGW could be used, where you can mount subtrees
per fstab and the "drives" wind up being /c/path for C:\path etc. And you
could potentially mount different file systems also, including /proc and /dev
synthetic things. On unix it could largely be pass-throughs, IWT, except for
mountable file system objects written in pure python which would probably need
some interfacing help.

This would also be a potential way of feeding/logging tests of software that
accesses the file system, using special mountable testing file systems.

It also lets you switch I/O sources and sinks with mounts that are external to
a particular python program being run.

Don't know how factorable all that is in python, but I would think the bulk
would be changes in os and hopefully pretty transparent elsewhere.

Regards,
Bengt Richter
Jul 19 '05 #18

This discussion thread is closed

Replies have been disabled for this discussion.