469,281 Members | 2,486 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,281 developers. It's quick & easy.

Python's "only one way to do it" philosophy isn't good?

I've just read an article "Building Robust System" by Gerald Jay
Sussman. The article is here:
http://swiss.csail.mit.edu/classes/s...st-systems.pdf

In it there is a footprint which says:
"Indeed, one often hears arguments against building exibility into an
engineered sys-
tem. For example, in the philosophy of the computer language Python it
is claimed:
\There should be one|and preferably only one|obvious way to do
it."[25] Science does
not usually proceed this way: In classical mechanics, for example, one
can construct equa-
tions of motion using Newtonian vectoral mechanics, or using a
Lagrangian or Hamiltonian
variational formulation.[30] In the cases where all three approaches
are applicable they are
equivalent, but each has its advantages in particular contexts."

I'm not sure how reasonable this statement is and personally I like
Python's simplicity, power and elegance. So I put it here and hope to
see some inspiring comments.

Jun 9 '07
206 7111
On Jul 5, 10:30 am, "Chris Mellon" <arka...@gmail.comwrote:
>
I don't think anyone has suggested that. Let me be clear about *my*
position: When you need to ensure that a file has been closed by a
certain time, you need to be explicit about it. When you don't care,
just that it will be closed "soonish" then relying on normal object
lifetime calls is sufficient. This is true regardless of whether
object lifetimes are handled via refcount or via "true" garbage
collection. Relying on the specific semantics of refcounting to give
certain lifetimes is a logic error.

For example:

f = some_file() #maybe it's the file store for a database implementation
f.write('a bunch of stuff')
del f
#insert code that assumes f is closed.

This is the sort of code that I warn against writing.

f = some_file()
with f:
f.write("a bunch of stuff")
#insert code that assumes f is closed, but correctly this time

is better.
This has raised a few questions in my mind. So, here's my newbie
question based off this.

Is this:

f = open(xyz)
f.write("wheee")
f.close()
# Assume file is closed properly.

as "safe" as your code:

f = some_file()
with f:
f.write("a bunch of stuff")
#insert code that assumes f is closed, but correctly this time

Thanks!

G

Jul 5 '07 #201
Falcolas wrote:
On Jul 5, 10:30 am, "Chris Mellon" <arka...@gmail.comwrote:
>>I don't think anyone has suggested that. Let me be clear about *my*
position: When you need to ensure that a file has been closed by a
certain time, you need to be explicit about it. When you don't care,
just that it will be closed "soonish" then relying on normal object
lifetime calls is sufficient. This is true regardless of whether
object lifetimes are handled via refcount or via "true" garbage
collection. Relying on the specific semantics of refcounting to give
certain lifetimes is a logic error.
We may need a guarantee that if you create a local object and
don't copy a strong reference to it to an outer scope, upon exit from
the scope, the object will be destroyed.

John Nagle
Jul 5 '07 #202
"Chris Mellon" <ar*****@gmail.comwrites:
>Some people here have been arguing that all code should use "with" to
ensure that the files are closed. But this still wouldn't solve the
problem of the large data structures being left around for an
arbitrary amount of time.
I don't think anyone has suggested that. Let me be clear about *my*
position: When you need to ensure that a file has been closed by a
certain time, you need to be explicit about it. When you don't care,
just that it will be closed "soonish" then relying on normal object
lifetime calls is sufficient. This is true regardless of whether
object lifetimes are handled via refcount or via "true" garbage
collection.
But it's *not* true at all when relying only on a "true GC"! Your
program could easily run out of file descriptors if you only have a
real garbage collector and code this way (and are opening lots of
files). This is why destructors are useless in Java -- you can't rely
on them *ever* being called. In Python, however, destructors are
quite useful due to the refcounter.
Relying on the specific semantics of refcounting to give
certain lifetimes is a logic error.

For example:

f = some_file() #maybe it's the file store for a database implementation
f.write('a bunch of stuff')
del f
#insert code that assumes f is closed.
That's not a logic error if you are coding in CPython, though I agree
that in this particular case the explicit use of "with" would be
preferable due to its clarity.

|>oug
Jul 5 '07 #203
Falcolas wrote:
>f = some_file() #maybe it's the file store for a database implementation
f.write('a bunch of stuff')
del f
#insert code that assumes f is closed.

This is the sort of code that I warn against writing.

f = some_file()
with f:
f.write("a bunch of stuff")
#insert code that assumes f is closed, but correctly this time

is better.

This has raised a few questions in my mind. So, here's my newbie
question based off this.

Is this:

f = open(xyz)
f.write("wheee")
f.close()
# Assume file is closed properly.
This will not immediately close f if f.write raises an exception since
the program stack is kept alive as a traceback.
as "safe" as your code:

f = some_file()
with f:
f.write("a bunch of stuff")
#insert code that assumes f is closed, but correctly this time
The with statement is designed to be safer. It contains an implicit
try/finally that lets the file close itself in case of an exception.

--
Lenard Lindstrom
<le***@telus.net>
Jul 6 '07 #204
On 7/5/07, Douglas Alan <do**@alum.mit.eduwrote:
"Chris Mellon" <ar*****@gmail.comwrites:
Some people here have been arguing that all code should use "with" to
ensure that the files are closed. But this still wouldn't solve the
problem of the large data structures being left around for an
arbitrary amount of time.
I don't think anyone has suggested that. Let me be clear about *my*
position: When you need to ensure that a file has been closed by a
certain time, you need to be explicit about it. When you don't care,
just that it will be closed "soonish" then relying on normal object
lifetime calls is sufficient. This is true regardless of whether
object lifetimes are handled via refcount or via "true" garbage
collection.

But it's *not* true at all when relying only on a "true GC"! Your
program could easily run out of file descriptors if you only have a
real garbage collector and code this way (and are opening lots of
files). This is why destructors are useless in Java -- you can't rely
on them *ever* being called. In Python, however, destructors are
quite useful due to the refcounter.
Sure, but thats part of the general refcounting vs GC argument -
refcounting gives (a certain level of) timeliness in resource
collection, GC often only runs under memory pressure. If you're saying
that we should keep refcounting because it provides better handling of
non-memory limited resources like file handles, I probably wouldn't
argue. But saying we should keep refcounting because people like to
and should write code that relies on implicit scope level object
destruction I very strongly argue against.
Relying on the specific semantics of refcounting to give
certain lifetimes is a logic error.

For example:

f = some_file() #maybe it's the file store for a database implementation
f.write('a bunch of stuff')
del f
#insert code that assumes f is closed.

That's not a logic error if you are coding in CPython, though I agree
that in this particular case the explicit use of "with" would be
preferable due to its clarity.
I stand by my statement. I feel that writing code in this manner is
like writing C code that assumes uninitialized pointers are 0 -
regardless of whether it works, it's erroneous and bad practice at
best, and actively harmful at worst.
Jul 6 '07 #205
"Chris Mellon" <ar*****@gmail.comwrites:
Sure, but thats part of the general refcounting vs GC argument -
refcounting gives (a certain level of) timeliness in resource
collection, GC often only runs under memory pressure. If you're
saying that we should keep refcounting because it provides better
handling of non-memory limited resources like file handles, I
probably wouldn't argue. But saying we should keep refcounting
because people like to and should write code that relies on implicit
scope level object destruction I very strongly argue against.
And why would you do that? People rely very heavily in C++ on when
destructors will be called, and they are in fact encouraged to do so.
They are, in fact, encouraged to do so *so* much that constructs like
"finally" and "with" have been rejected by the C++ BDFL. Instead, you
are told to use smart pointers, or what have you, to clean up your
allocated resources.

I so no reason not to make Python at least as expressive a programming
language as C++.
Relying on the specific semantics of refcounting to give
certain lifetimes is a logic error.

For example:

f = some_file() #maybe it's the file store for a database implementation
f.write('a bunch of stuff')
del f
#insert code that assumes f is closed.
>That's not a logic error if you are coding in CPython, though I agree
that in this particular case the explicit use of "with" would be
preferable due to its clarity.
I stand by my statement. I feel that writing code in this manner is
like writing C code that assumes uninitialized pointers are 0 -
regardless of whether it works, it's erroneous and bad practice at
best, and actively harmful at worst.
That's a poor analogy. C doesn't guarantee that pointers will be
initialized to 0, and in fact, they typically are not. CPython, on
other other hand, guarantees that the refcounter behaves a certain
way.

There are languages other than C that guarantee that values are
initialized in certain ways. Are you going to also assert that in
those languages you should not rely on the initialization rules?

|>oug
Jul 6 '07 #206
On 7/6/07, Douglas Alan <do**@alum.mit.eduwrote:
"Chris Mellon" <ar*****@gmail.comwrites:
Sure, but thats part of the general refcounting vs GC argument -
refcounting gives (a certain level of) timeliness in resource
collection, GC often only runs under memory pressure. If you're
saying that we should keep refcounting because it provides better
handling of non-memory limited resources like file handles, I
probably wouldn't argue. But saying we should keep refcounting
because people like to and should write code that relies on implicit
scope level object destruction I very strongly argue against.

And why would you do that? People rely very heavily in C++ on when
destructors will be called, and they are in fact encouraged to do so.
They are, in fact, encouraged to do so *so* much that constructs like
"finally" and "with" have been rejected by the C++ BDFL. Instead, you
are told to use smart pointers, or what have you, to clean up your
allocated resources.
For the record, C++ doesn't have a BDFL. And yes, I know that it's
used all the time in C++ and is heavily encouraged. However, C++ has
totally different object semantics than Python, and there's no reason
to think that we should use it because a different language with
different rules does it. For one thing, Python doesn't have the
concept of stack objects that C++ does.
I so no reason not to make Python at least as expressive a programming
language as C++.
I have an overwhelming urge to say something vulgar here. I'm going to
restrain myself and point out that this isn't a discussion about
expressiveness.
Relying on the specific semantics of refcounting to give
certain lifetimes is a logic error.

For example:

f = some_file() #maybe it's the file store for a database implementation
f.write('a bunch of stuff')
del f
#insert code that assumes f is closed.
That's not a logic error if you are coding in CPython, though I agree
that in this particular case the explicit use of "with" would be
preferable due to its clarity.
I stand by my statement. I feel that writing code in this manner is
like writing C code that assumes uninitialized pointers are 0 -
regardless of whether it works, it's erroneous and bad practice at
best, and actively harmful at worst.

That's a poor analogy. C doesn't guarantee that pointers will be
initialized to 0, and in fact, they typically are not. CPython, on
other other hand, guarantees that the refcounter behaves a certain
way.
It's a perfect analogy, because the value of an uninitialized pointer
in C is *implementation dependent*. The standard gives you no guidance
one way or the other, and an implementation is free to assign any
value it wants. Including 0, and it's not uncommon for implementations
to do so, at least in certain configurations.

The Python language reference explicitly does *not* guarantee the
behavior of the refcounter. By relying on it, you are relying on an
implementation specific, non-specified behavior. Exactly like you'd be
doing if you rely on the value of uninitialized variables in C.
There are languages other than C that guarantee that values are
initialized in certain ways. Are you going to also assert that in
those languages you should not rely on the initialization rules?
Of course not. Because they *do* guarantee and specify that. C
doesn't, and neither does Python.
Jul 9 '07 #207

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

226 posts views Thread by Stephen C. Waterbury | last post: by
22 posts views Thread by Tuang | last post: by
7 posts views Thread by Michele Simionato | last post: by
25 posts views Thread by John Morgan | last post: by
191 posts views Thread by Xah Lee | last post: by
22 posts views Thread by Xah Lee | last post: by
5 posts views Thread by Mathias Panzenboeck | last post: by
1 post views Thread by CARIGAR | last post: by
reply views Thread by zhoujie | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.