473,320 Members | 2,006 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,320 software developers and data experts.

Why does python not have a mechanism for data hiding?

Hi,

first, python is one of my fav languages, and i'll definitely keep
developing with it. But, there's 1 one thing what I -really- miss:
data hiding. I know member vars are private when you prefix them with
2 underscores, but I hate prefixing my vars, I'd rather add a keyword
before it.

Python advertises himself as a full OOP language, but why does it miss
one of the basic principles of OOP? Will it ever be added to python?

Thanks in advance,
Lucas
Jun 27 '08
162 10017
On Jun 10, 11:21*am, "Russ P." <Russ.Paie...@gmail.comwrote:
I took a risk in choosing Python, and I would
feel better about it if Python would move up to the next level with
more advanced features such as (optional) static typing and private
declarations. But every time I propose something like that, I get all
kinds of flak from people here who do their hacking and care little
about anyone else's needs.
Let me share my personal insight. I used Python for a mission-critical
application that needed, in effect, almost 100% uptime with superior
throughput. In other words, it was a very fine piece of art that
needed to be precise and correct. In the end, Python delivered, under
budget, under schedule, and with superbly low maintenance costs
(practically 0 compared to other systems written in Java and C). I
didn't have to use any of the features you mentioned, and I can't
imagine why you would need them. In fact, having them in the language
would encourage others to use them and make my software less reliable.

You may think we are all a bunch of hackers who are too stupid to
understand what you are saying, but that is your loss.

Now, let me try to explain something that perhaps the previous 166
post may not have thoroughly explained. If I am duplicating what
everyone else has already said, then it's my own fault.

Short answer: You don't need these features in Python. You do need to
use the right tools for the right tasks.

Long answer:

Who cares what the type of an object is? Only the machine. Being able
to tell, in advance, what the type of a variable is is a premature
optimization. Tools like psyco prove that computers (really,
programmers) nowadays are smart enough to figure things out the right
way without any hints from the developer. Static typing is no longer
necessary in today's world.

Who cares about private declarations, or interface declarations at
all? It is only a message to the developers. If you have a problem
with your users doing the right thing, that is a social problem, not a
technical one, and the solution is social, not technical. Yes, it is
work, but it is not coding---it is explaining to other living,
breathing human beings how to do a specific task, which is what you
should have been doing from the start.

When all you have is a hammer, the world seems full of nails. Think
about that. You have more tools than Python to solve these problems,
and Python will never be the panacea you wish it was. The panacea is
you, putting the right tools to use for the right tasks.
Jun 27 '08 #151
On 6 Juni, 03:09, "Russ P." <Russ.Paie...@gmail.comwrote:
On Jun 5, 2:57 pm, Hrvoje Niksic <hnik...@xemacs.orgwrote:
"Russ P." <Russ.Paie...@gmail.comwrites:
By the way, my recollection is that in C++ access defaults to private
if nothing is declared explicity. So normally the "private"
declaration is unnecessary. If it is left out, your little trick won't
work.
How about #define class struct

I never thought of that one. I wonder what the C++ gurus would say
about that.

Let me guess. They'd probably say that the access restrictions are for
your own good, and bypassing them is bound to do you more harm than
good in the long run. And they'd probably be right. Just because you
can break into a box labeled "DANGER HIGH VOLTAGE," that doesn't make
it a good idea.

This just goes to show that the whole idea of using header files as
simple text insertions is flaky to start with, and adding the
preprocessor just compounds the flakiness. Needless to say, I'mnota
big fan of C and C++.
Jun 27 '08 #152
On Jun 10, 11:58 am, Jonathan Gardner
Who cares what the type of an object is? Only the machine. Being able
to tell, in advance, what the type of a variable is is a premature
optimization. Tools like psyco prove that computers (really,
programmers) nowadays are smart enough to figure things out the right
way without any hints from the developer. Static typing is no longer
necessary in today's world.
You couldn't be more wrong. Even Guido recognizes the potential value
of static typing, which is why he is easing it into Python as on
optional feature. He recognizes, correctly, that it can detect errors
earlier and facilitate more efficient execution. But there's another,
more significant potential benefit for safety-critical and mission-
critical applications: static typing facilitates advanced static
analysis of software. To get an idea of what that is about, take a
look at

http://www.sofcheck.com

Here is an excerpt from their website:

"SofCheck’s advanced static error detection solutions find bugs in
programs before programs are run. By mathematically analyzing every
line of software, considering every possible input, and every path
through the program, SofCheck’s solutions find any and all errors that
cause a program to crash or produce an undefined result."

Me again: static analysis does not replace traditional dynamic and
unit testing, but it is far more advanced and finds many errors very
quickly that might require weeks or months of dynamic testing -- or
might not be found at all with dynamic testing until the product is in
the field.

With more and more automation of safety-critical systems these days,
we need this more than ever. Your assertion that "Static typing is no
longer
necessary in today's world," is just plain naive.
Who cares about private declarations, or interface declarations at
all? It is only a message to the developers. If you have a problem
with your users doing the right thing, that is a social problem, not a
technical one, and the solution is social, not technical. Yes, it is
work, but it is not coding---it is explaining to other living,
breathing human beings how to do a specific task, which is what you
should have been doing from the start.
You may be right to an extent for small or medium-sized non-critical
projects, but you are certainly not right in general. I read something
a while back about the flight software for the Boeing 777. I think it
was something like 3,000,000 lines of Ada code. Normally, for a
project of that magnitude the final integration would be expected to
take something like three months. However, the precise interface specs
and encapsulation methods in Ada allowed the integration to be
completed in just three days.

By your recommended method of social interaction, that would be one
hell of a lot of talking!

I realize that Python is not designed for such large projects, but
don't you think certain general principles can be learned anyway?
Perhaps the benefits of interface specs and encapsulation are not as
obvious for smaller projects, but certainly they are not zero.
Jun 27 '08 #153
On Jun 11, 8:11 am, "Russ P." <Russ.Paie...@gmail.comwrote:
On Jun 10, 11:58 am, Jonathan Gardner
Who cares what the type of an object is? Only the machine. Being able
to tell, in advance, what the type of a variable is is a premature
optimization. Tools like psyco prove that computers (really,
programmers) nowadays are smart enough to figure things out the right
way without any hints from the developer. Static typing is no longer
necessary in today's world.

You couldn't be more wrong. Even Guido recognizes the potential value
of static typing, which is why he is easing it into Python as on
optional feature. He recognizes, correctly, that it can detect errors
earlier and facilitate more efficient execution. But there's another,
more significant potential benefit for safety-critical and mission-
critical applications: static typing facilitates advanced static
analysis of software.
Can you provide me with any example of Guide wanting static typing to
be
optional? I haven't. Any why is it you keep going so abstract in this
discussion.
>
You may be right to an extent for small or medium-sized non-critical
projects, but you are certainly not right in general. I read something
a while back about the flight software for the Boeing 777. I think it
was something like 3,000,000 lines of Ada code. Normally, for a
project of that magnitude the final integration would be expected to
take something like three months. However, the precise interface specs
and encapsulation methods in Ada allowed the integration to be
completed in just three days.
Well, that isn't just because they used encapsulation, the likelihood
is well thoughtout planning, constant system testing (that DOES
require accessing of those more private methods to ensure there are no
problems throughout), and re-testing. Again since I'm not sure how
much I trust you and your statistics anymore, have you a link to
anything discussing this?
>
I realize that Python is not designed for such large projects, but
don't you think certain general principles can be learned anyway?
Perhaps the benefits of interface specs and encapsulation are not as
obvious for smaller projects, but certainly they are not zero.
Python is designed to be an efficient high level language for writing
clear readable code at any level. Considering the amount of use it
gets from Google, and the scope and size of many of their projects, I
find it foolish to say it is not designed for large projects. However
I do not myself have an example of a large python project, because I
don't program python at work.

I think the issue here is your want to have python perform exactly
like OO built languages such as Java, but it isn't Java, and that, is
a good thing.
Jun 27 '08 #154
Russ P. a écrit :
On Jun 10, 11:58 am, Jonathan Gardner
(snip)
>Who cares about private declarations, or interface declarations at
all? It is only a message to the developers. If you have a problem
with your users doing the right thing, that is a social problem, not a
technical one, and the solution is social, not technical. Yes, it is
work, but it is not coding---it is explaining to other living,
breathing human beings how to do a specific task, which is what you
should have been doing from the start.

You may be right to an extent for small or medium-sized non-critical
projects, but you are certainly not right in general. I read something
a while back about the flight software for the Boeing 777. I think it
was something like 3,000,000 lines of Ada code.
I can't obviously back my claim, but you could probably have the same
feature set implemented in 10 to 20 times less code in Python. Not that
I suggest using Python here specifically, but just to remind you that
kloc is not a very exact metric - it's relative to the design, the
language and the programmer(s). The first project I worked on
(professionaly) was about 100 000 locs when I took over it, and one year
later it was about 50 000 locs, with way less bugs and way more
features. FWIW, the bigger the project, the bigger the chances that you
could cut it by half with a good refactoring.
Normally, for a
project of that magnitude the final integration would be expected to
take something like three months. However, the precise interface specs
and encapsulation methods in Ada allowed the integration to be
completed in just three days.

By your recommended method of social interaction, that would be one
hell of a lot of talking!
Or just writing and reading.
I realize that Python is not designed for such large projects,
Clueless again. Python is pretty good for large projects. Now the point
is that it tends to make them way smaller than some other much more
static languages. As an average, you can count on something between 5:1
to 10:1 ratio between Java (typical and well-known reference) and Python
for a same feature set. And the larger the project, the greater the ratio.
but
don't you think certain general principles can be learned anyway?
Do you really think you're talking to a bunch of clueless newbies ? You
can bet there are quite a lot of talented *and experimented* programmers
here.
Perhaps the benefits of interface specs and encapsulation are not as
obvious for smaller projects,
Plain wrong.
but certainly they are not zero.
You still fail to get the point. Interface specifications and
encapsulation are design principles. period. These principles are just
as well expressed with documentation and naming conventions, and the
cost is way lower.

Russ, do yourself a favor : get out of your cargo-cult one minute and
ask yourself whether all python users are really such a bunch of
clueless newbies and cowboy hackers. You may not have noticed, but there
are some *very* talented and *experimented* programmers here.
Jun 27 '08 #155
Russ P. a écrit :
On Jun 10, 1:04 am, Bruno Desthuilliers <bruno.
42.desthuilli...@websiteburo.invalidwrote:
>If you hope to get a general agreement here in favor of a useless
keyword that don't bring anything to the language, then yes, I'm afraid
you're wasting your time.

Actually, what I hope to do is to "take something away" from the
language, and that is the need to clutter my identifiers with leading
underscores.

I find that I spend the vast majority of my programming time working
on the "private" aspects of my code, and I just don't want to look at
leading underscores everywhere. So I usually just leave them off and
resort to a separate user guide to specify the public interface.

I'll bet many Python programmers do the same. How many Python
programmers do you think use leading underscores on every private data
member or method, or even most of them for that matter?
First point : please s/private/implementation/g. As long as you don't
get why it's primary to make this conceptual shift, the whole discussion
is hopeless.

Second point : I've read millions of lines of (production) python code
these last years, and I can assure you that everyone used this
convention. And respected it.

I'll bet not
many. (See the original post on this thread.) That means that this
particular aspect of Python is basically encouraging sloppy
programming practices.
Bullshit. Working experience is here to prove that it JustWork(tm).
What I don't understand is your visceral hostility to the idea of a
"priv" or "private" keyword.
Because it's at best totally useless.
If it offends you, you wouldn't need to
use it in your own code. You would be perfectly free to continue using
the leading-underscore convention (unless your employer tells you
otherwise, of course).
My employer doesn't tell me how to write code. I'm not a java-drone. My
employer employ me because he is confident in my abilities, not because
he needs some monkey to type the code.

The point is not *my* code, but the whole free python codebase. I
definitively do not want it to start looking anything like Java. Thanks.
I get the impression that Python suits your own purposes and you
really don't care much about what purpose others might have for it.
Strange enough, every time I read something like this, it happens that
it comes from someone who is going to ask for some fundamental change in
a language used by millions of persons for the 15+ past years just
because they think it would be better for their own current project.
I
am using it to develop a research prototype of a major safety-critical
system. I chose Python because it enhances my productivity and has a
clean syntax, but my prototype will eventually have to be re-written
in another language. I took a risk in choosing Python, and I would
feel better about it if Python would move up to the next level with
more advanced features such as (optional) static typing and private
declarations.
I'm sorry, but I don't see any of this as being "a move up to the next
level".
But every time I propose something like that,
fundamental change in the language for your own (perceived, and mostly
imaginary) needs, that is...
I get all
kinds of flak from people here who do their hacking and care little
about anyone else's needs.
No one needs another Java. Now what happens here is that *you* come here
explaining everyone that they need to adapt to the way *you* think
things should be.
With a few relatively small improvements, Python could expand its
domain considerably and make major inroads into territory that is now
dominated by C++, Java, and other statically compiled languages. But
that won't happen if reactionary hackers stand in the way.
So anyone not agreeing with you - whatever his experience, reasons etc -
is by definition a "reactionnary hacker" ? Nice to know.
Side note: I've been looking at Scala, and I like what I see. It may
actually be more appropriate for my needs, but I have so much invested
in Python at this point that the switch will not be easy.
So instead of taking time to learn the tool that would fit your needs,
you ask for fundamental changes in a language that fits millions other
persons needs ? Now let's talk about not caring about other's needs...
Jun 27 '08 #156
Le Wednesday 11 June 2008 08:11:02 Russ P., vous avez écrit*:
http://www.sofcheck.com

Here is an excerpt from their website:

"SofCheck’s advanced static error detection solutions find bugs in
programs before programs are run. By mathematically analyzing every
line of software, considering every possible input, and every path
through the program, SofCheck’s solutions find any and all errors that
cause a program to crash or produce an undefined result."
Don't mix commercial discourse with technical, it desserves your point.
Theoretically, wether a program has bugs or not is not computable. Static
analysis as they imply is just nonsense.

AFAIK, the efforts needed to make good static analysis are proven, by
experience, to be at least as time consuming than the efforts needed to make
good unit and dynamic testing.

--
_____________

Maric Michaud
Jun 27 '08 #157
On Jun 11, 2:36 am, Paul Boddie <p...@boddie.org.ukwrote:
Maybe, but I'd hope that some of those programmers would be at least
able to entertain what Russ has been saying rather than setting
themselves up in an argumentative position where to concede any
limitation in Python might be considered some kind of weakness that
one should be unwilling to admit.
Thanks. I sometimes get the impression that Desthuilliers thinks of
this forum like a pack of dogs, where he is the top dog and I am a
newcomer who needs to be put in his place. I just wish he would take a
chill pill and give it a rest. I am not trying to challenge his
position as top dog.

All I did was to suggest that a keyword be added to Python to
designate private data and methods without cluttering my cherished
code with those ugly leading underscores all over the place. I don't
like that clutter any more than I like all those semi-colons in other
popular languages. I was originally attracted to Python for its clean
syntax, but when I learned about the leading-underscore convention I
nearly gagged.

If Desthuilliers doesn't like my suggestion, then fine. If no other
Python programmer in the world likes it, then so be it. But do we
really need to get personal about it? Python will not be ruined if it
gets such a keyword, and Desthuilliers would be perfectly free to
continue using the leading-underscore convention if he wishes. Where
is the threat to his way of life?
Jun 27 '08 #158
On 11 Jun, 21:28, "Russ P." <Russ.Paie...@gmail.comwrote:
>
All I did was to suggest that a keyword be added to Python to
designate private data and methods without cluttering my cherished
code with those ugly leading underscores all over the place. I don't
like that clutter any more than I like all those semi-colons in other
popular languages. I was originally attracted to Python for its clean
syntax, but when I learned about the leading-underscore convention I
nearly gagged.
I'm not bothered about having private instance data, but I think
there's definitely a case to be answered about the double-underscore
name-mangling convention. In the remote past, people were fairly
honest about it being something of a hack, albeit one which had mostly
satisfactory results, and unlike the private instance data argument
which can often be countered by emphasising social measures, there has
been genuine surprise about this particular method of preventing
attribute name collisions - it's an issue which can trip up even
careful programmers.

Note also that the double-underscore convention is listed as a Python
wart [1] and is described by Kuchling thus:

"But it's a hack and a kludge; making privacy depend on an unrelated
property such as the attribute's name is clumsy. At least this
ugliness is limited to one specific and little-used case; few Python
programmers ever bother to use this private variable feature."

In my opinion there are too many people either defending the status
quo (warts and all) or pushing the envelope in many areas that didn't
overly bother people before (various Python 3000 features).

Paul

[1] http://wiki.python.org/moin/PythonWarts
Jun 27 '08 #159
On Jun 12, 6:43 am, Dennis Lee Bieber <wlfr...@ix.netcom.comwrote:
On Wed, 11 Jun 2008 10:10:14 +0200, Bruno Desthuilliers
<bruno.42.desthuilli...@websiteburo.invaliddeclaim ed the following in
comp.lang.python:
are some *very* talented and *experimented* programmers here.

Pardon, but I think you mean "experienced".

Of course, GvR may qualify as "experimented" if one considers
designing a language from scratch to be an experiment <G>
It looks like in French (as in Italian) *experimented* has the
meaning of "tried and tested on the field" when applied to a
person.

Michele Simionato
Jun 27 '08 #160
Michele Simionato <mi***************@gmail.comwrites:
On Jun 12, 6:43 am, Dennis Lee Bieber <wlfr...@ix.netcom.comwrote:
Pardon, but I think you mean "experienced".

Of course, GvR may qualify as "experimented" if one considers
designing a language from scratch to be an experiment <G>

It looks like in French (as in Italian) *experimented* has the
meaning of "tried and tested on the field" when applied to a person.
That would, in English, be "proven" (from similar ideas: "to prove"
means "to determine the truth by a test or trial").

--
\ "I went over to the neighbor's and asked to borrow a cup of |
`\ salt. 'What are you making?' 'A salt lick.'" -- Steven Wright |
_o__) |
Ben Finney
Jun 27 '08 #161
Dennis Lee Bieber a écrit :
On Wed, 11 Jun 2008 10:10:14 +0200, Bruno Desthuilliers
<br********************@websiteburo.invaliddeclaim ed the following in
comp.lang.python:
>are some *very* talented and *experimented* programmers here.

Pardon, but I think you mean "experienced".
Indeed. Tim Golden already corrected me (in private) about my mistake.
Please pardon my french :-/
Of course, GvR may qualify as "experimented" if one considers
designing a language from scratch to be an experiment <G>
<g>++ ?-)
Jun 27 '08 #162

Quoting Dennis Lee Bieber <wl*****@ix.netcom.com>:
On Wed, 11 Jun 2008 21:54:33 -0700 (PDT), Michele Simionato
<mi***************@gmail.comdeclaimed the following in
comp.lang.python:

It looks like in French (as in Italian) *experimented* has the
meaning of "tried and tested on the field" when applied to a
person.
<Spock raised eyebrow>

Fascinating
Spanish also. I translate "experimentado" to "experienced", perhaps because I
had seen it before, but I never imagined that "experimented" would be wrong.

Fascinating x2

--
Luis Zarrabeitia
Facultad de Matemática y Computación, UH
http://profesores.matcom.uh.cu/~kyrie

Jun 27 '08 #163

This thread has been closed and replies have been disabled. Please start a new discussion.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.