473,216 Members | 2,157 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,216 software developers and data experts.

running functions in parallel on multiple processors

Hello.

What is the usual way for running functions in parallel on a
multiple-processor machine. Actually I want to run a single computationally
expensive function with different parameter sets.
Running the functions in different threads doesn't seem to work, because of
the global interpreter lock.
Would it help to fork processes, which run the single function with a given
parameter set? Is there any simple way, how this forked worker process can
report its result back to the controlling process?

Thanks.
Best regards,
Michael
Jul 18 '05 #1
2 3365
Michael Schmitt wrote:
What is the usual way for running functions in parallel on a
multiple-processor machine. Actually I want to run a single
computationally expensive function with different parameter sets.
Running the functions in different threads doesn't seem to work, because
of the global interpreter lock.
Would it help to fork processes, which run the single function with a
given parameter set? Is there any simple way, how this forked worker
process can report its result back to the controlling process?


Forked processes could indeed perform whatever computations you
need, and then report their results by writing them to a socket
which the controlling process reads (there are many other IPC
mechanisms, but sockets are often simplest where applicable).
Alex

Jul 18 '05 #2
Michael,
I may have something laying around that would be useful for you -
its a module I wrote that makes forked multi-process programing
very easy, since each process accesses a shared data-store
automatically. I haven't released it due to a lack of time to write
documentation, but it sounds like it may be the sort of thing you could use.
It's called remoteD, and it works like this:

import remoteD, time

SharedD = remoteD.initShare()

def child_function(Shared, arg1, arg2):
# the first arg will be the Shared
# dictionary-like object
# put shared data into the dictionary whenever you want
Shared["myresult"] = 5
SharedD.newProc(child_function, [arg1, arg2])

while not SharedD.has_key("myresult"):
time.sleep(0.2)

print "The other process got " + SharedD["myresult"] + " as the answer"

-------------------
stubShare objects, which are created by initShare() or newProc (which
puts the newly created sharestub as the first arg, ahead of your own in
the argument list for your function), act like dictionaries. .has_key(),
..keys() and del all work fine. You can also lock the whole share
temporarily
by simply calling .Lock(), and later .UnLock() on any stubShare object.
Anything python object that can be pickled can be stored in a share.

Behind the scenes, the first call to initShare() forks a server process that
holds
the shared data and accepts connections from share stub objects.
initShare()
returns a stubShare object in the calling process. The server will comit
suicide after a couple of seconds without any connected stubShares,
so you don't need to clean it up explicitly. (You can also force the
server to stay alive, but thats a different topic)
Fork is required.
By default, initShare() uses IP sockets, but you can easily tell it to use
unix sockets, which are much faster:

SharedD.initShare(sType=remoteD.UNIXSOCK)

the 'port' argument is overidden for use with unixsockets - so you can
choose to name your socket yourself, instead of using the default
'7450':

ShareD.initShare(port='myfile',sType=remoteD.UNIXS OCK)

you can also use the createShareServer function and stubShare class
themselves to share data across machines.

As for scalability - I've had hundreds of child processes running
and sharing data with this (unixsocks), but I have no hard numbers
on whether the overhead involved with the stubShare objects slowed
things down greatly. I will say this:
Avoid repeated references to the shared data - assigning to a local variable
will perform a deepcopy, and will be faster. So do things like the
following
to avoiding hitting the shared data every operation:

myValue = SharedD['remoteValue']
myValue += 5
# other manipulations of myValue here
# much later, when you are done:
SharedD['remoteValue'] = myValue

Anyway, I'll end up writing better documentation and doing an official
release
on sourceforge later this week - but for now you can download it at:
http://www.neurokode.com/remoteD.tar
I hope this helps, feel free to bug me with questions.

~Jon Franz
NeuroKode Labs, LLC

----- Original Message -----
From: "Michael Schmitt" <no****@nomail.com>
To: <py*********@python.org>
Sent: Monday, November 03, 2003 8:42 AM
Subject: running functions in parallel on multiple processors

Hello.

What is the usual way for running functions in parallel on a
multiple-processor machine. Actually I want to run a single computationally expensive function with different parameter sets.
Running the functions in different threads doesn't seem to work, because of the global interpreter lock.
Would it help to fork processes, which run the single function with a given parameter set? Is there any simple way, how this forked worker process can
report its result back to the controlling process?

Thanks.
Best regards,
Michael

Jul 18 '05 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: BalyanM | last post by:
Hi, I am new to python.I am using it on redhat linux 9. I am interested to run python on a sun machine(SunE420R,os=solaris) with 4 cpu's for a pattern discovery/search program on biological...
5
by: Ian McConnell | last post by:
What's the pythonic way of sending out a set of requests in parallel? My program throws an image at the server and then waits for the result. I'm currently using this bit of socket code to send...
6
by: Steven An | last post by:
Howdy, I need to write an update query with multiple aggregate functions. Here is an example: UPDATE t SET t.a = ( select avg(f.q) from dbo.foo f where f.p = t.y ), t.b = ( select sum(f.q)...
2
by: Neil Ginsberg | last post by:
I have a SQL 7 db with a union query (view), and I'm getting the error, "The query processor could not start the necessary thread resources for parallel query execution." This union query has been...
0
by: Norm | last post by:
We are copying over a thousand tablespaces using LISTDEF. The Copy utility is restricting the parallelism to 6, even though Parallel 20 is specified. Environment: Z/OS R1.4 in 64-bit mode DB2...
19
by: Ross A. Finlayson | last post by:
Hi, I hope you can help me understand the varargs facility. Say I am programming in ISO C including stdarg.h and I declare a function as so: void log_printf(const char* logfilename, const...
24
by: Gorlon the Impossible | last post by:
Hello I'm not sure how to phrase this question. I have a Python function that sends MIDI messages to a synth. When I run it, I of course have to wait until it is finished before I can do anything...
2
by: Dave Hughes | last post by:
Just noticed something rather annoying after upgrading my test box (a Linux server running DB2 UDB v8 for LUW) to fixpak 11 (for reference it was previously on fixpak 7). In the past I've relied...
43
by: parallelpython | last post by:
Has anybody tried to run parallel python applications? It appears that if your application is computation-bound using 'thread' or 'threading' modules will not get you any speedup. That is because...
1
isladogs
by: isladogs | last post by:
The next online meeting of the Access Europe User Group will be on Wednesday 6 Dec 2023 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, Mike...
0
by: VivesProcSPL | last post by:
Obviously, one of the original purposes of SQL is to make data query processing easy. The language uses many English-like terms and syntax in an effort to make it easy to learn, particularly for...
0
by: jianzs | last post by:
Introduction Cloud-native applications are conventionally identified as those designed and nurtured on cloud infrastructure. Such applications, rooted in cloud technologies, skillfully benefit from...
0
by: mar23 | last post by:
Here's the situation. I have a form called frmDiceInventory with subform called subfrmDice. The subform's control source is linked to a query called qryDiceInventory. I've been trying to pick up the...
2
by: jimatqsi | last post by:
The boss wants the word "CONFIDENTIAL" overlaying certain reports. He wants it large, slanted across the page, on every page, very light gray, outlined letters, not block letters. I thought Word Art...
2
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 7 Feb 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:30 (7.30PM). In this month's session, the creator of the excellent VBE...
0
by: fareedcanada | last post by:
Hello I am trying to split number on their count. suppose i have 121314151617 (12cnt) then number should be split like 12,13,14,15,16,17 and if 11314151617 (11cnt) then should be split like...
0
by: stefan129 | last post by:
Hey forum members, I'm exploring options for SSL certificates for multiple domains. Has anyone had experience with multi-domain SSL certificates? Any recommendations on reliable providers or specific...
1
by: davi5007 | last post by:
Hi, Basically, I am trying to automate a field named TraceabilityNo into a web page from an access form. I've got the serial held in the variable strSearchString. How can I get this into the...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.