By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,312 Members | 1,248 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,312 IT Pros & Developers. It's quick & easy.

download limit

P: n/a
Hi,

I have a multithreaded script that mainly creates several wget
processes to download files. I would like to check/see and eventually
limit the bandwidth of the pool of processes. One way to do this is to
change the number of wget instances, but it's a workaround.

What do you recommend to do the following in python:
1) know the bitrate at the script scale
2) control, and limit or not this bitrate

Thanks,
Mathieu
Aug 10 '08 #1
Share this Question
Share on Google+
3 Replies


P: n/a
I have a multithreaded script that mainly creates several wget
processes to download files. I would like to check/see and eventually
limit the bandwidth of the pool of processes. One way to do this is to
change the number of wget instances, but it's a workaround.

What do you recommend to do the following in python:
1) know the bitrate at the script scale
2) control, and limit or not this bitrate
I recommend to not use wget, but implement the network access directly
in Python. Then you can easily measure that bitrate, and also limit it.
(by having some threads sleep).

Once you fork out new processes, you lose, unless there is an operating
system bandwidth limitation framework available which works on groups
of processes. Solaris project objects may provide such a thing, but
apart from that, I don't think it's available in any operating system
that you might be using.

Regards,
Martin
Aug 10 '08 #2

P: n/a
I have a multithreaded script that mainly creates several wget
processes to download files. I would like to check/see and eventually
limit the bandwidth of the pool of processes. One way to do this is to
change the number of wget instances, but it's a workaround.

What do you recommend to do the following in python:
1) know the bitrate at the script scale
2) control, and limit or not this bitrate
I recommend to not use wget, but implement the network access directly
in Python. Then you can easily measure that bitrate, and also limit it.
(by having some threads sleep).

Once you fork out new processes, you lose, unless there is an operating
system bandwidth limitation framework available which works on groups
of processes. Solaris project objects may provide such a thing, but
apart from that, I don't think it's available in any operating system
that you might be using.

Regards,
Martin
Aug 10 '08 #3

P: n/a
Mathieu Prevot schrieb:
Hi,

I have a multithreaded script that mainly creates several wget
processes to download files. I would like to check/see and eventually
limit the bandwidth of the pool of processes. One way to do this is to
change the number of wget instances, but it's a workaround.

What do you recommend to do the following in python:
1) know the bitrate at the script scale
2) control, and limit or not this bitrate
I'm not aware that python can do that. But under Linux you might
consider using trickled.

Diez
Aug 10 '08 #4

This discussion thread is closed

Replies have been disabled for this discussion.