473,385 Members | 1,843 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

How do I limit my number of PHP hits per second?

Nu
I want to protect myself from if someone with a fast connection hammers my
site. It's not denial of service attacks, but offline downloaders (of course
that don't show they're offline downloaders in the useragent so I can't
filter them by that). My main issue is my site is PHP so if they hammer it,
it gets all the PHP files executing and overwhelms the CPU. I'd like to be
able to after a certain amount of hits on my index.php per second, so just
refuse.

I can't find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

Jan 31 '07 #1
32 4007
"Nu" <no@spam.comkirjoitti
viestissä:uE********************@bgtnsc05-news.ops.worldnet.att.net...
>I want to protect myself from if someone with a fast connection hammers my
site. It's not denial of service attacks, but offline downloaders
Even I have not dealt with this specific issue, I want help by asking these
questions:

1) What info offline downloaders bring to phpinfo():


Jan 31 '07 #2
Rik
Nu <no@spam.comwrote:
I want to protect myself from if someone with a fast connection hammers
my
site. It's not denial of service attacks, but offline downloaders (of
course
that don't show they're offline downloaders in the useragent so I can't
filter them by that). My main issue is my site is PHP so if they hammer
it,
it gets all the PHP files executing and overwhelms the CPU. I'd like to
be
able to after a certain amount of hits on my index.php per second, so
just
refuse.

I can't find how to do that. Can it be done in PHP, htaccess, etc.
I'd say this would have to be done on server-level, anything in PHP would
still need/eat quite some resources.

May I suggest you ask this on alt.apache.configuration?
--
Rik Wasmus
Jan 31 '07 #3
Nu wrote:
I want to protect myself from if someone with a fast connection hammers my
site. It's not denial of service attacks, but offline downloaders (of course
that don't show they're offline downloaders in the useragent so I can't
filter them by that). My main issue is my site is PHP so if they hammer it,
it gets all the PHP files executing and overwhelms the CPU. I'd like to be
able to after a certain amount of hits on my index.php per second, so just
refuse.

I can't find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?
Can't be done. You cannot control what other people on the web
do. You can only control how you react.

Any measure you take against the dishonest folks, you also take
against the honest ones. To that end, there are services out
there who will gladly charge you thousands of dollars to sell
you service packages for several thousand per month. And some
of those might even help to track down your abusive user.

But your best bet is to just make sure you have capacity to
handle peak loads, and that overloaded systems throttle down
gracefully.

Jan 31 '07 #4
Rik wrote:
I'd say this would have to be done on server-level, anything in PHP
would still need/eat quite some resources.
Personally, I do it by primarily serving up static HTML pages,
instead of PHP. I reserve PHP for active content and such.

You can still get hammered, but the PHP system isn't going wild.
Feb 1 '07 #5
Nu
Sanders Kaufman" <bu***@kaufman.netwrote in message
news:yW******************@newssvr27.news.prodigy.n et...
Nu wrote:
I want to protect myself from if someone with a fast connection hammers
my
site. It's not denial of service attacks, but offline downloaders (of
course
that don't show they're offline downloaders in the useragent so I can't
filter them by that). My main issue is my site is PHP so if they hammer
it,
it gets all the PHP files executing and overwhelms the CPU. I'd like to
be
able to after a certain amount of hits on my index.php per second, so
just
refuse.

I can't find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

Can't be done. You cannot control what other people on the web
do. You can only control how you react.

Any measure you take against the dishonest folks, you also take
against the honest ones. To that end, there are services out
there who will gladly charge you thousands of dollars to sell
you service packages for several thousand per month. And some
of those might even help to track down your abusive user.

But your best bet is to just make sure you have capacity to
handle peak loads, and that overloaded systems throttle down
gracefully.

Actually, my site goes to index.php and then index.php digs around in other
PHPs and MySQL. If I stop it right at index.php, I can keep my account from
overloading the CPU.


Feb 1 '07 #6
Nu

"P Pulkkinen" <pe*************************@POISTATAMA.elisanet.f iwrote in
message news:2N****************@reader1.news.saunalahti.fi ...
"Nu" <no@spam.comkirjoitti
viestissä:uE********************@bgtnsc05-news.ops.worldnet.att.net...
I want to protect myself from if someone with a fast connection hammers
my
site. It's not denial of service attacks, but offline downloaders

Even I have not dealt with this specific issue, I want help by asking
these
questions:

1) What info offline downloaders bring to phpinfo():


I don't understand that question.

Feb 1 '07 #7
"Nu" <no@spam.comkirjoitti
viestissä:uE********************@bgtnsc05-news.ops.worldnet.att.net...
>I want to protect myself from if someone with a fast connection hammers my
site. It's not denial of service attacks, but offline downloaders
Sorry, if I misunderstand or miss something. I understood that you mean
persons that use some batch to fetch the _output_ of your script, perhaps
automaticly on timely basis. But not _download_ it in sense of ftp/scp.

1) Can you use $_SERVER["REMOTE_ADDR"] to identify downloaders from each
other?

2) Does it really matter if they are online or offline, if the POINT is
that some people (or machines) execute your index.php or other script
_too_often_?

3) How about this scenario:
You have two database tables:
DOWNLOADS
- download_id
- filepath
- remote_ip
- timestamp
TROUBLEMAKERS
- remote_ip
- filepath

In the END of every script execution you add an entry to downloads table.
You also check, if that filepath/remote_id-combination has become bad enough
to be inserted into troublemakers table. You use some mathematics to define
characteristics of being evil downloader.

In the BEGINNING of every script, you make a database query to troublemakers
table and if current filepath/remote_id-combination is there, stop the
execution immediately.

Downside here is that mysql traffic increases, even php traffic may
decrease. If they was a way to check evil filepath/remote_id-combinations in
apache side, of course troublemakers table could be replaced with
troublemakers-file as well or a file that would be apache magik with that
data inside.



Feb 1 '07 #8
Nu wrote:
Sanders Kaufman" <bu***@kaufman.netwrote in message
>But your best bet is to just make sure you have capacity to
handle peak loads, and that overloaded systems throttle down
gracefully.

Actually, my site goes to index.php and then index.php digs around in other
PHPs and MySQL. If I stop it right at index.php, I can keep my account from
overloading the CPU.
In that case - you just have to choose one or more methods among
the several (labor-intensive) ones out there.

You can exit based on IP's - but they can be spoofed. You can
exit based on other headers - but they can be spoofed, too.

This is why developers talk so much about "scalability". If
your site isn't designed to handle peak loads, and to exit
gracefully during overload - all of the other measures won't help.

That's usually an OK design flaw behind a firewall, but not out
in open water.

Feb 1 '07 #9
Nu
"Sanders Kaufman" <bu***@kaufman.netwrote in message
news:yp*******************@newssvr27.news.prodigy. net...
Nu wrote:
Sanders Kaufman" <bu***@kaufman.netwrote in message
But your best bet is to just make sure you have capacity to
handle peak loads, and that overloaded systems throttle down
gracefully.
Actually, my site goes to index.php and then index.php digs around in
other
PHPs and MySQL. If I stop it right at index.php, I can keep my account
from
overloading the CPU.

In that case - you just have to choose one or more methods among
the several (labor-intensive) ones out there.

You can exit based on IP's - but they can be spoofed. You can
exit based on other headers - but they can be spoofed, too.

This is why developers talk so much about "scalability". If
your site isn't designed to handle peak loads, and to exit
gracefully during overload - all of the other measures won't help.

That's usually an OK design flaw behind a firewall, but not out
in open water.

So how do I handle peak loads and exit gracefully during overloads?

Basically something like X hits per 10 seconds to index.php sounds simple
enough. I can't find out how to do that, though.

Feb 1 '07 #10
Nu
I am trying to limit how often index.php gets run. Index.php calls lots of
other stuff. I want (even in index.php) to just like if it's run too often
per so many seconds, to just stop and that's enough for now. It's not about
a complicated IP tracking thing, just a simple thing.
"P Pulkkinen" <pe*************************@POISTATAMA.elisanet.f iwrote in
message news:tc****************@reader1.news.saunalahti.fi ...
"Nu" <no@spam.comkirjoitti
viestissä:uE********************@bgtnsc05-news.ops.worldnet.att.net...
I want to protect myself from if someone with a fast connection hammers
my
site. It's not denial of service attacks, but offline downloaders

Sorry, if I misunderstand or miss something. I understood that you mean
persons that use some batch to fetch the _output_ of your script, perhaps
automaticly on timely basis. But not _download_ it in sense of ftp/scp.

1) Can you use $_SERVER["REMOTE_ADDR"] to identify downloaders from each
other?

2) Does it really matter if they are online or offline, if the POINT is
that some people (or machines) execute your index.php or other script
_too_often_?

3) How about this scenario:
You have two database tables:
DOWNLOADS
- download_id
- filepath
- remote_ip
- timestamp
TROUBLEMAKERS
- remote_ip
- filepath

In the END of every script execution you add an entry to downloads table.
You also check, if that filepath/remote_id-combination has become bad
enough
to be inserted into troublemakers table. You use some mathematics to
define
characteristics of being evil downloader.

In the BEGINNING of every script, you make a database query to
troublemakers
table and if current filepath/remote_id-combination is there, stop the
execution immediately.

Downside here is that mysql traffic increases, even php traffic may
decrease. If they was a way to check evil filepath/remote_id-combinations
in
apache side, of course troublemakers table could be replaced with
troublemakers-file as well or a file that would be apache magik with that
data inside.



Feb 1 '07 #11
Nu
I am trying to limit how often index.php gets run. Index.php calls lots of
other stuff. I want (even in index.php) to just like if it's run too often
per so many seconds, to just stop and that's enough for now. It's not about
a complicated IP tracking thing, just a simple thing.
"P Pulkkinen" <pe*************************@POISTATAMA.elisanet.f iwrote in
message news:tc****************@reader1.news.saunalahti.fi ...
"Nu" <no@spam.comkirjoitti
viestissä:uE********************@bgtnsc05-news.ops.worldnet.att.net...
I want to protect myself from if someone with a fast connection hammers
my
site. It's not denial of service attacks, but offline downloaders

Sorry, if I misunderstand or miss something. I understood that you mean
persons that use some batch to fetch the _output_ of your script, perhaps
automaticly on timely basis. But not _download_ it in sense of ftp/scp.

1) Can you use $_SERVER["REMOTE_ADDR"] to identify downloaders from each
other?

2) Does it really matter if they are online or offline, if the POINT is
that some people (or machines) execute your index.php or other script
_too_often_?

3) How about this scenario:
You have two database tables:
DOWNLOADS
- download_id
- filepath
- remote_ip
- timestamp
TROUBLEMAKERS
- remote_ip
- filepath

In the END of every script execution you add an entry to downloads table.
You also check, if that filepath/remote_id-combination has become bad
enough
to be inserted into troublemakers table. You use some mathematics to
define
characteristics of being evil downloader.

In the BEGINNING of every script, you make a database query to
troublemakers
table and if current filepath/remote_id-combination is there, stop the
execution immediately.

Downside here is that mysql traffic increases, even php traffic may
decrease. If they was a way to check evil filepath/remote_id-combinations
in
apache side, of course troublemakers table could be replaced with
troublemakers-file as well or a file that would be apache magik with that
data inside.



Feb 1 '07 #12
Nu wrote:
"Sanders Kaufman" <bu***@kaufman.netwrote in message
>That's usually an OK design flaw behind a firewall, but not out
in open water.

So how do I handle peak loads and exit gracefully during overloads?

Basically something like X hits per 10 seconds to index.php sounds simple
enough. I can't find out how to do that, though.
Now THAT is a question a coder can answer!!!
There are several approaches.

I would use a timestamp/hitcount $_SESSION[] variable to track
their usage.

Then, each session will be aware of how often its client is
hitting you - aborting the connection (but not the session!)
when they're outside of your desired frequency.

Me, personally, I wouldn't abort the connection. I'd put them
to sleep. There's a sleep() function in PHP that will let you
pause the processing for a period of time. (You might want to
build a wrapper around it for your own sleepy purposes.)

This will also force bot/agents into throttling down their
requests. Since the connection isn't broken, they won't issue a
zillion connection requests. They'll just thing you've got one
seriously bogged down machine.

It might even trick them into thinking they DoS'd you - when in
fact, you DoS'd them.

You can't force people to behave any certain way on the web -
but you can trick their software!

Rule #1 of dealing with coders - don't ask *them* for the spec.
Feb 1 '07 #13
Nu
$_SESSION[] is pretty much dependant on cookies, right?

"Sanders Kaufman" <bu***@kaufman.netwrote in message
news:Ek***************@newssvr17.news.prodigy.net. ..
Nu wrote:
"Sanders Kaufman" <bu***@kaufman.netwrote in message
That's usually an OK design flaw behind a firewall, but not out
in open water.
So how do I handle peak loads and exit gracefully during overloads?

Basically something like X hits per 10 seconds to index.php sounds
simple
enough. I can't find out how to do that, though.

Now THAT is a question a coder can answer!!!
There are several approaches.

I would use a timestamp/hitcount $_SESSION[] variable to track
their usage.

Then, each session will be aware of how often its client is
hitting you - aborting the connection (but not the session!)
when they're outside of your desired frequency.

Me, personally, I wouldn't abort the connection. I'd put them
to sleep. There's a sleep() function in PHP that will let you
pause the processing for a period of time. (You might want to
build a wrapper around it for your own sleepy purposes.)

This will also force bot/agents into throttling down their
requests. Since the connection isn't broken, they won't issue a
zillion connection requests. They'll just thing you've got one
seriously bogged down machine.

It might even trick them into thinking they DoS'd you - when in
fact, you DoS'd them.

You can't force people to behave any certain way on the web -
but you can trick their software!

Rule #1 of dealing with coders - don't ask *them* for the spec.

Feb 1 '07 #14
Nu wrote:
I want to protect myself from if someone with a fast connection
hammers my site. It's not denial of service attacks, but offline
downloaders (of course that don't show they're offline downloaders in
the useragent so I can't filter them by that). My main issue is my
site is PHP so if they hammer it, it gets all the PHP files executing
and overwhelms the CPU. I'd like to be able to after a certain amount
of hits on my index.php per second, so just refuse.

I can't find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?
You can use database for it, but it is a partial solution only.
Create MySQL table 'requests' with these fields:
remote_addr varchar(20)
http_via varchar(100)
http_forwarded varchar(100)
http_x_forwarded_for varchar(100)
x_http_forwarded_for varchar(100)
x_forwarded_for varchar(100)
nexttime datetime

The field names are corresponding to uppercase http header fileds except the
last field. Not all of these you can get, only remote_addr you get alvays.
At begin of your script you must try to ge these fileds as
$_SERVER["REMOTE_ADDR"], $_SERVER["HTTP_VIA"] etc.
Now you must try to search record in table where all fileds are the same.
If you found record you must compare if current time is equial or greter
then value stored in nexttime field.
If current time is less then stored then you can show some error message or
redirect to www.microsoft.com :-)
If current time is equial or greater then you display requested page.

At the end of your script you must
1) update nexttime field (store current time + some addition when user can
access page again) if you found record at begin of script

2) or create new record when you not found record at script begin.

--

Petr Vileta, Czech republic
(My server rejects all messages from Yahoo and Hotmail. Send me your mail
from another non-spammer site please.)
Feb 1 '07 #15
Nu

"Nu" <no@spam.comwrote in message
news:uE********************@bgtnsc05-news.ops.worldnet.att.net...
I want to protect myself from if someone with a fast connection hammers my
site. It's not denial of service attacks, but offline downloaders (of
course
that don't show they're offline downloaders in the useragent so I can't
filter them by that). My main issue is my site is PHP so if they hammer
it,
it gets all the PHP files executing and overwhelms the CPU. I'd like to be
able to after a certain amount of hits on my index.php per second, so just
refuse.

I can't find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

I've heard this called a "hit limit".
Feb 1 '07 #16
>
Me, personally, I wouldn't abort the connection. I'd put them to sleep.
There's a sleep() function in PHP that will let you pause the processing
for a period of time. (You might want to build a wrapper around it for
your own sleepy purposes.)
I wish there was a real life wrapper for this... (no doona jokes please) :)
Feb 1 '07 #17
Nu wrote:
$_SESSION[] is pretty much dependant on cookies, right?
Yes - optionally a query string, but that won't work for your
purposes.

If you want to monitor the activity of your clients from one
connection to the next, you need persistant client-side data.

There's no way around that... except maybe the Honor system.

Feb 1 '07 #18
Petr Vileta wrote:
Nu wrote:
>I want to protect myself from if someone with a fast connection
hammers my site. It's not denial of service attacks, but offline
downloaders (of course that don't show they're offline downloaders in
the useragent so I can't filter them by that). My main issue is my
site is PHP so if they hammer it, it gets all the PHP files executing
and overwhelms the CPU. I'd like to be able to after a certain amount
of hits on my index.php per second, so just refuse.

I can't find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?
You can use database for it, but it is a partial solution only.

Tee hee. The idea was to *prevent* that kind of activity.
Feb 1 '07 #19
On Wed, 31 Jan 2007 17:37:57 -0800, Nu <no@spam.comwrote:
"Sanders Kaufman" <bu***@kaufman.netwrote in message
news:Ek***************@newssvr17.news.prodigy.net. ..
>Nu wrote:
"Sanders Kaufman" <bu***@kaufman.netwrote in message
>That's usually an OK design flaw behind a firewall, but not out
in open water.

So how do I handle peak loads and exit gracefully during overloads?

Basically something like X hits per 10 seconds to index.php sounds
simple
enough. I can't find out how to do that, though.

Now THAT is a question a coder can answer!!!
There are several approaches.

I would use a timestamp/hitcount $_SESSION[] variable to track
their usage.

Then, each session will be aware of how often its client is
hitting you - aborting the connection (but not the session!)
when they're outside of your desired frequency.

Me, personally, I wouldn't abort the connection. I'd put them
to sleep. There's a sleep() function in PHP that will let you
pause the processing for a period of time. (You might want to
build a wrapper around it for your own sleepy purposes.)

This will also force bot/agents into throttling down their
requests. Since the connection isn't broken, they won't issue a
zillion connection requests. They'll just thing you've got one
seriously bogged down machine.

It might even trick them into thinking they DoS'd you - when in
fact, you DoS'd them.

You can't force people to behave any certain way on the web -
but you can trick their software!

Rule #1 of dealing with coders - don't ask *them* for the spec.

$_SESSION[] is pretty much dependant on cookies, right?
No, sessions will work with the query string when session cookies can't be
set. Sanders Kaufman's idea seems pretty sound, in that it uses sleep. I
like it.

--
Curtis, http://dyersweb.com
Feb 1 '07 #20
Curtis wrote:
No, sessions will work with the query string when session cookies can't
be set. Sanders Kaufman's idea seems pretty sound, in that it uses
sleep. I like it.
The only problem with it is if you go cookieless.

In the query string way, you'd need the user to type in the
query string when they go back to the page in order to retain
the session data.
Feb 1 '07 #21
On Wed, 31 Jan 2007 20:48:19 -0800, Sanders Kaufman <bu***@kaufman.net>
wrote:
Curtis wrote:
>No, sessions will work with the query string when session cookies can't
be set. Sanders Kaufman's idea seems pretty sound, in that it uses
sleep. I like it.

The only problem with it is if you go cookieless.

In the query string way, you'd need the user to type in the query string
when they go back to the page in order to retain the session data.
Yeah, that's true, but if they're navigating within the site, PHP will (if
enabled in php.ini) append the SID to the end of links and form actions
transparently.

--
Curtis, http://dyersweb.com
Feb 1 '07 #22
Nu
"Sanders Kaufman" <bu***@kaufman.netwrote in message
news:xO******************@newssvr21.news.prodigy.n et...
Nu wrote:
$_SESSION[] is pretty much dependant on cookies, right?

Yes - optionally a query string, but that won't work for your
purposes.

If you want to monitor the activity of your clients from one
connection to the next, you need persistant client-side data.

There's no way around that... except maybe the Honor system.

Cookies wouldn't work on some email grabber bot or offline downloader. And a
query string isn't how my site's software works.

I'm basically looking for something that simply would be able to say
index.php can't be called by anyone more than say so many times per second.
After that, it'll do a sleep command or output a "too many connections
page."

Feb 1 '07 #23

$currentmin = mktime(date("H") , date("i"), 0, date("m"), date("d"),
date("Y"));

if(is_file($currentmin)) {

/*
get the content of the file

if content of file 10 (max limit per minute)
display some message/redirect ...

else
increment the value of the content read from the file and write it
back to the file

process MySQL operations ...

*/
} else {
//create a file $currentmin

//write the content "1" to the file

}
//you will need to delete the older $currentmin files by some scripts



On Feb 1, 4:30 am, "Nu" <n...@spam.comwrote:
I want to protect myself from if someone with a fast connection hammers my
site. It's not denial of service attacks, but offline downloaders (of course
that don't show they're offline downloaders in the useragent so I can't
filter them by that). My main issue is my site is PHP so if they hammer it,
it gets all the PHP files executing and overwhelms the CPU. I'd like to be
able to after a certain amount of hits on my index.php per second, so just
refuse.

I can't find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?

Feb 1 '07 #24
Nu

"Manish" <ye*********@gmail.comwrote in message
news:11**********************@v33g2000cwv.googlegr oups.com...
>
$currentmin = mktime(date("H") , date("i"), 0, date("m"), date("d"),
date("Y"));

if(is_file($currentmin)) {

/*
get the content of the file

if content of file 10 (max limit per minute)
display some message/redirect ...

else
increment the value of the content read from the file and write it
back to the file

process MySQL operations ...

*/
} else {
//create a file $currentmin

//write the content "1" to the file

}
//you will need to delete the older $currentmin files by some scripts


That idea will work. Thanks.
Feb 1 '07 #25
Nu wrote:
I'm basically looking for something that simply would be able to say
index.php can't be called by anyone more than say so many times per second.
See - that's where your spec is flawed. You want to act based
on a visitor's identity... without identifying visitors.

After that, it'll do a sleep command or output a "too many connections
page."


Feb 1 '07 #26
Manish wrote:
//create a file $currentmin
//write the content "1" to the file

//you will need to delete the older $currentmin files by some scripts

The only problem with that is that *preventing* such processing
was the primary goal.

This "solution" guarantees a whole extra level of creating and
deleting files to every single page.
Feb 1 '07 #27
Nu

"Sanders Kaufman" <bu***@kaufman.netwrote in message
news:%n****************@newssvr17.news.prodigy.net ...
Nu wrote:
I'm basically looking for something that simply would be able to say
index.php can't be called by anyone more than say so many times per
second.
>
See - that's where your spec is flawed. You want to act based
on a visitor's identity... without identifying visitors.
No I want something based on a filename, not visitors.
Feb 1 '07 #28
Nu wrote:
"Sanders Kaufman" <bu***@kaufman.netwrote in message
>>I'm basically looking for something that simply would be able to say
index.php can't be called by anyone more than say so many times per
second.
>See - that's where your spec is flawed. You want to act based
on a visitor's identity... without identifying visitors.

No I want something based on a filename, not visitors.
Oh - I took the "by anyone" literally.

No flame - but this stuff gets a LOT easier when you state the
problem correctly and completely. That's why it's called a
specification.
Feb 1 '07 #29
Following on from Nu's message. . .
>I am trying to limit how often index.php gets run. Index.php calls lots of
other stuff. I want (even in index.php) to just like if it's run too often
per so many seconds, to just stop and that's enough for now. It's not about
a complicated IP tracking thing, just a simple thing.
OK.[1] You simply want to say "I will only allow index.php to be run
10(etc) times per second."
(1) I don't know if there is an Apache way to do this so I'll pass on
that one.
(2) If you can't stop the page running you can stop it doing lots of
complicated work by sharing the data across all sessions for which you
need to persist in a shared resource such as a database or mutex file.
(3) There are various things you can do in the "do I allow another
proper execution or fail[2]"
- Every 100 runs reset mutex to 0 // stops errors building up to
wedge
the system.
- Check a counter of how many are 'in progress'
- If OK then add 1 to mutex and do real work or fail
- At end subtract 1 from mutex[3]

[1] Personally I think it is better to get to the root cause of the
problem and if you can't ask people 'not to do that' then to enforce it
/for those people/. If you're the national rail site that falls over
when there's a bit of snow and everyone wants to see what a shambles the
railways are in then get more server power.

[2] You need to give very careful thought to how you 'fail'. Delay, 500
message, bandwidth exceeded graphic, absolutely no output, Forbidden?,
redirect to static data in a HTML page ...

[3] Notice I have bent your spec in this scheme and looked at the number
of sessions currently running rather than an arbitrary time between
calls.
--
PETER FOX Not the same since the bolt company screwed up
pe******@eminent.demon.co.uk.not.this.bit.no.html
2 Tees Close, Witham, Essex.
Gravity beer in Essex <http://www.eminent.demon.co.uk>
Feb 1 '07 #30
Nu

"Nu" <no@spam.comwrote in message
news:uE********************@bgtnsc05-news.ops.worldnet.att.net...
I want to protect myself from if someone with a fast connection hammers my
site. It's not denial of service attacks, but offline downloaders (of
course
that don't show they're offline downloaders in the useragent so I can't
filter them by that). My main issue is my site is PHP so if they hammer
it,
it gets all the PHP files executing and overwhelms the CPU. I'd like to be
able to after a certain amount of hits on my index.php per second, so just
refuse.

I can't find how to do that. Can it be done in PHP, htaccess, etc.

Any ideas?


It was faster than I thought to make and I got code running now throttling
my site. I will never have to have an offline downloader trying to crash the
server.\
Feb 1 '07 #31
Nu wrote:
I can't find how to do that. Can it be done in PHP, htaccess, etc.
Firstly, are you on a shared host? If so, then ensuring quality of
service is really your hosting provider's job. If one site on a server is
using up a massive portion of the server's capacity (with regards to
bandwidth, CPU, memory or disk space) then this impacts *all* the sites on
that particular server, so it's their responsiblity to either throttle that
site or request that its administrator purchases a more expensive hosting
package so that it can be moved onto a server with fewer other sites.

If you're on a dedicated host and don't have root access, then get root
access (change your hosting provider if need be).

If you're on a dedicated host with root access, then probably the best
option is to use Apache's mod_cband module <http://mod-cband.com/>. With
this you can add to your httpd.conf something like this:

<VirtualHost>
...
# limit speed of this vhost to 1Mbit/s, 10 request/s, 30 open connections
CBandSpeed 1Mbps 10 30
# in addition every remote host connecting to this vhost
# will be limited to 100kbit/s, 3 request/s, 3 open connections
CBandRemoteSpeed 100kbps 3 3
</VirtualHost>

--
Toby A Inkster BSc (Hons) ARCS
Contact Me ~ http://tobyinkster.co.uk/contact
Geek of ~ HTML/CSS/Javascript/SQL/Perl/PHP/Python*/Apache/Linux

* = I'm getting there!
Feb 1 '07 #32
Following on from Nu's message. . .
>
It was faster than I thought to make and I got code running now throttling
my site. I will never have to have an offline downloader trying to crash the
server.\

Hold on! If all you've done is throttled everyone then offline
downloaders are effectively DOS. They still bash away. It's like they
put 5 people in the queue for every one ordinary user.

If you consider these events are rare then perhaps OK, but otherwise
you're shutting out everyone. In fact the 'correct' response from a bot
is to *increase* the fire rate to get an acceptable percentage of hits.
i.e this is as much an issue of equitable rationing as limiting your
server usage.


--
PETER FOX Not the same since the bolt company screwed up
pe******@eminent.demon.co.uk.not.this.bit.no.html
2 Tees Close, Witham, Essex.
Gravity beer in Essex <http://www.eminent.demon.co.uk>
Feb 1 '07 #33

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

6
by: JohnH | last post by:
I have an ASP app (developed with Interdev 6) which uses a DTC listbox (server-side) bound to a recordset to allow the user to make a selection. This works fine until the number of records in the...
3
by: A Seel | last post by:
COUNTING NUMBER OF SELECTS MADE table mytable { id, data, hits }
0
by: D. Dante Lorenso | last post by:
I need to know that original number of rows that WOULD have been returned by a SELECT statement if the LIMIT / OFFSET where not present in the statement. Is there a way to get this data from PG ?...
0
by: Tomas | last post by:
I have two questions: (1) How (if possible) can you, with ASP.NET (and with the IIS 5 included with win2000) specify an maximum limit of the memory that a web application may consume, as an...
40
by: RadiationX | last post by:
I have a problem that I really don't understand at all. In my previous post I could get started on my projects I just had a few problems with syntax errors. This problem is something that I don't...
1
by: ben07 | last post by:
Hi does anyone know a simple cookie code that records how many times a visitor visits a page, and once it hits a certain number (let say 6). It then blocks the visitor from further accessing the...
1
by: tom.youdan | last post by:
Hi, I am boiling my brain on this one and struggling with the plethora of responses on this topic. For some reason I can't access the Samples dbs through my Work pc, so I have no help there. Was...
2
by: OM | last post by:
I have two tables, one with categories in it, and only 3 records - Wages, Salary, Contract. I have a second table that records hours and numbers of employees in each category, per month. How do...
30
by: Jeff Bigham | last post by:
So, it appears that Javascript has a recursion limit of about 1000 levels on FF, maybe less/more on other browsers. Should such deep recursion then generally be avoided in Javascript?...
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...
0
by: ryjfgjl | last post by:
In our work, we often need to import Excel data into databases (such as MySQL, SQL Server, Oracle) for data analysis and processing. Usually, we use database tools like Navicat or the Excel import...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.