Cenzic 232 Patent
Paid Advertising
sla.ckers.org is
ha.ckers sla.cking
Sla.ckers.org
How do we crash systems, browsers, or otherwise bring things to a halt, and how do we protect those things? 
Go to Topic: PreviousNext
Go to: Forum ListMessage ListNew TopicSearchLog In
Bandwidth Overload Through GET Requests?
Posted by: Super-Friez
Date: July 18, 2007 04:59PM

Is it possible to force a website to exceed its bandwidth limit through repeated GET requests? I mean something like using GET 500 times.

Options: ReplyQuote
Re: Bandwidth Overload Through GET Requests?
Date: July 19, 2007 12:05AM

Yes of course, you can for instance post multiple images from the site in a high visited forum appending a query to the image so it will not cache since will be considered different or you can just access it yourself multiple times using a script making sure it does not cache either. But usually limits are not smaller than 1GB minimum and would require some larger files to access and some time to exceed the limit.

Options: ReplyQuote
Re: Bandwidth Overload Through GET Requests?
Posted by: Rich
Date: July 23, 2007 08:54AM

This happened to me last January. Not sure if it was deliberate or just a badly written script as I'm on a shared server and don't have full access to the logs, but some bright spark managed to use up three times my permitted monthly bandwidth in a little over two hours! They hit a large zip file on my site tens-of-thousands of times. Luckily my hosts were very understanding.

Options: ReplyQuote
Re: Bandwidth Overload Through GET Requests?
Posted by: Super-Friez
Date: July 27, 2007 12:17PM

So, say I use Perl's LWP functions to send a get request in a for() loop, over and over again, without displaying what's sent back by HTTP. This can still overload the bandwidth?

Options: ReplyQuote
Re: Bandwidth Overload Through GET Requests?
Posted by: DoctorDan
Date: August 19, 2007 12:20AM

Keep in mind that the server which the Perl script is running on will be using memory as well! You would need many computers doing this at the same time, probably- especially with any large site. It's illegal, too.

-Dan

Options: ReplyQuote
Re: Bandwidth Overload Through GET Requests?
Posted by: Super-Friez
Date: August 21, 2007 10:33AM

DoctorDan Wrote:
-------------------------------------------------------
> Keep in mind that the server which the Perl script
> is running on will be using memory as well! You
> would need many computers doing this at the same
> time, probably- especially with any large site.
> It's illegal, too.
>
> -Dan

Well, I knew it would use memory, but I didn't know how many computers I'd need. I also didn't know it was illegal! Good thing I didn't try it!

Options: ReplyQuote
Re: Bandwidth Overload Through GET Requests?
Posted by: DoctorDan
Date: August 21, 2007 11:04AM

The first time I read this I perceived it as a DoS. Now I understand what this is doing. It's more for some website on shared hosting with a bandwidth limit, right?

Got it, my post may not have been as relevant as I thought. This would work- it would be pretty funny! I wonder how long it would take for 40GB, for example, to be downloaded though. I do wonder if a few computers doing this would be necessary, considering upload/download speeds.

-Dan

Options: ReplyQuote


Sorry, only registered users may post in this forum.