By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
455,246 Members | 1,304 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 455,246 IT Pros & Developers. It's quick & easy.

Benchmarking

P: n/a
Hi,

I am looking for a tool, to benchmark a page.
I want to put a http-adress in and want to get the time
it take to download the whole page. The whole page means:
The HTML-page, the externel javascript and css, all images,
and, if its a frame, all sub-pages with it's material.

In short terms: How long will it take until the whole
page is in the browser.

Googling with: 'html benchmarking' doesn't help.
Apaches 'ab' only fetches one page.
Jmeter seems to be to complex, and I don't want to make
a stress-test on the server.
best regards
jens himmelreich


Jul 20 '05 #1
Share this Question
Share on Google+
6 Replies


P: n/a
in post: <news:bu************@ID-121668.news.uni-berlin.de>
"Jens Himmelreich" <je**@uni-bremen.de> said:
I am looking for a tool, to benchmark a page.
I want to put a http-adress in and want to get the time
it take to download the whole page. The whole page means:
The HTML-page, the externel javascript and css, all images,
and, if its a frame, all sub-pages with it's material.


whats wrong with the link i gave last week?

http://www.websiteoptimization.com/services/analyze/
--
brucie - i usenet nude
Jul 20 '05 #2

P: n/a
Hi brucie,
"brucie" <sh**@bruciesusenetshit.info> wrote:

whats wrong with the link i gave last week?

http://www.websiteoptimization.com/services/analyze/


The tool is great, thanks for the link.
But not all the pages I develop are accessible from the internet.
I am looking for a tool, which I can download.
The same tool as in your link for downloading,
that will be great.
Jul 20 '05 #3

P: n/a
On Wed, 21 Jan 2004 09:49:15 +0100, Jens Himmelreich wrote:
I am looking for a tool, to benchmark a page. I want to put a
http-adress in and want to get the time it take to download the whole
page. The whole page means: The HTML-page, the externel javascript and
css, all images, and, if its a frame, all sub-pages with it's material.


wget will download a page and all supporting files if you use the
--page-requisites option. You should be able to time how long it takes to
do that.

From your message headers I'm guessing your using Windows; according to
Google there's a win32 port at http://xoomer.virgilio.it/hherold/.

- olly

--
Oliver Burnett-Hall
rot13://by**@oheargg-unyy.pb.hx
Jul 20 '05 #4

P: n/a

"Oliver Burnett-Hall" <by**@oheargg-unyy.ph.hx> wrote
wget will download a page and all supporting files if you use the
--page-requisites option. You should be able to time how long it takes to
do that.


Thanks for your hint!
I tried this. But there is the overhaed of I/O, which wget performs.
I think there must be a tool, which simulate the browser-load-process.

best regards
jens
Jul 20 '05 #5

P: n/a
"Jens Himmelreich" <je**@uni-bremen.de> writes:
"Oliver Burnett-Hall" <by**@oheargg-unyy.ph.hx> wrote
wget will download a page and all supporting files if you use the
--page-requisites option. You should be able to time how long it takes to
do that.


Thanks for your hint!
I tried this. But there is the overhaed of I/O, which wget performs.
I think there must be a tool, which simulate the browser-load-process.


And a browser wouldn't perform comparable I/O putting the files into
its cache?

You could make a ramdisk and get wget to save to that, I suppose.

--
Chris
Jul 20 '05 #6

P: n/a

"Chris Morris" <c.********@durham.ac.uk> schrieb im Newsbeitrag
news:87************@dinopsis.dur.ac.uk...
"Jens Himmelreich" <je**@uni-bremen.de> writes:
"Oliver Burnett-Hall" <by**@oheargg-unyy.ph.hx> wrote
wget will download a page and all supporting files if you use the
--page-requisites option. You should be able to time how long it takes to do that.


Thanks for your hint!
I tried this. But there is the overhaed of I/O, which wget performs.
I think there must be a tool, which simulate the browser-load-process.


And a browser wouldn't perform comparable I/O putting the files into
its cache?


Good idea!
But there are other problems. wget with '--page-requisites' don't get all
requisites. It has problems with images-links in css-files.
best regards
jens himmelreich
Jul 20 '05 #7

This discussion thread is closed

Replies have been disabled for this discussion.