Full check of internet pages

Hi!

well, here is my problem:
I’ve been asked too many times to check the loading time of a complex html page (ie: a page with iframes, advertisments, gif, jpg …etc).
The idea is to check the time it takes for a user to be able to see the whole page!


and I have absolutely no idea on how to do that.
I’m getting a bit experienced with the check_http plug in and other plug ins I wrote myself, but they all rely on doing a “GET” command and parsing the source code of the page
=> I don’t know how to get all the elements.

Can anyone help me ?
Do you know any plug in that could do that ? (I’ve just tried web inject, but unless I’m wrong, it doesn’t do what I want).

Thank you for your help!

Hi Loose

I feel your pain. I don’t know whether 'wget’ting the page using the -p option is what you are after? From the wget man pages:

-p
--page-requisites
This option causes Wget to download all the files that are necessary to properly display a given HTML page.
This includes such things as inlined images, sounds, and referenced stylesheets.
...etc

I figure timing that operation would be the way forward.
I always wonder though when I am asked to perform these type of checks on public facing pages how it can possibly be of any use to time something over a LAN or hi-speed corporate WAN connection when Joe User might be on some low spec ADSL connection, or, god forbid, 56k dialup… Of course, there’s a limit rate option with wget too, but I try and avoid telling them that as all of a sudden “1 check” becomes “1 check for 56k, 1 check for 512k ,1 check for 2Mb…” !lol

Any road up, enough ranting… Hope that helps

/S

Great!
I didn’t even know wget :wink:

I’m going to try that.

btw: the idea is just to know if the whole page is displaying or not on our high speed wan … if someone attempts to connect from a 56k … sorry for him :slight_smile: (these kind of pages or not created for this kind of connection anyway ^^)

thanks again