473,748 Members | 2,426 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Dynamic Content in High-Traffic Sites

Hi.

This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:

http://www.dynamicdrive.com/dynamici...jaxcontent.htm

I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.

I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.

Is this a risk?

Thanks.

Jun 22 '07 #1
9 2984
On Jun 22, 4:58 pm, pbd22 <dush...@gmail. comwrote:
Hi.

This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:

http://www.dynamicdrive.com/dynamici...jaxcontent.htm

I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.

I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.

Is this a risk?

Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.

Jun 22 '07 #2
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail. comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.

post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.

To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax, and if speed was an issue,
they certainly wouldn't accept it. Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.

Jun 22 '07 #3
pbd22 <du*****@gmail. comwrote in news:1182527910 .053496.125620
@n60g2000hse.go oglegroups.com:
Hi.

This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
just a thought... server-side solution here:

you could have a CRON run every x minutes that grabs the data you need from
the site and writes it to a file on the server. then, your page just
'includes' that file for instant display.

its a technique i used to show 'live' border delays on my website's index
page... rather than have a call to the US Border website every time my
homepage is loaded, my CRON runs every 5 minutes and writes the border
delays (pre-formatted) to a file on my server... and my index just reads
that file, very quickly.

again, its a server side solution, but it would undoubtedly create less
overhead and traffic delays.

Jun 22 '07 #4
Darko wrote:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gma il.comwrote:
>On Jun 22, 4:58 pm, pbd22 <dush...@gmail. comwrote:
>>Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.

Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
No, actually it isn't. And I am sure by shimmyshack's post history it
is evident he didn't mean load 19 Megabytes worth of markup and then
switch it via hidden and shown DIVs.

And in fact, bytefx uses this very same method.

http://www.devpro.it/bytefx/

The only problem that arises is when JavaScript is disabled, so you
should also make sure each DIV has a named anchor in or near it, so
links (or whatever you are clicking) can still allow you to reach that
section of information. Or just provide links to the pages in question
which is also acceptable, degradable JavaScript.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax, and if speed was an issue,
they certainly wouldn't accept it. Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
Your suggestion is just as bad, in my opinion worse. You recommend
using XMLHttpRequests based on their popularity? The original poster
was asking a "disaster (management) recovery" question. So recommending
he use a technology simply because you think it is a popular option is
not wise.

Regardless of whether or not it is suitable or scalable.

--
-Lost
Remove the extra words to reply by e-mail. Don't e-mail me. I am
kidding. No I am not.
Jun 22 '07 #5
On Jun 22, 7:32 pm, Darko <darko.maksimo. ..@gmail.comwro te:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail. comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
>http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.

Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.

If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
>
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm ">page 2</aworks just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
>and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedCon tent =
{
"pages":
[
{
"div_headin g": "contact",
"div_title" : "Contact Us",
"div_conten t": "<p>Please use this form to contact us...",
"div_footer ": "Please feel free to contact use in any way you
wish"
},
{
"div_headin g": "sales",
"div_title" : "Sales Department",
"div_conten t": "<p>Here is a list of our sales team...",
"div_footer ": "We are happy to sell sell sell.."
}
]
}

</script></head><body>...
within the homepage, and swopping is you can
>Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.

Jun 22 '07 #6
On Jun 22, 2:01 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 22, 7:32 pm, Darko <darko.maksimo. ..@gmail.comwro te:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail. comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.

no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.

If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're

Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm ">page 2</aworks just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,

you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
and if speed was an issue,
they certainly wouldn't accept it.

It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedCon tent =
{
"pages":
[
{
"div_headin g": "contact",
"div_title" : "Contact Us",
"div_conten t": "<p>Please use this form to contact us...",
"div_footer ": "Please feel free to contact use in any way you
wish"
},
{
"div_headin g": "sales",
"div_title" : "Sales Department",
"div_conten t": "<p>Here is a list of our sales team...",
"div_footer ": "We are happy to sell sell sell.."
}
]

}

</script></head><body>...
within the homepage, and swopping is you can
Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.

but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.
ShimmyShack -

Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TRelements. Is
there any harm here?

Thanks again.

Jun 23 '07 #7
On Jun 23, 6:25 pm, pbd22 <dush...@gmail. comwrote:
On Jun 22, 2:01 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 22, 7:32 pm, Darko <darko.maksimo. ..@gmail.comwro te:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail. comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
>http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.
If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm ">page 2</aworks just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
>and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedCon tent =
{
"pages":
[
{
"div_headin g": "contact",
"div_title" : "Contact Us",
"div_conten t": "<p>Please use this form to contact us...",
"div_footer ": "Please feel free to contact use in any way you
wish"
},
{
"div_headin g": "sales",
"div_title" : "Sales Department",
"div_conten t": "<p>Here is a list of our sales team...",
"div_footer ": "We are happy to sell sell sell.."
}
]
}
</script></head><body>...
within the homepage, and swopping is you can
>Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.

ShimmyShack -

Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TRelements. Is
there any harm here?

Thanks again.
well I was assuming your code would be a few pages of content (one
piece of content per div) but rembember any content which you are
storing in divs, will be in the DOM unless they are hard coded in the
markup to have display:none - setting this in the CSS after page load
wouldnt be good because the whole page would have to render before
hiding some divs, and so yes, storing a lot of hidden tables is not
good, a "ton" of TRs just shouldnt exsit any more, it has been years
since css replaced the need for tables!!
I was assuming your code was modern sematic markup with css for the
display/look&feel. If you are using tables then you could consider
storing your contents as javascript strings of pure text, and creating
the table dynamically, however in the end, the best bet might well be
to simply use good old fashioned links until you markup is modern, and
then apply the modern techniques to it, ending up with a very clean
easy to update site.

Jun 24 '07 #8
On Jun 24, 4:21 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 23, 6:25 pm, pbd22 <dush...@gmail. comwrote:
On Jun 22, 2:01 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 22, 7:32 pm, Darko <darko.maksimo. ..@gmail.comwro te:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail. comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.
If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm ">page 2</aworks just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedCon tent =
{
"pages":
[
{
"div_headin g": "contact",
"div_title" : "Contact Us",
"div_conten t": "<p>Please use this form to contact us...",
"div_footer ": "Please feel free to contact use in any way you
wish"
},
{
"div_headin g": "sales",
"div_title" : "Sales Department",
"div_conten t": "<p>Here is a list of our sales team...",
"div_footer ": "We are happy to sell sell sell.."
}
]
}
</script></head><body>...
within the homepage, and swopping is you can
Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.
ShimmyShack -
Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TRelements. Is
there any harm here?
Thanks again.

well I was assuming your code would be a few pages of content (one
piece of content per div) but rembember any content which you are
storing in divs, will be in the DOM unless they are hard coded in the
markup to have display:none - setting this in the CSS after page load
wouldnt be good because the whole page would have to render before
hiding some divs, and so yes, storing a lot of hidden tables is not
good, a "ton" of TRs just shouldnt exsit any more, it has been years
since css replaced the need for tables!!
I was assuming your code was modern sematic markup with css for the
display/look&feel. If you are using tables then you could consider
storing your contents as javascript strings of pure text, and creating
the table dynamically, however in the end, the best bet might well be
to simply use good old fashioned links until you markup is modern, and
then apply the modern techniques to it, ending up with a very clean
easy to update site.
Hi, thanks. I have no problem trying to learn "the right way" to do
things once I know what those things are. By "modern techniques" do
you mean a page with only DIV tags and CSS - no tables at all? I am
assuming this is the way to go? Thanks again.

Jun 25 '07 #9
On Jun 25, 12:42 pm, pbd22 <dush...@gmail. comwrote:
On Jun 24, 4:21 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 23, 6:25 pm, pbd22 <dush...@gmail. comwrote:
On Jun 22, 2:01 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 22, 7:32 pm, Darko <darko.maksimo. ..@gmail.comwro te:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gma il.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail. comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
>http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.
If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm ">page 2</aworks just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
>and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedCon tent =
{
"pages":
[
{
"div_headin g": "contact",
"div_title" : "Contact Us",
"div_conten t": "<p>Please use this form to contact us...",
"div_footer ": "Please feel free to contact use in any way you
wish"
},
{
"div_headin g": "sales",
"div_title" : "Sales Department",
"div_conten t": "<p>Here is a list of our sales team...",
"div_footer ": "We are happy to sell sell sell.."
}
]
}
</script></head><body>...
within the homepage, and swopping is you can
>Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.
ShimmyShack -
Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TRelements. Is
there any harm here?
Thanks again.
well I was assuming your code would be a few pages of content (one
piece of content per div) but rembember any content which you are
storing in divs, will be in the DOM unless they are hard coded in the
markup to have display:none - setting this in the CSS after page load
wouldnt be good because the whole page would have to render before
hiding some divs, and so yes, storing a lot of hidden tables is not
good, a "ton" of TRs just shouldnt exsit any more, it has been years
since css replaced the need for tables!!
I was assuming your code was modern sematic markup with css for the
display/look&feel. If you are using tables then you could consider
storing your contents as javascript strings of pure text, and creating
the table dynamically, however in the end, the best bet might well be
to simply use good old fashioned links until you markup is modern, and
then apply the modern techniques to it, ending up with a very clean
easy to update site.

Hi, thanks. I have no problem trying to learn "the right way" to do
things once I know what those things are. By "modern techniques" do
you mean a page with only DIV tags and CSS - no tables at all? I am
assuming this is the way to go? Thanks again.
yeah, just the type of html/xhtml that you choose to use.
Bear in mind that it takes a bit of thinking to change, but that the
learning curse is WELL worth it. Back in 2004 there was this huge
table driven site, each page's source code printed to 7 A4 pages of
closely packed text, with css, it went down to 2 nicely formatted
pages. It's better for you, for those who use assistive devices, for
your search engine rating, and for you clients as the rendering time
is slashed.
I recommend checking out sites like http://alistapart.com/
Check out the source code to alistapart and see no tables!
The front page looks as if it could use tables, but download firefox
and use "view->page style->none" to see that it is just css styling
that produces the sites look and feel. Which means that there are just
a couple of separate documents included in the head section of each
html page that dictates the entire look and feel of the website, if
you feel like a change just change the css document, and your whole
site completely changes in an instant, or offer multiple look&feels
for those who require high visibility, allow your site to zoom in,
etc... all with no changes to any of the new style html you are going
to write.
Tools you can use include the web developer extension for firefox - to
highlight all the elements, <p<h1<h2>... <ulthat you will start
using more often now to see what the bounding box for these elements
looks like and how to shimmy them around in the page using css. You
can use firebug to edit the css live, or other extensions like that,
and you're on your way.
Consider that when you use javascript for functionality in your pages,
it should not be "core" to the website, it should add to an already
working website, so code your website to work in the old fashioned way
and add a layer over the top, of unobtrusive javascript that hijacks
the links and does the fancy stuff.
Once you start using css+(x)html you wont be worrying about
maintainability , you wont mind having 20 pages of markup per site, you
will find it easier to code a website, hijacking it afterwards, and
your work is done; the old tablebased sites are so hard to maintain
once a change is needed that the work involved means you reach around
for shortcuts, and draggin in content from iframes and so on and on...

Jun 25 '07 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

14
2704
by: mirnazim | last post by:
Hi, There are great Python Web Application Framework. But most of them are meant for content oriented web apps. Is there something that can ease the development of application that are not content oriented(I call them "NON CONTENT-ORIENTED WEB APPLICATIONS" because I don't know what else to call them). I mean the applications like, accounting, high volume data entry apps, where normally GUI clients have ruled. I know very high...
5
4938
by: Charlie | last post by:
Hi, The description of Python always mentions "very high level dynamic data types". Now, I can't seem to find any examples of these (nothing described with this term anyway). Is this simply refering to built-in dynamic data structures such as lists and dictionaries, with a great deal of operators defined on? Or is there something else meant by "dynamic data types" in Python? Regards,
11
39469
by: deko | last post by:
I need to create a basic one-dimensional array of strings, but I don't know how many strings I'm going to have until the code is finished looping. pseudo code: Dim astrMyArray() Do While Not rst.EOF i = i + 1 If rst!Something = Then astrMyArray(i) = rst!Something
3
1312
by: Stephen Gennard | last post by:
Hello, I having a problem dynamically invoking a static method that takes a reference to a SByte*. If I do it directly it works just fine. Anyone any ideas why? I have include a example below... --
3
13750
by: NateDawg | last post by:
I'm reposting this. I'm kinda in a bind untill i get this figured out, so if anyone has some input it would sure help me out. Ok, I’ve noticed a few gridview problems floating around the forum. Everyone wants to do a java confirmation box when a user clicks the delete button. Fair enough, basic user design rules state that you should always confirm a delete action. There is also a consensus that the best way to do this is a template...
7
22491
by: Mike Livenspargar | last post by:
We have an application converted from v1.1 Framework to v2.0. The executable references a class library which in turn has a web reference. The web reference 'URL Behavior' is set to dynamic. We added an entry to the executable's .exe.config file to specify the URL, and under the 1.1 framework this worked well. Unfortunately, this is not working under the 2.0 framework. I see in the Reference.cs file under the web service reference the...
2
1646
by: serge calderara | last post by:
Dear all, Technically, what dynamic web page really means. Is it simply due to the fact that the content can be change without recompiling the web application by simply changing for example data base fields ? thnaks for your clarification regards
23
7413
by: sandy | last post by:
I need (okay, I want) to make a dynamic array of my class 'Directory', within my class Directory (Can you already smell disaster?) Each Directory can have subdirectories so I thought to put these in an array. The application compiles but aborts without giving me any useful information. What I suspect is happening is infinite recursion. Each Directory object creates an array of Subdirectories each of which has an array of...
7
8222
by: Jo | last post by:
Hi, How can i differentiate between static and dynamic allocated objects? For example: void SomeFunction1() { CObject *objectp = new CObject; CObject object;
1
2573
by: Shawn Northrop | last post by:
Hi, I am having trouble loading dynamic text with html content specifically <br>. If i have the rendered as html option checked the bold property of the font i initially set goes away and instead of making new line the <br> is replaced with br (no brackets). am i doing something wrong? heres the code: popinfo = _root.txt_tour.show_mc.show_txt; showPop1_btn.onRollOver= function(){ popinfo.text = "Featuring Sarah Bettens, Christine...
0
8984
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9530
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
9363
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
9312
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
1
6793
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6073
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4864
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3300
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
2775
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.