473,320 Members | 1,612 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,320 software developers and data experts.

Dynamic Content in High-Traffic Sites

Hi.

This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:

http://www.dynamicdrive.com/dynamici...jaxcontent.htm

I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.

I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.

Is this a risk?

Thanks.

Jun 22 '07 #1
9 2895
On Jun 22, 4:58 pm, pbd22 <dush...@gmail.comwrote:
Hi.

This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:

http://www.dynamicdrive.com/dynamici...jaxcontent.htm

I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.

I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.

Is this a risk?

Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.

Jun 22 '07 #2
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail.comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.

post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.

To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax, and if speed was an issue,
they certainly wouldn't accept it. Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.

Jun 22 '07 #3
pbd22 <du*****@gmail.comwrote in news:1182527910.053496.125620
@n60g2000hse.googlegroups.com:
Hi.

This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
just a thought... server-side solution here:

you could have a CRON run every x minutes that grabs the data you need from
the site and writes it to a file on the server. then, your page just
'includes' that file for instant display.

its a technique i used to show 'live' border delays on my website's index
page... rather than have a call to the US Border website every time my
homepage is loaded, my CRON runs every 5 minutes and writes the border
delays (pre-formatted) to a file on my server... and my index just reads
that file, very quickly.

again, its a server side solution, but it would undoubtedly create less
overhead and traffic delays.

Jun 22 '07 #4
Darko wrote:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gmail.comwrote:
>On Jun 22, 4:58 pm, pbd22 <dush...@gmail.comwrote:
>>Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.

Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
No, actually it isn't. And I am sure by shimmyshack's post history it
is evident he didn't mean load 19 Megabytes worth of markup and then
switch it via hidden and shown DIVs.

And in fact, bytefx uses this very same method.

http://www.devpro.it/bytefx/

The only problem that arises is when JavaScript is disabled, so you
should also make sure each DIV has a named anchor in or near it, so
links (or whatever you are clicking) can still allow you to reach that
section of information. Or just provide links to the pages in question
which is also acceptable, degradable JavaScript.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax, and if speed was an issue,
they certainly wouldn't accept it. Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
Your suggestion is just as bad, in my opinion worse. You recommend
using XMLHttpRequests based on their popularity? The original poster
was asking a "disaster (management) recovery" question. So recommending
he use a technology simply because you think it is a popular option is
not wise.

Regardless of whether or not it is suitable or scalable.

--
-Lost
Remove the extra words to reply by e-mail. Don't e-mail me. I am
kidding. No I am not.
Jun 22 '07 #5
On Jun 22, 7:32 pm, Darko <darko.maksimo...@gmail.comwrote:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail.comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
>http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.

Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.

If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
>
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</aworks just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
>and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]
}

</script></head><body>...
within the homepage, and swopping is you can
>Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.

Jun 22 '07 #6
On Jun 22, 2:01 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 22, 7:32 pm, Darko <darko.maksimo...@gmail.comwrote:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail.comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.

no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.

If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're

Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</aworks just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,

you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
and if speed was an issue,
they certainly wouldn't accept it.

It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]

}

</script></head><body>...
within the homepage, and swopping is you can
Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.

but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.
ShimmyShack -

Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TRelements. Is
there any harm here?

Thanks again.

Jun 23 '07 #7
On Jun 23, 6:25 pm, pbd22 <dush...@gmail.comwrote:
On Jun 22, 2:01 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 22, 7:32 pm, Darko <darko.maksimo...@gmail.comwrote:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail.comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
>http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.
If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</aworks just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
>and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]
}
</script></head><body>...
within the homepage, and swopping is you can
>Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.

ShimmyShack -

Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TRelements. Is
there any harm here?

Thanks again.
well I was assuming your code would be a few pages of content (one
piece of content per div) but rembember any content which you are
storing in divs, will be in the DOM unless they are hard coded in the
markup to have display:none - setting this in the CSS after page load
wouldnt be good because the whole page would have to render before
hiding some divs, and so yes, storing a lot of hidden tables is not
good, a "ton" of TRs just shouldnt exsit any more, it has been years
since css replaced the need for tables!!
I was assuming your code was modern sematic markup with css for the
display/look&feel. If you are using tables then you could consider
storing your contents as javascript strings of pure text, and creating
the table dynamically, however in the end, the best bet might well be
to simply use good old fashioned links until you markup is modern, and
then apply the modern techniques to it, ending up with a very clean
easy to update site.

Jun 24 '07 #8
On Jun 24, 4:21 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 23, 6:25 pm, pbd22 <dush...@gmail.comwrote:
On Jun 22, 2:01 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 22, 7:32 pm, Darko <darko.maksimo...@gmail.comwrote:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail.comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.
If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</aworks just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]
}
</script></head><body>...
within the homepage, and swopping is you can
Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.
ShimmyShack -
Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TRelements. Is
there any harm here?
Thanks again.

well I was assuming your code would be a few pages of content (one
piece of content per div) but rembember any content which you are
storing in divs, will be in the DOM unless they are hard coded in the
markup to have display:none - setting this in the CSS after page load
wouldnt be good because the whole page would have to render before
hiding some divs, and so yes, storing a lot of hidden tables is not
good, a "ton" of TRs just shouldnt exsit any more, it has been years
since css replaced the need for tables!!
I was assuming your code was modern sematic markup with css for the
display/look&feel. If you are using tables then you could consider
storing your contents as javascript strings of pure text, and creating
the table dynamically, however in the end, the best bet might well be
to simply use good old fashioned links until you markup is modern, and
then apply the modern techniques to it, ending up with a very clean
easy to update site.
Hi, thanks. I have no problem trying to learn "the right way" to do
things once I know what those things are. By "modern techniques" do
you mean a page with only DIV tags and CSS - no tables at all? I am
assuming this is the way to go? Thanks again.

Jun 25 '07 #9
On Jun 25, 12:42 pm, pbd22 <dush...@gmail.comwrote:
On Jun 24, 4:21 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 23, 6:25 pm, pbd22 <dush...@gmail.comwrote:
On Jun 22, 2:01 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 22, 7:32 pm, Darko <darko.maksimo...@gmail.comwrote:
On Jun 22, 7:06 pm, shimmyshack <matt.fa...@gmail.comwrote:
On Jun 22, 4:58 pm, pbd22 <dush...@gmail.comwrote:
Hi.
This is just a disaster management question.
I am using XMLHTTP for the dynamic loading of content in a very
crucial area of my web site. Same as an IFrame, but using XMLHTTP and
a DIV. I got the core of the javascript from here:
>http://www.dynamicdrive.com/dynamici...jaxcontent.htm
I noticed in the demo that sometimes the content takes a long
time to load. That is not the case with my Dev Box, but, then again,
all the hardware is right here.
I am wondering if using XMLHTTP for the dynamic loading of content
could lose on performance as the site starts to gain in popularity?
I don't to lose customers because of content wait-time.
Is this a risk?
Thanks.
post a url and we can help with specific advice. You have to get the
content from somewhere but using xhr is probably not the best way to
scale up a site, no. Instead consider using just one large page
containing many divs each named after the content you need in the
"main" div, then instead of xhr with multiple http requests, you just
swap the divs around. Job done, completely scaleable no matter how
large the site, however it all depends what kind of site you are
running - dynamic content etc.. which is why when you ask a question
like this, a url is good, your site will be public one day anyway.
Don't you think this is an inappropriate advice? Can you imagine e.g.
gmail loading all data in DIV's then swapping divs around? I don't
think this would be a good idea. Maybe if amount of data is really
small, your advice could be acceptable, but in general, loading all
contents and then displaying just those needed is very bad solution,
for both the server and the client. Please correct me if I
misunderstood you.
no not bad advice really - my caveat was that it depends on the site -
the reason why ajax is such a good idea for an email site is that its
entirely dynamic, however gmail and others actually "preload" a
massive amount of speculative data, a huge amount of content - in case
they are needed - not as divs but as json, and have them "swop"
around the div with no further need for set-up and tear-down http
costs. (of course divs are already in the div which means you shuold
keep their number down)
here are the stats for a single hit to gmail, 700KB of preloaded data,
most of which is text/html
text/javascript: 93,471
text/html: 534,526
image/x-icon: 1,150
flash: 7,106
~headers: 17,274
image/png: 27,551
text/plain: 36,287
image/gif: 11,515
Even this is deceptive, the images are loaded using the same technique
an image containing 20 or so "views" is loaded because it decreases
the number of http requests needed for the application, a large amount
of text/css is then loaded to control the position of that image,
about 1KB of extra text per image, but it is still a fantsatic trade
off, the technique of preloading content speculatively is just the
same except that it requires js controllers, and lots of extra text/
html disguised as javascript similar to json.
If a website is static, and you had say 10 pages, the cost of
downloading the data would be only a few KB, you could of course
download it as json strings in javascript, but all at once,
speculatively, with a single call to a page. Also gmail and others are
a bad example os use to support AJAX as a good method for swopping
content, they have huge redundancy headroom, scalability is not a
problem for them. For this guy with one single server minimising http
requests is a great solution and worth a few KB, which is cacheable,
unless he is running a site full of dynamic content, a caveat I gave
in my last post.
To the topic author: no, xmlhttp is not so bad to use, as a matter of
fact it has become quite popular, and if you're talking about loading
just specific elements on the page it is the best idea. If you're
Popular as in fashionable, xhr is not a solution to most of the
problems it is being used for. For instance where is the accessibility
of xhr as it relies so much on javascript? xhr should be used only
where it makes sense, and using it to deliver bits of pages, just
isn't good enough, where a good old fashioned accessbile <a
href="page2.htm">page 2</aworks just the same is cached, accessible,
and just as fast - if you have your images,js and css sent with the
correct headers.
loading complete pages, then stick to the traditional navigating to
the page of interest. If you look around, you'll find out that almost
all popular webmail applications use ajax,
you will also find that turning javascript off, these applications
degrade gracefully - a fact most people that use ajax to create their
websites ignore. These applications tend to use ajax because it makes
sense in their environment, gmail, youtube and other highly changeable
content sites must use xhr but facebook with less changeable content
doesnt rely on ajax, amazon, ebay, bebo, even flickr make
comparitively small use of ajax.
>and if speed was an issue,
they certainly wouldn't accept it.
It's not speed, its the concurrent http requests that make it a
scalability nightmare, unless it is absolutely needed, as opposed to
loading
<html><head>
<script>var objPreloadedContent =
{
"pages":
[
{
"div_heading": "contact",
"div_title": "Contact Us",
"div_content": "<p>Please use this form to contact us...",
"div_footer": "Please feel free to contact use in any way you
wish"
},
{
"div_heading": "sales",
"div_title": "Sales Department",
"div_content": "<p>Here is a list of our sales team...",
"div_footer": "We are happy to sell sell sell.."
}
]
}
</script></head><body>...
within the homepage, and swopping is you can
>Ajax actually speeds things up,
because you don't have to load the whole new page in order to display
just a single new information to the user.
but using ajax in this "single piece" mode means you do have to
request the data piecewise, so you get latency, header overhead, http
costs, server cpu costs.
AJAX is in general a waste of resources, unless you have a clear need
that cannot be met by using a more conventional approach.
ShimmyShack -
Thanks. I think I am going to go with multiple DIVs and manipulation
of display:none. One question along those lines,
though. This will mean that the page will have a massive amount
of HTML with tons (I mean tons) of hidden <TRelements. Is
there any harm here?
Thanks again.
well I was assuming your code would be a few pages of content (one
piece of content per div) but rembember any content which you are
storing in divs, will be in the DOM unless they are hard coded in the
markup to have display:none - setting this in the CSS after page load
wouldnt be good because the whole page would have to render before
hiding some divs, and so yes, storing a lot of hidden tables is not
good, a "ton" of TRs just shouldnt exsit any more, it has been years
since css replaced the need for tables!!
I was assuming your code was modern sematic markup with css for the
display/look&feel. If you are using tables then you could consider
storing your contents as javascript strings of pure text, and creating
the table dynamically, however in the end, the best bet might well be
to simply use good old fashioned links until you markup is modern, and
then apply the modern techniques to it, ending up with a very clean
easy to update site.

Hi, thanks. I have no problem trying to learn "the right way" to do
things once I know what those things are. By "modern techniques" do
you mean a page with only DIV tags and CSS - no tables at all? I am
assuming this is the way to go? Thanks again.
yeah, just the type of html/xhtml that you choose to use.
Bear in mind that it takes a bit of thinking to change, but that the
learning curse is WELL worth it. Back in 2004 there was this huge
table driven site, each page's source code printed to 7 A4 pages of
closely packed text, with css, it went down to 2 nicely formatted
pages. It's better for you, for those who use assistive devices, for
your search engine rating, and for you clients as the rendering time
is slashed.
I recommend checking out sites like http://alistapart.com/
Check out the source code to alistapart and see no tables!
The front page looks as if it could use tables, but download firefox
and use "view->page style->none" to see that it is just css styling
that produces the sites look and feel. Which means that there are just
a couple of separate documents included in the head section of each
html page that dictates the entire look and feel of the website, if
you feel like a change just change the css document, and your whole
site completely changes in an instant, or offer multiple look&feels
for those who require high visibility, allow your site to zoom in,
etc... all with no changes to any of the new style html you are going
to write.
Tools you can use include the web developer extension for firefox - to
highlight all the elements, <p<h1<h2>... <ulthat you will start
using more often now to see what the bounding box for these elements
looks like and how to shimmy them around in the page using css. You
can use firebug to edit the css live, or other extensions like that,
and you're on your way.
Consider that when you use javascript for functionality in your pages,
it should not be "core" to the website, it should add to an already
working website, so code your website to work in the old fashioned way
and add a layer over the top, of unobtrusive javascript that hijacks
the links and does the fancy stuff.
Once you start using css+(x)html you wont be worrying about
maintainability, you wont mind having 20 pages of markup per site, you
will find it easier to code a website, hijacking it afterwards, and
your work is done; the old tablebased sites are so hard to maintain
once a change is needed that the work involved means you reach around
for shortcuts, and draggin in content from iframes and so on and on...

Jun 25 '07 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

14
by: mirnazim | last post by:
Hi, There are great Python Web Application Framework. But most of them are meant for content oriented web apps. Is there something that can ease the development of application that are not...
5
by: Charlie | last post by:
Hi, The description of Python always mentions "very high level dynamic data types". Now, I can't seem to find any examples of these (nothing described with this term anyway). Is this simply...
11
by: deko | last post by:
I need to create a basic one-dimensional array of strings, but I don't know how many strings I'm going to have until the code is finished looping. pseudo code: Dim astrMyArray() Do While Not...
3
by: Stephen Gennard | last post by:
Hello, I having a problem dynamically invoking a static method that takes a reference to a SByte*. If I do it directly it works just fine. Anyone any ideas why? I have include a example...
3
by: NateDawg | last post by:
I'm reposting this. I'm kinda in a bind untill i get this figured out, so if anyone has some input it would sure help me out. Ok, I’ve noticed a few gridview problems floating around the forum....
7
by: Mike Livenspargar | last post by:
We have an application converted from v1.1 Framework to v2.0. The executable references a class library which in turn has a web reference. The web reference 'URL Behavior' is set to dynamic. We...
2
by: serge calderara | last post by:
Dear all, Technically, what dynamic web page really means. Is it simply due to the fact that the content can be change without recompiling the web application by simply changing for example data...
23
by: sandy | last post by:
I need (okay, I want) to make a dynamic array of my class 'Directory', within my class Directory (Can you already smell disaster?) Each Directory can have subdirectories so I thought to put these...
7
by: Jo | last post by:
Hi, How can i differentiate between static and dynamic allocated objects? For example: void SomeFunction1() { CObject *objectp = new CObject; CObject object;
1
by: Shawn Northrop | last post by:
Hi, I am having trouble loading dynamic text with html content specifically <br>. If i have the rendered as html option checked the bold property of the font i initially set goes away and instead...
0
by: DolphinDB | last post by:
Tired of spending countless mintues downsampling your data? Look no further! In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
0
by: ryjfgjl | last post by:
ExcelToDatabase: batch import excel into database automatically...
0
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
1
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
0
by: Vimpel783 | last post by:
Hello! Guys, I found this code on the Internet, but I need to modify it a little. It works well, the problem is this: Data is sent from only one cell, in this case B5, but it is necessary that data...
1
by: PapaRatzi | last post by:
Hello, I am teaching myself MS Access forms design and Visual Basic. I've created a table to capture a list of Top 30 singles and forms to capture new entries. The final step is a form (unbound)...
1
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
1
by: Defcon1945 | last post by:
I'm trying to learn Python using Pycharm but import shutil doesn't work
1
by: Shællîpôpï 09 | last post by:
If u are using a keypad phone, how do u turn on JavaScript, to access features like WhatsApp, Facebook, Instagram....

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.