473,757 Members | 7,200 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

better to serve one big js file or several smaller ones?

Hi,

I'm curious about server load and download time if I use one big
javascript file or break it into several smaller ones. Which is better?
(Please think of this as the first time the scripts are downloaded so
that browser caching is out of the equation.)

Thanks,
Peter

Mar 22 '06 #1
22 4005
pe**********@gm ail.com said the following on 3/22/2006 1:45 AM:
Hi,

I'm curious about server load and download time if I use one big
javascript file or break it into several smaller ones. Which is better?
(Please think of this as the first time the scripts are downloaded so
that browser caching is out of the equation.)


That depends on your aim. IE will load them faster using separate files
and using separate files will allow the browser to go ahead and parse
them as they are downloaded.

--
Randy
comp.lang.javas cript FAQ - http://jibbering.com/faq & newsgroup weekly
Javascript Best Practices - http://www.JavascriptToolbox.com/bestpractices/
Mar 22 '06 #2
On Wed, 22 Mar 2006 01:52:42 -0500, Randy Webb
<Hi************ @aol.com> wrote:
pe**********@g mail.com said the following on 3/22/2006 1:45 AM:
Hi,

I'm curious about server load and download time if I use one big
javascript file or break it into several smaller ones. Which is better?
(Please think of this as the first time the scripts are downloaded so
that browser caching is out of the equation.)


That depends on your aim. IE will load them faster using separate files
and using separate files will allow the browser to go ahead and parse
them as they are downloaded.


Assuming a good connection, lots of the world doesn't have a good
connection or are using high latency mobile connections (where even if
the transfer speeds are high the overhead of making a request is also
high, so small files take a long time to be requested.

one gzipped large file is your best bet.

Jim.
Mar 22 '06 #3
Jim Ley wrote:
[...] Randy Webb [...] wrote:
pe**********@gm ail.com said the following on 3/22/2006 1:45 AM:
I'm curious about server load and download time if I use one big
javascript file or break it into several smaller ones. Which is better?
(Please think of this as the first time the scripts are downloaded so
that browser caching is out of the equation.)

That depends on your aim. IE will load them faster using separate files
and using separate files will allow the browser to go ahead and parse
them as they are downloaded.

Assuming a good connection, lots of the world doesn't have a good
connection or are using high latency mobile connections (where even if
the transfer speeds are high the overhead of making a request is also
high, so small files take a long time to be requested.

one gzipped large file is your best bet.


Not at all.

1. Not every UA supports gzip-compressed responses. Those which do not,
will (hopefully) be served the uncompressed version of that large
resource, with the known drawbacks. If one relies on gzip compression,
inevitably the resources will grow larger than usual, and so will the
loading time considerably if gzip compression is not supported by the
client.

2. Ideally each resource should be less than 1160 bytes, to easily fit
into one TCP/IP packet. It is therefore a Good Thing if resources
are not too large. However, one has to find a healthy balance between
the number of chunks the resource is split into and the size of each
chunk, because too many chunks require too many HTTP requests.

3. Functionality should be splitted into libraries that deal exactly
with a particular feature. That allows for easier maintenance of,
usually overall smaller download size if only a particular feature
from that feature set is needed, and it avoids problems for people
with older editors (that have a size limit).
PointedEars
Mar 22 '06 #4
@sh
> one gzipped large file is your best bet.

Jim.


I presume thats some sort of text compression? Does it make the file
unusable for editing without dezippin?
Mar 22 '06 #5
On Wed, 22 Mar 2006 16:40:44 +0100, Thomas 'PointedEars' Lahn
<Po*********@we b.de> wrote:
Jim Ley wrote:
1. Not every UA supports gzip-compressed responses. Those which do not,
will (hopefully) be served the uncompressed version of that large
resource, with the known drawbacks.
Of course, there's no associated drawbacks, the total size of all the
files together or one file is the same, but you've saved all the bytes
in the headers, both request and recieve - gzip is also likely more
efficient on the larger single file (more identical tokens)
If one relies on gzip compression,
inevitably the resources will grow larger than usual,
There's no reason to conclude that.
2. Ideally each resource should be less than 1160 bytes, to easily fit
into one TCP/IP packet.
1160 bytes leaves you about 200 bytes for your js code, less if there
are large cookes etc., that's a pointless amount of code to split your
files into.

3. Functionality should be splitted into libraries that deal exactly
with a particular feature.
I can't agree, delivering multiple javascript files is slow,
particularly on high latency, or slow upload connections - such as
mobile services, you also increase the chance of a single of the many
failing which leads to more unpredictable failures - of course if you
live in the first world and use nothing but broadband connections, you
simply won't comprehend this, but please try and think outside your
own experiences.
That allows for easier maintenance of,
maintenance and what is delivered to the client are seperate issues,
don't confuse them, have a good build environment, much better than
imposing your maintenance constraints on your users experience.
usually overall smaller download size if only a particular feature
from that feature set is needed


You should never be serving redundant code, but that's got nothing to
do with the files you deliver to the client. Stop confusing authoring
practices and consumer practices.

Jim.
Mar 22 '06 #6
On Wed, 22 Mar 2006 15:50:40 -0000, "@sh" <sp**@spam.co m> wrote:
one gzipped large file is your best bet.

Jim.


I presume thats some sort of text compression? Does it make the file
unusable for editing without dezippin?


well you normally handle such gzipping transparently in your
webserver.

Jim.
Mar 22 '06 #7
Jim Ley wrote:
[...] Thomas 'PointedEars' Lahn [...] wrote:
Jim Ley wrote:
1. Not every UA supports gzip-compressed responses. Those which do not,
will (hopefully) be served the uncompressed version of that large
resource, with the known drawbacks.
Of course, there's no associated drawbacks,


There are.
the total size of all the files together or one file is the same,
No, it is not, by definition. In fact, one big library is very likely
to be smaller when gzip-compressed than the concatenation of several
gzip-compressed libraries, because there is greater redundance in it.

However, that does not help you with the fact that gzip-compressed
resources have no ultimate support on the Web nowadays. Content
negotiation (that fails sometimes, but I would disregard that as a
bug of the UA) cannot mitigate the fact that if gzip-compressed
responses are /not/ supported, the resource that is to be downloaded
is considerably greater than with support for such responses.
but you've saved all the bytes in the headers, both request and recieve -
gzip is also likely more efficient on the larger single file (more
identical tokens)
The drawback is that it is larger than the gzipped version, no matter how
many chunks there are. Using gzip compression as an argument that large
resources are OK nowadays simply does not hold water. Unless one wants
to completely disregard users of the mentioned UAs and slower connections.
If one relies on gzip compression,
inevitably the resources will grow larger than usual,


There's no reason to conclude that.


Yes, it is. It is simply human nature. If you rely entirely on something,
without understanding the repercussions of using it, you do not care about
what happens if it is not supported. How some people use, or rather
misuse, client-side scripting is a clear indication of this.
2. Ideally each resource should be less than 1160 bytes, to easily fit
into one TCP/IP packet.


1160 bytes leaves you about 200 bytes for your js code, [...]


How did you get that idea? I said _"ideally"_.
[...]
3. Functionality should be splitted into libraries that deal exactly
with a particular feature.
I can't agree, delivering multiple javascript files is slow,
particularly on high latency, or slow upload connections - such as
mobile services, you also increase the chance of a single of the many
failing which leads to more unpredictable failures - of course if you
live in the first world and use nothing but broadband connections, you
simply won't comprehend this, but please try and think outside your
own experiences.


Think of a third-worlder who simply wants to access vital information, and
you are forcing him to download a, say, 100K script file as-is (because his
UA is old enough not to support gzip-compressed HTTP responses), of which,
say, 1% of that code is really used on the site. Then reconsider what you
just said.
That allows for easier maintenance of,


maintenance and what is delivered to the client are seperate issues,


No.
don't confuse them, have a good build environment, much better than
imposing your maintenance constraints on your users experience.
Users' experience will be greatly tarnished by waiting to download for
code that is mostly not needed.
usually overall smaller download size if only a particular feature
from that feature set is needed


You should never be serving redundant code, but that's got nothing to
do with the files you deliver to the client.


It has. Using one big library for everything is serving loads of
redundant code.
Stop confusing authoring practices and consumer practices.


"Consumer practices"? You must be kidding.
PointedEars
Mar 22 '06 #8
Thomas 'PointedEars' Lahn wrote:

Assuming a good connection, lots of the world doesn't have a good
connection or are using high latency mobile connections (where even if
the transfer speeds are high the overhead of making a request is also
high, so small files take a long time to be requested.

one gzipped large file is your best bet.

Not at all.

2. Ideally each resource should be less than 1160 bytes, to easily fit into one TCP/IP packet. It is therefore a Good Thing if resources
are not too large. However, one has to find a healthy balance between
the number of chunks the resource is split into and the size of each
chunk, because too many chunks require too many HTTP requests.
I can't agree with this. The higher the request to response packet
ratio, the more you suffer from latency issues, a real bugger on a poor
dial up or mobile connection.

A better argument would be to say the ideal size is the buffer size of
the client's TCP/IP stack (which tends to be at least 16K, less on
embedded devices, much more on desktops OSs). The server will send
multiple packets for one request, based on the clients advertised buffer
capacity.
3. Functionality should be splitted into libraries that deal exactly
with a particular feature. That allows for easier maintenance of,
usually overall smaller download size if only a particular feature
from that feature set is needed, and it avoids problems for people
with older editors (that have a size limit).

Agreed, but I don't think the editor argument is relevant these days!
This has nothing to do with optimising download times, more of a 'best
practice'.

--
Ian Collins.
Mar 22 '06 #9
Ian Collins wrote:
Thomas 'PointedEars' Lahn wrote:
Assuming a good connection, lots of the world doesn't have a good
connection or are using high latency mobile connections (where even if
the transfer speeds are high the overhead of making a request is also
high, so small files take a long time to be requested.

one gzipped large file is your best bet.
Not at all.

2. Ideally each resource should be less than 1160 bytes, to easily fit
into one TCP/IP packet. It is therefore a Good Thing if resources
are not too large. However, one has to find a healthy balance between
the number of chunks the resource is split into and the size of each
chunk, because too many chunks require too many HTTP requests.

I can't agree with this. The higher the request to response packet
ratio, the more you suffer from latency issues, a real bugger on a poor
dial up or mobile connection.


(sic!)

Please learn to quote. With Mozilla/5.0, you should disable format-flowed
for posting, then the bug should go away.
A better argument would be to say the ideal size is the buffer size of
the client's TCP/IP stack (which tends to be at least 16K, less on
embedded devices, much more on desktops OSs). The server will send
multiple packets for one request, based on the clients advertised
buffer capacity.
The server will not send more packets than required, even if the TCP/IP
stack buffer size of the client is greater. The argument is void.
3. Functionality should be splitted into libraries that deal exactly
with a particular feature. That allows for easier maintenance of,
usually overall smaller download size if only a particular feature
from that feature set is needed, and it avoids problems for people
with older editors (that have a size limit).

Agreed, but I don't think the editor argument is relevant these days!


You know /all/ users? And it is still a fact that editing a large file
is considerably slower than editing a small file.
This has nothing to do with optimising download times,
Yes, it does. If you split features into one script file each, you will
only have to serve the code for the needed features instead of all the
code you have ever written. The overhead by more requests is negligible
compared to the overhead from a large amount of served, but in the end
unused code.
more of a 'best practice'.


A best common practice that has several grounds.
PointedEars
Mar 22 '06 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
8221
by: George Adams | last post by:
I like the idea of compiling DSO modules for Apache. It allows me to turn on or off things we may or may not need at a given time (like mod_ssl, mod_auth_mysql, mod_auth_ldap, etc.) and also allows me to compile in new versions of modules without having to rebuild Apache from scratch. Now, when I build PHP, I tend to put in a lot of things. Like:
10
2850
by: Steve | last post by:
Hi all i am just starting to get back into VB and i need a little help. I am writing a program that asks a user to type in a set of numbers/letters (in this case shipping containers). Once the data is entered i have my 4 letters and i want to be able to call up data relating to the 4 letters. Basically i want it to show who the container belongs to and any other data i wish to put in there relating to the container.
8
2855
by: Bri | last post by:
Greetings, After making various edits and deletes on aproximately 40,000 records in one table (on the Design Master) syncronization fails with Error 3052 - File Sharing Lock Count Exceeded. After reading the MS KB article 173006 (http://support.microsoft.com/default.aspx?scid=kb;en-us;173006) I increased the locks to 100,000 on the Replica PC and I still get the error. Since the defailt value is only 9500 I would have thought that...
15
1745
by: Yogi_Bear_79 | last post by:
Visual Studio .NET started complaing when the array was around 4000. I found that if I pasted the array in via notepad then opened Visual Studio it would work. Now my array is over 26,000 and Visual Studio just doesn't like it. Everytime I open the prokect I get the following error: "An Error occurred which the C# comipler is unable to report due to low memory or possible heap corruption. It is recommended that you save all your files,...
170
5740
by: I_AM_DON_AND_YOU? | last post by:
Whether we can upload the projects (in .zip format) in these newsgroups? I am asking this because earlier there are more than 50 posts (in one thread) about this query and they are contradicting with each other (You can find that post by name date10/22/2003 Time 7:22PM) I want to get the answer from Microsoft. Therefore, please answer this query only if you are from Microsoft. Thanks in advance!
6
3290
by: Vlado Jasovic | last post by:
Hello, We're developing application in VS2005 using vb.net. Our application exe file is ~20mb when compiled. Recently, we have developed auto-update feature that goes on our web-site, authenticates through WebServices and if neccessary downloads updated app.exe file. We had to use custom solution vs. ClickOnce for several different reasons.
2
2852
bugboy
by: bugboy | last post by:
Does the total number of rows in a table determine the amount of resources required for a query?.. or is it primarily determined by the number of rows used by the query? ..Does an INDEX mean it doesn't have to read every row to find data? Context: I have a large dataset and am wondering if i should split it up into several smaller tables to reduce the resources required to search it... i would then have to predetermine which table to do the...
2
2307
by: =?Utf-8?B?TWFyYyBBbGxhcmQ=?= | last post by:
Hello, I have posted this message in the normal newsgroup instead of posting it in the managed ones. I have a project in VB6 (GDI+) that will read a WMF File (created by my customer) and then resize it and insert it into a picture I have a problem with in in NET2005 (but it is OK in VB6).
84
3966
by: Patient Guy | last post by:
Which is the better approach in working with Javascript? 1. Server side processing: Web server gets form input, runs it into the Javascript module, and PHP collects the output for document prep. 2. Client side processing: Web server gets form input and passes it to PHP which includes the Javascript written in a way to make the form input processed on the client side and rendered (probably using DOM function calls) on that side as...
0
9298
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10072
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
9885
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
9737
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8737
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
5329
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3829
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
3399
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2698
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.