By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,230 Members | 2,427 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,230 IT Pros & Developers. It's quick & easy.

Too many (small) vs. too large linked script files in a document...

P: n/a
Just wondering if anyone have looked into this?

How to split up ones JavaScript library?
A lot of very specific (and small) .js files, or
a few larger files.

I'm thinking about load-time here...

I have a gut-feeling that it will be better to use
(a lot of ) smaller (very specific) files, giving you
a much better granularity...

Any thoughts?
Have someone benchmarked this?
Am I way too far down in my wine bottle?
Do you care?

;-)

--
Dag.

Jul 23 '05 #1
Share this Question
Share on Google+
4 Replies


P: n/a
Dag Sunde wrote:
Just wondering if anyone have looked into this?

How to split up ones JavaScript library?
A lot of very specific (and small) .js files, or
a few larger files.

I'm thinking about load-time here...

I have a gut-feeling that it will be better to use
(a lot of ) smaller (very specific) files, giving you
a much better granularity...

Any thoughts?
Have someone benchmarked this?
Am I way too far down in my wine bottle?
Do you care?

;-)


This is so easy to test - it probably takes longer to write the
question.

Loading 30 script files from local drive took 468ms, loading one file
that contained all the content of the 30 files took 16ms.

Is *anyone* surprised that reading 30 references, requesting 30 files,
opening, reading, parsing, closing, etc. is slower than reading one
reference, requesting one file... you know the rest.

Anyone who has tried to transfer lots of files over a network knows it
is much, much faster to make one big file, send it, then unpack it at
the other end - ever heard of CPIO or its friend, SCPIO? Is UNIX
really that dead? Or has Linux changed the name to something presumably
more sexy but nonetheless awfully geeky?

Damn, that wine bottle is really on empty, eh? ;-p

--
Rob
Jul 23 '05 #2

P: n/a
"RobG" <rg***@iinet.net.auau> wrote in message
news:4I******************@news.optus.net.au...
Anyone who has tried to transfer lots of files over a network knows it is much, much faster to make one big file, send it, then unpack it at the other end - ever heard of CPIO or its friend, SCPIO? Is UNIX
really that dead? Or has Linux changed the name to something presumably more sexy but nonetheless awfully geeky?


Anyone who has downloaded content from a Web site knows it's much faster
to let the browser do GET requests on several images at once then it is
to do a GET, wait for an image to download, do another GET, let that
image download, etc.

By putting all your JavaScript in one large file, it must download
synchronously.

By splitting it into several small files, the browser can perform up to
4 GETs (HTTP 1.0) or 2 GETs (HTTP 1.1) simultaneously.

Anyone who has ever FTPed several large files over the Internet will
have also seen this. For example, when I FTP FreeBSD ISOs, I get
approximately 60-80Kb/s on each of the 4 downloads. As downloads end and
I have one left, that ISO downloads at approximately 100Kb/s, not the
240-320Kb/s you might expect.

When it comes to the Internet, one large pipe is not the same as several
smaller ones.

--
Grant Wagner <gw*****@agricoreunited.com>
comp.lang.javascript FAQ - http://jibbering.com/faq
Jul 23 '05 #3

P: n/a
Grant Wagner wrote:
[...]
By putting all your JavaScript in one large file, it must download
synchronously.

By splitting it into several small files, the browser can perform up to
4 GETs (HTTP 1.0) or 2 GETs (HTTP 1.1) simultaneously.

Anyone who has ever FTPed several large files over the Internet will
have also seen this. For example, when I FTP FreeBSD ISOs, I get
approximately 60-80Kb/s on each of the 4 downloads. As downloads end and
I have one left, that ISO downloads at approximately 100Kb/s, not the
240-320Kb/s you might expect.
Next time my browser tries to download 4 JavaScript files of 300KB each
then maybe I'll remember your advice.

A JS file of 300 lines is perhaps 11KB. The overhead of doing the GET
is likely more than the effort to download the file, so better to
download one 11KB file than say four of 3KB each.

You also assume that the browser is only downloading the JS files - it
probably isn't. The other streams are probably downloading other page
content, so why use them for JS when they may be better employed
getting images or other content?
When it comes to the Internet, one large pipe is not the same as several
smaller ones.


Quite true, but it can't be claimed that lots of small files is
*always* better than one (or a smaller number of) bigger file(s).

I'll bet there are some frustrated network engineers lurking who could
argue this one for weeks on end...
--
Rob
Jul 23 '05 #4

P: n/a
JRS: In article <A7**************@news2.mts.net>, dated Thu, 16 Dec
2004 16:57:04, seen in news:comp.lang.javascript, Grant Wagner
<gw*****@agricoreunited.com> posted :

By splitting it into several small files, the browser can perform up to
4 GETs (HTTP 1.0) or 2 GETs (HTTP 1.1) simultaneously.


But be aware that that can be very annoying for users with limited
bandwidth who are trying to load a big page in the background of doing
other work, perhaps interactive.

An author should not presume that his work is all that is of interest to
his readers.

--
John Stockton, Surrey, UK. ?@merlyn.demon.co.uk Turnpike v4.00 IE 4
<URL:http://www.jibbering.com/faq/> JL/RC: FAQ of news:comp.lang.javascript
<URL:http://www.merlyn.demon.co.uk/js-index.htm> jscr maths, dates, sources.
<URL:http://www.merlyn.demon.co.uk/> TP/BP/Delphi/jscr/&c, FAQ items, links.
Jul 23 '05 #5

This discussion thread is closed

Replies have been disabled for this discussion.