470,594 Members | 1,491 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 470,594 developers. It's quick & easy.

precompile javascript

I am trying to improve the performance of an embedded app that uses
javascript. Our javascript functions are quite long, and (for ease of
maintenance) quite verbose.

We already do things like inline the javascript and css to minimize load
times. Profiling indicates that the next best way to improve
performance is to reduce the time for javascript parsing.

I am looking for a php script that could pre-process the javascript to
reduce all long variable names to one or two letters, take out extra
spaces, comments, etc and otherwise reduce the javascript to the barest
minimum necessary to run.

Any suggestions?

--Yan
Oct 27 '06 #1
13 4457
ASM
CptDondo a écrit :
I am trying to improve the performance of an embedded app that uses
javascript. Our javascript functions are quite long, and (for ease of
maintenance) quite verbose.

We already do things like inline the javascript and css to minimize load
times. Profiling indicates that the next best way to improve
performance is to reduce the time for javascript parsing.
Bof! So little that it is a misery
(especially compared with the weight of images)
I am looking for a php script that could pre-process the javascript to
reduce all long variable names to one or two letters, take out extra
spaces, comments, etc and otherwise reduce the javascript to the barest
minimum necessary to run.
http://www.crockford.com/javascript/jsmin.html
Oct 27 '06 #2
if you are worrying about the number of characters that a script takes
up, then what you need is compression and caching. You can minify all
you like, but these two techniques taken together will reduce the data
size down to 10% (regardless of lengths of names of functions etc) and
then that will only be served ONCE per however many second. Both are
standard under apache and php, job done! I did excatly that for some
HUGE javascripts (scriptaculous all in one file, moofx, bahaviours
etc... a total weight of 250k down to about 20k)
Inlining your code might be counter-productive as it will use one less
request but at the expense of not being able to compress and cache the
js file, if the web pages cant be (if they are the result of php which
cannot be cached).

Oct 27 '06 #3
shimmyshack wrote:
if you are worrying about the number of characters that a script takes
up, then what you need is compression and caching. You can minify all
you like, but these two techniques taken together will reduce the data
size down to 10% (regardless of lengths of names of functions etc) and
then that will only be served ONCE per however many second. Both are
standard under apache and php, job done! I did excatly that for some
HUGE javascripts (scriptaculous all in one file, moofx, bahaviours
etc... a total weight of 250k down to about 20k)
Inlining your code might be counter-productive as it will use one less
request but at the expense of not being able to compress and cache the
js file, if the web pages cant be (if they are the result of php which
cannot be cached).
Actually I am trying to minimize the time it takes to parse the file.
Inlining CSS and js reduced load times from 38 seconds to 16.... From
unusable to bearable.

So now we're looking to squeeze a few more seconds out of it here and
there. If I can do that by minifying, great. If not, I'll try
something else.

The system is all driven via PHP; there is no caching anywhere. The
file system is read-only and the /tmp filesystem is ram-based, so I want
to minimize its use.

--Yan
Oct 27 '06 #4
hmm, with regard to parsing, if you have done profiling and identified
parsing as the trouble then ok, but usually its either producing the
html that will be displayed or the final rendering time. Are you doing
XPath stuff? if not have you tried rewriting the code to do the job in
chunks, so that users dont have to sit waiting for the entire thing to
complete before they can interact with the app.
it all sounds like its going to be rather frustrating for you, good
luck though. :)

Oct 27 '06 #5
shimmyshack wrote:
hmm, with regard to parsing, if you have done profiling and identified
parsing as the trouble then ok, but usually its either producing the
html that will be displayed or the final rendering time. Are you doing
XPath stuff? if not have you tried rewriting the code to do the job in
chunks, so that users dont have to sit waiting for the entire thing to
complete before they can interact with the app.
it all sounds like its going to be rather frustrating for you, good
luck though. :)
Well, minifying the code and putting the resulting javascript into a
ram-based tmpfs reduced our maximum load time to just over 10 seconds;
that's for a php-generated form with over a hundred input fields,
prefilled from XML data files, and complex javascript to drive them all.
Most other forms display in 5 seconds or less.

So, yes, frustrating (we've spent a week getting performance acceptable)
but in the end we have a platform that, while not exactly screaming, is
very usable. Not a bad way to end the week.

:-)
Oct 27 '06 #6

CptDondo wrote:
Well, minifying the code and putting the resulting javascript into a
ram-based tmpfs reduced our maximum load time to just over 10 seconds;
that's for a php-generated form with over a hundred input fields,
prefilled from XML data files, and complex javascript to drive them all.
Do you show progress in loading to the user, to make even the 10
seconds more acceptable? Most people don't mind delays if they have a
visual indicator of how long it will take, or where it's at. Many
even look forward to the extra time to do something else (check the
phone, pick their nose, that sort of thing :-)

Kev

Oct 28 '06 #7
CptDondo <ya*@NsOeSiPnAeMr.comwrote in
news:12*************@corp.supernews.com:
a php-generated form with over a hundred input fields,
prefilled from XML data files, and complex javascript to drive them all.
Maybe you need to re-think your site design. One page with 100 input
fields? Sheesh. It boggles the poor user's mind. Puhleeze, spread it out
over several pages, each a chunk that's managable by a human with ordinary
intelligence. ;-)
Oct 28 '06 #8
Kevin Darling wrote:
CptDondo wrote:
>Well, minifying the code and putting the resulting javascript into a
ram-based tmpfs reduced our maximum load time to just over 10 seconds;
that's for a php-generated form with over a hundred input fields,
prefilled from XML data files, and complex javascript to drive them all.

Do you show progress in loading to the user, to make even the 10
seconds more acceptable? Most people don't mind delays if they have a
visual indicator of how long it will take, or where it's at. Many
even look forward to the extra time to do something else (check the
phone, pick their nose, that sort of thing :-)

Kev
As soon as the user hits the submit button, we turn off the display, and
show a 'please wait' text... That only stays up for a few seconds, the
browser then blanks the screen, and the user sees changes. So, yes,
there's always stuff going on. (Vast improvement over the original,
when the screen went black for 38 seconds.....)

not much opportunity for eyecandy on our 320x240 monochrome LCD
display.... But my suggestion of an animated gif with growing corn was
nixed by the PHB... :-)
Oct 28 '06 #9
Jim Land wrote:
CptDondo <ya*@NsOeSiPnAeMr.comwrote in
news:12*************@corp.supernews.com:
>a php-generated form with over a hundred input fields,
prefilled from XML data files, and complex javascript to drive them all.

Maybe you need to re-think your site design. One page with 100 input
fields? Sheesh. It boggles the poor user's mind. Puhleeze, spread it out
over several pages, each a chunk that's managable by a human with ordinary
intelligence. ;-)
It's a table of 24 rows with half a dozen items in each row. It
actually makes it easier to show the whole table than to break it up,
especially since we want to show cumulative totals for columns as the
user changes values.... (Think spreadsheet, not google search form.)

It's a special purpose embedded box, not a general public website.

We're pushing the envelope on what can be done in a 200 Mhz 32MB Ram
embedded box.... But that's the fun part.

--Yan
Oct 28 '06 #10
We're pushing the envelope on what can be done in a 200 Mhz 32MB Ram
embedded box.... But that's the fun part.

--Yan
10seconds, nice work, I have worked with very large php-js forms, ( i
used the behaviours library to do the dynimic populating) have you ever
tried the activegrid oo code, it can be quite snappy for the type of
stuff youre doing.

Oct 29 '06 #11
CptDondo <ya*@NsOeSiPnAeMr.comwrote in news:12k758527fjpof7
@corp.supernews.com:
It's a table of 24 rows with half a dozen items in each row. It
actually makes it easier to show the whole table than to break it up,
especially since we want to show cumulative totals for columns as the
user changes values.... (Think spreadsheet, not google search form.)
So you're using a web browser, Javascript, and XML to *emulate* a
spreadsheet? Maybe another approach would be a better solution.
Oct 30 '06 #12

CptDondo wrote:
I am trying to improve the performance of an embedded app that uses
javascript. Our javascript functions are quite long, and (for ease of
maintenance) quite verbose.

We already do things like inline the javascript and css to minimize load
times. Profiling indicates that the next best way to improve
performance is to reduce the time for javascript parsing.

I am looking for a php script that could pre-process the javascript to
reduce all long variable names to one or two letters, take out extra
spaces, comments, etc and otherwise reduce the javascript to the barest
minimum necessary to run.

Any suggestions?

--Yan
Try TrickyScripter (http://trickyscripter.com)
It can strip redundant markup and even replace local variables names
with shorter ones. And it will not break your code. Effect is usually
20-60%.
At the moment TrickyScripter is a plugin for Dreamweaver, but I have
command line version (GUI version is at the development stage).

Val Polyakh

Oct 30 '06 #13
Jim Land wrote:
CptDondo <ya*@NsOeSiPnAeMr.comwrote in news:12k758527fjpof7
@corp.supernews.com:
>It's a table of 24 rows with half a dozen items in each row. It
actually makes it easier to show the whole table than to break it up,
especially since we want to show cumulative totals for columns as the
user changes values.... (Think spreadsheet, not google search form.)

So you're using a web browser, Javascript, and XML to *emulate* a
spreadsheet? Maybe another approach would be a better solution.
Yup... It makes sense.... We want to be able to control this thing from
anywhere. Our competition is using add-on hardware, local servers,
proprietary crippleware, etc. We want to be able to use *any* browser,
anywhere. Next on the agenda is browser detection and custom PHP/js to
provide usability on cell phones, PDAs and so on.

The XML is necessary because many new SCADA packages use XML, so we can
be competitive out of the box, again, without proprietary crippleware.
(our competition is using 300 baud motorola modems, for cripes' sakes.
We have ethernet, GSM modems, wifi, all built in.)

Oct 30 '06 #14

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

reply views Thread by Bernard Dhooghe | last post: by
1 post views Thread by gladiator | last post: by
1 post views Thread by Steve Franks | last post: by
8 posts views Thread by Mike Owen | last post: by
2 posts views Thread by Shimon Sim | last post: by
7 posts views Thread by bruce barker | last post: by
3 posts views Thread by Scott M. | last post: by
1 post views Thread by Gabriel Pineda | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.