By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,686 Members | 1,603 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,686 IT Pros & Developers. It's quick & easy.

the future of applications in JavaScript?

P: n/a
Hi,

I'm sure many here have already noticed this but it seems that the
development of the browser world is paralleling the development of the
computer world. However, the browser world is about 20 years behind
like back in the days of mainframes and dumb terminals just capable of
running a window manager. Instead now we have servers and browsers. In
a way I make me feel a little like browser scripting is already archaic
because this phase of computing has already passed.

As bandwidth increases people want the client to do more and then even
more. I have been asked to manipulate large datasets in the browser. I
don't necessarily think bandwidth is up to this yet but it isn't my
choice. I could hand write JavaScript to manipulate this data but what
I really want to do the job is a SQL DBMS application in the browser to
manipulate the data. I can see this situation becoming more common
where people want to cache JavaScript applications in the browser to be
used by other scripts. Perhaps in five years this will be common. Could
we end up have things like BrowserSQL and BrowserOffice stored in our
cache? I'm curious what obsticles stand in the way of these types of
objectives and what will have to change to make this happen. Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?

Peter

Nov 11 '06 #1
Share this Question
Share on Google+
8 Replies


P: n/a
Daz

Peter Michaux wrote:
Hi,

I'm sure many here have already noticed this but it seems that the
development of the browser world is paralleling the development of the
computer world. However, the browser world is about 20 years behind
like back in the days of mainframes and dumb terminals just capable of
running a window manager. Instead now we have servers and browsers. In
a way I make me feel a little like browser scripting is already archaic
because this phase of computing has already passed.

As bandwidth increases people want the client to do more and then even
more. I have been asked to manipulate large datasets in the browser. I
don't necessarily think bandwidth is up to this yet but it isn't my
choice. I could hand write JavaScript to manipulate this data but what
I really want to do the job is a SQL DBMS application in the browser to
manipulate the data. I can see this situation becoming more common
where people want to cache JavaScript applications in the browser to be
used by other scripts. Perhaps in five years this will be common. Could
we end up have things like BrowserSQL and BrowserOffice stored in our
cache? I'm curious what obsticles stand in the way of these types of
objectives and what will have to change to make this happen. Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?

Peter
I think that this kind of things 'evolves'. For something to be worth
killing off completely, I think there must be a good reason. I
personally, can't think of anything that is the same today as it was 10
years ago. However, most things are preserved for historical purposes,
or made to be backwards compatible. I think that the bottom line is
that it's not always possible to create an application, and leave it
as-is. You will always have to maintain an application in order to fix
bugs, tweak the speed and algorithms, and basically move with the
times. As programming and scripting languages evolve, so must the
programmer.

Programming languages started off simple with not many to choose from,
and now they are in ubundance. I don't think there is anything to
'force' you to change, however, as things become more advanced it's
usually the programmers 'preference' to utilize new functionality and
features of new and improved products. Suffice to say it works like
that in the home, too. Not many people these days own a Gramaphone, and
those that do probably do so for historical purposes. My preference is
to use CD's...

Nov 11 '06 #2

P: n/a
VK
I'm sure many here have already noticed this but it seems that the
development of the browser world is paralleling the development of the
computer world. However, the browser world is about 20 years behind
like back in the days of mainframes and dumb terminals just capable of
running a window manager. Instead now we have servers and browsers.
IMHO all modern "box applications" are taking the web-application
approach: thus the program development life doesn't end at the moment
of being boxed. It remains alive (unless blocked) by connecting to the
producer sites for upgrades and new releases.
Same way any web-application can be stored on your disk (File Save >
Web Page, complete) but being automatically updated from the producer
site.
This approach doesn't require any updates in the current UA's. But
still it's merely a reproduction of the older "box application"
technique. What I see really interesting is the possibilities of
distributed web-applications where say the interface comes from a US
site, script blocks from Germany and Finland and data processing RMI'ed
from a Japan database server.

I have attended a meeting in San Francisco, these links can be
interesting on the subject:
<http://www.webware.com/8301-1_109-9661722-2.html>
<http://www.webware.com>
Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?
IMHO the bottleneck (but not a dead end) of all - even the most modern
- UA's is the rendering engine. They are still - like NCSA Mosaic -
slow and lazy tools to display text documents with some graphics. Some
day the niche can be closed: but for this C++ programmer has to step
aside and Assembly programmer take his place. A browser handling HTML,
CSS, JavaScript, SVG/VML - but with triangles processing speed of the
Counter Strike mover: that would be a Web revolution allowing to use
many tools just waiting for an appropriate bearer.
See for just one example A.L.I.C.E. Foundation
<http://www.alicebot.orgNow replace (in your mind) i) Flash with
high quality script-driven SVG and ii) prerecorded, bandwidth killing
sound stream with XML based phonetic markup.

Nov 11 '06 #3

P: n/a
VK wrote:
>
IMHO all modern "box applications" are taking the web-application
approach: thus the program development life doesn't end at the moment
of being boxed. It remains alive (unless blocked) by connecting to the
producer sites for upgrades and new releases.
Same way any web-application can be stored on your disk (File Save >
Web Page, complete) but being automatically updated from the producer
site.
The moment you save a web application to your hard drive it can no
longer talk with all the same methods because of the JavaScript
security rules. So at this point many web applications will break.

We can load the browser with data in the browser's cache as a
JavaScript source file with JSON inside. If the data changes then we
can't locally update that cached data file. We have to download the
potentially large dataset from scratch the next time we visit the
website.

What I see really interesting is the possibilities of
distributed web-applications where say the interface comes from a US
site, script blocks from Germany and Finland and data processing RMI'ed
from a Japan database server.
I can see this really cutting down traffic on the web. If everyone
using Yahoo! UI downloaded it from the same URL then many sites could
take advantage of Yahoo! UI already being in the browsers cache. This
is possible isn't it?

Now the probability of a particular page working is decreased because
more than one server has to be up for success.
IMHO the bottleneck (but not a dead end) of all - even the most modern
- UA's is the rendering engine. They are still - like NCSA Mosaic -
slow and lazy tools to display text documents with some graphics. Some
day the niche can be closed: but for this C++ programmer has to step
aside and Assembly programmer take his place.
I can't imagine that a brower rendering engine needs to be hand written
in assembly to get fast rendering speed. Do they really write video
game rendering engines in assembly? That would be painful! I imagine
the problem in a web page is more of flowing the page which is not
something a video game needs to do. Can't a video game just place
everything by absolute pixel position?

-----

I suppose with faster client-server communication we could move in two
directions. One is make the browser smarter and have longer load times
but faster interaction times without server communication during the
page's life. The other is to move more in the direction of
mainframe-terminal design where the browser is capable of only
communicating and rendering.

Ruby on Rails is moving in the second direction. In a Rails app, all
form validation occurs server-side using Ajax if possible to make is
snappy. This makes sense if validation is considered application logic
and the browser is only part of the view layer in a
model-view-controller architecture.

One of Google's directive's is that page loads should be instant. I
think that implies that the page shouldn't contain any or much
application logic since the server can do this. Loading this logic is
not necessary to achieve the instant page load. I think Google would
rather spread out communication time over the life of a page than to
aggregate as much as possible to the initial page load.

Peter

Nov 11 '06 #4

P: n/a
In article <11**********************@i42g2000cwa.googlegroups .com>,
Peter Michaux <pe**********@gmail.comwrites

<snip>
>As bandwidth increases
Meanwhile, don't forget laptops connected by mobile phone.

people want the client to do more and then even
more. I have been asked to manipulate large datasets in the browser. I
don't necessarily think bandwidth is up to this yet but it isn't my
choice. I could hand write JavaScript to manipulate this data but what
I really want to do the job is a SQL DBMS application in the browser to
manipulate the data. I can see this situation becoming more common
where people want to cache JavaScript applications in the browser to be
used by other scripts. Perhaps in five years this will be common. Could
we end up have things like BrowserSQL and BrowserOffice stored in our
cache? I'm curious what obsticles stand in the way of these types of
objectives and what will have to change to make this happen. Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?
Remind me : why were Java applets invented ?

John
--
John Harris
Nov 12 '06 #5

P: n/a

John G Harris wrote:
In article <11**********************@i42g2000cwa.googlegroups .com>,
Peter Michaux <pe**********@gmail.comwrites

<snip>
As bandwidth increases

Meanwhile, don't forget laptops connected by mobile phone.

people want the client to do more and then even
more.
"People" here being developers, users don't care. They will be driven
by completely different requirements and needs, ultimately they are the
ones that drive technology adoption.

I have been asked to manipulate large datasets in the browser. I
don't necessarily think bandwidth is up to this yet but it isn't my
choice.
Why are Access databases considered only suitable for say <5 users?
Downloading large chunks of a db for the client to manipulate very
quickly "hits the wall". A web application with potentially thousands
of concurrent users should never use large datasets on the client
(though "large" has not been defined here) because of bandwidth and
concurrency issues.

I could hand write JavaScript to manipulate this data but what
I really want to do the job is a SQL DBMS application in the browser to
manipulate the data. I can see this situation becoming more common
where people want to cache JavaScript applications in the browser to be
used by other scripts.
No, I think you want better connectivity to with an RDBMS so it (or at
least the server application business logic layer) does all the real
work and the browser does (mostly) presentation.

There are many client/server models. Every time someone starts putting
more and more processing on the client, someone else turns up with a
model that puts most of it back on the server. I think the direction
is clear: keep the bulk of data and processing on the server, keep the
client for presentation and user interaction.

Of course there are exceptions to the rule, but generally it is the
best model.

Perhaps in five years this will be common. Could
we end up have things like BrowserSQL and BrowserOffice stored in our
cache?
I don't think anyone will like to have internet connectivity as a
prerequisite for basic office functionality. And anyone working locally
will not want the restrictions of a browser when working with
documents.

Web-based office applications are an interesting novelty that might
lead to some useful products, but they will never replace local
applications that provide similar functionality.
I'm curious what obsticles stand in the way of these types of
objectives
The same ones that stand in the way of Access databases being shared
across an enterprise of thousands of users.

and what will have to change to make this happen. Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?
No, they'll probably be even more popular. I guess we'll have to wait
5yrs to see whose crystal ball is best. :-)

Remind me : why were Java applets invented ?
And to extend that rhetorical question: why they are, compared to
JavaScript, almost non-existent on the web?
--
Rob

Nov 13 '06 #6

P: n/a

Peter Michaux wrote:
Hi,

I'm sure many here have already noticed this but it seems that the
development of the browser world is paralleling the development of the
computer world. However, the browser world is about 20 years behind
like back in the days of mainframes and dumb terminals just capable of
running a window manager. Instead now we have servers and browsers. In
a way I make me feel a little like browser scripting is already archaic
because this phase of computing has already passed.

As bandwidth increases people want the client to do more and then even
more. I have been asked to manipulate large datasets in the browser. I
don't necessarily think bandwidth is up to this yet but it isn't my
choice. I could hand write JavaScript to manipulate this data but what
I really want to do the job is a SQL DBMS application in the browser to
manipulate the data. I can see this situation becoming more common
where people want to cache JavaScript applications in the browser to be
used by other scripts. Perhaps in five years this will be common. Could
we end up have things like BrowserSQL and BrowserOffice stored in our
cache? I'm curious what obsticles stand in the way of these types of
objectives and what will have to change to make this happen. Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?
Peter

Have you considered:-

(a) .NET? If and when the .NET Framework is installed on every PC in
the world (including the Mono project), presumably that framework has
DBMS components, and I believe that Microsoft's intention is that you
should be able to download .NET applications in a similar way to Java
applets. So the future is now, if you want it.

(b) Java? Again, presumably J2SE has DBMS capability, through applets
to do this.

(c) When the introduction into Mozilla of JavaScript2 and ADOBE
Tamarin (a JIT compiler for JavaScript) takes place, combined with XUL
etc, this could form yet another platform to choose from.

When you compare C#, JavaScript 2, and Java, there is increasingly
little to tell between them.

If each platform also adheres to the same or similar standards, and has
similar performance capability, then perhaps the future will be a
matter for personal taste.

Julian Turner

Nov 13 '06 #7

P: n/a
Peter Michaux wrote:
[snip]
a way I make me feel a little like browser scripting is already archaic
because this phase of computing has already passed.
[/snip[

See in particular
<URL:http://www.mozilla.org/projects/tamarin/faq.html>

Julian Turner

Nov 13 '06 #8

P: n/a
In article <11**********************@m73g2000cwd.googlegroups .com>, RobG
<rg***@iinet.net.auwrites
>
John G Harris wrote:
<snip>
>Remind me : why were Java applets invented ?

And to extend that rhetorical question: why they are, compared to
JavaScript, almost non-existent on the web?
Two main reasons :

1 Compatibility

No one could be sure your PC was running the right version(s) of the
support software. Indeed, they couldn't even be sure that the latest
version(s) could be installed in your 2 year old PC.
2 Security

People found they could deliver programs to your PC that were malicious
or criminal.
Would a future super-duper javascript avoid these problems ?

John
--
John Harris
Nov 13 '06 #9

This discussion thread is closed

Replies have been disabled for this discussion.