473,750 Members | 2,270 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

the future of applications in JavaScript?

Hi,

I'm sure many here have already noticed this but it seems that the
development of the browser world is paralleling the development of the
computer world. However, the browser world is about 20 years behind
like back in the days of mainframes and dumb terminals just capable of
running a window manager. Instead now we have servers and browsers. In
a way I make me feel a little like browser scripting is already archaic
because this phase of computing has already passed.

As bandwidth increases people want the client to do more and then even
more. I have been asked to manipulate large datasets in the browser. I
don't necessarily think bandwidth is up to this yet but it isn't my
choice. I could hand write JavaScript to manipulate this data but what
I really want to do the job is a SQL DBMS application in the browser to
manipulate the data. I can see this situation becoming more common
where people want to cache JavaScript applications in the browser to be
used by other scripts. Perhaps in five years this will be common. Could
we end up have things like BrowserSQL and BrowserOffice stored in our
cache? I'm curious what obsticles stand in the way of these types of
objectives and what will have to change to make this happen. Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?

Peter

Nov 11 '06 #1
8 1835
Daz

Peter Michaux wrote:
Hi,

I'm sure many here have already noticed this but it seems that the
development of the browser world is paralleling the development of the
computer world. However, the browser world is about 20 years behind
like back in the days of mainframes and dumb terminals just capable of
running a window manager. Instead now we have servers and browsers. In
a way I make me feel a little like browser scripting is already archaic
because this phase of computing has already passed.

As bandwidth increases people want the client to do more and then even
more. I have been asked to manipulate large datasets in the browser. I
don't necessarily think bandwidth is up to this yet but it isn't my
choice. I could hand write JavaScript to manipulate this data but what
I really want to do the job is a SQL DBMS application in the browser to
manipulate the data. I can see this situation becoming more common
where people want to cache JavaScript applications in the browser to be
used by other scripts. Perhaps in five years this will be common. Could
we end up have things like BrowserSQL and BrowserOffice stored in our
cache? I'm curious what obsticles stand in the way of these types of
objectives and what will have to change to make this happen. Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?

Peter
I think that this kind of things 'evolves'. For something to be worth
killing off completely, I think there must be a good reason. I
personally, can't think of anything that is the same today as it was 10
years ago. However, most things are preserved for historical purposes,
or made to be backwards compatible. I think that the bottom line is
that it's not always possible to create an application, and leave it
as-is. You will always have to maintain an application in order to fix
bugs, tweak the speed and algorithms, and basically move with the
times. As programming and scripting languages evolve, so must the
programmer.

Programming languages started off simple with not many to choose from,
and now they are in ubundance. I don't think there is anything to
'force' you to change, however, as things become more advanced it's
usually the programmers 'preference' to utilize new functionality and
features of new and improved products. Suffice to say it works like
that in the home, too. Not many people these days own a Gramaphone, and
those that do probably do so for historical purposes. My preference is
to use CD's...

Nov 11 '06 #2
VK
I'm sure many here have already noticed this but it seems that the
development of the browser world is paralleling the development of the
computer world. However, the browser world is about 20 years behind
like back in the days of mainframes and dumb terminals just capable of
running a window manager. Instead now we have servers and browsers.
IMHO all modern "box applications" are taking the web-application
approach: thus the program development life doesn't end at the moment
of being boxed. It remains alive (unless blocked) by connecting to the
producer sites for upgrades and new releases.
Same way any web-application can be stored on your disk (File Save >
Web Page, complete) but being automatically updated from the producer
site.
This approach doesn't require any updates in the current UA's. But
still it's merely a reproduction of the older "box application"
technique. What I see really interesting is the possibilities of
distributed web-applications where say the interface comes from a US
site, script blocks from Germany and Finland and data processing RMI'ed
from a Japan database server.

I have attended a meeting in San Francisco, these links can be
interesting on the subject:
<http://www.webware.com/8301-1_109-9661722-2.html>
<http://www.webware.com >
Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?
IMHO the bottleneck (but not a dead end) of all - even the most modern
- UA's is the rendering engine. They are still - like NCSA Mosaic -
slow and lazy tools to display text documents with some graphics. Some
day the niche can be closed: but for this C++ programmer has to step
aside and Assembly programmer take his place. A browser handling HTML,
CSS, JavaScript, SVG/VML - but with triangles processing speed of the
Counter Strike mover: that would be a Web revolution allowing to use
many tools just waiting for an appropriate bearer.
See for just one example A.L.I.C.E. Foundation
<http://www.alicebot.or gNow replace (in your mind) i) Flash with
high quality script-driven SVG and ii) prerecorded, bandwidth killing
sound stream with XML based phonetic markup.

Nov 11 '06 #3
VK wrote:
>
IMHO all modern "box applications" are taking the web-application
approach: thus the program development life doesn't end at the moment
of being boxed. It remains alive (unless blocked) by connecting to the
producer sites for upgrades and new releases.
Same way any web-application can be stored on your disk (File Save >
Web Page, complete) but being automatically updated from the producer
site.
The moment you save a web application to your hard drive it can no
longer talk with all the same methods because of the JavaScript
security rules. So at this point many web applications will break.

We can load the browser with data in the browser's cache as a
JavaScript source file with JSON inside. If the data changes then we
can't locally update that cached data file. We have to download the
potentially large dataset from scratch the next time we visit the
website.

What I see really interesting is the possibilities of
distributed web-applications where say the interface comes from a US
site, script blocks from Germany and Finland and data processing RMI'ed
from a Japan database server.
I can see this really cutting down traffic on the web. If everyone
using Yahoo! UI downloaded it from the same URL then many sites could
take advantage of Yahoo! UI already being in the browsers cache. This
is possible isn't it?

Now the probability of a particular page working is decreased because
more than one server has to be up for success.
IMHO the bottleneck (but not a dead end) of all - even the most modern
- UA's is the rendering engine. They are still - like NCSA Mosaic -
slow and lazy tools to display text documents with some graphics. Some
day the niche can be closed: but for this C++ programmer has to step
aside and Assembly programmer take his place.
I can't imagine that a brower rendering engine needs to be hand written
in assembly to get fast rendering speed. Do they really write video
game rendering engines in assembly? That would be painful! I imagine
the problem in a web page is more of flowing the page which is not
something a video game needs to do. Can't a video game just place
everything by absolute pixel position?

-----

I suppose with faster client-server communication we could move in two
directions. One is make the browser smarter and have longer load times
but faster interaction times without server communication during the
page's life. The other is to move more in the direction of
mainframe-terminal design where the browser is capable of only
communicating and rendering.

Ruby on Rails is moving in the second direction. In a Rails app, all
form validation occurs server-side using Ajax if possible to make is
snappy. This makes sense if validation is considered application logic
and the browser is only part of the view layer in a
model-view-controller architecture.

One of Google's directive's is that page loads should be instant. I
think that implies that the page shouldn't contain any or much
application logic since the server can do this. Loading this logic is
not necessary to achieve the instant page load. I think Google would
rather spread out communication time over the life of a page than to
aggregate as much as possible to the initial page load.

Peter

Nov 11 '06 #4
In article <11************ **********@i42g 2000cwa.googleg roups.com>,
Peter Michaux <pe**********@g mail.comwrites

<snip>
>As bandwidth increases
Meanwhile, don't forget laptops connected by mobile phone.

people want the client to do more and then even
more. I have been asked to manipulate large datasets in the browser. I
don't necessarily think bandwidth is up to this yet but it isn't my
choice. I could hand write JavaScript to manipulate this data but what
I really want to do the job is a SQL DBMS application in the browser to
manipulate the data. I can see this situation becoming more common
where people want to cache JavaScript applications in the browser to be
used by other scripts. Perhaps in five years this will be common. Could
we end up have things like BrowserSQL and BrowserOffice stored in our
cache? I'm curious what obsticles stand in the way of these types of
objectives and what will have to change to make this happen. Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?
Remind me : why were Java applets invented ?

John
--
John Harris
Nov 12 '06 #5

John G Harris wrote:
In article <11************ **********@i42g 2000cwa.googleg roups.com>,
Peter Michaux <pe**********@g mail.comwrites

<snip>
As bandwidth increases

Meanwhile, don't forget laptops connected by mobile phone.

people want the client to do more and then even
more.
"People" here being developers, users don't care. They will be driven
by completely different requirements and needs, ultimately they are the
ones that drive technology adoption.

I have been asked to manipulate large datasets in the browser. I
don't necessarily think bandwidth is up to this yet but it isn't my
choice.
Why are Access databases considered only suitable for say <5 users?
Downloading large chunks of a db for the client to manipulate very
quickly "hits the wall". A web application with potentially thousands
of concurrent users should never use large datasets on the client
(though "large" has not been defined here) because of bandwidth and
concurrency issues.

I could hand write JavaScript to manipulate this data but what
I really want to do the job is a SQL DBMS application in the browser to
manipulate the data. I can see this situation becoming more common
where people want to cache JavaScript applications in the browser to be
used by other scripts.
No, I think you want better connectivity to with an RDBMS so it (or at
least the server application business logic layer) does all the real
work and the browser does (mostly) presentation.

There are many client/server models. Every time someone starts putting
more and more processing on the client, someone else turns up with a
model that puts most of it back on the server. I think the direction
is clear: keep the bulk of data and processing on the server, keep the
client for presentation and user interaction.

Of course there are exceptions to the rule, but generally it is the
best model.

Perhaps in five years this will be common. Could
we end up have things like BrowserSQL and BrowserOffice stored in our
cache?
I don't think anyone will like to have internet connectivity as a
prerequisite for basic office functionality. And anyone working locally
will not want the restrictions of a browser when working with
documents.

Web-based office applications are an interesting novelty that might
lead to some useful products, but they will never replace local
applications that provide similar functionality.
I'm curious what obsticles stand in the way of these types of
objectives
The same ones that stand in the way of Access databases being shared
across an enterprise of thousands of users.

and what will have to change to make this happen. Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?
No, they'll probably be even more popular. I guess we'll have to wait
5yrs to see whose crystal ball is best. :-)

Remind me : why were Java applets invented ?
And to extend that rhetorical question: why they are, compared to
JavaScript, almost non-existent on the web?
--
Rob

Nov 13 '06 #6

Peter Michaux wrote:
Hi,

I'm sure many here have already noticed this but it seems that the
development of the browser world is paralleling the development of the
computer world. However, the browser world is about 20 years behind
like back in the days of mainframes and dumb terminals just capable of
running a window manager. Instead now we have servers and browsers. In
a way I make me feel a little like browser scripting is already archaic
because this phase of computing has already passed.

As bandwidth increases people want the client to do more and then even
more. I have been asked to manipulate large datasets in the browser. I
don't necessarily think bandwidth is up to this yet but it isn't my
choice. I could hand write JavaScript to manipulate this data but what
I really want to do the job is a SQL DBMS application in the browser to
manipulate the data. I can see this situation becoming more common
where people want to cache JavaScript applications in the browser to be
used by other scripts. Perhaps in five years this will be common. Could
we end up have things like BrowserSQL and BrowserOffice stored in our
cache? I'm curious what obsticles stand in the way of these types of
objectives and what will have to change to make this happen. Are the
types of browsers we are using now with HTML and CSS a dead end when
things get really advanced in a few years from now?
Peter

Have you considered:-

(a) .NET? If and when the .NET Framework is installed on every PC in
the world (including the Mono project), presumably that framework has
DBMS components, and I believe that Microsoft's intention is that you
should be able to download .NET applications in a similar way to Java
applets. So the future is now, if you want it.

(b) Java? Again, presumably J2SE has DBMS capability, through applets
to do this.

(c) When the introduction into Mozilla of JavaScript2 and ADOBE
Tamarin (a JIT compiler for JavaScript) takes place, combined with XUL
etc, this could form yet another platform to choose from.

When you compare C#, JavaScript 2, and Java, there is increasingly
little to tell between them.

If each platform also adheres to the same or similar standards, and has
similar performance capability, then perhaps the future will be a
matter for personal taste.

Julian Turner

Nov 13 '06 #7
Peter Michaux wrote:
[snip]
a way I make me feel a little like browser scripting is already archaic
because this phase of computing has already passed.
[/snip[

See in particular
<URL:http://www.mozilla.org/projects/tamarin/faq.html>

Julian Turner

Nov 13 '06 #8
In article <11************ **********@m73g 2000cwd.googleg roups.com>, RobG
<rg***@iinet.ne t.auwrites
>
John G Harris wrote:
<snip>
>Remind me : why were Java applets invented ?

And to extend that rhetorical question: why they are, compared to
JavaScript, almost non-existent on the web?
Two main reasons :

1 Compatibility

No one could be sure your PC was running the right version(s) of the
support software. Indeed, they couldn't even be sure that the latest
version(s) could be installed in your 2 year old PC.
2 Security

People found they could deliver programs to your PC that were malicious
or criminal.
Would a future super-duper javascript avoid these problems ?

John
--
John Harris
Nov 13 '06 #9

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
1746
by: Pavils Jurjans | last post by:
Hallo, I yesterday was browsing the book, JavaScript The Definitive Guide (4th ed), which is, unquestionably, the best reference book for JS. To my surprise, I didn't find any documentation about the static properties of global RegExp object, ie, RegExp.lastMatch, RegExp.leftContext, RegExp.rightContext, and all those RegExp.$x properties. I lloked up the ECMA-262 PDF and to my surprise realized that they are not included into the...
47
3661
by: David Eng | last post by:
> For many years now enterprise business application development has > been the core area for the use of C++. > Today a significant share to this segment has already been lost to > SUN's Java technology and with MS now abandoning C++ in favour if its > proprietery .NET and C# technology, how long can we except C++ to hold > on against these might competitors? > Has C++ become a dying language? > What is the future of C++? As I posted...
35
3340
by: GTO | last post by:
I do not believe that C# is the future of C++. I also do not believe that adding two thousand new library functions to the standard library is the future of C++. But what is the future of C++? Is it as good as a programming language can get? Like so many of you, I programmed speech recognizers, image recognition systems, a portion of a chess program, lots of numeric code using STL, and tons of other applications in C++, (even firmware...
7
1918
by: Johan Jooris | last post by:
I can imagine that C++ will remain to have its place in a special kind of applications (device drivers, operating systels, ...) But what is the future of unmanaged C++ for 'normal' applications (e.g. distributed database applications, xml webservice applications) Can't C# be used for real time applications (statefull windows services with persistor functionalities) at all ? What is the pro's opinion about this ? Thanks !
375
18126
by: rkusenet | last post by:
This article is very bleak about future of DB2. How credible is the author. http://www.eweek.com/article2/0,1895,1839681,00.asp
9
2367
by: Lyle Fairfield | last post by:
It's confusing. Many people here and elsewhere make many different predictions: There's an introduction mentioning some aspects of this at http://msdn.microsoft.com/data/mdac/techinfo/default.aspx? pull=/library/en-us/dnmdac/html/data_mdacroadmap.asp revised Sep 2005 (upper case conversions are mine)
9
1279
by: Jason Vene | last post by:
I've looked at recent posts and not found material, but I'm new to the board, so please forgive if this thread has come up before. I'm facing a conundrum about development of business applications which have robust user interface requirements (sorry, details must be kept private). The UI will be required to present images, graphs (more akin to signals than charts, like a wave file editor), various custom controls (some are full scale...
1
339
by: Elias Politakis | last post by:
What I am failing to understand is how come and the whole world is treating ASP.NET as the irresistable pie for web development, whereas you have a server roundtrip for each mouse click on a form. Sure Internet is not as slow as it used to be, especially with ISDN and ADSL connections, but still it is not as fast as it should be in order to support that technology... Sometimes I wonder, what is wrong of having HTML bound ADO recordsets...
8
2204
by: damod.php | last post by:
Iam a php/mysql developer. I wanna know whether staying in php and mysql forever would affect my future. Most s/w people are in either java or dotnet. Iam novice in both of them. I love php and i dont wanna shift from it. Please tell me whether staying as a php developer is good enough to have a settled future in spftware industry. Pro's and Con's respected.
0
9577
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
9396
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
9256
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8260
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
6804
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6081
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4713
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
1
3322
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
2225
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.