By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,806 Members | 1,841 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,806 IT Pros & Developers. It's quick & easy.

SQL Server to Generate XML instead of Redendant Query..file open during write problem or job not running...

P: n/a
Here's a challenge that is killing me:

I've got 2 web servers and a SQL Server and about 5,000 'users' who
stay connected to the site all day. I have a page that is supposed to
be 'real-time', so to do this, I have a 1px frame that refreshes every
15 seconds (so the other frame doesn't have to reload all the time--the
top only reloads when a new record or a changed record hits the db).
The real time data can be filtered in about 8 different ways.
Currently, I have each user querying a table that contains 1 record,
including the max ID and the most current insert/update posting date.
The browser contains a cookie with that date. When the browser receives
notification that there is some new info on the server, it refreshes
the top page and reloads the data. This is happening for all users. So,
I thought to eliminate the 5,000 users running the same (or close
variations) of the same query each time a records is inserted/updated,
that I would generate an XML file with the current day's data.

In a dev environment this works 'ok'. I'm doing this by running an
ActiveX job on the SQL Server that calls a stored proc (FOR XML) and
writes the content to a file. Then from the web servers, I'm querying
this file for the new timestamp and then if newer than the cookie,
grabbing the XML (using the httprequest in the ASP XMLDOM) and using
XSLT to transform the data instead of parameterizing the queries.
Theoretically I love this solution. Problems happen in a LIVE
environment where either the file is being written to or the job isn't
able to run. When 2 records are trying to be written within the same
second, the file isn't being written (or maybe that the http requesting
the XML is keeping the file locked?)....anyway...this is a HUGE proble
that I can't seem to solve. Once we roll to .NET I think storing the
dataset in cache and updating cache (still don't know how I'll trigger
that without each user checking the db)....
Long winded, sorry...help?

Jul 23 '05 #1
Share this Question
Share on Google+
8 Replies


P: n/a
Phoenix (ph*********@yahoo.com) writes:
In a dev environment this works 'ok'. I'm doing this by running an
ActiveX job on the SQL Server that calls a stored proc (FOR XML) and
writes the content to a file. Then from the web servers, I'm querying
this file for the new timestamp and then if newer than the cookie,
grabbing the XML (using the httprequest in the ASP XMLDOM) and using
XSLT to transform the data instead of parameterizing the queries.
Theoretically I love this solution. Problems happen in a LIVE
environment where either the file is being written to or the job isn't
able to run. When 2 records are trying to be written within the same
second, the file isn't being written (or maybe that the http requesting
the XML is keeping the file locked?)....anyway...this is a HUGE proble
that I can't seem to solve. Once we roll to .NET I think storing the
dataset in cache and updating cache (still don't know how I'll trigger
that without each user checking the db)....
Long winded, sorry...help?
I am not sure that I understand half of it. But it may not be an SQL Server
question. You do mention that you have a job that creates a file, so
that is indeed something that concerns Agent. But then appears to be
some interaction with a web server. But if you really think that this
is an SQL Server problem, it would help if you could be more specific
what your problem is.

In another post you say:
I am curious if a user is trying to access the file while it is being
written if there would be major problems. Such as what if these
dropdowns were being updated constantly but it is still more efficient
to write a file than have thousands of people querying the db for the
same thing...?


And since read posts in reverse order, I saw this one first, and
understood about nothing at all first. While this last question is
not really SQL Server at all - SQL Server usually does not write files -
my general programming experience tells me that you should probably
create the file under a new name, and then perform a swap to minimize
the risk the above.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 23 '05 #2

P: n/a
"Phoenix" <ph*********@yahoo.com> wrote in message
news:11*********************@z14g2000cwz.googlegro ups.com...
Here's a challenge that is killing me: <<>> In a dev environment this works 'ok'. I'm doing this by running an
ActiveX job on the SQL Server that calls a stored proc (FOR XML) and
writes the content to a file. Then from the web servers, I'm querying
this file for the new timestamp and then if newer than the cookie,
grabbing the XML (using the httprequest in the ASP XMLDOM) and using
XSLT to transform the data instead of parameterizing the queries.
Theoretically I love this solution. Problems happen in a LIVE
environment where either the file is being written to or the job isn't
able to run. When 2 records are trying to be written within the same
second, the file isn't being written (or maybe that the http requesting
the XML is keeping the file locked?)....anyway...this is a HUGE proble
that I can't seem to solve. Once we roll to .NET I think storing the
dataset in cache and updating cache (still don't know how I'll trigger
that without each user checking the db)....
Long winded, sorry...help?


With asp.net you have a mechanism called shadow copying allows new versions
of files like dlls to be written whilst users have them locked.
Maybe that's something to investigate.

--
Regards,
Andy O'Neill
Jul 23 '05 #3

P: n/a
Here is one suggestion.

Your web server could maintain an application level cache (source XML or
result HTML). For each query that it runs it checks the cache and uses the
conained data if it exists, if not it hits the DB and caches the result. Use
the code of the query(or a derived hash) to be the lookup key.

When you detect that the data has changed, just dump the cache.

Mr Tea
http://mr-tea.blogspot.com
"Phoenix" <ph*********@yahoo.com> wrote in message
news:11*********************@z14g2000cwz.googlegro ups.com...
Here's a challenge that is killing me:

I've got 2 web servers and a SQL Server and about 5,000 'users' who
stay connected to the site all day. I have a page that is supposed to
be 'real-time', so to do this, I have a 1px frame that refreshes every
15 seconds (so the other frame doesn't have to reload all the time--the
top only reloads when a new record or a changed record hits the db).
The real time data can be filtered in about 8 different ways.
Currently, I have each user querying a table that contains 1 record,
including the max ID and the most current insert/update posting date.
The browser contains a cookie with that date. When the browser receives
notification that there is some new info on the server, it refreshes
the top page and reloads the data. This is happening for all users. So,
I thought to eliminate the 5,000 users running the same (or close
variations) of the same query each time a records is inserted/updated,
that I would generate an XML file with the current day's data.

In a dev environment this works 'ok'. I'm doing this by running an
ActiveX job on the SQL Server that calls a stored proc (FOR XML) and
writes the content to a file. Then from the web servers, I'm querying
this file for the new timestamp and then if newer than the cookie,
grabbing the XML (using the httprequest in the ASP XMLDOM) and using
XSLT to transform the data instead of parameterizing the queries.
Theoretically I love this solution. Problems happen in a LIVE
environment where either the file is being written to or the job isn't
able to run. When 2 records are trying to be written within the same
second, the file isn't being written (or maybe that the http requesting
the XML is keeping the file locked?)....anyway...this is a HUGE proble
that I can't seem to solve. Once we roll to .NET I think storing the
dataset in cache and updating cache (still don't know how I'll trigger
that without each user checking the db)....
Long winded, sorry...help?

Jul 23 '05 #4

P: n/a
So ultimately, each user would have to see if the cache holds the latest
dataset by checking the database for the latest update timestamp and
then verifying it with the cache timestamp. Right? Or is there a way to
do this otherwise? I like the idea of shadow copying. If I could write
to a file on each web server, then use the .NET cache that is triggered
to update by the change in a file, then I could compare a user's cookie
timestamp to the cache timestamp and that would work great. The only
problem then stands is how get the xml file to the web servers from the
SQL Server in a 'real time' fashion. Any suggestions?

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!
Jul 23 '05 #5

P: n/a
you could cache (web server application level again) the last time the DB
was hit to check for changes and dont hit it again until at least a few
seconds have passed.

Mr Tea
http://mr-tea.blogspot.com
"Jenn Lee" <ph*********@yahoo.com> wrote in message
news:41**********@127.0.0.1...
So ultimately, each user would have to see if the cache holds the latest
dataset by checking the database for the latest update timestamp and
then verifying it with the cache timestamp. Right? Or is there a way to
do this otherwise? I like the idea of shadow copying. If I could write
to a file on each web server, then use the .NET cache that is triggered
to update by the change in a file, then I could compare a user's cookie
timestamp to the cache timestamp and that would work great. The only
problem then stands is how get the xml file to the web servers from the
SQL Server in a 'real time' fashion. Any suggestions?

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!

Jul 23 '05 #6

P: n/a
"Jenn Lee" <ph*********@yahoo.com> wrote in message
news:41**********@127.0.0.1...
So ultimately, each user would have to see if the cache holds the latest
dataset by checking the database for the latest update timestamp and
then verifying it with the cache timestamp. Right? Or is there a way to
do this otherwise? I like the idea of shadow copying. If I could write
to a file on each web server, then use the .NET cache that is triggered
to update by the change in a file, then I could compare a user's cookie
timestamp to the cache timestamp and that would work great. The only
problem then stands is how get the xml file to the web servers from the
SQL Server in a 'real time' fashion. Any suggestions?


Either run a job or have a process runs on the web server and writes the
file, sleeps for 15 seconds and then repeats...
I prefer the job mechanism as a sleeping process it can be messy.
Shadow copy is intended to get past the file locking issue with dlls in the
old asp.
Essentially.... you just need to copy or write an xml file into the \bin
directory of the app and that's it.
The shadow copying mechanism ( used by the asp.net worker process ) takes
care of creating a temporary directory and a copy of the file which the
users actually read from and removing this as soon as no user is still using
the old one.
The simplicity of coding is why I suggested it.
I like simple.
I'd read up a bit more on shadow copying before launching into this approach
since I've not used it in this way.
There might be a gotcha lurking about.

I wonder if you could get away with running the job every minute.
15 seconds isn't real time, so I wonder if every minute would be near enough
anyhow.

--
Regards,
Andy O'Neill
Jul 23 '05 #7

P: n/a
I will read up on the shadow copying, thanks for the suggestion. Now if
I can only get my SQL Job to control itself if 3 people are inserting
records simultaneously (thereby calling the Job that writes the XML
file)...ugh.

As for a sleeping job, I've got clients checking back at 15 second
intervals but the clients become staggered so the most current data
needs to be available as soon as it is posted to the db. (Time sensitive
Stock market research.)

Thank you!
*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!
Jul 23 '05 #8

P: n/a
"Jenn Lee" <ph*********@yahoo.com> wrote in message
news:41**********@127.0.0.1...
I will read up on the shadow copying, thanks for the suggestion. Now if
I can only get my SQL Job to control itself if 3 people are inserting
records simultaneously (thereby calling the Job that writes the XML
file)...ugh.

sp_help_job can tell you if a job is already running.

--
Regards,
Andy O'Neill
Jul 23 '05 #9

This discussion thread is closed

Replies have been disabled for this discussion.