By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
432,086 Members | 1,875 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 432,086 IT Pros & Developers. It's quick & easy.

Pausing execution of an ASP Script

P: n/a

I am working on an ASP page that writes to several databases, ranging
from MDBs to x-base.

One of the tasks involves using an existing highest value from the DB
and incrementing it before inserting a new record.

I am using Application.Lock and .Unlock together with an application
variable to negotiate access to the DB routine to one session (user)
at a time. This is to ensure that the ID numbers are cleanly
incremented, and that no two records has the same ID. These databases
are also written to by a console user app (non-asp). The exclusive
access to the DB *generally* lasts WAY less than 1 second.

But because of the variable nature of things computer related, I have
error trapping in the code that detects a DB open or execute error,
and waits n period of time, and tries again - for a maximum of x
tries. (n is about 1 second, and x is 5 tries)

My problem is in pausing the script. I am a newbie at this, so be
gentle if this is a really stupid question.

You cannot use:

d - duration
x = time
do while time < (x + d)
.... (what, no doevents?)
loop

as it consumes vast amounts of CPU cycles.

There is no Sleep Function, I was amazed to learn.
There is no doevents, to release a dumb loop as above from hogging CPU
time.

I have had no luck in using the WScript.Sleep business....
I'm not quite sure how to reference it.

Is there a better approach to this?

Thanks,
Greg
Dr. Know
Jul 22 '05 #1
Share this Question
Share on Google+
7 Replies


P: n/a
Dr. Know wrote:
I am working on an ASP page that writes to several databases, ranging
from MDBs to x-base.

One of the tasks involves using an existing highest value from the DB
and incrementing it before inserting a new record.
This is silly. Why not use an auto-increment field?
Or at least use a table with a single record to generate your new ID's (one
field: lastIssuedID). You can use pessimistic locking to prevent other users
from updating it during the nanosecond it takes for you to grab a new ID and
increment it.
as it consumes vast amounts of CPU cycles.
There is no Sleep Function, I was amazed to learn.
There is no doevents, to release a dumb loop as above from hogging CPU
time.
You don't want to tie up a server thread doing nothing. It's a nice way to
kill your server's scalability. You have to remember: you're not writing a
desktop application.

I have had no luck in using the WScript.Sleep business....
I'm not quite sure how to reference it.
Can't be done in ASP.
http://blogs.msdn.com/ericlippert/ar.../07/53155.aspx

Is there a better approach to this?

There is none in ASP. You would be better off sending a message to the
client to tell them to try again in a few seconds. But if you must do this
automatically, you can redirect to a page whose onload event uses setTimer()
to wait a second and resubmit the request. Follow up in a client-side coding
group such as .scripting.jscript if you want to pursue this further.

Bob Barrows
--
Microsoft MVP - ASP/ASP.NET
Please reply to the newsgroup. This email account is my spam trap so I
don't check it very often. If you must reply off-line, then remove the
"NO SPAM"
Jul 22 '05 #2

P: n/a
Bob Barrows [MVP] said:
Dr. Know wrote:

One of the tasks involves using an existing highest value from the DB
and incrementing it before inserting a new record.


This is silly. Why not use an auto-increment field?
Or at least use a table with a single record to generate your new ID's (one
field: lastIssuedID). You can use pessimistic locking to prevent other users
from updating it during the nanosecond it takes for you to grab a new ID and
increment it.


Well, Sure it's silly, but it's the way it has to be.
This is *legacy* work. Existing DB formats and tables. They cannot
be changed. The native apps would choke on it... and you have to
retrieve the field as is updated by the native app as well as the ASP
script.

Hey, don't blame me - I didn't write it! ;-)
as it consumes vast amounts of CPU cycles.
There is no Sleep Function, I was amazed to learn.


You don't want to tie up a server thread doing nothing. It's a nice way to
kill your server's scalability. You have to remember: you're not writing a
desktop application.


Exactly my point. I was looking for a thread friendly way to pause an
ASP script.
I have had no luck in using the WScript.Sleep business....
I'm not quite sure how to reference it.


Can't be done in ASP.
http://blogs.msdn.com/ericlippert/ar.../07/53155.aspx


That explains THAT! Rather odd omission, however, considering how
useful it might be if not misused.
Is there a better approach to this?

There is none in ASP. You would be better off sending a message to the
client to tell them to try again in a few seconds. But if you must do this
automatically, you can redirect to a page whose onload event uses setTimer()
to wait a second and resubmit the request. Follow up in a client-side coding
group such as .scripting.jscript if you want to pursue this further.


OK, I'm just going to hammer it 5 times, and then return an error
message and suggest they repost the form. Darned... I only needed a
half second delay...

Thanks,

Greg
Dr. Know
Jul 22 '05 #3

P: n/a
Dr. Know wrote:
Bob Barrows [MVP] said:
Dr. Know wrote:
One of the tasks involves using an existing highest value from the
DB
and incrementing it before inserting a new record.


This is silly. Why not use an auto-increment field?
Or at least use a table with a single record to generate your new
ID's (one field: lastIssuedID). You can use pessimistic locking to
prevent other users from updating it during the nanosecond it takes
for you to grab a new ID and increment it.


Well, Sure it's silly, but it's the way it has to be.
This is *legacy* work. Existing DB formats and tables. They cannot
be changed. The native apps would choke on it... and you have to
retrieve the field as is updated by the native app as well as the ASP
script.


So use a separate database to generate your ID's - don't tie up your
database, application and web server for this purpose.
Hey, don't blame me - I didn't write it! ;-)
as it consumes vast amounts of CPU cycles.
There is no Sleep Function, I was amazed to learn.
You don't want to tie up a server thread doing nothing. It's a nice
way to kill your server's scalability. You have to remember: you're
not writing a desktop application.


Exactly my point. I was looking for a thread friendly way to pause an
ASP script.


There isn't one. :-)
I have had no luck in using the WScript.Sleep business....
I'm not quite sure how to reference it.
Can't be done in ASP.
http://blogs.msdn.com/ericlippert/ar.../07/53155.aspx


That explains THAT! Rather odd omission,


"Odd"? I thought he rather clearly explained why it was a deliberate
omission ...
however, considering how
useful it might be if not misused.
It can't NOT be misused. This is a server, not a desktop.
Is there a better approach to this?

There is none in ASP. You would be better off sending a message to
the client to tell them to try again in a few seconds. But if you
must do this automatically, you can redirect to a page whose onload
event uses setTimer() to wait a second and resubmit the request.
Follow up in a client-side coding group such as .scripting.jscript
if you want to pursue this further.


OK, I'm just going to hammer it 5 times, and then return an error
message and suggest they repost the form. Darned... I only needed a
half second delay...


No, don't hammer it. (by "hammer", I'm assuming you mean that you are going
to use the CPU-intensive code to "pause" it) Don't tie up your server that
way. There are other ways to generate unique ID's. I've already mentioned
the possibility of creating a separate database containing a lastID table.
There's nothing forcing you to use your legacy database for this.

Another possibility is to redirect the page to itself, using a querystring
variable to control the number of retries. Kludgy, but ...

But again, I strongly discourage serializing your database access in the way
you have designed it. if you can't modify the design of your main database,
create a separate mdb file containing a table with a single field. If you
make it two fields, one of which is an autonumber field, you don't even have
to worry about pessimistic locking - just connect to the database, insert a
record, and select @@Identity. You will have your unique ID. Close the
connection and connect to your legacy database.

See http://www.aspfaq.com/show.asp?id=2174

Bob Barrows
--
Microsoft MVP - ASP/ASP.NET
Please reply to the newsgroup. This email account is my spam trap so I
don't check it very often. If you must reply off-line, then remove the
"NO SPAM"
Jul 22 '05 #4

P: n/a
Bob Barrows [MVP] said:
Dr. Know wrote:
So use a separate database to generate your ID's - don't tie up your
database, application and web server for this purpose.
First, thanks for the information!
You seem quite knowledgeable on the subject - I'm a newcomer...

What I apparently failed to make clear is that the native app
increments the ID code on it's own when a terminal user adds a record.
The IDs are sequential and have to be ordered, not just unique.

The ASP page also has to increment the same ID number while respecting
any changes made by the terminal app. I cannot rewrite the original
app to recognize a secondary ID database or to use auto number or
replication ID fields. Therefore some sort of sync must be maintained
between the two clients. I see no way of doing this. It barely
recognizes record locks...

The only saving grace to this is that it will see limited usage. (less
than 50 additions per day).
OK, I'm just going to hammer it 5 times, and then return an error
message and suggest they repost the form. Darned... I only needed a
half second delay...


No, don't hammer it. (by "hammer", I'm assuming you mean that you are going
to use the CPU-intensive code to "pause" it) Don't tie up your server that
way. There are other ways to generate unique ID's. I've already mentioned
the possibility of creating a separate database containing a lastID table.
There's nothing forcing you to use your legacy database for this.


No, I meant that I would simply retry the command 5 times with no
(Sleep) delay before giving up, and returning an error that the record
could not be locked or appended. My perfectionist tendencies preclude
using any routine which consumes 100% CPU time... ;-)

I also don't want to simply repeat the command until it succeeds. I
was looking to allow a reasonable time limit for the other processes
to give up the record locks, then give up or proceed accordingly.
But again, I strongly discourage serializing your database access in the way
you have designed it. if you can't modify the design of your main database,
create a separate mdb file containing a table with a single field. If you
make it two fields, one of which is an autonumber field, you don't even have
to worry about pessimistic locking - just connect to the database, insert a
record, and select @@Identity. You will have your unique ID. Close the
connection and connect to your legacy database.


Believe me - I didn't design this thing. That's why it's giving me
such fits. But the customer wants this add-on - BAD! The
pre-existing app is a relic from the dark ages of DOS. All the
databases are pre-existing. It's still being sold - complete with
outrageous licensing and maintenance fees!
Thanks,
Greg


Dr. Know
Jul 22 '05 #5

P: n/a
It's still not completely clear what you are trying to do. Does the other
application control the id or is it a matter of avoiding 2 updates at the
same time? Is the counter in a separate table or is it part of the table
containing other data?

If the above are true you can roll your own optimistic locking mechanism
like this (adjust the names to match your database, fill in missing code for
executes, etc):

nRows = 0
do while nRow <> 1 'probably want a "too many tries" bailout

strSQL = "select count from counter"

nCount = CLng(rsdata(count))
nNewID = nCount + 1

strSQL = "update counter set count = " & nNewID & " where count = "
& nCount
dbconn.execute strsql, nrows

loop

You can get a delay in an ASP page by creating or purchasing a com component
that calls the winapi sleep function but we have the above code working with
no complaints (and far more than 50 calls per day). The above code is know
to work with Access, Oracle, and SQL Server. You would need to test it on
other DBMSs.

--
--Mark Schupp
Head of Development
Integrity eLearning
www.ielearning.com

"Dr. Know" <Dr****@electron.com> wrote in message
news:iu********************************@4ax.com...
Bob Barrows [MVP] said:
Dr. Know wrote:
So use a separate database to generate your ID's - don't tie up your
database, application and web server for this purpose.


First, thanks for the information!
You seem quite knowledgeable on the subject - I'm a newcomer...

What I apparently failed to make clear is that the native app
increments the ID code on it's own when a terminal user adds a record.
The IDs are sequential and have to be ordered, not just unique.

The ASP page also has to increment the same ID number while respecting
any changes made by the terminal app. I cannot rewrite the original
app to recognize a secondary ID database or to use auto number or
replication ID fields. Therefore some sort of sync must be maintained
between the two clients. I see no way of doing this. It barely
recognizes record locks...

The only saving grace to this is that it will see limited usage. (less
than 50 additions per day).
OK, I'm just going to hammer it 5 times, and then return an error
message and suggest they repost the form. Darned... I only needed a
half second delay...


No, don't hammer it. (by "hammer", I'm assuming you mean that you are
going
to use the CPU-intensive code to "pause" it) Don't tie up your server that
way. There are other ways to generate unique ID's. I've already mentioned
the possibility of creating a separate database containing a lastID table.
There's nothing forcing you to use your legacy database for this.


No, I meant that I would simply retry the command 5 times with no
(Sleep) delay before giving up, and returning an error that the record
could not be locked or appended. My perfectionist tendencies preclude
using any routine which consumes 100% CPU time... ;-)

I also don't want to simply repeat the command until it succeeds. I
was looking to allow a reasonable time limit for the other processes
to give up the record locks, then give up or proceed accordingly.
But again, I strongly discourage serializing your database access in the
way
you have designed it. if you can't modify the design of your main
database,
create a separate mdb file containing a table with a single field. If you
make it two fields, one of which is an autonumber field, you don't even
have
to worry about pessimistic locking - just connect to the database, insert
a
record, and select @@Identity. You will have your unique ID. Close the
connection and connect to your legacy database.


Believe me - I didn't design this thing. That's why it's giving me
such fits. But the customer wants this add-on - BAD! The
pre-existing app is a relic from the dark ages of DOS. All the
databases are pre-existing. It's still being sold - complete with
outrageous licensing and maintenance fees!
Thanks,
Greg


Dr. Know

Jul 22 '05 #6

P: n/a
Mark Schupp said:
It's still not completely clear what you are trying to do. Does the other
application control the id or is it a matter of avoiding 2 updates at the
same time? Is the counter in a separate table or is it part of the table
containing other data?
The other app uses the existing highest value of one field in a table
of 31 fields to determine the next ID number. They are sequential
with no gaps. The old app cannot be re-written to use autonumber
fields - even though that is essentially the field's function.

My ASP script has to do the same, while respecting any changes made by
the old app and also any record locks applied by old app.
I realize it's difficult in a forum of this type to fully blueprint
all the particulars of a DB design, but I'll try to keep it brief.
First, some background info...

Let's say the Customer is an electronic parts distibutor.
The Customer has an app he has used for 10+ years - let's just call it
DOA (Dumb Old App) to avoid being sued by it's greedy vendor...

All of his customer detail data, ship to's, remarks, discount info,
accounting information, invoicing, GL, etc is controlled by DOA and
it's numerous free file x-base "tables". There are a total of five
x-base "tables" used for the customer account info alone. There are
no database containers used - they are all individual .DBF and .CDX
files - dozens of them.

To demonstrate how bad this stuff is, all fields are character types,
including numbers and currency, and some data is referenced by it's
physical location within the field. i.e.
ID = Mid(fld,1,14)
Dept = Mid(fld,14,4)

Anyway...

The customer also has a third party add-on (let's call it EOE for
Electronic Order Entry) that writes orders placed over the internet
into the DOA system. EOE uses an automated Telnet session to write
the data into DOA's Order Entry screen by stuffing. See, I warned you
this was all One Big Kludge...

In the defense of EOE's author, he began by writing the info directly
into the tables, but the vendor of DOA added a complicated checksum
field to the order tables to discourage foreign access, primarily
because it was cutting into the sales of their own pathetic,
overpriced Web Order add-on. But I digress...

EOE will not allow an online order to be placed unless the customer
account info is already in the DOA databases.

This is where I come into the picture...

The Customer wants to allow new customer account information to be
generated by online users as well, so that customers can then place
orders within minutes of submitting their account info. I have
reservations about this, and have advised the Customer, but he wants
it anyway. I have implemented counters to control the volume of post
submissions from any given IP address and session - in case *someone*
decides to fill his accounting system with 10,000 bogus records.
So, I have written an ASP server script that collects user
information, checks it for validity, and writes it to the various
tables involved. There are 5 x-base (.DBF) tables and 3 access (.MDB)
tables involved in the two systems.

The primary customer info table has 31 fields, one of which is an ID
field - an incrementing value that must remain one and only one above
the existing high value. Each customer that is added increments the
existing highest value in this field and is appended with this number
as it's ID field. There are 3 other ID fields as well, but we won't
go into that... :-|

A generic flow diagram for what I need to do is as follows:

------------------------------------------------------------------------

Get USER info from posted form
Validate info
If OK then

x = 0
do while application("WRITELOCK") and x < 5
pause(1 second)
x = x + 1
loop
if application("WRITELOCK") then
exit with cannot get exclusive ASP access timeout error
else
application.lock
application("WRITELOCK") = TRUE
x = 0
on error goto next
dbc.open connection to DBF provider
do while dbc.status <> open and x < 5
pause(1 second)
x = x + 1
dbc.open connection to DBF provider
loop
on error goto 0
if dbc.status <> open then
exit with cannot open provider error
else
x = 0
on error goto next
dbf.open command object
do while dbc.error <> OK and x < 5
pause(1 second)
x = x + 1
dbf.open command object
loop
one error goto 0
if dbf.error <> OK then
exit with cannot open command object for insert error
else
x = 0
on error goto next
dbf.execute SQL string
do while dbf.error <> OK and x < 5
pause(1 second)
x = x + 1
dbf.execute SQL string
loop
one error goto 0
if dbf.error <> OK then
exit with cannot execute command error
else
Operation was a success!
end if
end if
end if
end if
application("WRITELOCK") = FALSE
application.unlock

-----------------------------------------------------------------------

Normally, I can limit access to the DB to one ASP session at a time
with the application.lock/unlock and WRITELOCK application variable.

But if one session has a WRITELOCK at the instant another ASP session
tries to aquire the lock, it immediately fails and returns an error.
No matter that the WRITELOCK would be lifted mere milliseconds from
that instant. You can apparently either have a busy-wait loop or
return an error to the client - neither a very eloquent solution.

My code attempts for provide a pause and retry method - not only for
the ASP session locks but for any recordset locks placed by DOA.

If a dbf.open.recordset denyotherwrites command, for example, is
executed against a DBF that is either file locked (exclusive) or
record locked (pessimistic or optimistic) by another application, it
immediately fails. The commandtimeout and connectiontimeout
parameters seem to have no effect on this. Again, it doesn't consider
that the locks could be lifted mere milliseconds from the first
attempt. Some of this behaviour may be in the nature of the VFP OLE
DB drivers. I'm new to all of this...

*This* is why I wanted a thread-safe Sleep function.

If the above are true you can roll your own optimistic locking mechanism
like this (adjust the names to match your database, fill in missing code for
executes, etc):
This is similar to my code above, without the Sleep and using a
different mechanism to detect errors. In my mind, however, this still
constitutes a busy-wait situation that repeatedly hammers the execute
method - doesn't it?

You can get a delay in an ASP page by creating or purchasing a com component
that calls the winapi sleep function but we have the above code working with
no complaints (and far more than 50 calls per day). The above code is know
to work with Access, Oracle, and SQL Server. You would need to test it on
other DBMSs.


I wanted to try wrapping the system.thread.sleep() or sleep() API
functions into a multi-thread safe COM object, but I have NO idea how
to do this...

Ahh... life goes on....

Thanks,

Greg
Dr. Know
Jul 22 '05 #7

P: n/a
> To demonstrate how bad this stuff is, all fields are character types,
including numbers and currency, and some data is referenced by it's
physical location within the field. i.e.
ID = Mid(fld,1,14)
Dept = Mid(fld,14,4)


I'd just retry it 10 or 20 times. With only 50 or so adds a day, I
don't see that as a problem.
Good luck... my job suddenly looks much better!

Jul 22 '05 #8

This discussion thread is closed

Replies have been disabled for this discussion.