(excuse the caps).
RDBMS excel when SET based operations are used.
IE if you get data, you get a set of data using SELECT.
If you want to delete data, you delete a set of data using DELETE.
If you wish to update data, you use an UPDATE statement. It is by far the
most efficient way of performing the operation. Updating say 10,000 records
should take a second or so on a moderate performing server. It can of course
take a lot longer if there are complex joins in the update statement.
Using a cursor operation with procedural logic is akin to reading tape files
on a mainframe and processing a record at a time - it is inherently very
slow. In a client server environment, each fetch of next record [can]
involve a network round trip to the server which with network latency,
server latency etc. makes for a slow process. Effectively you are treating
the data source as a big serial file...
Certainly there are some situations where the values to use in each record
update may require a custom calculation, but where does this data come from?
If it comes from the database, then the concept is to chop out the network,
the client and devise an SQL statement that will execute the UPDATE for all
records in one statement on the server. If some of the data needed is
somewhere in the client, then bung it in the database in some suitable table
for the duration of the update and so make it available for the calculation.
There are techniques to batch the update into smaller chunks that involve
using the TOP operator.
As for the cause for your failure: as Val says, it is difficult to give any
answer without any errors. Is there anything in the event log? I would
suspect you are hitting a timeout of some sort. IMHO ASP.Net is not designed
for large Batch operations.
- Tim
"Zeng" <Ze******@hotmail.com> wrote in message
news:%2******************@TK2MSFTNGP12.phx.gbl...
How come you didn't think it was a good idea to make updates one-by-one in
asp.net environment? By the way, each update is a well-defined and
re-usable operation, doing it otherwise would risk code inconsistency, I
have to do this once every two weeks. MS must have tried something like
this to stress test the framework, right? Basically just do a big loop and
do several db read/update each iteration. My loop failed around the
10000th
record. If all resource (memory etc..) is released and/or recycled
properly, why would it matter if it's a big loop or not?
"Val Mazur (MVP)" <gr******@hotmail.com> wrote in message
news:OD**************@TK2MSFTNGP10.phx.gbl... Hi,
Without an error it is probably hard to say something, but cannot you
make
an update in a some sort of batch? I do not think it is a good idea to
make updates one-by-one especially in ASP.NET environment. I am pretty sure
there is a better way to do this, but it depends on what you need to achieve
--
Val Mazur
Microsoft MVP
http://xport.mvps.org
"Zeng" <Ze******@hotmail.com> wrote in message
news:ub**************@TK2MSFTNGP10.phx.gbl... > Hello,
>
> I'm wondering if anybody has seen this problem. I basically need to cycle > through ~30000 db rows to update the data, I load up the id of the rows I > need first, put them into ArrayList, close the connection, then process
> through one record at a time, so there is no nested transaction. It
> normally take 1 hour or more, after about 45 min, the aspx page gives
> up
> with a server unavailable msg, but the server still goes on in the
> background for 15 min or more then stop.
> App event in the server shows this: aspnet_wp.exe (PID: 2400) stopped
> unexpectedly
>
> Does anyone know what the problem is?
>
> Thanks!
>
>
>