473,951 Members | 20,624 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Inserts gradually slow down

I'm inserting 2 million+ records from a C# routine which starts out
very fast and gradually slows down. Each insert is through a stored
procedure with no transactions involved.

If I stop and restart the process it immediately speeds up and then
gradually slows down again. But closing and re-opening the connection
every 10000 records didn't help.

Stopping and restarting the process is obviously clearing up some
resource on SQL Server (or DTC??), but what? How can I clean up that
resource manually?

Jul 23 '05 #1
9 8692
BTW, the tables have no indexes or constraints. Just simple tables
that are dropped and recreated each time the process is run.

Jul 23 '05 #2
(an******@gmail .com) writes:
I'm inserting 2 million+ records from a C# routine which starts out
very fast and gradually slows down. Each insert is through a stored
procedure with no transactions involved.

If I stop and restart the process it immediately speeds up and then
gradually slows down again. But closing and re-opening the connection
every 10000 records didn't help.

Stopping and restarting the process is obviously clearing up some
resource on SQL Server (or DTC??), but what? How can I clean up that
resource manually?


If I understand thius correctly, every time you restart the process
you also drop the tables and recreate them. So that is the "resource"
you clear up.

One reason could be autogrow. It might be an idea to extend the database
to reasonable size before you start loading. If you are running with
full recovery, this also includes the transaction log.

You could also consider adding a clustered index that is aligned with
the data that you insert. That is, you insert the data in foo-order,
you should have a clustered index on foo.

But since 2 million rows is quite a lot, you should probably examine
more efficient methods to load them. The fastest method is bulk-load,
but ADO .Net 1.1 does not have a bulk-load interface. But you could
run command-line BCP.

You could also build XML strings with your data and unpack these
with OPENXML in the stored procedure.

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 23 '05 #3
In my experience 9 times out of 10 the reason for the problem is a file
that keeps growing by 10%.
either pre-allocate a big file as Erland suggests or change the
filegrowth from 10% (the default, which by the way is a bad default) to
something in the region of 20 mb or so.

Jul 23 '05 #4
You're right, I removed the drop and re-create and it's definitely
slower when data already exists. So how would you suggest loading this
data?

The text file contains 11 different types of Rows. Each type of row
goes to a separate table, so I need to read each line, determine its
type, parse it and insert into the appropriate table.

Can BCP handle this? DTS? Or your XML idea?

Thanks

Erland Sommarskog wrote:
(an******@gmail .com) writes:
I'm inserting 2 million+ records from a C# routine which starts out
very fast and gradually slows down. Each insert is through a stored procedure with no transactions involved.

If I stop and restart the process it immediately speeds up and then
gradually slows down again. But closing and re-opening the connection every 10000 records didn't help.

Stopping and restarting the process is obviously clearing up some
resource on SQL Server (or DTC??), but what? How can I clean up that resource manually?
If I understand thius correctly, every time you restart the process
you also drop the tables and recreate them. So that is the "resource"
you clear up.

One reason could be autogrow. It might be an idea to extend the

database to reasonable size before you start loading. If you are running with
full recovery, this also includes the transaction log.

You could also consider adding a clustered index that is aligned with
the data that you insert. That is, you insert the data in foo-order,
you should have a clustered index on foo.

But since 2 million rows is quite a lot, you should probably examine
more efficient methods to load them. The fastest method is bulk-load,
but ADO .Net 1.1 does not have a bulk-load interface. But you could
run command-line BCP.

You could also build XML strings with your data and unpack these
with OPENXML in the stored procedure.

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp


Jul 23 '05 #5
I tried adjusting the settings... both different %'s and specific MBs,
but the slow down is the same in all cases.

It's dramatically slower to insert records into a 100,000 row table
than an empty one I guess.

Jul 23 '05 #6


an******@gmail. com wrote:
I tried adjusting the settings... both different %'s and specific MBs,
but the slow down is the same in all cases.

It's dramatically slower to insert records into a 100,000 row table
than an empty one I guess.


So these are existing tables? Do these tables have indexes already? If you
had a clustered index, for instance, and were inserting data in a random
order, there would be a lot of inefficiency and index maintenance. The
fastest way to get that much data in, would be to sort it by table,
and in the order you'd want the index, and BCP it in, to tables with no
indices, and then create your indexes.

Jul 23 '05 #7
(an******@gmail .com) writes:
You're right, I removed the drop and re-create and it's definitely
slower when data already exists. So how would you suggest loading this
data?
I can't really give good suggestions about data that I don't anything
about.
The text file contains 11 different types of Rows. Each type of row
goes to a separate table, so I need to read each line, determine its
type, parse it and insert into the appropriate table.

Can BCP handle this? DTS? Or your XML idea?


If the data has a conformant appearance, you could load the lot in a staging
table and then distribute the data from there.

You could also just write new files for each table and then bulk-load
these tables.

It's possible that a Data Pump task in DTS could do all this out
of the box, but I don't know DTS.

The XML idea would require you parse the file, and build an XML document
of it. You wouldn't have to build 11 XML documents, though. (Although
that might be easier than building one big one.)

It's also possible to bulk-load from variables, but not in C# with
ADO .Net 1.1.

So how big did you make the database before you started loading? With
two million records, you should have at least 100 MB for both data and
log.

By the way, how do call the stored procedure? You are using
CommandType.Sto redProcedure, aren't you?

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 23 '05 #8
Thanks for the help. I found the fastest way to do it in a single SP:

- BULK INSERT all data into a 2 column staging table using a BCP Format
file. (One column is the RowType, the rest is the data to be parsed)
- create an index on RowType
- call 11 different SELECT INTO statements based on RowType

2,000,000 rows loaded in 1.5 minutes.

Thanks a lot, BCP works very well.

Jul 23 '05 #9
(an******@gmail .com) writes:
Thanks for the help. I found the fastest way to do it in a single SP:

- BULK INSERT all data into a 2 column staging table using a BCP Format
file. (One column is the RowType, the rest is the data to be parsed)
- create an index on RowType
- call 11 different SELECT INTO statements based on RowType

2,000,000 rows loaded in 1.5 minutes.

Thanks a lot, BCP works very well.


Hmmm! It's always great to hear when things work out well!
--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 23 '05 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

12
2659
by: Neil | last post by:
I previously posted re. this, but thought I'd try again with a summary of facts. I have an Access 2000 MDB with a SQL Server 7 back end. There is a view that is linked to the database via ODBC and has been in place for several years without any performance problems. Recently I added a couple of fields to the output of the view, and it became very slow when scrolling. When just opened in the database window, the linked view takes about...
3
1391
by: Smutny30 | last post by:
Hello, I need to implement a solution where db2 Workgroup v. 8.1 is used for storing quite huge amount of data. The data will be rarely read and searched and those operations do not need to be fast. In opposite many writes (inserts , not updates) should work fast. I am open for any suggestions on techniques and database configurations that will be good for such a purpose. Any opinions are welcome !
9
1773
by: cow3 | last post by:
Is there anyone who can help me with this: I have written a fairly complicated vb application of hydrological model that does a lot of number crunching. The model creates a set of object variables before entering the main loop and then main calculations are done within the main loop using the object variables previously created. The application starts off with pretty good performance. But, after a while, it starts to slow down gradually...
11
1808
by: TC | last post by:
I am having an issue where inserts are taking a lot longer than I would expect. So far it seem that it is just the first insert. (Although, I have other people telling me it is happening more than just the firs time) The times for a single insert are anywhere from 2-10 seconds. I can't imagine what takes that long. We use JDBC and the bulk of this time is on the prepare of the statment. Does anyone have any ideas on what could be...
9
8697
by: dan | last post by:
within a loop i am building a sql insert statement to run against my (programatically created) mdb. it works but it seems unreasonably SLOW! Sorry, dont have the code here but the jist is very standard (I think!); e.g: # get connection loop
3
2143
by: John | last post by:
Hi I have replaced an ms access app with its vb.net version at a client site. Now the clients keeps complaining about how slow the app response is. The complains they have are for example when app is minimised and then trying to maximise it after a while takes a while for app to get maximised. Also form painting of controls is slow and app is generally slow in terms of response to user clicks. I am using Infragistics controls and my...
10
2381
by: penworthamnaynesh | last post by:
Does php slow your website down? This is what i would like to know.The reason is because my site is writtent 50% in html and 50% in php and is very slow at loading. And i cant tell wether php is doing it or html o is it another reason because i only have 20gb bandwidth My site is called : ultimate city the game http://www.ultimate-gamez.net
18
1710
by: Charles Law | last post by:
I have a sproc that returns data from a table using a simple SELECT. There are quite a few rows returned, e.g. ~150,000. In my first application, I use a reader on the sproc and iterate through the rows, writing them out to a file. This takes about 5 minutes until I close the reader. Whilst this is going on, I have another application that is trying to insert rows into the table. Normally, the inserts happen straight away, but when...
2
2023
by: sameerpanjwani | last post by:
I've been facing problems with a query which I think is relatively well optimized but has been performing really slow as of late as the size of the table has increased to more than 1GB. The MyISAM table is updated every few seconds, only a few rows, and a different set of rows are selected every few seconds.....would that cause a slowdown and if so, what is the solution?
0
11608
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
11207
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
11379
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
1
8278
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6238
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
6362
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
4969
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
4559
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
3566
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.