By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,419 Members | 1,609 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,419 IT Pros & Developers. It's quick & easy.

db design and performance for bulk insert

P: 26
i have a problem with large data import to a db in sql server.. Actually i have an application that collects data from an environment and dispatches this data to different csv files for sql server to bulk insert.. Each csv file corresponds a table in db.. After generating csv files on the application server(as a result csv files are remote files for the sql server), the dump process takes so much time that sometimes causes transaction log to fill up.. After reading articles, for the name of performance i changed recovery mode of db to simple and locate the log file of db to some other disk drive.. i also copy csv files to sql server manually but the result is almost same.. What can i do to increase the performance of bulk import?

i have a stored procedure that takes the path of remote file and the table name for bulk insert. i am not actually sure what is going under but for another db with recovery mode set to full import operation is sometimes 10 times faster than my db.. i am stuck in solving and understanding the mechanisms.. Changing recovery mode or using the existing db's create script to regenerate the db to bulk insert did not seem to have the same performance with the other db..

To sum up all i need to do is to import very large data to a db from csv files to corresponding tables.. this db can also be dropped and created to before bulk insert to increase performance and i do not need any backup/restore utility on this db..
Dec 16 '09 #1
Share this Question
Share on Google+
3 Replies


Delerna
Expert 100+
P: 1,134
Have you tried this using DTS instead?
How large is very large?
Dec 21 '09 #2

nbiswas
100+
P: 149
I don't know what version of SQL SERVER you are using.

If it is 2000 go ahead with DTS.

If higher version(2005+), I would strongly like to propose SSIS solution

Choose your database - > Right click on that -> Tasks -> Import data.

The Import / Export wizard will appear. Click on next.

From Choose a DataSource Screen, Choose the type of datasource you want to ( in this case it will be Microsoft excel), then specify the file path and click next.

Next choose the destination datasouce(i.e. the database name ) followed by the table names and then click finish.

Hope this makes sense
Dec 21 '09 #3

P: 26
i am using 2005 server, a program actually generates these large csv files (about at size of gb) and this import process should also be done through the application by means of stored procs. as the files are generated at the computer the application runs, i shared the folder for sql server machine to have direct access for bulk insert.. the program calls a procedure from the server by passing the name of the file to be imported and the name of the table to import into.. i read some articles on bulk insert and performance optimizations and tried a few of them such as TABLOCK which really seemed to improve the performance.. Is SSIS a faster way that can be called from inside an application (actually c++ application) or is there some other parameters that i can use for further improvements?

thanks
Dec 21 '09 #4

Post your reply

Sign in to post your reply or Sign up for a free account.