473,395 Members | 1,393 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

Inserting Records into SQL Server - is there a faster interface than ADO

I have a program that reads records from a binary file and loads them
into an MS-SQL Server database. It is using a stored proc, passing the
parameters.

I am using pywin32 to create a connection object. Once the connection
is open I simple pass the SQL formatted commands using
cnx.Execute(sqlstring).

My test examples;

20,000 records using the ADO connection: 0:04:45:45

If I setup the program to not send the record to the database - so all
other variables and processes are constant, it simply just skips the
cnx.Execute(sqlstring) step, then it takes only 0:00:25:78 to process
thru the same number of trx.

Obviously the times in my test are that , but I have a client that woud
like to use this and has several million transactions to content with.

So my questions is ....
Is there a "faster" method I can use to connect to the SQL server ?
Or does anyone have any "optimization" tips the can offer ?

Nov 11 '05 #1
5 4118
ge********@hotmail.com napisał(a):
Is there a "faster" method I can use to connect to the SQL server ?
Or does anyone have any "optimization" tips the can offer ?


This has nothing with python, but the fastest way to load large amount
of data to MS SQL Server database is DTS import from flat file.

To spped up the things a bit, do not commit transaction after each row
inserted -- commit whole batch.

--
Jarek Zgoda
http://jpa.berlios.de/
Nov 11 '05 #2
[ge********@hotmail.com]
I have a program that reads records from a binary file and loads them
into an MS-SQL Server database. It is using a stored proc, passing the
parameters.
[snip]
So my questions is ....
Is there a "faster" method I can use to connect to the SQL server ?
Or does anyone have any "optimization" tips the can offer ?


Is there a reason why you need to use a stored procedure?

Do you need to process the data in some way in order to maintain
referential integrity of the database?

If the answer to both these questions is "no", then you can use the
"bcp" (Bulk CoPy) utility to transfer data into SQLServer *very* quickly.

http://msdn.microsoft.com/library/en...p_bcp_61et.asp
http://www.sql-server-performance.com/bcp.asp

thought-it-was-worth-mentioning-ly y'rs,

--
alan kennedy
------------------------------------------------------
email alan: http://xhaus.com/contact/alan
Nov 11 '05 #3
Alan Kennedy wrote:
[ge********@hotmail.com]
I have a program that reads records from a binary file and loads them
into an MS-SQL Server database. It is using a stored proc, passing the
parameters.

So my questions is ....
Is there a "faster" method I can use to connect to the SQL server ?
Or does anyone have any "optimization" tips the can offer ?


Is there a reason why you need to use a stored procedure?

Do you need to process the data in some way in order to maintain
referential integrity of the database?

If the answer to both these questions is "no", then you can use the
"bcp" (Bulk CoPy) utility to transfer data into SQLServer *very* quickly.

http://msdn.microsoft.com/library/en...p_bcp_61et.asp
http://www.sql-server-performance.com/bcp.asp

thought-it-was-worth-mentioning-ly y'rs,

If the answer to some of the earlier questions is "yes," I have
found "bcp" can be a great tool to fill up a new table of data
"on its way in." SQL can then move it to where it should really
go with nice transaction-protected SQL, proper index-building
and so on. After distributing the data, you can drop the table
of pending data.

I agree this is off-topic, but it is too close to my experience.

--Scott David Daniels
sc***********@acm.org
Nov 11 '05 #4
The utility is designed to run in the background and maintain/update a
parallel copy of a production system database. We are using the
stored procedure to do a If Exist, update, else Insert processing for
each record.

The originating database is a series of keyed ISAM files. So we need
to read each record, perform some simple data conversions and then
update the SQL database. We are using Python to read the originating
database and perform the record conversion and then posting the results
back to SQL Server.

We designed our utility to run a night so that the SQL server is up to
date the next day and ready for reporting.

Thanks for your tips on BCP. I will investigate further as it looks
like it might be useful for the initial loading of the data and perhaps
some changes to the our utility program to minimize the amount of data
that needs to be read/processed.

Geoff.

Nov 22 '05 #5
> We are using the stored procedure to do a If Exist, update, else Insert processing for
each record.


Consider loading the data in batches into a temporary table and then
use a single insert statement to insert new records and a single update
statement to update existing ones. This way, you are not forcing the
database to do it one by one and give it a chance to aggressively
optimize your queries and update the indexes in bulk. You'd be
surprized at the difference this can make!

Nov 22 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: FizzBin | last post by:
We are writing a C application that is using ODBC to insert records into a database. We have a NOT NULL column that can legitimately have an empty value, i.e. we know the value and it is empty...
4
by: Andreas Lauffer | last post by:
Can anyone tell me advantages / disadvantages of DataDirect Server Wire ODBC-driver? Any experiences? What about redistribution? Andreas Lauffer, easySoft. GmbH, Germany
2
by: Chuck Reed | last post by:
I use ADO.Recordsets constantly to query data from external sources and append the results into tables in Access databases. A number of the queries I deal with bring back in excess of 50,000...
7
by: Trevor Best | last post by:
I have an import routine that takes a generic file (Excel, CSV, Fixed length, PDMS BOM, XML, etc) and maps fields to a temp import table on the server, in the field mapping there may be functions...
5
by: Brian | last post by:
I need help PLEASEEEEEEEEEEEEEEE..... I am trying to create a list of questions that have yes/no answers. There are 8 groups of questions. Based on who the user is that logs in will determine...
1
by: gouse | last post by:
Hello Friends, In a Table I am inserting more than 50,000 Records one by one. It was taking a lot of time . Is it There any good approach/solution for inserting records more than 50,000 one by one...
7
by: ebindia0041 | last post by:
This is like the bug from hell. It is kind of hard to explain, so please bear with me. Background Info: SQL Server 7.0, Asp.net 1.1 with c# I'm inserting simple records into a table. But one...
2
by: AlexanderDeLarge | last post by:
Hi! I got a problem that's driving me crazy and I'm desperately in need of help. I'll explain my scenario: I'm doing a database driven site for a band, I got these tables for their discography...
2
by: hakkatil | last post by:
Hi to all, I have a page that inserts excel sheet to access database. I am using asp. What I want to do is to check the inserting record if it is in the database. Basicly checking the dublicate...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.