473,486 Members | 2,277 Online
Bytes | Software Development & Data Engineering Community
Create Post

Home Posts Topics Members FAQ

Importing Records to SQL Server

I have an import routine that takes a generic file (Excel, CSV, Fixed
length, PDMS BOM, XML, etc) and maps fields to a temp import table on
the server, in the field mapping there may be functions specified so the
user may put in "Mid([DocNumber,12,3)" as a value to go into FieldX.

I've tried 3 methods of getting the data from a linked table (CSV file)
to the SQL Server, each having some drawbacks.

Method 1: DAO, pokes each record in one at a time.
Drawback: Extremely slow (17.5 minutes for 76,000 rows)
Drawback2: Took a helluva lot more code and time to write than methods 2
and 3 so if it were a person I'd have thumped it by now :-).

Method 2: Build a query and execute with .Execute method
Quick (2.5 minutes for 76,000 rows)
Drawback: No visual feedback, looks like PC hung for 2.5 minutes.

Method 3: As 2 but .RunSQL method.
As quick as 2 and a progress bar to boot.
Drawback: Errors are either vague (e.g. "inserted null into field that
is not variant type" ok but which field?) or handled by DoCmd and not by
my program (e.g. "x records not appended due to key violation") as vague
as the first with the added headache that the first my program will know
about it is if the user cancels then the error message is that the user
cancelled, which doesn't tell the program what was wrong with the data.

Methods 1 & 2 give concise error messages, e.g. "Cannot insert null into
column xyz, insert fails", which is more helpful.

I must say I was surprised at how much faster the query method was to
DAO, I knew it would be faster but not that much difference!

Before I investigate anything else as a viable alternative to
transferring records (and manipulating them en route) I wondered if
anyone else had any ideas or (better still) experience of a better way.

One way I thought of was to build singular SQL statements rather than
the batch statement in methods 2 & 3 then execute batches of the singlar
inserts, perhaps all at once or in batches of 1000 or something (not
sure how much SQL I can poke into the server for execution in one go).
--
This sig left intentionally blank
Nov 13 '05 #1
7 2381
On Thu, 25 Nov 2004 20:56:28 +0000, Trevor Best <no****@besty.org.uk>
wrote:

SQL Server has Data Transformation Services available for import and
export. Read up on it in Books Online. DTS can be invoked
programmatically.

-Tom.

I have an import routine that takes a generic file (Excel, CSV, Fixed
length, PDMS BOM, XML, etc) and maps fields to a temp import table on
the server, in the field mapping there may be functions specified so the
user may put in "Mid([DocNumber,12,3)" as a value to go into FieldX.

I've tried 3 methods of getting the data from a linked table (CSV file)
to the SQL Server, each having some drawbacks.

Method 1: DAO, pokes each record in one at a time.
Drawback: Extremely slow (17.5 minutes for 76,000 rows)
Drawback2: Took a helluva lot more code and time to write than methods 2
and 3 so if it were a person I'd have thumped it by now :-).

Method 2: Build a query and execute with .Execute method
Quick (2.5 minutes for 76,000 rows)
Drawback: No visual feedback, looks like PC hung for 2.5 minutes.

Method 3: As 2 but .RunSQL method.
As quick as 2 and a progress bar to boot.
Drawback: Errors are either vague (e.g. "inserted null into field that
is not variant type" ok but which field?) or handled by DoCmd and not by
my program (e.g. "x records not appended due to key violation") as vague
as the first with the added headache that the first my program will know
about it is if the user cancels then the error message is that the user
cancelled, which doesn't tell the program what was wrong with the data.

Methods 1 & 2 give concise error messages, e.g. "Cannot insert null into
column xyz, insert fails", which is more helpful.

I must say I was surprised at how much faster the query method was to
DAO, I knew it would be faster but not that much difference!

Before I investigate anything else as a viable alternative to
transferring records (and manipulating them en route) I wondered if
anyone else had any ideas or (better still) experience of a better way.

One way I thought of was to build singular SQL statements rather than
the batch statement in methods 2 & 3 then execute batches of the singlar
inserts, perhaps all at once or in batches of 1000 or something (not
sure how much SQL I can poke into the server for execution in one go).


Nov 13 '05 #2
Tom van Stiphout wrote:
On Thu, 25 Nov 2004 20:56:28 +0000, Trevor Best <no****@besty.org.uk>
wrote:

SQL Server has Data Transformation Services available for import and
export. Read up on it in Books Online. DTS can be invoked
programmatically.


This will be generic and the only app I know will definately be on the
workstation running it will be Access. Wouldn't DTS require the SQL
Server tools to be installed? This may not be the case in most
circumstances.

--
This sig left intentionally blank
Nov 13 '05 #3
"Trevor Best" <no****@besty.org.uk> wrote in message
news:41**********************@news.zen.co.uk...
I have an import routine that takes a generic file (Excel, CSV, Fixed
length, PDMS BOM, XML, etc) and maps fields to a temp import table on the
server, in the field mapping there may be functions specified so the user
may put in "Mid([DocNumber,12,3)" as a value to go into FieldX.

Before I investigate anything else as a viable alternative to transferring
records (and manipulating them en route) I wondered if anyone else had any
ideas or (better still) experience of a better way.

Have you looked at BULK INSERT or DTS? If you're after raw performance these
would probably be your best options. If you need more flexinbiltiy than
straight bulk insert you can use E-M to design an initail DTS package, with
any transformations you need then save it as a VB module from where you can
edit it to suit. Try a test using the DTS wizard and see what kind of
performance you get.
Nov 13 '05 #4
On Thu, 25 Nov 2004 21:53:59 +0000, Trevor Best <no****@besty.org.uk>
wrote:

I believe DTS indeed requires the tools to be installed locally. If
you can't be sure of that, this may not be a solution for you. Unless
you want to use DCOM to the server - not my first choice...

-Tom.
Tom van Stiphout wrote:
On Thu, 25 Nov 2004 20:56:28 +0000, Trevor Best <no****@besty.org.uk>
wrote:

SQL Server has Data Transformation Services available for import and
export. Read up on it in Books Online. DTS can be invoked
programmatically.


This will be generic and the only app I know will definately be on the
workstation running it will be Access. Wouldn't DTS require the SQL
Server tools to be installed? This may not be the case in most
circumstances.


Nov 13 '05 #5
John Winterbottom wrote:
"Trevor Best" <no****@besty.org.uk> wrote in message
news:41**********************@news.zen.co.uk...
I have an import routine that takes a generic file (Excel, CSV, Fixed
length, PDMS BOM, XML, etc) and maps fields to a temp import table on the
server, in the field mapping there may be functions specified so the user
may put in "Mid([DocNumber,12,3)" as a value to go into FieldX.

Before I investigate anything else as a viable alternative to transferring
records (and manipulating them en route) I wondered if anyone else had any
ideas or (better still) experience of a better way.


Have you looked at BULK INSERT or DTS? If you're after raw performance these
would probably be your best options. If you need more flexinbiltiy than
straight bulk insert you can use E-M to design an initail DTS package, with
any transformations you need then save it as a VB module from where you can
edit it to suit. Try a test using the DTS wizard and see what kind of
performance you get.


I'm finding DTS difficult to use, the (CSV) file I want to import is in
my temp directory. For .txt/csv DTS requires a DSN and even prompts for
username and password although not got that far yet as setting up the
DSN brings up an old 16 bit looking Open File Dialog that won't allow me
to browse to the temp directory because it's hidden (well "Local
Settings" above it is hidden) and typing that or %TEMP% won't get me
there either. I've never been impressed with DTS' ability to copy one
SQL Server database to another anyway, so why would I expect success out
of it for this? :-)

I think I might abandon DTS as a tool as the import started out as a
self contained tool within the access front end would now extend to
having to have SQL Server tools installed and have a DSN. I'd rather
keep it in-house as it were. I don't want to have to rely on users
having other applications or settings on their machine that are beyond
my control as that leads to my program not working because they haven't
got all the other bits. Fair enough but in the user's eyes it's just "my
program doesn't work" so of course it's my fault and my name gets
dragged in the mud, and mud sticks no matter how much it wasn't supposed
to be on there in the first place.
Nov 13 '05 #6
"Trev@Work" <no.email@please> wrote in message
news:41***********************@news.easynet.co.uk. ..

I think I might abandon DTS as a tool as the import started out as a self
contained tool within the access front end would now extend to having to
have SQL Server tools installed and have a DSN. I'd rather keep it
in-house as it were.

You might want to consider straight BULK INSERT then. We have an application
that periodically receives text files via ftp and inserts the data into
several tables. We also use temporary staging tables, because some data will
aleady exist and some won't. If the data already exists we update it, if
not we insert the new records. Everything is done on the server in a single
stored procedure. Here's a code snippet that shows how we do it - this loads
data from a temp file called clients.txt into a table called temp_clients,
then inserts the data into the main clients table. Also we use dynamic SQL
so that we can change the name of the text file:

/* code starts */
declare @file nvarchar(25),
@strSQL nvarchar(1000)

-- first get files from ftp server
-- getData.BAT is a batch file which copies the files to the local
machine
exec master.dbo.xp_cmdshell 'c:\getData.BAT, NO_OUTPUT

set @path = 'c:\'

-- drop primary key before bulk load to speed things up
if exists
(
select * from INFORMATION_SCHEMA.TABLE_CONSTRAINTS
where CONSTRAINT_NAME = 'PK_temp_clients'
)
alter table dbo.temp_clients drop constraint PK_temp_clients
if (@@ERROR<>0) goto on_error

-- purge old records
truncate table dbo.temp_clients
if (@@ERROR<>0) goto on_error

set @file = 'temp_clients'
set @strSQL = bulk insert ' + db_name() + '.dbo.' + @file
+ ' from''' + @path + @file + '.txt'' with '
+ '(FIELDTERMINATOR = ''|'')'

exec sp_executesql @strSQL
if (@@ERROR<>0) goto on_error

-- add primary key back
alter table dbo.temp_clients add constraint PK_temp_clients primary key
(ClientID)

-- update matching rows
update dbo.clients set
acctNumber = t.acctNumber, lastName = t.lastName,
firstName = t.firstName, salutation = t.salutation,
business = t.business, address = t.address
from dbo.temp_clients as t where t.ClientID = clients.ClientID
if (@@ERROR<>0) goto on_error
-- add new clients
insert into dbo.clients (ClientID, acctNumber, lastName,
firstName, salutation, business, address)
select ClientID, acctNumber, lastName,
firstName, salutation, business,
from dbo.temp_clients t
where not exists
(
select * from dbo.clients c
where c.ClientID = t.ClientID
)
if (@@ERROR<>0) goto on_error

/* code ends */
This has been running well for a couple of years now. The client machine,
(located remotely), has a VB app that periodically exports data from their
local Jet database to text files, ftp's them to our ftp server, then calls
the procedure above. We call it asynchronously using sql agent so the client
can make the connection, start the procedure and then disconnect.



Nov 13 '05 #7
John Winterbottom wrote:
You might want to consider straight BULK INSERT then. We have an application
that periodically receives text files via ftp and inserts the data into
several tables. We also use temporary staging tables, because some data will
aleady exist and some won't. If the data already exists we update it, if
not we insert the new records. Everything is done on the server in a single
stored procedure. Here's a code snippet that shows how we do it - this loads
data from a temp file called clients.txt into a table called temp_clients,
then inserts the data into the main clients table. Also we use dynamic SQL
so that we can change the name of the text file:


Thanks for the code but I've already written the stored procedures, one
for checking the data and generating a delta if it's a cumulative file,
the other updates the main tables using the delta to apply +/-
quantities, other updates, inserts and deletes. This all works OK and in
the blink of an eye. I then archive off the delta that's been applied as
this may be required by my client's client, particularly for MTOs.

The problem with bulk insert is that it will expect the source table to
be in the same format as the target table yes? If only my clients knew
how to use their CAD systems and ISOGEN properly <g> Also does the user
require special privvies to run extended sps like xp_cmdshell?

I might look into this if I find nothing better but I will have to do
some field mapping either client or server side. ATM my temp holding
table on the server has some required fields, these are the ones mapped
to the live tables that are required. I made them required in the temp
table so I could weed out any errors before attempting to update the
live tables but I suppose I could do a soft check for required fields
thus eliminating the only error I can think of that will occur during
the transfer from import file to temp table. Having said that I can
think of data type mismatches that may occur but as I've typed the
previous sentence now I can't be arsed to delete it :-)

I might stick with methods 2 and 3 (the query methods) and give the user
a choice to have either a nice progress bar or a meaningful error
message. I did start looking at the batch of inserts with value lists
but found where in my import spec I had something defined like
Mid(field,2,1) I would have to either evaluate it locally or substitute
for the equivalent SQL Server function. I then found it took slightly
longer than the query methods and I hadn't even put the code in to throw
the SQL statements at the server.
--
This sig left intentionally blank
Nov 13 '05 #8

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
2432
by: Campbell's List | last post by:
Hi, I'm completely new to MySql and need help importing tables from Access or Dbase. I am using Dreamweaver MX to create a data-driven web site. With the VPS hosting plan we're on, our remote...
2
2002
by: Joel | last post by:
Hi, I'm importing our DOS data base (Dataflex) to Mysql I have a table with 174,638 records that I convert into a tab delimetered Text file however, Mysql only reads 87,035! I checked the text...
3
2390
by: Mark Line | last post by:
Hello! I'm a python n00b! I've been writing in c++ for a few years so programming's not new to me, just python that I don't know the syntax!
11
3373
by: Grim Reaper | last post by:
I am importing a .csv file into Access that has 37 fields. My problem is that sometimes the last field only has data at the end of the column (it looks like when you import a file into Access, for...
2
1990
by: TheElectron707 | last post by:
Hi! I have a scenario in which i am importing a VERY LARGE database over the network. (Linking is not my solution). Now it takes many hours to import the complete database. I want that once the...
2
1678
by: db55 | last post by:
I am trying to import 6M records in a table from an Oracle 10g database into a SQL Server 2000 database. After importing 1.5M+ records I get the following error: Ora-01555: snapshot too old:...
11
2881
by: panic attack | last post by:
Hello everbody, Our system is using Sql Server 2000 on Windows XP / Windows 2000 We have a text file needs to be imported into Sql Server 2000 as a table. But we are facing a problem which is,...
11
2413
by: kaisersose1995 | last post by:
Hi, I've got an import procedure working, using a standard import specification to import a .csv file into a temporary table. The problem i'm having is that i have 4 different sets of borrower...
5
3150
by: hharriel | last post by:
Hi, I am hoping someone can help me with an issue I am having with excel and ms access. I have collected data (which are in individual excel files) from 49 different school districts. All...
0
7132
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
7180
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
1
6846
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
7341
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
1
4870
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new...
0
3076
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The...
0
3071
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
600
muto222
php
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
0
266
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.