By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,287 Members | 1,509 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,287 IT Pros & Developers. It's quick & easy.

DTS import for VB6

P: 4
hi, im a beginner in DTS. can anybody help me?pls!

would it be possible to create a DTS (import using vb6) that checks the data in a certain table and updates it without truncating or deleting any other data within a table?

say: i have a .CSV file to be imported in a cerain table using DTS.
the data in my .CSV file would be..

Field1, field2, field3
aa , 111 , bbbb
bb , 222 , xxxx
cc , 333 , yyyy
dd , 444 , zzzz

and the records in my table is:

Field1, field2, field3
xx , 444 , bbbb
zz , 555 , xxxx
aa , 111 , yyyy
yy , 777 , zzzz

where field1 and field2 are the primary keys.

would it be possible to have a a result in my table after importing like the following?

Field1, field2, field3
xx , 444 , bbbb
zz , 555 , xxxx
aa , 111 , bbbb ----> updated
yy , 777 , zzzz
bb , 222 , xxxx ----> inserted
cc , 333 , yyyy ----->inserted
dd , 444 , zzzz ----->inserted

i used the DTS(SQL) approach in importing data from csv to dbase before, but unfortunately it is just for importing purposes. that is, the table is to be truncated first before importing the data to avoid data redundancy (and of course error on primary keys). could anybody give me a tip on how can i do this? thanks a lot...
Jan 28 '08 #1
Share this Question
Share on Google+
6 Replies


ck9663
Expert 2.5K+
P: 2,878
hi, im a beginner in DTS. can anybody help me?pls!

would it be possible to create a DTS (import using vb6) that checks the data in a certain table and updates it without truncating or deleting any other data within a table?

say: i have a .CSV file to be imported in a cerain table using DTS.
the data in my .CSV file would be..

Field1, field2, field3
aa , 111 , bbbb
bb , 222 , xxxx
cc , 333 , yyyy
dd , 444 , zzzz

and the records in my table is:

Field1, field2, field3
xx , 444 , bbbb
zz , 555 , xxxx
aa , 111 , yyyy
yy , 777 , zzzz

where field1 and field2 are the primary keys.

would it be possible to have a a result in my table after importing like the following?

Field1, field2, field3
xx , 444 , bbbb
zz , 555 , xxxx
aa , 111 , bbbb ----> updated
yy , 777 , zzzz
bb , 222 , xxxx ----> inserted
cc , 333 , yyyy ----->inserted
dd , 444 , zzzz ----->inserted

i used the DTS(SQL) approach in importing data from csv to dbase before, but unfortunately it is just for importing purposes. that is, the table is to be truncated first before importing the data to avoid data redundancy (and of course error on primary keys). could anybody give me a tip on how can i do this? thanks a lot...

option 1:
1. parse your file based on it's delimiter.
2. before inserting the record, check for existence based on your primary key.
3, if existing, update. if not, insert.

option 2:
1. upload the csv as a table.
2. use a query to update your table.

-- ck
Jan 28 '08 #2

P: 4
Hi ck9663!
thanks for your reply, but i have some considerations to ask regarding your reply...

base from my understanding, DTS is very effective and useful in a system where fast data importing is the main corncern.

suppose i have a 100,000 lines (or more) of CSV data to be imported in my table, wouldnt it be so very slow if im going to use "step 2" of your "opton 1" approach? that is, checking every line of csv data against the records of the table before inserting it or updating it?
i think "option 2" may go the same result since checking of every csv data will be applied after the data upload.

is there any other way of making data importing fast?

thank you vey much!
Jan 28 '08 #3

ck9663
Expert 2.5K+
P: 2,878
Hi ck9663!
thanks for your reply, but i have some considerations to ask regarding your reply...

base from my understanding, DTS is very effective and useful in a system where fast data importing is the main corncern.

suppose i have a 100,000 lines (or more) of CSV data to be imported in my table, wouldnt it be so very slow if im going to use "step 2" of your "opton 1" approach? that is, checking every line of csv data against the records of the table before inserting it or updating it?
i think "option 2" may go the same result since checking of every csv data will be applied after the data upload.

is there any other way of making data importing fast?

thank you vey much!
define "very slow". but we are processing millions of records and it's "not slow", at least based on our definition of "not slow"...DTS is primarily designed to handle these kind of things. although others would want to parse the data, i'd say if there's no much complication to your system, just use DTS. "do not try to re-invent the wheel" ;)

-- ck
Jan 28 '08 #4

P: 4
define "very slow". but we are processing millions of records and it's "not slow", at least based on our definition of "not slow"...DTS is primarily designed to handle these kind of things. although others would want to parse the data, i'd say if there's no much complication to your system, just use DTS. "do not try to re-invent the wheel" ;)

-- ck
good morning ck! thnx again..
"very slow" in the sense that it checks every line of csv data before inserting or updating the table, rather than importing/uploading it as a whole table(which executes in just a matter of 1 or less than 3 mins compared to an hour of processing time). what im trying to ask is, if there is a way of checking data redundancy and automatically execute 'update' during data transformation other than checking csv data line-by-line? well, if there is no way then id try your option 1. thanks lot ck! ;)
Jan 29 '08 #5

ck9663
Expert 2.5K+
P: 2,878
good morning ck! thnx again..
"very slow" in the sense that it checks every line of csv data before inserting or updating the table, rather than importing/uploading it as a whole table(which executes in just a matter of 1 or less than 3 mins compared to an hour of processing time). what im trying to ask is, if there is a way of checking data redundancy and automatically execute 'update' during data transformation other than checking csv data line-by-line? well, if there is no way then id try your option 1. thanks lot ck! ;)
i don't think so. if you use the DTS, it's doing a BULK INSERT...you can create a trigger on your target table. but am not sure (read: i don't know ;) ) if an insert trigger will fire on BULK INSERT...

if parsing a line by line took you 30 mins...and you're DTS took you 3 mins to upload and another minute to issue an UPDATE ...FROM, then why would you stress yourself creating a trigger? or a parsing module?

-- ck
Jan 29 '08 #6

P: 4
i don't think so. if you use the DTS, it's doing a BULK INSERT...you can create a trigger on your target table. but am not sure (read: i don't know ;) ) if an insert trigger will fire on BULK INSERT...

if parsing a line by line took you 30 mins...and you're DTS took you 3 mins to upload and another minute to issue an UPDATE ...FROM, then why would you stress yourself creating a trigger? or a parsing module?

-- ck
thanks a lot ck! now i can argue with my boss....lol
Jan 29 '08 #7

Post your reply

Sign in to post your reply or Sign up for a free account.