473,395 Members | 1,622 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

DTS import for VB6

4
hi, im a beginner in DTS. can anybody help me?pls!

would it be possible to create a DTS (import using vb6) that checks the data in a certain table and updates it without truncating or deleting any other data within a table?

say: i have a .CSV file to be imported in a cerain table using DTS.
the data in my .CSV file would be..

Field1, field2, field3
aa , 111 , bbbb
bb , 222 , xxxx
cc , 333 , yyyy
dd , 444 , zzzz

and the records in my table is:

Field1, field2, field3
xx , 444 , bbbb
zz , 555 , xxxx
aa , 111 , yyyy
yy , 777 , zzzz

where field1 and field2 are the primary keys.

would it be possible to have a a result in my table after importing like the following?

Field1, field2, field3
xx , 444 , bbbb
zz , 555 , xxxx
aa , 111 , bbbb ----> updated
yy , 777 , zzzz
bb , 222 , xxxx ----> inserted
cc , 333 , yyyy ----->inserted
dd , 444 , zzzz ----->inserted

i used the DTS(SQL) approach in importing data from csv to dbase before, but unfortunately it is just for importing purposes. that is, the table is to be truncated first before importing the data to avoid data redundancy (and of course error on primary keys). could anybody give me a tip on how can i do this? thanks a lot...
Jan 28 '08 #1
6 2341
ck9663
2,878 Expert 2GB
hi, im a beginner in DTS. can anybody help me?pls!

would it be possible to create a DTS (import using vb6) that checks the data in a certain table and updates it without truncating or deleting any other data within a table?

say: i have a .CSV file to be imported in a cerain table using DTS.
the data in my .CSV file would be..

Field1, field2, field3
aa , 111 , bbbb
bb , 222 , xxxx
cc , 333 , yyyy
dd , 444 , zzzz

and the records in my table is:

Field1, field2, field3
xx , 444 , bbbb
zz , 555 , xxxx
aa , 111 , yyyy
yy , 777 , zzzz

where field1 and field2 are the primary keys.

would it be possible to have a a result in my table after importing like the following?

Field1, field2, field3
xx , 444 , bbbb
zz , 555 , xxxx
aa , 111 , bbbb ----> updated
yy , 777 , zzzz
bb , 222 , xxxx ----> inserted
cc , 333 , yyyy ----->inserted
dd , 444 , zzzz ----->inserted

i used the DTS(SQL) approach in importing data from csv to dbase before, but unfortunately it is just for importing purposes. that is, the table is to be truncated first before importing the data to avoid data redundancy (and of course error on primary keys). could anybody give me a tip on how can i do this? thanks a lot...

option 1:
1. parse your file based on it's delimiter.
2. before inserting the record, check for existence based on your primary key.
3, if existing, update. if not, insert.

option 2:
1. upload the csv as a table.
2. use a query to update your table.

-- ck
Jan 28 '08 #2
chawo
4
Hi ck9663!
thanks for your reply, but i have some considerations to ask regarding your reply...

base from my understanding, DTS is very effective and useful in a system where fast data importing is the main corncern.

suppose i have a 100,000 lines (or more) of CSV data to be imported in my table, wouldnt it be so very slow if im going to use "step 2" of your "opton 1" approach? that is, checking every line of csv data against the records of the table before inserting it or updating it?
i think "option 2" may go the same result since checking of every csv data will be applied after the data upload.

is there any other way of making data importing fast?

thank you vey much!
Jan 28 '08 #3
ck9663
2,878 Expert 2GB
Hi ck9663!
thanks for your reply, but i have some considerations to ask regarding your reply...

base from my understanding, DTS is very effective and useful in a system where fast data importing is the main corncern.

suppose i have a 100,000 lines (or more) of CSV data to be imported in my table, wouldnt it be so very slow if im going to use "step 2" of your "opton 1" approach? that is, checking every line of csv data against the records of the table before inserting it or updating it?
i think "option 2" may go the same result since checking of every csv data will be applied after the data upload.

is there any other way of making data importing fast?

thank you vey much!
define "very slow". but we are processing millions of records and it's "not slow", at least based on our definition of "not slow"...DTS is primarily designed to handle these kind of things. although others would want to parse the data, i'd say if there's no much complication to your system, just use DTS. "do not try to re-invent the wheel" ;)

-- ck
Jan 28 '08 #4
chawo
4
define "very slow". but we are processing millions of records and it's "not slow", at least based on our definition of "not slow"...DTS is primarily designed to handle these kind of things. although others would want to parse the data, i'd say if there's no much complication to your system, just use DTS. "do not try to re-invent the wheel" ;)

-- ck
good morning ck! thnx again..
"very slow" in the sense that it checks every line of csv data before inserting or updating the table, rather than importing/uploading it as a whole table(which executes in just a matter of 1 or less than 3 mins compared to an hour of processing time). what im trying to ask is, if there is a way of checking data redundancy and automatically execute 'update' during data transformation other than checking csv data line-by-line? well, if there is no way then id try your option 1. thanks lot ck! ;)
Jan 29 '08 #5
ck9663
2,878 Expert 2GB
good morning ck! thnx again..
"very slow" in the sense that it checks every line of csv data before inserting or updating the table, rather than importing/uploading it as a whole table(which executes in just a matter of 1 or less than 3 mins compared to an hour of processing time). what im trying to ask is, if there is a way of checking data redundancy and automatically execute 'update' during data transformation other than checking csv data line-by-line? well, if there is no way then id try your option 1. thanks lot ck! ;)
i don't think so. if you use the DTS, it's doing a BULK INSERT...you can create a trigger on your target table. but am not sure (read: i don't know ;) ) if an insert trigger will fire on BULK INSERT...

if parsing a line by line took you 30 mins...and you're DTS took you 3 mins to upload and another minute to issue an UPDATE ...FROM, then why would you stress yourself creating a trigger? or a parsing module?

-- ck
Jan 29 '08 #6
chawo
4
i don't think so. if you use the DTS, it's doing a BULK INSERT...you can create a trigger on your target table. but am not sure (read: i don't know ;) ) if an insert trigger will fire on BULK INSERT...

if parsing a line by line took you 30 mins...and you're DTS took you 3 mins to upload and another minute to issue an UPDATE ...FROM, then why would you stress yourself creating a trigger? or a parsing module?

-- ck
thanks a lot ck! now i can argue with my boss....lol
Jan 29 '08 #7

Sign in to post your reply or Sign up for a free account.

Similar topics

0
by: Stian Søiland | last post by:
all examples performed with: Python 2.3+ (#2, Aug 10 2003, 11:09:33) on linux2 (2, 3, 0, 'final', 1) This is a recursive import:
0
by: Vio | last post by:
Hi, I've been trying to embed (statically) wxPy alongside an embedded py interpreter on a linux/gtk box. At one point, for some reason misc.o linking reported "multiple definitions of...
0
by: John Roth | last post by:
I've found a case where it seems that Python is importing two copies of a module without any reason or indication. It took me a while to verify that this is what is occuring: I had to write a...
5
by: Steve Holden | last post by:
This is even stranger: it makes it if I import the module a second time: import dbimp as dbimp import sys if __name__ == "__main__": dbimp.install() #k = sys.modules.keys() #k.sort() #for...
1
by: mark | last post by:
In Access 2000 and 2002, I have created an import specification to import the fixed-width recordset below into an existing table. I am having strange problems with the import of the date and time...
4
by: Bruce W. Roeser | last post by:
All, I'm reading a book by Charles Petzold (Programming VS.Net). Pretty good content but am confused about the difference. From the text: ...
2
by: Jon | last post by:
It appears that (windows) python searches in the current working directory before looking in the local site-packages directory, or that '.' comes first in sys.path? The problem arises when I made...
7
by: Ron Adam | last post by:
from __future__ import absolute_import Is there a way to check if this is working? I get the same results with or without it. Python 2.5 (r25:51908, Sep 19 2006, 09:52:17) on win 32 ...
5
by: W. Watson | last post by:
Is there a single source that explains these statements? ------------------------------ from Tkinter import * from Numeric import * import Image import ImageChops import ImageTk import time...
9
by: rsoh.woodhouse | last post by:
Hi, I'm trying to work out some strange (to me) behaviour that I see when running a python script in two different ways (I've inherited some code that needs to be maintained and integrated with...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.