Hello there.
I'm having several issues with regards to loading data into an Oracle database. I am developing in ASP.Net with C# and the source data is in a CSV file.
What I am looking for is to find the best approach people on here have found to be the most efficient way to load the data within these boundaries. I don't want to load all of the CSV data but I do need to load about 70% of it; maybe a little more and the CSV file itself holds some 30 million records.
I am finding a file that has just 750,000 records is taking an hour to insert into an oracle table, so any suggestions on a quickest method will be greatly appreaciated please?
I have tried dynamic code and stored procedures with the same results, which was somewhat unexpected.
Thank you.
Mark :)
4 2979
Doing a quick search on Google, I found that Oracle has a module called SQL*Loader which is similar to SQL Server's BCP bulk upload or DTS for importing data from various sources. You can configure this to load the data in from the CSV and given that Oracle wrote the code, it's likely to be the most efficient mechanism at parsing the code into the Oracle tables...
Hi there and thank you for the reply.
Yes I have tried Oracle's Sql Loader and it works super fast, the problem I have is that I am using a CSV file and while it is easy enough to cherry pick the columns I want in the file after parsing it, the format of the sql loader CTL file requires you to have a matching column line up, so as my table has more columns than this particular csv file, the whole thing falls apart.
It is the like-for-like column matching in the control file -v- the database table column ordering which is causing me problems.
Using the Position parameter would be good but this is a CSV file so is not applicable.
As it currently stands, the database table columns can be simp,lified to:
col1, col2, col3, col4, col5, col6
But my control file may have col1, col4, col5
hence if I use sqlldr with my ctl file and csv input, then the wrong data is being uploaded to the wrong column, albeit very quickly done..!
If anybody has a way around this, then that would be helpful indeed.
Thank you.
Mark
Okay, could you two-step it? Bulk upload into a dummy table which has a trigger attached to it that can then do the forward copy of the data into the real table?
In an ideal world, it would be configurable so that you could cherry pick your columns directly, but given that it's not, could you do something like this?: - Create a script that builds a dummy table and attaches a trigger to it to pass incoming data out to the correct columns in the real table, thus allowing you to pick and choose which fields are/aren't included.
- Script the SQL*Loader import to grab the data from the CSV into the dummy table which will have the same amount of columns as your CSV.
- Data is pushed into your dummy table by the SQL*Loader, the trigger would then forward the data out to the real table, Then your script can drop the dummy table at the end of processing.
It's not what you might call "tidy" but it would make use of the performance benefits of the loader and give it the added flexibility you're after that it doesn't provide out of the box, it would be relatively easy to maintain as it's a single script.
Sign in to post your reply or Sign up for a free account.
Similar topics |
by: me |
last post by:
Hi guys - the question is in the subject line.
I thought of one quick way:
std::ifstream input("myfile.dat") ;
std::istreambuf_iterator beg(input), end ;
std::vector DataFile(beg,end) ;
|
by: Odin |
last post by:
Hello
I am making a webpage with two dropdown menus.
First I have a dropdown menu with a list of 235 countries. When one
country is selected from this list the contents of the next dropdown
menu is decided from a coresponding file containing a huge amount of
cities. The biggest of these files being 6 MB.
Getting different advices I have now to different sets of code doing
this thing. Does anyone know if any of these two sets of code is...
|
by: Deano |
last post by:
I think my brain has short-circuited again :)
Is this the quickest way to check for the existence of a given value in an
array?
e.g
For i = 0 To rrst.RecordCount
If myArray(i) = DLookup("ID", "tblmain", "Salary = " &
varSomeValue) Then
|
by: Jozef Jarosciak |
last post by:
Quickest way to find the string in 1 dimensional string array!
I have a queue 1 dimensional array of strings called 'queue' and I need
a fast way to search it. Once there is match, I don't need to search
any longer.
Currently I am using this code.
But I think it's too slow, because it runs through whole dimension.
I know this is trivial question, but is there any way to stop this
loop, or better way to search? I mean - FASTER?
|
by: Emmanuel |
last post by:
Hi there,
My client would like to process an xml file. the structure of which is as
below.
<xml>
<stockitem>
<releaseddate>.....date value...</releaseddate>
<...aditional tags for additional info>
</stockitem>
<stockitem>
| |
by: Bob |
last post by:
Hi all,
I'm trying to import data, modify the data then insert it into a new
table. The code below works fine for it but it takes a really long
time for 15,000 odd records. Is there a way I can speed up the
processing substantially? as it currently takes about 10 minutes and
thats just way too long because there is many of these imports that I
need to do.... I currently insert each record one by one and I
imagine
thats where all the...
|
by: Ricardo Vazquez |
last post by:
I'm writing log information into a file (via StreamWriter).
When it reached a 4GB size, my MFC/C++ code copied that file to another
name, truncated its length to 0 (CFile::SetLength(0)), and continued writing
(no need to close and open, so that it was very quick).
How is it the best .net way to get this "file change" the quickest possible?
Thanks!
Ricardo Vázquez.
|
by: rdudejr |
last post by:
Hi all,
I hardly ever make a post unless I am having a very purplexing issue,
so this one should be good...
I am trying to do a load against a database on an AIX server into a
DB2 v9.1 database, using SAN for storage. The table has a few CLOBs
(smallish clobs but we are storing XML data in non-native format).
Here is the load command I am using:
|
by: rrayfield |
last post by:
I have a XML file that contains content for an asp.net website. I need the quickest way to find the node and write the elements out to the page. Also how would I get the links section out of it? do I have to loop through them? Example = I need the content for the Home Page, and I have the id=Home in the xml, how do I get the rest of the data to write out to the page in that node between <page id="Home"> and </page>
<?xml version="1.0"...
|
by: marktang |
last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look !
Part I. Meaning of...
|
by: Hystou |
last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it.
First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
| |
by: agi2029 |
last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own....
Now, this would greatly impact the work of software developers. The idea...
|
by: isladogs |
last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM).
In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules.
He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms.
Adolph will...
|
by: conductexam |
last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one.
At the time of converting from word file to html my equations which are in the word document file was convert into image.
Globals.ThisAddIn.Application.ActiveDocument.Select();...
|
by: TSSRALBI |
last post by:
Hello
I'm a network technician in training and I need your help.
I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs.
The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols.
I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
|
by: 6302768590 |
last post by:
Hai team
i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
|
by: muto222 |
last post by:
How can i add a mobile payment intergratation into php mysql website.
| |
by: bsmnconsultancy |
last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...
| |