473,387 Members | 1,569 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

Performing fast search in huge oracle database

Hi All,
I am new to this forum as far as posting a question is concerned, although I have quite frequently visiting this site as a guest.

My quertion relates to preforming fast searched in a very large oracle database. This is my company database. I am having two very huge tables containing crores of records. I have to match each record from one table (I'll call it base table from here onwards) to another table which may contain one or more (not more than 10) corresponding record(s). And accordingly, I need to update certain flags and fields in both the tables to indicate that the record in question has been matched (found) in the base table and vice-versa.

This is basically a re-conciliation type of activity I am doing. For this I have made a PL-SQL procedure, which works quite file as far as functionality is concerned. But the problem is that it takes too much time to do the matching. Usually, I run it at the time leaving office and it gets completed after the lunch on next day.

Also, I am having index on both the tables comprising of fields which are involved in matching. What I need is some speeding up techniques applicable to real big databases.

Although, I don't feel it is necessary, even then I am providing the structure of both the tables:

1. Base table:
Expand|Select|Wrap|Line Numbers
  1. CREATE TABLE TEMP_SMD_6404
  2. (
  3.   SODATE            DATE,
  4.   CODE              VARCHAR2(4),
  5.   PCD               CHAR(2),
  6.   ORIGIN_DATE       DATE,
  7.   REFERENCE_NO      VARCHAR2(10),
  8.   AMOUNT            NUMBER(15,2),
  9.   BM_POSTDATE       DATE,
  10.   BM_TXNCNTR        NUMBER(5),
  11.   NARRATION         VARCHAR2(100),
  12.   CORE_POSTDATE     DATE,
  13.   CORE_TXNCNTR      VARCHAR2(9),
  14.   NPHASE            CHAR(1),
  15.   NPROD             CHAR(2),
  16.   CORE_FLAG         CHAR(1),
  17.   NBRANCH           VARCHAR2(4),
  18.   SYSTEM_INDICATOR  VARCHAR2(1),
  19.   CIRCLE            NUMBER,
  20.   MIG_DATE          DATE,
  21.   EXTRA_FLAG        VARCHAR2(2)
  22. )
  23. LOGGING 
  24. NOCACHE
  25. NOPARALLEL;
First six fields comprising of a unique index and are involved in matching.


2. Table to be matched:
Expand|Select|Wrap|Line Numbers
  1. CREATE TABLE THMASTER_6404
  2. (
  3.   CODE          CHAR(4),
  4.   PCD           CHAR(2),
  5.   POSTDATE      NUMBER(8),
  6.   TXNCNTR       NUMBER,
  7.   TTYPE         CHAR(2),
  8.   REFERENCE_NO  VARCHAR2(10),
  9.   AMOUNT        NUMBER(15,2),
  10.   USERID        CHAR(3),
  11.   NARRATION     VARCHAR2(25),
  12.   PROCESSED     CHAR(1),
  13.   SUPID         CHAR(3),
  14.   RECON_CNTR    NUMBER(7),
  15.   SODATE        DATE,
  16.   NEWFLAG       CHAR(1),
  17.   NPROCESSED    CHAR(1),
  18.   NRECON_CNTR   VARCHAR2(20),
  19.   NPROD         CHAR(2),
  20.   NPHASE        CHAR(1),
  21.   NREFNO        VARCHAR2(10),
  22.   NBRANCH       CHAR(4),
  23.   SOUSERID      VARCHAR2(3)
  24. )
  25. LOGGING 
  26. NOCACHE
  27. NOPARALLEL;
  28.  
First seven fields comprising of a unique index and are involved in matching.


Thanks in anticipation
Jan 1 '08 #1
3 4250
amitpatel66
2,367 Expert 2GB
Hi Vikas,

Welcome to TSDN!!

Please make sure you follow POSTING GUIDELINES when ever you post in this forum

MODERATOR
Jan 2 '08 #2
No reply as yet!!!
I am starting to loose my hope
Jan 3 '08 #3
amitpatel66
2,367 Expert 2GB
No reply as yet!!!
I am starting to loose my hope
Could you please post your procedure that is performing UPDATE/INSERT on these tables for my refernce, so that we could suggest why the procedure is taking so long?

I am aware that huge data is one cause but we would also like to check yur procedure?
Jan 3 '08 #4

Sign in to post your reply or Sign up for a free account.

Similar topics

2
by: Henry | last post by:
Hi experts of Oracle, I used Oracle for small queries, but now I have a task to create WEB-based report by getting data, summarying ( several counts, min and max) and re-formatting, where the...
0
by: Andre | last post by:
Hello, I am working on refactoring my companies search engine, and a critical component of the search is that the results be displayed very quickly. We have done enourmous work on making the...
11
by: Andy Fish | last post by:
Hi, I am trying to concoct a query that will join rows on the basis of a whitespace insensitive comparison. For instance if one row has the value 'a<space>b' and another has the value...
6
by: richard.pasco | last post by:
Hey all I am trying to write a script that allows users to search through a database of names. But rather than give a search string and then return all those that match, I would like it to...
4
by: yasaswi | last post by:
How can I find out the worst performing sql queries or statements in DB2 database? Is there a command or script to monitor all the sql statements which are consuming huge amounts of CPU cycles and...
6
by: DC | last post by:
Hi, I am programming a search catalogue with 200000 items (and growing). I am currently using the SQL Server 2000 fulltext engine for this task but it does not fit the requirements anymore. ...
2
by: mivey4 | last post by:
Hi, I have been searching but haven't had any luck finding a method of performing bulk inserts of data into Oracle comparable to using the BCP utility with MSSQL. In our MSSQL environment I...
8
by: Sham | last post by:
I am trying to perform the following query on a table that has been indexed using Full Text Search. The table contains multiple columns than have been indexed. (Below, all xml columns are...
1
by: thetariq | last post by:
HI, I have a problem can anyone suggest. I have made an online application for my company by which users can search thr Claims records. I have made a view in oracle database and Using OLDB...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.