473,385 Members | 1,863 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

How we improve query execution of a bulky database..

Hi Friends,

I am working in Postgresql 8.3. In that I have one table A, this table contains number of attributes and we already put indexing on Id, liveId attributes. But this table also contains the insertedDate and one more and very important things is that in every seconds approx 30 hits occurred in this table.

When we are fetching the records from this table Its takes to much time. Yesterday I had put indexing on this column and after indexing till 4-5 hours it working awesome, but now when the table size is increase it was taking too much time even more than earlier. Therefore I had drop that index but I want to increase the fetching speed.

If I will put indexing again then which way I will put it. I am new to Database handling so please help me..

So Please gave me guidance.. Thanks in Advance..
Waiting for your early response...
Feb 26 '10 #1
10 3522
rski
700 Expert 512MB
What query do you use to fetch data from table?
Feb 26 '10 #2
RedSon
5,000 Expert 4TB
Does postgresql have a query analysis tool?
Feb 26 '10 #3
Hi Rski,
Thanks for reply.

I am using subquery, the query is:-

select * from tableA where liveid in (select id from tableB where vehicleid in (select id from tableC where registrationNumber='MH4R2423')) and inserteddate>'2010-02-20' and inserteddate<'2010-02-21'.

When I did indexing on inserteddate till 4-5 hours it work quite good. But when the table size is increase its takes too much time therefore I drop that index.

Dear RedSon, I am new to Postgres
and I only know that if we want to check the qery performance then we are using Explain Analyze "Users Query"

I am not a Database Guy I am a Developer So I can't know too much about this. Still I am trying to fix this issue, but if you know anything then Plz help me.
Heartily Appreciate for your help
Feb 27 '10 #4
rski
700 Expert 512MB
Can you show the explain plan for that query.
Feb 27 '10 #5
The Explain Plan of my query is as follows:-

Explain analyze Select * from tableA where liveid in (select id from tableB where vehicleid in (select id from tableC where registrationNumber like '%0306')) and insertedDate>'2010-02-25' and insertedDate<'2010-02-26';
QUERY PLAN
---------------------------------------------------------------------------------------------------------
Hash IN Join (cost=37054.99..279253.69 rows=119 width=140) (actual time=13374.289..90126.653 rows=965 loops=1)
Hash Cond: (tableA.liveid = tableB.id)
-> Bitmap Heap Scan on tableA (cost=1998.45..243537.42 rows=87806 width=140) (actual time=11841.439..87984.638 rows=255173 loops=1)
Recheck Cond: ((insertedDate > '2010-02-25 00:00:00+05:30'::timestamp with time zone) AND (insertedDate < '2010-02-26 00:00:00+05:30'::timestamp with time zone))
-> Bitmap Index Scan on tableA_insertedDate (cost=0.00..1976.50 rows=87806 width=0) (actual time=11825.504..11825.504 rows=255180 loops=1)
Index Cond: ((insertedDate > '2010-02-25 00:00:00+05:30'::timestamp with time zone) AND (insertedDate < '2010-02-26 00:00:00+05:30'::timestamp with time zone))
-> Hash (cost=35009.64..35009.64 rows=3752 width=8) (actual time=1.070..1.070 rows=1 loops=1)
-> Nested Loop (cost=190.88..35009.64 rows=3752 width=8) (actual time=1.054..1.063 rows=1 loops=1)
-> HashAggregate (cost=27.23..27.24 rows=1 width=8) (actual time=0.269..0.271 rows=1 loops=1)
-> Seq Scan on tableC (cost=0.00..27.23 rows=1 width=8) (actual time=0.019..0.253 rows=1 loops=1)
Filter: (registrationNumber ~~ '%0360'::text)
-> Bitmap Heap Scan on tableB (cost=163.65..34809.35 rows=13844 width=16) (actual time=0.771..0.773 rows=1 loops=1)
Recheck Cond: (tableB.vehicleid = tableC.id)
-> Bitmap Index Scan on livestats_driver_vehicle_index (cost=0.00..160.19 rows=13844 width=0) (actual time=0.147..0.147 rows=386 loops=1)
Index Cond: (tableB.vehicleid = tableC.id)
Total runtime: 90143.018 ms
(16 rows)
Feb 27 '10 #6
rski
700 Expert 512MB
Did you try to replce 'IN' clauses with simple join?
Feb 27 '10 #7
Joins takes more time... I already used that but that is also not useful...
Feb 27 '10 #8
rski
700 Expert 512MB
OK, let's think
Don't you think that you can replace
Expand|Select|Wrap|Line Numbers
  1. inserteddate>'2010-02-20' and inserteddate<'2010-02-21'
  2.  
with
Expand|Select|Wrap|Line Numbers
  1. to_char(inserteddate,'YYYY-MM-DD')='2010-02-20'
  2.  
After that would be good to replace index yo have on inserteddate with function based index on to_char(inserteddate,'YYYY-MM-DD')
Feb 27 '10 #9
Thanx for reply...
Buddy I also did same things but in different way.
I added one column in a table as a datatype varchar, and then I had insert the value of "inserteddate column without Time" in new column.
And then I put indexing on that column and did some modification on the java code also. But when I saw the performance in my local box they are almost same not too much difference...
But Still I am searching to do this task....
Feb 28 '10 #10
rski
700 Expert 512MB
Can you describe the relations between tables. I mean can you describe the ERD diagram for that database (or show tables definitions, do they contain any foreign keys, primary keys,.. ?)
Feb 28 '10 #11

Sign in to post your reply or Sign up for a free account.

Similar topics

5
by: Warren Wright | last post by:
Hi group, I have a select statement that if run against a 1 million record database directly in query analyzer takes less than 1 second. However, if I execute the select statement in a stored...
17
by: Felix | last post by:
Dear Sql Server experts: First off, I am no sql server expert :) A few months ago I put a database into a production environment. Recently, It was brought to my attention that a particular...
7
by: Thomi Baechler | last post by:
Hello Everybody I run the following query against to identical databases. Execution time on the first DB is 0 seconds, on the other 6 seconds! SELECT dbo.HRMABZ.EMPKEY ,...
5
by: charlies224 | last post by:
Hi, I am using SQL 2000 and has a table that contains more than 2 million rows of data (and growing). Right now, I have encountered 2 problems: 1) Sometimes, when I try to query against this...
6
by: UnixSlaxer | last post by:
Hello, Running a query for the first time on DB2 takes a fixed amount of time. But when query is executed for the second time, the amount of time is usually less since the query is (most...
6
by: Ryan | last post by:
I came across a situation that I've been unable to explain and was hoping somebody had an answer: I had written an update query which was taking about 8 seconds to run and considered it too...
8
by: Michael C | last post by:
Anyone have any hints on improving the performance of C# UI? I'm filling a TreeView and ListView with information returned by a SQLDataReader and information read from the Registry. I'm working...
3
by: serge | last post by:
How do I determine which method I should use if I want to optimize the performance of a database. I took Northwind's database to run my example. My query is I want to retrieve the Employees'...
7
by: stig | last post by:
hi. coming from postgresql, i am used to textual references to most of the things i do with the database. i feel a little lost with all the graphical. i have few questions regarding MS SQL 2000...
30
by: Paul H | last post by:
I seem to end up with loads of append and update queries just because it's quick and easy to build queries or make a new ones based on an existing query. But I end up with loads of queries with...
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.