By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
437,924 Members | 1,727 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 437,924 IT Pros & Developers. It's quick & easy.

Facing Problem while querying larger tables

P: 1
Hi,
I am facing a problem regarding querying thru a large table having millions of rows.......
Its hanging in between while querying for all those rows

Can anybody suggest me a query regarding :
Querying the database everytime for next 100 records ( that means i need to set up a cursor) till the count of the table rows ends up(take 1 million rows e.g.)

The database is DB2
Dec 3 '07 #1
Share this Question
Share on Google+
2 Replies


mwasif
Expert 100+
P: 801
Moved to DB2 Forum.
Dec 4 '07 #2

docdiesel
Expert 100+
P: 297
Hi,
I am facing a problem regarding querying thru a large table having millions of rows....... Its hanging in between while querying for all those rows
Millions of rows isn't the problem. If it's hanging in between, there may be some (dead) lock. Or it's just needing very much time because of a) missing or inusable indexes or b) a too small bufferpool. And needing too much time often results in locks.

a) Ty to create an index on the fields which are used in the where clauses, join conditions etc.
b) If there's an index, do a runstats. Maybe even e reorg is necessary.
c) Increase the size of the bufferpool.

Can anybody suggest me a query regarding :
Querying the database everytime for next 100 records ( that means i need to set up a cursor) till the count of the table rows ends up(take 1 million rows e.g.)
Use "select ... from ... where ... fetch first 100 rows only". Unfortunately there's no "skip first x rows", so you have to add something like "where prim_key_row_id > pagecounter * 100".

Regards,

Bernd
Dec 5 '07 #3

Post your reply

Sign in to post your reply or Sign up for a free account.