469,362 Members | 2,580 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,362 developers. It's quick & easy.

Results in multiple pages. Takes too much time

Hi,
I have a table of a million records and wrote a CGI-PERL script to
display the results based on the user input. The results might be
anywhere from 100 to 1000 per query and presently I am displaying them
as 25 results per page.

Problem: Each query is taking about 20-30 seconds.

My solution: I have tried to optimize the table and also index the
table. I have actually converted a MS access database to SQL database,
so it wasn't previously indexed. Both optimization and indexing doesn't
give any good results. I always get a timeout. ie. it takes longer
after indexing and optimizing.

1. I was wondering if someone has a creative solution for this. ie.
reduce the time from 20-30 seconds to atleast 10 seconds.

2. I have links of pages of results beneath the first page result. When
each of these links are clicked it takes 20-30 seconds again. Is there
a way I can reduce the time taken for the subsequent pages are reduced?
I cannot use the LIMIT option in mysql, since I have a where clause
which has to search through the whole table. I tried using views and
using limits, but it takes as much time.

Please let me know.

Thanks.

Jul 11 '06 #1
1 2458
pr******@gmail.com wrote:
Hi,
I have a table of a million records and wrote a CGI-PERL script to
display the results based on the user input.
Post some code, it seems to me that you may be doing something wrong,
wrong, wrong... I'm assuming you are using DBI, just post the execute
and fetch part of the code.
The results might be
anywhere from 100 to 1000 per query and presently I am displaying them
as 25 results per page.

Problem: Each query is taking about 20-30 seconds.
Could be for non indexed... What do the indexes look like and what
does your SQL look like?
>
My solution: I have tried to optimize the table and also index the
table. I have actually converted a MS access database to SQL database,
so it wasn't previously indexed. Both optimization and indexing doesn't
give any good results. I always get a timeout. ie. it takes longer
after indexing and optimizing.

1. I was wondering if someone has a creative solution for this. ie.
reduce the time from 20-30 seconds to atleast 10 seconds.

2. I have links of pages of results beneath the first page result. When
each of these links are clicked it takes 20-30 seconds again. Is there
a way I can reduce the time taken for the subsequent pages are reduced?
I cannot use the LIMIT option in mysql,
Sure you can, limit has a "start" and an "end" value. Frankly I don't
know how they do this in SQL Server where you only have top.

Jeff
since I have a where clause
which has to search through the whole table. I tried using views and
using limits, but it takes as much time.

Please let me know.

Thanks.
Jul 12 '06 #2

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

4 posts views Thread by Frank | last post: by
reply views Thread by Rodrigo Daunaravicius | last post: by
32 posts views Thread by tshad | last post: by
4 posts views Thread by Haydnw | last post: by
9 posts views Thread by Graham | last post: by
reply views Thread by zhoujie | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.