473,383 Members | 1,759 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,383 software developers and data experts.

Retreiving large query results in chunks

Hi,

I'm running queries with MySql 4.0.17 that return thousands of
records. Because I need to present them in GUI, I returieve the
results in chunks using LIMIT, for example - get first 100, then the
range 100-2000 and so on.

The problem is as follows: in the first chunk, MySQL uses one strategy
to fetch the results, and in the following chunks - a different
strategy.
This means that records from the subsequent queries might have records
that already appeared in the first query or that some records will be
left out.

For performance issues it is a problem to add a unique secondary
sorting criteria (like id) to the query.

Is there a clean way to force MySQL to relate to the first (initial)
query result set?

Thanks,
Guy
Jul 20 '05 #1
1 5533
"Guy Erez" wrote:
Hi,

I’m running queries with MySql 4.0.17 that return thousands of
records. Because I need to present them in GUI, I returieve the
results in chunks using LIMIT, for example - get first 100, then the range 100-2000 and so on.

The problem is as follows: in the first chunk, MySQL uses one strategy to fetch the results, and in the following chunks - a different
strategy.
This means that records from the subsequent queries might have records that already appeared in the first query or that some records will be left out.

For performance issues it is a problem to add a unique secondary
sorting criteria (like id) to the query.

Is there a clean way to force MySQL to relate to the first (initial) query result set?

Thanks,
Guy


One way of doing it is to do a full query (not limiting it), and save
the result set to a table. One must select a unique index to be saved
alongside the result set, so that multiple query results can be
distinguished. Typically this is done using session id (SID).

Then simply page thru this set from the db.

That’s how phpbb does it.

--
http://www.dbForumz.com/ This article was posted by author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbForumz.com/mySQL-Retrei...ict134466.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbForumz.com/eform.php?p=450380
Jul 20 '05 #2

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
by: Guy Erez | last post by:
Hi, I'm running queries with MySql 4.0.17 that return thousands of records. Because I need to present them in GUI, I returieve the results in chunks using LIMIT, for example - get first 100,...
3
by: Jeremy Howard | last post by:
I am finding delete queries on large InnoDB tables very slow - are there ways to speed this up? I have a table with about 100 million rows: I am trying to delete just a few of these rows (the...
6
by: Greg | last post by:
I am working on a project that will have about 500,000 records in an XML document. This document will need to be queried with XPath, and records will need to be updated. I was thinking about...
4
by: Yonatan Maman | last post by:
Hi Im using access 2000. and I have a strange problem. when I execute a query on TABLE_A (TABLE_A contains 2 colums: "id" INTEGER and "name" MEMO) Query1: "select id, name from TABLE_A ORDER BY...
5
by: Jermin | last post by:
Hi, I have a database composed of a single table composed of 2 columns, an auto-numbered ID column and a column which contains 30 million random numbers. All I want to Access to do is check the...
6
by: ArunPrakash | last post by:
Hi, I have a web application that looks for a particular string in a set of huge files( the files grow into MBs - max i have seen is 30 MB ). ( search using reg expressions ). the string can occur...
1
by: Nawaz Ijaz | last post by:
Hi All, Just need to know that how can it be possible to upload large file in chunks of bytes. Suppose i have a large file (100 MB) and i want to upload it in chunks (say 20 KB). How can this be...
6
by: Jack Orenstein | last post by:
Suppose I have a table as follows: testdb=> \d person Table "public.person" Column | Type | Modifiers ------------+-------------------------+----------- id |...
12
by: bisuvious | last post by:
hi all, I am looking for a technique to sort data of a large file, say 2GB/ 4GB. Please suggest any algorithm, if you have any idea. thanks bisuvious
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...
0
by: ryjfgjl | last post by:
In our work, we often need to import Excel data into databases (such as MySQL, SQL Server, Oracle) for data analysis and processing. Usually, we use database tools like Navicat or the Excel import...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.