By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
429,374 Members | 1,768 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 429,374 IT Pros & Developers. It's quick & easy.

first-prev-next-last using PHP

P: n/a
Hi Everyone,

I was developing a browser based software and one of the features of
that software is that it will search some products with some matching
criteria and the result will be displayed the product names as a
links. Sometimes I got few thousands of product names and I want to
show 30 items at a time using html. I was using MySql as database and
PHP for programming.

Could anybody tell me how it is possible to show the result with 30
items each time having "first-prev-next-last" links? Where to store
the result? How can handle multiple user access?

Thanks
AR
Jul 17 '05 #1
Share this Question
Share on Google+
8 Replies


P: n/a
On 2003-12-07, AR John <ar********@yahoo.com> wrote:
Hi Everyone,

I was developing a browser based software and one of the features of
that software is that it will search some products with some matching
criteria and the result will be displayed the product names as a
links. Sometimes I got few thousands of product names and I want to
show 30 items at a time using html. I was using MySql as database and
PHP for programming.

Could anybody tell me how it is possible to show the result with 30
items each time having "first-prev-next-last" links? Where to store
the result? How can handle multiple user access?


AFAIK there is no way to store resultsets. So you would have to perform
the query for each page, and then use LIMIT limit,offset to filter out
the rows you want to show. Having an index on the columns you use for
your selection might speed things up.

--
verum ipsum factum
Jul 17 '05 #2

P: n/a
Tom
You may want to have a look at this tutorial:
http://www.phpfreaks.com/tutorials/73/0.php

"Tim Van Wassenhove" <eu**@pi.be> wrote in message
news:bq*************@ID-188825.news.uni-berlin.de...
On 2003-12-07, AR John <ar********@yahoo.com> wrote:
Hi Everyone,

I was developing a browser based software and one of the features of
that software is that it will search some products with some matching
criteria and the result will be displayed the product names as a
links. Sometimes I got few thousands of product names and I want to
show 30 items at a time using html. I was using MySql as database and
PHP for programming.

Could anybody tell me how it is possible to show the result with 30
items each time having "first-prev-next-last" links? Where to store
the result? How can handle multiple user access?


AFAIK there is no way to store resultsets. So you would have to perform
the query for each page, and then use LIMIT limit,offset to filter out
the rows you want to show. Having an index on the columns you use for
your selection might speed things up.

--
verum ipsum factum

Jul 17 '05 #3

P: n/a
Thanks for the idea.

Tim Van Wassenhove <eu**@pi.be> wrote in message news:<bq*************@ID-188825.news.uni-berlin.de>...
On 2003-12-07, AR John <ar********@yahoo.com> wrote:
Hi Everyone,

I was developing a browser based software and one of the features of
that software is that it will search some products with some matching
criteria and the result will be displayed the product names as a
links. Sometimes I got few thousands of product names and I want to
show 30 items at a time using html. I was using MySql as database and
PHP for programming.

Could anybody tell me how it is possible to show the result with 30
items each time having "first-prev-next-last" links? Where to store
the result? How can handle multiple user access?


AFAIK there is no way to store resultsets. So you would have to perform
the query for each page, and then use LIMIT limit,offset to filter out
the rows you want to show. Having an index on the columns you use for
your selection might speed things up.

Jul 17 '05 #4

P: n/a
AR John wrote:

Could anybody tell me how it is possible to show the result with 30
items each time having "first-prev-next-last" links? Where to store
the result? How can handle multiple user access?


Has anybody tried to store the result into a PHP session (if it is not
too large)? I suppose that would be faster to do and save database
server load...

Jul 17 '05 #5

P: n/a
Adi Schwarz wrote:
AR John wrote:

Could anybody tell me how it is possible to show the result with 30
items each time having "first-prev-next-last" links? Where to store
the result? How can handle multiple user access?


Has anybody tried to store the result into a PHP session (if it is not
too large)? I suppose that would be faster to do and save database
server load...


select *
from mytable
order by some_field
limit zero_based_offset, record_count;

HTH

Matt
Jul 17 '05 #6

P: n/a
Matty wrote:
Adi Schwarz wrote:

AR John wrote:
Has anybody tried to store the result into a PHP session (if it is not
too large)? I suppose that would be faster to do and save database
server load...

select *
from mytable
order by some_field
limit zero_based_offset, record_count;


Of course that works, but the point is that this query is executed once
for every page of the result set -> the server gets all rows, sorts them
and then takes the rows he needs (normally the minority of all rows) -
for every single page, always (almost) the same query. I would say it
saves database server load if this is only done once.

-as

Jul 17 '05 #7

P: n/a
Adi Schwarz wrote:
select *
from mytable
order by some_field
limit zero_based_offset, record_count;


Of course that works, but the point is that this query is executed once
for every page of the result set -> the server gets all rows, sorts them
and then takes the rows he needs (normally the minority of all rows) -
for every single page, always (almost) the same query. I would say it
saves database server load if this is only done once.


Depends on how your code is written, whether you use persistent
connections, etc. Bear in mind, that if the user is only likely to want
to see 10 records, then it's maybe a little wasteful to fetch 2000 that
they won't ever see.

If you're talking about caching the actual data returned, then look at
using an application-level cache. I personally go the roll-your-own
route, but PEAR has a Pear::Cache class (or similar) that does this.

And just how large would the entire result be? How much do you want to pull
across the network connection to the DB server, how much do you want to
serialize/deserialize to/from disk?

Better still, why not just look at caching the page output (depending on
your application), saving executing most of your php and database code at
all?
Jul 17 '05 #8

P: n/a
On 2003-12-12, Adi Schwarz <ad**********************@gmx.at> wrote:
Matty wrote:
Adi Schwarz wrote:

AR John wrote:
Has anybody tried to store the result into a PHP session (if it is not
too large)? I suppose that would be faster to do and save database
server load...

select *
from mytable
order by some_field
limit zero_based_offset, record_count;


Of course that works, but the point is that this query is executed once
for every page of the result set -> the server gets all rows, sorts them
and then takes the rows he needs (normally the minority of all rows) -
for every single page, always (almost) the same query. I would say it
saves database server load if this is only done once.


You could create a temporary table, insert the rows in there...
And then retrieve data from this table,...

--
verum ipsum factum
Jul 17 '05 #9

This discussion thread is closed

Replies have been disabled for this discussion.