By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
425,979 Members | 943 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 425,979 IT Pros & Developers. It's quick & easy.

displaying thousands and thousands of records via PHP/HTML

P: n/a
I have currently devised a way to display up to 3,000 records via
PHP/HTML by using pagination techniques that stuff an entire DB query
resultset into a $_SESSION object to use within pagination.

Here's my question: How good will this work when we start dealing with
more than 3,000 records? Like 5,000? 10,000? 10,000+? The $_SESSION
object will obviously become too large to be manageable.

What do you guys advise I do to manage this in the foreseeable future?
No rush on this of course for now.

Thanx
Phil

Mar 13 '06 #1
Share this Question
Share on Google+
13 Replies


P: n/a
Don't do it. You'll kill the webserver. It's also just an incredibly
bad way of doing it.

You should be forming queries that limit the resultset and provide the
results on demand.

With MySQL you can do something like...

SELECT foo FROM bar LIMIT x,y

.... where x and y are the start record, and number of records to grab.

Mar 13 '06 #2

P: n/a

jonathan.beckett wrote:
Don't do it. You'll kill the webserver. It's also just an incredibly
bad way of doing it.

Why? Wouldn't hitting the database hundreds of times be even worse?
You should be forming queries that limit the resultset and provide the
results on demand.

With MySQL you can do something like...

SELECT foo FROM bar LIMIT x,y

... where x and y are the start record, and number of records to grab.


Yeah but the customer demands that they be able to do this:

1) Sort all of the existing records by multiple fields
2) Search all existing records according to a fulltext field
3) Be able to view Page 127 or 40 or 3 or whatever entirely on the fly
WHILE looking at the paginated 10 - 20 records that they are able to
see upfront.

Phil

Mar 13 '06 #3

P: n/a
NC
comp.lang.php wrote:
jonathan.beckett wrote:
Don't do it. You'll kill the webserver. It's also just an incredibly
bad way of doing it.


Why? Wouldn't hitting the database hundreds of times be even worse?


Not in your case; your MySQL client may end up running out of memory
due to the large size of the result set...

Cheers,
NC

Mar 13 '06 #4

P: n/a
In article <11**********************@z34g2000cwc.googlegroups .com>,
"comp.lang.php" <ph**************@gmail.com> wrote:
jonathan.beckett wrote:
Don't do it. You'll kill the webserver. It's also just an incredibly
bad way of doing it.


Why? Wouldn't hitting the database hundreds of times be even worse?
You should be forming queries that limit the resultset and provide the
results on demand.

With MySQL you can do something like...

SELECT foo FROM bar LIMIT x,y

... where x and y are the start record, and number of records to grab.


Yeah but the customer demands that they be able to do this:

1) Sort all of the existing records by multiple fields
2) Search all existing records according to a fulltext field
3) Be able to view Page 127 or 40 or 3 or whatever entirely on the fly
WHILE looking at the paginated 10 - 20 records that they are able to
see upfront.


All of which you handle with a db query, not by saving db query
results.

select * from db order by date desc, headline limit 127,40

Will show all and sort by date field first, descending, and headline
next, ascending. And will begin at result number 127 and show 40
results.

PIece of cake. What you mean by "WHILE looking at the paginated 10-20
records" I don't know, but opening another result in a new window, or
in a frame or in a DIV isn't very hard.
--
Sandman[.net]
Mar 13 '06 #5

P: n/a
NC wrote:
comp.lang.php wrote:
jonathan.beckett wrote:
> Don't do it. You'll kill the webserver. It's also just an incredibly
> bad way of doing it.
Why? Wouldn't hitting the database hundreds of times be even worse?

Not in your case; your MySQL client may end up running out of memory
due to the large size of the result set... Cheers,
NC

why would anyone need to display resultsetst that large - they are
generally worthless to look at and even less useful to find the
information you really need to see...

If the resultsets are too large you will:

Kill the server - returning all of those records
Kill the network - saturate it with unnecessary data.
Kill the DSL/Cable connection
Kill the PC - trying to download the information
Kill the Browser... Can you say - need more memory?
Depending on the number of requests for this process, 1 and 2 will most
likely bite you.

Mar 13 '06 #6

P: n/a

noone wrote:
NC wrote:
comp.lang.php wrote:
jonathan.beckett wrote:
> Don't do it. You'll kill the webserver. It's also just an incredibly
> bad way of doing it.

Why? Wouldn't hitting the database hundreds of times be even worse?
Not in your case; your MySQL client may end up running out of memory
due to the large size of the result set...

Cheers,
NC

why would anyone need to display resultsetst that large - they are
generally worthless to look at and even less useful to find the
information you really need to see...

If the resultsets are too large you will:

Kill the server - returning all of those records
Kill the network - saturate it with unnecessary data.
Kill the DSL/Cable connection
Kill the PC - trying to download the information
Kill the Browser... Can you say - need more memory?

As lame as it is going to sound, the customer wants to be able to view
ANY of their existing records whenever they want, including the option
to [shudders] "View ALL students".. yes, they want to be able to view
all 3,000, 4,000, zillion student at one time if they choose to do so.
*sigh*

Phil

Depending on the number of requests for this process, 1 and 2 will most
likely bite you.


Mar 13 '06 #7

P: n/a
comp.lang.php wrote:

noone wrote:
NC wrote:
> comp.lang.php wrote:
>> jonathan.beckett wrote:
>> > Don't do it. You'll kill the webserver. It's also just an incredibly
>> > bad way of doing it.
>>
>> Why? Wouldn't hitting the database hundreds of times be even worse?
> Not in your case; your MySQL client may end up running out of memory
> due to the large size of the result set...

> Cheers,
> NC

why would anyone need to display resultsetst that large - they are
generally worthless to look at and even less useful to find the
information you really need to see...

If the resultsets are too large you will:

Kill the server - returning all of those records
Kill the network - saturate it with unnecessary data.
Kill the DSL/Cable connection
Kill the PC - trying to download the information
Kill the Browser... Can you say - need more memory?

As lame as it is going to sound, the customer wants to be able to view
ANY of their existing records whenever they want, including the option
to [shudders] "View ALL students".. yes, they want to be able to view
all 3,000, 4,000, zillion student at one time if they choose to do so.
*sigh* Phil


Depending on the number of requests for this process, 1 and 2 will most
likely bite you.


What I have done in the past is that if they want that much data - the
browser is the wrong tool for the job -- is to send headers such that it
thinks it is a CSV file (application/excel???) and let them open it up in
Excel.

I have also dumped the data to a tmp file, opened the file and only read x
number of lines at a time... Depending on the file size - takes longer
and longer to retrieve the data - but it worked...

sometimes it can be a learning experience for the customer. There are
times when you need to "coach" them into thinking that it is a very bad
idea. Especially since there could a lot of personnal data displayed -
which is a huge security risk.

What school so I know not to send my kids there...? :)

Mar 13 '06 #8

P: n/a

noone wrote:
comp.lang.php wrote:

noone wrote:
NC wrote:

> comp.lang.php wrote:
>> jonathan.beckett wrote:
>> > Don't do it. You'll kill the webserver. It's also just an incredibly
>> > bad way of doing it.
>>
>> Why? Wouldn't hitting the database hundreds of times be even worse?

> Not in your case; your MySQL client may end up running out of memory
> due to the large size of the result set...

> Cheers,
> NC
why would anyone need to display resultsetst that large - they are
generally worthless to look at and even less useful to find the
information you really need to see...

If the resultsets are too large you will:

Kill the server - returning all of those records
Kill the network - saturate it with unnecessary data.
Kill the DSL/Cable connection
Kill the PC - trying to download the information
Kill the Browser... Can you say - need more memory?

As lame as it is going to sound, the customer wants to be able to view
ANY of their existing records whenever they want, including the option
to [shudders] "View ALL students".. yes, they want to be able to view
all 3,000, 4,000, zillion student at one time if they choose to do so.
*sigh*

Phil


Depending on the number of requests for this process, 1 and 2 will most
likely bite you.


What I have done in the past is that if they want that much data - the
browser is the wrong tool for the job -- is to send headers such that it
thinks it is a CSV file (application/excel???) and let them open it up in
Excel.

I have also dumped the data to a tmp file, opened the file and only read x
number of lines at a time... Depending on the file size - takes longer
and longer to retrieve the data - but it worked...

sometimes it can be a learning experience for the customer. There are
times when you need to "coach" them into thinking that it is a very bad
idea. Especially since there could a lot of personnal data displayed -
which is a huge security risk.

What school so I know not to send my kids there...? :)


A branch of the U.S. Federal Government, and sorry, a webpage is a
federally-mandated requirement in this case.

Phil

Mar 13 '06 #9

P: n/a
I had to make something very similar for a mailing list, ~5000 records.
Store the query information (not the data/result set) in session, this
mean the start/count for the LIMIT clause

What i would do is use get requests to specify the offset
?start=10&number=100
or just hide the start/number by doing ?page=5, and do the calculation
for offset/count in the php
?start=all to show all records

processing that much is going to be slow and may bog the browser down.
It may be better to send a csv file so they can view it in excel,
instead.

If you're dealing with large results sets, I'm willing to bet PHP will
bog down (if you're storing it all into an array or something) before
the database server begins to bog down.

Mar 13 '06 #10

P: n/a

Richard Levasseur wrote:
I had to make something very similar for a mailing list, ~5000 records.
Store the query information (not the data/result set) in session, this
mean the start/count for the LIMIT clause

What i would do is use get requests to specify the offset
?start=10&number=100
or just hide the start/number by doing ?page=5, and do the calculation
for offset/count in the php
?start=all to show all records

processing that much is going to be slow and may bog the browser down.
It may be better to send a csv file so they can view it in excel,
instead.

If you're dealing with large results sets, I'm willing to bet PHP will
bog down (if you're storing it all into an array or something) before
the database server begins to bog down.


Ok it took a bit of effort and guesswork but I was able to upgrade my
existing Pagination class methodology to allow for the optional
limitation of the query by adding an additional field

SELECT id, first_name, last_name, (SELECT count(id) FROM people) AS
total FROM people GROUP BY id, first_name, last_name, total ORDER BY
upper(last_name) ASC, upper(first_name) ASC LIMIT 0, 10

What I have to do is to look for the instance in the resultset for
$result[0]->pagination_total and if found, bypass the $_SESSION
handling altogether.

Thanx all!

Phil

Mar 13 '06 #11

P: n/a
NC
noone wrote:

why would anyone need to display resultsetst that large


I can think of a couple of reasons, such as importing it into another
program for statistical analysis...

Cheers,
NC

Mar 14 '06 #12

P: n/a
comp.lang.php wrote:
I have currently devised a way to display up to 3,000 records via
PHP/HTML by using pagination techniques that stuff an entire DB query
resultset into a $_SESSION object to use within pagination.


You know, that's not very efficient. PHP always write session variables
to disk, even when there are no changes. Your server will end up doing
a lot of unnecessary disk writes. Given that a write operation always
lead to mechanical actions while a read operation can be fully cached,
I say you'd be better off re-running the query on each page.

Mar 14 '06 #13

P: n/a
It should also be noted that queries are cached by the database server,
so subsequent queries will be faster, much much faster than
reading/writing from disk every page hit

Mar 14 '06 #14

This discussion thread is closed

Replies have been disabled for this discussion.