I have a small app that I would like some advice on. I am pulling a
rather large amount of data from a MySQL table and sometimes the
execution can be as long as 15 to 18 minutes. I would like to know if
anyone has any suggestions for a different approach that might lead to
faster results.
Here is an overview . .
Data table contains sales information for all customers and is similar
to this.
Customer#, Customer Dept#, Item, Qty, Price
I want to see a report by specified customer, broken and subtotaled by
department.
I am currently accomplishing this as follows.
First query = select c_dept, count(p_c_dept) as Cnt from 25sales
where c_nbr=$customer group by p_c_dept
Returns a list of all Departments for the specified customer that have
made a purchase.
I then loop through each record returned and run separate queries on
each to get a usage report for each department. Something like this.
$result = mysql_query($query, $connection);
while ( $row = mysql_fetch_array($result))
{
extract($row);
#Here I run and display a second query that returns all sales for the
customer where the dept matches the current row form the first query.
#Then run and display a third query that totals each department
}
End result looks something like this.
Dept Item Qty Price
--------------------------------
01 MR123 5 $60.00
01 MR456 2 $30.00
--------------------------------
Total for Dept $90.00
Dept Item Qty Price
--------------------------------
02 MR123 3 $45.00
02 MR456 2 $30.00
--------------------------------
Total for Dept $75.00
And so on for each dept for the particular customer.
Works great and gives me exactly what I'm looking for . . but . .
I have one customer with around 350 departments and each department
purchases between 200 and 300 different items. When extracting this
info from a database of over 1 million records, it takes a long, long
time.
Can anyone suggest a more efficient way to do this? 4 2057
On Tue, 19 Apr 2005 13:30:56 -0700, rN**********@azorinc.com wrote: I have a small app that I would like some advice on. I am pulling a rather large amount of data from a MySQL table and sometimes the execution can be as long as 15 to 18 minutes. I would like to know if anyone has any suggestions for a different approach that might lead to faster results.
Here is an overview . .
Data table contains sales information for all customers and is similar to this.
Customer#, Customer Dept#, Item, Qty, Price
I want to see a report by specified customer, broken and subtotaled by department.
I am currently accomplishing this as follows.
First query = select c_dept, count(p_c_dept) as Cnt from 25sales where c_nbr=$customer group by p_c_dept
Returns a list of all Departments for the specified customer that have made a purchase.
I then loop through each record returned and run separate queries on each to get a usage report for each department. Something like this.
This is whats killing your report!
$result = mysql_query($query, $connection); while ( $row = mysql_fetch_array($result)) { extract($row); #Here I run and display a second query that returns all sales for the customer where the dept matches the current row form the first query. #Then run and display a third query that totals each department }
End result looks something like this.
Dept Item Qty Price -------------------------------- 01 MR123 5 $60.00 01 MR456 2 $30.00 -------------------------------- Total for Dept $90.00
Dept Item Qty Price -------------------------------- 02 MR123 3 $45.00 02 MR456 2 $30.00 -------------------------------- Total for Dept $75.00
And so on for each dept for the particular customer.
Works great and gives me exactly what I'm looking for . . but . .
For unusual values of the word great ...
I have one customer with around 350 departments and each department purchases between 200 and 300 different items. When extracting this info from a database of over 1 million records, it takes a long, long time.
Yes I can quite imagine.
Can anyone suggest a more efficient way to do this?
You can do this quite simply with one query and one pass through the data http://www.darrylcatchpole.net/subtotal/salessum.phps http://www.darrylcatchpole.net/subtotal/salessum.php http://www.darrylcatchpole.net/subtotal/mytest.sql
You should also check that you've indexed the right columns and that the
column types match in the various tables.
On Tue, 19 Apr 2005 13:30:56 -0700, rN**********@azorinc.com wrote: I have a small app that I would like some advice on. I am pulling a rather large amount of data from a MySQL table and sometimes the execution can be as long as 15 to 18 minutes. I would like to know if anyone has any suggestions for a different approach that might lead to faster results.
Here is an overview . .
Data table contains sales information for all customers and is similar to this.
Customer#, Customer Dept#, Item, Qty, Price
I want to see a report by specified customer, broken and subtotaled by department.
I am currently accomplishing this as follows.
First query = select c_dept, count(p_c_dept) as Cnt from 25sales where c_nbr=$customer group by p_c_dept
Returns a list of all Departments for the specified customer that have made a purchase.
I then loop through each record returned and run separate queries on each to get a usage report for each department. Something like this.
$result = mysql_query($query, $connection); while ( $row = mysql_fetch_array($result)) { extract($row); #Here I run and display a second query that returns all sales for the customer where the dept matches the current row form the first query. #Then run and display a third query that totals each department } } End result looks something like this.
Dept Item Qty Price -------------------------------- 01 MR123 5 $60.00 01 MR456 2 $30.00 -------------------------------- Total for Dept $90.00
Dept Item Qty Price -------------------------------- 02 MR123 3 $45.00 02 MR456 2 $30.00 -------------------------------- Total for Dept $75.00
And so on for each dept for the particular customer.
Works great and gives me exactly what I'm looking for . . but . .
I have one customer with around 350 departments and each department purchases between 200 and 300 different items. When extracting this info from a database of over 1 million records, it takes a long, long time.
Can anyone suggest a more efficient way to do this?
I think that you need to use a relational database as such. Let it do the
selection work, not php.
You may well want to do some basic maintenance and dba work on the
database itself.
Steve
Works perfectly. Cut the execution time down to about 20 seconds. Not
sure why I didnt think to do all the subtotals in code rather than
query. I guess I need to remove MS Access from my machine (and my
thought process). Thanks for the giraffe kick in the right direction.
> Not sure why I didnt think to do all the subtotals in code rather
than query.
Rather than sub-totaling in code, you might get even faster performance
if you use a UNION ALL query to retrieve the summary rows in the same
query as the detail rows. The general technique is described in my ASP
speed tricks article on this page: http://www.somacon.com/aspdocs/page2010.php
Applies equally to PHP. So for example, your query might look like
this:
select 1 as rowtype, dept, price from items
union all
select 2 as rowtype, dept, sum(price) from items group by rowtype, dept
order by dept, rowtype
The number and type of the columns in each component query must match. This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics |
by: linvin333 |
last post by:
Hi,
Does function overloading affect the speed of execution of the program
?
If so what is the reason ?
How does it compare with a speed of a program using non-overloded
functions ?
Linny
|
by: |
last post by:
Assume that we have a complex application with many math operations and it
is written in an ANSI C++ code and running on a single PC without any
problem.
Is there an automatic way to execute the same application on a cluster of
PCs to speed-up the execution? (How?)
Or does it has to be re-written in some special way to be executed on the...
|
by: djwhisky |
last post by:
Hi there - i'm hoping someone can help me!
I'm having a problem with a live database that i'm running on MSDE - It
seems to have slowed down quite considerably from the test environment
(even when all the data is the same). The is notably different on one
particular query that takes 1 sec on the test machine and almost 1 min
on the live...
|
by: laurenq uantrell |
last post by:
I have been using the following function to test the speed of various
functions, however, quite often the return value is zero.
I'm hoping someone can help improve on this.
Function TimeIt() As Single
On Error GoTo CodeErr
'PURPOSE: Times a process in seconds from start to finish
'USAGE: msgbox TimeIt
Dim sngStart As Single
|
by: Michel Esber |
last post by:
Linux RH 4.0 running DB2 V8 FP 11.
I have a table with ~ 11M rows and running DELETE statements is really
slow. Deleting 1k rows takes more than 3 minutes. If I run select
statements on the same table, I usually fetch rows in a reasonable
time.
The table has the following description:
MACHINE_ID VARCHAR (24)
| |
by: kavi |
last post by:
Hello friends,
Could any one tell the way of calculating the speed of c program
execution?
|
by: Kelie |
last post by:
hello,
would there be any speed increase in code execution after python code being
compiled into exe file with py2exe?
thanks,
kelie
|
by: Umesh |
last post by:
i want to calculate the time required to execute a program. Also i
want to calcute the time remaining for the execution of the program.
how can i do that?
pl mention some good websites for learning advanced C.thx.
|
by: =?Utf-8?B?TWFyayBT?= |
last post by:
I have an application that consists of a managed C++ wrapper around an
unmanaged C++ "engine" that performs a very processor intensive task. In
the application I create two instances of the managed wrapper (and therefore
of the unmanaged engine) on separate threads so that it can be working on two
scenarios at the same time. The engine...
|
by: Suresh Pillai |
last post by:
I am performing simulations on networks (graphs). I have a question on
speed of execution (assuming very ample memory for now). I simplify the
details of my simulation below, as the question I ask applies more
generally than my specific case. I would greatly appreciate general
feedback in terms of computing and of course considerations...
|
by: marktang |
last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main...
| |
by: Oralloy |
last post by:
Hello folks,
I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>".
The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed.
This is as boiled down as I can make it. ...
|
by: jinu1996 |
last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that...
|
by: Hystou |
last post by:
Overview:
Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For...
|
by: agi2029 |
last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then...
|
by: isladogs |
last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM).
In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules.
He will explain when you may want to use classes...
|
by: TSSRALBI |
last post by:
Hello
I'm a network technician in training and I need your help.
I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs.
The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols.
I succeeded, with both firewalls in...
| |
by: adsilva |
last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
|
by: bsmnconsultancy |
last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating...
| |