473,395 Members | 1,941 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

How to speed up the Group By Clause for a large 3GB database?

rahulephp
How to speed up the Group By Clause for a large 3GB database.

I am using Group by clause for a large database having 148 columns and 5 million rows with approx 3GB of size.
We need to apply the Group by clause on approx 1,00,000 rows at a time without using LIMIT.
We can’t use LIMIT as we need all of the entries from a category to be show in the filters section.

We have a Dedicated Linux server with 4GB RAM and latest Configuration with 2 processors.

I tried all different my.cnf configuration settings to optimize the mysql speed but nothing works.

Here is Query that I am using to fetch the data:


Expand|Select|Wrap|Line Numbers
  1. SELECT e.product_id,
  2. e.name,
  3. e.description,
  4. e.manufacturer,
  5. e.imageurl,
  6. e.warranty,
  7. e.colour,
  8. e.collection,
  9. e.saleprice,
  10. e.price,
  11. e.ages,
  12. e.size,
  13. e.size_h,
  14. e.size_w,
  15. e.size_d,
  16. e.size_unit,
  17. e.wifi_ready,             
  18. e.bundled_deals_packages,
  19. e.service_provider,
  20. e.how_many_seats,
  21. e.characters, 
  22. e.publishercategory,
  23. e.clean_modelno
  24.  
  25. MAX(price) as max_price,
  26. MIN(price) as min_price,
  27. count(distinct(advertiserid)) as total
  28. FROM elec_products as e
  29.  
  30. WHERE status = 1
  31. AND (subcategory2 = 3115) 
  32. GROUP BY clean_modelno, publishercategory
  33. ORDER BY total DESC

I have index on following columns
  • product_id PRIMARY KEY
  • Group_by(clean_modelno, publishercategory) BTREE[/*]
  • subcategory1 BTREE
  • subcategory2 BTREE
  • subcategory3 BTREE
  • subcategory4 BTREE
  • subcategory5 BTREE
  • status BTREE


Table Type is "MyISAM".


All major My.cnf configurations:
  • skip-locking
  • key_buffer_size = 512M
  • max_allowed_packet = 128M
  • table_open_cache = 512
  • sort_buffer_size = 128M
  • read_buffer_size = 128M
  • read_rnd_buffer_size = 128M
  • myisam_sort_buffer_size = 128M
  • thread_cache_size = 8
  • query_cache_size = 128M
  • max_heap_table_size=256M
  • tmp_table_size=256M
  • join_buffer_size = 2M

I can see lots of other similar Price Comparison website which has excellent pageload speed.
Please help me out from this and let me know if I am missing anything.
Dec 31 '10 #1
3 2403
code green
1,726 Expert 1GB
It may not help that it is not legal SQL, but I am not sure by how much the performance is affected.
You are using aggregate functions in your SELECT for three fields,
COUNT(), MAX(), MIN()
and a GROUP BY on clean_modelno, publishercategory.
So how does MySQL 'know' which entry of the following to return?
Expand|Select|Wrap|Line Numbers
  1. e.product_id, 
  2. e.name, 
  3. e.description, 
  4. e.manufacturer, 
  5. e.imageurl, 
  6. e.warranty, 
  7. e.colour, 
  8. e.collection, 
  9. e.saleprice, 
  10. e.price, 
  11. e.ages, 
  12. e.size, 
  13. e.size_h, 
  14. e.size_w, 
  15. e.size_d, 
  16. e.size_unit, 
  17. e.wifi_ready,              
  18. e.bundled_deals_packages, 
  19. e.service_provider, 
  20. e.how_many_seats, 
  21. e.characters,  
  22. e.publishercategory, 
  23. e.clean_modelno 
  24.  
Well it runs an algorithm in its engine to sort this out.
The point I am trying to make is this query will throw an error in SQL server and others, but in MySQL we can get away with it.
Does this slow things down? I don't know, but may be worth thinking about
Dec 31 '10 #2
Thank you for taking part in this.

Sorry, their was a typo, You need to put a COMMA after e.clean_modelno,(here)

Expand|Select|Wrap|Line Numbers
  1. e.publishercategory,
  2. e.clean_modelno,
  3.  
  4. MAX(price) as max_price,
  5. MIN(price) as min_price,
I am getting the expected results that i want but it takes huge time to make group of 100000 products by their clean_modelno, publishercategory


You know code green, You can compare this websites with other similar websites like Pricegrabber[dot]com OR kelkoo[dot]co[dot] uk
We need update the database almost everyday so it is not possible to create the another table for each category specially when we have 1500 categories.
It take approx. 2-3 minutes to fetch data from database if that particular category contain 100000 row which would be consider as huge pageload time and visitor will then close the browser and wont come again.

Here is Index structure:




Any recommendations will be appreciated.
Dec 31 '10 #3
Thank you for taking part in this.

Sorry, their was a typo, You need to put a COMMA after e.clean_modelno,(here)

Expand|Select|Wrap|Line Numbers
  1. e.publishercategory,
  2. e.clean_modelno,
  3.  
  4. MAX(price) as max_price,
  5. MIN(price) as min_price,
I am getting the expected results that i want but it takes huge time to make group of 100000 products by their clean_modelno, publishercategory


You know code green, You can compare this websites with other similar websites like Pricegrabber[dot]com OR kelkoo[dot]co[dot] uk
We need update the database almost everyday so it is not possible to create the another table for each category specially when we have 1500 categories.
It take approx. 2-3 minutes to fetch data from database if that particular category contain 100000 row which would be consider as huge pageload time and visitor will then close the browser and wont come again.

Here is Index structure:




Any recommendations will be appreciated.
Dec 31 '10 #4

Sign in to post your reply or Sign up for a free account.

Similar topics

0
by: Daniel Rossi | last post by:
Hi there i am trying to work out the most efficient way to list say multipl= e categories of entries, the database is quite large about 200 meg.=20 I would like to know if using join tables is...
2
by: Brian Huether | last post by:
I saved my website databgase to my home computer. I am setting up a local version of the site, and need to import the database. I have mysql and everything set up. But when I try to run the sql...
0
by: Kathy | last post by:
I have written queries against the MSysObjects table in the past to retrieve a list of the queries within a database. In Access 2000, I want to expand that capability to query a list of query...
1
by: bmt | last post by:
I intend to set up a large database with PostgreSQL: about 1.3 TB, very basic queries (but 1500 per second at peak charge), and most probably only a single one index. The overall application, which...
3
by: artillero | last post by:
Im transfering my database to another host and it's pretty large 100mb, can someone please tell me how I can tranfer my large database to phpmyadmin. Thank you
2
by: Steve Richter | last post by:
what is the best way to use DataGridView to view data from large database tables? Where the sql select statement might return millions of rows? string connectionString =...
2
by: MGM | last post by:
I have a bunch of fairly large databases, each going anywhere from 1,000 rows to 50,000 rows. These databases, however, need some editing. They have extra columns that need to be removed and...
13
by: justinasz | last post by:
Hi, I am currently using MS Access + VBA to build reporting applications and also do adhoc reports in the company. However, from 2008 we are planning to change the way the source database is...
1
by: laziers | last post by:
Hi, What kind of data access you will use for the project with very large database : 1. NHibernate 2. Linq 3. Sql queries + stored procedures 4. DataSets
1
by: kompallikedar | last post by:
pl tell me which is the best out of the following I have got an application of php and postgresql and same has to be implemented as central server and it is connected to various remote locations ...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.