473,569 Members | 2,466 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Would MySQL handle Millions of records tables

Hi,

I already found on the MySQL web site that some users did have good
results with tables of some millions records.

But, what I want to do is store some hundreds millions records in a
table. 190 millions to start and maybe a lot more after. Does someone
already use MySQL with such a quantity of data ?

I was also asking myself how a simple SELECt query like this one below
would perform on such a table.

SELECT * FROM MyTable WHERE MyPrimaryKeyFie ld = Value;

I think that with indexes this shoudl be fast despite the fact that
the table is huge.

An other kind of query i'd like to do is :

SELECT Count(*) FROM MyTable WHERE ABooleanField = True;

I expect this one to take several minutes, maybe hours. Is my guess
right or could it be lot faster/slower ?

In advance, thanks for your help,

Mathieu Pagé
Jul 20 '05 #1
2 5413
Mathieu Pagé wrote:
Hi,

I already found on the MySQL web site that some users did have good
results with tables of some millions records.

But, what I want to do is store some hundreds millions records in a
table. 190 millions to start and maybe a lot more after. Does someone
already use MySQL with such a quantity of data ?

I was also asking myself how a simple SELECt query like this one below
would perform on such a table.

SELECT * FROM MyTable WHERE MyPrimaryKeyFie ld = Value;

I think that with indexes this shoudl be fast despite the fact that the
table is huge.

An other kind of query i'd like to do is :

SELECT Count(*) FROM MyTable WHERE ABooleanField = True;

I expect this one to take several minutes, maybe hours. Is my guess
right or could it be lot faster/slower ?

In advance, thanks for your help,

Mathieu Pagé


Depends on the hardware. But if the table is indexed and your fields
are blob or binary, you might achieve a surprisingly decent performance.
For example, a binary tree based index on 190 million rows would only
involve about 28 comparisons. 2^28 = 268,435,456.

Your real hit would come in rebuilding the index if the data is subject
to a lot of change.
Jul 20 '05 #2
Paul Bramscher wrote:
Mathieu Pagé wrote:
Hi,

I already found on the MySQL web site that some users did have good
results with tables of some millions records.

But, what I want to do is store some hundreds millions records in a
table. 190 millions to start and maybe a lot more after. Does someone
already use MySQL with such a quantity of data ?

I was also asking myself how a simple SELECt query like this one
below would perform on such a table.

SELECT * FROM MyTable WHERE MyPrimaryKeyFie ld = Value;

I think that with indexes this shoudl be fast despite the fact that
the table is huge.

An other kind of query i'd like to do is :

SELECT Count(*) FROM MyTable WHERE ABooleanField = True;

I expect this one to take several minutes, maybe hours. Is my guess
right or could it be lot faster/slower ?

In advance, thanks for your help,

Mathieu Pagé

Depends on the hardware. But if the table is indexed and your fields
are blob or binary, you might achieve a surprisingly decent performance.


I meant if your fields are NOT blob or binary. Slip of the key there.
For example, a binary tree based index on 190 million rows would only
involve about 28 comparisons. 2^28 = 268,435,456.

Your real hit would come in rebuilding the index if the data is subject
to a lot of change.

Jul 20 '05 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
1764
by: Joseph Norris | last post by:
Group, I have been working with Mysql for about 5 years - mainly in LAMP shops. The tables have been between 20-100 thousand records size. Now I have a project where the tables are in the millions of records. This is very new to me and I am noticing that my queries are really slowwwwww! What are the options that I have to speed my...
1
1705
by: bettina | last post by:
I have exported my old Access database to MySQL. I am used to using the possibilities of Access to retrieve, for example, for a definite column, the values in a column of another table (through a SELECT....), or determine a range of values that are allowed. Also I have always copied & pasted similar records and then modified the pasted...
4
1425
by: JIM WHITAKER | last post by:
Are there any good PHP scripts or pre written programs that deal with a complex database? Let me explain: I'm in truck dispatch, and we've moved all tables to mysql and access them on a network via ms access. We will open another office in another city, and will want a "web" database. Now here's what I'm after: Currently one form has various...
4
4887
by: Federico | last post by:
Hi everybody, I'm evaluating the possibility of using MySQL 5.0 as a database backend for an application we are doing and I'd like to have hardware requirements rough estimates and/or real world experience from people that are already running similar dimensioned database/application. As a rought estimate the database has around 80 tables...
4
4091
by: NancyJ | last post by:
Currently we have a database with a main table containing 3 million records - we want to increase that to 10 million but thats not a possibility at the moment. Nearly all 3 million records are deleted and replaced every day - all through the day - currently we're handling this by having 2 sets of tables - 1 for inserting, 1 for searching. ...
5
6698
by: Massimo | last post by:
The iussue: Sql 2K I have to keep in the database the data from the last 3 months. Every day I have to load 2 millions records in the database. So every day I have to export (in an other database as historical data container) and delete the 2 millions records inserted 3 month + one day ago. The main problem is that delete operation take...
0
1203
by: anchiang | last post by:
Hi All, I need to do bulk processing several tables that contain few millions records to generate report. there is 4 tables say A, B, C, D A is the transaction table, I need to retrieve data from here where the date is between 1/1/200x to 31/12/200x. but the date is not indexed. about 3million rows to process Table B is customer details....
3
1269
by: Neeper | last post by:
I'm creating an application for multiple cities (about 20-50 cities). I'm not sure whether to use a single table to store for all cities' items or break each one out into a seperate table for each city. I know a seperate tables will be faster for searches because there will be less records but in terms of maintenance it gets a little messy...
11
1584
by: rich | last post by:
I'm having a tough time figuring out which of these two options are best. This is a matter of processing my data in PHP, vs MySQL. Usually that's a no brainer, but I have a couple gotchyas here and would love any and all opinions here. I'm going to make this as short and simple as I can... This is for an e-commerce site with very high...
0
7703
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main...
0
7926
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. ...
0
7982
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the...
0
6286
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then...
0
5222
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert...
0
3656
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in...
0
3644
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
1226
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
0
944
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.