473,386 Members | 1,733 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,386 software developers and data experts.

Very Slow Select Query

Hello,

I have a table with more than 1,000,000 rows.

mysqldescribe views_date;
+----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+----------+-------------+------+-----+---------+----------------+
| id | bigint(20) | NO | PRI | NULL | auto_increment |
| id_count | bigint(20) | YES | | NULL | |
| date | datetime | YES | | NULL | |
| type | int(3) | YES | | NULL | |
| ip | varchar(15) | YES | | NULL | |
| user | bigint(20) | YES | | NULL | |
+----------+-------------+------+-----+---------+----------------+
6 rows in set (0.00 sec)

This table is used to log info on which video the visitor has viewed
and its IP address.

Everytime a visitor views a video a new row is inserted.

Then I use the following query to know if the visitor has already
viewed the video:

mysqlSELECT count(*) from views_date WHERE id_count = $videoid AND
ip = '$ip';
+----------+
| count(*) |
+----------+
| 1 |
+----------+
1 row in set (6.19 sec)

mysqlexplain SELECT count(*) from views_date WHERE id_count = 8731
AND ip = '121.97.245.124';
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
| id | select_type | table | type | possible_keys | key |
key_len | ref | rows | Extra |
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
| 1 | SIMPLE | views_date | ALL | NULL | NULL |
NULL | NULL | 1089103 | Using where |
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
1 row in set (0.04 sec)

Can anyone give me tips on how to optimize the table to run faster
queries?

Thanks!

Antoni

Mar 27 '07 #1
3 4547
On Mar 27, 4:15 pm, "Antoni" <antonimassom...@gmail.comwrote:
Hello,

I have a table with more than 1,000,000 rows.

mysqldescribe views_date;
+----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+----------+-------------+------+-----+---------+----------------+
| id | bigint(20) | NO | PRI | NULL | auto_increment |
| id_count | bigint(20) | YES | | NULL | |
| date | datetime | YES | | NULL | |
| type | int(3) | YES | | NULL | |
| ip | varchar(15) | YES | | NULL | |
| user | bigint(20) | YES | | NULL | |
+----------+-------------+------+-----+---------+----------------+
6 rows in set (0.00 sec)

This table is used to log info on which video the visitor has viewed
and its IP address.

Everytime a visitor views a video a new row is inserted.

Then I use the following query to know if the visitor has already
viewed the video:

mysqlSELECT count(*) from views_date WHERE id_count = $videoid AND
ip = '$ip';
+----------+
| count(*) |
+----------+
| 1 |
+----------+
1 row in set (6.19 sec)

mysqlexplain SELECT count(*) from views_date WHERE id_count = 8731
AND ip = '121.97.245.124';
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
| id | select_type | table | type | possible_keys | key |
key_len | ref | rows | Extra |
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
| 1 | SIMPLE | views_date | ALL | NULL | NULL |
NULL | NULL | 1089103 | Using where |
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
1 row in set (0.04 sec)

Can anyone give me tips on how to optimize the table to run faster
queries?

Thanks!

Antoni
Try putting an index on ip and id_count

Mar 27 '07 #2
How do I put an index on them? Will I lost any data?

strawberry ha escrit:
On Mar 27, 4:15 pm, "Antoni" <antonimassom...@gmail.comwrote:
Hello,

I have a table with more than 1,000,000 rows.

mysqldescribe views_date;
+----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+----------+-------------+------+-----+---------+----------------+
| id | bigint(20) | NO | PRI | NULL | auto_increment |
| id_count | bigint(20) | YES | | NULL | |
| date | datetime | YES | | NULL | |
| type | int(3) | YES | | NULL | |
| ip | varchar(15) | YES | | NULL | |
| user | bigint(20) | YES | | NULL | |
+----------+-------------+------+-----+---------+----------------+
6 rows in set (0.00 sec)

This table is used to log info on which video the visitor has viewed
and its IP address.

Everytime a visitor views a video a new row is inserted.

Then I use the following query to know if the visitor has already
viewed the video:

mysqlSELECT count(*) from views_date WHERE id_count = $videoid AND
ip = '$ip';
+----------+
| count(*) |
+----------+
| 1 |
+----------+
1 row in set (6.19 sec)

mysqlexplain SELECT count(*) from views_date WHERE id_count = 8731
AND ip = '121.97.245.124';
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
| id | select_type | table | type | possible_keys | key |
key_len | ref | rows | Extra |
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
| 1 | SIMPLE | views_date | ALL | NULL | NULL |
NULL | NULL | 1089103 | Using where |
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
1 row in set (0.04 sec)

Can anyone give me tips on how to optimize the table to run faster
queries?

Thanks!

Antoni

Try putting an index on ip and id_count
Mar 27 '07 #3
Antoni wrote:
How do I put an index on them? Will I lost any data?

strawberry ha escrit:
>>On Mar 27, 4:15 pm, "Antoni" <antonimassom...@gmail.comwrote:
>>>Hello,

I have a table with more than 1,000,000 rows.

mysqldescribe views_date;
+----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+----------+-------------+------+-----+---------+----------------+
| id | bigint(20) | NO | PRI | NULL | auto_increment |
| id_count | bigint(20) | YES | | NULL | |
| date | datetime | YES | | NULL | |
| type | int(3) | YES | | NULL | |
| ip | varchar(15) | YES | | NULL | |
| user | bigint(20) | YES | | NULL | |
+----------+-------------+------+-----+---------+----------------+
6 rows in set (0.00 sec)

This table is used to log info on which video the visitor has viewed
and its IP address.

Everytime a visitor views a video a new row is inserted.

Then I use the following query to know if the visitor has already
viewed the video:

mysqlSELECT count(*) from views_date WHERE id_count = $videoid AND
ip = '$ip';
+----------+
| count(*) |
+----------+
| 1 |
+----------+
1 row in set (6.19 sec)

mysqlexplain SELECT count(*) from views_date WHERE id_count = 8731
AND ip = '121.97.245.124';
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
| id | select_type | table | type | possible_keys | key |
key_len | ref | rows | Extra |
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
| 1 | SIMPLE | views_date | ALL | NULL | NULL |
NULL | NULL | 1089103 | Using where |
+----+-------------+------------+------+---------------+------
+---------+------+---------+-------------+
1 row in set (0.04 sec)

Can anyone give me tips on how to optimize the table to run faster
queries?

Thanks!

Antoni

Try putting an index on ip and id_count

see the docs for create index statement. And no you will not lose any data.

And BTW, you cannot be guaranteed that the person using that IP address has not
already viewed that particular movie. An unfortunate side-affect of DHCP.

--
Michael Austin
Database Consultant
Domain Registration and Linux/Windows Web Hosting Reseller
http://www.spacelots.com
Mar 31 '07 #4

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
by: Tom Yee | last post by:
We just installed SQL Server version 800.194 on a dual processor server equipped with a gigabyte of RAM, running Windows 2000 Server operating system. We set up a few databases with (so far) very...
1
by: David Lawson | last post by:
The line indicated below from my php script is very slow (about 10 seconds). I have this field indexed so I thought that it would be much faster. Could someone tell me what might be wrong? I'm also...
11
by: DJJ | last post by:
I am using the MySQL ODBC 3.51 driver to link three relatively small MySQL tables to a Microsoft Access 2003 database. I am finding that the data from the MySQL tables takes a hell of a long time...
5
by: krystoffff | last post by:
I currently have PostgreSQL 7.1 installed on a server with about 700 Mb of RAM. I have many problems of speed with a database I created. For example, it took almost 12 sec to run the query...
2
by: Yonatan Goraly | last post by:
I am in the process of adding PostgreSQL support for an application, in addition to Oracle and MS SQL. I am using PostgreSQL version 7.3.2, Red Hat 9.0 on Intel Pentium III board. I have a...
3
by: Janross | last post by:
I'm having trouble with a query that's prohibitively slow. On my free-standing office computer it's fine (well, 2-4 seconds), but on the client's network, it takes at least 5 minutes to run. ...
4
by: cefrancke | last post by:
Are there any ways to speed up a Cumulative sum in a query or perhaps another faster way to have a cumulative sum column (call a vba function?). For example, I want to sum up all values under...
0
by: roiavidan | last post by:
Hi, I'm having a bit of a problem with a small application I wrote in C#, which uses an Access database (mdb file) for storing financial data. After looking for a similiar topic and failing to...
9
by: dan | last post by:
within a loop i am building a sql insert statement to run against my (programatically created) mdb. it works but it seems unreasonably SLOW! Sorry, dont have the code here but the jist is very...
3
dlite922
by: dlite922 | last post by:
I'm building a dynamic reporting system. The report can of course query multiple tables. The query declares which tables it will access, the fields that it needs for display, and the dependent...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.