473,657 Members | 2,530 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Database Optimization

Hello group,

I have a rather general but interesting inquiry that is related to PHP
and I hope this is the appropriate place to post it.

I'm looking for a way to improve dramatically the performance of my PHP
application. The application is getting slow as it is taking more load.
It is performing a very high number of queries to a database, and I
believe that this is taking up most of the ressources.

I'm trying to figure out how I could cache the content of the database
and prevent that many queries to be performed.

What I would like to do is cache all the content of the database in
memory, so that I could access it directly through my PHP application
without querying the database and saving precious ressources.

The database is quite small, 15 - 20 mB and it's size is constant (it
does not get bigger over time). 92% of the queries are SELECT, only 8
percents are UPDATE, DELETE and INSERT.

So, my question is, is it possible and recommandable to place 20mB of
data in shared memory in order to prevent queries to the database? (all
updates, deletes and inserts are performed both in the database as well
as in memory)

Or would I be better to place a copy of the database on a ramdrive?

Other info:
I have a server which runs both the PHP application and the MySQL
database. It has 1GB of RAM. The database receives 250 queries / sec.

Thank you in advance for your kind help

Sep 15 '05 #1
19 2083
no********@gmai l.com wrote:
Hello group,

I have a rather general but interesting inquiry that is related to PHP
and I hope this is the appropriate place to post it.

I'm looking for a way to improve dramatically the performance of my PHP
application. The application is getting slow as it is taking more load.
It is performing a very high number of queries to a database, and I
believe that this is taking up most of the ressources.

I'm trying to figure out how I could cache the content of the database
and prevent that many queries to be performed.

What I would like to do is cache all the content of the database in
memory, so that I could access it directly through my PHP application
without querying the database and saving precious ressources.

The database is quite small, 15 - 20 mB and it's size is constant (it
does not get bigger over time). 92% of the queries are SELECT, only 8
percents are UPDATE, DELETE and INSERT.

So, my question is, is it possible and recommandable to place 20mB of
data in shared memory in order to prevent queries to the database? (all
updates, deletes and inserts are performed both in the database as well
as in memory)

Or would I be better to place a copy of the database on a ramdrive?

Other info:
I have a server which runs both the PHP application and the MySQL
database. It has 1GB of RAM. The database receives 250 queries / sec.

Thank you in advance for your kind help

Hi,

I am unsure if placing the database in memory will seriously increase it's
performance: you'll have to test that.

If you database is using its time on scanning tables and joining, and
conversions, etc etc, the time trimmed off could be disappointing.
If the database is non-stop reading files, it could help.
Hard to say.
Which database do you use?
(!)But before you go all that way, did you try some more 'old-fashioned'
optimizations?

Some ideas:
- Try to figure out which tables are scanned a lot, and place indexes on the
relevant column(s).
(If you use Postgresql, try EXPLAIN-command for help)

- Does you DB and your code use Prepared statements?
They can help a lot, especially when the queries are complex.

- If 50 of the 250 queries/sec are the same selects that don't change, you
could try some smart caching.
eg: If a popular query is to get the latest 20 this-or-that, with all kind
of joins on other tables, you could shedule that query every 15 minutes,
and safe the results in a file. Then include the file on the pages where
you need it.
Alternatively: you could just update the file, whenever you know a relevant
table is changed.
(What makes the most sense is up to you to decide of course.)

This kind of optimalization can make huge differences.

In general: Try to figure out which queries are executed a lot, and start
there with prepared statements/indexing/caching-to-file.

Hope this helps.

Good luck!

Regards,
Erwin Moller
Sep 15 '05 #2
Thanks for your reply.

I use a MySQL database that is properly optimized. All the indexes are
set correctly and used.

Most of the requests are simple queries using a unique ID and returning
only a single result. There is almost no joins or complex joins.
- If 50 of the 250 queries/sec are the same selects that don't change, >you could try some smart caching.


Unfortunately, most of the the queries are different.

I can give an example:

An user table with around 4000 users. It is possible to consult other
user's information. So a lot of queries are made on single records.

I tested placing a few records in memory with shm functions, and it
was of course, blazingly fast.

But I'm wonderig how the system reacts with higher volume of data, and
what would be the best way to do this.

Thanks

Sep 15 '05 #3
Testing it could be easy. Have a link on the page that peeps currently
use asking them to test the effectiveness of the new code. I have
experienced that almost 80% of my users will go in and test the new
stuff, just for curiosity maybe. So my testing on new code is normally
completed within a few days, with no impact on operations.

Sep 15 '05 #4
Thanks,

But I would rather stick to profiling. It's much more precise.

Perhaps I should rewrite the exact question: is it possible and
recommandable to load large amount of data (20mB) into shared memory.
If yes, what is a good way to implement it.

Thanks again for your help

Sep 15 '05 #5
nospamm...@gmai l.com wrote:
Hello group,
The database is quite small, 15 - 20 mB and it's size is constant (it
does not get bigger over time). 92% of the queries are SELECT, only 8
percents are UPDATE, DELETE and INSERT.


8% is still a significant amount of writes. I wonder if your database
is running into locking problems. Not an expert in MySQL, but I've
heard that its locking mechanism isn't that great. If a table is
constantly being modified, then queries on it could be often stalled.

Sep 15 '05 #6
I do not use any manual locking as I do not need atomic transactions.
So I think locking shouldn't be an issue.

I believe that the source of the problem is the big amount of queries
generated by the mass of users. And that's why i'm looking at shared
memory caching.

Sep 15 '05 #7
Maybe my article at http://www.w-p.dds.nl/article/wtrframe.htm
describes something useful. Especially the section on lazy collections
can be interesting.

Best regards.

no********@gmai l.com wrote:
Hello group,

I have a rather general but interesting inquiry that is related to PHP
and I hope this is the appropriate place to post it.

I'm looking for a way to improve dramatically the performance of my PHP
application. The application is getting slow as it is taking more load.
It is performing a very high number of queries to a database, and I
believe that this is taking up most of the ressources.

I'm trying to figure out how I could cache the content of the database
and prevent that many queries to be performed.

What I would like to do is cache all the content of the database in
memory, so that I could access it directly through my PHP application
without querying the database and saving precious ressources.

The database is quite small, 15 - 20 mB and it's size is constant (it
does not get bigger over time). 92% of the queries are SELECT, only 8
percents are UPDATE, DELETE and INSERT.

So, my question is, is it possible and recommandable to place 20mB of
data in shared memory in order to prevent queries to the database? (all
updates, deletes and inserts are performed both in the database as well
as in memory)

Or would I be better to place a copy of the database on a ramdrive?

Other info:
I have a server which runs both the PHP application and the MySQL
database. It has 1GB of RAM. The database receives 250 queries / sec.

Thank you in advance for your kind help

Sep 15 '05 #8
That's right. It is interesting. This article touches the kind of
structure I would need for my caching solution.

As you mentionned, I want the class to control the data, no matter if
it comes from the database or from the cache. But I want it to control
all the updates, inserts, and deletes also as my data will never be
modified from outside of the application. The cache will simply be
updates at the same time as the database.

What I would like to discuss, however, is technically how to index,
retrieve, update and store large amount of data in shared memory.

Thanks for your help

Sep 15 '05 #9
NC
nospamm...@gmai l.com wrote:

I'm looking for a way to improve dramatically the performance of my PHP
application. The application is getting slow as it is taking more load.
It is performing a very high number of queries to a database, and I
believe that this is taking up most of the ressources.

I'm trying to figure out how I could cache the content of the database
and prevent that many queries to be performed.

What I would like to do is cache all the content of the database in
memory, so that I could access it directly through my PHP application
without querying the database and saving precious ressources.

The database is quite small, 15 - 20 mB and it's size is constant (it
does not get bigger over time). 92% of the queries are SELECT, only 8
percents are UPDATE, DELETE and INSERT.
It sounds like you could improve you performance by using query
caching and/or better indexing... Read "High Performance MySQL"
by Jeremy Zawodny; it should give you some ideas...
So, my question is, is it possible and recommandable to place 20mB of
data in shared memory in order to prevent queries to the database? (all
updates, deletes and inserts are performed both in the database as well
as in memory)
Yes, it is possible. MySQL supports HEAP tables that are stored in
memory. But you still need to figure out a way to save those tables
on the hard drive, because HEAP tables disappear when MySQL server
stops or reboots.
Or would I be better to place a copy of the database on a ramdrive?
Again, you can do that, but you still need to make sure your database
is synchronized to a hard drive somewhere...
Other info:
I have a server which runs both the PHP application and the MySQL
database. It has 1GB of RAM. The database receives 250 queries / sec.


It appears a very manageable load... How many concurrent connections
are you handling?

Cheers,
NC

Sep 15 '05 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
4321
by: trotter | last post by:
I want to know if there is a "best-practice" for setting up Database Maintenance Plans in SQL Server 7 or 2000. To be more specific, I want to know the order in which I complete the tasks. Do I complete optimization first, then integrity checks, then translog backup, then full backup??? OR is there a better order which should be used? Should I ALWAYS backup the transaction Log before I complete a full database backup, and if so, why?? ...
9
2392
by: Rune | last post by:
Is it best to use double quotes and let PHP expand variables inside strings, or is it faster to do the string manipulation yourself manually? Which is quicker? 1) $insert = 'To Be'; $sentence = "$insert or not $insert. That is the question."; or
10
3242
by: Rada Chirkova | last post by:
Hi, at NC State University, my students and I are working on a project called "self-organizing databases," please see description below. I would like to use an open-source database system for implementation and would really appreciate your opinion on whether PostgreSQL is suitable for the project. In general, I am very impressed by the quality of PostgreSQL code and documentation, as well as by the support of the developer community. ...
12
6175
by: WantedToBeDBA | last post by:
Hi all, db2 => create table emp(empno int not null primary key, \ db2 (cont.) => sex char(1) not null constraint s_check check \ db2 (cont.) => (sex in ('m','f')) \ db2 (cont.) => not enforced \ db2 (cont.) => enable query optimization) DB20000I The SQL command completed successfully. db2 => insert into emp values(1,'m')
5
2384
by: wkaras | last post by:
I've compiled this code: const int x0 = 10; const int x1 = 20; const int x2 = 30; int x = { x2, x0, x1 }; struct Y {
2
3340
by: webcm123 | last post by:
People say that structural programming isn't good for database connection. I code fast-running structural oriented CMS and I don't know what I should do. I use mysql connection using mysql_*. I want also to use SQLite. Can you give me some advices? 1. PHP4 is still used. I want to be compatible. 2. There are various methods - MySQL, MySQLi, SQLite, PDO... I have
30
5646
by: Neil | last post by:
Yikes! My database, which had been consistently 1 gig for a long time, went from being 1 gig to 3 gigs overnight! Looking at the nightly backups, the database increased on average about 5-15 MB per day, and was 1.06 GB on the Thursday night backup. Then, with the Friday night backup, it was 2.95 GB, and has stayed that way since! I did a Shrink on the database, but that didn't help the situation. The only thing I could think it might...
0
1900
by: sam | last post by:
Hi, Hope you are doing well !!!! One of our clients is looking to augment their team with “Database Architect – DB2" please find below the details and respond with
20
2339
by: Ravikiran | last post by:
Hi Friends, I wanted know about whatt is ment by zero optimization and sign optimization and its differences.... Thank you...
0
8421
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
8325
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
8844
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
8518
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
8621
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
7354
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
4173
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
4330
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
2743
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.