473,394 Members | 1,739 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,394 software developers and data experts.

mysqldump locking issue

ok I want to do a dump daily of my database, this works fine, however
the problem I have is that while the dump is executing itself, I cant
access my website cause the thing locks tables I think.

Is there a way to dump and allow queries to happen at the same time
without them being put on hold?

thanks.

Aug 10 '06 #1
5 5840
jmichel wrote:
ok I want to do a dump daily of my database, this works fine, however
the problem I have is that while the dump is executing itself, I cant
access my website cause the thing locks tables I think.

Is there a way to dump and allow queries to happen at the same time
without them being put on hold?

thanks.
The short answer is no. Your best bet would be to setup replication and
then backup the slave. That way, you can lock the tables on the slave
while you do the backup and not affect your website visitors.

-Steven
Aug 10 '06 #2

Steven Musumeche wrote:
The short answer is no. Your best bet would be to setup replication and
then backup the slave. That way, you can lock the tables on the slave
while you do the backup and not affect your website visitors.

-Steven
What if I did this:

cp -Rp /var/lib/mysql/mydatabase /var/lib/mydatabase_backup
and then do the mysqldump on the mydatabase_backup database

I tested it, it seems to work, but do you see any problems with doing
that?

all my tables are MyISAM

Aug 11 '06 #3
The only problem with that method is that you cannot be sure that the
data files aren't modified while you are copying them. If they are, you
could have a corrupt or inconsistent backup. This is why MyISAM tables
have to be locked before you do a copy of the database files. You can
use FLUSH TABLES WITH READ LOCK and then UNLOCK TABLES to accomplish this.

-Steven

jmichel wrote:
Steven Musumeche wrote:
>The short answer is no. Your best bet would be to setup replication and
then backup the slave. That way, you can lock the tables on the slave
while you do the backup and not affect your website visitors.

-Steven

What if I did this:

cp -Rp /var/lib/mysql/mydatabase /var/lib/mydatabase_backup
and then do the mysqldump on the mydatabase_backup database

I tested it, it seems to work, but do you see any problems with doing
that?

all my tables are MyISAM
Aug 11 '06 #4
jmichel wrote:
cp -Rp /var/lib/mysql/mydatabase /var/lib/mydatabase_backup
This is safe to do only if the MySQL Server is shut down.

Filesystem-level commands like cp don't coordinate with MySQL's internal
locking and caching. So if there are data rows that MySQL Server hasn't
yet written to the files, you risk getting an incomplete copy of the
files. You could even get an inconsistent state of the database (e.g. a
row in a child table referencing a row in a parent table that hasn't
been written to disk yet).

Regards,
Bill K.
Aug 11 '06 #5

"jmichel" <jm********@infogt2000.comwrote in message
news:11**********************@b28g2000cwb.googlegr oups.com...
>
Steven Musumeche wrote:
>The short answer is no. Your best bet would be to setup replication and
then backup the slave. That way, you can lock the tables on the slave
while you do the backup and not affect your website visitors.

-Steven

What if I did this:

cp -Rp /var/lib/mysql/mydatabase /var/lib/mydatabase_backup
and then do the mysqldump on the mydatabase_backup database

I tested it, it seems to work, but do you see any problems with doing
that?

all my tables are MyISAM
4
Aug 14 '06 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

7
by: Adam Smith | last post by:
Hope this is the right news group!! I did a server upgrade and at the same time did a mysql update from 4.0.12 to Ver 12.22 Distrib 4.0.16 mach1# mysql --version mysql Ver 12.22 Distrib...
0
by: mcstayinskool | last post by:
I'm trying to dump a bugzilla installation to a file, then restore it to a different database name (so as to test my db backup system). I use mysqldump to dump it to a file: # mysqldump -u root...
1
by: Dave Crypto | last post by:
Hi there, SUMMARY: 1) When I use phpadmin to export a large database, the file created on my local PC never containes all the tables and information. It seems to max out at about 10mb. 2)...
4
by: 21novembre | last post by:
Hi all, I got a quite strange problem when I tried to setup a database backup shell. I put it this way: "bin/mysqldump --opt --user=xxx --password=xxx DB > DB.bak" However, error 1045 came to...
3
by: Christopher Mouton | last post by:
We regularly make drive images of our entire server and store it at a backup site. To be honest I have never fully understood how the imaging programs deal with open files. My question is will the...
1
by: Greg.Harabedian | last post by:
I'll start off by saying I am using MySQL v4.0 and my question is...how do I get mysqldump to dump the actual binary values store in a blob? Here is an example: -- Create a test table create...
6
by: Robert Blackwell | last post by:
I want to make a scheduled task in windows to do a mysqldump. Someone gave me this .bat to run but I'm not able to get it to work. REM @echo off for /f "tokens=1" %%i in ('date /t') do set...
6
by: Antoni | last post by:
Hello, I'm trying to make a daily database backup. When executing "mysqldump - uxxx -pxxx database /home/backup/ddbb.sql -q &" the httpd server gets blocked and my site is "offline" for 30...
7
by: damezumari | last post by:
I use Windows XP and Apache. I am trying to backup a database on my local computer with the following php code: $dbname = 'ol'; $dbhost = 'localhost'; $dbuser = 'root'; $dbpass =...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.