469,929 Members | 2,067 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,929 developers. It's quick & easy.

database backup - how to?

How do I synchronize MySQL table data of my home PC with latest data
from a remote server ?

My home PC is on a very slow internet connection, so implementing
replication will cause long time read lock on remote server, which is
not desirable.

What are the tools for manual sync of tables ?

I'm running FreeBSD on my home PC.

Mike

Jul 23 '05 #1
12 2069
siliconmike wrote:
What are the tools for manual sync of tables ?


If the size of database is not too big:

Use mysqldump to create dump of the whole database. Use gzip or similar
tool to make it smaller for the transfering. On your computer, unzip it
and read it in using mysql-command line tool.

You can use mysqldump to get only given tables, in case you habe very
large static table and few small dynamic tables.
Jul 23 '05 #2


Aggro wrote:
siliconmike wrote:
What are the tools for manual sync of tables ?


If the size of database is not too big:

Use mysqldump to create dump of the whole database.


Oh O! The tables are Biiiiiiiiiiiiiiiigggggggg!!

Jul 23 '05 #3
siliconmike wrote:
How do I synchronize MySQL table data of my home PC with latest data
from a remote server ?

My home PC is on a very slow internet connection, so implementing
replication will cause long time read lock on remote server, which is
not desirable.

What are the tools for manual sync of tables ?

I'm running FreeBSD on my home PC.

Mike


Mike,

If you can modify the programs on the server have them write the
SQL statements that insert or update your server database to a
flat file. Download the flat file and run it against your home
PC database.

As long as every SQL statement is terminated with a semi-colon
you can execute the contents of your flat file using:

mysql -u user_name database_name < flat_file_name

HTH

Jerry

Jul 23 '05 #4
<si*********@yahoo.com> ecrivait/schrieb/wrote:
Use mysqldump to create dump of the whole database.


Oh O! The tables are Biiiiiiiiiiiiiiiigggggggg!!


even if you compress them with bzip2/gzip? (generaly work
quite well on dumps).

Olivier

--
__________________________________________________ _______________
Olivier M. - sp**********@8304.ch - PGP: 0E84D2EA - Switzerland
Jul 23 '05 #5
Sounds good. But I would prefer another approach - putting a column
called NEEDS_BACK_UP in each remote table to keep track of whether a
record needs to be backed up.

In what you advise, probably a failed sql statement in a flat file
might leave the rest of the sql statements meaningless.

Although, my scheme I guess needs considerable change in my code.

Or there is another scheme - updated-timestamp in each remote table...

Mike

jerry gitomer wrote:
Mike,

If you can modify the programs on the server have them write the
SQL statements that insert or update your server database to a
flat file. Download the flat file and run it against your home
PC database.

As long as every SQL statement is terminated with a semi-colon
you can execute the contents of your flat file using:

mysql -u user_name database_name < flat_file_name

HTH

Jerry


Jul 23 '05 #6
Olivier M. wrote:
<si*********@yahoo.com> ecrivait/schrieb/wrote:
Use mysqldump to create dump of the whole database.


Oh O! The tables are Biiiiiiiiiiiiiiiigggggggg!!


even if you compress them with bzip2/gzip? (generaly work
quite well on dumps).


The site will initially have 2 GB tables, which will grow..

Jul 23 '05 #7
siliconmike wrote:
In what you advise, probably a failed sql statement in a flat file
might leave the rest of the sql statements meaningless.


No, if you use --force parameter (if I remember the syntax correctly),
which will force it continue after errors also.
Jul 23 '05 #8


Aggro wrote:
siliconmike wrote:
In what you advise, probably a failed sql statement in a flat file
might leave the rest of the sql statements meaningless.


No, if you use --force parameter (if I remember the syntax correctly),
which will force it continue after errors also.


What I mean is that there is no meaning to continue backup in case a
sql command fails. Because some of the future sql commands might be
depending on the result of the one that failed. The backup-data then
wouldn't be equivalent to the original one.

Jul 23 '05 #9
"" wrote:
How do I synchronize MySQL table data of my home PC with
latest data
from a remote server ?

My home PC is on a very slow internet connection, so
implementing
replication will cause long time read lock on remote server,
which is
not desirable.

What are the tools for manual sync of tables ?

I'm running FreeBSD on my home PC.

Mike


I have followed this thread...

but have a basic question..

Why do you want to ’synchronize’ db with your home pc? Since you
mention slow connection, you obvisouly only using your home pc for
back. Then there are no stringent requiremetns for synch’ing. All
you need is simple backups, which have been amply explained.

By the way, on a 2Gig DB, the best approach is mysqlhotcopy, if you
have root access. All the other approaches can be problematic:
mysqldump can fail on big db’s, and taring the files directly fails
too, since the files may be updating in middle of tar.

--
Posted using the http://www.dbforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbforumz.com/mySQL-databa...ict237213.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=824569
Jul 23 '05 #10
steve wrote:
"" wrote:
> How do I synchronize MySQL table data of my home PC with
> latest data
> from a remote server ?
>
> My home PC is on a very slow internet connection, so
> implementing
> replication will cause long time read lock on remote server,
> which is
> not desirable.
>
> What are the tools for manual sync of tables ?
>
> I'm running FreeBSD on my home PC.
>
> Mike
I have followed this thread...

but have a basic question..

Why do you want to 'synchronize' db with your home pc?


To keep the new data posted by people on the website server, so that in
case the server crashes, I have a copy.

Since you mention slow connection, you obvisouly only using your home pc for
back. Then there are no stringent requiremetns for synch'ing. All
you need is simple backups, which have been amply explained.

By the way, on a 2Gig DB, the best approach is mysqlhotcopy, if you
have root access.
Correct me- but mysqlhotcopy acquires read locks, and since I'm on a
slow connection, long time read locks would prevent other threads from
reading the tables while it is being backed up.. right?

All the other approaches can be problematic: mysqldump can fail on big db's, and taring the files directly fails
too, since the files may be updating in middle of tar.


Jul 23 '05 #11
siliconmike wrote:
Correct me- but mysqlhotcopy acquires read locks, and since I'm on a
slow connection, long time read locks would prevent other threads from
reading the tables while it is being backed up.. right?


Can you not create the copy to the server. After copy is ready, then
transfer it to your home computer?
Jul 23 '05 #12
"" wrote:
steve wrote:
"" wrote:
> How do I synchronize MySQL table data of my home PC with
> latest data
> from a remote server ?
>
> My home PC is on a very slow internet connection, so
> implementing
> replication will cause long time read lock on remote server, > which is
> not desirable.
>
> What are the tools for manual sync of tables ?
>
> I'm running FreeBSD on my home PC.
>
> Mike


I have followed this thread...

but have a basic question..

Why do you want to 'synchronize' db with your home pc?


To keep the new data posted by people on the website server,
so that in
case the server crashes, I have a copy.

Since you
mention slow connection, you obvisouly only using your home

pc for
back. Then there are no stringent requiremetns for

synch'ing. All
you need is simple backups, which have been amply explained.

By the way, on a 2Gig DB, the best approach is mysqlhotcopy,

if you
have root access.


Correct me- but mysqlhotcopy acquires read locks, and since
I'm on a
slow connection, long time read locks would prevent other
threads from
reading the tables while it is being backed up.. right?

All the other approaches can be problematic:
mysqldump can fail on big db's, and taring the files

directly fails
too, since the files may be updating in middle of tar.


mysqlhotcopy works on the remote server itself (again assuming you
have root access). It takes care of all the locking, so that a live
database can be backed up. If you, instead, use TAR or mysqldum, you
will notice that the db will be brought down to its feet.

Backups can be had for very cheap these days, a few dollars a month.
There is no reason to back up over slow lines to your local PC. Just
backup over high speed to a 3rd party location. Try GNAX, you can
back up your 2Gig for $1.00 !! a month
http://www.gnax.net/backup/index.php

--
Posted using the http://www.dbforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbforumz.com/mySQL-databa...ict237213.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=824634
Jul 23 '05 #13

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

3 posts views Thread by Cristina | last post: by
6 posts views Thread by Edwinah63 | last post: by
5 posts views Thread by Hassan Naqvi | last post: by
3 posts views Thread by Bill E. | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.