By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
432,002 Members | 2,157 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 432,002 IT Pros & Developers. It's quick & easy.

Porting MySQL DB across servers - mysqldump query as phpadmin export does not complete.

P: n/a
Hi there,

SUMMARY:
1) When I use phpadmin to export a large database, the file created on
my local PC never containes all the tables and information. It seems
to max out at about 10mb.

2) Will running mysqldump from command line on a live in-use DB (with
one table containing 350,000 rows) pose any risks and have any large
effect of current use?

FULL DETAILS:
I am trying to port a MySQL database from a live environemt to a test
one so I can build a test server to try out changes.

Both test and live are hosted on shared hosting packages.

I used phpAdmin to export the database and when the file got to about
10mb it just seemed to stop. I therefore tried exporting all the
tables seperately but one table in particular that has about 350,000
rows never completed. The text file that is created on my local PC
never has all the data that is in the original.

Firstly I do not understand why this is so does anyone know why?

As a work around I looked into mysqldump from the command line. I
tried this out on my test server and it worked fine and quickly, but
then I did not have a database that has a table with 350,000 rows in
it. So if I log into the live environment, will it have a big impact
if I run a mysqldump and create a file (Whcih according to totalling
the sizes of tables on PHP admin) should be about 23mb. I do not want
to create any impact (running a little bit slow for 5 mins or so is
acceptable) on the live data and I CERTAINLY do not want to risk
damagaing the data.

Any thoughts appreciated.

Kind regards

G
Jul 23 '05 #1
Share this Question
Share on Google+
1 Reply


P: n/a
On 24 Jan 2005 03:55:41 -0800, Gr*************@lycos.co.uk (Dave
Crypto) wrote:
Hi there,

SUMMARY:
1) When I use phpadmin to export a large database, the file created on
my local PC never containes all the tables and information. It seems
to max out at about 10mb.
I don't know of a file size limit or total disk space limit in you
rituation. Might that be the problem? Or the script execution time?

2) Will running mysqldump from command line on a live in-use DB (with
one table containing 350,000 rows) pose any risks and have any large
effect of current use?
Nope, unless locking tables for a short period of time might be a
problem. I do it for that size of tables as well. The MySQL dump
routine takes a few seconds and the tar and zip routine take a few
seconds more. The total time it takes me to backup my database
including compression is less than a minute (I was also concerned
about locking my system and use a master/slave setup now, where I
backup the slave data).

FULL DETAILS:
I am trying to port a MySQL database from a live environemt to a test
one so I can build a test server to try out changes.

Both test and live are hosted on shared hosting packages.

I used phpAdmin to export the database and when the file got to about
10mb it just seemed to stop. I therefore tried exporting all the
tables seperately but one table in particular that has about 350,000
rows never completed. The text file that is created on my local PC
never has all the data that is in the original.

Firstly I do not understand why this is so does anyone know why?

As a work around I looked into mysqldump from the command line. I
tried this out on my test server and it worked fine and quickly, but
then I did not have a database that has a table with 350,000 rows in
it. So if I log into the live environment, will it have a big impact
if I run a mysqldump and create a file (Whcih according to totalling
the sizes of tables on PHP admin) should be about 23mb. I do not want
to create any impact (running a little bit slow for 5 mins or so is
acceptable) on the live data and I CERTAINLY do not want to risk
damagaing the data.
Is it not possible to issue a MySQL dump on command on the machine?

A work around if you can access you data directory is to lock you
system using FLUSH TABLES WITH READ LOCK; statement (I believe and am
not completely sure about the syntaxis) and leave this MySQL session
running. Copy the datadir to your local system and kill the MySQL
session and thus removing the lock.
Any thoughts appreciated.

Kind regards

G


Jonathan
Jul 23 '05 #2

This discussion thread is closed

Replies have been disabled for this discussion.