472,334 Members | 2,512 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 472,334 software developers and data experts.

Porting MySQL DB across servers - mysqldump query as phpadmin export does not complete.

Hi there,

SUMMARY:
1) When I use phpadmin to export a large database, the file created on
my local PC never containes all the tables and information. It seems
to max out at about 10mb.

2) Will running mysqldump from command line on a live in-use DB (with
one table containing 350,000 rows) pose any risks and have any large
effect of current use?

FULL DETAILS:
I am trying to port a MySQL database from a live environemt to a test
one so I can build a test server to try out changes.

Both test and live are hosted on shared hosting packages.

I used phpAdmin to export the database and when the file got to about
10mb it just seemed to stop. I therefore tried exporting all the
tables seperately but one table in particular that has about 350,000
rows never completed. The text file that is created on my local PC
never has all the data that is in the original.

Firstly I do not understand why this is so does anyone know why?

As a work around I looked into mysqldump from the command line. I
tried this out on my test server and it worked fine and quickly, but
then I did not have a database that has a table with 350,000 rows in
it. So if I log into the live environment, will it have a big impact
if I run a mysqldump and create a file (Whcih according to totalling
the sizes of tables on PHP admin) should be about 23mb. I do not want
to create any impact (running a little bit slow for 5 mins or so is
acceptable) on the live data and I CERTAINLY do not want to risk
damagaing the data.

Any thoughts appreciated.

Kind regards

G
Jul 23 '05 #1
1 3918
On 24 Jan 2005 03:55:41 -0800, Gr*************@lycos.co.uk (Dave
Crypto) wrote:
Hi there,

SUMMARY:
1) When I use phpadmin to export a large database, the file created on
my local PC never containes all the tables and information. It seems
to max out at about 10mb.
I don't know of a file size limit or total disk space limit in you
rituation. Might that be the problem? Or the script execution time?

2) Will running mysqldump from command line on a live in-use DB (with
one table containing 350,000 rows) pose any risks and have any large
effect of current use?
Nope, unless locking tables for a short period of time might be a
problem. I do it for that size of tables as well. The MySQL dump
routine takes a few seconds and the tar and zip routine take a few
seconds more. The total time it takes me to backup my database
including compression is less than a minute (I was also concerned
about locking my system and use a master/slave setup now, where I
backup the slave data).

FULL DETAILS:
I am trying to port a MySQL database from a live environemt to a test
one so I can build a test server to try out changes.

Both test and live are hosted on shared hosting packages.

I used phpAdmin to export the database and when the file got to about
10mb it just seemed to stop. I therefore tried exporting all the
tables seperately but one table in particular that has about 350,000
rows never completed. The text file that is created on my local PC
never has all the data that is in the original.

Firstly I do not understand why this is so does anyone know why?

As a work around I looked into mysqldump from the command line. I
tried this out on my test server and it worked fine and quickly, but
then I did not have a database that has a table with 350,000 rows in
it. So if I log into the live environment, will it have a big impact
if I run a mysqldump and create a file (Whcih according to totalling
the sizes of tables on PHP admin) should be about 23mb. I do not want
to create any impact (running a little bit slow for 5 mins or so is
acceptable) on the live data and I CERTAINLY do not want to risk
damagaing the data.
Is it not possible to issue a MySQL dump on command on the machine?

A work around if you can access you data directory is to lock you
system using FLUSH TABLES WITH READ LOCK; statement (I believe and am
not completely sure about the syntaxis) and leave this MySQL session
running. Copy the datadir to your local system and kill the MySQL
session and thus removing the lock.
Any thoughts appreciated.

Kind regards

G


Jonathan
Jul 23 '05 #2

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
by: James | last post by:
HI, I'm looking for a script that will allow users/admins to have a one click backup solution for a MYSQL Database.. 'BACK DATABASE' button,...
1
by: Marc | last post by:
Hello, Newbie here..... Searching and working this for a week now. We too are having the same problems. Using MySql 4.0.14 and there are "no...
4
by: news | last post by:
Our production database in an exported textfil runs about 60 MB. Compressed that's about 9 MB. I'm trying to import the export into another...
39
by: Mairhtin O'Feannag | last post by:
Hello, I have a client (customer) who asked the question : "Why would I buy and use UDB, when MySql is free?" I had to say I was stunned. I...
1
by: Pratchaya | last post by:
Hi, All I want to make transfer data between MySQL Server to MySQL Local . My Environment. Server < --- > My PC Client Server = :::::...
10
by: Simon | last post by:
Hi, I need to export a 200MB database from one domain to another. phpMyAdmin timeout after a while and is not ideal. I don't mind spending...
6
by: frank78 | last post by:
Hi everyone, I am having a little bit of trouble backing up some mySQL tables. I've been trying to adapt a script I found on the internet at...
5
by: linuxlover992000 | last post by:
I am a newbie in the world of MySQL. In fact I enabled it in my Linux box only because it is required to run WordPress (the blogging software). I...
7
by: Randy | last post by:
Folks: We have a web-based app that's _really_ slowing down because multiple clients are writing their own private data into a single, central...
0
better678
by: better678 | last post by:
Question: Discuss your understanding of the Java platform. Is the statement "Java is interpreted" correct? Answer: Java is an object-oriented...
0
by: teenabhardwaj | last post by:
How would one discover a valid source for learning news, comfort, and help for engineering designs? Covering through piles of books takes a lot of...
0
by: Kemmylinns12 | last post by:
Blockchain technology has emerged as a transformative force in the business world, offering unprecedented opportunities for innovation and...
0
by: CD Tom | last post by:
This happens in runtime 2013 and 2016. When a report is run and then closed a toolbar shows up and the only way to get it to go away is to right...
0
by: CD Tom | last post by:
This only shows up in access runtime. When a user select a report from my report menu when they close the report they get a menu I've called Add-ins...
0
jalbright99669
by: jalbright99669 | last post by:
Am having a bit of a time with URL Rewrite. I need to incorporate http to https redirect with a reverse proxy. I have the URL Rewrite rules made...
0
by: antdb | last post by:
Ⅰ. Advantage of AntDB: hyper-convergence + streaming processing engine In the overall architecture, a new "hyper-convergence" concept was...
0
by: AndyPSV | last post by:
HOW CAN I CREATE AN AI with an .executable file that would suck all files in the folder and on my computerHOW CAN I CREATE AN AI with an .executable...
0
by: Arjunsri | last post by:
I have a Redshift database that I need to use as an import data source. I have configured the DSN connection using the server, port, database, and...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.