Hi there,
SUMMARY:
1) When I use phpadmin to export a large database, the file created on
my local PC never containes all the tables and information. It seems
to max out at about 10mb.
2) Will running mysqldump from command line on a live in-use DB (with
one table containing 350,000 rows) pose any risks and have any large
effect of current use?
FULL DETAILS:
I am trying to port a MySQL database from a live environemt to a test
one so I can build a test server to try out changes.
Both test and live are hosted on shared hosting packages.
I used phpAdmin to export the database and when the file got to about
10mb it just seemed to stop. I therefore tried exporting all the
tables seperately but one table in particular that has about 350,000
rows never completed. The text file that is created on my local PC
never has all the data that is in the original.
Firstly I do not understand why this is so does anyone know why?
As a work around I looked into mysqldump from the command line. I
tried this out on my test server and it worked fine and quickly, but
then I did not have a database that has a table with 350,000 rows in
it. So if I log into the live environment, will it have a big impact
if I run a mysqldump and create a file (Whcih according to totalling
the sizes of tables on PHP admin) should be about 23mb. I do not want
to create any impact (running a little bit slow for 5 mins or so is
acceptable) on the live data and I CERTAINLY do not want to risk
damagaing the data.
Any thoughts appreciated.
Kind regards
G