473,386 Members | 1,721 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,386 software developers and data experts.

mySQL import text file too big?

Our production database in an exported textfil runs about 60 MB.
Compressed that's about 9 MB.

I'm trying to import the export into another machine running FC3 and
mySQL 11.18, and it appears as though the file may be too big!
When I try to do it via command line:

mysql -u root --host=localhost printing < ./printing.txt

It eventually errors out with a "syntax error on line X" and only about
half the tables get imported, and MyPHPAdmin GUI interface says it has
a 2MB file limit on importing.

This concerns me. If something were to happen to our production
database, how am I going to restore it?

I tried to do a search on mysql.com, but that's a less than helpful
site for support. A newsgroup search on the topic didn't come up with
much, so maybe I'm just bad at using keywords.

Any pointers or suggestions on this issue?
Thanks!
Liam

Jul 23 '05 #1
4 12664
<ne**@celticbear.com> wrote in message
news:11**********************@z14g2000cwz.googlegr oups.com...
Our production database in an exported textfil runs about 60 MB.
Compressed that's about 9 MB.

I'm trying to import the export into another machine running FC3 and
mySQL 11.18, and it appears as though the file may be too big!
<snip> It eventually errors out with a "syntax error on line X" ... <snip> Any pointers or suggestions on this issue?


Have you tried to manually isolate "line X" and put it into a *small* text
file all by itself.? And then tried to import it?

Perhaps what you are dealing with is a "syntax error at line X"!

I know it sounds crazy but what the heck!
Thomas Bartkus
Jul 23 '05 #2
This problem appears to be more complex.

The export I do of the database is in a cron I wrote, which includes
this line:

mysqldump --opt --databases printing > /var/backup/dbtemp/printing.txt

This creates a file about 60MB in size.

However, I just did an export in phpMyAdmin of that database, and it
created a 250MB file!

I scanned through the 60MB file, and while I have no way of knowing
about the individual items, I do find commands to dump and create all
the tables the database contains, followed by the commands to insert
each record for the table.

Yet when I try to import that file, I only get a few of the tables
created.

When I tried importing the 250MB file phpMyAdmin created, it
successfully created all the tables, and as far as I can tell, all the
records. So, it's evidently not a size issue at all.

I looked back at the automated export which I have set up to create a
database export every 30 minutes, and looking at the file sizes,
they're all 60MB, growing gradually smaller as you go back in time. So
it always appeared to me that it was working as they grew in size
gradually as more records entered the database. But seeing how it's
nearly a quarter of the size phpMyAdin export is, and failing on the
import, there's something very wrong with it.

But this is the export script. I can't see anything wrong with it:

#! /bin/bash
mysqldump --opt --databases offers > /var/backup/dbtemp/offers.txt
mysqldump --opt --databases priceretail >
/var/backup/dbtemp/priceretail.txt
mysqldump --opt --databases pricewholesale >
/var/backup/dbtemp/pricewholesale.txt
mysqldump --opt --databases printing > /var/backup/dbtemp/printing.txt
mysqldump --opt --databases db_users > /var/backup/dbtemp/db_users.txt
mysqldump --opt --databases test > /var/backup/dbtemp/test.txt
DOY=`date +%Y%m%d` # DATE YYYYMMDD
TIME=`date +%H%M` # TIME HHMM
/bin/tar -czf /var/backup/db/db-$DOY-$TIME.tgz /var/backup/dbtemp/*
rm -f /var/backup/dbtemp/*
scp -B -S ssh -i /home/backup/.ssh/id_dsa
/var/backup/db/db-$DOY-$TIME.tgz backup@(IP ADDRESS OF REMOTE
PC):/home/backup/db/

Any suggestions?
Thanks!
Liam

Jul 23 '05 #3
<ne**@celticbear.com> wrote in message
news:11**********************@f14g2000cwb.googlegr oups.com...
This problem appears to be more complex.

The export I do of the database is in a cron I wrote, which includes
this line:

mysqldump --opt --databases printing > /var/backup/dbtemp/printing.txt

This creates a file about 60MB in size.

However, I just did an export in phpMyAdmin of that database, and it
created a 250MB file!

I scanned through the 60MB file, and while I have no way of knowing
about the individual items, I do find commands to dump and create all
the tables the database contains, followed by the commands to insert
each record for the table.

Yet when I try to import that file, I only get a few of the tables
created.

When I tried importing the 250MB file phpMyAdmin created, it
successfully created all the tables, and as far as I can tell, all the
records. So, it's evidently not a size issue at all.

I looked back at the automated export which I have set up to create a
database export every 30 minutes, and looking at the file sizes,
they're all 60MB, growing gradually smaller as you go back in time. So
it always appeared to me that it was working as they grew in size
gradually as more records entered the database. But seeing how it's
nearly a quarter of the size phpMyAdin export is, and failing on the
import, there's something very wrong with it.

But this is the export script. I can't see anything wrong with it:

#! /bin/bash
mysqldump --opt --databases offers > /var/backup/dbtemp/offers.txt
mysqldump --opt --databases priceretail >
/var/backup/dbtemp/priceretail.txt
mysqldump --opt --databases pricewholesale >
/var/backup/dbtemp/pricewholesale.txt
mysqldump --opt --databases printing > /var/backup/dbtemp/printing.txt
mysqldump --opt --databases db_users > /var/backup/dbtemp/db_users.txt
mysqldump --opt --databases test > /var/backup/dbtemp/test.txt
DOY=`date +%Y%m%d` # DATE YYYYMMDD
TIME=`date +%H%M` # TIME HHMM
/bin/tar -czf /var/backup/db/db-$DOY-$TIME.tgz /var/backup/dbtemp/*
rm -f /var/backup/dbtemp/*
scp -B -S ssh -i /home/backup/.ssh/id_dsa
/var/backup/db/db-$DOY-$TIME.tgz backup@(IP ADDRESS OF REMOTE
PC):/home/backup/db/


I have no experience with phpMyAdmin so I can't comment here.
And
Your "mySQL 11.18" version info is indecipherable. What version(s) are you
running? Are you porting data between 2 identical versions? If not -
beware of your --opt switch. A nickle says you creating something in one
version that is not recognized by the other - IF you are porting between
different MySQLs.

Personally, I always separate backup up of database structures from backing
up the data. It makes it easier to keep things straight.

I use mysqldump with -d to get data structures only. If I want to restore
or recreate a database. I start empty (no tables) and execute the script
created by mysqldump.

And I don't use mysqldump at all for the data. I use SELECT * INTO OUTFILE
'sometable.dat' to backup data and a corresponding LOAD TABLE
INFILE('sometable.dat') to restore the data. These files are substantially
smaller anyway, since they don't need to carry all the SQL syntax (INSERT,
VALUES and such). Just a clean tab delimited (or comma, or what have you)
text data table that is portable.

I don't know of an import file size limit and I am having difficulty
believing this can be the problem. You mention that the file size of your
phpMyAdmin dump is substantially larger than what you get from mysqldump.
*Clearly*, each is throwing something different at MySQL when you try to
import.

You *still* need to isolate "line X" (where MySQL is choking!) if you really
want to know what's shaking. MySQL is telling you where it's having a
problem. You owe it to yourself to look at that spot!
Thomas Bartkus
Jul 23 '05 #4
Thomas Bartkus wrote:
<ne**@celticbear.com> wrote in message
news:11**********************@f14g2000cwb.googlegr oups.com...
This problem appears to be more complex.

I have no experience with phpMyAdmin so I can't comment here.
And
Your "mySQL 11.18" version info is indecipherable. What version(s) are you running? Are you porting data between 2 identical versions? If not - beware of your --opt switch. A nickle says you creating something in one version that is not recognized by the other - IF you are porting between different MySQLs.
Sorry, I did a "mysql -V" and wrote down the wrong info.
Source mySQL = 3.23.58
Destination mySQL = 3.23.58

OK, I guess that's not the issue.

Personally, I always separate backup up of database structures from backing up the data. It makes it easier to keep things straight.

I use mysqldump with -d to get data structures only. If I want to restore or recreate a database. I start empty (no tables) and execute the script created by mysqldump.

And I don't use mysqldump at all for the data. I use SELECT * INTO OUTFILE 'sometable.dat' to backup data and a corresponding LOAD TABLE
INFILE('sometable.dat') to restore the data. These files are substantially smaller anyway, since they don't need to carry all the SQL syntax (INSERT, VALUES and such). Just a clean tab delimited (or comma, or what have you) text data table that is portable.

That's certainly an idea to consider. But how viable is that? For
example, in the one database (and we have 5 we need to back up on a
regular basis) we have over 30 tables. Would be pretty onerous to do a
query export and import for each table. Is that even something than can
be automated? I have a cron doing the mysqldump every 30 minutes and
sending them to a folder that stores the last 7 days of backups (yeah,
that's a lot, but I do that in case there's some terrible event that
happens that makes the data invalid for up to 6 days before we realize
it.) Now that I think about it, I suppose I could make a shell
executable PHP script that can be croned. Anyway....
I don't know of an import file size limit and I am having difficulty
believing this can be the problem. You mention that the file size of your phpMyAdmin dump is substantially larger than what you get from mysqldump. *Clearly*, each is throwing something different at MySQL when you try to import.
Yeah, because I was able to import a backed up file that was 4 times
larger than the automated exports, successfully, I have ruled out size
being an issue.
Comparing the two files, the 60MB file I get from mysqldump and the
250MB file I get from phpMyAdmin, I can't see anything blatantly
different. Both have the DUMP/CREATE commands for all the tables, and
from a superficial look, it appears that the format of the inserts are
the same. The only thing I can think of is missing data...but even if
that's the case, since the 60MB mysqldump file has commands to create
all the tables, it SHOULD do at least THAT much.

Well, also, if there IS missing data, doesn't make for a viable backup.

You *still* need to isolate "line X" (where MySQL is choking!) if you really want to know what's shaking. MySQL is telling you where it's having a problem. You owe it to yourself to look at that spot!
Thomas Bartkus


Yeah, new twist in that topic. It appears that that one mysqldump 60MB
version I grabbed yesterday to test with, was actually an incomplete
file. Only by about 2 MB though. I copied it out from the backup dir
before it was actually finished dumping. How unluck is that. =/

Anyway, I tried it again today with a completed mysqldump file and I
don't get the error. Although, it's STILL only creating some of the
tables and is a quarter the size of the phpMyAdmin export.

Well, thanks for replying. I appreciate the tips!
Liam

Jul 23 '05 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

5
by: apchar | last post by:
I am trying to use php as a kind of servlet to act as a middle man between a java applet and mysql. I know java has jdbc but it's flakey and painful. php access to mysql is much nicer. So I have:...
8
by: Johnny | last post by:
I was looking for information on how to import an excel file or other text file into a MySql database. I have done this with both access and SQL Server and am looking for a way to do this in...
0
by: Mike Chirico | last post by:
Interesting Things to Know about MySQL Mike Chirico (mchirico@users.sourceforge.net) Copyright (GPU Free Documentation License) 2004 Last Updated: Mon Jun 7 10:37:28 EDT 2004 The latest...
0
by: Webmaster | last post by:
Hi everyone... I know this isn't strictly a php/mysql ng, but I was hoping maybe someone here knows both languages and is willing to help. I have a directory of text files that i'd like to...
17
by: chicha | last post by:
Hey people, I have to convert MS Access 2000 database into mysql database, the whole thing being part of this project I'm doing for one of my faculty classes. My professor somehow presumed I...
0
by: NewbieSupreme | last post by:
I'm using PHPMyAdmin on an Apache2Triad install (latest version; 5.x.x, which installs PHP5 and PHPMyAdmin 2.8 as well). In Access, I exported a table to a text file, tab-delimited, text qualifyer...
0
by: nrip | last post by:
Dear All, I am facing a very peculiar problem. I am reading a CSV file from my JSP code and trying to insert them into MYSQL database. the program first reads a line and then splits it into words...
16
by: Ananthu | last post by:
Hi I dont know how to connect mysql with ECLIPSE in RCP application. Please send me the sample code of connecting mysql sever with ECLIPSE in RCP application. Coding Part: RCP Application...
1
by: paulq182 | last post by:
PLEASE HELP ME WITH MY CODE?? import java.sql.*; import java.io.*; class min_filmdb_rel_mysql { public static void main (String args ) throws SQLException, IOException {
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.