472,328 Members | 1,013 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 472,328 software developers and data experts.

importing large amount of data does not work

Hi, I have researched but have not found a good solution to this
problem.

I am importing large amounts of data (over 50 Meg) into a new mysql db
that I set up. I use
mysql dbname < importfile.txt

But I keep getting timeouts and errors due to the data being too
large. I know that since if I break the imported data into multiple
chuncks (by importing a few tables at a time) then everything works.

Any advice is appreciated. Running on Linux.

--
http://www.dbForumz.com/ This article was posted by author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbForumz.com/mySQL-import...ict139853.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbForumz.com/eform.php?p=467833
Jul 20 '05 #1
2 2608
I can import files larger than that without any problems on linux. How
does it time out and what errors are you getting?

steve <Us************@dbForumz.com> wrote in message news:<41********@news.athenanews.com>...
Hi, I have researched but have not found a good solution to this
problem.

I am importing large amounts of data (over 50 Meg) into a new mysql db
that I set up. I use
mysql dbname < importfile.txt

But I keep getting timeouts and errors due to the data being too
large. I know that since if I break the imported data into multiple
chuncks (by importing a few tables at a time) then everything works.

Any advice is appreciated. Running on Linux.

Jul 20 '05 #2
"Jay Donnell" wrote:
I can import files larger than that without any problems on linux. How does it time out and what errors are you getting?

steve <Us************@dbForumz.com> wrote in message
news:<41********@news.athenanews.com>...
Hi, I have researched but have not found a good solution to this
problem.

I am importing large amounts of data (over 50 Meg) into a new

mysql db
that I set up. I use
mysql dbname < importfile.txt

But I keep getting timeouts and errors due to the data being too
large. I know that since if I break the imported data into

multiple
chuncks (by importing a few tables at a time) then everything

works.

Any advice is appreciated. Running on Linux.</font>


Thanks. Next time I do an import, I will report the error. But
generally it says something like scripted timed out.

--
http://www.dbForumz.com/ This article was posted by author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbForumz.com/mySQL-import...ict139853.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbForumz.com/eform.php?p=470402
Jul 20 '05 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
by: DJTB | last post by:
zodb-dev@zope.org] Hi, I'm having problems storing large amounts of objects in a ZODB. After committing changes to the database, elements are...
2
by: steve | last post by:
Hi, I have researched but have not found a good solution to this problem. I am importing large amounts of data (over 50 Meg) into a new mysql db...
15
by: rodchar | last post by:
hey all what's the best way to take comma-delimited text file and insert it into a database via oledb provider thanks in advance rodchar
9
by: Edward S | last post by:
I budget for a Project in an Excel sheet as illustrated below. The months below are usually a 2 year period i.e. 24 months, though it could be...
5
by: hharriel | last post by:
Hi, I am hoping someone can help me with an issue I am having with excel and ms access. I have collected data (which are in individual excel...
7
by: =?Utf-8?B?TW9iaWxlTWFu?= | last post by:
Hello everyone: I am looking for everyone's thoughts on moving large amounts (actually, not very large, but large enough that I'm throwing...
16
by: Jack | last post by:
I need to process large amount of data. The data structure fits well in a dictionary but the amount is large - close to or more than the size of...
4
by: johnporter123 | last post by:
Does anyone have a method of importing a large "FLAT" CSV file into access. The file has well over 255 columns (Fields). Before anyone flames me...
0
by: tammygombez | last post by:
Hey fellow JavaFX developers, I'm currently working on a project that involves using a ComboBox in JavaFX, and I've run into a bit of an issue....
0
better678
by: better678 | last post by:
Question: Discuss your understanding of the Java platform. Is the statement "Java is interpreted" correct? Answer: Java is an object-oriented...
0
by: teenabhardwaj | last post by:
How would one discover a valid source for learning news, comfort, and help for engineering designs? Covering through piles of books takes a lot of...
0
by: Kemmylinns12 | last post by:
Blockchain technology has emerged as a transformative force in the business world, offering unprecedented opportunities for innovation and...
0
by: Naresh1 | last post by:
What is WebLogic Admin Training? WebLogic Admin Training is a specialized program designed to equip individuals with the skills and knowledge...
0
jalbright99669
by: jalbright99669 | last post by:
Am having a bit of a time with URL Rewrite. I need to incorporate http to https redirect with a reverse proxy. I have the URL Rewrite rules made...
0
by: Matthew3360 | last post by:
Hi there. I have been struggling to find out how to use a variable as my location in my header redirect function. Here is my code. ...
1
by: Matthew3360 | last post by:
Hi, I have a python app that i want to be able to get variables from a php page on my webserver. My python app is on my computer. How would I make it...
0
by: AndyPSV | last post by:
HOW CAN I CREATE AN AI with an .executable file that would suck all files in the folder and on my computerHOW CAN I CREATE AN AI with an .executable...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.