473,418 Members | 1,694 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,418 software developers and data experts.

How to speed up the import of 400,000+ records

jj
It's taking forever to upload 400,000 records to the database through
access/odbc, and I've tried phpMyAdmin's interface but it seems to timeout
during import of a CSV file. Is there a better way to import 400,000+
records?

Jul 19 '05 #1
7 9258
jj
Thanks that gave me an idea. I'm importing all the records to my local
MySQL database and will create an extract of that which will format it like
you said. Then take that .sql file and import it onto the server. We'll
see...
"Aggro" <sp**********@yahoo.com> wrote in message
news:X9***************@read3.inet.fi...
jj wrote:
It's taking forever to upload 400,000 records to the database through
access/odbc, and I've tried phpMyAdmin's interface but it seems to timeout during import of a CSV file. Is there a better way to import 400,000+
records?


If it is possible for you to get the data into sql command-form like this:

-----filename.txt--------
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
-----filename.txt--------

You can insert it into database using mysql console like this:

c:\mysql\bin\mysql -u username databasename < filename.txt

Jul 19 '05 #2
Don't listen to the previous guy.

Use LOAD DATA CONCURRENT INFILE

Loading of several megs of data will take only a few seconds.

On Fri, 19 Dec 2003 19:30:04 GMT, "jj" <jj@test.net> wrote:
Thanks that gave me an idea. I'm importing all the records to my local
MySQL database and will create an extract of that which will format it like
you said. Then take that .sql file and import it onto the server. We'll
see...
"Aggro" <sp**********@yahoo.com> wrote in message
news:X9***************@read3.inet.fi...
jj wrote:
> It's taking forever to upload 400,000 records to the database through
> access/odbc, and I've tried phpMyAdmin's interface but it seems totimeout > during import of a CSV file. Is there a better way to import 400,000+
> records?


If it is possible for you to get the data into sql command-form like this:

-----filename.txt--------
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
-----filename.txt--------

You can insert it into database using mysql console like this:

c:\mysql\bin\mysql -u username databasename < filename.txt


Jul 19 '05 #3
jj wrote:
It's taking forever to upload 400,000 records to the database through
access/odbc, and I've tried phpMyAdmin's interface but it seems to timeout
during import of a CSV file. Is there a better way to import 400,000+
records?


If it is possible for you to get the data into sql command-form like this:

-----filename.txt--------
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
-----filename.txt--------

You can insert it into database using mysql console like this:

c:\mysql\bin\mysql -u username databasename < filename.txt

Jul 19 '05 #4
jj
from Access via MyODBC for MySQL to my local MySQL server it took only a
minute or so. But through ODBC to a remote server, it takes hours then
finally dies. We have a high speed connection. Weird...
<us******@tampabay.rr.com> wrote in message
news:av********************************@4ax.com...
Don't listen to the previous guy.

Use LOAD DATA CONCURRENT INFILE

Loading of several megs of data will take only a few seconds.

On Fri, 19 Dec 2003 19:30:04 GMT, "jj" <jj@test.net> wrote:
Thanks that gave me an idea. I'm importing all the records to my local
MySQL database and will create an extract of that which will format it likeyou said. Then take that .sql file and import it onto the server. We'll
see...
"Aggro" <sp**********@yahoo.com> wrote in message
news:X9***************@read3.inet.fi...
jj wrote:

> It's taking forever to upload 400,000 records to the database through
> access/odbc, and I've tried phpMyAdmin's interface but it seems to

timeout
> during import of a CSV file. Is there a better way to import 400,000+ > records?

If it is possible for you to get the data into sql command-form like this:
-----filename.txt--------
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
-----filename.txt--------

You can insert it into database using mysql console like this:

c:\mysql\bin\mysql -u username databasename < filename.txt

Jul 19 '05 #5
jj
Thanks that gave me an idea. I'm importing all the records to my local
MySQL database and will create an extract of that which will format it like
you said. Then take that .sql file and import it onto the server. We'll
see...
"Aggro" <sp**********@yahoo.com> wrote in message
news:X9***************@read3.inet.fi...
jj wrote:
It's taking forever to upload 400,000 records to the database through
access/odbc, and I've tried phpMyAdmin's interface but it seems to timeout during import of a CSV file. Is there a better way to import 400,000+
records?


If it is possible for you to get the data into sql command-form like this:

-----filename.txt--------
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
-----filename.txt--------

You can insert it into database using mysql console like this:

c:\mysql\bin\mysql -u username databasename < filename.txt

Jul 19 '05 #6
Don't listen to the previous guy.

Use LOAD DATA CONCURRENT INFILE

Loading of several megs of data will take only a few seconds.

On Fri, 19 Dec 2003 19:30:04 GMT, "jj" <jj@test.net> wrote:
Thanks that gave me an idea. I'm importing all the records to my local
MySQL database and will create an extract of that which will format it like
you said. Then take that .sql file and import it onto the server. We'll
see...
"Aggro" <sp**********@yahoo.com> wrote in message
news:X9***************@read3.inet.fi...
jj wrote:
> It's taking forever to upload 400,000 records to the database through
> access/odbc, and I've tried phpMyAdmin's interface but it seems totimeout > during import of a CSV file. Is there a better way to import 400,000+
> records?


If it is possible for you to get the data into sql command-form like this:

-----filename.txt--------
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
-----filename.txt--------

You can insert it into database using mysql console like this:

c:\mysql\bin\mysql -u username databasename < filename.txt


Jul 19 '05 #7
jj
from Access via MyODBC for MySQL to my local MySQL server it took only a
minute or so. But through ODBC to a remote server, it takes hours then
finally dies. We have a high speed connection. Weird...
<us******@tampabay.rr.com> wrote in message
news:av********************************@4ax.com...
Don't listen to the previous guy.

Use LOAD DATA CONCURRENT INFILE

Loading of several megs of data will take only a few seconds.

On Fri, 19 Dec 2003 19:30:04 GMT, "jj" <jj@test.net> wrote:
Thanks that gave me an idea. I'm importing all the records to my local
MySQL database and will create an extract of that which will format it likeyou said. Then take that .sql file and import it onto the server. We'll
see...
"Aggro" <sp**********@yahoo.com> wrote in message
news:X9***************@read3.inet.fi...
jj wrote:

> It's taking forever to upload 400,000 records to the database through
> access/odbc, and I've tried phpMyAdmin's interface but it seems to

timeout
> during import of a CSV file. Is there a better way to import 400,000+ > records?

If it is possible for you to get the data into sql command-form like this:
-----filename.txt--------
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
insert into tablename(column1,column2) values(xx,yy);
-----filename.txt--------

You can insert it into database using mysql console like this:

c:\mysql\bin\mysql -u username databasename < filename.txt

Jul 19 '05 #8

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

14
by: Bulba! | last post by:
One of the posters inspired me to do profiling on my newbie script (pasted below). After measurements I have found that the speed of Python, at least in the area where my script works, is...
17
by: Shailesh Humbad | last post by:
I just posted an article I wrote called ASP Speed Tricks. It covers techniques to optimize output of database data in HTML, for both simple tables and complex tables. More advanced ASP authors...
0
by: Creigh Shank | last post by:
Using an Apache/PHP/MySQL/Linux (Redhat 8.0) solution, PHPList, to create an e-mailing list for our 5.6 million book club members. Unfortunately, the import speed for importing records (at record...
1
by: jj | last post by:
It's taking forever to upload 400,000 records to the database through access/odbc, and I've tried phpMyAdmin's interface but it seems to timeout during import of a CSV file. Is there a better way...
13
by: David Mitchell | last post by:
I use the above function in queries for a number of forms and reports. The reports take approx 20 seconds to open. There are only 100 product id's in tblProducts. My concern is that the time will...
3
by: Reddy | last post by:
The sql query for my datagrid returns 100, 000 records. But the datagrid should display 20 records per page. I am using datagrid paging, but it is taking too much time for the page to load. Is...
11
by: Sezai YILMAZ | last post by:
Hello I need high throughput while inserting into PostgreSQL. Because of that I did some PostgreSQL insert performance tests. ------------------------------------------------------------ --...
9
by: Paul | last post by:
I have a process that I want to speed up. It first was written in Microsoft Access then converted to VB.NET. I would like to hear some suggestions on how to speed it up. The process is to match...
8
by: SaltyBoat | last post by:
Needing to import and parse data from a large PDF file into an Access 2002 table: I start by converted the PDF file to a html file. Then I read this html text file, line by line, into a table...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.