473,408 Members | 2,734 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,408 software developers and data experts.

Stream from FTP directly to MySQL while parsing CSV

I have some files that sit on a FTP server. These files contain data
stored in a tab-separated format. I need to download these files and
insert/update them in a MySQL database. My current basic strategy is to
do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.

This all seems to work and it means I don't have to write anything to
disk. Everything is handled in memory so not temp files are needed. The
downside is that some of these files are very large so the program can
consume large amounts of memory. I want to see what I can do to reduce
this memory usage.

In a perfect world I don't need to keep the entire file in memory. As
soon as a single line is read via FTP I should be able to pass that line
off to the CSV parsing code and the MySQL insert/update should be able
to take place as each line is parsed by the CSV library. I.E. I should
have more than a buffer worth of data in memory at a time. A buffer
would need to be able to store at least a entire line but my memory
requirements would drop significantly.

My problem is that I can't seem to be able to figure out how to do this
with the current PHP libraries. It seems that most functions in PHP are
not designed around the idea of piping streams of information together.

The other restriction I have is that I am limited to just PHP 4.3. Any
ideas or is holding the entire file in memory the best way (other than
writing my own libraries).

Eric
Jan 27 '06 #1
8 2927
Eric Anderson wrote:
I have some files that sit on a FTP server. These files contain data
stored in a tab-separated format. I need to download these files and
insert/update them in a MySQL database. My current basic strategy is to
do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.


Is there a reason why you can't use the built-in FTP stream wrapper?

Jan 27 '06 #2
NC
Eric Anderson wrote:

I have some files that sit on a FTP server. These files contain data
stored in a tab-separated format. I need to download these files and
insert/update them in a MySQL database. My current basic strategy
is to do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.

This all seems to work and it means I don't have to write anything to
disk.


This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.

Cheers,
NC

Jan 27 '06 #3
d
"Eric Anderson" <er**@afaik.us> wrote in message
news:43**********************@titian.nntpserver.co m...
I have some files that sit on a FTP server. These files contain data stored
in a tab-separated format. I need to download these files and insert/update
them in a MySQL database. My current basic strategy is to do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in
PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.

This all seems to work and it means I don't have to write anything to
disk. Everything is handled in memory so not temp files are needed. The
downside is that some of these files are very large so the program can
consume large amounts of memory. I want to see what I can do to reduce
this memory usage.

In a perfect world I don't need to keep the entire file in memory. As soon
as a single line is read via FTP I should be able to pass that line off to
the CSV parsing code and the MySQL insert/update should be able to take
place as each line is parsed by the CSV library. I.E. I should have more
than a buffer worth of data in memory at a time. A buffer would need to be
able to store at least a entire line but my memory requirements would drop
significantly.

My problem is that I can't seem to be able to figure out how to do this
with the current PHP libraries. It seems that most functions in PHP are
not designed around the idea of piping streams of information together.

The other restriction I have is that I am limited to just PHP 4.3. Any
ideas or is holding the entire file in memory the best way (other than
writing my own libraries).

Eric


As chung was hinting at, use the FTP wrapper by simply opening the file with
fopen() as you would a local file. You can then use fgets() to read a
single line at a time, process that line, and repeat until the file has been
read in its entirity.

dave
Jan 27 '06 #4
d wrote:
As chung was hinting at, use the FTP wrapper by simply opening the file with
fopen() as you would a local file. You can then use fgets() to read a
single line at a time, process that line, and repeat until the file has been
read in its entirity.


I thought about that but I am also reading a large number of file from
the FTP server into the database and my assumption is that if I use
fopen it will login to the FTP server everytime vs just once adding to
the overhead a good bit.

Thanks for the suggestion though. I'll keep it in mind.

Eric
Jan 27 '06 #5
NC wrote:
This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.


Interesting approach. It would be fast and low on resources (although it
would require usage of the filesystem but perhaps that isn't too big of
a deal). The only downside is that it is MySQL specific. Currently this
application is database independent and it would be nice to keep it that
way. I'll keep it in mind.

Eric
Jan 27 '06 #6

Eric Anderson wrote:
d wrote:
As chung was hinting at, use the FTP wrapper by simply opening the file with
fopen() as you would a local file. You can then use fgets() to read a
single line at a time, process that line, and repeat until the file has been
read in its entirity.


I thought about that but I am also reading a large number of file from
the FTP server into the database and my assumption is that if I use
fopen it will login to the FTP server everytime vs just once adding to
the overhead a good bit.

Thanks for the suggestion though. I'll keep it in mind.

Eric


Good point. On the other, the file transfer would happen concurrently
with the database inserts. If the file is large, the time required
would be lower for the entire operation.

Jan 27 '06 #7
NC
Eric Anderson wrote:
NC wrote:
This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.


Interesting approach. It would be fast and low on resources (although it
would require usage of the filesystem but perhaps that isn't too big of
a deal). The only downside is that it is MySQL specific. Currently this
application is database independent and it would be nice to keep it that
way.


If memory serves, all SQL databases support import of text files. The
query syntax may differ, but the concept is clearly there...

Cheers,
NC

Jan 28 '06 #8
NC wrote:
Eric Anderson wrote:
NC wrote:
This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.

Interesting approach. It would be fast and low on resources (although it
would require usage of the filesystem but perhaps that isn't too big of
a deal). The only downside is that it is MySQL specific. Currently this
application is database independent and it would be nice to keep it that
way.


If memory serves, all SQL databases support import of text files. The
query syntax may differ, but the concept is clearly there...


I went for a modification on this method. I got to thinking that php
variables are probably not designed to store large amounts of data. So
even though Stream_Var is convenient it is probably not not efficient
because as new data is read in from the FTP server PHP is probably
mallocing continuously causing performance really to drag. So I instead
have ftp_fget write out to whatever tmpfile() gives me to disk and then
use fgetcsv() to read that in from disk and insert into the database as
before.

This change resulted in reducing the time of the import taking about 3
hours to the import now taking about 10 minutes! It still seems like the
FTP library should offer a way to stream ftp data directly into a
consuming function such as fgetcsv() without having to read the entire
file into memory at once. But using the tmpfile() workaround seems to
perform well so I am happy.

Eric
Jan 30 '06 #9

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
by: Robert Mazur | last post by:
MySQL 5.0 alpha (binary install) on Solaris 9 -or- RedHat 8.0 mysql-connector-java-3.0.8-stable ----------------------- Is there something different going on with JDBC and the alpha version...
0
by: Thomas Engelmeier | last post by:
Hi, I need an expat-alike parser (ABI-wise as much a drop-in replacement as possible) for C/C++ with the following additions: - in order to handle some tags with nesting in completely...
5
by: Rafal 'Raf256' Maj | last post by:
Hi, I need to parse a file. This means reading from it as from std::istream. But - sometimes I also need to put-back some text I read before. What type of string can I use for that? Something...
1
by: Sandy | last post by:
Hi, I have some data stored in my internal data structure. I am writing this data in an xml file and invoking xalan on this file to perform some transformation. After the transformation I want...
10
by: Newsscanner | last post by:
Hello, In my MySQL database, one of the fields eople have to fill in is "DOB" (Date of Birth). I have now managed to begin inserting data into my DB via a form, the data type for the DOB field...
4
by: fastback66 | last post by:
Evidently per the C++ standard, it is not possible to measure the distance between two stream iterators, because two non-end-of-stream iterators are equal when they are constructed from the same...
2
by: Michael Kolias | last post by:
Hello everyone and happy holidays! I have a situation where I want to parse on the fly a stream of binary data. I am developing an httpmodule for asp.net for file uploading. I do not whant to...
9
by: ThePants | last post by:
Hi, given the following code, I've been successful in grabbing pages for parsing, but for a certain page template (containing a particular piece of code) the stream always ends right after that...
2
by: Cuong.Tong | last post by:
Greeting, Can anyone give me some hints about parsing the mulitpart/form-data stream? I have a form looks something like this <form action="process.dll> <input type=file...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.