By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,559 Members | 1,154 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,559 IT Pros & Developers. It's quick & easy.

Stream from FTP directly to MySQL while parsing CSV

P: n/a
I have some files that sit on a FTP server. These files contain data
stored in a tab-separated format. I need to download these files and
insert/update them in a MySQL database. My current basic strategy is to
do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.

This all seems to work and it means I don't have to write anything to
disk. Everything is handled in memory so not temp files are needed. The
downside is that some of these files are very large so the program can
consume large amounts of memory. I want to see what I can do to reduce
this memory usage.

In a perfect world I don't need to keep the entire file in memory. As
soon as a single line is read via FTP I should be able to pass that line
off to the CSV parsing code and the MySQL insert/update should be able
to take place as each line is parsed by the CSV library. I.E. I should
have more than a buffer worth of data in memory at a time. A buffer
would need to be able to store at least a entire line but my memory
requirements would drop significantly.

My problem is that I can't seem to be able to figure out how to do this
with the current PHP libraries. It seems that most functions in PHP are
not designed around the idea of piping streams of information together.

The other restriction I have is that I am limited to just PHP 4.3. Any
ideas or is holding the entire file in memory the best way (other than
writing my own libraries).

Eric
Jan 27 '06 #1
Share this Question
Share on Google+
8 Replies


P: n/a
Eric Anderson wrote:
I have some files that sit on a FTP server. These files contain data
stored in a tab-separated format. I need to download these files and
insert/update them in a MySQL database. My current basic strategy is to
do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.


Is there a reason why you can't use the built-in FTP stream wrapper?

Jan 27 '06 #2

P: n/a
NC
Eric Anderson wrote:

I have some files that sit on a FTP server. These files contain data
stored in a tab-separated format. I need to download these files and
insert/update them in a MySQL database. My current basic strategy
is to do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.

This all seems to work and it means I don't have to write anything to
disk.


This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.

Cheers,
NC

Jan 27 '06 #3

P: n/a
d
"Eric Anderson" <er**@afaik.us> wrote in message
news:43**********************@titian.nntpserver.co m...
I have some files that sit on a FTP server. These files contain data stored
in a tab-separated format. I need to download these files and insert/update
them in a MySQL database. My current basic strategy is to do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in
PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.

This all seems to work and it means I don't have to write anything to
disk. Everything is handled in memory so not temp files are needed. The
downside is that some of these files are very large so the program can
consume large amounts of memory. I want to see what I can do to reduce
this memory usage.

In a perfect world I don't need to keep the entire file in memory. As soon
as a single line is read via FTP I should be able to pass that line off to
the CSV parsing code and the MySQL insert/update should be able to take
place as each line is parsed by the CSV library. I.E. I should have more
than a buffer worth of data in memory at a time. A buffer would need to be
able to store at least a entire line but my memory requirements would drop
significantly.

My problem is that I can't seem to be able to figure out how to do this
with the current PHP libraries. It seems that most functions in PHP are
not designed around the idea of piping streams of information together.

The other restriction I have is that I am limited to just PHP 4.3. Any
ideas or is holding the entire file in memory the best way (other than
writing my own libraries).

Eric


As chung was hinting at, use the FTP wrapper by simply opening the file with
fopen() as you would a local file. You can then use fgets() to read a
single line at a time, process that line, and repeat until the file has been
read in its entirity.

dave
Jan 27 '06 #4

P: n/a
d wrote:
As chung was hinting at, use the FTP wrapper by simply opening the file with
fopen() as you would a local file. You can then use fgets() to read a
single line at a time, process that line, and repeat until the file has been
read in its entirity.


I thought about that but I am also reading a large number of file from
the FTP server into the database and my assumption is that if I use
fopen it will login to the FTP server everytime vs just once adding to
the overhead a good bit.

Thanks for the suggestion though. I'll keep it in mind.

Eric
Jan 27 '06 #5

P: n/a
NC wrote:
This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.


Interesting approach. It would be fast and low on resources (although it
would require usage of the filesystem but perhaps that isn't too big of
a deal). The only downside is that it is MySQL specific. Currently this
application is database independent and it would be nice to keep it that
way. I'll keep it in mind.

Eric
Jan 27 '06 #6

P: n/a

Eric Anderson wrote:
d wrote:
As chung was hinting at, use the FTP wrapper by simply opening the file with
fopen() as you would a local file. You can then use fgets() to read a
single line at a time, process that line, and repeat until the file has been
read in its entirity.


I thought about that but I am also reading a large number of file from
the FTP server into the database and my assumption is that if I use
fopen it will login to the FTP server everytime vs just once adding to
the overhead a good bit.

Thanks for the suggestion though. I'll keep it in mind.

Eric


Good point. On the other, the file transfer would happen concurrently
with the database inserts. If the file is large, the time required
would be lower for the entire operation.

Jan 27 '06 #7

P: n/a
NC
Eric Anderson wrote:
NC wrote:
This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.


Interesting approach. It would be fast and low on resources (although it
would require usage of the filesystem but perhaps that isn't too big of
a deal). The only downside is that it is MySQL specific. Currently this
application is database independent and it would be nice to keep it that
way.


If memory serves, all SQL databases support import of text files. The
query syntax may differ, but the concept is clearly there...

Cheers,
NC

Jan 28 '06 #8

P: n/a
NC wrote:
Eric Anderson wrote:
NC wrote:
This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.

Interesting approach. It would be fast and low on resources (although it
would require usage of the filesystem but perhaps that isn't too big of
a deal). The only downside is that it is MySQL specific. Currently this
application is database independent and it would be nice to keep it that
way.


If memory serves, all SQL databases support import of text files. The
query syntax may differ, but the concept is clearly there...


I went for a modification on this method. I got to thinking that php
variables are probably not designed to store large amounts of data. So
even though Stream_Var is convenient it is probably not not efficient
because as new data is read in from the FTP server PHP is probably
mallocing continuously causing performance really to drag. So I instead
have ftp_fget write out to whatever tmpfile() gives me to disk and then
use fgetcsv() to read that in from disk and insert into the database as
before.

This change resulted in reducing the time of the import taking about 3
hours to the import now taking about 10 minutes! It still seems like the
FTP library should offer a way to stream ftp data directly into a
consuming function such as fgetcsv() without having to read the entire
file into memory at once. But using the tmpfile() workaround seems to
perform well so I am happy.

Eric
Jan 30 '06 #9

This discussion thread is closed

Replies have been disabled for this discussion.