473,776 Members | 1,606 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Stream from FTP directly to MySQL while parsing CSV

I have some files that sit on a FTP server. These files contain data
stored in a tab-separated format. I need to download these files and
insert/update them in a MySQL database. My current basic strategy is to
do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.

This all seems to work and it means I don't have to write anything to
disk. Everything is handled in memory so not temp files are needed. The
downside is that some of these files are very large so the program can
consume large amounts of memory. I want to see what I can do to reduce
this memory usage.

In a perfect world I don't need to keep the entire file in memory. As
soon as a single line is read via FTP I should be able to pass that line
off to the CSV parsing code and the MySQL insert/update should be able
to take place as each line is parsed by the CSV library. I.E. I should
have more than a buffer worth of data in memory at a time. A buffer
would need to be able to store at least a entire line but my memory
requirements would drop significantly.

My problem is that I can't seem to be able to figure out how to do this
with the current PHP libraries. It seems that most functions in PHP are
not designed around the idea of piping streams of information together.

The other restriction I have is that I am limited to just PHP 4.3. Any
ideas or is holding the entire file in memory the best way (other than
writing my own libraries).

Eric
Jan 27 '06 #1
8 2943
Eric Anderson wrote:
I have some files that sit on a FTP server. These files contain data
stored in a tab-separated format. I need to download these files and
insert/update them in a MySQL database. My current basic strategy is to
do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.


Is there a reason why you can't use the built-in FTP stream wrapper?

Jan 27 '06 #2
NC
Eric Anderson wrote:

I have some files that sit on a FTP server. These files contain data
stored in a tab-separated format. I need to download these files and
insert/update them in a MySQL database. My current basic strategy
is to do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.

This all seems to work and it means I don't have to write anything to
disk.


This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.

Cheers,
NC

Jan 27 '06 #3
d
"Eric Anderson" <er**@afaik.u s> wrote in message
news:43******** **************@ titian.nntpserv er.com...
I have some files that sit on a FTP server. These files contain data stored
in a tab-separated format. I need to download these files and insert/update
them in a MySQL database. My current basic strategy is to do the following:

1) Login to the ftp server using the FTP library in PHP
2) Create a variable that acts like a file handle using Stream_Var in
PEAR.
3) Use ftp_fget() to read a remote file into this variable (this is so I
don't have to write it to disk).
4) Parse that data now stored in memory using fgetcsv() (again treating
that variable as a file handle using Stream_Var). This produces an array.
4) Insert/Update the data in the array using DB in PEAR.

This all seems to work and it means I don't have to write anything to
disk. Everything is handled in memory so not temp files are needed. The
downside is that some of these files are very large so the program can
consume large amounts of memory. I want to see what I can do to reduce
this memory usage.

In a perfect world I don't need to keep the entire file in memory. As soon
as a single line is read via FTP I should be able to pass that line off to
the CSV parsing code and the MySQL insert/update should be able to take
place as each line is parsed by the CSV library. I.E. I should have more
than a buffer worth of data in memory at a time. A buffer would need to be
able to store at least a entire line but my memory requirements would drop
significantly.

My problem is that I can't seem to be able to figure out how to do this
with the current PHP libraries. It seems that most functions in PHP are
not designed around the idea of piping streams of information together.

The other restriction I have is that I am limited to just PHP 4.3. Any
ideas or is holding the entire file in memory the best way (other than
writing my own libraries).

Eric


As chung was hinting at, use the FTP wrapper by simply opening the file with
fopen() as you would a local file. You can then use fgets() to read a
single line at a time, process that line, and repeat until the file has been
read in its entirity.

dave
Jan 27 '06 #4
d wrote:
As chung was hinting at, use the FTP wrapper by simply opening the file with
fopen() as you would a local file. You can then use fgets() to read a
single line at a time, process that line, and repeat until the file has been
read in its entirity.


I thought about that but I am also reading a large number of file from
the FTP server into the database and my assumption is that if I use
fopen it will login to the FTP server everytime vs just once adding to
the overhead a good bit.

Thanks for the suggestion though. I'll keep it in mind.

Eric
Jan 27 '06 #5
NC wrote:
This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.


Interesting approach. It would be fast and low on resources (although it
would require usage of the filesystem but perhaps that isn't too big of
a deal). The only downside is that it is MySQL specific. Currently this
application is database independent and it would be nice to keep it that
way. I'll keep it in mind.

Eric
Jan 27 '06 #6

Eric Anderson wrote:
d wrote:
As chung was hinting at, use the FTP wrapper by simply opening the file with
fopen() as you would a local file. You can then use fgets() to read a
single line at a time, process that line, and repeat until the file has been
read in its entirity.


I thought about that but I am also reading a large number of file from
the FTP server into the database and my assumption is that if I use
fopen it will login to the FTP server everytime vs just once adding to
the overhead a good bit.

Thanks for the suggestion though. I'll keep it in mind.

Eric


Good point. On the other, the file transfer would happen concurrently
with the database inserts. If the file is large, the time required
would be lower for the entire operation.

Jan 27 '06 #7
NC
Eric Anderson wrote:
NC wrote:
This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.


Interesting approach. It would be fast and low on resources (although it
would require usage of the filesystem but perhaps that isn't too big of
a deal). The only downside is that it is MySQL specific. Currently this
application is database independent and it would be nice to keep it that
way.


If memory serves, all SQL databases support import of text files. The
query syntax may differ, but the concept is clearly there...

Cheers,
NC

Jan 28 '06 #8
NC wrote:
Eric Anderson wrote:
NC wrote:
This is not necessarily a good thing. Because you want minimize the
disk usage, you are missing out on MySQL's ability to process large
files very quickly. I would suggest an alternative approach:

1. Copy the remote file to your local disk.
2. Use LOAD DATA INFILE to load the data into MySQL.
3. Delete the data file, if necessary.

Interesting approach. It would be fast and low on resources (although it
would require usage of the filesystem but perhaps that isn't too big of
a deal). The only downside is that it is MySQL specific. Currently this
application is database independent and it would be nice to keep it that
way.


If memory serves, all SQL databases support import of text files. The
query syntax may differ, but the concept is clearly there...


I went for a modification on this method. I got to thinking that php
variables are probably not designed to store large amounts of data. So
even though Stream_Var is convenient it is probably not not efficient
because as new data is read in from the FTP server PHP is probably
mallocing continuously causing performance really to drag. So I instead
have ftp_fget write out to whatever tmpfile() gives me to disk and then
use fgetcsv() to read that in from disk and insert into the database as
before.

This change resulted in reducing the time of the import taking about 3
hours to the import now taking about 10 minutes! It still seems like the
FTP library should offer a way to stream ftp data directly into a
consuming function such as fgetcsv() without having to read the entire
file into memory at once. But using the tmpfile() workaround seems to
perform well so I am happy.

Eric
Jan 30 '06 #9

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
4245
by: Robert Mazur | last post by:
MySQL 5.0 alpha (binary install) on Solaris 9 -or- RedHat 8.0 mysql-connector-java-3.0.8-stable ----------------------- Is there something different going on with JDBC and the alpha version of MySQL 5.0? When trying to connect I am getting: ** BEGIN NESTED EXCEPTION **
0
1351
by: Thomas Engelmeier | last post by:
Hi, I need an expat-alike parser (ABI-wise as much a drop-in replacement as possible) for C/C++ with the following additions: - in order to handle some tags with nesting in completely different compilation units, I need a possibility to skip a given range of the input bytes: If I encounter the tag <foo>, I need to hand over the input stream at the current parsing offset to another (completely independent)
5
8024
by: Rafal 'Raf256' Maj | last post by:
Hi, I need to parse a file. This means reading from it as from std::istream. But - sometimes I also need to put-back some text I read before. What type of string can I use for that? Something like: void cLine::Load(std::istream &data) { data.ignore(4); // ignore word "line" char c; // ignore { and } chars data >> c >> x1 >> y1 >> x2 >> y2 >> c;
1
1271
by: Sandy | last post by:
Hi, I have some data stored in my internal data structure. I am writing this data in an xml file and invoking xalan on this file to perform some transformation. After the transformation I want to put the data in Database so i m reading the xml produced by xalan. But as there are lot of IO operations so the application is very slow. Is there any way to pass the xml stream (using the string buffer that I am
10
2374
by: Newsscanner | last post by:
Hello, In my MySQL database, one of the fields eople have to fill in is "DOB" (Date of Birth). I have now managed to begin inserting data into my DB via a form, the data type for the DOB field is "DATE", but every time I add data and then check my db, I see the DOB displayed as "0000-00-00". Has anyone got any idea how I could put this right? TIA,
4
2319
by: fastback66 | last post by:
Evidently per the C++ standard, it is not possible to measure the distance between two stream iterators, because two non-end-of-stream iterators are equal when they are constructed from the same stream. I still don't quite understand why that fact is true. But... I wish to find the first occurance of some sequence in an existing stream. For the moment this stream is a file on disk, but later may be a stream from another process. Anyway,...
2
1770
by: Michael Kolias | last post by:
Hello everyone and happy holidays! I have a situation where I want to parse on the fly a stream of binary data. I am developing an httpmodule for asp.net for file uploading. I do not whant to use the built-in control because it's inadequate. In theory I would use a que and feed data as it comes in and then have a parser that extracts the data from the que, parse them and do whatever.
9
13989
by: ThePants | last post by:
Hi, given the following code, I've been successful in grabbing pages for parsing, but for a certain page template (containing a particular piece of code) the stream always ends right after that code. If you try this with just about any type of url (incuding urls from the same site without that piece of code) it works fine, but with urls containing the piece of code, the stream is returned only up to that point. Dim sURL as String '...
2
14721
by: Cuong.Tong | last post by:
Greeting, Can anyone give me some hints about parsing the mulitpart/form-data stream? I have a form looks something like this <form action="process.dll> <input type=file name=fileupload</input> </form>
0
10122
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
10061
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
9923
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
1
7471
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6722
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5368
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
1
4031
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
3627
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2860
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.