go***********@burditt.org (Gordon Burditt) wrote in message news:<cg********@library2.airnews.net>...
I'm developing a site which will periodically update inventory via a
large batch file. I'm wondering if anyone has ideas, caveats, etc on
how to update the inventory. At the moment, I'm simply uploading the
batch file via a web interface and in that same PHP page executing sql
to populate the database with the data from the batch file. Is this
I'd talk SQL directly to the database with whatever protocol the
database provides to update the inventory. You could do this from
the same machine the database is on (ftp the file in and have a
script (possibly standalone PHP) see it and process it), or run the
database client remotely (MySQL offers remote database connections,
and you can use SSL if you want more security).
Database transaction features may be desirable to lock out web
access while the inventory is in the middle of updating.
A "large batch file" update doesn't sound like something you want
to do in one HTTP transaction, especially given that you probably
want PHP's execution time limit for most of the rest of the site.
the best way? Should I write a shell script to do it? Should it be
done via the web or should someone have to log into the box to do it?
In all, it takes about 30 seconds to do the import, if that makes any
difference.
You haven't said much about security issues. How sensitive is it if
someone grabs a copy of the data? How about if someone alters it
maliciously?
Gordon L. Burditt
Thanks for the replies guys. Confirms hunches I had, which helps me
sleep better ;)
Regarding the security, if someone alters it maliciously it would
definately be a bad thing. If I get the file to the server via SCP or
HTTPS, I assume it should be okay in transit. Correct me if I'm wrong.
The problem with connecting remotely directly via mysql is I've got to
make it such that a non-programmer can do the inventory updating.
Perhaps if I could use stored procs this would be okay, but the app is
using mysql 4 and IIRC they are available only in mysql 5. As far as
I know there's no non-programmery way of doing this is there?
For the moment, I'm thinking of writing a shell script in php which
would do the updating. The user would log in via ssh, upload the file
via scp, and run the script. At the moment the script would
- create a temp table
- import the data into the temp table via load data infile
- do a bunch of validation on the data in the temp table
- use a bunch of inserts for each row to distribute the data to
various tables in the db (making sure the inserts are atomic). (Is
there an alternative to using x number of inserts per row where x =
the number of different tables into to which the data should be
inserted?)
I was thinking of doing all this via a webpage and using shell_exec()
but I need to get responses from the procedure in case something goes
wrong and I'd again have the php script execution timeout issue.
Does this sound like a solid way of doing things? Anything I'm
missing? Is there a more user friendly way of doing it so ye olde
non-programmer doesn't have run a shell script?
Thanks again for the suggestions