By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
437,660 Members | 1,230 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 437,660 IT Pros & Developers. It's quick & easy.

lo_import for bytea columns

P: n/a
Is there an equivalent function for bytea columns that works like
lo_import?

Alternatively, is there a way to copy from a large object to a bytea
column from SQL?

Or maybe someone has another way of attacking this problem:

I've got some Perl code that does this:

undef $/;
$data = <FHFOR89MBFILE>;
$sth = $dbh->prepare("insert into data (bigbyteacolumn) values (?)");
$sth->bind_param(1, $data, DBI::SQL_BINARY);
$sth->execute;

Which has worked fine for a while, with file sizes around 10MB.

However, now I have someone who wants to use this for a file that's 89MB,
and it's taking up about 500M of memory before crashing. I'm trying to
find a less-memory-consuming way of handling this, even if just for a
temporary hack for this one file. I think what's happening is that Perl
is reading in the 89M, and then I'm guessing that either Perl or the
driver is converting that into a fully-escaped string for transfer, and
this is where the problem is occuring.

Any ideas?

Thanks,

Jonathan Bartlett
---------------------------(end of broadcast)---------------------------
TIP 9: the planner will ignore your desire to choose an index scan if your
joining column's datatypes do not match

Nov 12 '05 #1
Share this Question
Share on Google+
1 Reply


P: n/a
Jonathan Bartlett wrote:
However, now I have someone who wants to use this for a file that's 89MB,
and it's taking up about 500M of memory before crashing. I'm trying to
find a less-memory-consuming way of handling this, even if just for a
temporary hack for this one file. I think what's happening is that Perl
is reading in the 89M, and then I'm guessing that either Perl or the
driver is converting that into a fully-escaped string for transfer, and
this is where the problem is occuring.

Any ideas?


If you can use 7.4, then see:
http://www.postgresql.org/docs/curre...ibpq-exec.html

Specifically look at the PQexecParams() function. You could write a
simple command line program that uses PQexecParams() to insert your
large files. Or talk the Perl Postgres DBI driver guys into supporting
it maybe.

Joe
---------------------------(end of broadcast)---------------------------
TIP 8: explain analyze is your friend

Nov 12 '05 #2

This discussion thread is closed

Replies have been disabled for this discussion.