470,619 Members | 2,062 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 470,619 developers. It's quick & easy.

Problem with CONCAT function (bug?)

I'm needed to insert large BLOBs into a database. With the 1MB packet
limit, sending larger amounts of data would be difficult, so I had a neat
idea. I would do an initial insert of an empty record and get the
auto_insert ID from the response, and then loop through, appending data to
the record.

My table is simple. One unsigned int auto_increment field (DataID), and
one long blob field (BinaryData).

When I loop through the data to send, I run:

UPDATE BinaryTable SET BinaryData=CONCAT(BinaryData, 'My binary data
here') WHERE DataID = 35
The binary data I insert I escape null characters, backslashes, single and
double quotes. The data seems to insert fine.

The problem is that as I increase the amount of data in the field, CONCAT
seems to drop all but the last 416k of the data. Thus if I loop through
adding 400k blocks at a time (Which I do) I am left with at most 800k of
data in the blob field.

Using 4k blocks I end up with 419k of data in the field when all is said
and done.

Please let me know if/when this will be fixed, and if there is a
work around that might be used, r a better way to insert BLOB data is
known.

- Garrett Kajmowicz
gk******@tbaytel.net
Jul 20 '05 #1
2 1445
> Using 4k blocks I end up with 419k of data in the field when all is said
and done.


Inserting data which is only NULL characters in 4k blocks allows me to get
a maximum stored data length of 906.3 kb.

Inserting data which is all 'q's, (just the letter q) gets me a total
inserted size of 653.2 kb. This is if the file size is 4890340 bytse
long. If the file is 5000000 bytes long, the total data inserted is 789.1
KB.

It seems as although the total inserted size varies, but is always under 1
MB.

This is getting more and more confusing. Help is greatly appreciated.

- Garrett Kajmowicz
Jul 20 '05 #2
> Using 4k blocks I end up with 419k of data in the field when all is said
and done.


Inserting data which is only NULL characters in 4k blocks allows me to get
a maximum stored data length of 906.3 kb.

Inserting data which is all 'q's, (just the letter q) gets me a total
inserted size of 653.2 kb. This is if the file size is 4890340 bytse
long. If the file is 5000000 bytes long, the total data inserted is 789.1
KB.

It seems as although the total inserted size varies, but is always under 1
MB.

This is getting more and more confusing. Help is greatly appreciated.

- Garrett Kajmowicz
Jul 20 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

1 post views Thread by Joe | last post: by
reply views Thread by Garrett Kajmowicz | last post: by
7 posts views Thread by hierro | last post: by
3 posts views Thread by Bryan Valencia | last post: by
16 posts views Thread by Jacky | last post: by
4 posts views Thread by Martin Evans | last post: by
13 posts views Thread by Henry Townsend | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.