474,047 Members | 2,773 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Limit on SQLDataReader w/ Blobs

I use a filtered SELECT to populate the SQLDataReader (rdr) with
a filename and a blob (pdf). I then use File.WriteAllBy tes to write
each pdf to disk.

----------------------------------------
rdr = command.Execute Reader();

while (rdr.Read())
{
byte[] BinaryImage = (byte[])rdr["attachment_fil e"];
File.WriteAllBy tes("\\" + rdr["fn"].ToString(), BinaryImage);
}
----------------------------------------

I tested the code on a small resultset and it works as expected. My
client tried it on the live database (I can't get acccess to this database),
and he said he thinks it has some built-in upper limit on the number of
rows it can process because he doesn't think it writes all files that are
returned by the SELECT. Instead of adding rows to my test database
I was hoping someone might be able to provide insight on any upper
limit this approach might have and suggest a workaround.

Thanks

Oct 4 '07 #1
7 4340
Garth Wells wrote:
I use a filtered SELECT to populate the SQLDataReader (rdr) with
a filename and a blob (pdf). I then use File.WriteAllBy tes to write
each pdf to disk.

----------------------------------------
rdr = command.Execute Reader();

while (rdr.Read())
{
byte[] BinaryImage = (byte[])rdr["attachment_fil e"];
File.WriteAllBy tes("\\" + rdr["fn"].ToString(), BinaryImage);
}
----------------------------------------

I tested the code on a small resultset and it works as expected. My
client tried it on the live database (I can't get acccess to this database),
and he said he thinks it has some built-in upper limit on the number of
rows it can process because he doesn't think it writes all files that are
returned by the SELECT. Instead of adding rows to my test database
I was hoping someone might be able to provide insight on any upper
limit this approach might have and suggest a workaround.
An SQLServer BLOB (called IMAGE) can be up to 2 GB.

There is lots of things that can prevent you from reading
such a big data item in a single call.

You can try using a loop and via rdr.GetBytes read 100 KB or 1 MB
a time and write them to the file.

Arne
Oct 4 '07 #2
You can try using a loop and via rdr.GetBytes read 100 KB or 1 MB
a time and write them to the file.
Additional: when handling large BLOBs, it would be a good idea to set
CommandBehavior .SequentialAcce ss in the reader; this will allow it to
work more efficiently, as it doesn't try to load the entire row into
memory - just the bytes you are reading at that moment. Of course,
with the GetBytes() approach you are forced to use the slightly longer
version of writing, using a FileStream.
called IMAGE
Or varbinary(max) ;-p

Marc

Oct 4 '07 #3
Marc Gravell wrote:
>called IMAGE

Or varbinary(max) ;-p
Not by people of my age.

:-)

Arne
Oct 4 '07 #4
Garth Wells wrote:
>I use a filtered SELECT to populate the SQLDataReader (rdr) with
a filename and a blob (pdf). I then use File.WriteAllBy tes to write
each pdf to disk.

----------------------------------------
rdr = command.Execute Reader();

while (rdr.Read())
{
byte[] BinaryImage = (byte[])rdr["attachment_fil e"];
File.WriteAllBy tes("\\" + rdr["fn"].ToString(), BinaryImage);
}
----------------------------------------

I tested the code on a small resultset and it works as expected. My
client tried it on the live database (I can't get acccess to this database),
and he said he thinks it has some built-in upper limit on the number of
rows it can process because he doesn't think it writes all files that are
returned by the SELECT. Instead of adding rows to my test database
I was hoping someone might be able to provide insight on any upper
limit this approach might have and suggest a workaround.

An SQLServer BLOB (called IMAGE) can be up to 2 GB.

There is lots of things that can prevent you from reading
such a big data item in a single call.

You can try using a loop and via rdr.GetBytes read 100 KB or 1 MB
a time and write them to the file.
I reviewed all the file sizes and the largest was 4 MB. There must be
a max SQLDataReader size, but a quick google did not indicate the
value.

Oct 4 '07 #5
It is conceivable that the limit is in the line:
rdr["attachment_fil e"];

For BLOBs, I would be using GetBytes() successively with increasing
offsets (into a re-usable buffer)

Marc

Oct 4 '07 #6
It is conceivable that the limit is in the line:
rdr["attachment_fil e"];

For BLOBs, I would be using GetBytes() successively with increasing
offsets (into a re-usable buffer)
Can you point me to an example of this approach?

Thanks
Oct 5 '07 #7
OK: http://weblogs.asp.net/cazzu/archive.../27/25568.aspx

The context here is streaming an image from the database to a browser
- however ,the BLOB reading is the same.

Personally, I think the "read" step could be neater - it makes the not-
strictly-true assumption that a buffer must be filled; I'd go with:
while ((size = r.GetBytes(1, idx, buffer, 0, ChunkSize)) 0)
{
context.Respons e.BinaryWrite(b uffer, 0, size);
idx += size;
}

(and remove the "last bytes" stuff down to the end-brace, since we
have dealt with them above).

"Replace context.Respons e" with your own stream / writer...

Marc

Oct 5 '07 #8

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
2916
by: Ole Hansen | last post by:
Hi, Is it at all possible to insert BLOBs using the Array Interface? Today I have an application using the array interface. It works fine but so far I haven't been using BLOBs. I insert 100-200 rows in one server round trip. Now I want to have one or more colums of type BLOB but I cant see how this can fit into my current application.
7
6992
by: Howard Lowndes | last post by:
My situation is that I am interacting PHP 4.1.2 to PostgreSQL 7.2.2 I have no difficulty inserting and managing BLOBs into the Large Object system table, and I have a user table called images which maintains the relationship between the BLOB loid and the identity that relates to it in my user tables. So far so good. When I RTFM obout psql it refers to the \lo_import, \lo_list, \lo_export and \lo_unlink functions.
7
4088
by: Nilabhra Banerjee | last post by:
Hi, I am still not sure whether the BLOBS are actually stored in the database or they have the pointer to the database for that file in the filesystem. If I remove the files (sources) for BLOBS from the directories with the BLOB still hold the data ? Also one more very intriguing part is that if BLOBS are not deleted if we delete them from tables how to
6
7403
by: tigr | last post by:
I am trying to read BLOBs from a large table (i.e., greater than 34K rows) using Java and the IBM JDBC driver (actually through an Application server). I get the following: SQL0429N The maximum number of concurrent LOB locators has been exceeded. SQLSTATE=54028 I understand that the LOB locators are pointers to the BLOBs, but is there any way to free them explicitly, or get the server to do it ?
1
3278
by: Arvind P Rangan | last post by:
Hi All, How do you get all the values of a sqldatareader if it contains multiple resultset. Using sqldatareader.nextresult and sqldatareader.read e.g. While sqldatareader.read ' If not sqldatareader then sqldatareader.nextresult
2
2838
by: Jerry LeVan | last post by:
Hi, I am just getting into large objects and bytea "stuff". I created a small db called pictures and loaded some large objects and then tried to do a restore. Here is how I got the dump. pg_dump -Fc -b pictures > /Users/jerry/desktop/db.comp
1
3423
by: Daniel | last post by:
is there any limit to how long of a string SqlDataReader.GetString() can return?
1
6683
by: Jan | last post by:
Is there a 2GB size limit on Access2003/XP ? If so, is there an easy way to span across more than one MDB file?
3
1059
by: Jon Skeet [C# MVP] | last post by:
On Apr 19, 7:20 am, "Tony Johansson" <johansson.anders...@telia.com> wrote: I believe that's the case for SqlDataAdapter, but I can't see any documentation saying it's the case for calling SqlCommand.ExecuteReader directly. Jon
0
10550
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
11612
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
11145
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
10320
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
8707
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
7879
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
6854
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
2
4950
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
3980
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.