473,327 Members | 1,896 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,327 software developers and data experts.

HttpWebRequest to save images

I am running on my PC Windows Forms to collect data from websites,
including images.

I hit a problem with images and javascript, and I would appreciate any
help.

The current code fails with a copy error. My Internet connected PC is
not the development machine, so I cannot debug on that PC and my other
PC has no Internet connection. So I cannot debug using .NET as normal.
My Code:
private string copyWebImage(string url, string TargetFile)
{
// add error handling
StringBuilder sb = new StringBuilder();
byte[] buf = new byte[8192];
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);

request.Timeout=5000;
HttpWebResponse response =
(HttpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader myTextReader = new StreamReader(responseStream);
char[] strBuffer = new char[25];
myTextReader.ReadBlock(strBuffer,0,25);
string stringBuffer = new string(strBuffer);
//get image name from tiff or jpg or gif or jpeg
// insert name into database
// store image on pc as standard name

if (stringBuffer.IndexOf("GIF8")>-1 ||
stringBuffer.IndexOf("JFIF")>-1)
{

// image found
//Image thisImage = Image.FromStream(responseStream);
//thisImage.Save(res
//thisImage.
//Stream ToStream = File.Create(TargetFile);
//
thisImage.Save(ToStream,System.Drawing.Imaging.Ima geFormat.Gif);
//BinaryReader br = new BinaryReader(responseStream);
byte[] b;
using (BinaryReader br = new BinaryReader(responseStream))
{
b= br.ReadBytes((int)responseStream.Length);
br.Close();
}
FileStream fs= new FileStream(TargetFile,FileMode.Create);
BinaryWriter w = new BinaryWriter(fs);
w.Write(b);

// BinaryWriter bw = new BinaryWriter(ToStream);
// bw.Write(br.ReadBytes((int)responseStream.Length)) ;
// bw.Flush();
// bw.Close();
//ToStream.Close();
// br.Close();

}
else
{
// this is a text page
}
}
catch (WebException e)
{
return "image copy failed error "+e.ToString();
}
catch (Exception e)
{
return "image copy failed error "+e.ToString();
}

return "";
}

Sep 17 '07 #1
4 6085
Locician,

I don't see why you are doing all this stuff to get the index of GIF8 or
JFIF. I know you are trying to get the header of the image to determine
what type it is so that you can put it into an Image instance and then save
that, but I don't see why you just don't save the contents to disk directly,
since that is what you are doing in the end?
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"Logician" <sa***@logicians.comwrote in message
news:11*********************@19g2000hsx.googlegrou ps.com...
>I am running on my PC Windows Forms to collect data from websites,
including images.

I hit a problem with images and javascript, and I would appreciate any
help.

The current code fails with a copy error. My Internet connected PC is
not the development machine, so I cannot debug on that PC and my other
PC has no Internet connection. So I cannot debug using .NET as normal.
My Code:
private string copyWebImage(string url, string TargetFile)
{
// add error handling
StringBuilder sb = new StringBuilder();
byte[] buf = new byte[8192];
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);

request.Timeout=5000;
HttpWebResponse response =
(HttpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader myTextReader = new StreamReader(responseStream);
char[] strBuffer = new char[25];
myTextReader.ReadBlock(strBuffer,0,25);
string stringBuffer = new string(strBuffer);
//get image name from tiff or jpg or gif or jpeg
// insert name into database
// store image on pc as standard name

if (stringBuffer.IndexOf("GIF8")>-1 ||
stringBuffer.IndexOf("JFIF")>-1)
{

// image found
//Image thisImage = Image.FromStream(responseStream);
//thisImage.Save(res
//thisImage.
//Stream ToStream = File.Create(TargetFile);
//
thisImage.Save(ToStream,System.Drawing.Imaging.Ima geFormat.Gif);
//BinaryReader br = new BinaryReader(responseStream);
byte[] b;
using (BinaryReader br = new BinaryReader(responseStream))
{
b= br.ReadBytes((int)responseStream.Length);
br.Close();
}
FileStream fs= new FileStream(TargetFile,FileMode.Create);
BinaryWriter w = new BinaryWriter(fs);
w.Write(b);

// BinaryWriter bw = new BinaryWriter(ToStream);
// bw.Write(br.ReadBytes((int)responseStream.Length)) ;
// bw.Flush();
// bw.Close();
//ToStream.Close();
// br.Close();

}
else
{
// this is a text page
}
}
catch (WebException e)
{
return "image copy failed error "+e.ToString();
}
catch (Exception e)
{
return "image copy failed error "+e.ToString();
}

return "";
}

Sep 17 '07 #2
As Nick indicated, all you need to do is issue the WebClient.DownloadData
(urlOfImage) method and save the resultant byte[] array to disk with the name
of the image. Essentially, two lines of code.
-- Peter
Recursion: see Recursion
site: http://www.eggheadcafe.com
unBlog: http://petesbloggerama.blogspot.com
BlogMetaFinder: http://www.blogmetafinder.com

"Logician" wrote:
I am running on my PC Windows Forms to collect data from websites,
including images.

I hit a problem with images and javascript, and I would appreciate any
help.

The current code fails with a copy error. My Internet connected PC is
not the development machine, so I cannot debug on that PC and my other
PC has no Internet connection. So I cannot debug using .NET as normal.
My Code:
private string copyWebImage(string url, string TargetFile)
{
// add error handling
StringBuilder sb = new StringBuilder();
byte[] buf = new byte[8192];
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);

request.Timeout=5000;
HttpWebResponse response =
(HttpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader myTextReader = new StreamReader(responseStream);
char[] strBuffer = new char[25];
myTextReader.ReadBlock(strBuffer,0,25);
string stringBuffer = new string(strBuffer);
//get image name from tiff or jpg or gif or jpeg
// insert name into database
// store image on pc as standard name

if (stringBuffer.IndexOf("GIF8")>-1 ||
stringBuffer.IndexOf("JFIF")>-1)
{

// image found
//Image thisImage = Image.FromStream(responseStream);
//thisImage.Save(res
//thisImage.
//Stream ToStream = File.Create(TargetFile);
//
thisImage.Save(ToStream,System.Drawing.Imaging.Ima geFormat.Gif);
//BinaryReader br = new BinaryReader(responseStream);
byte[] b;
using (BinaryReader br = new BinaryReader(responseStream))
{
b= br.ReadBytes((int)responseStream.Length);
br.Close();
}
FileStream fs= new FileStream(TargetFile,FileMode.Create);
BinaryWriter w = new BinaryWriter(fs);
w.Write(b);

// BinaryWriter bw = new BinaryWriter(ToStream);
// bw.Write(br.ReadBytes((int)responseStream.Length)) ;
// bw.Flush();
// bw.Close();
//ToStream.Close();
// br.Close();

}
else
{
// this is a text page
}
}
catch (WebException e)
{
return "image copy failed error "+e.ToString();
}
catch (Exception e)
{
return "image copy failed error "+e.ToString();
}

return "";
}

Sep 17 '07 #3
On Sep 17, 4:12 pm, Peter Bromberg [C# MVP]
<pbromb...@yahoo.yohohhoandabottleofrum.comwrote :
As Nick indicated, all you need to do is issue the WebClient.DownloadData
(urlOfImage) method and save the resultant byte[] array to disk with the name
of the image. Essentially, two lines of code.
-- Peter
Recursion: see Recursion
site: http://www.eggheadcafe.com
unBlog: http://petesbloggerama.blogspot.com
BlogMetaFinder: http://www.blogmetafinder.com

"Logician" wrote:
I am running on my PC Windows Forms to collect data from websites,
including images.
I hit a problem with images and javascript, and I would appreciate any
help.
The current code fails with a copy error. My Internet connected PC is
not the development machine, so I cannot debug on that PC and my other
PC has no Internet connection. So I cannot debug using .NET as normal.
My Code:
private string copyWebImage(string url, string TargetFile)
{
// add error handling
StringBuilder sb = new StringBuilder();
byte[] buf = new byte[8192];
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout=5000;
HttpWebResponse response =
(HttpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader myTextReader = new StreamReader(responseStream);
char[] strBuffer = new char[25];
myTextReader.ReadBlock(strBuffer,0,25);
string stringBuffer = new string(strBuffer);
//get image name from tiff or jpg or gif or jpeg
// insert name into database
// store image on pc as standard name
if (stringBuffer.IndexOf("GIF8")>-1 ||
stringBuffer.IndexOf("JFIF")>-1)
{
// image found
//Image thisImage = Image.FromStream(responseStream);
//thisImage.Save(res
//thisImage.
//Stream ToStream = File.Create(TargetFile);
//
thisImage.Save(ToStream,System.Drawing.Imaging.Ima geFormat.Gif);
//BinaryReader br = new BinaryReader(responseStream);
byte[] b;
using (BinaryReader br = new BinaryReader(responseStream))
{
b= br.ReadBytes((int)responseStream.Length);
br.Close();
}
FileStream fs= new FileStream(TargetFile,FileMode.Create);
BinaryWriter w = new BinaryWriter(fs);
w.Write(b);
// BinaryWriter bw = new BinaryWriter(ToStream);
// bw.Write(br.ReadBytes((int)responseStream.Length)) ;
// bw.Flush();
// bw.Close();
//ToStream.Close();
// br.Close();
}
else
{
// this is a text page
}
}
catch (WebException e)
{
return "image copy failed error "+e.ToString();
}
catch (Exception e)
{
return "image copy failed error "+e.ToString();
}
return "";
}- Hide quoted text -

- Show quoted text -
It works with DownloadFile(url,imageName). Thanks for the suggestion
about using WebClient.

I want to run this under UNIX and also implement Javascript link
following which is common on many websites.

Can C# work well under UNIX? Do you have experience of using WebClient
with .js based links?

Sep 17 '07 #4
Logician wrote:
It works with DownloadFile(url,imageName). Thanks for the suggestion
about using WebClient.

I want to run this under UNIX and also implement Javascript link
following which is common on many websites.

Can C# work well under UNIX? Do you have experience of using WebClient
with .js based links?
I would expect both (Http)WebRequest and WebClient to work with
Mono on Unix (assuming Mono is available for your flavor of Unix).

The code will not interpret JavaScript.

Maybe you can do that by embedding a browser in your app. But that
will not work on Unix.

Arne

Oct 2 '07 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
by: TJO | last post by:
Can someone at MS please reply to this. I am trying to post data so a web form via ssl with the following code. I keep getting this error: "The underlying connection was closed: Could not...
10
by: Brian Brown | last post by:
I have code which works as an asp.net page that posts an xml file to web page and gets a response back. When the the calls GetResponse() it goes into the page it's posting to to and works fine....
6
by: omyek | last post by:
I'm trying to mimic the browsing of a webpage using an HttpWebRequest. I've had a lot of luck with it so far, including logging into pages, posting form data, and even collecting and using cookies....
16
by: thomas peter | last post by:
I am building a precache engine... one that request over 100 pages on an remote server to cache them remotely... can i use the HttpWebRequest and WebResponse classes for this? or must i use the...
15
by: warlord | last post by:
I have a windows client app that is trying to download a file from a web server but I always get the following error when I call the GetResponse method of the Request object. The remote server...
14
by: tomer | last post by:
Clear DayHello, I have implemented a download manger in .NET that overrides Internet Explorer's bult-in download manager. I using HttpWebRequest/Response to download the files. All is working...
7
by: gorkos | last post by:
Hi, I am two days trying to solve a problem with some pages, which i get through HTTPWebRequest. Error is that some pages need Script to be enabled. But how to do this in HTTPWebRequest class?
3
by: Indiresh | last post by:
Hi all, I have a problem downloading web pages to my local system. I am using the HttpebRequest class to query a web site and get the response from the same. UndNow, when i asy a web page, all...
3
by: Angus | last post by:
I have a web page with a toolbar containing a Save button. The Save button can change contextually to be a Search button in some cases. Hence the button name searchsavechanges. The snippet of...
0
by: DolphinDB | last post by:
Tired of spending countless mintues downsampling your data? Look no further! In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
0
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
1
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
0
by: Vimpel783 | last post by:
Hello! Guys, I found this code on the Internet, but I need to modify it a little. It works well, the problem is this: Data is sent from only one cell, in this case B5, but it is necessary that data...
1
by: PapaRatzi | last post by:
Hello, I am teaching myself MS Access forms design and Visual Basic. I've created a table to capture a list of Top 30 singles and forms to capture new entries. The final step is a form (unbound)...
1
by: Defcon1945 | last post by:
I'm trying to learn Python using Pycharm but import shutil doesn't work
0
by: af34tf | last post by:
Hi Guys, I have a domain whose name is BytesLimited.com, and I want to sell it. Does anyone know about platforms that allow me to list my domain in auction for free. Thank you
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.