473,659 Members | 2,872 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

HttpWebRequest to save images

I am running on my PC Windows Forms to collect data from websites,
including images.

I hit a problem with images and javascript, and I would appreciate any
help.

The current code fails with a copy error. My Internet connected PC is
not the development machine, so I cannot debug on that PC and my other
PC has no Internet connection. So I cannot debug using .NET as normal.
My Code:
private string copyWebImage(st ring url, string TargetFile)
{
// add error handling
StringBuilder sb = new StringBuilder() ;
byte[] buf = new byte[8192];
try
{
HttpWebRequest request = (HttpWebRequest )WebRequest.Cre ate(url);

request.Timeout =5000;
HttpWebResponse response =
(HttpWebRespons e)request.GetRe sponse();
Stream responseStream = response.GetRes ponseStream();
StreamReader myTextReader = new StreamReader(re sponseStream);
char[] strBuffer = new char[25];
myTextReader.Re adBlock(strBuff er,0,25);
string stringBuffer = new string(strBuffe r);
//get image name from tiff or jpg or gif or jpeg
// insert name into database
// store image on pc as standard name

if (stringBuffer.I ndexOf("GIF8")>-1 ||
stringBuffer.In dexOf("JFIF")>-1)
{

// image found
//Image thisImage = Image.FromStrea m(responseStrea m);
//thisImage.Save( res
//thisImage.
//Stream ToStream = File.Create(Tar getFile);
//
thisImage.Save( ToStream,System .Drawing.Imagin g.ImageFormat.G if);
//BinaryReader br = new BinaryReader(re sponseStream);
byte[] b;
using (BinaryReader br = new BinaryReader(re sponseStream))
{
b= br.ReadBytes((i nt)responseStre am.Length);
br.Close();
}
FileStream fs= new FileStream(Targ etFile,FileMode .Create);
BinaryWriter w = new BinaryWriter(fs );
w.Write(b);

// BinaryWriter bw = new BinaryWriter(To Stream);
// bw.Write(br.Rea dBytes((int)res ponseStream.Len gth));
// bw.Flush();
// bw.Close();
//ToStream.Close( );
// br.Close();

}
else
{
// this is a text page
}
}
catch (WebException e)
{
return "image copy failed error "+e.ToStrin g();
}
catch (Exception e)
{
return "image copy failed error "+e.ToStrin g();
}

return "";
}

Sep 17 '07 #1
4 6135
Locician,

I don't see why you are doing all this stuff to get the index of GIF8 or
JFIF. I know you are trying to get the header of the image to determine
what type it is so that you can put it into an Image instance and then save
that, but I don't see why you just don't save the contents to disk directly,
since that is what you are doing in the end?
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard. caspershouse.co m

"Logician" <sa***@logician s.comwrote in message
news:11******** *************@1 9g2000hsx.googl egroups.com...
>I am running on my PC Windows Forms to collect data from websites,
including images.

I hit a problem with images and javascript, and I would appreciate any
help.

The current code fails with a copy error. My Internet connected PC is
not the development machine, so I cannot debug on that PC and my other
PC has no Internet connection. So I cannot debug using .NET as normal.
My Code:
private string copyWebImage(st ring url, string TargetFile)
{
// add error handling
StringBuilder sb = new StringBuilder() ;
byte[] buf = new byte[8192];
try
{
HttpWebRequest request = (HttpWebRequest )WebRequest.Cre ate(url);

request.Timeout =5000;
HttpWebResponse response =
(HttpWebRespons e)request.GetRe sponse();
Stream responseStream = response.GetRes ponseStream();
StreamReader myTextReader = new StreamReader(re sponseStream);
char[] strBuffer = new char[25];
myTextReader.Re adBlock(strBuff er,0,25);
string stringBuffer = new string(strBuffe r);
//get image name from tiff or jpg or gif or jpeg
// insert name into database
// store image on pc as standard name

if (stringBuffer.I ndexOf("GIF8")>-1 ||
stringBuffer.In dexOf("JFIF")>-1)
{

// image found
//Image thisImage = Image.FromStrea m(responseStrea m);
//thisImage.Save( res
//thisImage.
//Stream ToStream = File.Create(Tar getFile);
//
thisImage.Save( ToStream,System .Drawing.Imagin g.ImageFormat.G if);
//BinaryReader br = new BinaryReader(re sponseStream);
byte[] b;
using (BinaryReader br = new BinaryReader(re sponseStream))
{
b= br.ReadBytes((i nt)responseStre am.Length);
br.Close();
}
FileStream fs= new FileStream(Targ etFile,FileMode .Create);
BinaryWriter w = new BinaryWriter(fs );
w.Write(b);

// BinaryWriter bw = new BinaryWriter(To Stream);
// bw.Write(br.Rea dBytes((int)res ponseStream.Len gth));
// bw.Flush();
// bw.Close();
//ToStream.Close( );
// br.Close();

}
else
{
// this is a text page
}
}
catch (WebException e)
{
return "image copy failed error "+e.ToStrin g();
}
catch (Exception e)
{
return "image copy failed error "+e.ToStrin g();
}

return "";
}

Sep 17 '07 #2
As Nick indicated, all you need to do is issue the WebClient.Downl oadData
(urlOfImage) method and save the resultant byte[] array to disk with the name
of the image. Essentially, two lines of code.
-- Peter
Recursion: see Recursion
site: http://www.eggheadcafe.com
unBlog: http://petesbloggerama.blogspot.com
BlogMetaFinder: http://www.blogmetafinder.com

"Logician" wrote:
I am running on my PC Windows Forms to collect data from websites,
including images.

I hit a problem with images and javascript, and I would appreciate any
help.

The current code fails with a copy error. My Internet connected PC is
not the development machine, so I cannot debug on that PC and my other
PC has no Internet connection. So I cannot debug using .NET as normal.
My Code:
private string copyWebImage(st ring url, string TargetFile)
{
// add error handling
StringBuilder sb = new StringBuilder() ;
byte[] buf = new byte[8192];
try
{
HttpWebRequest request = (HttpWebRequest )WebRequest.Cre ate(url);

request.Timeout =5000;
HttpWebResponse response =
(HttpWebRespons e)request.GetRe sponse();
Stream responseStream = response.GetRes ponseStream();
StreamReader myTextReader = new StreamReader(re sponseStream);
char[] strBuffer = new char[25];
myTextReader.Re adBlock(strBuff er,0,25);
string stringBuffer = new string(strBuffe r);
//get image name from tiff or jpg or gif or jpeg
// insert name into database
// store image on pc as standard name

if (stringBuffer.I ndexOf("GIF8")>-1 ||
stringBuffer.In dexOf("JFIF")>-1)
{

// image found
//Image thisImage = Image.FromStrea m(responseStrea m);
//thisImage.Save( res
//thisImage.
//Stream ToStream = File.Create(Tar getFile);
//
thisImage.Save( ToStream,System .Drawing.Imagin g.ImageFormat.G if);
//BinaryReader br = new BinaryReader(re sponseStream);
byte[] b;
using (BinaryReader br = new BinaryReader(re sponseStream))
{
b= br.ReadBytes((i nt)responseStre am.Length);
br.Close();
}
FileStream fs= new FileStream(Targ etFile,FileMode .Create);
BinaryWriter w = new BinaryWriter(fs );
w.Write(b);

// BinaryWriter bw = new BinaryWriter(To Stream);
// bw.Write(br.Rea dBytes((int)res ponseStream.Len gth));
// bw.Flush();
// bw.Close();
//ToStream.Close( );
// br.Close();

}
else
{
// this is a text page
}
}
catch (WebException e)
{
return "image copy failed error "+e.ToStrin g();
}
catch (Exception e)
{
return "image copy failed error "+e.ToStrin g();
}

return "";
}

Sep 17 '07 #3
On Sep 17, 4:12 pm, Peter Bromberg [C# MVP]
<pbromb...@yaho o.yohohhoandabo ttleofrum.comwr ote:
As Nick indicated, all you need to do is issue the WebClient.Downl oadData
(urlOfImage) method and save the resultant byte[] array to disk with the name
of the image. Essentially, two lines of code.
-- Peter
Recursion: see Recursion
site: http://www.eggheadcafe.com
unBlog: http://petesbloggerama.blogspot.com
BlogMetaFinder: http://www.blogmetafinder.com

"Logician" wrote:
I am running on my PC Windows Forms to collect data from websites,
including images.
I hit a problem with images and javascript, and I would appreciate any
help.
The current code fails with a copy error. My Internet connected PC is
not the development machine, so I cannot debug on that PC and my other
PC has no Internet connection. So I cannot debug using .NET as normal.
My Code:
private string copyWebImage(st ring url, string TargetFile)
{
// add error handling
StringBuilder sb = new StringBuilder() ;
byte[] buf = new byte[8192];
try
{
HttpWebRequest request = (HttpWebRequest )WebRequest.Cre ate(url);
request.Timeout =5000;
HttpWebResponse response =
(HttpWebRespons e)request.GetRe sponse();
Stream responseStream = response.GetRes ponseStream();
StreamReader myTextReader = new StreamReader(re sponseStream);
char[] strBuffer = new char[25];
myTextReader.Re adBlock(strBuff er,0,25);
string stringBuffer = new string(strBuffe r);
//get image name from tiff or jpg or gif or jpeg
// insert name into database
// store image on pc as standard name
if (stringBuffer.I ndexOf("GIF8")>-1 ||
stringBuffer.In dexOf("JFIF")>-1)
{
// image found
//Image thisImage = Image.FromStrea m(responseStrea m);
//thisImage.Save( res
//thisImage.
//Stream ToStream = File.Create(Tar getFile);
//
thisImage.Save( ToStream,System .Drawing.Imagin g.ImageFormat.G if);
//BinaryReader br = new BinaryReader(re sponseStream);
byte[] b;
using (BinaryReader br = new BinaryReader(re sponseStream))
{
b= br.ReadBytes((i nt)responseStre am.Length);
br.Close();
}
FileStream fs= new FileStream(Targ etFile,FileMode .Create);
BinaryWriter w = new BinaryWriter(fs );
w.Write(b);
// BinaryWriter bw = new BinaryWriter(To Stream);
// bw.Write(br.Rea dBytes((int)res ponseStream.Len gth));
// bw.Flush();
// bw.Close();
//ToStream.Close( );
// br.Close();
}
else
{
// this is a text page
}
}
catch (WebException e)
{
return "image copy failed error "+e.ToStrin g();
}
catch (Exception e)
{
return "image copy failed error "+e.ToStrin g();
}
return "";
}- Hide quoted text -

- Show quoted text -
It works with DownloadFile(ur l,imageName). Thanks for the suggestion
about using WebClient.

I want to run this under UNIX and also implement Javascript link
following which is common on many websites.

Can C# work well under UNIX? Do you have experience of using WebClient
with .js based links?

Sep 17 '07 #4
Logician wrote:
It works with DownloadFile(ur l,imageName). Thanks for the suggestion
about using WebClient.

I want to run this under UNIX and also implement Javascript link
following which is common on many websites.

Can C# work well under UNIX? Do you have experience of using WebClient
with .js based links?
I would expect both (Http)WebReques t and WebClient to work with
Mono on Unix (assuming Mono is available for your flavor of Unix).

The code will not interpret JavaScript.

Maybe you can do that by embedding a browser in your app. But that
will not work on Unix.

Arne

Oct 2 '07 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
3406
by: TJO | last post by:
Can someone at MS please reply to this. I am trying to post data so a web form via ssl with the following code. I keep getting this error: "The underlying connection was closed: Could not establish secure channel for SSL/TLS" private void mainHttpCalls(string postData) { HttpWebRequest objRequest1 ; HttpWebRequest objRequest2 ;
10
472
by: Brian Brown | last post by:
I have code which works as an asp.net page that posts an xml file to web page and gets a response back. When the the calls GetResponse() it goes into the page it's posting to to and works fine. When it's been ported to a winform it doesn't work on the GetResponse() call. I think it probably needs credentials but not sure what to use, tried a few ids w/o any luck.
6
3082
by: omyek | last post by:
I'm trying to mimic the browsing of a webpage using an HttpWebRequest. I've had a lot of luck with it so far, including logging into pages, posting form data, and even collecting and using cookies. However, I ran into a scenario that I'm baffled by. I have a website which requires a user to login. This is nothing new and I was able to successfully log in. For our case, let's say the URL to the login page is http://login.html
16
12634
by: thomas peter | last post by:
I am building a precache engine... one that request over 100 pages on an remote server to cache them remotely... can i use the HttpWebRequest and WebResponse classes for this? or must i use the MSHTML objects to really load the HTML and request all of the images on site? string lcUrl = http://www.cnn.com; // *** Establish the request
15
515
by: warlord | last post by:
I have a windows client app that is trying to download a file from a web server but I always get the following error when I call the GetResponse method of the Request object. The remote server returned an error: (404) Not Found. When I run it against a website on my local machine everything works perfectly, but not against the remote server. I'm sure it's a security or permissions problem of some sort, but I'm not sure where to start.
14
2964
by: tomer | last post by:
Clear DayHello, I have implemented a download manger in .NET that overrides Internet Explorer's bult-in download manager. I using HttpWebRequest/Response to download the files. All is working well except when I try to download an attachment file from a web-based mailbox. I've tried using credentials and saving the cookies, nothing helps.
7
7617
by: gorkos | last post by:
Hi, I am two days trying to solve a problem with some pages, which i get through HTTPWebRequest. Error is that some pages need Script to be enabled. But how to do this in HTTPWebRequest class?
3
2392
by: Indiresh | last post by:
Hi all, I have a problem downloading web pages to my local system. I am using the HttpebRequest class to query a web site and get the response from the same. UndNow, when i asy a web page, all the images and support files such as JS file, CSS files are also in picture. I want to store the response stream to the local drive, say something like "C:\Temp", and i want all the images and CSS files to be stored to this local path.
3
3666
by: Angus | last post by:
I have a web page with a toolbar containing a Save button. The Save button can change contextually to be a Search button in some cases. Hence the button name searchsavechanges. The snippet of html is: <a class="searchsavechanges btn btn3d tbbtn" href="javascript:" style="position:static"> <div id="TBsearchsavechanges">Search</div> </a>
0
8332
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
8851
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
8746
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
8627
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
7356
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
5649
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4175
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
2
1975
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
2
1737
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.