473,626 Members | 3,240 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Programmaticall y Generate a POST to Log Into Site and Screen Scrape

Hi,

Lets say there's a web site with simple authentication. It
asks you to type a uname/password into a couple text
boxes, and then it gives you a cookie and you're logged in
for 20 minutes or so. What I need to do is automate that.

In other words, in my code behind, how can I generate a
POST request (with username and password data) to a
server , get the cookie it returns, issue a request (using
that cookie) to a secured page so I can scrape the data?

Thanks
Nov 18 '05 #1
5 2235
Any ideas?
-----Original Message-----
Hi,

Lets say there's a web site with simple authentication. Itasks you to type a uname/password into a couple text
boxes, and then it gives you a cookie and you're logged infor 20 minutes or so. What I need to do is automate that.

In other words, in my code behind, how can I generate a
POST request (with username and password data) to a
server , get the cookie it returns, issue a request (usingthat cookie) to a secured page so I can scrape the data?

Thanks
.

Nov 18 '05 #2
"Tony Pino" wrote:
Any ideas?
-----Original Message-----
Hi,

Lets say there's a web site with simple authentication.

It
asks you to type a uname/password into a couple text
boxes, and then it gives you a cookie and you're logged

in
for 20 minutes or so. What I need to do is automate that.

In other words, in my code behind, how can I generate a
POST request (with username and password data) to a
server , get the cookie it returns, issue a request

(using
that cookie) to a secured page so I can scrape the data?


Check out System.Net.(Htt p)WebRequest and System.Net.(Htp )WebResponse.
For simple use cases, System.Net.WebC lient will work as well.

Cheers,
--
Joerg Jooss
jo*********@gmx .net
Nov 18 '05 #3
Thanks for the reply.

I understand (and have seen examples) of using those
classes to simply request a page (like google.com) and
store the HTML in a string object. However, I'm still a
bit confused with how to store an auth cookie so the next
request I make will be authenticated so I can access a
private page.
-----Original Message-----
"Tony Pino" wrote:
Any ideas?
>-----Original Message-----
>Hi,
>
>Lets say there's a web site with simple authentication.
It
>asks you to type a uname/password into a couple text
>boxes, and then it gives you a cookie and you're
logged in
>for 20 minutes or so. What I need to do is automate
that. >
>In other words, in my code behind, how can I generate a >POST request (with username and password data) to a
>server , get the cookie it returns, issue a request

(using
>that cookie) to a secured page so I can scrape the

data?
Check out System.Net.(Htt p)WebRequest and System.Net. (Htp)WebRespons e.For simple use cases, System.Net.WebC lient will work as well.
Cheers,
--
Joerg Jooss
jo*********@gm x.net
.

Nov 18 '05 #4
Hi Tony,

HttpWebRequest and HttpWebResponse provide the container to hold cookies
both for the sending and receiving ends but it doesn't automatically
persist them so that becomes your responsibility.

Because the Cookie collections are nicely abstracted in these objects it's
fairly easy to save and restore them. The key to make this work is to have
a persistent object reference to the cookie collection and then reuse the
same cookie store each time.

To do this let's assume you are running the request on a form (or some
other class - this in the example below). You'd create a property called
Cookies:

CookieCollectio n Cookies;

On the Request end of the connection before the request is sent to the
server you can then check whether there's a previously saved set of cookies
and if so use them:

Request.CookieC ontainer = new CookieContainer ();

if (this.Cookies != null &&

this.Cookies.Co unt > 0)

Request.CookieC ontainer.Add(th is.Cookies);

So, if you previously had retrieved cookies, they were stored in the
Cookies property and then added back into the Request's CookieContainer
property. CookieContainer is a collection of cookie collections - it's
meant to be able to store cookies for multiple sites. Here I only deal with
tracking a single set of cookies for a single set of requests.

On the receiving end once the request headers have been retrieved after the
call to GetWebResponse( ), you then use code like the following:

// *** Save the cookies on the persistent object

if (Response.Cooki es.Count > 0)

this.Cookies = Response.Cookie s;

This saves the cookies collection until the next request when it is then
reassigned to the Request which sends it to the server. Note, that this is
a very simplistic cookie management approach that will work only if a
single or a single set of cookies is set on a given Web site. If multiple
cookies are set in multiple different places of the site you will actually
have to retrieve the individual cookies and individually store them into
the Cookie collection. Here's some code that demonstrates:

if (loWebResponse. Cookies.Count > 0)

if (this.Cookies == null)

{

this.Cookies = loWebResponse.C ookies;

}

else

{

// If we already have cookies update list

foreach (Cookie oRespCookie in

loWebResponse.C ookies)

{

bool bMatch = false;

foreach(Cookie oReqCookie in

this.oCookies) {

if (oReqCookie.Nam e ==

oRespCookie.Nam e) {

oReqCookie.Valu e =

oRespCookie.Nam e;

bMatch = true;

break;

}

}

if (!bMatch)

this.Cookies.Ad d(oRespCookie);

}

}

}

This should give you a good starting point.

Best regards,

Jacob Yang
Microsoft Online Partner Support
Get Secure! ¨C www.microsoft.com/security
This posting is provided "as is" with no warranties and confers no rights.

Nov 18 '05 #5
Great, thanks!
-----Original Message-----
Hi Tony,

HttpWebReque st and HttpWebResponse provide the container to hold cookies both for the sending and receiving ends but it doesn't automatically persist them so that becomes your responsibility.

Because the Cookie collections are nicely abstracted in these objects it's fairly easy to save and restore them. The key to make this work is to have a persistent object reference to the cookie collection and then reuse the same cookie store each time.

To do this let's assume you are running the request on a form (or some other class - this in the example below). You'd create a property called Cookies:

CookieCollecti on Cookies;

On the Request end of the connection before the request is sent to the server you can then check whether there's a previously saved set of cookies and if so use them:

Request.Cookie Container = new CookieContainer ();

if (this.Cookies != null &&

this.Cookies.Co unt > 0)

Request.CookieC ontainer.Add(th is.Cookies);

So, if you previously had retrieved cookies, they were stored in the Cookies property and then added back into the Request's CookieContainer property. CookieContainer is a collection of cookie collections - it's meant to be able to store cookies for multiple sites. Here I only deal with tracking a single set of cookies for a single set of requests.
On the receiving end once the request headers have been retrieved after the call to GetWebResponse( ), you then use code like the following:
// *** Save the cookies on the persistent object

if (Response.Cooki es.Count > 0)

this.Cookies = Response.Cookie s;

This saves the cookies collection until the next request when it is then reassigned to the Request which sends it to the server. Note, that this is a very simplistic cookie management approach that will work only if a single or a single set of cookies is set on a given Web site. If multiple cookies are set in multiple different places of the site you will actually have to retrieve the individual cookies and individually store them into the Cookie collection. Here's some code that demonstrates:
if (loWebResponse. Cookies.Count > 0)

if (this.Cookies == null)

{

this.Cookies = loWebResponse.C ookies;

}

else

{

// If we already have cookies update list

foreach (Cookie oRespCookie in

loWebResponse.C ookies)

{

bool bMatch = false;

foreach(Cookie oReqCookie in

this.oCookies) {

if (oReqCookie.Nam e ==

oRespCookie.Nam e) {

oReqCookie.Valu e =

oRespCookie.Nam e;

bMatch = true;

break;

}

}

if (!bMatch)

this.Cookies.Ad d(oRespCookie);

}

}

}

This should give you a good starting point.

Best regards,

Jacob Yang
Microsoft Online Partner Support
Get Secure! ¨C www.microsoft.com/security
This posting is provided "as is" with no warranties and confers no rights.
.

Nov 18 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
1810
by: Tarren | last post by:
Hi: I would like to programmatically scrape and generate .MHT files for archival purposes. Is this possible. Has anyone dealt with a need like this. Any help to point me in the right direction is much appreciated. Thanks!
6
2851
by: ALthePal | last post by:
Hi, I'm not sure if we are able to or even how to loop through the web forms in a VB.NET project during design time. In MSAccess we are able to go through the database -> forms collection and loop through all the forms in a database and pull information about the form (controls and properties). We would need to do the same in our VB.NET project; loop through the project and get the web form's control and property information...
0
1364
by: Jason Steeves | last post by:
I have one .aspx form that my users fill out and this then takes that information and populates a second .aspx form via session variables. I need to screen scrape the second .aspx form and e-mail that off. I have figured out how to do the screen scrape but it e-mails a blank form with none of the session variables set. Is there a way to screen scrapte the second .aspx form with the variables set from the first .aspx form? Thanks in...
3
3998
by: Ollie | last post by:
I know you can screen scrape a website using the System.Net.HttpWebResponse & System.Net.HttpWebRequest classes. But how do you screen scrape a secured website (https) that takes a username & password, I guess what I am asking where does the username & password go and where do you store any returned token for further requests... Cheers Ollie
7
2161
by: Swanand Mokashi | last post by:
Hi all -- I would like to create an application(call it Application "A") that I would like to mimic exactly as a form on a foreign system (Application "F"). Application "F" is on the web (so basically I can not control it). I will have a form exactly on Application "A" as that of Application "F". Application "A" will submit to the url of the application "F". I would like to do a screen scrape of the confirmation obtained after submitting...
1
2469
by: chrisben | last post by:
Hi, To save some daily routine work, I need to programmatically login a website and parse and data at the page. C# is preferred but I am open to any language :-). The site use a submit button to submit the username and password, then direct to the site I need. The html is like
5
2434
by: Tyler | last post by:
I am developing an application which will allow me to automatically sign into an external website. I can currently do a screen scrape using HTTPWEBREQUEST. However I want to just redirect to the external site. No need to pull back any data. Here is the code I am using. Dim encoding As ASCIIEncoding = New ASCIIEncoding Dim myParams As String = "AccessID=" & sUserID & "&Password=" & sPassword
3
4016
by: Gregory A Greenman | last post by:
I'm trying to screen scrape a site that requires a password. If I access the site's login page in my browser and view the source, I see that it does not contain a viewstate. When my program posts the login information, the response I get is the same page as if I had logged in using my browser. In the page it says "Welcome" followed by my name. The cookie collection returned doesn't contain any cookies (response.cookies.count = 0).
1
2648
by: nbomike | last post by:
Hello. I want to scrape pages from a site that generates pages from form inputs using this web app . However, the URL of the results page (the page I want to scrape) is masked and is always the same. It looks something like this for every form input combination: http://www.scrapedsite.com/the/results/page.htm Is it even possible to scrape the individual results pages? I posted this same question in the PHP forum, because I wanted to scrape...
0
8269
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
8203
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
8711
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
8368
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
8512
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
7203
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
4094
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
1
1815
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
2
1515
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.