By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,469 Members | 1,885 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,469 IT Pros & Developers. It's quick & easy.

Web Crawler via java

P: n/a

I am trying to write a web crawler with java and while I have most of it
worked out and able to access pages, I keep coming up with a cookie problem.

The long and short of it is that when I first request a page, I recieve from
my test site ( a commercial site) a warning page telling me that I don't
have cookies enabled on the browser. Is there a generic http header field I
need to set to show that the crawler is cookie enabled? How does a site
actually know on a initial contact that a browser isn't cookie enabled?

I have found how to read the header for 'set-cookie' keys and how to place
in the header these keys too. But I can't figure out how this site knows on
the initial open connection. (and I don't see any 'meta' redirects either).

Any help will be appreciated!


Jul 17 '05 #1
Share this question for a faster answer!
Share on Google+

This discussion thread is closed

Replies have been disabled for this discussion.