Hi,
I'm primarily a web developer and have recently moved onto .NET and am
slowly realising the enormous potential both through ASP.NET but
integrating all the different services.
While teaching myself .NET I decided to start building a spider to get a
load of information off other sites (thanks for the points on here guys)
and on of the spiders that seems to suiting my needs is the Spider In
..NET article on MSDN
(http://msdn.microsoft.com/msdnmag/is...T/default.aspx)
but I would now like to expand on what I have altered so far (simply
restricting the spider to search URL's matching certain parameters) to
cope with loss of connection and pausing the process, is there any way
of doing this? I have been reading about pausing processes and needing
other threads and all kinds so have got a little confused and would
apprechiate any advice given.
Many thanks in advance.
Tim