"Kyle Mizell" <ky**@pimpinonline.comNOSPAM> wrote in message news:<qewyb.174752$Dw6.686810@attbi_s02>...
I am looking for a script that I can use to spider a website, and then pull
the images... I know how to do it for a single page, but, I would like to be
able to do this for the entire site. Any suggestions?
Thanks,
Kyle Mizell
http://www.pimpinonline.com
As you do for one page do for all your pages.
In one array store all links foud on first page (eliminate
duplicates), then do for all this pages as for first page.
I think the beset is to make function, which save one page and return
found links, then call your function with all urls.
While you are saving a page you have to replace links because static
names will be diferent
i.e.
me*************************************@pimpinonli ne.com&unset_search=true
replace with
members_php_search_sex_Male_search_kyle_pimpinonli ne_com_unset_search_true.HTML
and so name all stored pages.
enjoy