I'm not really sure what you are asking.
On this search engine:
https://siteexplorer.search.yahoo.com/mysites
go enter the domain url in the search box (the one where you would enter keywords normaly before the invention of this search engine) .Enter
blogspot.com
you will get on the top left corner of the search page notification that there were
28 million websites found.Each web search page can list 50 of that number.
I want to copy as much as I want these urls (that are under the title of each found webpage on a search page)to have them copied till the end if I don't interrupt them and can acces them even unfinished and continue their saving from the place they have stopped -save them into a text file.Note that yahoo site explorer lists only the subdomains of the domains we enter and lists nothing else.
so for example we got only two websites of one search engine quest:
1. title A
(www.1domain.com/1subdomain.html)
2.title B
(www.1domain.com/2subdomain.html)
all I would like is to have them copied in a text file:
www.1domain.com/1subdomain.html
www.1domain.com/2subdomain.html
and if there were 28 million of such than 28 millions such lines were be written.Name it
1a.txt
and again to copy but only the urls of the websites that contain flash -flv,swf movies
inside them.Save them as 1b.txt
and again,if possible some seo data of each webpage.E.g number of views,rank.As
1c.txt