By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
446,383 Members | 2,070 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 446,383 IT Pros & Developers. It's quick & easy.

How to save all search engine results urls in a text file

P: 16
From this search engine:

https://siteexplorer.search.yahoo.com/mysites

when I make all subdomains search just by typing a domain url in a search box like
blogspot.com -and after I verify my yahoo email password I get 28 millions webpages listed on thousands of search result pages.I want to save the all urls and again only the one that have flv files in a text file so some flash downloader preferably freeware could preview or download them selectively or all.
Dec 26 '06 #1
Share this Question
Share on Google+
2 Replies


Expert 100+
P: 1,892
I'm not really sure what you are asking.
Dec 27 '06 #2

P: 16
I'm not really sure what you are asking.
On this search engine:

https://siteexplorer.search.yahoo.com/mysites

go enter the domain url in the search box (the one where you would enter keywords normaly before the invention of this search engine) .Enter

blogspot.com

you will get on the top left corner of the search page notification that there were

28 million websites found.Each web search page can list 50 of that number.
I want to copy as much as I want these urls (that are under the title of each found webpage on a search page)to have them copied till the end if I don't interrupt them and can acces them even unfinished and continue their saving from the place they have stopped -save them into a text file.Note that yahoo site explorer lists only the subdomains of the domains we enter and lists nothing else.

so for example we got only two websites of one search engine quest:

1. title A
(www.1domain.com/1subdomain.html)

2.title B
(www.1domain.com/2subdomain.html)

all I would like is to have them copied in a text file:

www.1domain.com/1subdomain.html
www.1domain.com/2subdomain.html

and if there were 28 million of such than 28 millions such lines were be written.Name it

1a.txt

and again to copy but only the urls of the websites that contain flash -flv,swf movies
inside them.Save them as 1b.txt

and again,if possible some seo data of each webpage.E.g number of views,rank.As

1c.txt
Dec 27 '06 #3

Post your reply

Sign in to post your reply or Sign up for a free account.