473,395 Members | 1,616 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

form crawler

This code crawls only the url of the domain of a site eg www.example.com/orange-is-good.html
and will display word on the url based on searched word.Now how do i make it to crawl for keywords, description, title,body content of pages

images via title, alt,form headers(eg h1 to h7), anchor links etc.


Expand|Select|Wrap|Line Numbers
  1.  
  2.  
  3. <!DOCTYPE HTML PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
  4. <html xmlns="http://www.w3.org/1999/xhtml"><head>
  5.  
  6.  
  7.  
  8.  
  9. <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
  10.  
  11. <title>Orange is good</title>
  12.  
  13. <meta name="Description" content="orange is good for health">
  14. <meta name="Keywords" content="ripe orange,fresh orange,good orange">
  15.  
  16.  
  17.  
  18. </head>
  19.  
  20. <body> 
  21. <a href="http://www.example.com/orange-is-good.html"><h1>Orange is Good</h1></a><br>
  22.  
  23.  Orange is the best among all the fruits in the world. Orange has been in place since 2013.
  24.  
  25. img src="orange.jpg" title="orange site" alt="orange is a good fruit." width=200 height=200>
  26.  
  27.  
  28. </body> 
  29.  
  30.  
  31.  
  32.  
  33.  
  34.  
  35.  



www.example.com/orange-is-good.html= ADD DOMAIN NAME HERE


Expand|Select|Wrap|Line Numbers
  1.  
  2. <?php
  3.  
  4. session_start();
  5.  
  6. $domain = "ADD DOMAIN NAME HERE";
  7.  
  8. if(empty($_SESSION['page']))
  9. {
  10. $original_file = file_get_contents("http://" . $domain . "/");
  11.  
  12. $_SESSION['i'] = 0;
  13.  
  14. $connect = mysql_connect("HOST","USERNAME","PASSWORD");
  15.  
  16. if (!$connect)
  17. {
  18. die("MySQL could not connect!");
  19. }
  20.  
  21. $DB = mysql_select_db('DATABASE NAME');
  22.  
  23. if(!$DB)
  24. {
  25. die("MySQL could not select Database!");
  26. }
  27. }
  28. if(isset($_SESSION['page']))
  29. {
  30.  
  31. $connect = mysql_connect("HOST","USERNAME","PASSWORD");
  32.  
  33. if (!$connect)
  34. {
  35. die("MySQL could not connect!");
  36. }
  37.  
  38. $DB = mysql_select_db('DATABASE NAME');
  39.  
  40. if(!$DB)
  41. {
  42. die("MySQL could not select Database!");
  43. }
  44. $PAGE = $_SESSION['page'];
  45. $original_file = file_get_contents("$PAGE");
  46. }
  47.  
  48. $stripped_file = strip_tags($original_file, "<a>");
  49. preg_match_all("/<a(?:[^>]*)href=\"([^\"]*)\"(?:[^>]*)>(?:[^<]*)<\/a>/is", $stripped_file, $matches);
  50.  
  51. foreach($matches[1] as $key => $value)
  52. {
  53.  
  54. if(strpos($value,"http://") != 'FALSE' && strpos($value,"https://") != 'FALSE')
  55. {
  56. $New_URL = "http://" . $domain . $value;
  57. }
  58. else
  59. {
  60. $New_URL = $value;
  61. }
  62. $New_URL = addslashes($New_URL);
  63. $Check = mysql_query("SELECT * FROM pages WHERE url='$New_URL'");
  64. $Num = mysql_num_rows($Check);
  65.  
  66. if($Num == 0)
  67. {
  68. mysql_query("INSERT INTO pages (url)
  69. VALUES ('$New_URL')");
  70.  
  71. $_SESSION['i']++;
  72.  
  73. echo $_SESSION['i'] . "";
  74. }
  75. echo mysql_error();
  76. }
  77.  
  78. $RandQuery = mysql_query("SELECT * FROM pages ORDER BY RAND() LIMIT 0,1");
  79. $RandReturn = mysql_num_rows($RandQuery);
  80. while($row1 = mysql_fetch_assoc($RandQuery))
  81. {
  82. $_SESSION['page'] = $row1['url'];
  83. }
  84. echo $RandReturn;
  85. echo $_SESSION['page'];
  86. mysql_close();
  87. header("refresh: 0;");
  88.  
  89. ?>
  90.  
  91.  
  92.  
Jul 10 '13 #1
0 1097

Sign in to post your reply or Sign up for a free account.

Similar topics

2
by: Gomez | last post by:
Hi, Is there a way to know if a session on my web server is from an actual user or an automated crawler. please advise. G
1
by: Benjamin Lefevre | last post by:
I am currently developping a web crawler, mainly crawling mobile page (wml, mobile xhtml) but not only (also html/xml/...), and I ask myself which speed I can reach. This crawler is developped in...
1
by: Steve Ocsic | last post by:
Hi, I've coded a basic crawler where by you enter the URL and it will then crawl the said URL. What I would like to do now is to take it one step further and do the following: 1. pick up the...
0
by: Nicolas | last post by:
I need HELP!!!!! The crawler (Google or other) don't index my web site unless the web site is currently visited If there is nobody visiting those .aspx page therefor activating the aspnet no...
3
by: Bill | last post by:
Has anyone used/tested Request.Browser.Crawler ? Is it reliable, or are there false positives/negatives? Thanks!
13
by: abhinav | last post by:
Hi guys.I have to implement a topical crawler as a part of my project.What language should i implement C or Python?Python though has fast development cycle but my concern is speed also.I want to...
3
by: mh121 | last post by:
I am trying to write a web crawler (for academic research purposes) that grabs the number of links different websites/domain names have from other websites, as listed on Google (for example, to get...
7
by: bdy120602 | last post by:
In addition to the question in the subject line, if the answer is yes, is it possible to locate keywords as part of the functionality of said crawler (bot, spider)? Basically, I would like to...
0
by: kishorealla | last post by:
Hello I need to create a web bot/crawler/spider that would go into different web sites and collect data for us and store in a database. The crawler needs to 'READ' the options on a website (either...
4
by: sonich | last post by:
I need simple web crawler, I found Ruya, but it's seems not currently maintained. Does anybody know good web crawler on python or with python interface?
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.