473,395 Members | 1,956 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

Perl for checking broken link on a website

Hi Guys!


I am new on this forum and I am hoping that I could acquire help to improve my PERL skill.

My goal is to be able to create a report of all broken links from a website. Based from my understanding, PERL can go through to server directory and search for htm/html file and put it on an array. From this array I believe PERL can search for a "http://" string and emulate to click the URL. If the URL contains "Page cannot be found" then create and append to a log file.

My question, what is the best method to do this and what is the required library and syntax?

Also for searching the "htm" file on a server, assuming that I don't have access to get in to server directory (which is unlikely) , how do I get this done? and if I have an access to the server how do I get PERL to login to the server first before PERL search for "htm" file on a specific directory? or do I have to put the PERL .pl file on the server root directory and run it there?

I really appreciate if someone could help. Thank you.
Jan 24 '08 #1
11 2879
eWish
971 Expert 512MB
Welcome to TSDN!

I would suggest that you use a module like WWW::Mechanize or LWP::UserAgent to achieve that you are wanting. Also, checkout WWW::Mechanize::FAQ.

--Kevin
Jan 24 '08 #2
eWish
971 Expert 512MB
You might also check out W3C::LogValidator::LinkChecker . If I read it correctly it will use your server log to locate broken links.

--Kevin
Jan 24 '08 #3
Thanks for your feedback Kevin! anyone else?

I will try to post the sample code later on for your feedback and also for someone else to learn. Cheers!
Jan 24 '08 #4
numberwhun
3,509 Expert Mod 2GB
Thanks for your feedback Kevin! anyone else?

I will try to post the sample code later on for your feedback and also for someone else to learn. Cheers!
Considering that Kevin pretty much covered it, I doubt anyone else will be throwing out the same links that he did.

Regards,

Jeff
Jan 24 '08 #5
Guys I'm a bit stuck here.

I would like to know how to capture a certain word (in this case "http://") in the file content however I don't know what the syntax is. Could anyone help? Thanks.
Expand|Select|Wrap|Line Numbers
  1.  #Capturing .htm or .html filename on root directory
  2. my @root_files = <*.htm>;
  3. #Searching through directory
  4. for ($count=0;$count<@root_files;$count++)
  5. {
  6.     #opening the file one by one
  7.     my $FILENAME = @root_files[$count];
  8.     open(FILEPOINTER, "< $FILENAME") or die "Can't open $filename : $!";
  9.  
  10.     #Searching file content line by line
  11.     do
  12.     {
  13.         my $LINECONTENT = <FILEPOINTER>;
  14.                 # I need to capture only the http link if there is any. the "print" below is just for testing to make sure file content is shown line by line on command prompt
  15.         print $LINECONTENT;
  16.     }
  17.     until eof;
  18.     close FILEOPEN;    
  19. }
  20.  
Jan 25 '08 #6
numberwhun
3,509 Expert Mod 2GB
Guys I'm a bit stuck here.

I would like to know how to capture a certain word (in this case "http://") in the file content however I don't know what the syntax is. Could anyone help? Thanks.
Expand|Select|Wrap|Line Numbers
  1.  #Capturing .htm or .html filename on root directory
  2. my @root_files = <*.htm>;
  3. #Searching through directory
  4. for ($count=0;$count<@root_files;$count++)
  5. {
  6.     #opening the file one by one
  7.     my $FILENAME = @root_files[$count];
  8.     open(FILEPOINTER, "< $FILENAME") or die "Can't open $filename : $!";
  9.  
  10.     #Searching file content line by line
  11.     do
  12.     {
  13.         my $LINECONTENT = <FILEPOINTER>;
  14.                 # I need to capture only the http link if there is any. the "print" below is just for testing to make sure file content is shown line by line on command prompt
  15.         print $LINECONTENT;
  16.     }
  17.     until eof;
  18.     close FILEOPEN;    
  19. }
  20.  
Ok, I think that the best thing for you to do is grab a good Perl book and start reading.

I say this because this line:

Expand|Select|Wrap|Line Numbers
  1. my @root_files = <*.htm>;
  2.  
does not do what you are expecting. You cannot just specify "*.htm" and expect all files with .htm on the end to magically be in the array.

First, what OS are you working on? Windows or Unix/Linux? That will help tell us how you are going to get your directory listing.

Regards,

Jeff
Jan 25 '08 #7
Thanks for your reply Jeff.

I'm working on Linux and the "my @root_files = <*.htm>" worked as long as I have the .htm file at the same directory as the .pl file. I guess I will also need to improve the code to automatically dig deeper into the directory if it contains more folders (need more time, my perl book is confusing, if you can give me a hint that'll be great!).

Right now I am working on the Regular Expression to grab only the http link

Expand|Select|Wrap|Line Numbers
  1.  
  2. do
  3.     {
  4.         my $LINECONTENT = <FILEPOINTER>;
  5.         $_ = $LINECONTENT;
  6.        #this is where I'm stuck at this stage. The code below list the whole line instead of just the http://
  7.  
  8.         if (/a href="?"/) 
  9.         {
  10.             print $LINECONTENT;
  11.         }
  12.     }
  13.     until eof;
  14.  
Jan 25 '08 #8
KevinADC
4,059 Expert 2GB
Hi Guys!


I am new on this forum and I am hoping that I could acquire help to improve my PERL skill.

My goal is to be able to create a report of all broken links from a website. Based from my understanding, PERL can go through to server directory and search for htm/html file and put it on an array. From this array I believe PERL can search for a "http://" string and emulate to click the URL. If the URL contains "Page cannot be found" then create and append to a log file.

My question, what is the best method to do this and what is the required library and syntax?

Also for searching the "htm" file on a server, assuming that I don't have access to get in to server directory (which is unlikely) , how do I get this done? and if I have an access to the server how do I get PERL to login to the server first before PERL search for "htm" file on a specific directory? or do I have to put the PERL .pl file on the server root directory and run it there?

I really appreciate if someone could help. Thank you.
Can't you just use something like FrontPage to do that? I am sure there are many third party programs you can use to report broken site links.
Jan 26 '08 #9
Ummm yes I guess you can use 3rd party software to check broken link but you won't learn anything...

I have just finished a very basic script although need a lot of improvement.

Expand|Select|Wrap|Line Numbers
  1.  
  2. #!/usr/bin/perl
  3. use LWP::UserAgent;
  4.  
  5.  
  6. #Capturing .htm file
  7. #Improvement required. How if there are multiple folders
  8. my @root_files = <*.htm>;
  9. #Searching through folder
  10. for ($count=0;$count<@root_files;$count++)
  11. {
  12.     #opening the file one by one
  13.     my $FILENAME = (@root_files[$count]);
  14.     open(FILEPOINTER, "< $FILENAME") or die "Can't open $filename : $!";
  15.  
  16.     #Searching file content line by line
  17.     do
  18.     {
  19.         my $LINECONTENT = <FILEPOINTER>;
  20.         $_ = $LINECONTENT;
  21.         if (/href="?"/i)
  22.         {
  23.             #capturing quoted strings in href
  24.             #Improvement required. How if there are multiple href in each line?
  25.             my $beginstring = index($LINECONTENT,'"');
  26.             my $endstring = index($LINECONTENT, '"', $beginstring + 1);
  27.             my $finalstring = substr($LINECONTENT,($beginstring+1),($endstring-$beginstring-1));
  28.  
  29.             my $ua = LWP::UserAgent->new;
  30.             $ua->timeout(3);
  31.  
  32.             #For each link found check for 404 error and log to text file #Improvement required. How if there are a relative URI?
  33.             my $response = $ua->get($finalstring);
  34.             if ($response->is_error)
  35.             { 
  36.                 #output the link that has 404 error
  37.                 open(TEXT,">>broken_link_output.txt");
  38.                 printf TEXT $finalstring. "\n";                
  39.             }
  40.             close TEXT;
  41.         }
  42.     }
  43.     until eof;
  44.     close FILEPOINTER;    
  45. }
  46.  
  47.  
Jan 26 '08 #10
KevinADC
4,059 Expert 2GB
Ummm yes I guess you can use 3rd party software to check broken link but you won't learn anything...
If the goal includes learning how to write perl code then by all means, proceed. But you did not mention that in your openning post. ;)
Jan 26 '08 #11
KevinADC
4,059 Expert 2GB
You will want to look into using File::Find to search in all the sub directories. But then you might not learn how to recurse through directories yourself. ;)


For internal links you might consider just checking if the file exists and is readable instead of trying to fetch files with get(). It should be much faster.

For links to external files you have to use get().

There are also bound to be links to the same file more than once, you may want to avoid checking for the same file. You can use a hash table to keep track of what links are already checked.

The "do{}until eof" construct is not necessary. You might want to look into HTML::LinkExtractor (I think thats the name) to find the links in the html documents.
Jan 26 '08 #12

Sign in to post your reply or Sign up for a free account.

Similar topics

1
by: | last post by:
I am planning to develop a directory website (ASP.NET) which will contain links to hundreds of external web pages. In an effort to keep the directory up to date, I would like to trap (perhaps as an...
0
by: Kirt Loki Dankmyer | last post by:
So, I download the latest "stable" tar for perl (5.8.7) and try to compile it on the Solaris 8 (SPARC) box that I administrate. I try all sorts of different switches, but I can't get it to compile....
28
by: Craig Cockburn | last post by:
I have a tool which tells me the number of times that visitors attempt to access a link from my site to an external site and what the response code received was. In the event of the remote site...
2
by: barrybevel | last post by:
Hi, I have a very small simple program below which does the following: 1) post a username & password to a website - THIS WORKS 2) follow a link - THIS WORKS 3) update values of 2 fields and...
21
KevinADC
by: KevinADC | last post by:
Note: You may skip to the end of the article if all you want is the perl code. Introduction Uploading files from a local computer to a remote web server has many useful purposes, the most...
7
numberwhun
by: numberwhun | last post by:
**NOTE: This article is written using the 5.8.8 Alpha2 release of Strawberry Perl. I am writing this article with much joy and glee. This is due to the fact that Active State no longer has a...
10
by: happyse27 | last post by:
Hi All, I got this apache errors(see section A1 and A2 below) when I used a html(see section b below) to activate acctman.pl(see section c below). Section D below is part of the configuration...
1
KevinADC
by: KevinADC | last post by:
Note: You may skip to the end of the article if all you want is the perl code. Introduction Many websites have a form or a link you can use to download a file. You click a form button or click...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.