Hello All,
I am currently trying to teach a web crawler how to identify blogs,
that is I am trying to determine a fairly inclusive set of criteria
that will help my crawler to identify them.
I have noticed that many Blogs include
div class=blogsomet hing (A format class conveniantly named blog)
xml tags
and/or php code.
I do know that cms(content management system) is used for several
blogs, does anyone else have any suggestions to help me determine
criteria.
I am aware that any criteria is subjective, especially when
considering sites such as slashdot which has been around longer than
Blogs...
thanks,
David 2 3247
Metropolis wrote: I am currently trying to teach a web crawler how to identify blogs, that is I am trying to determine a fairly inclusive set of criteria that will help my crawler to identify them.
I have noticed that many Blogs include
div class=blogsomet hing (A format class conveniantly named blog)
Maybe *some* blogs contain this tag, but I'm betting most don't.
xml tags
So do lots of other websites, and I'm betting other websites have 'em more
than blogs do.
and/or php code.
How can you tell it's a PHP document? You can't see any PHP code because
what you are served up is a static HTML page. The only hint you can have is
that the file extension ends with .php but not all PHP pages end in .php
In any case just because it's PHP doesn't make it a blog.
I think you'll need to do a lot more than your suggestions here to determine
if it's a blog or not.
A lot of them do have date boxes on the page somewhere so you can navigate
back to previous days postings. Things like this, and other elements that
are common to blogs, are what you should be looking for, and not stuff like
whether it contains XML style tags or PHP file extensions.
--
Chris Hope - The Electric Toolbox - http://www.electrictoolbox.com/
Chris Hope <bl*******@elec trictoolbox.com > wrote in message news:<11******* ******@216.128. 74.129>... Metropolis wrote:
I am currently trying to teach a web crawler how to identify blogs, that is I am trying to determine a fairly inclusive set of criteria that will help my crawler to identify them.
I have noticed that many Blogs include
div class=blogsomet hing (A format class conveniantly named blog)
Maybe *some* blogs contain this tag, but I'm betting most don't.
xml tags
So do lots of other websites, and I'm betting other websites have 'em more than blogs do.
and/or php code.
How can you tell it's a PHP document? You can't see any PHP code because what you are served up is a static HTML page. The only hint you can have is that the file extension ends with .php but not all PHP pages end in .php
In any case just because it's PHP doesn't make it a blog.
All true.
Start by thinking about how -you- identify a blog. That ain't easy,
if my attempts at explaining what a blog is to other people is any
indication.
Look for references to time and self. E.g. "yesterday, I"
What IS a blog, anyway?
Not duck soup, or a piece of cake, this problem. This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics |
by: Gomez |
last post by:
Hi,
Is there a way to know if a session on my web server is from an actual user or an automated crawler.
please advise.
G
|
by: Benjamin Lefevre |
last post by:
I am currently developping a web crawler, mainly crawling mobile page (wml,
mobile xhtml) but not only (also html/xml/...), and I ask myself which speed
I can reach.
This crawler is developped in C# using multithreading and HttpWebRequest.
Actually my crawler is able to download and crawl pages at the speed of
around 5 pages per second. It's running on a development machine with 512Mb
Ram and a shared ADSL-connection (2Mbits). Is it...
|
by: Turamnvia Suouriviaskimatta |
last post by:
I 'm following various posting in "comp.lang.ada, comp.lang.c++ ,
comp.realtime, comp.software-eng" groups regarding selection of a
programming language of C, C++ or Ada for safety critical real-time
applications. The majority of expert/people recommend Ada for safety
critical real-time applications. I've many years of experience in C/C++ (and
Delphi) but no Ada knowledge.
May I ask if it is too difficult to move from C/C++ to Ada?...
|
by: Steve Ocsic |
last post by:
Hi,
I've coded a basic crawler where by you enter the URL and it will then
crawl the said URL. What I would like to do now is to take it one
step further and do the following:
1. pick up the url's I would like to crawl from a database and pass
them to the crawler. Once the crawler has crawled the website I would
then like to put a flag against it so that the url is not processed
for a certain period of time.
|
by: Nicolas |
last post by:
I need HELP!!!!!
The crawler (Google or other) don't index my web site unless the web site is
currently visited
If there is nobody visiting those .aspx page therefor activating the aspnet
no crawler is going throught the site
I play with the robots file the meta tag etc.
Also played with the crawler class but no success
Sub Application_BeginRequest(ByVal sender As Object, ByVal e As EventArgs)
| |
by: Bill |
last post by:
Has anyone used/tested Request.Browser.Crawler ? Is it reliable, or are there false
positives/negatives?
Thanks!
|
by: StevePBurgess |
last post by:
Hi. I have a book affiliate website. Whenever a visitor clicks on one
of the books, a script adds one to a field in a mysql database and then
takes the visitor to the shopping basket on the book website.
I have noticed that the book links are getting lots of hit. At first, I
was pleased about the potential income this might mean - but then it
occurred to me that many of these hits are web crawlers (this was
confirmed by webaliser).
...
|
by: disappearedng |
last post by:
Hi all,
I am currently planning to write my own web crawler. I know Python but
not Perl, and I am interested in knowing which of these two are a
better choice given the following scenario:
1) I/O issues: my biggest constraint in terms of resource will be
bandwidth throttle neck.
2) Efficiency issues: The crawlers have to be fast, robust and as
"memory efficient" as possible. I am running all of my crawlers on
cheap pcs with about 500...
|
by: kishorealla |
last post by:
Hello
I need to create a web bot/crawler/spider that would go into different web sites and collect data for us and store in a database. The crawler needs to 'READ' the options on a website (either from drop-downs, radio-buttons or check-boxesand) to create some input itself OR use some generic pre-defined words (that we provide it with).
For example, a webpage might be structure with a text field and some drop-downs. Typically, if the user...
|
by: marktang |
last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look !
Part I. Meaning of...
|
by: Hystou |
last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it.
First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
| |
by: Oralloy |
last post by:
Hello folks,
I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>".
The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed.
This is as boiled down as I can make it.
Here is my compilation command:
g++-12 -std=c++20 -Wnarrowing bit_field.cpp
Here is the code in...
|
by: jinu1996 |
last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth.
The Art of Business Website Design
Your website is...
|
by: Hystou |
last post by:
Overview:
Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
|
by: isladogs |
last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM).
In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules.
He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms.
Adolph will...
|
by: adsilva |
last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
|
by: 6302768590 |
last post by:
Hai team
i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
| |
by: bsmnconsultancy |
last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...
| |