473,769 Members | 5,173 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

web crawler in python or C?

Hi guys.I have to implement a topical crawler as a part of my
project.What language should i implement
C or Python?Python though has fast development cycle but my concern is
speed also.I want to strke a balance between development speed and
crawler speed.Since Python is an interpreted language it is rather
slow.The crawler which will be working on huge set of pages should be
as fast as possible.One possible implementation would be implementing
partly in C and partly in Python so that i can have best of both
worlds.But i don't know to approach about it.Can anyone guide me on
what part should i implement in C and what should be in Python?

Feb 16 '06 #1
1 3358
abhinav wrote:
Hi guys.I have to implement a topical crawler as a part of my
project.What language should i implement
C or Python?Python though has fast development cycle but my concern is
speed also. I want to strke a balance between development speed and
crawler speed.
Web crawling is an inherently network limited activity. The way to
speed up crawling is through parallel downloading. The language
performance is not going to have a relevant effect. Python does not
support multithreading, but it does support weak coroutines. (Of
course, C does not support any kind of multithreading, except by
platform specific extensions -- but these extensions are widespread.)

For the problem of parsing and handling data structures for this
activity, however, Python is *FAR* superior to C in terms of
development speed.
[...] Since Python is an interpreted language it is rather
slow.The crawler which will be working on huge set of pages should be
as fast as possible.One possible implementation would be implementing
partly in C and partly in Python so that i can have best of both
worlds. But i don't know to approach about it.Can anyone guide me on
what part should i implement in C and what should be in Python?


Actually, I have, in fact, done it this way myself in the past (before
Python had weak coroutines.) The way I did it is I wrote a
command-line tool for pulling down a collection of URLs from a control
file in C (the URLs would be downloaded in a multithreaded manner),
then I drove this tool from a Python program. Asymptotically, this
pegs my download bandwidth for the majority of the runtime, thus making
it basically within striking distance of theoretically optimal.

The problem is that you've picked completely the wrong newsgroup to ask
this question. Unfortunately, there is not clue to this fact from the
name of this newsgroup. This is actually a newsgroup that discusses
only the ANSI/ISO C standard as it exists, and none of platform
specific extensions (including sockets, and multithreading) . Nor is
the discussion of the development of real applications considered
on-topic in this newsgroup. Neither is performance considered on topic
-- by the standard, apparently you can't know even the *relative* speed
of anything in C. comp.programmin g would probaby have been a better
place to post this.

--
Paul Hsieh
http://www.pobox.com/~qed/
http://bstring.sf.net/

Feb 16 '06 #2

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
2226
by: Gomez | last post by:
Hi, Is there a way to know if a session on my web server is from an actual user or an automated crawler. please advise. G
1
1097
by: Benjamin Lefevre | last post by:
I am currently developping a web crawler, mainly crawling mobile page (wml, mobile xhtml) but not only (also html/xml/...), and I ask myself which speed I can reach. This crawler is developped in C# using multithreading and HttpWebRequest. Actually my crawler is able to download and crawl pages at the speed of around 5 pages per second. It's running on a development machine with 512Mb Ram and a shared ADSL-connection (2Mbits). Is it...
1
2510
by: Steve Ocsic | last post by:
Hi, I've coded a basic crawler where by you enter the URL and it will then crawl the said URL. What I would like to do now is to take it one step further and do the following: 1. pick up the url's I would like to crawl from a database and pass them to the crawler. Once the crawler has crawled the website I would then like to put a flag against it so that the url is not processed for a certain period of time.
3
5730
by: Bill | last post by:
Has anyone used/tested Request.Browser.Crawler ? Is it reliable, or are there false positives/negatives? Thanks!
13
5907
by: abhinav | last post by:
Hi guys.I have to implement a topical crawler as a part of my project.What language should i implement C or Python?Python though has fast development cycle but my concern is speed also.I want to strke a balance between development speed and crawler speed.Since Python is an interpreted language it is rather slow.The crawler which will be working on huge set of pages should be as fast as possible.One possible implementation would be...
3
4646
rhitam30111985
by: rhitam30111985 | last post by:
hi all,,, i am testing a web crawler on a site passsed as a command line argument.. it works fine until it finds a server which is down or some other error ... here is my code: #! /usr/bin/python import urllib import re import sys def crawl(urllist,done):
3
3973
by: mh121 | last post by:
I am trying to write a web crawler (for academic research purposes) that grabs the number of links different websites/domain names have from other websites, as listed on Google (for example, to get the number of websites linking to YouTube, you could type into Google 'Link:YouTube.com' and get 11,100). I have a list of websites in a spreadsheet and would like to be able to output the number of links for each website in the sheet. When I run...
12
4303
by: disappearedng | last post by:
Hi all, I am currently planning to write my own web crawler. I know Python but not Perl, and I am interested in knowing which of these two are a better choice given the following scenario: 1) I/O issues: my biggest constraint in terms of resource will be bandwidth throttle neck. 2) Efficiency issues: The crawlers have to be fast, robust and as "memory efficient" as possible. I am running all of my crawlers on cheap pcs with about 500...
0
2251
by: kishorealla | last post by:
Hello I need to create a web bot/crawler/spider that would go into different web sites and collect data for us and store in a database. The crawler needs to 'READ' the options on a website (either from drop-downs, radio-buttons or check-boxesand) to create some input itself OR use some generic pre-defined words (that we provide it with). For example, a webpage might be structure with a text field and some drop-downs. Typically, if the user...
4
4091
by: sonich | last post by:
I need simple web crawler, I found Ruya, but it's seems not currently maintained. Does anybody know good web crawler on python or with python interface?
0
9589
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
10216
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10049
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
9865
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
1
7413
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
5448
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3965
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
3565
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2815
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.