For a project I need to analysis of web content. So I should get a web page as a input and then decompose information and other objects. Every object may have special processing after decomposition. I have doubt which language is more suitable for this situation. Do you think python is a good option or recommend Perl or other languages? I have to mention that I have C++ programming experience and a little JavaScript.
Yes, it is an option. There are modules you can use, eg
urllib ,
urllib or
BeautifulSoup for getting web contents easily. For more information, you can have a look at
web client programming . Of course, if you are comfortable with Perl, there are similar libraries, eg LWP for similar task.