Why should crawlers use python?
What is a web crawler?
A web crawler is a program that automatically extracts web pages. It downloads web pages from the World Wide Web for search engines and is an important component of search engines. The traditional crawler starts from the URL of one or several initial web pages and obtains the URL on the initial web page. During the process of crawling the web page, it continuously extracts new URLs from the current page and puts them into the queue until certain stopping conditions of the system are met
What is the use of crawlers?
As a general search engine web page collector. (google, baidu) is a vertical search engine. Scientific research: online human behavior, online community evolution, human dynamics research, econometric sociology, complex networks, data mining, and other fields require a large amount of data. Web crawlers are A powerful tool for collecting relevant data. Peeping, hacking, sending spam...
Crawler is the first and easiest step for search engines
Webpage collection
Build index
Query sorting
What language should I use to write the crawler?
C,C. Highly efficient and fast, suitable for general search engines to crawl the entire web. Disadvantages: development is slow and writing is stinky and long, for example: Skynet search source code.
Scripting language: Perl, Python, Java, Ruby. Simple, easy to learn, and good text processing can facilitate the detailed extraction of web content, but the efficiency is often not high and it is suitable for focused crawling of a small number of websites
C#? (It seems to be a language that people in information management prefer)
Why did you choose Python in the end?
Cross-platform, has good support for Linux and windows.
Scientific computing, numerical fitting: Numpy, Scipy
Visualization: 2d: Matplotlib (the drawing is very beautiful), 3d: Mayavi2
Complex network: Networkx
Statistics: Interface with R language: Rpy
Interactive terminal
Rapid development of websites
A simple Python crawler
1 import urllib 2 import urllib.request 3 4 def loadPage(url,filename): 5 """ 6 作用:根据url发送请求,获取html数据; 7 :param url: 8 :return: 9 """ 10 request=urllib.request.Request(url) 11 html1= urllib.request.urlopen(request).read() 12 return html1.decode('utf-8') 13 14 def writePage(html,filename): 15 """ 16 作用将html写入本地 17 18 :param html: 服务器相应的文件内容 19 :return: 20 """ 21 with open(filename,'w') as f: 22 f.write(html) 23 print('-'*30) 24 def tiebaSpider(url,beginPage,endPage): 25 """ 26 作用贴吧爬虫调度器,负责处理每一个页面url; 27 :param url: 28 :param beginPage: 29 :param endPage: 30 :return: 31 """ 32 for page in range(beginPage,endPage+1): 33 pn=(page - 1)*50 34 fullurl=url+"&pn="+str(pn) 35 print(fullurl) 36 filename='第'+str(page)+'页.html' 37 html= loadPage(url,filename) 38 39 writePage(html,filename) 40 41 42 43 if __name__=="__main__": 44 kw=input('请输入你要需要爬取的贴吧名:') 45 beginPage=int(input('请输入起始页')) 46 endPage=int(input('请输入结束页')) 47 url='https://tieba.baidu.com/f?' 48 kw1={'kw':kw} 49 key = urllib.parse.urlencode(kw1) 50 fullurl=url+key 51 tiebaSpider(fullurl,beginPage,endPage)
The above is the detailed content of Why should crawlers use python?. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics



Solution to permission issues when viewing Python version in Linux terminal When you try to view Python version in Linux terminal, enter python...

When using Python's pandas library, how to copy whole columns between two DataFrames with different structures is a common problem. Suppose we have two Dats...

How to teach computer novice programming basics within 10 hours? If you only have 10 hours to teach computer novice some programming knowledge, what would you choose to teach...

How to avoid being detected when using FiddlerEverywhere for man-in-the-middle readings When you use FiddlerEverywhere...

Regular expressions are powerful tools for pattern matching and text manipulation in programming, enhancing efficiency in text processing across various applications.

How does Uvicorn continuously listen for HTTP requests? Uvicorn is a lightweight web server based on ASGI. One of its core functions is to listen for HTTP requests and proceed...

The article discusses popular Python libraries like NumPy, Pandas, Matplotlib, Scikit-learn, TensorFlow, Django, Flask, and Requests, detailing their uses in scientific computing, data analysis, visualization, machine learning, web development, and H

In Python, how to dynamically create an object through a string and call its methods? This is a common programming requirement, especially if it needs to be configured or run...
