Preface
The main knowledge point of this article is to use Python's BeautifulSoup
to perform multi-layer traversal.
as the picture shows. Just a simple hack, not crawling the hidden things inside.
Sample code
from bs4 import BeautifulSoup as bs import requests headers = { "host": "www.jd.com", "User-Agent": "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.80 Safari/537.36 Core/1.47.933.400 QQBrowser/9.4.8699.400", "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8" } session = requests.session() def get_url(): renspned = bs(session.get('http://www.jd.com/',headers = headers).text,'html.parser') for i in renspned.find("p", {"class": "dd-inner"}).find_all("a",{"target":"_blank"}): print(i.get_text(),':',i.get('href')) get_url()
Run this code and achieve our goal.
#Let’s interpret this code.
First we need to visit JD.com’s homepage.
Then parse the visited home page through BeautifulSoup
.
At this time, we have to locate the element to get what we need.
Through F12
in the browser, we can see what is shown in the picture below:
Let’s Take a look at the following code:
for i in renspned.find("p", {"class": "dd-inner"}).find_all("a",{"target":"_blank"})
This line of code completely meets our needs. First, use the find
method to locate class="dd-inner
" p, and then use find_all
for all a tags under this tag.
Finally, I wanted to print out all product categories and corresponding links, so I used i.get_text()
and i.get('href ')
finally obtained the product classification and corresponding link.
Summary
It’s actually not difficult, the main thing is to use the right method. The author didn't use the correct method because I was a beginner. It took almost two days to get it done. Here I also tell you that you can use the find().find_all()
method to perform multi-layer traversal. The above is some of my experience in using Python to crawl JD.com’s product categories and links. I hope it will be helpful to everyone learning python.
For more articles related to Python crawling JD.com’s product categories and links, please pay attention to the PHP Chinese website!