


Tutorial on using the Python-nmap network scanning and sniffing toolkit
nmap concept
NMap, also known as Network Mapper, was originally a network scanning and sniffing toolkit under Linux.
nmap is a network connection scanning software, used to scan open network connections of computers on the Internet. Determine which services are running on which connections and infer which operating system the computer is running (this is also known as fingerprinting). It is one of the must-use software for network administrators and is used to assess network system security.
Like most tools used for network security, nmap is also a tool favored by many hackers and hackers (also known as script kiddies). System administrators can use nmap to detect unauthorized use of servers in the work environment, but hackers will use nmap to collect the network settings of target computers to plan attacks.
Nmap is often confused with Nessus, a system vulnerability assessment software. Nmap uses stealth methods to avoid the surveillance of intrusion detection systems and try not to affect the daily operations of the target system.
Nmap was used by Trinity to hack into the energy management system of power plants in The Matrix, together with the 32-bit cyclic redundancy check vulnerability of SSH1.
nmap function
There are three basic functions. One is to detect whether a group of hosts is online; the second is to scan the host port and sniff the provided network services; and it can also Infer the operating system used by the host. Nmap can be used to scan LANs with as few as two nodes, up to networks with more than 500 nodes. Nmap also allows users to customize scanning techniques. Usually, a simple ping operation using the ICMP protocol can meet general needs; it can also deeply detect the UDP or TCP port, down to the operating system used by the host; it can also record all detection results into logs in various formats for further analysis. Analysis operations.
Perform a ping scan and print out the hosts that responded to the scan without further testing (such as port scanning or operating system detection):
nmap -sP 192.168.1.0/24
Only list each host on the specified network Host, do not send any packets to the target host:
nmap -sL 192.168.1.0/24
To detect the open ports of the target host, you can specify a comma-separated port list (such as -PS22, 23, 25, 80):
nmap -PS 192.168.1.234
Use UDP ping to detect the host:
nmap -PU 192.168.1.0/24
The most frequently used scan option: SYN scan, also known as semi-open scan, it does not open a full TCP connection and executes very quickly:
nmap -sS 192.168.1.0/24
nmap installation
This article takes linux Ubuntu16.04 as an example, and finally mainly uses python to operate
1. Install nmap first
sudo apt-get install nmap
2. Then install python-nmap
sudo pip install python-nmap
After installation, import nmap into python and test to verify whether it is successful
com@pythontab:~# python Python 2.7.12 (default, Dec 3 2016, 10:42:27) [GCC 4.4.7 20120313 (Red Hat 4.4.7-17)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import nmap
Python operation nmap
1. Simple Small case
Create a PortScanner instance, and then scan ports 20-443 of the IP 114.114.114.114.
import nmap nm = nmap.PortScanner() ret = nm.scan('114.114.114.114','20') print ret
The return format is as follows:
{ 'nmap': { 'scanstats': {'uphosts': '1', 'timestr': 'Tue Oct 25 11:30:47 2016', 'downhosts': '0', 'totalhosts': '1', 'elapsed': '1.11'}, 'scaninfo': {'tcp': {'services': '20', 'method': 'connect'}}, 'command_line': 'nmap -oX - -p 20 -sV 115.239.210.26' }, 'scan': { '115.239.210.26': { 'status': {'state': 'up', 'reason': 'syn-ack'}, 'hostnames': [{'type': '', 'name': ''}], 'vendor': {}, 'addresses': {'ipv4': '115.239.210.26'}, 'tcp': {20: {'product': '', 'state': 'filtered', 'version': '', 'name': 'ftp-data', 'conf': '3', 'extrainfo': '', 'reason': 'no-response', 'cpe': ''} } } } }
2. Built-in method:
You can also print out simple information
import nmap nm = nmap.PortScanner() print nm.scaninfo() # {u'tcp': {'services': u'20-443', 'method': u'syn'}} print nm.command_line() # u'nmap -oX - -p 20-443 -sV 114.114.114.114'
View How many hosts are there
print nm.all_hosts()
[u'114.114.114.114']
View the detailed information of the host
nm['114.114.114.114']
View all protocols included in the host
nm['114.114.114.114'].all_protocols()
Check which ports of the host provide the tcp protocol
nm['114.114.114.114']['tcp'] nm['114.114.114.114']['tcp'].keys()
Check whether the port provides the tcp protocol
nm['114.114.114.114'].has_tcp(21)
You can also set the parameters for nmap execution like this
nm.scan(hosts='192.168.1.0/24', arguments='-n -sP -PE -PA21,23,80,3389')
The above is the detailed content of Tutorial on using the Python-nmap network scanning and sniffing toolkit. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

This tutorial demonstrates how to use Python to process the statistical concept of Zipf's law and demonstrates the efficiency of Python's reading and sorting large text files when processing the law. You may be wondering what the term Zipf distribution means. To understand this term, we first need to define Zipf's law. Don't worry, I'll try to simplify the instructions. Zipf's Law Zipf's law simply means: in a large natural language corpus, the most frequently occurring words appear about twice as frequently as the second frequent words, three times as the third frequent words, four times as the fourth frequent words, and so on. Let's look at an example. If you look at the Brown corpus in American English, you will notice that the most frequent word is "th

This article explains how to use Beautiful Soup, a Python library, to parse HTML. It details common methods like find(), find_all(), select(), and get_text() for data extraction, handling of diverse HTML structures and errors, and alternatives (Sel

Dealing with noisy images is a common problem, especially with mobile phone or low-resolution camera photos. This tutorial explores image filtering techniques in Python using OpenCV to tackle this issue. Image Filtering: A Powerful Tool Image filter

PDF files are popular for their cross-platform compatibility, with content and layout consistent across operating systems, reading devices and software. However, unlike Python processing plain text files, PDF files are binary files with more complex structures and contain elements such as fonts, colors, and images. Fortunately, it is not difficult to process PDF files with Python's external modules. This article will use the PyPDF2 module to demonstrate how to open a PDF file, print a page, and extract text. For the creation and editing of PDF files, please refer to another tutorial from me. Preparation The core lies in using external module PyPDF2. First, install it using pip: pip is P

This tutorial demonstrates how to leverage Redis caching to boost the performance of Python applications, specifically within a Django framework. We'll cover Redis installation, Django configuration, and performance comparisons to highlight the bene

This article compares TensorFlow and PyTorch for deep learning. It details the steps involved: data preparation, model building, training, evaluation, and deployment. Key differences between the frameworks, particularly regarding computational grap

This tutorial demonstrates creating a custom pipeline data structure in Python 3, leveraging classes and operator overloading for enhanced functionality. The pipeline's flexibility lies in its ability to apply a series of functions to a data set, ge

Python, a favorite for data science and processing, offers a rich ecosystem for high-performance computing. However, parallel programming in Python presents unique challenges. This tutorial explores these challenges, focusing on the Global Interprete
