


Correct understanding and usage analysis of super function in Python programming
When a subclass needs to call a method of the parent class, before python2.2, the method of the class is directly called using the class name, that is, a non-bound class method, and the own object self is passed in as a parameter.
class A(object): def say(self): print 'I am A' class B(A): def say(self): print 'I am B' A.say(self) b = B() b.say()
Output
I am B I am A
This works great , but there is a problem. When the parent class changes its name, these explicit calls to the parent class must be corrected one by one. The coupling between the subclass and the parent class is relatively high.
So the super() function was introduced after Python 2.2 to avoid hard coding and no need to care about the name of the parent class.
Using the super() function, the above code can be written as follows.
class B(A): def say(self): print 'I am B' super(B,self).say()
After python3.0, improvements have been made. The super() function does not need to pass parameters, that is, the above line of code directly super(). Just say() will do.
Things to note:
super can only be used in new-style classes.
Super has a problem with multiple inheritance. If a subclass inherits multiple parent classes, then super calls the method of the first parent class.
Don’t mix these two methods of calling parent class methods, either use unbound class methods or both use super. Otherwise, it may not be called or may be called multiple times.
BUT:
Don’t think of the parent class when you think of super! super refers to the next class in MRO!
When talking about super, I think of the parent class. This is a mistake that beginners can easily make, and it was also a mistake I made back then.
def super(cls, inst): mro = inst.__class__.mro() return mro[mro.index(cls) + 1]
The two parameters cls and inst do two things respectively:
1. inst is responsible for generating the MRO list
2. Locate the index in the current MRO through cls, and return mro[index + 1]
These two things are the essence of super, you must remember it!
MRO stands for Method Resolution Order, which represents the order of class inheritance.
For example:
class Root(object): def __init__(self): print("this is Root") class B(Root): def __init__(self): print("enter B") # print(self) # this will printsuper(B, self).__init__() print("leave B") class C(Root): def __init__(self): print("enter C") super(C, self).__init__() print("leave C") class D(B, C): pass d = D() print(d.__class__.__mro__)
Output
enter B enter C this is Root leave C leave B (,,,,)
After knowing that super has no real relationship with the parent class, it is not difficult for us to understand why the next sentence of enter B is enter C instead of this is Root (if you think super represents "calling the method of the parent class", you will I take it for granted that the next sentence should be this is Root). The process is as follows, in the __init__ function of B:
super(B, self).__init__()
First, we get self.__class__.__mro__. Note that self here is the instance of D. Instead of B's
(
, then, use B to locate the index in the MRO and find the next one. Obviously B's next one is C. So, we call C's __init__ and type enter C.
By the way, why B's __init__ will be called: Because D does not define __init__, so we will find the next class in MRO to see if it has __init__ defined, that is, to call B's __init__.
In fact, the logic of all this is still very clear. The key is to understand what super does.
The above is the correct understanding and usage analysis of the super function in Python programming. For more related content, please pay attention to the PHP Chinese website (www.php.cn)!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics



PHP and Python have their own advantages and disadvantages, and the choice depends on project needs and personal preferences. 1.PHP is suitable for rapid development and maintenance of large-scale web applications. 2. Python dominates the field of data science and machine learning.

Efficient training of PyTorch models on CentOS systems requires steps, and this article will provide detailed guides. 1. Environment preparation: Python and dependency installation: CentOS system usually preinstalls Python, but the version may be older. It is recommended to use yum or dnf to install Python 3 and upgrade pip: sudoyumupdatepython3 (or sudodnfupdatepython3), pip3install--upgradepip. CUDA and cuDNN (GPU acceleration): If you use NVIDIAGPU, you need to install CUDATool

Docker uses Linux kernel features to provide an efficient and isolated application running environment. Its working principle is as follows: 1. The mirror is used as a read-only template, which contains everything you need to run the application; 2. The Union File System (UnionFS) stacks multiple file systems, only storing the differences, saving space and speeding up; 3. The daemon manages the mirrors and containers, and the client uses them for interaction; 4. Namespaces and cgroups implement container isolation and resource limitations; 5. Multiple network modes support container interconnection. Only by understanding these core concepts can you better utilize Docker.

Enable PyTorch GPU acceleration on CentOS system requires the installation of CUDA, cuDNN and GPU versions of PyTorch. The following steps will guide you through the process: CUDA and cuDNN installation determine CUDA version compatibility: Use the nvidia-smi command to view the CUDA version supported by your NVIDIA graphics card. For example, your MX450 graphics card may support CUDA11.1 or higher. Download and install CUDAToolkit: Visit the official website of NVIDIACUDAToolkit and download and install the corresponding version according to the highest CUDA version supported by your graphics card. Install cuDNN library:

Python and JavaScript have their own advantages and disadvantages in terms of community, libraries and resources. 1) The Python community is friendly and suitable for beginners, but the front-end development resources are not as rich as JavaScript. 2) Python is powerful in data science and machine learning libraries, while JavaScript is better in front-end development libraries and frameworks. 3) Both have rich learning resources, but Python is suitable for starting with official documents, while JavaScript is better with MDNWebDocs. The choice should be based on project needs and personal interests.

When selecting a PyTorch version under CentOS, the following key factors need to be considered: 1. CUDA version compatibility GPU support: If you have NVIDIA GPU and want to utilize GPU acceleration, you need to choose PyTorch that supports the corresponding CUDA version. You can view the CUDA version supported by running the nvidia-smi command. CPU version: If you don't have a GPU or don't want to use a GPU, you can choose a CPU version of PyTorch. 2. Python version PyTorch

MinIO Object Storage: High-performance deployment under CentOS system MinIO is a high-performance, distributed object storage system developed based on the Go language, compatible with AmazonS3. It supports a variety of client languages, including Java, Python, JavaScript, and Go. This article will briefly introduce the installation and compatibility of MinIO on CentOS systems. CentOS version compatibility MinIO has been verified on multiple CentOS versions, including but not limited to: CentOS7.9: Provides a complete installation guide covering cluster configuration, environment preparation, configuration file settings, disk partitioning, and MinI

CentOS Installing Nginx requires following the following steps: Installing dependencies such as development tools, pcre-devel, and openssl-devel. Download the Nginx source code package, unzip it and compile and install it, and specify the installation path as /usr/local/nginx. Create Nginx users and user groups and set permissions. Modify the configuration file nginx.conf, and configure the listening port and domain name/IP address. Start the Nginx service. Common errors need to be paid attention to, such as dependency issues, port conflicts, and configuration file errors. Performance optimization needs to be adjusted according to the specific situation, such as turning on cache and adjusting the number of worker processes.
