The long-awaited detection classic has another wave of attacks - YOLOv5. Among them, YOLOv5 does not have complete files. The most important thing now is to figure out YOLOv4, which will benefit a lot in the field of target detection and can be highly improved in certain scenarios. Today we will analyze YOLOv4 for you. In the next issue, we will practice deploying YOLOv5 to Apple mobile phones or detect it in real time through the camera on the terminal!
There are a large number of features that are considered to improve the accuracy of convolutional neural networks (CNN). Combinations of these features need to be practically tested on large datasets and the results theoretically validated. Some functions operate only on certain models, on certain problems, or on small datasets; while some functions, such as batch normalization and residual joins, work on most Models, tasks, and datasets. This paper assumes that these common features include weighted residual connections (WRC), cross-stage connections (CSP), cross-minibatch normalization (CMbN), self-adversarial training (SAT), and Mish activation. This paper uses new features: WRC, CSP, CMbN, SAT, error activation, mosaic data augmentation, CMbN, DropBlock regularization and CIoU loss, and combines some of them to achieve the following effect: 43.5% AP (65.7% AP50), using MS+COCO dataset, real-time speed of 65 FPS on Tesla V100.
Mosaic data enhancement
Putting four pictures into one picture for training is equivalent to increasing the mini-batch in disguise. This is an improvement based on CutMix mixing two pictures;
Self-Adversarial Training
On a picture, let the neural network update the picture in reverse, make changes and perturbations to the picture, and then train on this picture. This method is the main method of image stylization, allowing the network to reversely update the image to stylize the image.
Self-Adversarial Training (SAT) also represents a new data augmentation technique that operates in 2 forward backward stages. In the 1st stage the neural network alters the original image instead of the network weights . In this way the neural network executes an adversarial attack on itself, altering the original image to create the deception that there is no desired object on the image. In the 2nd stage, the neural network is trained to detect an object on this modified image in the normal way.
Cross mini-batch Normal
CmBN means CBN modification The version, as shown in the figure below, is defined as Cross mini-Batch Normalization (CMBN). This only collects statistics between the smallest batches within a single batch.
modify SAM
##From SAM's space-by-space attention to point-by-point attention; The modified PAN changes the channel from addition (add) to concat.
Take the data enhancement method as an example. Although it increases the training time, it can make the model generalize. Better performance and robustness. For example, the following common enhancement methods:
Through experiments, we can see that using a lot of tricks, it is simply the most powerful kaleidoscope for target detection, as shown in the table below It is an experiment on classification networks:
CSPResNeXt-50 classifier accuracy
CSPDarknet-53 classifier accuracy
On the YOLOv4 detection network, four losses (GIoU, CIoU, DIoU, MSE), label smoothing, Cosine learning rate, genetic algorithm hyperparameter selection, Mosaic data enhancement and other methods were compared. . The following table is the ablation experiment results on the YOLOv4 detection network:
CSPResNeXt50-PANet-SPP, 512x512
Use models with different training weights for training:
Different mini-batch size results:
Finally, on three different series of GPUs: Maxwell, Pascal, and Volta, in the COCO data set Comparison of results on:
The most exciting thing is that in the COCO data set, compared with other frameworks (speed and accuracy):
The above is the detailed content of The whole process of deploying yolov to iPhone or terminal practice. For more information, please follow other related articles on the PHP Chinese website!