


Introduction to the use of Gearman, a powerful tool for PHP concurrent multi-process processing, gearman_PHP tutorial
Introduction to the use of PHP concurrent multi-process processing tool Gearman, the powerful tool gearman
In our work, we sometimes encounter situations where we need to publish data to multiple servers at the same time, or process it at the same time. Multiple tasks. You can use PHP's curl_multi method to process requests concurrently, but due to some conditions in the network, data, and various servers, the response time of this concurrent processing is very slow, because the process of concurrent requests also includes recording logs, processing data, etc. Logic, waiting for processing results and returning, so it cannot satisfy the background operation experience in a friendly way.
Now there is another solution, using Gearman to achieve concurrency requirements. Send requests to Gearman's Jobs through the Client, and perform operations such as curl_multi, data processing, and logging in each Work. At the same time, use the supervisor to monitor the processes of Gearman and Works, so that a parallel multi-process and load balancing can be achieved. plan.
What Gearman can do:
Asynchronous processing: image processing, order processing, batch emails/notifications, etc.
Processing requiring high CPU or memory: large-capacity data processing, MapReduce operations, log aggregation, video encoding
Distributed and parallel processing
Timing processing: incremental update, data copy
Rate-limited FIFO processing
Distributed system monitoring tasks
Gearman working principle:
Applications using Gearman usually consist of three parts: a Client, a Worker, and a task server. The role of the Client is to propose a Job task and hand it over to the Job Server task server. The Job Server will look for a suitable Worker to complete the task. The Worker executes the Job sent by the Client and returns the results to the Client through the Job Server. Gearman provides Client and Worker APIs, and applications can use these APIs to communicate with the Gearman Job Server. Communication between Client and Worker within Gearman is conducted through TCP connections.
Gearman can distribute the workload of work to different machines.
Installation:
Copy code The code is as follows:
rpm -ivh http://dl.iuscommunity.org/pub/ius/stable/Redhat/6/x86_64/epel-release-6-5.noarch.rpm
yum install -y gearmand
Start:
gearmand -d
Install the PHP Gearman extension
I use pcel to install. You can also download the source code package to compile and install, but remember to install libgearman and re2c first, otherwise the extension compilation and installation will cause errors.
pecl install gearman #If it fails and prompts version problem, you can try pecl install gearman-1.0.3, the default seems to be 1.1.2
Compilation and installation are also very simple
Copy code The code is as follows:
wget -c http://pecl.php.net/get/gearman-1.1.1.tgz
tar zxvf gearman-1.1.1.tgz
phpize
./configure
make && make install
echo "extension=gearman.so" >> /etc/php.ini
PHP interface function
Gearman provides many complete extension functions, including GearmanClient, GearmanJob, GearmanTask, and GearmanWorker. For details, you can check the official PHP manual.
This is one of the officially provided Examples, which is quite an example of concurrent distribution task processing
<?php $client = new GearmanClient(); $client->addServer(); // initialize the results of our 3 "query results" here $userInfo = $friends = $posts = null; // This sets up what gearman will callback to as tasks are returned to us. // The $context helps us know which function is being returned so we can // handle it correctly. $client->setCompleteCallback(function(GearmanTask $task, $context) use (&$userInfo, &$friends, &$posts) { switch ($context) { case 'lookup_user': $userInfo = $task->data(); break; case 'baconate': $friends = $task->data(); break; case 'get_latest_posts_by': $posts = $task->data(); break; } }); // Here we queue up multiple tasks to be execute in *as much* parallelism as gearmand can give us $client->addTask('lookup_user', 'joe@joe.com', 'lookup_user'); $client->addTask('baconate', 'joe@joe.com', 'baconate'); $client->addTask('get_latest_posts_by', 'joe@joe.com', 'get_latest_posts_by'); echo "Fetching...\n"; $start = microtime(true); $client->runTasks(); $totaltime = number_format(microtime(true) - $start, 2); echo "Got user info in: $totaltime seconds:\n"; var_dump($userInfo, $friends, $posts);
gearman_work.php
<?php $worker = new GearmanWorker(); $worker->addServer(); $worker->addFunction('lookup_user', function(GearmanJob $job) { // normally you'd so some very safe type checking and query binding to a database here. // ...and we're gonna fake that. sleep(3); return 'The user requested (' . $job->workload() . ') is 7 feet tall and awesome'; }); $worker->addFunction('baconate', function(GearmanJob $job) { sleep(3); return 'The user (' . $job->workload() . ') is 1 degree away from Kevin Bacon'; }); $worker->addFunction('get_latest_posts_by', function(GearmanJob $job) { sleep(3); return 'The user (' . $job->workload() . ') has no posts, sorry!'; }); while ($worker->work());
I executed gearman_work.php in 3 terminals
ryan@ryan-lamp:~$ ps aux | grep gearman* | grep -v grep gearman 1504 0.0 0.1 60536 1264 ? Ssl 11:06 0:00 /usr/sbin/gearmand --pid-file=/var/run/gearman/gearmand.pid --user=gearman --daemon --log-file=/var/log/gearman-job-server/gearman.log --listen=127.0.0.1 ryan 2992 0.0 0.8 43340 9036 pts/0 S+ 14:05 0:00 php /var/www/gearmand_work.php ryan 3713 0.0 0.8 43340 9036 pts/1 S+ 14:05 0:00 php /var/www/gearmand_work.php ryan 3715 0.0 0.8 43340 9036 pts/2 S+ 14:05 0:00 php /var/www/gearmand_work.php
Let’s check the result of executing gearman_work.php shell
Copy code The code is as follows:
Fetching...
Got user info in: 3.03 seconds:
string(59) "The user requested (joe@joe.com) is 7 feet tall and awesome"
string(56) "The user (joe@joe.com) is 1 degree away from Kevin Bacon"
string(43) "The user (joe@joe.com) has no posts, sorry!"
Seeing the 3.03 seconds above indicates that the tasks requested by the client in the past were distributed and executed in parallel.
In the actual production environment, in order to monitor that the gearmand and work processes have not exited unexpectedly, we can use the Supervisor tool.

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

Everyone knows that Node.js is single-threaded, but they don’t know that it also provides a multi-thread module to speed up the processing of some special tasks. This article will lead you to understand the multi-threading of Node.js, hoping to learn more about it. Everyone helps!

As a highly concurrent programming language, Golang's built-in coroutine mechanism and multi-threaded operations enable lightweight multi-tasking. However, in a multi-process processing scenario, communication and shared memory between different processes have become key issues in program development. This article will introduce the application method of realizing shared memory between multiple processes in Golang. 1. How to implement multi-process in Golang In Golang, multi-process concurrent processing can be implemented in a variety of ways, including fork, os.Process,

Golang is a multi-process, and its thread model is the MPG model. Overall, Go processes and kernel threads correspond to many-to-many, so first of all, they must be multi-threaded. Golang has some so-called M ratio N model. N go routines can be created under M threads. Generally speaking, N is much larger than M. It is essentially a multi-threaded model. However, the scheduling of coroutines is determined by the runtime of Go. It is emphasized that developers should Use channels for synchronization between coroutines.

Locks and Synchronization in Concurrent Programming In concurrent programming, multiple processes or threads run simultaneously, which can lead to resource contention and inconsistency issues. To solve these problems, locks and synchronization mechanisms are needed to coordinate access to shared resources. Concept of Lock A lock is a mechanism that allows only one thread or process to access a shared resource at a time. When one thread or process acquires a lock, other threads or processes are blocked from accessing the resource until the lock is released. Types of locks There are several types of locks in python: Mutex lock (Mutex): ensures that only one thread or process can access resources at a time. Condition variable: Allows a thread or process to wait for a certain condition and then acquire the lock. Read-write lock: allows multiple threads to read resources at the same time, but only allows one thread to write resources

PHP development skills: How to use Gearman scheduled tasks to process MySQL database Introduction: Gearman is an open source distributed task scheduling system that can be used to execute tasks in parallel and improve the system's processing capabilities. In PHP development, we often use Gearman to handle some time-consuming tasks or asynchronous tasks. This article will introduce how to use Gearman to implement scheduled tasks to handle MySQL database operations. 1. Install Gearman in the Linux system, you can

This article brings you relevant knowledge about PHP, which mainly introduces issues related to PHP multi-process development. Here is a summary of some multi-process development issues for you, with answers. Let’s take a look at them together. I hope it will be helpful to everyone. helpful.

How to implement multi-process in node? How to deploy node project? The following article will help you master the relevant knowledge of Node.js multi-process model and project deployment. I hope it will be helpful to you!

Python is popular in numerous programming fields due to its extensive libraries and easy-to-use syntax. However, for applications that need to process large amounts of data or real-time tasks, it is crucial to leverage the full potential of Python, and concurrent programming is the key to achieving this goal. 1. Multi-process The multi-process concurrency model allows you to execute code simultaneously in different operating system processes. This is useful for compute-intensive tasks, as each process can take advantage of a separate CPU core. The following is a Python multi-process example: importmultiprocessingdefworker(num):print(f"Process{num}isrunning")if
