golang concurrent request interface
Go language is a programming language that is very suitable for concurrent programming. Its performance is well utilized when implementing high-concurrency services or applications. In daily development, we may encounter scenarios that require concurrent request interfaces or concurrent processing of large amounts of data. This article will introduce how to implement concurrent request interfaces in golang.
Concurrent request interface scenarios
In actual development, we may encounter scenarios where we need to request an interface and obtain response data, such as:
- Get product data on a website.
- Get data from different API interfaces and present them in a summary.
- Request multiple data sources at the same time to quickly collect data.
In a single thread, if you need to request multiple interfaces, you need to complete one interface request before requesting another interface, which will cause the entire process to become slow. On the contrary, using the concurrent request interface can initiate multiple requests at the same time, greatly improving request efficiency.
Goroutine Concurrency Processing
Goroutine is a special function in the go language that can run in a special thread parallel to the main thread. Multiple goroutines running at the same time can request multiple interfaces at the same time, and then perform data integration processing after the request is completed. Concurrent use of goroutines is relatively easy to implement, and can be achieved through the go keyword.
WaitGroup controls goroutine
In actual development, we may find that some coroutines may be more time-consuming and may take more time to return results. In this case, we need to wait for the coroutine to return the result and perform subsequent processing. At this time, we need to use sync.WaitGroup to control the number of goroutines to ensure that all requests receive response results.
package main import ( "fmt" "io/ioutil" "net/http" "sync" ) var wg sync.WaitGroup // 声明一个sync.WaitGroup实例,用于协程控制 func main() { urls := []string{"https://www.baidu.com", "https://www.qq.com", "https://www.taobao.com", "https://www.jd.com", "https://www.mi.com"} // 通过遍历urls,启动goroutine for _, url := range urls { wg.Add(1) // 添加一个goroutine go getBody(url) } wg.Wait() // 等待所有goroutine结束 } // getBody用于获取传入url的响应结果,并打印。 func getBody(url string) { resp, err := http.Get(url) // 发起http GET请求 if err != nil { fmt.Println(err) return } defer resp.Body.Close() body, err := ioutil.ReadAll(resp.Body) if err != nil { fmt.Println(err) return } fmt.Printf("url: %s, contents: %s ", url, string(body)) wg.Done() // 相当于wg.Add(-1),标志该goroutine已经结束 }
In the above code, we first declare a sync.WaitGroup instance for controlling the number of coroutines. Then, in the main()
function, multiple coroutines are started by traversing urls. At the same time, each time a coroutine is started, the wg.Add(1) method is called, indicating that one needs to wait for a coroutine to complete. . In this case, the number of waiting coroutines recorded in WaitGroup will become the number of urls in urls. Then in the line go getBody(url)
, we start the coroutine that requests the url, and then call the wg.Done()
method when the coroutine ends, indicating that the coroutine The process is over.
Finally, wg.Wait()
is called to make the main coroutine wait for all coroutines to end.
Best Practices for Concurrent Requests
In actual development, we need to pay attention to some details, which can help us better use the concurrent request interface.
1. Control of the number of concurrencies
When requesting the interface concurrently, we need to control the number of concurrencies, especially when the number of interface requests is relatively large, to avoid one-time requests from causing too much damage to the server. huge pressure. We can set a maximum value to ensure the highest number of concurrencies. We can use the buffer channel in golang to control the maximum number of concurrencies.
ch := make(chan struct{}, 5) // 声明一个缓冲通道,大小为5,控制并发数量为5 for _, url := range urls { ch <- struct{}{} // 把协程数量放在通道里 wg.Add(1) // 添加一个goroutine go func(url string) { defer wg.Done() getBody(url) <-ch // 从通道里取出一个值,表示这个协程已经结束 }(url) }
In the process of declaring the buffer channel, we set the buffer size to 5, which means that up to 5 goroutines can be run at the same time. Then we traverse the urls and add the structure value to the channel.
When starting goroutine, we declared a func(url string)
as the processing function to avoid the maximum number of goroutines running at the same time exceeding 5, and then called getBody(url )
method. When the goroutine ends, we release a signal through the channel, indicating that a goroutine has ended - <-ch
.
2. Avoid request blocking
When making concurrent request interfaces, we need to avoid request blocking, which usually occurs when a request does not respond for a long time. We can solve this problem using context.Context in Golang. If the request times out, cancel the blocked request.
url := "https://httpstat.us/200?sleep=8000" ctx, cancel := context.WithTimeout(context.Background(), time.Millisecond*5000) // 告诉请求,5秒之后自动取消 defer cancel() req, err := http.NewRequestWithContext(ctx, "GET", url, nil) // 使用请求上下文 if err != nil { log.Fatal(err) } client := http.DefaultClient resp, err := client.Do(req) // 发起请求 if err != nil { log.Fatal(err) } if resp.StatusCode == http.StatusOK { contents, err := ioutil.ReadAll(resp.Body) if err != nil { log.Fatal(err) } fmt.Printf("%s ", contents) }
In the above code, we used the context.WithTimeout
method to create a request context with its timeout set to 5 seconds, such as http://httpstat.us/200? sleep=8000, this request takes 8 seconds to return data. We then create a request using the request context using the http.NewRequestWithContext method. When sending a request, we use http.DefaultClient
to initiate the request. Finally, if the response status code is 200, the response data is output.
When the request times out, the request link will be shut down directly. At this time we will be prompted with a "context deadline exceeded" error.
3. Avoid repeated requests
When requesting an interface, you may encounter repeated requests for the same interface. In this case, we should avoid repeated requests for the same interface. This It will waste valuable time and resources. We can solve this problem using sync.Map in Golang.
var m = sync.Map{} url := "https://httpbin.org/get" wg.Add(2) go doGet(url, &m, &wg) go doGet(url, &m, &wg) wg.Wait() func doGet(url string, m *sync.Map, wg *sync.WaitGroup) { _, loaded := m.LoadOrStore(url, true) // 表示url已经被请求过,如果已存在,则直接返回,否则返回false并储存 if loaded { fmt.Printf("url %s already requested. ", url) wg.Done() return } resp, err := http.Get(url) if err != nil { log.Fatal(err) } defer resp.Body.Close() contents, err := ioutil.ReadAll(resp.Body) if err != nil { log.Fatal(err) } fmt.Printf("%s ", contents) wg.Done() }
In the above code, we use a sync.Map to ensure that the url is only requested once. In the doGet
coroutine, we use m.LoadOrStore(url, true) to determine whether the url has been requested. If it has been requested, return
exits the coroutine directly. Otherwise, we initiate an http.Get request and print the response data in the log. Finally, we mark that the coroutine has ended through the wg.Done()
method.
Summary
This article introduces the method of using golang to implement the concurrent request interface. Control the number of concurrencies by using goroutine concurrency processing, WaitGroup coroutine control, and buffer channels. Avoid request blocking by setting a timeout in the request context, and use sync.Map to avoid request duplication. By using these technologies, we can greatly improve the efficiency of the request interface, improve coding efficiency and programming experience.
The above is the detailed content of golang concurrent request interface. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics



OpenSSL, as an open source library widely used in secure communications, provides encryption algorithms, keys and certificate management functions. However, there are some known security vulnerabilities in its historical version, some of which are extremely harmful. This article will focus on common vulnerabilities and response measures for OpenSSL in Debian systems. DebianOpenSSL known vulnerabilities: OpenSSL has experienced several serious vulnerabilities, such as: Heart Bleeding Vulnerability (CVE-2014-0160): This vulnerability affects OpenSSL 1.0.1 to 1.0.1f and 1.0.2 to 1.0.2 beta versions. An attacker can use this vulnerability to unauthorized read sensitive information on the server, including encryption keys, etc.

The article explains how to use the pprof tool for analyzing Go performance, including enabling profiling, collecting data, and identifying common bottlenecks like CPU and memory issues.Character count: 159

The article discusses writing unit tests in Go, covering best practices, mocking techniques, and tools for efficient test management.

Queue threading problem in Go crawler Colly explores the problem of using the Colly crawler library in Go language, developers often encounter problems with threads and request queues. �...

The library used for floating-point number operation in Go language introduces how to ensure the accuracy is...

The article discusses the go fmt command in Go programming, which formats code to adhere to official style guidelines. It highlights the importance of go fmt for maintaining code consistency, readability, and reducing style debates. Best practices fo

This article introduces a variety of methods and tools to monitor PostgreSQL databases under the Debian system, helping you to fully grasp database performance monitoring. 1. Use PostgreSQL to build-in monitoring view PostgreSQL itself provides multiple views for monitoring database activities: pg_stat_activity: displays database activities in real time, including connections, queries, transactions and other information. pg_stat_replication: Monitors replication status, especially suitable for stream replication clusters. pg_stat_database: Provides database statistics, such as database size, transaction commit/rollback times and other key indicators. 2. Use log analysis tool pgBadg

Backend learning path: The exploration journey from front-end to back-end As a back-end beginner who transforms from front-end development, you already have the foundation of nodejs,...
