Limiting Concurrent Go Routines
In your code, you attempt to limit the number of concurrent goroutines. However, the current implementation doesn't work as intended. Here's an alternative approach:
Solution:
Instead of creating a goroutine for each URL, create a fixed number of workers that process URLs from a shared channel. Here's the modified code:
<code class="go">parallel := flag.Int("parallel", 10, "max parallel requests allowed") flag.Parse() // Workers get URLs from this channel urls := make(chan string) // Feed the workers with URLs go func() { for _, u := range flag.Args() { urls <- u } // Workers will exit from range loop when channel is closed close(urls) }() var wg sync.WaitGroup client := rest.Client{} results := make(chan string) // Start the specified number of workers. for i := 0; i < *parallel; i++ { wg.Add(1) go func() { defer wg.Done() for url := range urls { worker(url, client, results) } }() } // When workers are done, close results so that main will exit. go func() { wg.Wait() close(results) }() for res := range results { fmt.Println(res) }</code>
Explanation:
This approach ensures that a maximum of parallel goroutines are active at any given time, limiting the concurrency as desired.
The above is the detailed content of How to Effectively Limit Concurrent Go Routines?. For more information, please follow other related articles on the PHP Chinese website!