Efficient Read and Write of CSV Data in Go
The Go code provided reads a large CSV file with 10,000 records, performs calculations, and then writes the original values to another CSV with an additional score column. However, the process is slow, taking hours to complete. This article investigates potential inefficiencies in the CSV reading and writing operations to optimize the code.
One key optimization is to avoid loading the entire file into memory at once. The original code uses ReadAll() to read all the records into a slice, which is not efficient for large files. Instead, a streaming approach is preferred, where the file is processed one line at a time.
To implement this, we can use a goroutine to read the CSV file line by line and send the records to a channel. The main goroutine can consume the records from the channel, perform calculations, and write the results to the output CSV. Here is an example implementation:
<code class="go">func processCSV(rc io.Reader) (ch chan []string) { ch = make(chan []string, 10) go func() { r := csv.NewReader(rc) if _, err := r.Read(); err != nil { //read header log.Fatal(err) } defer close(ch) for { rec, err := r.Read() if err != nil { if err == io.EOF { break } log.Fatal(err) } ch <- rec } }() return }</code>
In this code, the processCSV() function takes a reader and returns a channel that emits the records from the CSV file. The main goroutine can then use this channel to efficiently process and write the records.
By using this streaming approach, we can significantly improve the performance of the CSV read and write operations, making the code more efficient for processing large CSV files.
The above is the detailed content of How to Efficiently Process Large CSV Files in Go?. For more information, please follow other related articles on the PHP Chinese website!