Streaming Large-Scale JSON Files
Decoding massive JSON arrays from files can encounter memory issues when attempting to load all data at once. To address this challenge, streaming techniques can be employed.
In the provided example, utilizing json.Unmarshal on an extensive JSON file results in memory exhaustion due to entire array loading.
To avoid this, consider streaming the JSON elements sequentially. Here's an approach based on the Golang documentation example:
import ( "encoding/json" "fmt" "strings" ) func main() { const jsonStream = ` [ {"Name": "Ed", "Text": "Knock knock."}, {"Name": "Sam", "Text": "Who's there?"}, {"Name": "Ed", "Text": "Go fmt."}, {"Name": "Sam", "Text": "Go fmt who?"}, {"Name": "Ed", "Text": "Go fmt yourself!"} ] ` type Message struct { Name, Text string } dec := json.NewDecoder(strings.NewReader(jsonStream)) for dec.More() { var m Message if err := dec.Decode(&m); err != nil { fmt.Println(err) return } fmt.Printf("%v: %v\n", m.Name, m.Text) } }
By streaming the elements, the code processes them sequentially, reducing memory consumption. Note that this example is based on a string input; it can be modified to read from a file as needed.
The above is the detailed content of How Can I Efficiently Process Large JSON Files in Go to Avoid Memory Exhaustion?. For more information, please follow other related articles on the PHP Chinese website!