Decoding Large Stream JSON Efficiently
Enormous JSON arrays can pose a memory challenge when decoding them in one go. To address this, it is advisable to employ a streaming approach that iterates through the elements one by one.
Streaming JSON Element by Element
The encoding/json package provides a method for streaming JSON data. Here's an example:
import ( "encoding/json" "fmt" "log" "strings" ) func main() { const jsonStream = ` [ {"Name": "Ed", "Text": "Knock knock."}, {"Name": "Sam", "Text": "Who's there?"}, {"Name": "Ed", "Text": "Go fmt."}, {"Name": "Sam", "Text": "Go fmt who?"}, {"Name": "Ed", "Text": "Go fmt yourself!"} ]` dec := json.NewDecoder(strings.NewReader(jsonStream)) // skip the opening bracket dec.Token() // read and process each element for dec.More() { var message Message if err := dec.Decode(&message); err != nil { log.Fatal(err) } fmt.Printf("%v: %v\n", message.Name, message.Text) } // skip the closing bracket dec.Token() }
In this code, we use a json.NewDecoder to create a decoder for the JSON stream. We then iterate through the stream, skipping the opening and closing brackets ({ and }) and using dec.More() to check if there are more elements to process.
For each element, we create a Message struct and decode the JSON data into it using dec.Decode. We then print the message's Name and Text fields.
This approach allows us to process large JSON arrays without loading the entire array into memory, thus avoiding out of memory errors and improving performance.
The above is the detailed content of How Can I Efficiently Decode Large Streaming JSON Data in Go?. For more information, please follow other related articles on the PHP Chinese website!