Streaming Decode of Massive JSON Arrays
Retrieving large JSON arrays from files can be memory-intensive when using Unmarshal, as it loads the entire data into memory at once. To address this, we can leverage streaming techniques to process the array element by element.
Example Using Stream Decoder
The encoding/json package provides a solution for streaming JSON processing. Here's an extended example from the package documentation:
package main import ( "encoding/json" "fmt" "log" "strings" ) func main() { const jsonStream = ` [ {"Name": "Ed", "Text": "Knock knock."}, {"Name": "Sam", "Text": "Who's there?"}, {"Name": "Ed", "Text": "Go fmt."}, {"Name": "Sam", "Text": "Go fmt who?"}, {"Name": "Ed", "Text": "Go fmt yourself!"} ] ` type Message struct { Name, Text string } dec := json.NewDecoder(strings.NewReader(jsonStream)) // read open bracket t, err := dec.Token() if err != nil { log.Fatal(err) } fmt.Printf("%T: %v\n", t, t) // while the array contains values for dec.More() { var m Message // decode an array value (Message) err := dec.Decode(&m) if err != nil { log.Fatal(err) } fmt.Printf("%v: %v\n", m.Name, m.Text) } // read closing bracket t, err = dec.Token() if err != nil { log.Fatal(err) } fmt.Printf("%T: %v\n", t, t) }
In this example, we create a stream decoder (dec) and use dec.More() within a loop to iterate through the array elements. Each message is decoded and its contents printed without loading the entire array into memory.
The above is the detailed content of How Can I Stream Decode Massive JSON Arrays to Avoid Memory Issues?. For more information, please follow other related articles on the PHP Chinese website!