Bulk Insertion from CSV into PostgreSQL using Go without For Loop
Inserting a large volume of data from a CSV file into a PostgreSQL database can be time-consuming when using a for loop. A more efficient approach is to utilize the pgx library to perform a bulk copy operation.
Using pgx for Bulk Insertion
To achieve this, you can follow these steps:
Code Example
The following Go code demonstrates how to bulk insert data from a CSV file into a PostgreSQL database using pgx:
<code class="go">import ( "context" "fmt" "os" "github.com/jackc/pgx/v5" ) func main() { filename := "foo.csv" dbconn, err := pgx.Connect(context.Background(), os.Getenv("DATABASE_URL")) if err != nil { panic(err) } defer dbconn.Release() f, err := os.Open(filename) if err != nil { panic(err) } defer func() { _ = f.Close() }() res, err := dbconn.Conn().PgConn().CopyFrom(context.Background(), f, "COPY csv_test FROM STDIN (FORMAT csv)") if err != nil { panic(err) } fmt.Print(res.RowsAffected()) }</code>
By using this approach, you can efficiently bulk insert data from large CSV files without the overhead of a for loop. This method is particularly useful for handling large data sets and ensuring faster data loading performance.
The above is the detailed content of How can I efficiently bulk insert data from a CSV file into PostgreSQL using Go without a for loop?. For more information, please follow other related articles on the PHP Chinese website!