Efficient Bulk Data Insertion from CSV to PostgreSQL using Go (Without For Loops)
In Go, the Postgres COPY command provides an efficient way to bulk insert data from a CSV file into a PostgreSQL database. Here's how to achieve this without using any for loops:
Step 1: Establish Database Connection
Establish a database connection using the pgx library. This is essential for accessing the database and executing commands.
Step 2: Open the CSV File
Open the CSV file containing the data to be inserted. Make sure the file is accessible by your application.
Step 3: Execute COPY Command
Execute the PgConn().CopyFrom() method to initiate the data transfer from the CSV file to the database table. Here's an example:
<code class="go">import ( "context" "fmt" "io" "github.com/jackc/pgx/v4/pgxpool" ) const query = "COPY csv_test FROM STDIN (FORMAT csv)" func main() { dbpool, err := pgxpool.Connect(context.Background(), os.Getenv("DATABASE_URL")) if err != nil { panic(err) } defer dbpool.Close() f, err := os.Open("foo.csv") if err != nil { panic(err) } defer f.Close() res, err := dbpool.Conn().PgConn().CopyFrom(context.Background(), f, query) if err != nil { panic(err) } fmt.Print(res.RowsAffected()) }</code>
Breakdown of the Code:
Benefits:
The above is the detailed content of How to Efficiently Bulk Insert CSV Data into PostgreSQL Using Go Without For Loops?. For more information, please follow other related articles on the PHP Chinese website!