Efficiently improve SQL Server data insertion speed: insert 2 million rows of data at one time
Inserting large amounts of data into a SQL Server database, especially when dealing with millions of rows of data, can be a challenge. This article explores several of the best and fastest techniques to achieve this:
1. Use SqlBulkCopy for bulk loading:
SqlBulkCopy is a powerful tool designed to quickly import large amounts of data into SQL Server. It bypasses the traditional row-by-row insertion process and instead performs batch operations.
Implementation method:
<code class="language-csharp">using (SqlConnection connection = new SqlConnection(connString)) { SqlBulkCopy bulkCopy = new SqlBulkCopy(connection, ..., null); bulkCopy.DestinationTableName = this.tableName; connection.Open(); bulkCopy.WriteToServer(dataTable); connection.Close(); }</code>
2. Batch insert using XML:
Another approach is to convert the data to XML and then use OpenXML functions to load it into SQL Server. This approach allows bulk inserts while maintaining data integrity.
Implementation method:
<code class="language-sql">INSERT INTO TableName(XMLData) VALUES (CAST(@xmlData AS XML))</code>
Memory Notes:
It is important to note that both methods require the entire data set to be loaded into memory during the insertion process. Therefore, consider available memory and the size of your data set when choosing an appropriate technology.
The above is the detailed content of How Can I Quickly Insert 2 Million Rows into SQL Server?. For more information, please follow other related articles on the PHP Chinese website!