When inserting massive amounts of data into a SQL Server database, optimizing the insertion process is crucial. This article will comprehensively introduce the best way to efficiently insert 2 million rows of data.
The SqlBulkCopy class is ideal for bulk inserts. It quickly transfers data from .NET DataTable or IEnumerable
To use SqlBulkCopy effectively, follow these steps:
The following is a C# example:
<code class="language-csharp">using (SqlConnection connection = new SqlConnection(connString)) { SqlBulkCopy bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.FireTriggers, null); bulkCopy.DestinationTableName = tableName; connection.Open(); bulkCopy.WriteToServer(dataTable); connection.Close(); }</code>
If the dataset is stored in a text file, consider reading it into a DataSet and converting it to XML using OpenXML functionality. This XML can then be passed to the database for bulk inserts. However, this approach can require significant memory resources, especially for large data sets.
Please note that loading and inserting 2 million rows of data may stress your system's memory. If necessary, consider using a server with sufficient memory capacity, or explore other solutions such as data streaming or incremental inserts.
The above is the detailed content of How Can I Efficiently Insert 2 Million Rows into SQL Server?. For more information, please follow other related articles on the PHP Chinese website!