Home > Database > Mysql Tutorial > How Can I Optimize Bulk Insertion Performance in Entity Framework?

How Can I Optimize Bulk Insertion Performance in Entity Framework?

Patricia Arquette
Release: 2025-01-23 21:57:09
Original
851 people have browsed it

How Can I Optimize Bulk Insertion Performance in Entity Framework?

Detailed explanation of Entity Framework batch insertion performance optimization

Inserting large amounts of data into a database using Entity Framework can be a challenging task. Especially when working within a transaction scope and the data volume exceeds 4000 records, it may result in incomplete transactions.

To resolve this issue, it is important to understand that calling SaveChanges() on every record can severely impact performance. It is recommended to adopt the following optimization strategies:

  1. Call SaveChanges() in one go: Instead of calling SaveChanges() after each record is saved, accumulate changes and save them all in one go after all records have been processed.

  2. Batch SaveChanges(): If saving all records at once is still too slow, consider calling SaveChanges() every certain number of records (e.g. 100).

  3. Create and release contexts periodically: When batching changes, consider creating a new context and releasing the existing context after each batch to clear the accumulated entities in the context.

  4. Disable change detection: By disabling automatic change detection, Entity Framework can focus on inserting new records without spending time tracking changes.

Code example:

<code class="language-csharp">using (TransactionScope scope = new TransactionScope())
{
    using (var context = new MyDbContext())
    {
        context.Configuration.AutoDetectChangesEnabled = false;

        int count = 0;
        foreach (var entity in entities)
        {
            ++count;
            context.Set<MyEntity>().Add(entity); // 使用更明确的类型

            if (count % 100 == 0)
            {
                context.SaveChanges();
            }
        }

        context.SaveChanges(); // 保存剩余的记录
    }

    scope.Complete();
}</code>
Copy after login

Performance Benchmark:

For the test with 560,000 entities, the following benchmark results were observed:

  • Number of commits: 1, Recreate context: No: More than 20 hours
  • Number of commits: 100, Recreate context: No: More than 20 minutes
  • Number of commits: 1000, Recreate context: No: 242 seconds
  • Number of commits: 10, Recreate context: Yes: 241 seconds
  • Number of commits: 100, re-create context: Yes: 164 seconds

These optimizations significantly improve performance and ensure successful insertion of large data sets within transaction timeout limits. Choosing an appropriate batch size and whether to recreate the context requires tuning and testing based on actual circumstances.

The above is the detailed content of How Can I Optimize Bulk Insertion Performance in Entity Framework?. For more information, please follow other related articles on the PHP Chinese website!

source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Latest Articles by Author
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template