C# Performance optimization details
##1. Use ItemArray to implement batch assignment to DataRow
When assigning values to all fields of DataRow, using field names for column-by-column assignment is inefficient. At this time, batch field assignment should be used whenever possible. You can use the ItemArray or rows.Add method:
/ ds是数据集(DataSet)对象 DataTable dt = ds.Tables[0]; DataRow row = dt.NewRow(); row.ItemArray = new object[] { value1, value2, …, valuen }; // ds是数据集(DataSet)对象 DataTable dt = ds.Tables[0]; dt.Rows.Add(value1, value2, …, valuen); //应避免做大量连续的单列赋值,如下: DataTable dt = ds.Tables[0]; DataRow row = dt.NewRow(); row["col1"] = value1; row["col2"] = value2; … row["coln"] = valuen;
2. Reasonable use of parallel computing of DataTable
DataTable's built-in parallel computing can make full use of each CPU of the computer to optimize efficiency.
IEnumerable<DataRow> FindRows() //查找所有数量小于0的分录 { DataTable dt = ItemDataTable; …… return dt.Select(“Quantity<0”); //未使用并行计算 } IEnumerable<DataRow> FindRows() //查找所有数量小于0的分录 { DataTable dt = ItemDataTable; …… int index = dt.Columns.IndexOf("Quantity"); return dt.AsEnumerable().AsParallel().Where(dr => (decimal)dr[index] < 0); //使用并行计算: }
##3. Use ImportRow to merge into DataTable with the same structure
DataTable[] srcTables = ... ; foreach(DataTable src in srcTables ) { dest.Merge( src ) ; }
DataTable[] srcTables = ... ; foreach(DataTable src in srcTables ) { foreach(DataRow row in src.Rows) { dest.ImportRow( row ) ; } }
The above is the content of C# DataSet performance best practices. For more related content, please pay attention to the PHP Chinese website (www.php.cn)!