1.当查询出来的记录过万时,遍历Rows的时间过长,达到1分钟甚至更多
程序代码:
rows, err := p.conn.Query(sqlStr, params...)
if err != nil {
log.Println("mysql query error", err.Error())
return nil, err
}
log.Println("conn结束:", goutil.GetCurrentTime())
//延时关闭Rows
defer rows.Close()
//获取记录列
if columns, err := rows.Columns(); err != nil {
return nil, err
} else {
//拼接记录Map
values := make([]sql.RawBytes, len(columns))
scans := make([]interface{}, len(columns))
for i := range values {
scans[i] = &values[i]
}
//此处遍历在3W记录的时候,长达1分钟甚至更多
for rows.Next() {
_ = rows.Scan(scans...)
each := map[string]interface{}{}
for i, col := range values {
each[columns[i]] = string(col)
}
rowMaps = append(rowMaps, each)
}
return rowMaps, nil
}
You can create multiple goroutines to query data in segments. Use channl to get the query structure in the thread for iteration.
This is not a problem of golang performance, other programming languages will be similar. Fetching 30,000 records at one time is an unreasonable demand. It should be solved through paging and only fetching dozens to hundreds of records at a time.