php - How can mysql not insert duplicate data without using indexes?
PHP中文网
PHP中文网 2017-05-16 12:58:10
0
2
614

There are two fields in the table, namely a and b.

For example, the record is as follows:
1, 2
1, 3
2, 3
1, 3

Explanation:
1. Among them, 1 and 3 are repeated data.
2. Duplicate data is allowed to exist in the table.

Problem: When the user selects duplicate data not to be added to the database, the data added by the user is as follows.

2, 3
1, 4
1, 3

Only 1 and 4 can be inserted, but not the rest. How to achieve this requirement? Sorry

PHP中文网
PHP中文网

认证高级PHP讲师

reply all(2)
伊谢尔伦

This problem is easier to solve, use optimistic locking, the steps are as follows:
1. Create the md5 hash key (hashedKey) of the (1, 3) tuple into redis, and put the corresponding record in the first query into redis;
2. When inserting a record, first insert the md5 hash key of the new (1,3) tuple into redis (optimistic locking mechanism). If it cannot be inserted, it proves that the key already exists and the insertion operation is refused. ;
3. The hash algorithm example is as follows:

function goHashed($a,$b,&$hashedKey)
{
   //注意对a,b数据格式的判断和异常考虑
   const SALT = "Hello,world"
   $hashedKey = md5($a.SALT.$b.SALT)
}

=== After taking a closer look at the question, it seems that it must be solved using MySQL ===

If you don’t use an index, I really don’t have any good ideas. However, if you can build an index, you can make a->b form a field of an index table, and the index type is unique index. This can also solve the problem of repeated insertion.

为情所困

The purpose of using unique constraints is to limit duplicate data at the database layer.

If you want to give up this restriction, then we can only avoid it through programming.

The most direct solution is that before inserting data, we first search the data to determine whether the field that needs to consider uniqueness already exists with a value that is about to be updated. If it exists, abandon the update, otherwise perform the update operation. BUT, This still cannot completely avoid the entry of duplicate data. After all, there is still the situation of high concurrent duplicate data entry. It can only be said that it greatly reduces the probability of duplicate data entry.

It is recommended that when unique constraints can be used, try to have unique constraint guarantees. After all, this is the last layer of guarantee for the system.

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template