uniq method of JavaScript array_javascript tips
Add a prototype method to the Array local object. Its purpose is to delete duplicate entries (there may be multiple) in the array entries. The return value is a new array containing the deleted duplicate entries.
Formal description:
input
Array(size=N)
output
Array1=A subset of Array without duplication and order preservation,
No duplication means, For any a, b belongs to Array1, a!=b
order preservation means that if the subscript of a in Array is smaller than the subscript of b in Array, then the subscript of a in Array1 is also smaller than the subscript of b in Array. Marked
Array2=Array-Array1, order-preserving
realazy gives a new solution, the idea is very clear: sequential traversal accesses each element, if the value of this element has been accessed, add it to Array2, otherwise add it Array1. The method used to determine whether the value of the current element has been accessed is to sequentially traverse all the elements that have been accessed.
It is easy to see that the complexity of this algorithm is about O(N^2).
I made some slight improvements under his algorithm framework. The key lies in how to determine whether the value of the current element has been visited during the traversal process. Under the condition that the value range of the original array is a positive integer and the range (range=max value-min value) is not too large, a simple "bucket" algorithm can be used.
Prepare a boolean array b with a length of range, and initialize it all to false. For each value value in the original array, if b[value]=true, it means that the value has been accessed and placed in Array2, otherwise it is placed in Array1 and b[value]=true.
This is obviously an O(N) algorithm, the cost is additional space complexity range, and the original array value range is required to be a positive integer.
It is not difficult to generalize to the case where the value range is an integer. In fact, you only need to examine the case where the bucket number value-min (Array) can be converted into a positive integer.
In order to avoid the waste of space caused by too large a range, a hashing algorithm is improved based on the "bucket" algorithm, specifically a linear congruential open hashing method. The purpose is to compress the value range and map it to a controllable small subset of continuous positive integers, while ensuring that the probability that different original images correspond to the same image is as small as possible, which means that the load between buckets should be balanced as much as possible. .
For example, this is a hash function with a real number range:
key=hashFun(value)=Math.floor(value)*37�
This is still an O(N) algorithm, (obviously O(N) is the lower bound of complexity of all uniq algorithms). The advantage is that it can control the space overhead and can adapt to non-integer value fields. It only needs to design the corresponding hash function.
下面是桶(bucket)算法的实现:
var resultArr = [],
returnArr = [],
origLen = this.length,
resultLen;
var maxv=this[0],minv=this[0];
for (var i=1; i
else if(this[i]
var blen=maxv-minv 1;
var b=new Array(blen);
for(var i=0;i
returnArr.push(this[i]);
} else {
resultArr.push(this[i]);
b[this[i]-minv]=true;
}
}
resultLen = resultArr.length;
this.length = resultLen;
for (var i=0; i
}
return returnArr;
下面是散列(hash)算法的实现
var shuffler = 37
var beta=0.007;
var origLen=this.length
var bucketSize=Math.ceil(origLen*beta);
var hashSet=new Array(bucketSize);
var hashFun = function(value){
var key = (Math.floor(value)*shuffler)%bucketSize;
return key;
}
//init hashSet
for(var i=0;i
var ret=[],self=[];
var key,value;
var bucket,openLen;
var everConflict;
for(var i=0;i
key=hashFun(value);
bucket = hashSet[key];
openLen=bucket.length;//if(openLen>1)return;
everConflict=false;
for(var j=0;j
ret.push(value);
everConflict=true;
break;
}
}
if(!everConflict){
bucket.push(value);
self.push(value);
}
}
selfLen = self.length;
this.length = selfLen;
for (i=0; i
}
//compute average bucket size
var lens=[],sum=0;
for(var i=0;i
return ret;
用k*10000个0~k*100的随机整数测试计算时间(ms)
k 1 2 3 4 5
realazy 240 693 1399 2301 3807
bucket 55 101 141 219 293
hash 214 411 654 844 1083
测试框架借鉴了http://realazy.org/lab/uniq.html
测试环境Firefox2.0.0.6/Ubuntu7.10/2.66GHzP4/1024MBDDR

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

The method of using a foreach loop to remove duplicate elements from a PHP array is as follows: traverse the array, and if the element already exists and the current position is not the first occurrence, delete it. For example, if there are duplicate records in the database query results, you can use this method to remove them and obtain results without duplicate records.

The performance comparison of PHP array key value flipping methods shows that the array_flip() function performs better than the for loop in large arrays (more than 1 million elements) and takes less time. The for loop method of manually flipping key values takes a relatively long time.

Methods for deep copying arrays in PHP include: JSON encoding and decoding using json_decode and json_encode. Use array_map and clone to make deep copies of keys and values. Use serialize and unserialize for serialization and deserialization.

Multidimensional array sorting can be divided into single column sorting and nested sorting. Single column sorting can use the array_multisort() function to sort by columns; nested sorting requires a recursive function to traverse the array and sort it. Practical cases include sorting by product name and compound sorting by sales volume and price.

The best practice for performing an array deep copy in PHP is to use json_decode(json_encode($arr)) to convert the array to a JSON string and then convert it back to an array. Use unserialize(serialize($arr)) to serialize the array to a string and then deserialize it to a new array. Use the RecursiveIteratorIterator to recursively traverse multidimensional arrays.

PHP's array_group_by function can group elements in an array based on keys or closure functions, returning an associative array where the key is the group name and the value is an array of elements belonging to the group.

PHP's array_group() function can be used to group an array by a specified key to find duplicate elements. This function works through the following steps: Use key_callback to specify the grouping key. Optionally use value_callback to determine grouping values. Count grouped elements and identify duplicates. Therefore, the array_group() function is very useful for finding and processing duplicate elements.

The PHP array merging and deduplication algorithm provides a parallel solution, dividing the original array into small blocks for parallel processing, and the main process merges the results of the blocks to deduplicate. Algorithmic steps: Split the original array into equally allocated small blocks. Process each block for deduplication in parallel. Merge block results and deduplicate again.
