When we think of sorting arrays in JavaScript, we usually imagine strings, numbers, or even objects being organized in ascending or descending order. But what happens when the elements of the array are emojis? The answer might surprise you, especially when skin-tone modifiers come into play!
In this blog, we’ll dive into how JavaScript handles sorting emojis, explore the Fitzpatrick scale that influences this behavior, and showcase some quirky examples along the way.
Let’s start with the code snippet from the screenshot:
["??", "??", "??", "??", "??"].sort();
When executed, this code produces the following output:
["??", "??", "??", "??", "??"]
At first glance, you might wonder, What just happened? The emojis were sorted in a "light-to-dark" order. To understand why this happens, we need to peek under the hood of JavaScript and Unicode.
Every emoji is represented by a unique Unicode code point. For example:
When an emoji like "??" is displayed, it is actually a combination of two Unicode characters:
The .sort() method in JavaScript works by comparing the Unicode values of array elements, treating them as strings by default. This means that the sorting order of emojis is determined by their underlying Unicode values.
The Fitzpatrick scale is a classification system for human skin tones, ranging from Type I (lightest) to Type VI (darkest). Unicode adopted this scale to introduce skin-tone diversity in emojis. Here’s how the modifiers correspond to the Fitzpatrick scale:
When sorting emojis with these modifiers, JavaScript essentially orders them based on the numeric value of the skin-tone modifiers, which corresponds to the Fitzpatrick scale.
Here’s a practical demonstration:
["??", "??", "??", "??", "??"].sort();
Let’s try a few variations and observe the results.
What happens if we include base emojis without skin-tone modifiers?
["??", "??", "??", "??", "??"]
Output:
// Array of emojis with skin-tone modifiers const emojis = ["??", "??", "??", "??", "??"]; // Sort the array const sortedEmojis = emojis.sort(); // Log the sorted array console.log(sortedEmojis); // Output: ["??", "??", "??", "??", "??"]
Here, the base emoji ? comes first because it has the smallest Unicode value, followed by the sorted skin-tone variants.
What if we sort other emoji groups with modifiers, like hand gestures?
const mixedEmojis = ["??", "??", "?", "??", "??", "??"]; console.log(mixedEmojis.sort());
Output:
["?", "??", "??", "??", "??", "??"]
Once again, the emojis are sorted based on the Fitzpatrick scale.
To understand the sorting behavior more deeply, let’s inspect the Unicode values of each emoji:
const handEmojis = ["??", "??", "??", "??", "??"]; console.log(handEmojis.sort());
This will log the Unicode values of the base emoji and its modifiers. You’ll notice the incremental increase in values for the skin-tone modifiers.
The above is the detailed content of Decoding JavaScript Emoji Sorting with the Fitzpatrick Scale. For more information, please follow other related articles on the PHP Chinese website!