How to design an algorithm? The following article will analyze common algorithm paradigms for you. It has certain reference value. Friends in need can refer to it. I hope it will be helpful to everyone.
First clarify three concepts:
Algorithm: The process of solving problems step by step.
Paradigm: A mode of thinking about a problem.
Algorithmic Paradigm: A general approach to building efficient solutions to problems.
This article discusses some commonly used algorithm paradigms, such as
Among the sorting algorithms, the common point of the two algorithms, merge and quick sort, is the divide and conquer algorithm.
Divide and conquer is a common algorithm design. The idea is to decompose the problem into smaller sub-problems that are similar to the original problem. Subproblems are usually solved recursively and the solutions to the subproblems are combined to solve the original problem.
The logic of the divide-and-conquer method can be divided into three steps:
The following is a binary search implemented using divide-and-conquer.
function binarySearchRecursive(array, value, low, high) { if (low <= high) { const mid = Math.floor((low + high) / 2); const element = array[mid]; if (element < value) { return binarySearchRecursive(array, value, mid + 1, high); } else if (element > value) { return binarySearchRecursive(array, value, low, mid - 1); } else { return mid; } } return null; } export function binarySearch(array, value) { const sortedArray = quickSort(array); const low = 0; const high = sortedArray.length - 1; return binarySearchRecursive(array, value, low, high); }
Please note that the binarySearch
function above is for others to call, while binarySearchRecursive
is where the divide and conquer method is implemented.
Dynamic Programming is an optimization technique used to solve complex problems by breaking them into smaller sub-problems. It looks a lot like divide and conquer, but instead of decomposing the problem into independent sub-problems and then combining them together, dynamic programming only decomposes the problem into independent sub-problems.
The algorithm logic is divided into three steps:
This is a common interview question called the Coin Change Problem. The coin change problem is a way of finding how many specific numbers of coins can be used to make change, given the amount of change. The minimum coin change problem is simply finding the minimum number of coins required to use a given denomination of money. For example, if you need change for 37 cents, you can use 1 2 cent, 1 5 cent, 1 1 dime, and 1 2 cent.
function minCoinChange(coins, amount) { const cache = []; const makeChange = (value) => { if (!value) { return []; } if (cache[value]) { return cache[value]; } let min = []; let newMin; let newAmount; for (let i = 0; i < coins.length; i++) { const coin = coins[i]; newAmount = value - coin; if (newAmount >= 0) { newMin = makeChange(newAmount); } if (newAmount >= 0 && (newMin.length < min.length - 1 || !min.length) && (newMin.length || !newAmount)) { min = [coin].concat(newMin); } } return (cache[value] = min); } return makeChange(amount); }
In the above code, the parameter coins
represents the denomination ([1, 2, 5, 10, 20, 50] in RMB). To prevent double counting, a cache
is used. The makeChange
function is implemented recursively and is an internal function with access to cache
.
console.log(minCoinChange([1, 2, 5 10, 20], 37)); // => [2, 5, 10, 20] console.log(minCoinChange([1, 3, 4], 6)) // => [3, 3]
Greedy algorithm is related to the current optimal solution and tries to find a global optimal solution. Unlike dynamic programming, it does not consider the overall situation. Greedy algorithms tend to be simple and intuitive, but may not be the overall optimal solution.
The coin problem solved by dynamic programming above can also be solved by greedy algorithm. Whether this solution is optimal or not depends on the denomination used.
function minCoinChange(coins, amount) { const change = []; let total = 0; for (let i = coins.length; i>= 0; i--) { const coin = coins[i]; while (total + coin <= amount) { change.push(coin); total += coin; } } return change; }
As you can see, the greedy algorithm is much simpler than the dynamic programming solution. Let’s take a look at the same solution case to understand the difference between the two:
console.log(minCoinChange([1, 2, 5 10, 20], 37)); // => [2, 5, 10, 20] console.log(minCoinChange([1, 3, 4], 6)) // => [4, 1, 1]
The greedy algorithm gives the optimal solution to the first problem, but the second one is not the optimal solution ( It should be [3,3]
).
The greedy algorithm is simpler and faster than the dynamic programming algorithm, but the solution obtained may not be the optimal solution.
Backtracking Algorithm is great for finding and building solutions step by step.
For the backtracking algorithm, I will write another article to introduce more complex algorithms. I haven't figured out what to write yet. Maybe it's to write a program for solving Sudoku. If you are interested in this, please follow my official account!
Algorithms are never-ending. I hope this article can help you understand some common algorithm paradigms.
Related free learning recommendations: js video tutorial
The above is the detailed content of How to design an algorithm? Introduction to common algorithm paradigms. For more information, please follow other related articles on the PHP Chinese website!