Home > Web Front-end > JS Tutorial > body text

API rate limiting via Node+Redi

青灯夜游
Release: 2020-09-19 10:13:03
forward
3567 people have browsed it

API rate limiting via Node+Redi

Rate limiting protects and improves the availability of API-based services. If you are talking to an API and receive an HTTP 429 Too Many Requests response status code, you have been rate limited. This means you have exceeded the number of requests allowed in a given time. All you need to do is slow down, wait a moment, and try again.

Video tutorial recommendation: nodejs tutorial

Why rate limit?

When you consider limiting your own API-based services, you need to weigh the trade-offs between user experience, security, and performance.

API rate limiting via Node+Redi

#The most common reason to control data flow is to maintain the availability of API-based services. But there are also security benefits, as an accidental or intentional surge in inbound traffic can take up valuable resources and impact availability for other users.

By controlling the rate of incoming requests, you can:

  • Ensure that services and resources are not "flooded."
  • Mitigating Brute Force Attacks
  • Preventing Distributed Denial of Service (DDOS) Attacks

How to implement rate limiting?

Rate limiting can be implemented at the client level, application level, infrastructure level, or anywhere in between. There are several ways to control inbound traffic to an API service:

  • By User: Track calls made by a user using an API key, access token, or IP address
  • By Geographic Region: For example lowering the rate limit for each geographic region during peak times of the day
  • By Server: If you have multiple servers to handle Different calls to the API, you might impose stricter rate limits on access to more expensive resources.

You can use any of these rate limits (or even a combination).

API rate limiting via Node+Redi

No matter how you choose to implement it, the goal of rate limiting is to establish a checkpoint that denies or passes requests to access your resources. Many programming languages ​​and frameworks have built-in functionality or middleware to achieve this, as well as options for various rate limiting algorithms.

Here's one way to make your own rate limiter using Node and Redis:

  • Create a Node App

  • Adding a rate limiter using Redis

  • Testing in Postman

View code examples on GitHub.

Before you begin, make sure you have Node and Redis installed on your computer.

Step 1: Set up a Node application

Set up a new Node application from the command line. Accept the default options via the CLI prompt, or add the —yes flag.

$ npm init --yes
Copy after login

If you accepted the default options during project setup, create a file named index.js for the entry point.

$ touch index.js
Copy after login

Install the Express Web framework, and then initialize the server in index.js.

const express = require('express')
const app = express()
const port = process.env.PORT || 3000

app.get('/', (req, res) => res.send('Hello World!'))
app.listen(port, () => console.log(`Example app listening at http://localhost:${port}`))
Copy after login

Start the server from the command line.

$ node index.js
Copy after login

Go back to index.js, create a route, first check the rate limit, and then allow the user to access the resource if it does not exceed the limit.

app.post('/', async (req, res) => {
  async function isOverLimit(ip) {
    // to define
  }
  // 检查率限制
  let overLimit = await isOverLimit(req.ip)
  if (overLimit) {
    res.status(429).send('Too many requests - try again later')
    return
  }
  // 允许访问资源
  res.send("Accessed the precious resources!")
})
Copy after login

API rate limiting via Node+Redi

In the next step, we will define the rate limiter function isOverLimit.

Step 2: Add a rate limiter using Redis

Redis is an in-memory key-value database, so it can retrieve data very quickly. Implementing rate limiting with Redis is also very simple.

  • Store a key like the user's IP address.
  • Increase the number of calls made from this IP
  • Expire records after a specified period of time

The rate limiting algorithm shown in the figure below is a Example of sliding window counter. A user will never hit the rate limit if they submit a moderate number of calls, or space them out over time. Users who exceed the maximum request within the 10 second window must wait sufficient time to resume their requests.

API rate limiting via Node+Redi

Install a Redis client named ioredis for Node from the command line.

$ npm install ioredis
Copy after login

Start the Redis server locally.

$ redis-server
Copy after login

Then require and initialize the Redis client in index.js.

const redis = require('ioredis')
const client = redis.createClient({
  port: process.env.REDIS_PORT || 6379,
  host: process.env.REDIS_HOST || 'localhost',
})
client.on('connect', function () {
  console.log('connected');
});
Copy after login

Define the isOverLimit function we started writing in the previous step. According to this mode of Redis, a counter is saved according to IP.

async function isOverLimit(ip) {
  let res
  try {
    res = await client.incr(ip)
  } catch (err) {
    console.error('isOverLimit: could not increment key')
    throw err
  }
  console.log(`${ip} has value: ${res}`)
  if (res > 10) {
    return true
  }
  client.expire(ip, 10)
}
Copy after login

This is the rate limiter.

When a user calls the API, we check Redis to see if the user has exceeded the limit. If so, the API will immediately return an HTTP 429 status code with the message Too many requests — try again later . If the user is within limits, we continue to the next block of code where we can allow access to a protected resource (such as a database).

During rate limit checking, we find the user's record in Redis and increment its request count, if there is no record for that user in Redis, then we create a new record. Finally, each record will expire within 10 seconds of the most recent activity.

In the next step, please make sure our speed limiter is functioning properly.

Step 3: Test in Postman

Save the changes and restart the server. We will use Postman to send POST requests to our API server, which is running locally at http:// localhost:3000.

API rate limiting via Node+Redi

Continue sending requests in rapid succession to reach your rate limit.

API rate limiting via Node+Redi

Final Thoughts on Rate Limiting

This is a simple example of a rate limiter for Node and Redis, and this is just the beginning. There are a bunch of strategies and tools you can use to structure and implement your rate limits. And there are other enhancements that can be explored with this example, like:

  • in the response body or as a Retry-after header to let the user know before retrying How long should you wait
  • Log requests that hit the rate limit to understand user behavior and warn of malicious attacks
  • Try using other rate limiting algorithms or other middleware

Remember that when you look at API limits, you are making trade-offs between performance, security, and user experience. Your ideal rate limiting solution will change over time, taking these factors into account.

English original address: https://codeburst.io/api-rate-limiting-with-node-and-redis-95354259c768

More programming related knowledge , please visit: Introduction to Programming! !

The above is the detailed content of API rate limiting via Node+Redi. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:segmentfault.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template