To answer, this is a question of overall architecture. Regarding how to design the entire topic, I think you can go to infoq to find the corresponding architecture design articles. You can read articles such as Baidu Tieba app, news app, or Snowball app. I’ll post a snowball handle here: http://www.tuicool.com/articles/YNzAfmV.
High traffic does not mean high concurrency. For high-performance servers, this kind of high traffic can be easily handled. Simple read-write separation and front-end caching are basically enough. If you want to learn from it, you should search directly. A flash sale solution. To put it simply, it’s not a big deal if 100,000 people visit a page at the same time, but it’s troublesome if 2,000 people write comments at the same time. But what news gets 2,000 comments at the same time in 1 second? After all, 99% of people don't comment at all, and more than half of those who write comments comment on other people's comments.
To answer, this is a question of overall architecture. Regarding how to design the entire topic, I think you can go to infoq to find the corresponding architecture design articles. You can read articles such as Baidu Tieba app, news app, or Snowball app.
I’ll post a snowball handle here: http://www.tuicool.com/articles/YNzAfmV.
High traffic does not mean high concurrency. For high-performance servers, this kind of high traffic can be easily handled. Simple read-write separation and front-end caching are basically enough. If you want to learn from it, you should search directly. A flash sale solution.
To put it simply, it’s not a big deal if 100,000 people visit a page at the same time, but it’s troublesome if 2,000 people write comments at the same time. But what news gets 2,000 comments at the same time in 1 second? After all, 99% of people don't comment at all, and more than half of those who write comments comment on other people's comments.