The only way to optimize without adding hardware is to improve the business logic, or add indexes appropriately (write speed is replaced by reading speed) or memory cache, but 1G of memory is also in short supply...
If you have a hardware budget, there are naturally a hundred ways to optimize it. The most crude and low-cost method is to add an SSD to store the database. If the average 800W data is 15K, then 240G is almost enough
Using PostgreSQL's GIN index is the simplest and most effective method. For details, please refer to: Optimization of high-concurrency, low-cardinality, multi-field, arbitrary combination queries
The only way to optimize without adding hardware is to improve the business logic, or add indexes appropriately (write speed is replaced by reading speed) or memory cache, but 1G of memory is also in short supply...
If you have a hardware budget, there are naturally a hundred ways to optimize it. The most crude and low-cost method is to add an SSD to store the database. If the average 800W data is 15K, then 240G is almost enough
Can be optimized for slow queries. The index is mainly effective for the conditional fields behind where. You can try to create it.
Indexes are good for querying, but adding data will slow down. It is estimated that too many indexes are not good.
Using PostgreSQL's GIN index is the simplest and most effective method. For details, please refer to: Optimization of high-concurrency, low-cardinality, multi-field, arbitrary combination queries