python redis list insertion speed is too slow
typecho
typecho 2017-06-28 09:21:58
0
1
1328
pool = redis.ConnectionPool(host=host, port=port)
client = redis.StrictRedis(connection_pool=pool)

for i in range(10000):
    for j in range(30):
        client.lpush(IDLE_TASKS, json.dumps(args))
 

This kind of execution efficiency is terrible.
You need to wait dozens of seconds before the insertion is completed.
Is there any more efficient way to handle this?

args is just a tuple content (1,2,"3") or something like that

typecho
typecho

Following the voice in heart.

reply all(1)
刘奇

Because I personally have never used the redis library, so I can only try to give some suggestions based on the code you gave. If you don’t like it, don’t criticize:

1. I don’t know where your args came from, but there seems to be no change in the loop body, so can you put this json.dumps(args) outside the loop body and execute it:

args_dump = json.dumps(args)
for i in range(10000):
    for j in range(30):
        client.lpush(IDLE_TASKS, args_dump)

2. Seeing that you need to generate about 300,000 pieces of the same data, can you generate this data first and then run it client.lpush? Because after all, tcp also has its own delay factors

3. You can use the cProfile library to find out what takes a long time, or you can try to use another library to implement it (you have to google this for details)

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template