Accelerating creation and deletion of many tasks


i am writing a program that does mirorring and organizes tasks, i have to often delete a large number of task or create a large number of tasks. I am using the Python library.

I implemented batch dispatching for create_task and delete_task operations via the client.batch_api.create_batch_request API call. It works fine but this approach is still too slow. I have to handle 45000 tasks right now.

So i tried to use Thread to have concurrent requests via client.batch_api.create_batch_request API, but no success so far. I suspect problems either with my threading approach or the API functions. That’s why i need to clarify: Can “client.batch_api.create_batch_request” be called parallel by multiple threads? The API documentation says that i can have 50 concurrent GET requests and 15 of the other operations (POST, PUT, DELETE). So can i have 15 threads which make concurrent DELETE requests via the batch API?

Question 2: How can i make task creation and deletion as fast as possible? I am happy for any suggestions on how to delete and create massive number of tasks in the fastest way.

Hi @MacForrester ,

For your first question, the limit is 15 concurrent post-put-delete, using “batch” will not help.

Because, 1 batch of 10 items, is count like 10 individual items. So, your 2nd simultaneous batch of 10 items will crash.

Also, when using batch, the response time is equal to the slowest item of your batch.

To get the maximum performance, you shoud put all your actions in queue, then start 15 threads that pop / run each “post-put-delete” you need.

That said, the performance will not be wonderful, because each “post-put” action we call takes between 1000 and 2500ms to complete!

In the worst scenario, 60000 * 15 / 2500 = 360 actions per minute.
I suppose that running them at 2am can help to get a better performance.

I don’t think you’ll be able to even reach their 1500/minute max api request limit.

Using any method, batch or not, threads or not, the api limit is 1500 per minute.

But, if you can reach that limit, your max performance for 45000 tasks will be 30 minutes.
That’s a theorical best time you can get, but it’s almost impossible to reach !

So, my suggestion is:

  1. do not use batch, to make a fast action ends quickly, without waiting for a slower one to complete.
  2. run 15 parallel threads
  3. run your batch during non busy time
  4. be patient :smiley:

Great answers/info, @Frederic_Malenfant!

@MacForrester if you’re curious why those writes take so long, here’s some detail:


Thanks Frederic, your approach seems like a good alternative to my current implementation, i might implement it and do some benchmarking. Although, interesting, the API engineer in the Post that @Phil_Seeman quoted, recommends using the batch API :thinking:

I will get back once i have some test results…


1 Like

Yes, I saw that suggestion of using the Batch…

That’s what I think: When you use the Batch, the authentication and security is done only once, so I suppose they save time and resources doing it once for 1 batch of 10 query…

But, your request will still be as slow as the slowest of the 10 items of the batch, so, you have to test and benchmark to see what’s the best for your needs!

I am curious, after your experimentations, please come back and tell us your results !!

1 Like

I will, i just have to fix the token refresh first, since some of my threads get TokenExpiredErrors and i am still trying to solve that with the python API as pointed out here:

Unfortunately no asana dev has a definitive solution for that :confused: