Create a way to limit # of actions of an AI rule during testing/iteration phase to prevent infinite loops

There needs to be a way to limit the number of actions that an AI automation can make during the testing/building phase to prevent infinite loops, like AIs endlessly commenting to each other, an easy mistake to make while iterating/testing. I burned through 10% of a month’s credits in 90 seconds yesterday, and deleted it immediately only because I had the billing/AI usage open in another browser window alongside…. Notice I said deleted - it wasn’t possible to halt it. Anyone else experiencing this?

3 Likes