Skip to main content
How do we determine a bot has been stuck for more than a predetermined amount of time. Like say a bot is stuck at 25% in control room can I track it using API and trigger its closure after sending an error email.

@MICAH SMITH​ would be great if you could share some suggestions


I dont believe there's a public API for getting bot status at the moment. I think the bigger concern is why is a bot stuck and what actions are leading to that stuck condition? Most actions in Automation 360 have a certain timeout (thinking of recorder for example) where if an object/condition isnt found to be true within 15 seconds, it automatically moves on to the next action/error...so I'd probably focus on why bots are stuck and how that can be resolved.


I would like to add to this Question:

We have Bots that run 98% of the time and they do fail and throw failure messages when needed. All of my tests reside within a Try, Catch, Finally block to close all browsers and set up for the next test, however when it gets stuck it does not throw any errors except that I forcefully closed the bot. 

When I go to run a stress test(Running test every 20 mins) of my tests and leave the tests overnight, I come in in the Morning and the tests will be stuck for 0-13 hours depending on when it got stuck. This creates a Huge backlog and I have to manually stop all of the ones that didn’t run after the blockage so that our business tests can get running during business hours. I am absolutely in the dark as to why the tests get hung up and I don’t really have a way of checking that as the error is that I forcefully closed the bot and when I run it again it doesn’t happen. It usually only happens once a day during the night, because I don’t catch it while sleeping. It becomes a scramble in the morning to get every other test shut down to run the everyday business tests that are scheduled. 

 

We are pretty small right now with only 2 Bot Runner devices so we have to share licenses and Bot Runners and we are currently on a previous version, n-5 I believe but we are in the process of upgrading to the latest version soon.

Does anyone know a workaround for this or know if this is fixed in a newer version? It happens with all of the tests I have created sporadically. 

Thanks, 
Ben


Reply