I've found an effective way to filter bots is to check their headers. The order and capitalization of the headers. Browsers will always send headers over with the same capitalization and order (unless some plugin interfering is installed). Check your bot traffic and see what browser it's masquerading as, then compare some headers that are sent over by that browser and version. For example 'Content-type' is coming over as 'content-type' or 'Content-Type'. Or the bot is not sending the Accepts header whereas Chrome always sends it. A few small checks on the headers can help easily identify a lot of the bots.
Chances are they're not going to take the time to figure out your specific site's mitigation measures and just let the bot fail crawling or move on.
Chances are they're not going to take the time to figure out your specific site's mitigation measures and just let the bot fail crawling or move on.