We are facing a constant battle against bots and spiders with our in-house ad system, striving for 100% valid impressions. To achieve this goal, I conduct experiments on a specific ad zone that is only displayed on one page of our site.
By comparing the Google Analytics page views for that page to the impression count for the ad zone, I aim to align them as closely as possible.
Our defense tactics involve using a known bot/spider list, serving ads via JavaScript, and implementing a honeypot to capture new scrapers/bots automatically.
This results in an ad delivery rate of 130-150% of page views, indicating that bots trigger impressions without generating actual page views.
To address this issue, I attempted loading the ads only upon mouse movement, which reduced delivery to 40-60% of page views but was limited to desktop users exclusively.
Despite JS being widely enabled and mice being common input devices, fulfillment rates remain low. It was surprising to see such a significant drop, as I initially expected most bots to simulate mouse movements.
If you have any insights or suggestions, please share them.
EDIT WITH JS SNIPPET
adShow = 0;
document.onmousemove = function(){
if (adShow == 0) {
var leaderboard = CODE_FOR_AD;
var adLeaderboard = document.querySelector('.adspace-leaderboard#adspace');
adLeaderboard.innerHTML = leaderboard;
adShow = 1;
}
}