Spider trap detection

Spider traps, also known as crawler traps, are a major problem when crawling the internet. Using a well trained neural network, we developed an algorithm capable of detecting recursive and repetetive spider traps that cannot be caught by regular expressions.

Spider trap detector

We use this detector for our Sitesearch service to avoid crawling useless pages over and over again. If you operate a web crawler and this problem plagues you over and over, don't hesitate to contact us or try it out yourself.

Contact us
Please feel free to contact us now! Call 0(031) 50 85 36 620, send an e-mail or go to our contact page.