Exploit Krawler is a device that will allow us to grab the tools from miscellaneous exploit kits (applet java,pdf..) in order to make their analysis easier. These exploit kits are more and more numerous on Internet and are more and more used to drop malwares and build botnets. One problem for the security researchers is to reproduce the infections and access the while infection chain. The Exploit Krawler framework goal is to answer these problems at a large scale. Exploit Krawler is a cluster of Selenium instrumented browsers. Browsers are driven in different virtual machines; each virtual machine is monitored to detect an intrusion through its browser.
Monitoring is implemented through the hypervisor. The hypervisor API is used to dump the memory, dump the disks and also launch actions on the virtual machine. Process, socks and DLL which are added or removed during the crawl are checked. Each VM reaches the web pages through Honeyproxy. So all the accesses are logged and the proxy downloads the whole set of web transactions (page, applet, executable,…).
The initial URL list is shared inside the cluster and every newly found URL is distributed through a demultiplexer; the goal is to run different browsers on the same URL with different or identical referrers to trigger the infection, as some exploit kits only triggers on a given Referer and/or for a given browser.