Jump to content
Tuts 4 You

About This File

Malicious web sites that compromise vulnerable computers are an ever-present threat on the web. The purveyors of these sites are highly motivated and quickly adapt to technologies that try to protect users from their sites. This paper studies the resulting arms race between detection and evasion from the point of view of Google’s Safe Browsing infrastructure, an operational web-malware detection system that serves hundreds of millions of users. We analyze data collected over a four year period and study the most popular practices that challenge four of the most prevalent web-malware detection systems: Virtual Machine client honeypots, Browser Emulator client honeypots, Classification based on domain reputation, and Anti-Virus engines. Our results show that none of these systems are effective in isolation. In addition to describing specific methods that malicious web sites employ to evade detection, we study trends over time to measure the prevalence of evasion at scale. Our results indicate that exploit delivery mechanisms are becoming increasingly complex and evasive.


User Feedback

Recommended Comments

There are no comments to display.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...