PLI Podcast: Andy Ellis, Akamai Technologies

Posted on 12 March 2011 by


This week on the Police Led Intelligence Podcast we’re speaking with Andy Ellis, Senior Director of Information Security and Chief Security Architect at Akamai Technologies, a large provider of applications and services on the Internet. Andy formerly worked on information warfare in the United States Air Force.

Download the podcast here

We’re speaking with Andy because his company’s 80,000 servers see enormous traffic from around the world, placing it in the middle of about 20% of Internet transactions. Like a shopping mall security guard, Akamai ends up seeing the entire range of human shopping behaviors, and also sees a whole lot of crime. So Andy finds himself in the fairly rare position of being one who both sees tons of malicious Internet traffic he can use as intelligence feeds, as well as needing to share information about attacks against his company, his infrastructure and ultimately his customers.

When law enforcement agencies look at intelligence collection and processing, they look at a mountain of data and information from a huge range of public, private and official sources. Since we have long since passed the inflection point between not enough and far too much data, we’re forced to listen selectively. If we’re doing it right, we’re monitoring only those channels which support the mission we’re on, no more no less.

[We’re going to do a separate podcast on that process in the near future.]

But this selective listening presents some challenges – in a world of infinite and growing information channels, how do we decide what information not to listen to, and how do we test to see whether the channels we’ve chosen to monitor are giving us the full picture?

Andy is in an interesting position because you have to do, in effect, both sides of the equation. As head of security for a large company and a large service provider, you need to do everything that I just said. but you also have to balance what information you allow your organization to share.

“We’re almost like an ISAAC ourselves – an Information Sharing and Analysis Center,” he said, “because we’re looking at the information across a huge piece of industry. Our customers are trusting us: we have to decide, how do we take that data back to them. We can’t say, “Nick Selby got hacked yesterday, because you’d be upset if we told your competitors. But at the same time, we have to say, ‘Hey, there’s this attack that’s happening, and here’s what it looks like.’ You’re willing to do that as long as your name isn’t part of it. Because we’re not just protecting our own data, we’re protecting the data of our constituents.”

Some highlights of the conversation:

  • Sharing is about partnership. Who are you going to trust with the data? It’s much worse than merely revealing your vulnerabilities: by sharing too much, you’re revealing to adversaries your data collection capabilities. You’re telling them how you would detect somebody exploiting your vulnerabilities.
  • Partnerships have to be a two-way street: if you share with law enforcement and get nothing back, you’re not going to share much any more. Most of the time we’ve seen that in public-private partnerships, private gives something to public and public gives nothing back. This must change.
  • That said, Andy said that, for every law enforcement agency that does not give back he’s got examples for those that do. He lauded the work of the FBI, and agreed that it should work more with local, county and state law enforcement to share resources it has to help solve crimes that may be too small for the FBI to become involved but are still criminal activity.
  • When overwhelmed with subpoena requests that were badly formed, untimely and otherwise inconvenient, Andy began a campaign to educate law enforcement through the FBI and other agencies, and this paid immediate results. Building partnerships and relationships with law enforcement helps in many ways which can be leveraged on behalf of his customers and his company.
  • Intelligence is a cycle, an iterative process, not set and forget. Look at datasets and pull out information you might not have seen and then automate. Look at incidents that happened that your monitoring didn’t tell you about in advance, and question how you could have detected this. What you could have viewed to have seen that event coming?

Give it a listen! Download the podcast here