Much Ado About Kaspersky

3 min read

We’re still being asked about this regularly, so thought it to be a good idea to get thoughts in one place.  Recently, it was reported in the media that the federal government has recommended terminating the use of the Kaspersky anti-virus software.  The reason given is that Eugene Kaspersky is somehow on the payroll or otherwise in collusion with the FSB (successor to the KGB, the Russian equivalent of the CIA). This was followed by a lot of pearl-clutching and at least one story on why local governments (cities, counties) are not dropping the product on command.  CI’s Mike Hamilton contributed to an opinion piece [link], saying that the reasons for dropping the product, specifically the cost and disruption of doing that – doesn’t map to the actual risk.  Let’s dive into that assertion and try and bring some order to this one piece of chaos.

We have heard that a classified briefing was conducted to inform a variety of organizations about the reasons behind the pronouncement.  We weren’t invited to the briefing, so take into consideration that we’re not in possession of that body of reasoning.  Still, it can be safely assumed that the federal government has reasons that are well-researched and supported – and have created policy that applies to federal agencies as a part of a mature risk management process.  All that is assumed.

It's true that it's theoretically possible to use any end-point agent to surveil, or at least map internal networks.  That's certainly a threat and would get your attention.  That would also be pretty easy to see – the result of that surveillance is meaningless if the data can’t be transmitted to a collector, and that would be visible to anyone performing the kind of monitoring they should.

It’s also possible that, rather than having a weaponized endpoint agent at the disposal of a hostile government, that said government has access to the source code of the product, and knows where there may be technical vulnerabilities that can be exploited… either in the ability to sneak something in under the radar, or perform an exploit against the agent for remote code execution.  This is a much more likely scenario, given that it would be important for the Russian government to provide plausible deniability – so any access to the product would be through an exploit, and not outright collusion.

Should local governments pay attention to this?

If there is classified info which says, "hey this is dangerous for the federal government", it's because someone looked at the likelihood of exploit and potential impacts to the federal government.  As far as I know, that doesn't extend to local governments, and they haven't come out and said, "Hey we're the FBI and we suggest that nobody use this stuff."  They did not say that, they said "This is not authorized for use in the federal government anymore."

It’s also true that the penetration of the Kaspersky product into the local government market is spotty at best… so a potential attack, which would be easy to identify, would be limited in its return on investment.  In other words, the threat actor does not have sufficient motivation – the cost-benefit calculation is upside-down.

The message here is that unless there's a risk-based approach including assessing your likelihood and impact that suggests that a capability could be turned against local governments, then there's no need to clutch your pearls and go through the huge expense of firing your AV vendor and standing up a replacement program.  No matter how great your procurement process, it's going to be hugely invasive and expensive.

Until there's a statement which says that's a reasonable thing to do, it's not a reasonable thing to do.

The caveat is this: If you haven't purchased something already, or you’re preparing to change products or vendors anyway, then it's probably incumbent upon anybody to consider the ramifications of any vendor who is based in a foreign country, particularly China, Russia, Israel, anywhere you know the government has an essentially unfettered ability to access source code, systems, and any hardware that goes in and out of the country.  I think there is a case to be said that out of an abundance of caution, you might want to eliminate those kinds of vendors from your deliberations.

Lastly, the situation makes a good case for the return of the Common Criteria EAL-4 certification, which addresses controls on code curation, testing, manufacturing, and a number of other processes that result in a shrink-wrapped, highly-trusted product.