Banks and other financial institutions utilizing artificial intelligence may be uniquely susceptible to retaliatory Russian cyberattacks as taxing international sanctions worsen, experts warn.
Those fears, highlighted in a recent Wall Street Journal report, comes as Russia’s war on Ukraine trundles forward into its second month and as an unprecedented barrage of international sanctions continue to chip away at the Russian economy. Global financial institutions have played an integral role in the sanctions regime from the start, blocking money flows from certain Russian banks, denying them access to international markets, and even freezing the assets of President Vladimir Putin and prominent Russian oligarchs.
However, experts fear these same institutions’ rapid reliance on machine learning-learning models to automate more and more of their systems in the name of efficiency could come back to bite them in the ass. Andrew Burt, a former policy adviser to the head of the cyber division at the FBI, described AI vulnerabilities as “significant and very widely overlooked” at many financial institutions that have come to rely on them. “It’s a huge unaccounted-for risk,” Burt said.
So why exactly are machine learning algorithms more susceptible to attacks? Well, in general, most of the problems stem from machine learning’s need to utilize large amounts of data to improve calculations. That reality makes them particularly susceptible to data manipulation attacks. In the past, researchers have shown it’s possible for an attacker to deliberately “poison” an algorithm’s training data to corrupt or influence any results it may spit out.
While issues of racial, gender, and other biases in AI algorithms stemming from limited data have become well known, some researchers fear bad actors targeting financial institutions could deploy large levels of biased data to attack financial system algorithms looking to suss out market sentiments. Think Russian disinformation memes but applied to the financial sector.
Worse still, according to a 2020 Georgetown Center for Security and Emerging Technology report, machine learning vulnerabilities can’t be patched the same way as other software meaning any potential attacks could last much longer.
“Lying dormant in those systems are vulnerabilities that are different from the traditional flaws with which we have decades of experience,” the report reads. “These vulnerabilities are pervasive and inexpensive to exploit using tools that have proliferated widely and against which there is often little defense.”
These algorithms can also be duped in real-time without large sets of data. Researchers from Tencent’s Keen Security Lab, for example, demonstrated several relatively simple techniques used to fool Tesla’s machine learning capabilities back in 2019, first tricking the windshield wipers to engage when they weren’t supposed to and then using a bright sticker on a road to convince a Tesla engaged in Autopilot to drift into an opposing lane.
“I haven’t seen any real abilities in terms of being able to defend against the flood of disinformation,” Montreal AI Ethics Institute Founder Abhishek Gupta told The Journal. Gupta went on to describe machine learning security as a “novel” field filled with unknowns. “When you introduce machine learning into any kind of software infrastructure, it opens up new attack surfaces, new modalities for how a system’s behavior might be corrupted.”
While those security weaknesses are cause for concern even in the best of times, government leaders including President Biden worry Russia may use cyberattacks to lash out against these institutions as sanctions take a continued toll. In a statement released earlier this week, Biden advised private companies in the U.S to bolster their security practices, citing “evolving intelligence that the Russian Government is exploring options for potential cyberattacks.” Global banks meanwhile have reportedly increased network monitoring and increased drilling for potential cyberattack scenarios in the weeks since the Russian military began its invasion Reuters notes.