NewsWorld
PredictionsDigestsScorecardTimelinesArticles
NewsWorld
HomePredictionsDigestsScorecardTimelinesArticlesWorldTechnologyPoliticsBusiness
AI-powered predictive news aggregation© 2026 NewsWorld. All rights reserved.
For live open‑source updates on the Middle East conflict, visit the IranXIsrael War Room.

A real‑time OSINT dashboard curated for the current Middle East war.

Open War Room

Trending
IranIranianMilitaryStrikesIsraeliCrisisPricesMarketOperationsRegionalLaunchTrumpGulfHormuzProxyDisruptionEscalationConflictTimelineTargetsStatesStraitDigestPower
IranIranianMilitaryStrikesIsraeliCrisisPricesMarketOperationsRegionalLaunchTrumpGulfHormuzProxyDisruptionEscalationConflictTimelineTargetsStatesStraitDigestPower
All Articles
Deepfake attack: 'Many people could have been cheated'
BBC World
Published about 14 hours ago

Deepfake attack: 'Many people could have been cheated'

BBC World · Mar 2, 2026 · Collected from RSS

Summary

The boss of the Bombay Stock Exchange was recently targeted in what is a growing global problem.

Full Article

11 hours agoGideon LongandEd ButlerAFP via Getty ImagesSundararaman Ramamurthy says it is impossible to know how many people saw the fake videoAt the start of this year, a video popped up on social media sites in India showing the chief executive of the Bombay Stock Exchange, Sundararaman Ramamurthy, giving investors advice on which stocks to buy.Viewers were promised handsome returns if they heeded his advice.The only problem was, it was not Ramamurthy speaking. It was a deepfake video of him, made using artificial intelligence."It was in the public domain where many people could see it, and get cheated into buying or selling stocks, as if I'd recommended them," explains Ramamurthy."When we see an incident like this, we immediately lodge a complaint. We go to Instagram and other places where it's posted to get the video taken down. And we regularly write to the market warning people not to believe in fake videos."Ramamurthy adds: "We don't know how many people have seen this video, it's really difficult to find out, so we can't really judge if it's had a big impact or not."What we want is for it to have had no impact at all. No one should incur a loss because they believe something that is untrue."Ramamurthy and the Bombay Stock Exchange are not alone."The latest data shows that over the past two years or so, we've seen an increase of almost 3,000% in the number of deepfakes being utilized," says Karim Toubba, the chief executive of US-based password security company LastPass.Toubba himself was deepfaked in 2024."One of our employees in Europe received an audio message and a text message from someone alleging to be me, urgently requesting some help from me," he says.Fortunately for Toubba - and LastPass - the employee was suspicious."The message was on WhatsApp, which for us is not a sanctioned communication channel," says Toubba. "Also, we have corporate sanctioned mobile devices and this came in via his personal phone. So that made him think this was potentially a little murky, a little fishy."The employee reported the incident to LastPass's cyber-security team and no harm was done.AFP via Getty ImagesIt is not known how many people were affected by the attack on the boss of the Bombay Stock ExchangeBritish engineering firm Arup was not so lucky. In 2024 it was hit by one of the most sophisticated deepfake attacks ever seen in the corporate world.According to Hong Kong police, an Arup employee working there received a message purporting to come from the firm's chief financial officer (CFO), who was based in London, regarding a "confidential transaction".The employee got onto a video call with the CFO and other staff. On the basis of that call, the employee then transferred $25m (£18.5m) of Arup money to five different bank accounts, as instructed. It only later emerged that the people on the call, including the CFO, were deepfakes."You would never want to simply jump on a video call with someone and transfer $25m," says Stephanie Hare, a tech researcher and co-presenter of the BBC's AI Decoded TV programme."Companies are having to take extra steps to secure these types of communications. That's the brave new world we're in now."The rapid evolution of AI means that these videos are becoming more lifelike all the time."Deepfakes are becoming very, very easy to do," says Matt Lovell, co-founder and CEO of UK-based cyber-security company CloudGuard. "To generate video and audio quality of extremely accurate specifications - it takes minutes."It is also becoming cheaper."For, say, a simple, single individual-led attack, you're looking at $500 to $1,000 with the use of largely free tools," says Lovell. "For a more sophisticated attack, you're looking at between $5,000 and $10,000."While deepfake videos are becoming more sophisticated, so are the tools used to thwart them. Companies can now use verification software that can assess a person's facial expressions, the way they turn their head and even the way the blood flows through their face to establish whether it really is them or a deepfake version of them."In your cheeks or just underneath your eyelids, we'll be looking for changes in blood flow when a person is talking or presenting." Lovell says. "That's really where we can tease out whether it's AI-generated or it's real."Getty ImagesAI is allowing cyber criminals to far more easily make deepfake videosBut firms are in a constant battle to stay one step ahead of the fraudsters."It's a race, between who can deploy a technology and who can thwart that technology as quickly as possible," says LastPass's Toubba. "Luckily, there seems to be quite a bit of money flowing into this, which will only accelerate the pace with which organisations will develop technologies to detect and ultimately block these things."At CloudGuard, CEO Matt Lovell is more downbeat."Attack vectors are accelerating faster than we can accelerate defence automation and protection," he says. "Are people moving fast enough to respond to the speed the threat is developing? Absolutely not."Hare says the proliferation of deepfake attacks means that people with the skills to combat fraudsters are in high demand. "We have a shortage of cybersecurity professionals worldwide, We need more people to get into this."And she says companies are waking up to the threat, albeit slowly."In the past it was not considered a priority to secure your operations in quite the same way as it is now," she points out."Now that we have these types of risks, with the leaders at companies, with CEOs, being deepfaked, I think company executives will be spending more time with their chief information security officers and teams than before. And that is a good thing."


Share this story

Read Original at BBC World

Related Articles

BBC Worldabout 3 hours ago
France to boost nuclear arsenal and extend deterrence to European allies

Emmanuel Macron said eight countries could enjoy protection from France's nuclear umbrella - but that Paris would retain sole decision-making power.

BBC Worldabout 4 hours ago
Israel strikes Lebanon after Hezbollah rocket fire as Iran conflict widens

Lebanon's health ministry says 31 people were killed by Israeli strikes, while there are no reports of casualties in Israel.

BBC Worldabout 4 hours ago
Saturday Night Live criticised for 'hurtful' Tourette's sketch

The condition is "not a joke", the Tourette Action charity says, as the Baftas fallout continues.

BBC Worldabout 5 hours ago
At least 169 people killed in South Sudan 'surprise' attack

Peacekeepers are sheltering about 1,000 civilians near their base and providing emergency care.

BBC Worldabout 5 hours ago
India and Canada reset ties with 'landmark' nuclear energy deal

A host of deals, including one to supply India with uranium, is unveiled as Mark Carney meets Narendra Modi in Delhi.

BBC Worldabout 6 hours ago
Hegseth on Iran attacks: 'This is not Iraq, this is not endless'

US Secretary of Defense Pete Hegseth offered few details about the operation, leaving questions unanswered about the scope or duration.