Troop movements exposed by civilians: Danish investigator gives an insight into OSINT

Illustration: rvlsoft/Bigstock

A not insignificant part of the escalating situation in Ukraine is taking place online in the form of information warfare, both when it comes to spreading disinformation and providing information about what is actually going on in Ukraine and Russia.

OSINT—Open Source INTelligence, i.e. collecting and analysing data gathered from open sources—plays a central role in this.

In recent weeks, private OSINT investigators, journalists, and NGOs have quite accurately documented the movements of Russian troops in Ukraine.

This is mainly due to the extensive access to satellite images from NASA and ESA as well as private satellite service providers such as Maxar Technologies.

“Civilian population having access to satellite data that was previously reserved for state intelligence services is completely new and quite crazy. In a relatively short time, methods have been developed to decode the new type of data and link it to other forms of public data,” says Aslak Ransby, IT security consultant with an interest in OSINT.

In addition to satellite data from ESA and NASA, images from private satellite service providers such as Maxar Technologies and Capella Space have been shared tirelessly.

In December, sources from the U.S. government said that by mid-December, they had identified 55 groups with a total of 800 Russian troops near the Russian border. A private OSINT investigator had identified 48 out of the 55 without access to classified material.

Aslak Ransby, IT security consultant and OSINT investigator. Illustration: Privat

The priority is debunking on social media

Social media such as TikTok, YouTube, Twitter, and Telegram are full of videos and pictures that allegedly show different war situations. The OSINT community is trying to identify false or manipulated material.

“When things are moving fast, as is the case in Ukraine right now, it takes a long time to verify information, for example photos and videos. But it does not necessarily take long to debunk information, that is, to expose that it is false or manipulated,” Aslak Ransby says.

While it can be difficult to prove that a photo actually shows the situation that it supposedly does, it is easier to prove errors, for example that metadata does not match the statement, i.e. where and when a photo was taken.

“Right now, the OSINT community is spending a lot of time debunking viral posts on the Internet. We have seen examples in which situations are portrayed as assaults on Russians but the reality is actually the opposite. This is about having technical knowledge of, for example, vehicles and equipment,” Aslak Ransby says.

He points out that the Bellingcat team has gradually built up significant competencies in terms of identifying different types of ammunition and weapons, just like the profile @CalibreObscura has also gained reputation internationally for exposing disinformation based on military equipment.

“It’s not enough to be able to find information. You also need technical knowledge to be able to identify errors and manipulation,” Aslak Ransby says.

A few days before the most recent presidential election, a huge amount of information that allegedly compromised Manuel Macron appeared.

It was many gigabytes of data and there were only a few days to verify it. But a quick investigation showed that the material had been manipulated using a Russian version of Excel, and that it contained clear examples of manipulation, which meant that one could initially write off the documents as false.

Dutch OSINT expert Henk van Ess was the one who identified the Russian involvement in what became known as #MacronLeaks.

Often easy to expose

In recent weeks leading up to the Russian invasion, several so-called false flags have been exposed, which have been planted on social media by pro-Russian accounts. In some cases, the material has been relatively easy to debunk, i.e. expose as false.

“The challenge is that in a disinformation war, which we’re seeing in Ukraine right now, it’s not about resisting fact checks from a group of nerds on Twitter and Reddit but about convincing the population on social media. Therefore, we also see that Russians use false evidence on social media, even if it is pretty poorly made. Because the people they are trying to target on social media are usually not able to deduce that it’s a case of manipulation,” Aslak Ransby says.

Social media are fumbling

It is too early to assess how the roaring disinformation war is going to unfold.

“In many ways, it seems like Russians are following the same script as when they attacked Georgia in 2008. But we’re facing what looks like a conventional war in Europe, in which cyber capabilities are likely to play a big role. We haven’t been in that situation before, so many are waiting to see what will happen,” Aslak Ransby says.

He notes that the major social media platforms such as Twitter, YouTube, and TikTok do not yet seem to have found the right balance between blocking and removing misinformation without censoring OSINT sources in Ukraine.

“We have seen examples of Ukrainian sources who have been censored on platforms solely because they have been reported on social media,” Aslak Ransby says.

On Thursday, a number of Twitter accounts belonging to people performing OSINT in Ukraine were suspended for sharing photos and videos of Russian tanks and helicopters near the Ukrainian border. The suspended accounts have now been reinstated after Twitter acknowledged that it was a mistake to close them, The Verge writes.

“We’ve been proactively monitoring for emerging narratives that are violative of our policies, and, in this instance, we took enforcement action on a number of accounts in error. We’re expeditiously reviewing these actions and have already proactively reinstated access to a number of affected accounts,” Twitter spokesperson Elizabeth Busby said in a statement.

Elizabeth Busby elaborated to The Verge that the suspended accounts were believed to have violated Twitter’s synthetic and manipulated media policy, which deals with the sharing of misinformation on the platform.