3 minute read
Israeli researchers at Ben-Gurion University have identified a vulnerability in the way social media platforms preview content for posted links. The vulnerability poses risks for brands and individuals, and could be another arrow in the quiver for organizations seeking to wage influence operations or commit ad fraud. Here we summarize the three biggest risks, and how to address them.
Summary of findings
The study's authors are Aviad Elyashar, Sagi Uziel, Abigail Paradise, and Rami Puzis from the Telekom Innovation Laboratories and Department of Software and Information Systems Engineering at Ben-Gurion University. The vulnerability is in the way social platforms display preview content for posted links. When a link is posted, the platforms pull preview images and text to summarize what the content is about to other users. Facebook and LinkedIn use what's called OpenGraph, while Twitter uses Twitter cards, for example. Researchers found that malicious actors, can shape their sites to display one message, but then change the content or use link redirects, and query the social platforms to refresh the preview content. Site owners can mimic an audience's interests (like a chameleon changing colors), then effect a bait-and-switch.
Below is an example from their study, in which they posted links from a site they created purporting to support English Premier League team Chelsea Football Club, from left to right, on Facebook, Twitter, and LinkedIn. The researchers then changed the site content, requested a refresh, and you can see the same links now support London rivals, Arsenal Football Club.
|Courtesy: Aviad Elyashar, et. al, The Chameleon Attack: Manipulating Content Display in Online Social Media|
This may seem rather tame, as far as social media risks go. But there are, of course, ways in which a seemingly innocuous vulnerability can be exploited at scale. Here are some risks to watch out for:
1. Reputation Damage
Bad actors are typically quite patient when it comes to social media scams. Nation states and criminal organizations use social media just as much to study a target audience as they to do exploit target individuals. This technique could be used to lure high-profile individuals or brand accounts to "Like" content that is similar to their existing interests. Many people like and share links without fully clicking through or even reading the linked content. After building a series of Likes, bad actors could switch content to something far more controversial in order to shame the target. For example, a high-level banking executive could be lured to "Like" content related to macro-economic analysis for months, before site owners flip content to be posts in defense of Jeffrey Epstein.
What to do about it: As ever, be careful what you interact with. If you don't know the site or have never visited it before, remain cautious. Brand marketers should instruct their teams or agencies to only interact with content from a list of approved sites.
2. Theft of Social Capital
Owners of dubious sites can use this technique to accrue social capital under one premise, only to switch content, effectively harvesting that social capital for a new cause. This switcheroo could take on geopolitical dimensions. A site could be designed to gain clicks for a cause on the left, only to switch to the right, thereby boosting the new content by gaming engagement algorithms. On the commercial side, it could be used to boost a site in order to commit ad fraud. Lastly, the site content could simply be changed to launch phishing attacks. For example, a news site that accrues social capital, could switch to a catchy sweepstakes, jamming feeds to get victims to fill out forms for a vacation. Social capital is real, when enforced by algorithms, and can be stolen.
What to do about it: The advice for individuals is the same: be wary of what you "Like." But the social platforms will need to make some technical fixes to prevent this type of manipulation. Some, like Facebook, have it easier than others, like LinkedIn. The researchers provide clear next steps.
3. Influence Operations
Social capital can boost content. Imagine thinking that you're liking something that hews close to your political beliefs, only to discover months later that you've "Liked" content from the opposite end of the political spectrum. This technique can also be used to maintain fleets of accounts looking like real, engaging users, thus evading detection and takedown. We have seen, in our own research, Russian bot accounts flip their identities from Spanish-speaking Venezuelans who support Maduro to rightwing rural German voters supporting the AfD.
What to do about it: This falls squarely on the shoulders of the platforms to make the necessary technical changes to prevent preview refreshes.
If you'd like to learn more: I highly recommend reading the original working paper: https://arxiv.org/abs/2001.05668
If you're short on time, or want to hear from the researchers themselves, you can catch an interview with the lead author, who spoke to the Cyber Wire's Research Saturday:
Tags: #Social Media Security
November 14, 2020