Login

Register

Login

Register

Algorithms can sway people when making online dating decisions | #bumble | #tinder | #pof | #onlinedating | romancescams | #scams



The decisions we make can be affected by AI algorithms

Dimitri Otis/Getty Images

Artificial intelligence-based algorithms can influence people to prefer one political candidate – or a would-be partner – over another, according to researchers.

“We are worried that everyone is using recommendation algorithms all the time, but there was no information on how effective those recommendation algorithms are,” says Helena Matute at the University of Deusto in Spain.

Advertisement

Her work with her colleague Ujué Agudo, also at the University of Deusto, was designed to investigate the issue.

The researchers carried out a series of four experiments in which participants were told they were interacting with an algorithm that would judge their personality. The ‘algorithm’ did not actually do this: it was a mock algorithm that responded in the same way regardless of the information participants gave it. After participants had answered the  mock algorithm’s questions, it presented them with photos of potential partners they might date or political leaders they might vote for – although the ‘politicians’ were simply photographs of people unfamiliar to the participants.

Sometimes the researchers’ mock-algorithm explicitly nudged users to choose one of the photographed individuals. It might state, for instance, that it had discovered a 90 per cent compatibility match between the user and the potential partner or politician in the photo. In other cases, the cue was implicit: the algorithm might simply show the user one particular photo more often.

Study participants were drawn from Spanish-language Twitter and online survey platform Prolific. Between 218 and 441 people took part, depending on the experiment.

Individuals were asked which photographed people they preferred. They were more likely to prefer political candidates presented explicitly to them by the mock-algorithm, and more likely to want to date those who were implicitly selected.

“Maybe we have the idea that algorithms are objective and neutral and efficient, and with numbers and rules,” says Agudo, regarding why we prefer explicit algorithmic recommendations for politicians. “It’s a decision where feelings aren’t involved.”

For that reason, we may be inclined to question algorithmic recommendations more when it comes to matters of the heart.

“The authors raise the really important and frightening point that artificial intelligence, big data and broad user bases give unprecedented opportunities to private corporations for refining their understanding and application of the powers of persuasion,” says Ella McPherson at the University of Cambridge.

“This study strengthens calls for platforms like Facebook and Google to be more transparent about their own algorithms,” says Steven Buckley at the University of the West of England, Bristol, in the UK. “If not to the general public, then at least to academics who can research what the algorithms we actually engage with on a daily basis are doing to us.”

Journal reference: PLoS One, DOI: 10.1371/journal.pone.0249454

More on these topics:

.  .  .  .  .  .  . .  .  .  .  .  .  .  .  .  .   .   .   .    .    .   .   .   .   .   .  .   .   .   .  .  .   .  .





Source link

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Leave a Reply

Shqip Shqip አማርኛ አማርኛ العربية العربية English English Français Français Deutsch Deutsch Português Português Русский Русский Español Español

National Cyber Security Consulting App

 https://apps.apple.com/us/app/id1521390354

https://play.google.com/store/apps/details?id=nationalcybersecuritycom.wpapp


Ads

NATIONAL CYBER SECURITY RADIO

Ads

ALEXA “OPEN NATIONAL CYBER SECURITY RADIO”

National Cyber Security Radio (Podcast) is now available for Alexa.  If you don't have an Alexa device, you can download the Alexa App for free for Google and Apple devices.   

nationalcybersecurity.com

FREE
VIEW