Site icon TimesKuwait

Major tech companies use ‘dark patterns’ to influence users

A new piece of bipartisan legislation in the United States aims to protect people from one of the sketchiest practices that tech companies employ to subtly influence user behavior. Known as ‘dark patterns’, this dodgy design strategy often pushes users toward giving up their privacy unwittingly and allowing a company deeper access to their personal data.

With more scrutiny than ever on the tech industry, the US legislation is based on evidence from a study by a Norway watchdog group that found companies like Facebook and Google push users towards making choices that negatively affect the users’ privacy.

The study details how these companies create an illusion of user control over their own data while simultaneously nudging the user towards making choices that limit that control.

By going through a set of privacy popups put out in May by Facebook, Google, and Microsoft, the researchers found that the first two especially feature ‘dark patterns’, techniques and features of interface design meant to manipulate users or used to nudge users towards privacy intrusive options.

Dark patterns are small and subtle yet effective ways of guiding people towards the outcome preferred by the designers. For instance, in Facebook and Google’s privacy settings process, the more private options are simply disabled by default, and users not paying close attention will not know that there was a choice to begin with. You are always opting out of things, not in. To enable these options is also a considerably long process.

When choosing the privacy-enhancing option, such as disabling face recognition, on Facebook, users are presented with a tailored set of consequences: “we won’t be able to use this technology if a stranger uses your photo to impersonate you,” for instance, to scare the user into enabling it. But nothing is said about what you will be opting into, such as how your likeness could be used in ad targeting or automatically matched to photos taken by others.

Disabling ad targeting on Google, meanwhile, warns you that you will not be able to mute some ads going forward. People who do not understand the mechanism of muting being referred to here will be scared of the possibility — what if an ad pops up at work or during a show and I cannot mute it? So they agree to share their data.

In this way users are punished for choosing privacy over sharing, and are always presented only with a carefully curated set of pros and cons intended to cue the user to decide in favor of sharing. “You’re in control,” the user is constantly told, though those controls are deliberately designed to undermine what control you do have in the first place.

 

Exit mobile version