NYT reported on how the company manipulated the drivers into working longer hours for lower wages while continually chasing the next ride or task.
It vividly described how Uber and other companies use tactics developed by the video game industry to keep drivers on the road when they would prefer to call it a day, raising company revenue while lowering drivers’ per-hour earnings.
With the help of hundreds of social scientists and data scientists, Uber has reportedly "gamified" their service, using elements straight out of smartphone hits like Candy Crush and Farmville to offer the drivers rewards and keep them motivated.
For example, when a driver is about to log off for the day, Uber will send them an app alert saying they’re only a few dollars away from a particular financial target. Even though drivers can decide when they want to work, Uber’s techniques are potentially problematic because they can manipulate drivers into working longer hours or in undesirable locations with no guarantee of a higher income.
Radio Sputnik discussed the issue with research scientist in human-computer interaction at the Center for Machine Learning and Health at Carnegie Mellon University Min — Kyung Lee and a research fellow at Carnegie Mellon University Daniel Kusbit, who described it as "behavioral economics" in action and explained exactly what is going on.
"Regardless the recent article in The New York Times on the tactics to motivate drivers to work longer for Uber, it's nothing new. Behavioral economics has influenced every interface that we see online around us: Amazon is designed to make us buy more products, Facebook is designed to make us share more news and have more friends online and upload more photos. These tactics itself are used really everywhere and in all the digital products around us," Min — Kyung Lee told Sputnik.
Lee went more in-depth, describing the mechanics of behavioral science. This fascinating new digital branch of science is about psychological, social, cognitive, and emotional factors involved in the economic decisions-making of individuals and institutions and the consequences for market prices, returns, and resource allocation.
The scientist stressed what's called "dominated alternatives," when introducing a third decoy option can make a purchaser more likely to choose the option the seller secretly wants him to choose. This principle works by having two similar options, which can be compared – primarily one which is easily identified as high quality, and one that anyone can see if poor quality, plus a third option from a slightly different category.
In the background, the website or app owner uses their interface to influencing the buyer's subconscious. The interface, she said, will never overtly tell a customer to choose a particular option, but instead shows, for example, one good apple, one bad apple and one pear. The good apple definitely looks better than the bad apple, and therefore providing buying context. Thus the buyer recognizes that the apple is a good one, and is more likely to choose it rather than the pear. The same decision making matrix is used when you shop for laptops, cosmetics and just about anything else.
Speaking about the legal or ethical sides of such technological manipulations, Daniel Kusbit noted that the law is "always way behind on this type of thing." In terms of whether this manipulation is ethical, he said, it really depends on whether or not this is becoming the new norm of full-time employment.
"As we see, Uber is filling in the gaps for people who are struggling to find regular employment in the changing economy. If it is just something that is being done short-term, for earning some extra cash on the side, it does not carry the same weight it does if this is a new norm for earning a living," he told Sputnik.
In terms of research ethics, he said, is that it also comes down to informed consent. If a person is being manipulated in some way, they should be aware and informed upfront of the risk and the methods by which they will be subjected to these manipulations. This all ties back to transparency, the scientist said.
"The more transparent the service is, the more the drivers can react to it in a way that is fair. If they don't want to go to a "hard-zone", they can go the opposite direction if they want to be away from it," he finally said.