Uber CEO Travis Kalanick, addresses a gathering at an event in New Delhi, India, December 16, 2016
When a technology company decides to block a person from using its service, it’s usually obvious to that person. There may be an email to notify users when their account has been suspended. Or, you know, you try to log in and you can’t.
With Uber, it isn’t so straightforward.
The ridesharing giant created a mirror of its own app, a map-view intended to give some users the perception that they could hail a ride—even showing phantom cars moving across a map!—that was never going to show up.
Uber confirmed to The Atlantic the existence of the program, which was first reported by Mike Issac of The New York Times, but emphasized in a statement that the fake version of the app was meant to protect drivers—not to mislead local investigators in states where Uber’s arrival had generated controversy, as the Times reported. The program launched internally as Greyball and was later renamed VTOS, short for Violations of Terms of Service—a reflection of Uber’s justification for creating it in the first place.
“This program denies ride requests to fraudulent users who are violating our terms of service,” an Uber spokesperson said in a statement, “whether that’s people aiming to physically harm drivers, competitors looking to disrupt our operations, or opponents who collude with officials on secret ‘stings’ meant to entrap drivers.”
One such sting involved a code enforcement inspector in Portland, Ore., named Erich England, who posed as a rider and attempted to hail an Uber as part of an operation in 2014, when Uber launched its service in Portland without seeking approval from the city. As the Times reported: “But unknown to Mr. England and other authorities, some of the digital cars they saw in the app did not represent actual vehicles. And the Uber drivers they were able to hail also quickly canceled. That was because Uber had tagged Mr. England and his colleagues—essentially Greyballing them as city officials—based on data collected from the app and in other ways. The company then served up a fake version of the app populated with ghost cars, to evade capture.”
There’s a clear question of legality here, which Uber declined to discuss on the record. The attorneys general in five states—California, Hawaii, Massachusetts, Oregon, and Texas—either declined to speak on the record about whether they were investigating Uber over the program, or did not immediately respond to a request for comment on Friday afternoon.
But there’s something else at stake here, too. Uber’s ghost app raises a pressing question for anyone who lives or works, at least part of the time, on the internet—and, in the United States, that’s nearly everyone: What does the right to refuse service look like in digital environments?
At least some of the time, as Uber proves, it’s invisible. And the implications of this invisibility are troubling.
It’s only natural that Uber would be the company to push this question forward, and not just because of its astonishing streak of recent controversies. Uber has dramatically reshaped the way people think about the integration of digital and physical worlds, and the possibilities for using digital interfaces to make something happen at the street level. Today, people fully expect to be able to touch a button on their phone to make a car pull up to the curb they’re standing on; a few short years ago the concept was magical. This shift in expectation happened so quickly that what it means for other cultural norms and laws—like the right to refuse service—is still uncertain.
In brick-and-mortar environments, being refused service happens face to face—meaning, the person being turned away from a bar, for instance, knows they’re being denied entry, even if they don’t always agree with the rationale. But what happens when there’s an elaborate facade designed to keep a person from knowing that they were never allowed in?
In Uber’s view, this strategy offers an added layer of protection to drivers, not only preventing potential harm but possible retaliation that might come from someone after they find out they have been blocked. In principle, this makes sense—it’s like building in a layer of plausible deniability to de-escalate tension. As a destructive force to an industry long dominated by taxis, Uber has met huge resistance in some markets, and the outcome has been messy, to say the least.
Such secrecy, however, also shields Uber from any public scrutiny over who gets denied its service and why. “It’s critically important for people to know they’re being refused service,” Ethan Zuckerman, the director of the Center for Civic Media at MIT, told me. “It’s what allows people to file claims for discrimination, allows them to gather evidence and demonstrate that a class of people is being excluded.”
“Greyballing police may primarily raise the concern that Uber is obstructing justice,” Zuckerman added, “but Greyballing for other reasons—a bias against Muslims, for instance—would be illegal and discriminatory, and it would be very difficult to make the case it was going on.”
The history of race-based discrimination in the United States is full of examples of this very dynamic playing out in non-digital environments, like the housing market. Even after the passage of the Fair Housing Act of 1968, real estate companies would tell black renters that they had no units available—then later rent those units to white people. President Donald Trump was sued for this practice in the 1970s, and ultimately reached an agreement with the Justice Department, but still denies having done anything wrong, as The Washington Post reported last year.
Just like prospective renters being told a neighborhood was full, “those discriminated against would simply have a poor experience with Uber and move on to another service,” Zuckerman said. “Uber likes to make the case that it can innovate because it’s largely unregulated, but it can also discriminate for the same reasons.”
In digital environments, as in physical ones, the refusal of service—and any tactics used to refuse it, however secretive—should prompt observers to ask: Who is actually being protected here? And from what?
This article was sourced from http://news680.com