The Black Market for Black People’s Phone Data

In a recent New York Times article, Geoffrey Starks, one of the members of the Federal Communications Commission, made a thought-provoking point about wireless phone carriers’ rampant selling of customer data to shady third parties. “The misuse of this data is downright dangerous,” wrote Commissioner Geoffrey Starks. “The harms fall disproportionately upon people of color.”

Starks is African American and a lifelong Democrat, who was once a staff member at the Illinois state Senate (overlapping with then-senator Barack Obama). He described a chilling arrangement where smartphone service providers sell our location data to companies we’ve never heard of, perpetuating surveillance-based digital discrimination against Black and Brown communities.

Bounty hunters do still exist, though these days they prefer to be called “bail enforcement agents,” doing the dirty work for an unfair bail system with a proven racial bias. Starks' article is entitled “Why It’s So Easy for a Bounty Hunter to Find You.”

Vice discovered that, for just $300 cash, a bounty hunter can locate pretty much any smartphone user who has AT&T, Sprint, or T-Mobile. (Police can also do this via sanctioned methods, but are known to abuse the privilege.)

These mobile phone tracking services are also sold to car leasing operations, loan providers, or anyone who wants to secretly keep tabs on people. In some cases, a random stalker could access your phone’s location. ZDNet found a bug in a company called LocationSmart’s “try before you buy” feature that would give up the current location data of any phone number requested, one time, for free. That free demo feature has since been taken offline, but it shows how cell service carriers are hawking our personal data to a network of seedy, below-the-radar tech firms with little regard for ethics or privacy safeguards.

Phone carriers freely admit that they sell your location data to third parties, claiming they do it “only where a customer consents to such disclosure.” None of us would knowingly consent to this, but it was buried deep in those lengthy terms of service that nobody reads.

Telecom companies created location tracking as a clever side hustle to deliver us ads for things like shoes, handbags, and movie tickets. But they inadvertently created a much darker “pay to track” underground economy that preys on communities of color in ways that go far beyond just chasing people who’ve jumped bail.

You might think the online advertisers only care about tracking wealthier people who buy expensive things. But there’s a lucrative cottage industry for digital blacklists meant to identify “the wrong kind of person” based on users’ locations, Google searches, or online banking habits.

People of color are more vulnerable to surveillance and data mining because they’re more likely to use a smartphone as their only form of computer.

A Pew Research analysis found that 24 percent of African Americans have a smartphone but no broadband coverage in their home, and 35 percent of Latinx smartphone users do not have home broadband. If you only use a smartphone and never a computer, you create significantly more tracking data that’s sold to third parties. More data means more ways these parties can enforce bias in employment, credit, and housing, and further exacerbate the inequality facing these communities on the internet.

Readers may have experienced one of the most offensive examples of this. Google searches on African American names are more likely to generate ads for “background check,” “arrest record,” or “find a person.”

Harvard professor Latanya Sweeney confirmed this in her study Discrimination in Online Ad Delivery. In searching thousands of names that correlated highly with certain races, she found that “a Black-identifying name was 25 percent more likely to get an ad suggestive of an arrest record” compared to a typically white name.

Sweeney even found that this happened to her own name. “A Google search for ‘Latanya Sweeney’ and ‘Latanya Lockett’ also yields ads suggestive of arrests,” she wrote.

These ads don’t just appear on Google. They follow you around the web, just like searching for a pair of shoes ensures that ads for those shoes show up on other websites you visit. “On Reuters.com,” she found, “the highest percentage of ads with ‘arrest’ in the text are ‘Darnell’ (84 percent), ‘Jermaine’ (81 percent) and ‘DeShawn’ (86 percent).”

Even if you haven’t been arrested, the appearance of your name in such an ad indicates to potential employers or creditors that you have been. Little text ads are just one part of the surveillance empire that exploits communities of color, and Facebook is probably the most careless in allowing racially biased targeting. NBC News found that Russian misinformation efforts in the 2016 election “targeted African Americans with election meddling,” and an influence campaign to create dissension among Black Lives Matter activists further aimed to depress and divide the African American vote.

Facebook also faces federal discrimination charges for allowing advertisers on its platform to block their ads from appearing to certain races, ethnicities, and sexual orientations. A ProPublica investigation found housing ads on Facebook gave advertisers the option to exclude African American, Asian, and Hispanic users.

“Algorithmic decision-making is the Civil Rights issue of the 21st century,” said Cornell assistant professor Ifeoma Ajunwa. “Algorithmic systems can provide end runs against equal protection laws and may also be used to skirt laws ensuring equal opportunity on the labor market.”

The problem with artificial intelligence is that it makes assumptions that are, you know, artificial. Data mining algorithms are mostly designed by white, affluent programmers who have a blind eye to their own bias — which also reflected in hiring practices. The monetization of poverty is nothing new, but our cellular phone service providers are getting even richer by predictively perpetuating poverty with discriminatory data practices.


About the Author

Joe Kukura is a San Francisco freelance writer covering the intersection of cannabis policy and social justice for The North Star and SF Weekly. His work has previously appeared in Thrillist and the Daily Dot, and you can follow him on Twitter @ExercisingDrunk.