Your life is an open book.
Unless you live off the grid, big data companies know all about you. They constantly collect information about the things you buy and the websites you visit, plus thousands of other bits of personal information gathered from public records and your social media activity.
This data is run through sophisticated algorithms that make it possible for retailers, utilities and financial institutions to predict how you will respond to various marketing offers. These computer programs can predict your ability to pay your bills on time or predict that you are sick. They can even predict that a woman is pregnant.
The consumer scores generated from this "predictive analysis" determine what ads you see when you go online, the offers for products and services you get in the mail and the coupons that are sent to you via email.
"It's kind of like the Tom Cruise movie, 'Minority Report,' " said Pam Dixon, executive director of the World Privacy Forum. "This predictive world is on its way and I'm not sure we're fully ready for it."
You can get your credit score, but there is no way to find out about these other consumer scores that have an ever-increasing impact on your life. Credit bureaus are highly regulated. Big data is not.
And as Dixon reminds us, "These consumer scores can become someone's destiny, whether they are right or wrong."
In its new report, "The Scoring of America," the organization outlines how these secret consumer scores can threaten privacy and fairness. And it calls on federal regulators to police this relatively new industry of predictive scoring.
"Not all scores are bad, but any score that's secret is a problem because it's a simple matter of fairness not to have a secret score," Dixon said. "I'm really concerned about potential discrimination lurking in scores that we don't see and don't know what goes into them."
You will never know when a predictive score was used or how it affected the offers you did or did not receive. And even if you did find out, you can't change that score.
Robert Gellman, a privacy and information policy consultant who worked on the report, said these scores can be based on thousands of factors, including race, religion, age, gender, household income, ZIP code, medical conditions and purchase history.
"There could be a discriminatory effect here—some kind of redlining that isn't visible on the surface," he said.
And that's the rub. Whether you get a discount coupon isn't a big deal, but if consumer scoring keeps you from receiving credit card offers or makes you a target for subprime loans—because of your ZIP code, household income or race—well that's another matter.
The companies that create and market this ever-expanding treasure-trove of personal information don't see a problem or the need for any additional regulation. They say the marketing information they gather and analyze is used to offer people relevant ads and money-saving deals.
"Marketers want the most accurate information possible," said Rachel Nyswander Thomas, executive director of the Direct Marketing Association's Data-Driven Marketing Institute. "But at the end of the day, if the data is wrong and the predictive analytics is wrong, the worst thing that can happen to a consumer is that the ad or offer they get is not relevant."
Thomas stressed that the Direct Marketing Association's code of conduct does not allow any marketing that is disparaging or discriminatory in any way.
Jennifer Barrett Glasgow, chief privacy officer at Acxiom, one of the country's biggest data brokers, explained that a marketing score really isn't all that personal.
"It's a mathematical computation that puts a group of individuals into a defined audience for a marketing campaign," she said. "And these scores aren't static in the same way that credit scores are. Their lifespan may be milliseconds."
We live in a world of scores
Credit scores are based on information in your credit file. Federal law aims to prevent discrimination by prohibiting certain information from being included in those files, such as race, national origin, religion, gender, marital status and sexual orientation.
The Fair Credit Reporting Act gives you the right to check your files and correct errors. If this information was used to deny your application for credit, you must be told.
The marketing databases used to determine consumer marketing scores are virtually unregulated, so they can legally contain all sorts of sensitive information that cannot be included in your credit files—including health and medical information gleaned from Web searches, purchases and public postings.
Privacy advocates worry that this could allow discrimination, unfairness and bias in the marketplace.
The World Privacy Forum Report suggests that some predictive scores are being used in place of credit scores in order to get around the restrictions of the Fair Credit Reporting Act.
"If I can peek at a credit score equivalent, not a credit score that's regulated, but one that is not regulated, and decide whether to hire you or make you an offer of credit, that may be a problem," Gellman said.
Consumer scoring is growing
The World Privacy Forum report estimates there were fewer than 25 consumer scores back in 2007. Today, there are hundreds and forum researchers believe there are probably thousands of custom scores that are beyond their ability to confirm. A few examples cited in the report:
- Job security score: Predicts future income and capacity to pay.
- Churn score: Predicts when customers will move their business or account to another merchant.
- Brand name medicine propensity score: Predicts if you will buy generics or brand name medications.
- Fraud score: Predicts if a customer is not who they claim to be or may be up to some mischief.
Some big data brokers, such as Acxiom, Spokeo, Intelius, eBureau and ID Analytics make it possible for you to see what's in your file—the information used to create various consumer scores, but the process isn't always easy.
For its recent report on Big Data, the National Consumer Law Center had 15 volunteers try to get their own information from four large data brokers.
"We found that there's a lot of bad data out there," said attorney Persis Yu, co-author of the report.
The reports the volunteers received about themselves were full of inaccuracies. Some of the mistakes were minor: a wrong email address or phone number. Some were significant: incorrect occupation, salary or level of education.
A call for action
Privacy advocates would like to see federal regulators establish some rules for the use of consumer scores to make sure they are not being used unfairly or to discriminate.
They believe the companies that collect this data should be required to take steps to ensure that it is accurate.
They want companies to disclose that a predictive score was used, if that score adversely impacts someone's employment, credit, insurance or any significant marketplace opportunity.
Congress, the Federal Trade Commission and the Consumer Financial Protection Bureau have all been studying the impact of the collection and use of consumer data on the marketplace.
Even if the decision is made to regulate this industry—and that is far from certain—it won't be easy to do. The World Privacy Forum estimates that there are now more than 4,000 databases collecting and analyzing every bit of information they can gather on us.
There are ways to stop some of this data collection. You can use Web browsers that don't track where you go, or you can shop with cash. You could even stop sharing all of your personal information on social media—fair game for the data collectors.
But there are so many sources of personal information beyond your control that it's really a losing battle. Your data are now a commodity, bought and sold, whether you like it or not.
—By CNBC contributor Herb Weisbaum.