Paying consumers for their data is a new permission model being tested as the value of data to AI machine learning models is better recognized.
AI algorithms typically required thousands or millions of data points to work effectively, increasing the value of data. But thanks to GDPR and similar efforts, the days of grabbing data from individuals without explicitly getting permission is numbered. The pendulum seems to be swinging the other way, toward avenues for granting clear permission.
One model for this is being offered by Oasis Labs, founded by Dawn Song, a professor at the University of California-Berkeley, to create a secure way for patients to share their data with researchers. Participating patients get paid when their data is used; participating researchers never see the data, even when it is used to train AI, according to an account in Wired.
Government is getting in the mix too. US Senator Mark Warner (D-Virginia) has introduced a bill that would require firms to put a value on the personal data of each user. The idea is that companies should pay to use personal data.
Health research is a good place to explore these ideas, says Song, because people often agree to participate in clinical studies and get paid for it. “We can help users to maintain control of their data and at the same time to enable data to be utilized in a privacy preserving way for machine learning models,” she said.
Song and her research partner Robert Chang, a Stanford ophthalmologist, recently started a trial system called Kara, which uses a technique known as differential privacy. In this model, the data for training an AI system comes together with limited visibility to all parties involved. The medical data is encrypted, anonymized, and stored in the Oasis platform, which is blockchain-based.
Silicon Valley incumbent giants such as Facebook and Google have built business models based on fuzzy permissions around use of data by their users. James Zou, a professor of biomedical data science at Stanford, sees a gap between permissions such as expressed in Sen. Warner’s bill and those granted in practice.
“There is a gap between the policy community and the technical community on what exactly it means to value data,” he says. “We’re trying to inject more rigor into these policy decisions,” he said.
Gov. Newsom of California Proposes Data Dividend
California Governor Gavin Newsome gave the idea of paying consumers for their data a boost in February, when he proposed a bill.
People give massive amounts of their personal data to companies for free every day. Some economists, academics and activists think they should be paid for their contributions.
“California’s consumers should… be able to share in the wealth that is created from their data. And so I’ve asked my team to develop a proposal for a new data dividend for Californians, because we recognize that your data has value and it belongs to you,” said Newsom during his annual State of the State speech, as quoted in CNN/FoxNews.
The idea is based on a model in Alaska where residents receive payments for their share of the state’s oil-royalties fund dividend each fall. The payouts vary from hundreds of dollars to a couple of thousand dollars per person.
It’s becoming more clear that as AI systems are being trained, they need data from wherever possible to get it: online purchases, credit card transactions, social media posts, shared smartphone location data, to name a few.
Recognizing the heightened awareness of the value of data, in June Facebook announced a market research program called Study, that aims to compensate willing users of Android operating system about their smartphone app use.
Users are to be paid a flat monthly rate through their connected PayPal account, according to a report in MarketWatch. Facebook would not provide an estimate of how much money users could be expected to make.
The app will collect information on what apps are installed on the phone, how much time is spent using those apps, the participant’s country, network type and device, and possibly what app features participants use. The data captured from the program, Facebook said, would help the company “learn which apps people value and how they’re used.”
Given its recent history in the exploitation of its users data, Facebook would have been well advised to have an independent review board evaluate the Study program for its ethical and legal implications, suggested Pam Dixon, the executive director of the World Privacy Forum. “We need more transparency from Facebook, and they have to go extra miles to prove that we can trust them because of their track record now,” she said.
Facebook said in announcing Study that it would not collect information like usernames or passwords, or photo and message content, and it would not sell Study app information to third parties or use it for targeted ads. The company said it would inform users of what information they would be sharing and how it would be used, before providing any data.
Source at AITrends
Commentaires