Artificial intelligence can transform industries, but California lawmakers are worried about your privacy - Los Angeles Times
Advertisement

Artificial intelligence can transform industries, but California lawmakers are worried about your privacy

Pinscreen, a Los Angeles start-up, uses real-time capture technology to make photorealistic avatars.
(Gina Ferazzi / Los Angeles Times)
Share via

The use of bots to meddle in political elections. Algorithms that learn who people are and keep them coming back to social media platforms. The rise of autonomous vehicles and drones that could displace hundreds of thousands of workers.

The “robot apocalypse” that some envisioned with the rise of artificial intelligence hasn’t arrived, but machine learning systems are becoming part of Californians’ everyday lives, tech experts told state lawmakers in Sacramento last week. As use of the technology becomes more widespread, so will the challenges for legislators who will have to grapple with how and when they should step in to protect people’s personal data.

“AI might be over-hyped, but it is here to stay,” said Olaf Groth, a professor at Hult International Business School. “We are indeed at the threshold, at the moment, between AI being a new development that promises significant growth in the economy, and AI also causing significant disruptions in society.”

Advertisement

The state Assembly hearing was the second this year to take on the issue of artificial intelligence. The committee’s chair, Assemblyman Ed Chau (D-Arcadia), said he hoped it would be one of many to open a dialogue on a force already reshaping society. Members of the Little Hoover Commission, an independent state oversight agency that reviews government operations, held a similar discussion last month.

Lawmaker wants to force California police agencies to disclose surveillance policies >>

AI is loosely defined as the theory and development of computer systems able to perform tasks that traditionally require human intelligence. Courts use artificial intelligence to determine whether defendants are fit for release on bail, scientists are developing AI systems to match patients with treatment and at least one company is using AI to rethink how local stores deliver groceries.

Advertisement

Representatives with tech companies Adobe and Postmates argued during Tuesday’s hearing that lawmakers should not move too quickly on developing privacy regulations.

“The best thing we can do to make AI unbiased [and efficient] is to ensure [AI systems] can get trained on the broadest sets of data that are out there,” said Dana Rao, vice president of intellectual property and litigation with Adobe.

But tech privacy experts countered, saying users want more control over how their personal information is shared. Patients, for example, may agree to share their medical data with one AI developer, but might not want that business to release their information to their employers or health insurance companies.

Advertisement

State lawmakers have started to tackle some of these issues. Legislation this year would require social companies to identify bots on their platforms. Other bills would fund independent research on the addictive qualities of social media, require manufacturers to install new security features on their internet-connect devices and force police to disclose all of their tech equipment, such as facial recognition software.

California passed a law boosting police transparency on cellphone surveillance. Here’s why it’s not working >>

Jonathan Feldman of the California Police Chiefs Assn. envisioned a scenario in which police will someday use AI to predict violent or criminal behavior, allowing officers to make better decisions on the job. But without better transparency laws, the public will have little knowledge of what these automated systems are collecting from everyday people and whether their predictions are accurate, said Matt Cagle, a technology lawyer with the ACLU of Northern California.

Samantha Corbin, a tech privacy lobbyist who recently co-launched the We Said Enough App to report sexual harassment, argued there should be more state protections against data misuse across all fields.

“The level of information that can be aggregated, the degree of intimate knowledge that will be known about individuals — this lifetime data — is in many ways unprotected by outdated existing laws regarding privacy, security and even human research,” she said.

Advertisement

[email protected]

@jazmineulloa

Advertisement