(Bloomberg) —
Cities are looking to artificial intelligence to help improve their waste management, including reducing contaminants in their recycling and composting streams. But as with all new technology, there are privacy concerns.
In East Lansing, a Michigan university town where students make up over half of the 50,000 residents, items like plastic bags, Styrofoam and plastic film commonly end up in the municipal recycling stream as a result of “wish-cycling” — when well-intentioned people discard waste in the recycling bin in hopes that it can be recycled.
“People are trying to do the right thing, but education is tough to get your hands on when all the solutions and rules for recycling are hyperlocal,” Cliff Walls, environmental sustainability and resiliency manager for the city, told a roomful of city leaders and urban experts during a session on AI at the Bloomberg CityLab 2024 conference in Mexico City this week.
Broad education campaigns have proven ineffective due to the city’s transient population, with new students moving in and leaving every year. So in 2022, the city launched a pilot program to deliver more personalized messages in hopes of changing each household’s behavior.
Recycling trucks were equipped with AI-powered computers and cameras that have been trained to identify and photograph non-recyclable items in household recycling bins left on the curb. The city then sends the geotagged photo, with everything but the item in question blurred out, to the bin owner in the form of a postcard notifying them of their error and with tips.
Over a 24-week period with more than 5,000 postcards mailed, contamination of the recycling stream decreased by more than a fifth, according to a study on the pilot program. Postcards bearing an emotional message — via an image of a family looking out to a mountain of landfill — proved to be most effective, said Walls, with recipients contaminating 23% less than the control group.
The Canadian city of Leduc, in the province of Alberta, launched a similar pilot in 2023 to keep non-organic materials out of the municipal compost carts. That year, contamination in the city’s compost stream started at 68%, according to environmental manager Michael Hancharyk. Now it’s down to 9%.
At least one cybersecurity expert warned, however, of potential risks to residents’ privacy when cities don’t adequately vet the intention and security processes of the AI companies they work with to collect personal information, or don’t have the capacity to do so.
“Trash is really personal,” said Sarah Powazek, program director of public interest cybersecurity at the Center for Long-Term Cybersecurity at the University of California, Berkeley. “Trash has a lot of sensitive information that not only could lead to identity theft [through items] such as credit cards and financial records that people might throw out, but also about how people live their lives.”
That might include records revealing a person’s health, for example, or receipts that detail where people frequently visit and therefore expose information about their lifestyle. That information could get into the wrong hands if cities — and the vendors they work with — are hacked. AI companies collecting that data on the city’s behalf could also potentially sell it to third parties, who then use it to target ads.
And various agencies could potentially access that information and use it against the resident or to monitor entire communities. Powazek, who was in the audience during the session, pointed out a scenario in which a photo of a discarded pregnancy test captured by AI cameras might get turned over to the police and used against a woman in a state that has outlawed abortion.
Cybersecurity experts have a term for that gradual shift in the use of data and technology, called mission creep: “This is often how surveillance programs start; you create a technology that is meant to, let’s say, view people’s trash, and it ends up going to the police,” Powazek told Bloomberg CityLab. “And because their job is to prevent crime, it becomes very difficult to make the case that they shouldn’t have access to it.”
At the session, Walls and Hancharyk both acknowledged the privacy concerns and said they work with the vendors and regulators to ensure that people’s information is protected. In East Lansing, the city took steps to ensure AI companies are not selling information to third parties, according to Walls, and made sure they were transparent with residents about the initiative. There are also similar pilots in other cities in Michigan, including in Detroit and Grand Rapids, to study how communities of various income levels might respond to the program.
Hancharyk said that not only did they conduct a media blitz to make residents aware of the program before implementation; the city had to run it by Alberta regulators to ensure that the program abided by the Freedom of Information and Protection of Privacy Act. That’s the law governing the collection, use and disclosure of people’s data by public agencies.
Powazek clarified that she is not calling for cities to stop experimenting with new technologies, but wants officials to do their due diligence. “I think what cities really should do first is understand the trade-offs that they’re making, understand the new risks they’re introducing and to whom, with a particular focus on sensitive populations,” she said. “And they should really assess whether or not improvements to recycling are worth potentially revealing very sensitive, very personal information about residents.”
To contact the author of this story:
Linda Poon in Washington at lpoon12@bloomberg.net
© 2024 Bloomberg L.P.