With the latest column for Wired, "Worried About Privacy at Home? There's an AI for That How edge AI will provide devices with just enough smarts to get the job done without spilling all your secrets to the mothership," by Clive Thompson, a very smart columnist who also writes for the New York Times, touched on both parts: the privacy and the AI in everything.
His feeling: "I don't need light switches that tell dad jokes. When it comes to gadgets that share my house, I'd prefer they be less smart."
What he means is there are companies building "edge AI": AI that runs on "teensy microprocessors" that have enough capability to control a coffee maker but nothing more than that; perhaps an edge AI can understand 200 words.
Why is that enough? With designed limited capability, the edge AI-enabled coffee machine does not need to interact with the cloud, which would give it more power but also share all kinds of data in the cloud, where it can be used to better train future iterations of the coffee machine (perhaps) and also could be monetized or further shared without your permission.
Designed to handle specific applications, edge AI can be faster and can ensure privacy while helping you get the job done. As Thompson notes:
"You can't banter with it (edge AI) as you would with Alex. But who cares? 'It's a coffee maker. You're not going to have a meaningful conversation with your coffee maker...'"Thompson describes edge AI as perfect for appliances light lamps, TVs, and other devices that could benefit from voice control without needing full-on conversational capabilities. True, users would need to know the key terms to turn on and off devices or handle other variables (like turn up or down the lights and thermostats or the channels or volume, if that's how you still watch TV or listen to music). But they won't have to worry that someone is listening in on the conversations.
Edge AI won't solve all the "Age of Anxiety" issues but it's a good way to use just-smart-enough AI to help us without being too smart and not knowing how our input and data are being used.
No comments:
Post a Comment