Why should Apple enter the space? After all, Google Glass got a lot of backlash for its awkward design and from onlookers who were afraid of the attached camera, which could be turned on at will. Snap‘s similar camera glasses, Spectacles, introduced last year, have also fared poorly — the company just had to write off almost $40 million in unsold inventory.
But I think Apple has learned from the mistakes of its predecessors, and might not even have to include the ability to take photos or videos — just leave those tasks to a smartphone.
Instead, Apple’s smart glasses could be used for guides: providing directions as you walk around a city. Or maybe more information about the restaurant that you’re looking at (Yelp reviews, for example). Or perhaps Apple glasses could be used in the enterprise, as Google Glass has been used to allow engineers more accuracy during assembly.
What if you could pull up notes on Apple’s glasses during a business meeting? Or receive and send emails and messages, likely via voice, without breaking a stride. Or what about health? As Apple continues to expand its focus in the area, what if doctors could use rOS-powered glasses to better diagnose patients by comparing a one symptom to an entire database of ailments in just seconds? What if I could look down at a rash on my leg and know it was poison ivy and not a reaction to something I ate?