top of page
markturner5

Eagerly awaiting Apple’s AR Glasses? You already have most of the technology in your pocket...


Mark Turner is Chair of the AR/VR Working Group for the Consumer Technology Association but has no access or prior knowledge regarding Apple’s plans with regard to AR Hardware.

At CES 2018 I argued that the battle over AR and VR technologies was not a skirmish over which company can build the best headset in the short term but part of a much wider war between the technology giants (Apple, Microsoft, Google, Facebook, Samsung) to stake a claim into a new world of pervasive computing devices. Those new devices represent the next wave in personal computing and will be always-on wearables with cameras and sensors to understand our environment, displays to augment our vision and use voice as their primary interface with us.

So, at CES 2019 next week and Mobile World Congress in February, look for more weapons to be deployed in this multi-year war. However, one company to watch will be silent during these shows - Apple. Even though they’ve not announced or displayed any AR headsets, ironically Apple could be furthest ahead in developing an entire AR ecosystem, and you may already own most of the technology required. Apple CEO Tim Cook has been uncharacteristically vocal about Apple’s interest in AR but let’s look further at the clues they have laid out about what exactly their “Apple Glasses” may look like:

1) ARKit. Updated to version 2.0 in 2018, ARKit is Apple’s most obvious offering here. Even though some see ARKit as 'weird experimentation' and it’s currently limited to moving a phone handset around as a ‘window’ into an immersive world, ARKit should be viewed as a teaser for Apple Glasses and a way to get developers trained in building immersive 3D apps and 3D interfaces. There were two vital Apple Glasses features announced with ARKit version 2 – persistence and multi-user support. Effectively, these features will allow two or more people wearing Apple Glasses to see and engage with the same virtual objects and to leave those virtual objects in place and see them when they return to a space. These are vital features for a world which Charlie Fink describes as “painted with data” and it’s another battleground for the tech companies – if Apple can seed some of these virtual objects, sign posts and advertisements in 2019 there will be more to experience for those early adopters of Apple Glasses. And if those digital objects are exclusive to the Apple ecosystem they create a similar world of ‘sticky’ content which works to lock in consumers, like the App Store and iTunes did before them.

2) Apple Watch demonstrates multiple Glasses technologies are already mature and deployed and also the Watch as a slave device to an iPhone. Obviously Apple Watch shows Apple’s amazing ability to shrink technology and batteries into tiny proportions, which will be vital for Glasses, but it also demonstrates other key technologies such as wireless charging, making and receiving calls, moving those calls between Apple devices, accessing Siri for voice commands and sharing data and experiences between Mini Apps (Complications as Apple calls them) and their big brother Apps on iPhone. You can see direct correlation here with future Glasses which could feature audio (for music and calls) on the temples (the arms of glasses that wrap around the ears), wireless charging on AirPower (so Glasses could be topping up charge whenever removed), a custom but familiar user interface, a mini App Store for Glasses enhanced apps and microphones for Siri voice commands.

3) Cameras and Depth Sensors – AR devices require a thorough understanding of what’s around them so they place digital objects correctly within and not on top of their surroundings. Apple leads the world in small but very high-quality cameras, lenses and the advanced software to interpret those images. The InfraRed camera used for FaceID is another foundational technology which Apple has deployed already that will be useful to enable Glasses to scan the environment around them and detect surfaces, walls and other opportunities to insert digital content.

4) Ecosystem of devices working together. As Watch and Air Pods demonstrate, Apple has built foundational technologies in their devices to seamlessly communicate with each other via Bluetooth and Wi-Fi but they can also move media and user experiences between different devices. This is vital to enable some other key Glasses features and to make Glasses smaller and lighter:

a. Moving Images & Video between devices. Apple Watch has an interesting feature to allow Watch to control and take photos on your iPhone. This requires the link between Apple devices to be able to stream camera inputs between devices and that opens up an interesting solution to a problem that has plagued current AR headsets. Standalone AR headsets like Hololens and Magic Leap One house all the compute (CPU and GPU) on the device. Those chips work hard, they get hot and they are heavy. If Apple can use its streaming services to move data from Glasses back to an iPhone or iPad and offload the compute on those beefier CPUs and GPUs - they can then very smartly get much smaller and lighter Glasses with batteries that will last longer.

b. AI on iPhone. Now iPhones are shipping with AI chips, Apple can play the same efficiency trick with computer vision – using small cameras on glasses and streaming that data back to an iPhone for processing and interpreting. The stream of video data does not need to be huge - 10 frames per second would probably be sufficient for an AI chip on your Phone to be able to identify faces of people you are looking at through Glasses, to interpret or translate street signs for you or identify or distinguish objects you're looking at. It’s this smart use of complimentary devices and edge computing that will make AR devices less like diving masks and more like fashion items. Speaking of which...

5) Fashion & Partnerships. Under Jony Ives Apple has become an iconic design company but with Apple Watch they also sensibly made the design extensible for consumers to express their uniqueness and partnered with other fashion brands (Hermes, Nike). With Glasses you can expect a similar extensible design to allow a range of unique customizations but also support for prescription lenses and shades.

So, with all these core technologies already deployed, what else is left for Apple to do? Why aren't they launching Apple Glasses already? Well the hardest part remains and that is the display technology that can project digital text, objects and interfaces into your eyes in a way that accurately adapts to your depth of field and does not create eye strain. Even for Apple this is no mean feat. And of course, they still need to achieve the balance between cost and performance such that Glasses can be made with Apple’s suitably high margins. As Apple stock valuations have been recently hit because of flattening iPhones sales they may be well advised to view their consumer market as this ecosystem of interconnected devices instead of individual sales (and maybe measure performance like service companies in ARPU – Average Revenue Per User).

Apple have never been first to a market, in this case they have left Google Glass, Microsoft Hololens, Snap and Magic Leap to do the early work feeling out AR, but you can be assured that when they do release their Glasses they will have learned from others and avoided making heavy, dorky looking devices. Apple Glasses version 1 will likely have limited battery life, perhaps only indoor AR visuals and limited vision cameras but as they move through version 2 and 3 those teething problems will be resolved so we can genuinely expect mass market, fashionable 5G computers attached to our heads.

8 views0 comments

Recent Posts

See All

Comments


bottom of page