Recent Posts
Connect with:
Thursday / July 18.
HomemistoryWearable Technology: NOW and in the Pipeline

Wearable Technology: NOW and in the Pipeline

The idea of augmented reality (AR) has been around since the early 60s when a motorbike simulator used a hint of the technology. Boeing picked it up in the 80s, foreseeing benefits for aeronautical maintenance, then; the concept went to labs to be developed for the military.

In the 90s it started to take off and since then AR has exploded.

In the ophthalmic industry, AR is used to train surgeons and assist with surgical procedures. It helps detect eye disease, measure visual acuity and visual fields, and assist people with vision loss to navigate their environment and enjoy a greater quality of life.

In the retail world, AR is being used to create memorable shopping experiences that build brand awareness, loyalty and sales.

With so much going on in this space, we thought we’d bring you a snapshot of products currently available and others that are in the pipeline.

AR overlays the real world with a virtual world from the media, the World Wide Web, and television… in real time. A person slips on a headset – or wears connected contact lenses – and voila, images and data appear to sit about three metres away from their face, within the perspective of their environment.

Strangely enough, consumers are now even prepared to pay to ‘own’ a virtual product

Google Enterprise Edition 2.


For marketers, AR is a powerful tool that can be used to communicate the story behind their products to prospective and existing customers, without relying on retailers. At a time when the rate of online shopping has been accelerated out of necessity due to the pandemic, AR creates a retail experience that’s as close to a physical experience as you’re going to get. You can see the product in situ, try it on and acquire it with or without being in-store.

Ikea, for example, allows customers to position and visualise a piece of furniture in their home, via its Place app. Beauty brands allow potential customers to see how different cosmetic products look on their own faces.

The potential to promote eyewear using AR is vast with opportunities to provide lifestyle and branding messages, information about technical features and, most importantly for many, the ability for consumers to try frames on.

Strangely enough, consumers are now even prepared to pay to ‘own’ a virtual product.

Consider this example from Gucci, which released a pair of virtual sneakers designed to only be worn and shared online. The ‘digitalonly’ trainers, created in collaboration with AR fashion platform Wanna, are available for purchase via the Gucci app for just US$17.99 – creating an opportunity for aspiring Gucci wearers around the world to experience the luxury brand well before they could otherwise have afforded it.

Virtual products like these provide the opportunity for premium brands to grow with high volume sales of low price products while at the same time building a pipeline of future loyal customers who may one day splash out on the real item.

Heru re:Vive.

Growing Uptake 

A recent study, conducted by Snapchat across the United States, the United Kingdom, France, and Saudi Arabia, found more than half of consumers aged 13–49 had used AR in the past, and nearly one third had used branded AR (AR initiatives established by brands to specifically showcase their products).1 Snapchatters were even more enthusiastic, with 56% being more likely than non-Snapchatters to have used branded AR.1

Two thirds of those surveyed said they would use branded AR for shopping and would be likely to purchase after a branded AR experience. The numbers increased to 72% for virtual try-on.1


Everyone remembers Google Glasses – the AR enabled ‘Explorer’ eyewear that promised the world but never got off the ground, most likely due to concerns with privacy and an inability to successfully build the technology into a good-looking, comfortable frame for consumers on the street.

The withdrawal of the Explorer from the market didn’t signal an end to the company’s foray into wearable technology. Far from it. Today Google Enterprise Edition 2 features a voice activated optical head-mounted display that lets wearers access apps, training videos, images annotated with instructions, and quality assurance checklists etc. It performs similar to a smartphone but with the hands-free novelty of having the information hovering right before your eyes. Additionally, Google Enterprise Edition 2 lets co-workers connect with video calls and enables others to see what you’re seeing through a live video stream. Unsurprisingly, these AR glasses are now used across industrial applications, including in surgery.


Recognising the opportunities available to them, in July this year, EssilorLuxottica announced the creation of a dedicated innovation centre in France which will focus on electrochromic and smart eyewear technologies.

Electrochromism is an automatic change in the lens tint, which is powered electronically and triggered by a signal generated by the frame to activate the electrochromic dyes in the lens.

Smart eyewear is a complex product category that requires the combination of active lenses and sophisticated frames on the one hand, and electronics, sensors and software on the other, along with the optical function of a lens.

“On the back of a decade of research and development around smart eyewear, we are accelerating our investments and reinforcing our capacity to bring digital technology into eyewear, in the service of good vision. At EssilorLuxottica, we are convinced that advanced optical functions in lenses will be instrumental to advancing our mission to help people to see more, be more and live life to its fullest. We now look forward to showcasing our first innovation in the segment of active products as EssilorLuxottica in the coming months,” said Francesco Milleri and Paul du Saillant, CEO and Deputy CEO of EssilorLuxottica respectively.

Smart eyewear is a complex product category that requires the combination of active lenses and sophisticated frames on the one hand, and electronics, sensors and software on the other…

The newly created Smart Eyewear Technologies Centre located in Dijon, France will bring together more than 50 “leading experts” and coordinate the relevant R&D and industrialisation sites, based in Toulouse and Créteil (France), and collaborate closely with the R&D teams based in Agordo (Italy).

The Centre, which will include the creation of a high-grade clean room, is expected to be operational by the end of 2021, with the launch of the first products in the near future.

Facebook, Apple, Microsoft, Snapchat and others are also actively pursuing the development of smart glasses. Indeed, Facebook’s Mark Zuckerberg has confirmed that the company’s first pair of smart glasses is almost here. It is believed that the company has some 6,000 people working furiously on virtual reality, AR and hardware.


Make no mistake, these glasses do not offer augmented reality yet, however according to industry, they feature several concepts expected to eventually be critical for AR glasses.

Made in a partnership with EssilorLuxottica under the Ray-Ban brand, Mr Zuckerberg said, “The glasses have their iconic form factor, and they let you do some pretty neat things. I’m excited to get these into people’s hands and to continue to make progress on the journey towards full augmented reality glasses in the future.”

By all accounts, the glasses borne from this Facebook Ray-Ban collaboration will look like normal glasses, have lenses similar to standard lenses, and connect wirelessly with phones or, in the future, motionsensing watches.

Rather than overlaying images, it appears they will focus on audio as the immersive technology, which is considered to be a more achievable step in the short-term.


In August, United States innovators Heru Inc. announced the commercial availability of re:Vive, its wearable diagnostics and visual field application.

Compact, comfortable to wear, and weighing in at just 370g, this portable wearable means it’s no longer necessary to have a designated dark room in which to conduct an eye examination – it can be used anywhere in the practice, making this a great solution for mobility impaired patients and for freeing up consult rooms for other requirements.

Developed by eye care practitioners and scientists at the University of Miami’s Bascom Palmer Eye Institute, Heru’s re:Vive system is designed to test every patient, whether for screening or disease management, and especially for glaucoma.

According to the company, it supports automated suprathreshold screening and full threshold exam strategies. Threshold tests are highly correlated with the Humphrey Field Analyser (HFA) and can be completed in a shorter time. According to Heru, two studies comparing the HFA 24-2 to the Heru re:Vive 24-2 confirm the accuracy of the re:Vive Threshold test strategy. One included 47 eyes (21 healthy and 26 of patients with glaucoma and neuro-ophthalmic diseases).2 Another study brought the total eyes tested to 81 (40 normal eyes and 41 from eyes with pathology). Both found strong correlations between Heru VF mean deviation and threshold values and those of HFA in normal eyes (R=0.91, P<0.001) and eyes with glaucoma and other pathologies (R=0.81, P<0.001), as well as excellent re-producibility with ICC of 0.95 (95% CI 0.86-0.98) and 0.80 (95% CI 0.78-0.82) on normal and pathologic eyes, respectively.3 The re:Vive Threshold strategy was statistically significantly faster than the HFA SITA Standard (4.3 vs five minutes respectively; P<0.001), with a 15% gain in pathologic eyes and an 8% gain in healthy eyes.

With a cloud-based web portal, eye care practitioners can access and review test results from any location. The fully encrypted and HIPAA-compliant system ensures patient privacy and confidentiality is maintained.


A fascinating product in the pipeline is Oculenz by Ocutrx, a Californian company described as a “smart-up startup”, and founded in 2015 by Michael H. Freeman, an Emmy Award Winning Mobile Video Inventor.

In 2018, Ocutrx was granted its first patent by the United States Patent and Trademark Office (USPTO) for its core AR medical application technology – the Oculenz.

Quite simply, Oculenz mediated reality glasses use pixel manipulation to correct for eye deficits…

“The Oculenz is the first ground-breaking technology to offer a solution for advanced central visual defects in patients with retinal disease,” said Dr Thomas Finley, M.D., a vitreoretinal surgeon and member of the Ocutrx International Medical Advisory Board (IMAB). “This innovative AR device brings a new hope to regain functional vision previously considered impossible. The impact on the individual’s quality of life and retained or regained ability to productively function at home and in the workplace will be immeasurable.”

Quite simply, Oculenz mediated reality glasses use pixel manipulation to correct for eye deficits such as macular degeneration (AMD), macular scar and myopic degeneration.

Weighing in at just 200g, the Oculenz headset has an eye-tracking camera housed in the nosepiece that provides a virtual overlay of what the wearer is looking at, in a way that corrects for their scotoma, regardless of its size.

The device is particularly useful when reading, as the wearer gets to see all the words in their entirety, in line with their natural gaze.

To prepare the Oculenz technology for an individual wearer, the patient initially wears the headpiece to map their retinal defects, alone or with the help of their eye care professional. The eye tracking video camera is then buffered according to the retinal map and displayed over the patient’s real-world vision. The Oculenz’s buffered display shows all the real-world video image, but none is shown on the exact area of the patient’s eye defect. Instead, it is directed to the patient’s existing good areas of the macula or near adjacent peripheral retina.

“The Oculenz technology works somewhat like how we all have a natural bind spot in our vision that we do not realise on a dayto- day basis,” said Dr Lars Freisberg, M.D., an international Retinal Specialist surgeon, licensed in the U.S., Norway and Germany, and a member of the Ocutrx IMAB. “However, you never really see this hole,” he explained, “because the brain fills it in with other gathered visual information.”

Oculenz is expected to be available this quarter (Q4, 2021).


While not AR, a robotic imaging tool being developed at Duke University, Durham, North Carolina, has researchers well and truly excited.

The new tool, which combines optical coherence tomography (OCT) with a robotic arm, can automatically track and image a patient’s eyes in less than a minute, and produce images that are as clear as traditional OCTs. The engineers and ophthalmologists believe it can automatically detect and scan a patient’s eyes for markers of different eye diseases.

To use their scanner, a patient approaches the machine and stands in front of the robotic arm. 3D cameras placed to the left and right of the robot help find the patient in space, while smaller cameras in the robotic arm search for landmarks on the eye to precisely position the scanner. The system is able to scan both the macula and cornea. It takes the tool less than 10 seconds to scan and image each eye, and the entire process is complete in less than 50 seconds.

Because it’s easier for the patient and does not require a trained ophthalmic photographer to operate this tool, the researchers hope their innovation will be of great use in the wider community.

“The robotic arm gives us the flexibility of handheld OCT scanners, but we don’t need to worry about any operator tremor,” said Mark Draelos, a postdoctoral fellow in the biomedical engineering department. “If a person moves, the robot moves with it. As long as the scanner is aligned to within a centimetre of where it needs to be on your pupil, the scanner can get an image that is as good as a tabletop scanner.”

Because the patient is never in physical contact with the system, this tool avoids any hygiene and infectious diseases concerns that arise with the shared chin and headrest in traditional OCT systems. Additionally, it is proving to be safe.

“The camera systems continuously track the patient and allow the robot to keep a safe distance,” said Mr Draelos. “In fact, the only time we’ve seen any unintended robot contact is if a person walks or bumps into the robot when it isn’t imaging their eye.”

The team is now refining the robot’s targeting by imaging the eyes of volunteers. Next, they hope to image patients that have actual retinal or corneal diseases to test how well their robot can capture abnormalities.

They are also working to improve the field of view for the retinal scanner, as their first iteration was able to capture key features, but multiple images would need to be spliced together to get a full view of the retina.

“While this is a solution for image collection issues, we think it will pair incredibly well with recent advances in machine learning for OCT image interpretation,” said Ryan McNabb, a research scientist in the Department of Ophthalmology at the Duke University Medical Centre. “We’re really bringing OCT to the patients rather than limiting these tools to specialised clinics, and I think it will make it much easier to help a wider population of people.”


  1. forbusiness.snapchat.com/blog/branded-ar-influencespurchasing
  2. R. Kashem, Comparison of Heru Visual field as a cloudbased artificial intelligence-powered software applicationdownloadable on commercial augmented reality headset with Humphrey Field Analyzer SITA Standard, presented at the ARVO, 2021. 
  3. A. H. Goldbach et al., Visual field measurements usingHeru Visual Field Multi-platform application downloadedon two different commercially available augmented reality devices, p. 1.)