According to media reports, Apple is expected to release its first wearable augmented reality (AR) or virtual reality (VR) device in 2022 or 2023. Most suppliers may be located in Taiwan, such as TSMC, Largan, Yecheng, and Pegatron. Apple may use its experimental plant in Taiwan to design this microdisplay. The industry expects that Apple's attractive use cases will lead to the take-off of the extended reality (XR) market. Apple's device announcement and reports related to the device's XR technology (AR, VR, or MR) have not been confirmed. But Apple has added AR applications on the iPhone and iPad and launched the ARKit platform for developers to create AR applications. In the future, Apple may develop a wearable XR device, generate synergy with the iPhone and iPad, and gradually expand AR from commercial applications to consumer applications.
According to Korean media news, Apple announced on November 18 that it is developing an XR device that includes an "OLED display." OLED (OLED on Silicon, OLED on Silicon) is a display that implements OLED after creating pixels and drivers on a silicon wafer substrate. Due to semiconductor technology, ultra-precision driving can be performed, installing more pixels. The typical display resolution is hundreds of pixels per inch (PPI). In contrast, OLEDoS can achieve up to thousands of pixels per inch PPI. Since XR devices look close to the eye, they must support high resolution. Apple is preparing to install a high-resolution OLED display with high PPI.
Conceptual image of Apple headset (picture source: Internet)
Apple also plans to use TOF sensors on its XR devices. TOF is a sensor that can measure the distance and shape of the measured object. It is essential to realize virtual reality (VR) and augmented reality (AR).
It is understood that Apple is working with Sony, LG Display, and LG Innotek to promote the research and development of core components. It is understood that the development task is in progress; rather than simply technology research and development, the possibility of its commercialization is very high. According to Bloomberg News, Apple plans to launch XR devices in the second half of next year.
Samsung is also focusing on next-generation XR devices. Samsung Electronics invested in developing "DigiLens" lenses for smart glasses. Although it did not disclose the investment amount, it is expected to be a glasses-type product with a screen infused with a unique lens. Samsung Electro-Mechanics also participated in the investment of DigiLens.
Challenges Apple faces in manufacturing wearable XR devices.
Wearable AR or VR devices include three functional components: display and presentation, sensing mechanism, and calculation.
The appearance design of wearable devices should consider related issues such as comfort and acceptability, such as the weight and size of the device. XR applications closer to the virtual world usually require more computing power to generate virtual objects, so their core computing performance must be higher, leading to greater power consumption.
In addition, heat dissipation and internal XR batteries also limit technical design. These restrictions also apply to AR devices close to the real world. The XR battery life of Microsoft HoloLens 2 (566g) is only 2-3 hours. Connecting wearable devices (tethering) to external computing resources (such as smartphones or personal computers) or power sources can be used as a solution, but this will limit the mobility of wearable devices.
Regarding the sensing mechanism, when most VR devices perform human-computer interaction, their precision mainly relies on the controller in their hands, especially in games, where the motion tracking function depends on the inertial measurement device (IMU). AR devices use freehand user interfaces, such as natural voice recognition and gesture sensing control. High-end devices such as Microsoft HoloLens even provide machine vision and 3D depth-sensing functions, which are also areas that Microsoft has been good at since Xbox launched Kinect.
Compared with wearable AR devices, it may be easier to create user interfaces and display presentations on VR devices because there is less need to consider the external world or the influence of ambient light. The handheld controller can also be more accessible to develop than the man-machine interface when bare-handed. Handheld controllers can use IMU, but gesture sensing control and 3D depth-sensing rely on advanced optical technology and vision algorithms, that is, machine vision.
The VR device needs to be shielded to prevent the real-world environment from affecting the display. VR displays can be LTPS TFT liquid crystal displays, LTPS AMOLED displays with lower cost and more suppliers, or emerging silicon-based OLED (micro OLED) displays. It is cost-effective to use a single display (for left and right eyes), as large as a mobile phone display screen from 5 inches to 6 inches. However, the dual-monitor design (separated left and right eyes) provides better interpupillary distance (IPD) adjustment and viewing angle (FOV).
In addition, given that users continue to watch computer-generated animations, low-latency (smooth images, preventing blur) and high-resolution (eliminating the screen-door effect) are the development directions for displays. The display optics of the VR device is an intermediate object between the show and the user's eyes. Therefore, the thickness (device shape factor) is reduced and excellent for optical designs such as the Fresnel lens. The display effect can be challenging.
As for AR displays, most of them are silicon-based microdisplays. Display technologies include liquid crystal on silicon (LCOS), digital light processing (DLP) or digital mirror device (DMD), Laser beam scanning (LBS), silicon-based micro OLED, and silicon-based micro-LED (micro-LED on silicon). To resist the interference of intense ambient light, the AR display must have a high brightness higher than 10Knits (considering the loss after the waveguide, 100Knits is more ideal). Although it is passive light emission, LCOS, DLP and LBS can increase the brightness by enhancing the light source (such as a laser).
Therefore, people may prefer to use micro LEDs compared to micro OLEDs. But in terms of colorization and manufacturing, micro-LED technology is not as mature as micro OLED technology. It can use WOLED (RGB color filter for white light) technology to make RGB light-emitting micro OLEDs. However, there is no straightforward method for the production of micro LEDs. Potential plans include Plessey's Quantum Dot (QD) color conversion (in collaboration with Nanoco), Ostendo's Quantum Photon Imager (QPI) designed RGB stack, and JBD's X-cube (a combination of three RGB chips).
If Apple devices are based on the video see-through (VST) method, Apple can use mature micro OLED technology. If the Apple device is based on the direct see-through (optical see-through, OST) approach, It cannot avoid substantial ambient light interference, and the brightness of the micro OLED may be limited. Most AR devices face the same interference problem, which may be why Microsoft HoloLens 2 chose LBS instead of micro OLED.
The optical components (such as waveguide or Fresnel lens) required for designing a microdisplay are not necessarily more straightforward than creating a microdisplay. If it is based on the VST method, Apple can use the pancake-style optical design (combination) to achieve a variety of micro-display and optical devices. Based on the OST method, you can choose the waveguide or birdbath visual design. The advantage of waveguide optical design is that its form factor is thinner and smaller. However, waveguide optics have weak optical rotation performance for microdisplays and are accompanied by other problems such as distortion, uniformity, color quality, and contrast. The diffractive optical element (DOE), the holographic optical element (HOE), and the reflective optical element (ROE) are the main methods of waveguide visual design. Apple acquired Akonia Holographics in 2018 to obtain its optical expertise.