According to an example method, it is determined whether a difference between first acoustic data and second acoustic data exceeds a threshold. The first acoustic data is associated with a first client application in communication with an audio service. The second acoustic data is associated with a second client application. A first input audio signal associated with the first client application is received via the audio service. In accordance with the determination that the difference does not exceed the threshold, the second acoustic data is applied to the first input audio signal to produce a first output audio signal. In accordance with a determination that the difference exceeds the threshold, the first acoustic data is applied to the first input audio signal to produce the first output audio signal. The first output audio signal is presented to a user of a wearable head device in communication with the audio service.
A63F 13/54 - Commande des signaux de sortie en fonction de la progression du jeu incluant des signaux acoustiques, p. ex. pour simuler le bruit d’un moteur en fonction des tours par minute [RPM] dans un jeu de conduite ou la réverbération contre un mur virtuel
A virtual image generation system comprises a planar optical waveguide having opposing first and second faces, an in-coupling (IC) element configured for optically coupling a collimated light beam from an image projection assembly into the planar optical waveguide as an in-coupled light beam, a first orthogonal pupil expansion (OPE) element associated with the first face of the planar optical waveguide for splitting the in-coupled light beam into a first set of orthogonal light beamlets, a second orthogonal pupil expansion (OPE) element associated with the second face of the planar optical waveguide for splitting the in-coupled light beam into a second set of orthogonal light beamlets, and an exit pupil expansion (EPE) element associated with the planar optical waveguide for splitting the first and second sets of orthogonal light beamlets into an array of out-coupled light beamlets that exit the planar optical waveguide.
G02B 6/12 - Guides de lumièreDétails de structure de dispositions comprenant des guides de lumière et d'autres éléments optiques, p. ex. des moyens de couplage du type guide d'ondes optiques du genre à circuit intégré
G02B 26/08 - Dispositifs ou dispositions optiques pour la commande de la lumière utilisant des éléments optiques mobiles ou déformables pour commander la direction de la lumière
A method of fabricating an optical element includes providing a substrate, forming a castable material coupled to the substrate, and casting the castable material using a mold. The method also includes curing the castable material and removing the mold. The optical element includes a planar region and a clear aperture adjacent the planar region and characterized by an optical power.
Disclosed herein is an augmented reality (AR) system that provides information about purchasing alternatives to a user who is about to purchase an item or product (e.g., a target product) in a physical retail location. In some variations, offers to purchase the product and/or an alternative product are provided by the merchant and/or competitors via the AR system. An offer negotiation server (ONS) aggregates offer data provided various external parties (EPs) and displays these offers to the user as the user is considering the purchase of a target product. In some variations, an AR system may be configured to facilitate the process of purchasing items at a retail location.
A head-mounted apparatus include an eyepiece that include a variable dimming assembly and a frame mounting the eyepiece so that a user side of the eyepiece faces a towards a user and a world side of the eyepiece opposite the first side faces away from the user. The dynamic dimming assembly selectively modulates an intensity of light transmitted parallel to an optical axis from the world side to the user side during operation. The dynamic dimming assembly includes a variable birefringence cell having multiple pixels each having an independently variable birefringence, a first linear polarizer arranged on the user side of the variable birefringence cell, the first linear polarizer being configured to transmit light propagating parallel to the optical axis linearly polarized along a pass axis of the first linear polarizer orthogonal to the optical axis, a quarter wave plate arranged between the variable birefringence cell and the first linear polarizer, a fast axis of the quarter wave plate being arranged relative to the pass axis of the first linear polarizer to transform linearly polarized light transmitted by the first linear polarizer into circularly polarized light, and a second linear polarizer on the world side of the variable birefringence cell.
G02B 27/28 - Systèmes ou appareils optiques non prévus dans aucun des groupes , pour polariser
G02F 1/1335 - Association structurelle de cellules avec des dispositifs optiques, p. ex. des polariseurs ou des réflecteurs
G02F 1/13363 - Éléments à biréfringence, p. ex. pour la compensation optique
G02F 1/139 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de l'intensité, de la phase, de la polarisation ou de la couleur basés sur des cristaux liquides, p. ex. cellules d'affichage individuelles à cristaux liquides caractérisés par l'effet électro-optique ou magnéto-optique, p. ex. transition de phase induite par un champ, effet d'orientation, interaction entre milieu récepteur et matière additive ou diffusion dynamique basés sur des effets d'orientation où les cristaux liquides restent transparents
6.
EYE TRACKING BASED VIDEO TRANSMISSION AND COMPRESSION
A computer-implemented method includes receiving gaze information about an observer of a video stream; determining a video compression spatial map for the video stream based on the received gaze information and performance characteristics of a network connection with the observer; compressing the video stream according to the video compression spatial map; and sending the compressed video stream to the observer.
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06T 7/50 - Récupération de la profondeur ou de la forme
G06V 10/25 - Détermination d’une région d’intérêt [ROI] ou d’un volume d’intérêt [VOI]
H04N 19/59 - Procédés ou dispositions pour le codage, le décodage, la compression ou la décompression de signaux vidéo numériques utilisant le codage prédictif mettant en œuvre un sous-échantillonnage spatial ou une interpolation spatiale, p. ex. modification de la taille de l’image ou de la résolution
Systems and methods of presenting an output audio signal to a listener located at a first location in a virtual environment are disclosed. According to embodiments of a method, an input audio signal is received. For each sound source of a plurality of sound sources in the virtual environment, a respective first intermediate audio signal corresponding to the input audio signal is determined, based on a location of the respective sound source in the virtual environment, and the respective first intermediate audio signal is associated with a first bus. For each of the sound sources of the plurality of sound sources in the virtual environment, a respective second intermediate audio signal is determined. The respective second intermediate audio signal corresponds to a reflection of the input audio signal in a surface of the virtual environment. The respective second intermediate audio signal is determined based on a location of the respective sound source, and further based on an acoustic property of the virtual environment. The respective second intermediate audio signal is associated with a second bus. The output audio signal is presented to the listener via the first bus and the second bus.
G10K 15/10 - Dispositions pour produire une réverbération sonore ou un écho utilisant des réseaux retardateurs comportant des dispositifs électromécaniques ou électro-acoustiques
H04R 3/04 - Circuits pour transducteurs pour corriger la fréquence de réponse
H04R 3/12 - Circuits pour transducteurs pour distribuer des signaux à plusieurs haut-parleurs
H04R 5/033 - Casques pour communication stéréophonique
This disclosure describes a wearable display system configured to project light to the eye(s) of a user to display virtual (e.g., augmented reality) image content in a vision field of the user. The system can include light source(s) that output light, spatial light modulator(s) that modulate the light to provide the virtual image content, and an eyepiece configured to convey the modulated light toward the eye(s) of the user. The eyepiece can include waveguide(s) and a plurality of in-coupling optical elements arranged on or in the waveguide(s) to in-couple the modulated light received from the spatial light modulator(s) into the waveguide(s) to be guided toward the user's eye(s). The spatial light modulator(s) may be movable, and/or may include movable components, to direct different portions of the modulated light toward different ones of the in-coupling optical elements at different times.
Embodiments of this disclosure provides systems and methods for displays. In embodiments, a display system includes a frame, an eyepiece coupled to the frame, and a first adhesive bond disposed between the frame and the eyepiece. The eyepiece can include a light input region and a light output region. The first adhesive bond can be disposed along a first portion of a perimeter of the eyepiece, where the first portion of the perimeter of the eyepiece borders the light input region such that the first adhesive bond is configured to maintain a position of the light input region relative to the frame.
The invention provides a content provisioning system. A mobile device has a mobile device processor. The mobile device mobile device has communication interface connected to the mobile device processor and a first resource device communication interface and under the control of the mobile device processor to receive first content transmitted by the first resource device transmitter The mobile device mobile device has a mobile device output device connected to the mobile device processor and under control of the mobile device processor capable of providing an output that can be sensed by a user.
A method of processing an audio signal is disclosed. According to embodiments of the method, magnitude response information of a prototype filter is determined. The magnitude response information includes a plurality of gain values, at least one of which includes a first gain corresponding to a first frequency. The magnitude response information of the prototype filter is stored. The magnitude response information of the prototype filter at the first frequency is retrieved. Gains are computed for a plurality of control frequencies based on the retrieved magnitude response information of the prototype filter at the first frequency, and the computed gains are applied to the audio signal.
An augmented reality viewer includes components, assemblies, and executable logic to provide a user with the perception of rich augmented reality experiences, including aspects of the aquatic world.
A pupil separation system includes a first optical element disposed at a first position along an axis and including a first input surface, a first beamsplitter optically coupled to the first input surface, and a first output surface optically coupled to the first beamsplitter. The pupil separation system also includes a second optical element disposed at a second position along the axis and including a second input surface, a second beamsplitter optically coupled to the second input surface, and a second output surface optically coupled to the second beamsplitter. The pupil separation system further includes a third optical element disposed at a third position along the axis and including a third input surface, a reflective surface optically coupled to the third input surface, and a third output surface optically coupled to the reflective surface.
G06F 18/21 - Conception ou mise en place de systèmes ou de techniquesExtraction de caractéristiques dans l'espace des caractéristiquesSéparation aveugle de sources
G06F 18/22 - Critères d'appariement, p. ex. mesures de proximité
A user may interact and select positions in three-dimensional space arbitrarily through conversion of a two-dimensional positional input into a three-dimensional point in space. The system may allow a user to use one or more user input devices for pointing, annotating, or drawing on virtual objects, real objects or empty space in reference to the location of the three-dimensional point in space within an augmented reality or mixed reality session.
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 3/0346 - Dispositifs de pointage déplacés ou positionnés par l'utilisateurLeurs accessoires avec détection de l’orientation ou du mouvement libre du dispositif dans un espace en trois dimensions [3D], p. ex. souris 3D, dispositifs de pointage à six degrés de liberté [6-DOF] utilisant des capteurs gyroscopiques, accéléromètres ou d’inclinaison
G06F 3/041 - Numériseurs, p. ex. pour des écrans ou des pavés tactiles, caractérisés par les moyens de transduction
G06F 3/04812 - Techniques d’interaction fondées sur l’aspect ou le comportement du curseur, p. ex. sous l’influence de la présence des objets affichés
G06F 3/04815 - Interaction s’effectuant dans un environnement basé sur des métaphores ou des objets avec un affichage tridimensionnel, p. ex. modification du point de vue de l’utilisateur par rapport à l’environnement ou l’objet
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
A method is disclosed, the method comprising the steps of identifying a first real object in a mixed reality environment, the mixed reality environment having a user; identifying a second real object in the mixed reality environment; generating, in the mixed reality environment, a first virtual object corresponding to the second real object; identifying, in the mixed reality environment, a collision between the first real object and the first virtual object; determining a first attribute associated with the collision; determining, based on the first attribute, a first audio signal corresponding to the collision; and presenting to the user, via one or more speakers, the first audio signal.
A method of forming a reconstructed image includes receiving an image having a plurality of lines of pixel data and forming a mask for each line. Each line is defined by a plurality of pixel groups. The method includes, for each pixel group of the plurality of pixel groups, defining a first bit if pixels in the pixel group are characterized by pixel values less than a brightness threshold and defining a second bit if pixels in the pixel group are characterized by pixel values greater than or equal to the brightness threshold. The method further includes providing pixel values for pixels in pixel groups having the second bit, storing the mask and the provided pixel values for each line in a memory, extracting the mask and the provided pixel values for each line from the memory, and forming a reconstructed image using the mask and the provided pixel values.
H04N 1/64 - Systèmes pour la transmission ou l'enregistrement du signal d'image en couleursLeurs détails, p. ex. leurs moyens de codage, de décodage
H04N 19/593 - Procédés ou dispositions pour le codage, le décodage, la compression ou la décompression de signaux vidéo numériques utilisant le codage prédictif mettant en œuvre des techniques de prédiction spatiale
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06N 5/04 - Modèles d’inférence ou de raisonnement
H04N 25/535 - Commande du temps d'intégration en utilisant des temps d'intégration différents pour les différentes régions du capteur par une sélection dynamique des régions
H04N 1/41 - Réduction de la largeur de bande ou de la redondance
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
17.
TEMPERATURE DEPENDENT CALIBRATION OF MOVEMENT DETECTION DEVICES
An electronics system has a board with a thermal interface having an exposed surface. A thermoelectric device is placed against the thermal interface to heat the board. Heat transfers through the board from a first region where the thermal interface is located to a second region where an electronics device is mounted. The electronics device has a temperature sensor that detects the temperature of the electronics device. The temperature of the electronics device is used to calibrate an accelerometer and a gyroscope in the electronics device. Calibration data includes a temperature and a corresponding acceleration offset and a corresponding angle offset. A field computer simultaneously senses a temperature, an acceleration and an angle from the temperature sensor, accelerometer and gyroscope and adjusts the measured data with the offset data at the same temperature. The field computer provides corrected data to a controlled system.
G01C 25/00 - Fabrication, étalonnage, nettoyage ou réparation des instruments ou des dispositifs mentionnés dans les autres groupes de la présente sous-classe
B81B 7/02 - Systèmes à microstructure comportant des dispositifs électriques ou optiques distincts dont la fonction a une importance particulière, p. ex. systèmes micro-électromécaniques [SMEM, MEMS]
An augmented reality (AR) system includes a wearable device comprising a display inside the wearable device and operable to display virtual content and an imaging device mounted to the wearable device. The AR system also includes a handheld device comprising handheld fiducials affixed to the handheld device and a computing apparatus configured to perform localization of the handheld device with respect to the wearable device.
Examples of the disclosure describe systems and methods for generating and displaying a virtual companion. In an example method, a first input from an environment of a user is received at a first time via a first sensor. An occurrence of an event in the environment is determined based on the first input. A second input from the user is received via a second sensor, and an emotional reaction of the user is identified based on the second input. An association is determined between the emotional reaction and the event. A view of the environment is presented at a second time later than the first time via a display. A stimulus is presented at the second time via a virtual companion displayed via the display, wherein the stimulus is determined based on the determined association between the emotional reaction and the event.
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 21/53 - Contrôle des utilisateurs, des programmes ou des dispositifs de préservation de l’intégrité des plates-formes, p. ex. des processeurs, des micrologiciels ou des systèmes d’exploitation au stade de l’exécution du programme, p. ex. intégrité de la pile, débordement de tampon ou prévention d'effacement involontaire de données par exécution dans un environnement restreint, p. ex. "boîte à sable" ou machine virtuelle sécurisée
A method of operating an eyepiece waveguide having a first diffractive region and a second diffractive region includes directing light from a first projector to impinge on a first incoupling grating (ICG) and directing light from a second projector to impinge on a second ICG. Light from the first projector is diffracted into a first portion of the first diffractive region, diffracted into the second diffractive region, and subsequently diffracted out of the eyepiece waveguide. Light from the second projector is diffracted into the third portion of the second diffractive region, diffracted into the first portion of the first diffractive region, and subsequently diffracted out of the eyepiece waveguide.
A method of operating an imaging system to display an image includes providing a light source having an emission area and a light-guiding optical element having an output field of view. The method also includes activating a segment of the light source, causing the light source to produce a light beam that illuminates a portion of the emission area, encoding the light beam with image data, thereby producing an encoded light beam, focusing the encoded light beam onto the light-guiding optical element, and projecting the encoded light beam from a portion of the output field of view of the light-guiding optical element that corresponds to the portion of the emission area of the light source.
G02B 30/24 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p. ex. des effets stéréoscopiques en fournissant des première et seconde images de parallaxe à chacun des yeux gauche et droit d’un observateur du type stéréoscopique impliquant un multiplexage temporel, p. ex. utilisant des obturateurs gauche et droit activés séquentiellement
G03H 1/22 - Procédés ou appareils pour obtenir une image optique à partir d'un hologramme
G03H 1/26 - Procédés ou appareils adaptés spécialement pour produire des hologrammes multiples ou pour en obtenir des images, p. ex. procédés pour l'holographie à plusieurs couleurs
H04N 9/31 - Dispositifs de projection pour la présentation d'images en couleurs
H04N 23/56 - Caméras ou modules de caméras comprenant des capteurs d'images électroniquesLeur commande munis de moyens d'éclairage
22.
METHOD AND SYSTEM FOR VARIABLE OPTICAL THICKNESS WAVEGUIDES FOR AUGMENTED REALITY DEVICES
An augmented reality device includes a projector, projector optics optically coupled to the projector, and a substrate structure including a substrate having an incident surface and an opposing exit surface and a first variable thickness film coupled to the incident surface. The substrate structure can also include a first combined pupil expander coupled to the first variable thickness film, a second variable thickness film coupled to the opposing exit surface, an incoupling grating coupled to the opposing exit surface, and a second combined pupil expander coupled to the opposing exit surface.
G02B 6/13 - Circuits optiques intégrés caractérisés par le procédé de fabrication
F21V 8/00 - Utilisation de guides de lumière, p. ex. dispositifs à fibres optiques, dans les dispositifs ou systèmes d'éclairage
G02B 6/12 - Guides de lumièreDétails de structure de dispositions comprenant des guides de lumière et d'autres éléments optiques, p. ex. des moyens de couplage du type guide d'ondes optiques du genre à circuit intégré
G02B 6/122 - Éléments optiques de base, p. ex. voies de guidage de la lumière
G02B 26/08 - Dispositifs ou dispositions optiques pour la commande de la lumière utilisant des éléments optiques mobiles ou déformables pour commander la direction de la lumière
23.
METHOD OF WAKING A DEVICE USING SPOKEN VOICE COMMANDS
Disclosed herein are systems and methods for processing speech signals in mixed reality applications. A method may include receiving an audio signal; determining, via first processors, whether the audio signal comprises a voice onset event; in accordance with a determination that the audio signal comprises the voice onset event: waking a second one or more processors; determining, via the second processors, that the audio signal comprises a predetermined trigger signal; in accordance with a determination that the audio signal comprises the predetermined trigger signal: waking third processors; performing, via the third processors, automatic speech recognition based on the audio signal; and in accordance with a determination that the audio signal does not comprise the predetermined trigger signal: forgoing waking the third processors; and in accordance with a determination that the audio signal does not comprise the voice onset event: forgoing waking the second processors.
Augmented reality and virtual reality display systems and devices are configured for efficient use of projected light. In some aspects, a display system includes a light projection system and a head-mounted display configured to project light into an eye of the user to display virtual image content. The head-mounted display includes at least one waveguide comprising a plurality of in-coupling elements each configured to receive, from the light projection system, light corresponding to a portion of the user's field of view and to in-couple the light into the waveguide; and a plurality of out-coupling elements configured to out-couple the light out of the waveguide to display the virtual content, wherein each of the out-coupling elements are configured to receive light from different ones of the in-coupling elements.
A foveated display for projecting an image to an eye of a viewer is provided. The foveated display includes a first projector and a dynamic eyepiece optically coupled to the first projector. The dynamic eyepiece comprises a waveguide having a variable surface profile. The foveated display also includes a second projector and a fixed depth plane eyepiece optically coupled to the second projector.
G02B 30/52 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p. ex. des effets stéréoscopiques l’image étant construite à partir d'éléments d'image répartis sur un volume 3D, p. ex. des voxels le volume 3D étant construit à partir d'une pile ou d'une séquence de plans 2D, p. ex. systèmes d'échantillonnage en profondeur
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
Examples of an imaging system for use with a head mounted display (HMD) are disclosed. The imaging system can include a forward-facing imaging camera and a surface of a display of the HMD can include an off-axis diffractive optical element (DOE) or hot mirror configured to reflect light to the imaging camera. The DOE or hot mirror can be segmented. The imaging system can be used for eye tracking, biometric identification, multiscopic reconstruction of the three-dimensional shape of the eye, etc.
A61B 3/10 - Appareils pour l'examen optique des yeuxAppareils pour l'examen clinique des yeux du type à mesure objective, c.-à-d. instruments pour l'examen des yeux indépendamment des perceptions ou des réactions du patient
A61B 3/113 - Appareils pour l'examen optique des yeuxAppareils pour l'examen clinique des yeux du type à mesure objective, c.-à-d. instruments pour l'examen des yeux indépendamment des perceptions ou des réactions du patient pour déterminer ou enregistrer le mouvement de l'œil
A61B 3/12 - Appareils pour l'examen optique des yeuxAppareils pour l'examen clinique des yeux du type à mesure objective, c.-à-d. instruments pour l'examen des yeux indépendamment des perceptions ou des réactions du patient pour examiner le fond de l'œil, p. ex. ophtalmoscopes
A61B 3/14 - Dispositions spécialement adaptées à la photographie de l'œil
A61B 5/00 - Mesure servant à établir un diagnostic Identification des individus
A61B 5/11 - Mesure du mouvement du corps entier ou de parties de celui-ci, p. ex. tremblement de la tête ou des mains ou mobilité d'un membre
Methods and systems for triggering presentation of virtual content based on sensor information. The display system may be an augmented reality display system configured to provide virtual content on a plurality of depth planes using different wavefront divergences. The system may monitor information detected via the sensors, and based on the monitored information, trigger access to virtual content identified in the sensor information. Virtual content can be obtained, and presented as augmented reality content via the display system. The system may monitor information detected via the sensors to identify a QR code, or a presence of a wireless beacon. The QR code or wireless beacon can trigger the display system to obtain virtual content for presentation.
A63F 13/211 - Dispositions d'entrée pour les dispositifs de jeu vidéo caractérisées par leurs capteurs, leurs finalités ou leurs types utilisant des capteurs d’inertie, p. ex. des accéléromètres ou des gyroscopes
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
28.
OPTICAL LAYERS TO IMPROVE PERFORMANCE OF EYEPIECES FOR USE WITH VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEMS
Improved diffractive optical elements for use in an eyepiece for an extended reality system, The diffractive optical elements comprise a diffraction structure having a waveguide substrate, a surface grating positioned on a first side of the waveguide substrate, and one or more optical layer pairs disposed between the waveguide substrate and the surface grating. Each optical layer pair comprises a low index layer and a high index layer disposed directly on an exterior side of the low index layer.
In some embodiments, an augmented reality system includes at least one waveguide that is configured to receive and redirect light toward a user, and is further configured to allow ambient light from an environment of the user to pass therethrough toward the user. The augmented reality system also includes a first adaptive lens assembly positioned between the at least one waveguide and the environment, a second adaptive lens assembly positioned between the at least one waveguide and the user, and at least one processor operatively coupled to the first and second adaptive lens assemblies. Each lens assembly of the augmented reality system is selectively switchable between at least two different states in which the respective lens assembly is configured to impart at least two different optical powers to light passing therethrough, respectively. The at least one processor is configured to cause the first and second adaptive lens assemblies to synchronously switch between different states in a manner such that the first and second adaptive lens assemblies impart a substantially constant net optical power to ambient light from the environment passing therethrough.
G02F 1/01 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de l'intensité, de la phase, de la polarisation ou de la couleur
G02F 1/13 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de l'intensité, de la phase, de la polarisation ou de la couleur basés sur des cristaux liquides, p. ex. cellules d'affichage individuelles à cristaux liquides
G02F 1/1337 - Orientation des molécules des cristaux liquides induite par les caractéristiques de surface, p. ex. par des couches d'alignement
G02F 1/29 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de la position ou de la direction des rayons lumineux, c.-à-d. déflexion
30.
ANGULARLY SELECTIVE ATTENUATION OF LIGHT TRANSMISSION ARTIFACTS IN WEARABLE DISPLAYS
A wearable display system includes an eyepiece stack having a world side and a user side opposite the world side. During use, a user positioned on the user side views displayed images delivered by the wearable display system via the eyepiece stack which augment the user's field of view of the user's environment. The system also includes an optical attenuator arranged on the world side of the of the eyepiece stack, the optical attenuator having a layer of a birefringent material having a plurality of domains each having a principal optic axis oriented in a corresponding direction different from the direction of other domains. Each domain of the optical attenuator reduces transmission of visible light incident on the optical attenuator for a corresponding different range of angles of incidence.
G02B 27/28 - Systèmes ou appareils optiques non prévus dans aucun des groupes , pour polariser
G02F 1/01 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de l'intensité, de la phase, de la polarisation ou de la couleur
G02F 1/1335 - Association structurelle de cellules avec des dispositifs optiques, p. ex. des polariseurs ou des réflecteurs
G02F 1/13363 - Éléments à biréfringence, p. ex. pour la compensation optique
G02F 1/1337 - Orientation des molécules des cristaux liquides induite par les caractéristiques de surface, p. ex. par des couches d'alignement
G02F 1/139 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de l'intensité, de la phase, de la polarisation ou de la couleur basés sur des cristaux liquides, p. ex. cellules d'affichage individuelles à cristaux liquides caractérisés par l'effet électro-optique ou magnéto-optique, p. ex. transition de phase induite par un champ, effet d'orientation, interaction entre milieu récepteur et matière additive ou diffusion dynamique basés sur des effets d'orientation où les cristaux liquides restent transparents
A head mounted display system configured to project a first image to an eye of a user, the head mounted display system includes at least one waveguide comprising a first major surface, a second major surface opposite the first major surface, and a first edge and a second edge between the first major surface and second major surface. The at least one waveguide also includes a first reflector disposed between the first major surface and the second major surface. The head mounted display system also includes at least one light source disposed closer to the first major surface than the second major surface and a spatial light modulator configured to form a second image and disposed closer to the first major surface than the second major surface, wherein the first reflector is configured to reflect light toward the spatial light modulator.
An apparatus including a set of three illumination sources disposed in a first plane. Each of the set of three illumination sources is disposed at a position in the first plane offset from others of the set of three illumination sources by 120 degrees measured in polar coordinates. The apparatus also includes a set of three waveguide layers disposed adjacent the set of three illumination sources. Each of the set of three waveguide layers includes an incoupling diffractive element disposed at a lateral position offset by 180 degrees from a corresponding illumination source of the set of three illumination sources.
H04N 13/344 - Affichage pour le visionnement à l’aide de lunettes spéciales ou de visiocasques avec des visiocasques portant des affichages gauche et droit
33.
VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEMS WITH EMISSIVE MICRO-DISPLAYS
A wearable display system includes one or more emissive micro-displays, e.g., micro-LED displays. The micro-displays may be monochrome micro-displays or full-color micro-displays. The micro-displays may include arrays of light emitters. Light collimators may be utilized to narrow the angular emission profile of light emitted by the light emitters. Where a plurality of emissive micro-displays is utilized, the micro-displays may be positioned at different sides of an optical combiner, e.g., an X-cube prism which receives light rays from different micro-displays and outputs the light rays from the same face of the cube. The optical combiner directs the light to projection optics, which outputs the light to an eyepiece that relays the light to a user's eye. The eyepiece may output the light to the user's eye with different amounts of wavefront divergence, to place virtual content on different depth planes.
G02B 27/09 - Mise en forme du faisceau, p. ex. changement de la section transversale, non prévue ailleurs
G02B 27/10 - Systèmes divisant ou combinant des faisceaux
G02B 27/14 - Systèmes divisant ou combinant des faisceaux fonctionnant uniquement par réflexion
G02B 27/18 - Systèmes ou appareils optiques non prévus dans aucun des groupes , pour projection optique, p. ex. combinaison de miroir, de condensateur et d'objectif
G02B 27/40 - Moyens optiques auxiliaires pour mise au point
G02B 27/62 - Appareils optiques spécialement adaptés pour régler des éléments optiques pendant l'assemblage de systèmes optiques
G09G 3/00 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques
G09G 3/32 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p. ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice utilisant des sources lumineuses commandées utilisant des panneaux électroluminescents semi-conducteurs, p. ex. utilisant des diodes électroluminescentes [LED]
H02N 2/02 - Machines électriques en général utilisant l'effet piézo-électrique, l'électrostriction ou la magnétostriction produisant un mouvement linéaire, p. ex. actionneursPositionneurs linéaires
Apparatus and methods for displaying an image by a rotating structure are provided. The rotating structure can comprise blades of a fan. The fan can be a cooling fan for an electronics device such as an augmented reality display. In some embodiments, the rotating structure comprises light sources that emit light to generate the image. The light sources can comprise light-field emitters. In other embodiments, the rotating structure is illuminated by an external (e.g., non-rotating) light source.
G09G 3/02 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques par traçage ou balayage d'un faisceau lumineux sur un écran
F04D 17/16 - Pompes centrifuges pour déplacement sans compression notable
F04D 25/08 - Ensembles comprenant des pompes et leurs moyens d'entraînement le fluide énergétique étant l'air, p. ex. pour la ventilation
F04D 29/00 - Parties constitutives, détails ou accessoires
F04D 29/42 - Carters d'enveloppeTubulures pour le fluide énergétique pour pompes radiales ou hélicocentrifuges
G02B 30/56 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p. ex. des effets stéréoscopiques l’image étant construite à partir d'éléments d'image répartis sur un volume 3D, p. ex. des voxels en projetant une image aérienne ou flottante
G09G 3/00 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques
Head-mounted augmented reality (AR) devices can track pose of a wearer's head to provide a three-dimensional virtual representation of objects in the wearer's environment. An electromagnetic (EM) tracking system can track head or body pose. A handheld user input device can include an EM emitter that generates an EM field, and the head-mounted AR device can include an EM sensor that senses the EM field. EM information from the sensor can be analyzed to determine location and/or orientation of the sensor and thereby the wearer's pose. The EM emitter and sensor may utilize time division multiplexing (TDM) or dynamic frequency tuning to operate at multiple frequencies. Voltage gain control may be implemented in the transmitter, rather than the sensor, allowing smaller and lighter weight sensor designs. The EM sensor can implement noise cancellation to reduce the level of EM interference generated by nearby audio speakers.
G01S 1/68 - Marqueur, balise d'extrémité, indicatif d'appel ou toutes balises analogues transmettant des signaux ne portant pas d'information directionnelle
G01S 1/70 - Radiophares ou systèmes de balisage émettant des signaux ayant une ou des caractéristiques pouvant être détectées par des récepteurs non directionnels et définissant des directions, situations ou lignes de position déterminées par rapport aux émetteurs de radiophareRécepteurs travaillant avec ces systèmes utilisant des ondes électromagnétiques autres que les ondes radio
G01S 5/02 - Localisation par coordination de plusieurs déterminations de direction ou de ligne de positionLocalisation par coordination de plusieurs déterminations de distance utilisant les ondes radioélectriques
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
A method of presenting a signal to a speech processing engine is disclosed. According to an example of the method, an audio signal is received via a microphone. A portion of the audio signal is identified, and a probability is determined that the portion comprises speech directed by a user of the speech processing engine as input to the speech processing engine. In accordance with a determination that the probability exceeds a threshold, the portion of the audio signal is presented as input to the speech processing engine. In accordance with a determination that the probability does not exceed the threshold, the portion of the audio signal is not presented as input to the speech processing engine.
G10L 15/22 - Procédures utilisées pendant le processus de reconnaissance de la parole, p. ex. dialogue homme-machine
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G10L 15/14 - Classement ou recherche de la parole utilisant des modèles statistiques, p. ex. des modèles de Markov cachés [HMM]
G10L 15/25 - Reconnaissance de la parole utilisant des caractéristiques non acoustiques utilisant la position des lèvres, le mouvement des lèvres ou l’analyse du visage
G10L 15/30 - Reconnaissance distribuée, p. ex. dans les systèmes client-serveur, pour les applications en téléphonie mobile ou réseaux
In some embodiments, eye tracking is used on an AR or VR display system to determine if a user of the display system is blinking or otherwise cannot see. In response, current drain or power usage of a display associated with the display system may be reduced, for example, by dimming or turning off a light source associated with the display, or by configuring a graphics driver to skip a designated number of frames or reduce a refresh rate for a designated period of time.
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 1/3231 - Surveillance de la présence, de l’absence ou du mouvement des utilisateurs
G06F 1/3234 - Économie d’énergie caractérisée par l'action entreprise
G06F 3/03 - Dispositions pour convertir sous forme codée la position ou le déplacement d'un élément
G09G 3/34 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p. ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice en commandant la lumière provenant d'une source indépendante
G09G 3/36 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p. ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice en commandant la lumière provenant d'une source indépendante utilisant des cristaux liquides
38.
SYSTEMS AND METHODS FOR CROSS-APPLICATION AUTHORING, TRANSFER, AND EVALUATION OF RIGGING CONTROL SYSTEMS FOR VIRTUAL CHARACTERS
Various examples of cross-application systems and methods for authoring, transferring, and evaluating rigging control systems for virtual characters are disclosed. Embodiments of a method include the steps or processes of creating, in a first application which implements a first rigging control protocol, a rigging control system description; writing the rigging control system description to a data file; and initiating transfer of the data file to a second application. In such embodiments, the rigging control system description may be defined according to a different second rigging control protocol. The rigging control system description may specify a rigging control input, such as a lower-order rigging element (e.g., a core skeleton for a virtual character), and at least one rule for operating on the rigging control input to produce a rigging control output, such as a higher-order skeleton or other higher-order rigging element.
Various embodiments of a user-wearable device can comprise a frame configured to mount on a user. The device can include a display attached to the frame and configured to direct virtual images to an eye of the user. The device can also include a light source configured to provide polarized light to the eye of the user and that the polarized light is configured to reflect from the eye of the user. The device can further include a light analyzer configured to determine a polarization angle rotation of the reflected light from the eye of the user such that a glucose level of the user can be determined based at least in part on the polarization angle rotation of the reflected light.
A61B 5/1455 - Mesure des caractéristiques du sang in vivo, p. ex. de la concentration des gaz dans le sang ou de la valeur du pH du sang en utilisant des capteurs optiques, p. ex. des oxymètres à photométrie spectrale
A61B 5/00 - Mesure servant à établir un diagnostic Identification des individus
A61B 5/0205 - Évaluation simultanée de l'état cardio-vasculaire et de l'état d'autres parties du corps, p. ex. de l'état cardiaque et respiratoire
A61B 5/024 - Mesure du pouls ou des pulsations cardiaques
A61B 5/08 - Dispositifs de mesure pour examiner les organes respiratoires
A61B 5/145 - Mesure des caractéristiques du sang in vivo, p. ex. de la concentration des gaz dans le sang ou de la valeur du pH du sang
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
40.
AUGMENTED REALITY SYSTEMS AND METHODS UTILIZING REFLECTIONS
A display system includes a wearable display device for displaying augmented reality content. The display device comprises a display area comprising light redirecting features that are configured to direct light to a user. The display area is at least partially transparent and is configured to provide a view of an ambient environment through the display area. The display device is configured to determine that a reflection of the user is within the user's field of view through the display area. After making this determination, augmented reality content is displayed in the display area with the augmented reality content augmenting the user's view of the reflection. In some embodiments, the augmented reality content may overlie on the user's view of the reflection, thereby allowing all or portions of the reflection to appear to be modified to provide a realistic view of the user with various modifications made to their appearance.
An optical device may include a wedge-shaped light turning element, a first surface that is parallel to a horizontal axis, a second surface opposite to the first surface that is inclined with respect to the horizontal axis by a wedge angle, and a light module including light emitters. The light module can be configured to combine light emitted by the emitters. The optical device can further include a light input surface that is between the first and the second surfaces and disposed with respect to the light module to receive light emitted from the emitters. The optical device may include an end reflector disposed on a side opposite the light input surface. Light coupled into the light turning element may be reflected by the end reflector and/or reflected from the second surface towards the first surface.
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
G02B 27/14 - Systèmes divisant ou combinant des faisceaux fonctionnant uniquement par réflexion
G02B 30/26 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p. ex. des effets stéréoscopiques en fournissant des première et seconde images de parallaxe à chacun des yeux gauche et droit d’un observateur du type autostéréoscopique
G02B 30/52 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p. ex. des effets stéréoscopiques l’image étant construite à partir d'éléments d'image répartis sur un volume 3D, p. ex. des voxels le volume 3D étant construit à partir d'une pile ou d'une séquence de plans 2D, p. ex. systèmes d'échantillonnage en profondeur
G02F 1/137 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de l'intensité, de la phase, de la polarisation ou de la couleur basés sur des cristaux liquides, p. ex. cellules d'affichage individuelles à cristaux liquides caractérisés par l'effet électro-optique ou magnéto-optique, p. ex. transition de phase induite par un champ, effet d'orientation, interaction entre milieu récepteur et matière additive ou diffusion dynamique
G03B 21/00 - Projecteurs ou visionneuses du type par projectionLeurs accessoires
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 3/0482 - Interaction avec des listes d’éléments sélectionnables, p. ex. des menus
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
G06V 20/20 - ScènesÉléments spécifiques à la scène dans les scènes de réalité augmentée
G09G 3/02 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques par traçage ou balayage d'un faisceau lumineux sur un écran
G09G 3/24 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p. ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice utilisant des sources lumineuses commandées utilisant des filaments incandescents
H04N 13/239 - Générateurs de signaux d’images utilisant des caméras à images stéréoscopiques utilisant deux capteurs d’images 2D dont la position relative est égale ou en correspondance à l’intervalle oculaire
H04N 13/279 - Générateurs de signaux d’images à partir de modèles 3D d’objets, p. ex. des signaux d’images stéréoscopiques générés par ordinateur les positions des points de vue virtuels étant choisies par les spectateurs ou déterminées par suivi
H04N 13/344 - Affichage pour le visionnement à l’aide de lunettes spéciales ou de visiocasques avec des visiocasques portant des affichages gauche et droit
42.
SYSTEMS AND METHODS FOR VIRTUAL AND AUGMENTED REALITY
Examples of the disclosure describe systems and methods relating to mobile computing. According to an example method, a first user location of a user of a mobile computing system is determined. A first communication device in proximity to the first user location is identified based on the first user location. A first signal is communicated to the first communication device. A first information payload based on the first user location is received from the first communication device, in response to the first communication device receiving the first signal. Video or audio data based on the first information payload is presented to the user at a first time during which the user is at the first user location.
Head mounted display systems configured to project light to an eye of a user to display augmented reality image content in a vision field of the user are disclosed. In embodiments, the system includes a frame configured to be supported on a head of the user, an image projector configured to project images into the user's eye, a camera coupled to the frame, a waveguide optically coupled to the camera, an optical coupling optical element me, an out-coupling element configured to direct light emitted from the waveguide to the camera, and a first light source configured to direct light to the user's eye through the waveguide. Electronics control the camera to capture images periodically and farther control the first light source to pulse in time with the camera such that light emitted by the light source has a reduced intensity when the camera is not capturing images.
An eyepiece waveguide for augmented reality applications includes a substrate and a set of incoupling diffractive optical elements coupled to the substrate. A first subset of the set of incoupling diffractive optical elements is operable to diffract light into the substrate along a first range of propagation angles and a second subset of the set of incoupling diffractive optical elements is operable to diffract light into the substrate along a second range of propagation angles. The eyepiece waveguide also includes a combined pupil expander diffractive optical element coupled to the substrate.
Examples of a light field metrology system for use with a display are disclosed. The light field metrology may capture images of a projected light field, and determine focus depths or lateral focus positions for various regions of the light field using the captured images. The determined focus depths or lateral positions may then be compared with intended focus depths or lateral positions, to quantify the imperfections of the display. Based on the measured imperfections, an appropriate error correction may be performed on the light field to correct for the measured imperfections. The display can be an optical display element in a head mounted display, for example, an optical display element capable of generating multiple depth planes or a light field display.
G01B 11/14 - Dispositions pour la mesure caractérisées par l'utilisation de techniques optiques pour mesurer la distance ou la marge entre des objets ou des ouvertures espacés
G01B 11/22 - Dispositions pour la mesure caractérisées par l'utilisation de techniques optiques pour mesurer la profondeur
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
G09G 3/00 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques
G09G 3/20 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p. ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice
G09G 3/34 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p. ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice en commandant la lumière provenant d'une source indépendante
G09G 5/02 - Dispositions ou circuits de commande de l'affichage communs à l'affichage utilisant des tubes à rayons cathodiques et à l'affichage utilisant d'autres moyens de visualisation caractérisés par la manière dont la couleur est visualisée
H04N 13/344 - Affichage pour le visionnement à l’aide de lunettes spéciales ou de visiocasques avec des visiocasques portant des affichages gauche et droit
H04N 13/383 - Suivi des spectateurs pour le suivi du regard, c.-à-d. avec détection de l’axe de vision des yeux du spectateur
H04N 13/395 - Affichages volumétriques, c.-à-d. systèmes où l’image est réalisée à partir d’éléments répartis dans un volume avec échantillonnage de la profondeur, c.-à-d. construction du volume à partir d’un ensemble ou d’une séquence de plans d’image 2D
In some implementations, an optical device includes a one-way mirror formed by a polarization selective mirror and an absorptive polarizer. The absorptive polarizer has a transmission axis aligned with the transmission axis of the reflective polarizer. The one-way mirror may be provided on the world side of a head-mounted display system. Advantageously, the one-way mirror may reflect light from the world, which provides privacy and may improve the cosmetics of the display. In some implementations, the one-way mirror may include one or more of a depolarizer and a pair of opposing waveplates to improve alignment tolerances and reduce reflections to a viewer. In some implementations, the one-way mirror may form a compact integrated structure with a dimmer for reducing light transmitted to the viewer from the world.
A display system comprises a waveguide having light incoupling or light outcoupling optical elements formed of a metasurface. The metasurface is a multilevel (e.g., bi-level, tri-level, etc.) structure having a first level defined by spaced apart protrusions formed of a first optically transmissive material and a second optically transmissive material between the protrusions. The metasurface can also include a second level formed by the second optically transmissive material. The protrusions on the first level may be patterned by nanoimprinting the first optically transmissive material, and the second optically transmissive material may be deposited over and between the patterned protrusions. The widths of the protrusions and the spacing between the protrusions may be selected to diffract light, and a pitch of the protrusions may be 10-600 nm.
G02B 6/00 - Guides de lumièreDétails de structure de dispositions comprenant des guides de lumière et d'autres éléments optiques, p. ex. des moyens de couplage
G02B 6/122 - Éléments optiques de base, p. ex. voies de guidage de la lumière
G02B 6/293 - Moyens de couplage optique ayant des bus de données, c.-à-d. plusieurs guides d'ondes interconnectés et assurant un système bidirectionnel par nature en mélangeant et divisant les signaux avec des moyens de sélection de la longueur d'onde
The disclosure describes an improved drop-on-demand, controlled volume technique for dispensing resist onto a substrate, which is then imprinted to create a patterned optical device suitable for use in optical applications such as augmented reality and/or mixed reality systems. The technique enables the dispensation of drops of resist at precise locations on the substrate, with precisely controlled drop volume corresponding to an imprint template having different zones associated with different total resist volumes. Controlled drop size and placement also provides for substantially less variation in residual layer thickness across the surface of the substrate after imprinting, compared to previously available techniques. The technique employs resist having a refractive index closer to that of the substrate index, reducing optical artifacts in the device. To ensure reliable dispensing of the higher index and higher viscosity resist in smaller drop sizes, the dispensing system can continuously circulate the resist.
F21V 8/00 - Utilisation de guides de lumière, p. ex. dispositifs à fibres optiques, dans les dispositifs ou systèmes d'éclairage
G03F 7/00 - Production par voie photomécanique, p. ex. photolithographique, de surfaces texturées, p. ex. surfaces impriméesMatériaux à cet effet, p. ex. comportant des photoréservesAppareillages spécialement adaptés à cet effet
49.
DISPLAY SYSTEM WITH SPATIAL LIGHT MODULATOR ILLUMINATION FOR DIVIDED PUPILS
Illuminations systems that separate different colors into laterally displaced beams may be used to direct different color image content into an eyepiece for displaying images in the eye. Such an eyepiece may be used, for example, for an augmented reality head mounted display. Illumination systems may be provided that utilize one or more waveguides to direct light from a light source towards a spatial light modulator. Light from the spatial light modulator may be directed towards an eyepiece. Some aspects of the invention provide for light of different colors to be outcoupled at different angles from the one or more waveguides and directed along different beam paths.
In an example method for forming a variable optical viewing optics assembly (VOA) for a head mounted display, a prepolymer is deposited onto a substrate having a first optical element for the VOA. Further, a mold is applied to the prepolymer to conform the prepolymer to a curved surface of the mold on a first side of the prepolymer and to conform the prepolymer to a surface of the substrate on a second side of the prepolymer opposite the first side. Further, the prepolymer is exposed to actinic radiation sufficient to form a solid polymer from the prepolymer, such that the solid polymer forms an ophthalmic lens having a curved surface corresponding to the curved surface of the mold, and the substrate and the ophthalmic lens form an integrated optical component. The mold is released from the solid polymer, and the VOA is assembled using the integrated optical component.
In one aspect, an optical device comprises a plurality of waveguides formed over one another and having formed thereon respective diffraction gratings, wherein the respective diffraction gratings are configured to diffract visible light incident thereon into respective waveguides, such that visible light diffracted into the respective waveguides propagates therewithin. The respective diffraction gratings are configured to diffract the visible light into the respective waveguides within respective field of views (FOVs) with respect to layer normal directions of the respective waveguides. The respective FOVs are such that the plurality of waveguides are configured to diffract the visible light within a combined FOV that is continuous and greater than each of the respective FOVs
G02B 6/10 - Guides de lumièreDétails de structure de dispositions comprenant des guides de lumière et d'autres éléments optiques, p. ex. des moyens de couplage du type guide d'ondes optiques
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
G02F 1/29 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de la position ou de la direction des rayons lumineux, c.-à-d. déflexion
H04N 9/31 - Dispositifs de projection pour la présentation d'images en couleurs
52.
METHODS AND APPARATUSES FOR PROVIDING INPUT FOR HEAD-WORN IMAGE DISPLAY DEVICES
An apparatus for use with an image display device configured for head-worn by a user, includes: a screen; and a processing unit configured to assign a first area of the screen to sense finger-action of the user; wherein the processing unit is configured to generate an electronic signal to cause a change in a content displayed by the display device based on the finger-action of the user sensed by the assigned first area of the screen of the apparatus.
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 3/04812 - Techniques d’interaction fondées sur l’aspect ou le comportement du curseur, p. ex. sous l’influence de la présence des objets affichés
G06F 3/04886 - Techniques d’interaction fondées sur les interfaces utilisateur graphiques [GUI] utilisant des caractéristiques spécifiques fournies par le périphérique d’entrée, p. ex. des fonctions commandées par la rotation d’une souris à deux capteurs, ou par la nature du périphérique d’entrée, p. ex. des gestes en fonction de la pression exercée enregistrée par une tablette numérique utilisant un écran tactile ou une tablette numérique, p. ex. entrée de commandes par des tracés gestuels par partition en zones à commande indépendante de la surface d’affichage de l’écran tactile ou de la tablette numérique, p. ex. claviers virtuels ou menus
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
53.
IMAGING MODIFICATION, DISPLAY AND VISUALIZATION USING AUGMENTED AND VIRTUAL REALITY EYEWEAR
A display system can include a head-mounted display configured to project light to an eye of a user to display augmented reality image content to the user. The display system can include one or more user sensors configured to sense the user and can include one or more environmental sensors configured to sense surroundings of the user. The display system can also include processing electronics in communication with the display, the one or more user sensors, and the one or more environmental sensors. The processing electronics can be configured to sense a situation involving user focus, determine user intent for the situation, and alter user perception of a real or virtual object within the vision field of the user based at least in part on the user intent and/or sensed situation involving user focus. The processing electronics can be configured to at least one of enhance or de-emphasize the user perception of the real or virtual object within the vision field of the user.
A61B 17/00 - Instruments, dispositifs ou procédés chirurgicaux
A61B 34/00 - Chirurgie assistée par ordinateurManipulateurs ou robots spécialement adaptés à l’utilisation en chirurgie
A61B 34/20 - Systèmes de navigation chirurgicaleDispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p. ex. pour la stéréotaxie sans cadre
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p. ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 90/50 - Supports pour instruments chirurgicaux, p. ex. bras articulés
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 3/03 - Dispositions pour convertir sous forme codée la position ou le déplacement d'un élément
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
54.
HYBRID POLYMER WAVEGUIDE AND METHODS FOR MAKING THE SAME
In some embodiments, a head-mounted augmented reality display system comprises one or more hybrid waveguides configured to display images by directing modulated light containing image information into the eyes of a viewer. Each hybrid waveguide is formed of two or more layers of different materials. A first (e.g., thicker) layer is a highly optically transparent core layer, and a second (e.g., thinner) auxiliary layer includes a pattern of protrusions and indentations, e.g., to form a diffractive optical element. The pattern may be formed by imprinting. The hybrid waveguide may include additional layers, e.g., forming a plurality of alternating core layers and thinner patterned layers. Multiple waveguides may be stacked to form an integrated eyepiece, with each waveguide configured to receive and output light of a different component color.
G02B 1/04 - Éléments optiques caractérisés par la substance dont ils sont faitsRevêtements optiques pour éléments optiques faits de substances organiques, p. ex. plastiques
A viewing optics assembly for augmented reality includes a projector configured to generate image light and an eyepiece optically coupled to the projector. The eyepiece includes at least one eyepiece layer comprising a waveguide having a surface, an incoupling grating coupled to the waveguide, and an outcoupling grating coupled to the waveguide. The outcoupling grating comprises a first array of first ridges protruding from the surface of the waveguide, each of the first ridges having a first height in a direction perpendicular to the surface and a first width in a direction parallel to the surface and a plurality of second ridges, each of the plurality of second ridges protruding from a respective first ridge of the first ridges and having a second height and a second width. At least one of the first width or the second width varies as a function of position across the surface.
Disclosed are methods, systems, and articles of manufacture for managing and displaying web pages and web resources in a virtual three-dimensional (3D) space with an extended reality system. These techniques receive an input for 3D transform for a web page or a web page panel therefor. In response to the input, a browser engine coupled to a processor of an extended reality system determines 3D transform data for the web page or the web page panel based at least in part upon the 3D transform of the web page or the web page panel, wherein the 3D transform comprises a change in 3D position, rotation, or scale of the web page or the web page panel therefor in a virtual 3D space. A universe browser engine may present contents of the web page in a virtual 3D space based at least in part upon the 3D transform data.
Systems and methods for rendering audio signals are disclosed. In some embodiments, a method may receive an input signal including a first portion and the second portion. A first processing stage comprising a first filter is applied to the first portion to generate a first filtered signal. A second processing stage comprising a second filter is applied to the first portion to generate a second filtered signal. A third processing stage comprising a third filter is applied to the second portion to generate a third filtered signal. A fourth processing stage comprising a fourth filter is applied to the second portion to generate a fourth filtered signal. A first output signal is determined based on a sum of the first filtered signal and the third filtered signal. A second output signal is determined based on a sum of the second filtered signal and the fourth filtered signal. The first output signal is presented to a first ear of a user of a virtual environment, and the second output signal is presented to the second ear of the user. The first portion of the input signal corresponds to a first location in the virtual environment, and the second portion of the input signal corresponds to a second location in the virtual environment.
The present disclosure relates to display systems and, more particularly, to augmented reality display systems including diffraction grating(s), and methods of fabricating same. A diffraction grating includes a plurality of different diffracting zones having a periodically repeating lateral dimension corresponding to a grating period adapted for light diffraction. The diffraction grating additionally includes a plurality of different liquid crystal layers corresponding to the different diffracting zones. The different liquid crystal layers include liquid crystal molecules that are aligned differently, such that the different diffracting zones have different optical properties associated with light diffraction.
Disclosed are dimming assemblies and display systems for reducing artifacts produced by optically-transmissive displays. A system may include a substrate upon which a plurality of electronic components are disposed. The electronic components may include a plurality of pixels, a plurality of conductors, and a plurality of circuit modules. The plurality of pixels may be arranged in a two-dimensional array, with each pixel having a two-dimensional geometry corresponding to a shape with at least one curved side. The plurality of conductors may be arranged adjacent to the plurality of pixels. The system may also include control circuitry electrically coupled to the plurality of conductors. The control circuitry may be configured to apply electrical signals to the plurality of circuit modules by way of the plurality of conductors.
G02F 1/1362 - Cellules à adressage par une matrice active
G02F 1/139 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de l'intensité, de la phase, de la polarisation ou de la couleur basés sur des cristaux liquides, p. ex. cellules d'affichage individuelles à cristaux liquides caractérisés par l'effet électro-optique ou magnéto-optique, p. ex. transition de phase induite par un champ, effet d'orientation, interaction entre milieu récepteur et matière additive ou diffusion dynamique basés sur des effets d'orientation où les cristaux liquides restent transparents
60.
METHOD AND SYSTEM FOR PERFORMING DYNAMIC FOVEATION BASED ON EYE GAZE
A method of forming a foveated image includes (a) setting dimensions of a first region, (b) receiving an image having a first resolution, and (c) forming the foveated image including a primary quality region having the dimensions of the first region and the first resolution and a secondary quality region having a second resolution less than the first resolution. The method also includes (d) outputting the foveated image, (e) determining an eye gaze location, and (f) determining an eye gaze velocity. If the eye gaze velocity is less than a threshold velocity, the method includes decreasing the dimensions of the primary quality region and repeating (b) - (f). If the eye gaze velocity is greater than or equal to the threshold velocity, the method includes repeating (a) - (f).
A method includes rendering an original image at a first processor, encoding the original image to provide an encoded image, and transmitting the encoded image to a second processor. The method also includes decoding the encoded image to provide a decoded image, determining an eye gaze location, splitting the decoded image into N sections based on the eye gaze location, and processing N-1 sections of the N sections to produce N-1 secondary quality sections. The method further includes processing one section of the N sections to provide one primary quality section, combining the one primary quality section and the N-1 secondary quality sections to form a foveated image, and transmitting the foveated image to a display.
A mixed reality virtual environment is sharable among multiple users through the use of multiple view modes that are selectable by a presenter. Multiple users with wearable display systems may wish to view a common virtual object, which may be presented in a virtual room to any suitable number of users. A presentation may be controlled by a presenter using a presenter wearable system that leads multiple participants through information associated with the virtual object. Use of different viewing modes allows individual users to see different virtual content through their wearable display systems, despite being in a shared viewing space or alternatively, to see the same virtual content in different locations within a shared space.
G09B 5/12 - Matériel à but éducatif à commande électrique avec présentation individuelle d'une information à une pluralité de postes d'élèves différents postes étant capables de présenter des informations différentes simultanément
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
H04L 12/18 - Dispositions pour la fourniture de services particuliers aux abonnés pour la diffusion ou les conférences
Systems and methods for generating a face model for a user of a head-mounted device are disclosed. The head-mounted device can include one or more eye cameras configured to image the face of the user while the user is putting the device on or taking the device off. The images obtained by the eye cameras may be analyzed using a stereoscopic vision technique, a monocular vision technique, or a combination, to generate a face model for the user. The face model can be used to generate a virtual image of at least a portion of the user's face, for example to be presented as an avatar.
A mixed reality (MR) device can allow a user to switch between input modes to allow interactions with a virtual environment via devices such as a six degrees of freedom (6DoF) handheld controller and a touchpad input device. A default input mode for interacting with virtual content may rely on the user's head pose, which may be difficult to use in selecting virtual objects that are far away in the virtual environment. Thus, the system may be configured to allow the user to use a 6DoF cursor, and a visual ray that extends from the handheld controller to the cursor, to enable precise targeting. Input via a touchpad input device (e.g., that allows three degrees of freedom movements) may also be used in conjunction with the 6DoF cursor.
G06F 3/0354 - Dispositifs de pointage déplacés ou positionnés par l'utilisateurLeurs accessoires avec détection des mouvements relatifs en deux dimensions [2D] entre le dispositif de pointage ou une partie agissante dudit dispositif, et un plan ou une surface, p. ex. souris 2D, boules traçantes, crayons ou palets
G09G 3/00 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques
H04N 13/361 - Reproduction d’images stéréoscopiques mixtesReproduction d’images stéréoscopiques et monoscopiques mixtes, p. ex. une fenêtre avec une image stéréoscopique en superposition sur un arrière-plan avec une image monoscopique
H04N 13/383 - Suivi des spectateurs pour le suivi du regard, c.-à-d. avec détection de l’axe de vision des yeux du spectateur
65.
TECHNIQUES FOR DETERMINING SETTINGS FOR A CONTENT CAPTURE DEVICE
A method includes receiving a first image captured by a content capture device, identifying a first object in the first image and determining a first update to a first setting of the content capture device. The method further includes receiving a second image captured by the content capture device, identifying a second object in the second image, and determining a second update to a second setting of the content capture device. The method further includes updating the first setting of the content capture device using the first update, receiving a third image using the updated first setting of the content capture device, updating the second setting of the content capture device using the second update, receiving a fourth image using the updated second setting of the content capture device, and stitching the third image and the fourth image together to form a composite image.
H04N 23/63 - Commande des caméras ou des modules de caméras en utilisant des viseurs électroniques
H04N 23/71 - Circuits d'évaluation de la variation de luminosité
H04N 23/73 - Circuits de compensation de la variation de luminosité dans la scène en influençant le temps d'exposition
H04N 23/741 - Circuits de compensation de la variation de luminosité dans la scène en augmentant la plage dynamique de l'image par rapport à la plage dynamique des capteurs d'image électroniques
H04N 23/743 - Bracketing, c.-à-d. la prise d'une série d'images avec des conditions d'exposition différentes
H04N 23/76 - Circuits de compensation de la variation de luminosité dans la scène en agissant sur le signal d'image
66.
METHOD AND SYSTEM FOR PERFORMING EYE TRACKING IN AUGMENTED REALITY DEVICES
A wearable device for projecting image light to an eye of a viewer and forming an image of virtual content in an augmented reality display is provided. The wearable device includes a projector and stack of waveguides optically connected to the projector. The wearable device also includes an eye tracking system comprising a plurality of illumination sources, an optical element having optical power, and a set of cameras. The optical element is disposed between the plurality of illumination sources and the set of cameras. In some embodiments, the augmented reality display includes an eyepiece operable to output virtual content from an output region and a plurality of illumination sources. At least some of the plurality of illumination sources overlap with the output region.
A high-resolution image sensor suitable for use in an augmented reality (AR) system to provide low latency image analysis with low power consumption. The AR system can be compact, and may be small enough to be packaged within a wearable device such as a set of goggles or mounted on a frame resembling ordinary eyeglasses. The image sensor may receive information about a region of an imaging array associated with a movable object, selectively output imaging information for that region, and synchronously output high-resolution image frames. The region may be updated dynamically as the image sensor and/or the object moves. The image sensor may output the high-resolution image frames less frequently than the region being updated when the image sensor and/or the object moves. Such an image sensor provides a small amount of data from which object information used in rendering an AR scene can be developed.
A head-mounted display system configured to be worn over eyes of a user includes a frame configured to be worn on a head of the user. The system also includes a display disposed on the frame over the eyes of the user. The system further includes an inwardly-facing light source disposed on the frame and configured to emit light toward the eyes of the user to improve visibility of respective portions of a face and the eyes of the user through the display. Moreover, the system includes a processor configured to control a brightness of the display, an opacity of the display, and an intensity of the light emitted by the inwardly-facing light source.
G09G 3/20 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p. ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice
G09G 5/36 - Dispositions ou circuits de commande de l'affichage communs à l'affichage utilisant des tubes à rayons cathodiques et à l'affichage utilisant d'autres moyens de visualisation caractérisés par l'affichage de dessins graphiques individuels en utilisant une mémoire à mappage binaire
A method is disclosed, the method comprising the steps of receiving, from a first client application, first graphical data comprising a first node; receiving, from a second client application independent of the first client application, second graphical data comprising a second node; and generating a scenegraph, wherein the scenegraph describes a hierarchical relationship between the first node and the second node according to visual occlusion relative to a perspective from a display.
A display system, such as a virtual reality or augmented reality display system, can control a display to present image data including a plurality of color components, on a plurality of depth planes supported by the display. The presentation of the image data through the display can be controlled based on control information that is embedded in the image data, for example to activate or inactive a color component and/or a depth plane. In some examples, light sources and/or spatial light modulators that relay illumination from the light sources may receive signals from a display controller to adjust a power setting to the light source or spatial light modulator based on control information embedded in an image data frame.
Systems and methods for managing multi-objective alignments in imprinting (e.g., single-sided or double-sided) are provided. An example system includes rollers for moving a template roll, a stage for holding a substrate, a dispenser for dispensing resist on the substrate, a light source for curing the resist to form an imprint on the substrate when a template of the template roll is pressed into the resist on the substrate, a first inspection system for registering a fiducial mark of the template to determine a template offset, a second inspection system for registering the imprint on the substrate to determine a wafer registration offset between a target location and an actual location of the imprint, and a controller for controlling to move the substrate with the resist below the template based on the template offset, and determine an overlay bias of the imprint on the substrate based on the wafer registration offset.
G03F 9/00 - Mise en registre ou positionnement d'originaux, de masques, de trames, de feuilles photographiques, de surfaces texturées, p. ex. automatique
G03F 7/00 - Production par voie photomécanique, p. ex. photolithographique, de surfaces texturées, p. ex. surfaces impriméesMatériaux à cet effet, p. ex. comportant des photoréservesAppareillages spécialement adaptés à cet effet
72.
DISPLAY SYSTEM HAVING A PLURALITY OF LIGHT PIPES FOR A PLURALITY OF LIGHT EMITTERS
A display system includes a plurality of light pipes and a plurality of light sources configured to emit light into the light pipes. The display system also comprises a spatial light modulator configured to modulate light received from the light pipes to form images. The display system may also comprise one or more waveguides configured to receive modulated light from the spatial light modulator and to relay that light to a viewer.
AR/VR display systems limit displaying content that exceeds an accommodation-vergence mismatch threshold, which may define a volume around the viewer. The volume may be subdivided into two or more zones, including an innermost loss-of-fusion zone (LoF) in which no content is displayed, and one or more outer AVM zones in which the displaying of content may be stopped, or clipped, under certain conditions. For example, content may be clipped if the viewer is verging within an AVM zone and if the content is displayed within the AVM zone for more than a threshold duration. A further possible condition for clipping content is that the user is verging on that content. In addition, the boundaries of the AVM zone and/or the acceptable amount of time that the content is displayed may vary depending upon the type of content being displayed, e.g., whether the content is user-locked content or in-world content.
Methods and systems for depth-based foveated rendering in a display system are disclosed. The display system may be an augmented reality display system configured to provide virtual content on a plurality of depth planes using different wavefront divergence. Some embodiments include monitoring eye orientations of a user of the display system. A fixation point can be determined based on the eye orientations, the fixation point representing a three-dimensional location with respect to a field of view. Location information of virtual object(s) to present is obtained, with the location information including three-dimensional position(s) of the virtual object(s). A resolution of the virtual object(s) can be adjusted based on a proximity of the location(s) of the virtual object(s) to the fixation point. The virtual object(s) are presented by the display system according to the adjusted resolution(s).
G06T 19/00 - Transformation de modèles ou d'images tridimensionnels [3D] pour infographie
H04N 13/279 - Générateurs de signaux d’images à partir de modèles 3D d’objets, p. ex. des signaux d’images stéréoscopiques générés par ordinateur les positions des points de vue virtuels étant choisies par les spectateurs ou déterminées par suivi
H04N 13/341 - Affichage pour le visionnement à l’aide de lunettes spéciales ou de visiocasques utilisant le multiplexage temporel
H04N 13/344 - Affichage pour le visionnement à l’aide de lunettes spéciales ou de visiocasques avec des visiocasques portant des affichages gauche et droit
H04N 13/383 - Suivi des spectateurs pour le suivi du regard, c.-à-d. avec détection de l’axe de vision des yeux du spectateur
H04N 13/395 - Affichages volumétriques, c.-à-d. systèmes où l’image est réalisée à partir d’éléments répartis dans un volume avec échantillonnage de la profondeur, c.-à-d. construction du volume à partir d’un ensemble ou d’une séquence de plans d’image 2D
A virtual, augmented, or mixed reality display system includes a display configured to display virtual, augmented, or mixed reality image data, the display including one or more optical components which introduce optical distortions or aberrations to the image data. The system also includes a display controller configured to provide the image data to the display. The display controller includes memory for storing optical distortion correction information, and one or more processing elements to at least partially correct the image data for the optical distortions or aberrations using the optical distortion correction information.
G06F 1/3203 - Gestion de l’alimentation, c.-à-d. passage en mode d’économie d’énergie amorcé par événements
G06F 3/14 - Sortie numérique vers un dispositif de visualisation
G06T 3/18 - Déformation d’images, p. ex. réarrangement de pixels individuellement
G06T 3/4007 - Changement d'échelle d’images complètes ou de parties d’image, p. ex. agrandissement ou rétrécissement basé sur l’interpolation, p. ex. interpolation bilinéaire
76.
DEPTH BASED FOVEATED RENDERING FOR DISPLAY SYSTEMS
Methods and systems for depth-based foveated rendering in the display system are disclosed. The display system may be an augmented reality display system configured to provide virtual content on a plurality of depth planes using different wavefront divergence. Some embodiments include determining a fixation point of a user's eyes. Location information associated with a first virtual object to be presented to the user via a display device is obtained. A resolution-modifying parameter of the first virtual object is obtained. A particular resolution at which to render the first virtual object is identified based on the location information and the resolution-modifying parameter of the first virtual object. The particular resolution is based on a resolution distribution specifying resolutions for corresponding distances from the fixation point. The first virtual object rendered at the identified resolution is presented to the user via the display system.
An eyepiece for an augmented reality display system. The eyepiece can include a waveguide substrate. The waveguide substrate can include an input coupler grating (ICG), an orthogonal pupil expander (OPE) grating, a spreader grating, and an exit pupil expander (EPE) grating. The ICG can couple at least one input light beam into at least a first guided light beam that propagates inside the waveguide substrate. The OPE grating can divide the first guided light beam into a plurality of parallel, spaced-apart light beams. The spreader grating can receive the light beams from the OPE grating and spread their distribution. The spreader grating can include diffractive features oriented at approximately 90° to diffractive features of the OPE grating. The EPE grating can re-direct the light beams from the first OPE grating and the first spreader grating such that they exit the waveguide substrate.
A wearable display device, such as an augmented reality display device, can present virtual content to the wearer for applications in a healthcare setting. The wearer may be a patient or a healthcare provider (HCP). Applications can include, but are not limited to, access, display, and modification of patient medical records and sharing patient medical records among authorized HCPs, detecting one or more anomalies in a medical environment and presenting virtual content (e.g., alerts) indicating the one or more anomalies, detecting the presence of physical objects (e.g., medical instruments or devices) in the medical environment, enabling communication with and/or remove control of a medical device in the environment, and so forth.
G16H 10/60 - TIC spécialement adaptées au maniement ou au traitement des données médicales ou de soins de santé relatives aux patients pour des données spécifiques de patients, p. ex. pour des dossiers électroniques de patients
A61B 3/00 - Appareils pour l'examen optique des yeuxAppareils pour l'examen clinique des yeux
A61B 3/10 - Appareils pour l'examen optique des yeuxAppareils pour l'examen clinique des yeux du type à mesure objective, c.-à-d. instruments pour l'examen des yeux indépendamment des perceptions ou des réactions du patient
A61B 3/113 - Appareils pour l'examen optique des yeuxAppareils pour l'examen clinique des yeux du type à mesure objective, c.-à-d. instruments pour l'examen des yeux indépendamment des perceptions ou des réactions du patient pour déterminer ou enregistrer le mouvement de l'œil
A61B 5/00 - Mesure servant à établir un diagnostic Identification des individus
A61B 5/06 - Dispositifs autres que ceux à radiation, pour détecter ou localiser les corps étrangers
A61B 5/1171 - Identification des personnes basée sur la morphologie ou l’aspect de leur corps ou de parties de celui-ci
A61B 5/339 - Affichages spécialement adaptés à cet effet
A61B 17/00 - Instruments, dispositifs ou procédés chirurgicaux
A61B 34/20 - Systèmes de navigation chirurgicaleDispositifs pour le suivi ou le guidage d'instruments chirurgicaux, p. ex. pour la stéréotaxie sans cadre
A61B 90/00 - Instruments, outillage ou accessoires spécialement adaptés à la chirurgie ou au diagnostic non couverts par l'un des groupes , p. ex. pour le traitement de la luxation ou pour la protection de bords de blessures
A61B 90/50 - Supports pour instruments chirurgicaux, p. ex. bras articulés
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
G06F 40/289 - Analyse syntagmatique, p. ex. techniques d’états finis ou regroupement
G06V 20/20 - ScènesÉléments spécifiques à la scène dans les scènes de réalité augmentée
G10L 15/26 - Systèmes de synthèse de texte à partir de la parole
G16H 30/40 - TIC spécialement adaptées au maniement ou au traitement d’images médicales pour le traitement d’images médicales, p. ex. l’édition
G16H 40/67 - TIC spécialement adaptées à la gestion ou à l’administration de ressources ou d’établissements de santéTIC spécialement adaptées à la gestion ou au fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement d’équipement ou de dispositifs médicaux pour le fonctionnement à distance
A method of operating an optical system includes identifying a set of angle dependent transmittance levels for light passing through pixels of a segmented dimmer exhibiting viewing angle transmittance variations for application of a same voltage to all pixels of the segmented dimmer. The method also includes determining a set of voltages to apply to pixels of the segmented dimmer. Determining the set of voltages includes using the set of angle dependent transmittance levels. The method includes applying the set of voltages to the pixels of the segmented dimmer to achieve light transmittance through the segmented dimmer corresponding to the set of angle dependent transmittance levels.
G09G 3/36 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques pour la présentation d'un ensemble de plusieurs caractères, p. ex. d'une page, en composant l'ensemble par combinaison d'éléments individuels disposés en matrice en commandant la lumière provenant d'une source indépendante utilisant des cristaux liquides
Apparatuses and methods for displaying a 3-D representation of an object are described. Apparatuses can include a rotatable structure, motor, and multiple light field sub-displays disposed on the rotatable structure. The apparatuses can store a light field image to be displayed, the light field image providing multiple different views of the object at different viewing directions. A processor can drive the motor to rotate the rotatable structure and map the light field image to each of the light field sub-displays based in part on the rotation angle, and illuminate the light field sub-displays based in part on the mapped light field image. The apparatuses can include a display panel configured to be viewed from a fiducial viewing direction, where the display panel is curved out of a plane that is perpendicular to the fiducial viewing direction, and a plurality of light field sub-displays disposed on the display panel.
H04N 13/393 - Affichages volumétriques, c.-à-d. systèmes où l’image est réalisée à partir d’éléments répartis dans un volume le volume résultant du mouvement, p. ex. de vibration ou de rotation, d’une surface
G02B 30/27 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p. ex. des effets stéréoscopiques en fournissant des première et seconde images de parallaxe à chacun des yeux gauche et droit d’un observateur du type autostéréoscopique comprenant des réseaux lenticulaires
G02B 30/54 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p. ex. des effets stéréoscopiques l’image étant construite à partir d'éléments d'image répartis sur un volume 3D, p. ex. des voxels le volume 3D étant généré par une surface 2D en mouvement, p. ex. une surface 2D vibrante ou rotative
G02B 30/56 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p. ex. des effets stéréoscopiques l’image étant construite à partir d'éléments d'image répartis sur un volume 3D, p. ex. des voxels en projetant une image aérienne ou flottante
G09G 3/00 - Dispositions ou circuits de commande présentant un intérêt uniquement pour l'affichage utilisant des moyens de visualisation autres que les tubes à rayons cathodiques
H04N 13/307 - Reproducteurs d’images pour visionnement sans avoir recours à des lunettes spéciales, c.-à-d. utilisant des affichages autostéréoscopiques utilisant des lentilles du type œil de mouche, p. ex. dispositions de lentilles circulaires
H04N 13/32 - Reproducteurs d’images pour visionnement sans avoir recours à des lunettes spéciales, c.-à-d. utilisant des affichages autostéréoscopiques utilisant des matrices de sources lumineuses commandéesReproducteurs d’images pour visionnement sans avoir recours à des lunettes spéciales, c.-à-d. utilisant des affichages autostéréoscopiques utilisant des fenêtres en mouvement ou des sources lumineuses en mouvement
An augmented reality system includes a projector assembly and a set of imaging optics optically coupled to the projector assembly. The augmented reality system also includes an eyepiece optically coupled to the set of imaging optics. The eyepiece has a world side and a user side opposite the world side and includes one or more eyepiece waveguides. Each of the one or more eyepiece waveguides includes an incoupling interface and an outcoupling interface operable to output virtual content toward the user side. The augmented reality system further includes an optical notch filter disposed on the world side of the eyepiece.
A method of operating an eyepiece waveguide of an augmented reality system includes projecting virtual content using a projector assembly and diffracting the virtual content into the eyepiece waveguide via a first order diffraction. A first portion of the virtual content is clipped to produce a remaining portion of the virtual content. The method also includes propagating the remaining portion of the virtual content in the eyepiece waveguide, outcoupling the remaining portion of the virtual content out of the eyepiece waveguide, and diffracting the virtual content into the eyepiece waveguide via a second order diffraction. A second portion of the virtual content is clipped to produce a complementary portion. The method further includes propagating the complementary portion of the virtual content in the eyepiece waveguide and outcoupling the complementary portion of the virtual content out of the eyepiece waveguide.
G02B 6/00 - Guides de lumièreDétails de structure de dispositions comprenant des guides de lumière et d'autres éléments optiques, p. ex. des moyens de couplage
G02B 6/122 - Éléments optiques de base, p. ex. voies de guidage de la lumière
83.
COMPACT EXTENDED DEPTH OF FIELD LENSES FOR WEARABLE DISPLAY DEVICES
A wearable display device includes waveguide(s) that present virtual image elements as an augmentation to the real-world environment. The display device includes a first extended depth of field (EDOF) refractive lens arranged between the waveguide(s) and the user's eye(s), and a second EDOF refractive lens located outward from the waveguide(s). The first EDOF lens has a (e.g., negative) optical power to alter the depth of the virtual image elements. The second EDOF lens has a substantially equal and opposite (e.g., positive) optical power to that of the first EDOF lens, such that the depth of real-world objects is not altered along with the depth of the virtual image elements. To reduce the weight and/or size of the device, one or both EDOF lenses is a compact lens, e.g., Fresnel lens or flattened periphery lens. The compact lens may be coated and/or embedded in another material to enhance its performance.
G02B 6/10 - Guides de lumièreDétails de structure de dispositions comprenant des guides de lumière et d'autres éléments optiques, p. ex. des moyens de couplage du type guide d'ondes optiques
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
84.
CUSTOMIZED POLYMER/GLASS DIFFRACTIVE WAVEGUIDE STACKS FOR AUGMENTED REALITY/MIXED REALITY APPLICATIONS
A diffractive waveguide stack includes first, second, and third diffractive waveguides for guiding light in first, second, and third visible wavelength ranges, respectively. The first diffractive waveguide includes a first material having first refractive index at a selected wavelength and a first target refractive index at a midpoint of the first visible wavelength range. The second diffractive waveguide includes a second material having a second refractive index at the selected wavelength and a second target refractive index at a midpoint of the second visible wavelength range. The third diffractive waveguide includes a third material having a third refractive index at the selected wavelength and a third target refractive index at a midpoint of the third visible wavelength range. A difference between any two of the first target refractive index, the second target refractive index, and the third target refractive index is less than 0.005 at the selected wavelength.
Methods and systems for reductions in switching between depth planes of a multi-depth plane display system are disclosed. The display system may be an AR display system configured to provide virtual content on a plurality of depth planes using different wavefront divergence. The system may monitor the fixation points based upon the gaze of each of the user's eyes, with each fixation point being a three-dimensional location in the user's field of view. Location information of virtual objects to be presented to the user are obtained, with each virtual object being associated with a depth plane. The depth plane on which the virtual object is to be presented may modified based upon the fixation point of the user's eyes.
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 3/04842 - Sélection des objets affichés ou des éléments de texte affichés
H04N 13/122 - Raffinement de la perception 3D des images stéréoscopiques par modification du contenu des signaux d’images, p. ex. par filtrage ou par ajout d’indices monoscopiques de profondeur
H04N 13/128 - Ajustement de la profondeur ou de la disparité
H04N 13/332 - Affichage pour le visionnement à l’aide de lunettes spéciales ou de visiocasques
H04N 13/383 - Suivi des spectateurs pour le suivi du regard, c.-à-d. avec détection de l’axe de vision des yeux du spectateur
H04N 13/395 - Affichages volumétriques, c.-à-d. systèmes où l’image est réalisée à partir d’éléments répartis dans un volume avec échantillonnage de la profondeur, c.-à-d. construction du volume à partir d’un ensemble ou d’une séquence de plans d’image 2D
86.
WEARABLE SYSTEM WITH HEADSET AND CONTROLLER INSIDE-OUT TRACKING
Wearable systems and method for operation thereof incorporating headset and controller inside-out tracking are disclosed. A wearable system may include a headset and a controller. The wearable system may cause fiducials of the controller to flash. The wearable system may track a pose of the controller by capturing headset images using a headset camera, identifying the fiducials in the headset images, and tracking the pose of the controller based on the identified fiducials in the headset images and based on a pose of the headset. While tracking the pose of the controller, the wearable system may capture controller images using a controller camera. The wearable system may identify two-dimensional feature points in each controller image and determine three-dimensional map points based on the two-dimensional feature points and the pose of the controller.
Techniques for operating a depth sensor are discussed. A first sequence of operation steps and a second sequence of operation steps can be stored in memory on the depth sensor to define, respectively, a first depth sensing mode of operation and a second depth sensing mode of operation. In response to a first request for depth measurement(s) according to the first depth sensing mode of operation, the depth sensor can operate in the first mode of operation by executing the first sequence of operation steps. In response to a second request for depth measurement(s) according to the second depth sensing mode of operation, and without performing an additional configuration operation, the depth sensor can operate in the second mode of operation by executing the second sequence of operation steps.
H04N 23/667 - Changement de mode de fonctionnement de la caméra, p. ex. entre les modes photo et vidéo, sport et normal ou haute et basse résolutions
G01S 17/894 - Imagerie 3D avec mesure simultanée du temps de vol sur une matrice 2D de pixels récepteurs, p. ex. caméras à temps de vol ou lidar flash
G06F 3/00 - Dispositions d'entrée pour le transfert de données destinées à être traitées sous une forme maniable par le calculateurDispositions de sortie pour le transfert de données de l'unité de traitement à l'unité de sortie, p. ex. dispositions d'interface
H04N 13/139 - Conversion du format, p. ex. du débit de trames ou de la taille
H04N 23/959 - Systèmes de photographie numérique, p. ex. systèmes d'imagerie par champ lumineux pour l'imagerie à grande profondeur de champ en ajustant la profondeur de champ pendant la capture de l'image, p. ex. en maximisant ou en réglant la portée en fonction des caractéristiques de la scène
88.
DIFFRACTIVE OPTICAL ELEMENTS WITH MITIGATION OF REBOUNCE-INDUCED LIGHT LOSS AND RELATED SYSTEMS AND METHODS
Display devices include waveguides with in-coupling optical elements that mitigate re-bounce of in-coupled light to improve in-coupling efficiency and/or uniformity. A waveguide receives light from a light source and includes an in-coupling optical element that in-couples the received light to propagate by total internal reflection within the waveguide. The in-coupled light may undergo re-bounce, in which the light reflects off a waveguide surface and, after the reflection, strikes the in-coupling optical element. Upon striking the in-coupling optical element, the light may be partially absorbed and/or out-coupled by the optical element, thereby reducing the amount of in-coupled light propagating through the waveguide. The in-coupling optical element can be truncated or have reduced diffraction efficiency along the propagation direction to reduce the occurrence of light loss due to re-bounce of in-coupled light, resulting in less in-coupled light being prematurely out-coupled and/or absorbed during subsequent interactions with the in-coupling optical element.
F21V 8/00 - Utilisation de guides de lumière, p. ex. dispositifs à fibres optiques, dans les dispositifs ou systèmes d'éclairage
G02B 6/10 - Guides de lumièreDétails de structure de dispositions comprenant des guides de lumière et d'autres éléments optiques, p. ex. des moyens de couplage du type guide d'ondes optiques
A wearable display system includes an eyepiece stack having a world side and a user side opposite the world side, wherein during use a user positioned on the user side views displayed images delivered by the system via the eyepiece stack which augment the user's view of the user's environment. The wearable display system also includes an angularly selective film arranged on the world side of the of the eyepiece stack. The angularly selective film includes a polarization adjusting film arranged between pair of linear polarizers. The linear polarizers and polarization adjusting film significantly reduces transmission of visible light incident on the angularly selective film at large angles of incidence without significantly reducing transmission of light incident on the angularly selective film at small angles of incidence.
G02F 1/1335 - Association structurelle de cellules avec des dispositifs optiques, p. ex. des polariseurs ou des réflecteurs
G02F 1/137 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de l'intensité, de la phase, de la polarisation ou de la couleur basés sur des cristaux liquides, p. ex. cellules d'affichage individuelles à cristaux liquides caractérisés par l'effet électro-optique ou magnéto-optique, p. ex. transition de phase induite par un champ, effet d'orientation, interaction entre milieu récepteur et matière additive ou diffusion dynamique
90.
DISPLAY SYSTEMS AND METHODS FOR DETERMINING REGISTRATION BETWEEN A DISPLAY AND A USER'S EYES
A display system may include a head-mounted display (HMD) for rendering a three-dimensional virtual object which appears to be located in an ambient environment of a user of the display. One or more eyes of the user may not be in desired positions, relative to the HMD, to receive, or register, image information outputted by the HMD and/or to view an external environment. For example, the HMD-to-eye alignment may vary for different users and/or may change over time (e.g., as the HMD is displaced). The display system may determine a relative position or alignment between the HMD and the user's eyes. Based on the relative positions, the wearable device may determine if it is properly fitted to the user, may provide feedback on the quality of the fit to the user, and/or may take actions to reduce or minimize effects of any misalignment.
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
A61B 3/11 - Appareils pour l'examen optique des yeuxAppareils pour l'examen clinique des yeux du type à mesure objective, c.-à-d. instruments pour l'examen des yeux indépendamment des perceptions ou des réactions du patient pour mesurer la distance interpupillaire ou le diamètre de la pupille
A61B 3/113 - Appareils pour l'examen optique des yeuxAppareils pour l'examen clinique des yeux du type à mesure objective, c.-à-d. instruments pour l'examen des yeux indépendamment des perceptions ou des réactions du patient pour déterminer ou enregistrer le mouvement de l'œil
G02B 30/00 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p. ex. des effets stéréoscopiques
G02B 30/40 - Systèmes ou appareils optiques pour produire des effets tridimensionnels [3D], p. ex. des effets stéréoscopiques donnant à l’observateur d'une seule image bidimensionnelle [2D] une impression perceptive de profondeur
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 3/0346 - Dispositifs de pointage déplacés ou positionnés par l'utilisateurLeurs accessoires avec détection de l’orientation ou du mouvement libre du dispositif dans un espace en trois dimensions [3D], p. ex. souris 3D, dispositifs de pointage à six degrés de liberté [6-DOF] utilisant des capteurs gyroscopiques, accéléromètres ou d’inclinaison
G06F 3/04815 - Interaction s’effectuant dans un environnement basé sur des métaphores ou des objets avec un affichage tridimensionnel, p. ex. modification du point de vue de l’utilisateur par rapport à l’environnement ou l’objet
G06T 3/40 - Changement d'échelle d’images complètes ou de parties d’image, p. ex. agrandissement ou rétrécissement
G06V 10/42 - Extraction de caractéristiques globales par l’analyse du motif entier, p. ex. utilisant des transformations dans le domaine de fréquence ou d’autocorrélation
G06V 10/46 - Descripteurs pour la forme, descripteurs liés au contour ou aux points, p. ex. transformation de caractéristiques visuelles invariante à l’échelle [SIFT] ou sacs de mots [BoW]Caractéristiques régionales saillantes
G06V 10/60 - Extraction de caractéristiques d’images ou de vidéos relative aux propriétés luminescentes, p. ex. utilisant un modèle de réflectance ou d’éclairage
G06V 40/18 - Caractéristiques de l’œil, p. ex. de l’iris
H04N 13/344 - Affichage pour le visionnement à l’aide de lunettes spéciales ou de visiocasques avec des visiocasques portant des affichages gauche et droit
H04N 13/383 - Suivi des spectateurs pour le suivi du regard, c.-à-d. avec détection de l’axe de vision des yeux du spectateur
Disclosed herein are systems and methods for displays, such as for a head wearable device. An example display can include an infrared illumination layer, the infrared illumination layer including a substrate, one or more LEDs disposed on a first surface of the substrate, and a first encapsulation layer disposed on the first surface of the substrate, where the encapsulation layer can include a nano-patterned surface. In some examples, the nano-patterned surface can be configured to improve a visible light transmittance of the illumination layer. In one or more examples, embodiments disclosed herein may provide a robust illumination layer that can reduce the haze associated with an illumination layer.
A display system can include a head-mounted display configured to project light to an eye of a user to display virtual image content at different amounts of divergence and collimation. The display system can include an inward-facing imaging system possibly comprising a plurality of cameras that image the user's eye and glints for thereon and processing electronics that are in communication with the inward-facing imaging system and that are configured to obtain an estimate of a center of rotation of the user's eye using cornea data derived from the glint images. The display system may render virtual image content with a render camera positioned at the determined position of the center of rotation of said eye.
An eyepiece waveguide for an augmented reality display system may include an optically transmissive substrate, an input coupling grating (ICG) region, a multi-directional pupil expander (MPE) region, and an exit pupil expander (EPE) region. The ICG region may receive an input beam of light and couple the input beam into the substrate as a guided beam. The MPE region may include a plurality of diffractive features which exhibit periodicity along at least a first axis of periodicity and a second axis of periodicity. The MPE region may be positioned to receive the guided beam from the ICG region and to diffract it in a plurality of directions to create a plurality of diffracted beams. The EPE region may overlap the MPE region and may out couple one or more of the diffracted beams from the optically transmissive substrate as output beams.
Head-mounted display systems with power saving functionality are disclosed. The systems can include a frame configured to be supported on the head of the user. The systems can also include a head-mounted display disposed on the frame, one or more sensors, and processing electronics in communication with the display and the one or more sensors. In some implementations, the processing electronics can be configured to cause the system to reduce power of one or more components in response to at least in part on a determination that the frame is in a certain position (e.g., upside-down or on top of the head of the user). In some implementations, the processing electronics can be configured to cause the system to reduce power of one or more components in response to at least in part on a determination that the frame has been stationary for at least a threshold period of time.
Wearable systems and method for operation thereof incorporating headset and controller localization using headset cameras and controller fiducials are disclosed. A wearable system may include a headset and a controller. The wearable system may alternate between performing headset tracking and performing controller tracking by repeatedly capturing images using a headset camera of the headset during headset tracking frames and controller tracking frames. The wearable system may cause the headset camera to capture a first exposure image an exposure above a threshold and cause the headset camera to capture a second exposure image having an exposure below the threshold. The wearable system may determine a fiducial interval during which fiducials of the controller are to flash at a fiducial frequency and a fiducial period. The wearable system may cause the fiducials to flash during the fiducial interval in accordance with the fiducial frequency and the fiducial period.
A head mounted display system can process images by assessing relative motion between the head mounted display and one or more features in a user's environment. The assessment of relative motion can include determining whether the head mounted display has moved, is moving and/or is expected to move with respect to one or more features in the environment. Additionally or alternatively, the assessment can include determining whether one or more features in the environment have moved, are moving and/or are expected to move relative to the head mounted display. The image processing can further include determining one or more virtual image content locations in the environment that correspond to a location where renderable virtual image content appears to a user when the location appears in the display and comparing the one or more virtual image content locations in the environment with a viewing zone.
G06T 7/246 - Analyse du mouvement utilisant des procédés basés sur les caractéristiques, p. ex. le suivi des coins ou des segments
G09G 5/00 - Dispositions ou circuits de commande de l'affichage communs à l'affichage utilisant des tubes à rayons cathodiques et à l'affichage utilisant d'autres moyens de visualisation
98.
PLENOPTIC CAMERA MEASUREMENT AND CALIBRATION OF HEAD-MOUNTED DISPLAYS
A method for measuring performance of a head-mounted display module, the method including arranging the head-mounted display module relative to a plenoptic camera assembly so that an exit pupil of the head-mounted display module coincides with a pupil of the plenoptic camera assembly; emitting light from the head-mounted display module while the head-mounted display module is arranged relative to the plenoptic camera assembly; filtering the light at the exit pupil of the head-mounted display module; acquiring, with the plenoptic camera assembly, one or more light field images projected from the head-mounted display module with the filtered light; and determining information about the performance of the head-mounted display module based on acquired light field image.
Architectures are provided for selectively outputting light for forming images, the light having different wavelengths and being outputted with low levels of crosstalk. In some embodiments, light is incoupled into a waveguide and deflected to propagate in different directions, depending on wavelength. The incoupled light then outcoupled by outcoupling optical elements that outcouple light based on the direction of propagation of the light. In some other embodiments, color filters are between a waveguide and outcoupling elements. The color filters limit the wavelengths of light that interact with and are outcoupled by the outcoupling elements. In yet other embodiments, a different waveguide is provided for each range of wavelengths to be outputted. Incoupling optical elements selectively incouple light of the appropriate range of wavelengths into a corresponding waveguide, from which the light is outcoupled.
Examples of wearable devices that can present to a user of the display device an audible or visual representation of an audio file comprising a plurality of stem tracks that represent different audio content of the audio file are described. Systems and methods are described that determine the pose of the user; generate, based on the pose of the user, an audio mix of at least one of the plurality of stem tracks of the audio file; generate, based on the pose of the user and the audio mix, a visualization of the audio mix; communicate an audio signal representative of the audio mix to the speaker; and communicate a visual signal representative of the visualization of the audio mix to the display.