Magic Leap, Inc.

United States of America

Back to Profile

1-100 of 3,213 for Magic Leap, Inc. Sort by
Query
Aggregations
IP Type
        Patent 3,066
        Trademark 147
Jurisdiction
        United States 2,317
        World 712
        Canada 170
        Europe 14
Date
New (last 4 weeks) 34
2025 May (MTD) 31
2025 April 16
2025 March 33
2025 February 18
See more
IPC Class
G02B 27/01 - Head-up displays 1,431
G06T 19/00 - Manipulating 3D models or images for computer graphics 828
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 776
G02B 27/00 - Optical systems or apparatus not provided for by any of the groups , 495
F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems 287
See more
NICE Class
09 - Scientific and electric apparatus and instruments 134
42 - Scientific, technological and industrial services, research and design 57
41 - Education, entertainment, sporting and cultural services 40
38 - Telecommunications services 35
35 - Advertising and business services 30
See more
Status
Pending 436
Registered / In Force 2,777
  1     2     3     ...     33        Next Page

1.

AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS WITH CORRELATED IN-COUPLING AND OUT-COUPLING OPTICAL REGIONS FOR EFFICIENT LIGHT UTILIZATION

      
Application Number 19027789
Status Pending
Filing Date 2025-01-17
First Publication Date 2025-05-22
Owner Magic Leap, Inc. (USA)
Inventor Schowengerdt, Brian T.

Abstract

Augmented reality and virtual reality display systems and devices are configured for efficient use of projected light. In some aspects, a display system includes a light projection system and a head-mounted display configured to project light into an eye of the user to display virtual image content. The head-mounted display includes at least one waveguide comprising a plurality of in-coupling elements each configured to receive, from the light projection system, light corresponding to a portion of the user's field of view and to in-couple the light into the waveguide; and a plurality of out-coupling elements configured to out-couple the light out of the waveguide to display the virtual content, wherein each of the out-coupling elements are configured to receive light from different ones of the in-coupling elements.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

2.

EYE IMAGING WITH AN OFF-AXIS IMAGER

      
Application Number 19030083
Status Pending
Filing Date 2025-01-17
First Publication Date 2025-05-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Kaehler, Adrian

Abstract

Examples of an imaging system for use with a head mounted display (HMD) are disclosed. The imaging system can include a forward-facing imaging camera and a surface of a display of the HMD can include an off-axis diffractive optical element (DOE) or hot mirror configured to reflect light to the imaging camera. The DOE or hot mirror can be segmented. The imaging system can be used for eye tracking, biometric identification, multiscopic reconstruction of the three-dimensional shape of the eye, etc.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • A61B 3/10 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions
  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • A61B 3/12 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
  • A61B 3/14 - Arrangements specially adapted for eye photography
  • A61B 5/00 - Measuring for diagnostic purposes Identification of persons
  • A61B 5/11 - Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
  • A61B 5/117 - Identification of persons
  • A61B 5/16 - Devices for psychotechnicsTesting reaction times
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

3.

SYSTEMS AND METHODS FOR AUGMENTED REALITY

      
Application Number 19030887
Status Pending
Filing Date 2025-01-17
First Publication Date 2025-05-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Woods, Michael Janusz
  • Rabinovich, Andrew
  • Taylor, Richard Leslie

Abstract

Methods and systems for triggering presentation of virtual content based on sensor information. The display system may be an augmented reality display system configured to provide virtual content on a plurality of depth planes using different wavefront divergences. The system may monitor information detected via the sensors, and based on the monitored information, trigger access to virtual content identified in the sensor information. Virtual content can be obtained, and presented as augmented reality content via the display system. The system may monitor information detected via the sensors to identify a QR code, or a presence of a wireless beacon. The QR code or wireless beacon can trigger the display system to obtain virtual content for presentation.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • A63F 13/211 - Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

4.

METHOD OF WAKING A DEVICE USING SPOKEN VOICE COMMANDS

      
Application Number 19026149
Status Pending
Filing Date 2025-01-16
First Publication Date 2025-05-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Roach, David Thomas
  • Jot, Jean-Marc
  • Lee, Jung-Suk

Abstract

Disclosed herein are systems and methods for processing speech signals in mixed reality applications. A method may include receiving an audio signal; determining, via first processors, whether the audio signal comprises a voice onset event; in accordance with a determination that the audio signal comprises the voice onset event: waking a second one or more processors; determining, via the second processors, that the audio signal comprises a predetermined trigger signal; in accordance with a determination that the audio signal comprises the predetermined trigger signal: waking third processors; performing, via the third processors, automatic speech recognition based on the audio signal; and in accordance with a determination that the audio signal does not comprise the predetermined trigger signal: forgoing waking the third processors; and in accordance with a determination that the audio signal does not comprise the voice onset event: forgoing waking the second processors.

IPC Classes  ?

  • H04R 5/04 - Circuit arrangements
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G10L 15/08 - Speech classification or search
  • H04R 3/00 - Circuits for transducers
  • H04R 3/04 - Circuits for transducers for correcting frequency response
  • H04R 5/033 - Headphones for stereophonic communication

5.

METHODS AND SYSTEMS FOR AUGMENTED REALITY DISPLAY WITH DYNAMIC FIELD OF VIEW

      
Application Number 19028036
Status Pending
Filing Date 2025-01-17
First Publication Date 2025-05-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Chang, Chieh
  • Liu, Victor Kai
  • Bhargava, Samarth
  • Li, Ling
  • Bhagat, Sharad D.
  • Peroz, Christophe
  • Mareno, Jason Donald

Abstract

A foveated display for projecting an image to an eye of a viewer is provided. The foveated display includes a first projector and a dynamic eyepiece optically coupled to the first projector. The dynamic eyepiece comprises a waveguide having a variable surface profile. The foveated display also includes a second projector and a fixed depth plane eyepiece optically coupled to the second projector.

IPC Classes  ?

  • G02B 30/52 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being constructed from a stack or sequence of 2D planes, e.g. depth sampling systems
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

6.

OPTICAL LAYERS TO IMPROVE PERFORMANCE OF EYEPIECES FOR USE WITH VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEMS

      
Application Number 18875645
Status Pending
Filing Date 2022-06-17
First Publication Date 2025-05-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Tekolste, Robert D.
  • Singh, Vikramjit
  • Khandekar, Chinmay

Abstract

Improved diffractive optical elements for use in an eyepiece for an extended reality system, The diffractive optical elements comprise a diffraction structure having a waveguide substrate, a surface grating positioned on a first side of the waveguide substrate, and one or more optical layer pairs disposed between the waveguide substrate and the surface grating. Each optical layer pair comprises a low index layer and a high index layer disposed directly on an exterior side of the low index layer.

IPC Classes  ?

7.

AUGMENTED REALITY DISPLAY HAVING MULTI-ELEMENT ADAPTIVE LENS FOR CHANGING DEPTH PLANES

      
Application Number 19027972
Status Pending
Filing Date 2025-01-17
First Publication Date 2025-05-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Schaefer, Jason
  • Cheng, Hui-Chuan
  • Manly, David
  • Trisnadi, Jahja I.
  • Carlisle, Clinton
  • Klug, Michael Anthony

Abstract

In some embodiments, an augmented reality system includes at least one waveguide that is configured to receive and redirect light toward a user, and is further configured to allow ambient light from an environment of the user to pass therethrough toward the user. The augmented reality system also includes a first adaptive lens assembly positioned between the at least one waveguide and the environment, a second adaptive lens assembly positioned between the at least one waveguide and the user, and at least one processor operatively coupled to the first and second adaptive lens assemblies. Each lens assembly of the augmented reality system is selectively switchable between at least two different states in which the respective lens assembly is configured to impart at least two different optical powers to light passing therethrough, respectively. The at least one processor is configured to cause the first and second adaptive lens assemblies to synchronously switch between different states in a manner such that the first and second adaptive lens assemblies impart a substantially constant net optical power to ambient light from the environment passing therethrough.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02F 1/01 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour
  • G02F 1/13 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells
  • G02F 1/1337 - Surface-induced orientation of the liquid crystal molecules, e.g. by alignment layers
  • G02F 1/29 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the position or the direction of light beams, i.e. deflection

8.

ANGULARLY SELECTIVE ATTENUATION OF LIGHT TRANSMISSION ARTIFACTS IN WEARABLE DISPLAYS

      
Application Number 19029730
Status Pending
Filing Date 2025-01-17
First Publication Date 2025-05-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Messer, Kevin
  • Haddock, Joshua Naaman
  • Cheng, Hui-Chuan
  • Mathur, Vaibhav
  • Carlisle, Clinton

Abstract

A wearable display system includes an eyepiece stack having a world side and a user side opposite the world side. During use, a user positioned on the user side views displayed images delivered by the wearable display system via the eyepiece stack which augment the user's field of view of the user's environment. The system also includes an optical attenuator arranged on the world side of the of the eyepiece stack, the optical attenuator having a layer of a birefringent material having a plurality of domains each having a principal optic axis oriented in a corresponding direction different from the direction of other domains. Each domain of the optical attenuator reduces transmission of visible light incident on the optical attenuator for a corresponding different range of angles of incidence.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 25/00 - EyepiecesMagnifying glasses
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G02F 1/01 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02F 1/13363 - Birefringent elements, e.g. for optical compensation
  • G02F 1/1337 - Surface-induced orientation of the liquid crystal molecules, e.g. by alignment layers
  • G02F 1/1343 - Electrodes
  • G02F 1/139 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on orientation effects in which the liquid crystal remains transparent

9.

WAVEGUIDE ILLUMINATOR

      
Application Number 19030440
Status Pending
Filing Date 2025-01-17
First Publication Date 2025-05-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Curtis, Kevin Richard
  • Sissom, Bradley Jay
  • Cheng, Hui-Chuan
  • Schuck, Iii, Miller Harry
  • Bhargava, Samarth
  • Arend, Erik Heath

Abstract

A head mounted display system configured to project a first image to an eye of a user, the head mounted display system includes at least one waveguide comprising a first major surface, a second major surface opposite the first major surface, and a first edge and a second edge between the first major surface and second major surface. The at least one waveguide also includes a first reflector disposed between the first major surface and the second major surface. The head mounted display system also includes at least one light source disposed closer to the first major surface than the second major surface and a spatial light modulator configured to form a second image and disposed closer to the first major surface than the second major surface, wherein the first reflector is configured to reflect light toward the spatial light modulator.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

10.

ILLUMINATION LAYOUT FOR COMPACT PROJECTION SYSTEM

      
Application Number 19019310
Status Pending
Filing Date 2025-01-13
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Uhlendorf, Kristina
  • Curtis, Kevin Richard
  • Tekolste, Robert D.
  • Singh, Vikramjit

Abstract

An apparatus including a set of three illumination sources disposed in a first plane. Each of the set of three illumination sources is disposed at a position in the first plane offset from others of the set of three illumination sources by 120 degrees measured in polar coordinates. The apparatus also includes a set of three waveguide layers disposed adjacent the set of three illumination sources. Each of the set of three waveguide layers includes an incoupling diffractive element disposed at a lateral position offset by 180 degrees from a corresponding illumination source of the set of three illumination sources.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/42 - Diffraction optics
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

11.

VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEMS WITH EMISSIVE MICRO-DISPLAYS

      
Application Number 19022453
Status Pending
Filing Date 2025-01-15
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Poliakov, Evgeni
  • Trisnadi, Jahja I.
  • Chung, Hyunsun
  • Edwin, Lionel Ernest
  • Cohen, Howard Russell
  • Taylor, Robert Blake
  • Russell, Andrew Ian
  • Curtis, Kevin Richard
  • Carlisle, Clinton

Abstract

A wearable display system includes one or more emissive micro-displays, e.g., micro-LED displays. The micro-displays may be monochrome micro-displays or full-color micro-displays. The micro-displays may include arrays of light emitters. Light collimators may be utilized to narrow the angular emission profile of light emitted by the light emitters. Where a plurality of emissive micro-displays is utilized, the micro-displays may be positioned at different sides of an optical combiner, e.g., an X-cube prism which receives light rays from different micro-displays and outputs the light rays from the same face of the cube. The optical combiner directs the light to projection optics, which outputs the light to an eyepiece that relays the light to a user's eye. The eyepiece may output the light to the user's eye with different amounts of wavefront divergence, to place virtual content on different depth planes.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 6/26 - Optical coupling means
  • G02B 27/09 - Beam shaping, e.g. changing the cross-sectioned area, not otherwise provided for
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only
  • G02B 27/18 - Optical systems or apparatus not provided for by any of the groups , for optical projection, e.g. combination of mirror and condenser and objective
  • G02B 27/30 - Collimators
  • G02B 27/40 - Optical focusing aids
  • G02B 27/62 - Optical apparatus specially adapted for adjusting optical elements during the assembly of optical systems
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 3/32 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
  • H02N 2/02 - Electric machines in general using piezoelectric effect, electrostriction or magnetostriction producing linear motion, e.g. actuatorsLinear positioners

12.

FAN ASSEMBLY FOR DISPLAYING AN IMAGE

      
Application Number 19022842
Status Pending
Filing Date 2025-01-15
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Rohena, Guillermo Padin
  • Remsburg, Ralph
  • Kaehler, Adrian
  • Rynk, Evan Francis

Abstract

Apparatus and methods for displaying an image by a rotating structure are provided. The rotating structure can comprise blades of a fan. The fan can be a cooling fan for an electronics device such as an augmented reality display. In some embodiments, the rotating structure comprises light sources that emit light to generate the image. The light sources can comprise light-field emitters. In other embodiments, the rotating structure is illuminated by an external (e.g., non-rotating) light source.

IPC Classes  ?

  • G09G 3/02 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes by tracing or scanning a light beam on a screen
  • F04D 17/16 - Centrifugal pumps for displacing without appreciable compression
  • F04D 25/08 - Units comprising pumps and their driving means the working fluid being air, e.g. for ventilation
  • F04D 29/00 - Details, component parts, or accessories
  • F04D 29/42 - CasingsConnections for working fluid for radial or helico-centrifugal pumps
  • G02B 27/01 - Head-up displays
  • G02B 30/56 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
  • G06F 3/16 - Sound inputSound output
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G08B 21/18 - Status alarms
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 5/10 - Intensity circuits
  • H04N 5/262 - Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects

13.

ELECTROMAGNETIC TRACKING WITH AUGMENTED REALITY SYSTEMS

      
Application Number 19022978
Status Pending
Filing Date 2025-01-15
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Bucknor, Brian
  • Lopez, Christopher
  • Woods, Michael Janusz
  • Aly, Aly H. M.
  • Palmer, James William
  • Rynk, Evan Francis

Abstract

Head-mounted augmented reality (AR) devices can track pose of a wearer's head to provide a three-dimensional virtual representation of objects in the wearer's environment. An electromagnetic (EM) tracking system can track head or body pose. A handheld user input device can include an EM emitter that generates an EM field, and the head-mounted AR device can include an EM sensor that senses the EM field. EM information from the sensor can be analyzed to determine location and/or orientation of the sensor and thereby the wearer's pose. The EM emitter and sensor may utilize time division multiplexing (TDM) or dynamic frequency tuning to operate at multiple frequencies. Voltage gain control may be implemented in the transmitter, rather than the sensor, allowing smaller and lighter weight sensor designs. The EM sensor can implement noise cancellation to reduce the level of EM interference generated by nearby audio speakers.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G01S 1/68 - Marker, boundary, call-sign, or like beacons transmitting signals not carrying directional information
  • G01S 1/70 - Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmittersReceivers co-operating therewith using electromagnetic waves other than radio waves
  • G01S 5/02 - Position-fixing by co-ordinating two or more direction or position-line determinationsPosition-fixing by co-ordinating two or more distance determinations using radio waves
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/16 - Sound inputSound output
  • H01H 9/02 - Bases, casings, or covers

14.

DETERMINING INPUT FOR SPEECH PROCESSING ENGINE

      
Application Number 19026113
Status Pending
Filing Date 2025-01-16
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Sheeder, Anthony Robert
  • Leider, Colby Nelson

Abstract

A method of presenting a signal to a speech processing engine is disclosed. According to an example of the method, an audio signal is received via a microphone. A portion of the audio signal is identified, and a probability is determined that the portion comprises speech directed by a user of the speech processing engine as input to the speech processing engine. In accordance with a determination that the probability exceeds a threshold, the portion of the audio signal is presented as input to the speech processing engine. In accordance with a determination that the probability does not exceed the threshold, the portion of the audio signal is not presented as input to the speech processing engine.

IPC Classes  ?

  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G10L 15/14 - Speech classification or search using statistical models, e.g. Hidden Markov Models [HMM]
  • G10L 15/25 - Speech recognition using non-acoustical features using position of the lips, movement of the lips or face analysis
  • G10L 15/30 - Distributed recognition, e.g. in client-server systems, for mobile phones or network applications

15.

CURRENT DRAIN REDUCTION IN AR/VR DISPLAY SYSTEMS

      
Application Number 19022748
Status Pending
Filing Date 2025-01-15
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor Mor, Tal

Abstract

In some embodiments, eye tracking is used on an AR or VR display system to determine if a user of the display system is blinking or otherwise cannot see. In response, current drain or power usage of a display associated with the display system may be reduced, for example, by dimming or turning off a light source associated with the display, or by configuring a graphics driver to skip a designated number of frames or reduce a refresh rate for a designated period of time.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 1/3231 - Monitoring the presence, absence or movement of users
  • G06F 1/3234 - Power saving characterised by the action undertaken
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G09G 3/34 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source
  • G09G 3/36 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source using liquid crystals

16.

SYSTEMS AND METHODS FOR CROSS-APPLICATION AUTHORING, TRANSFER, AND EVALUATION OF RIGGING CONTROL SYSTEMS FOR VIRTUAL CHARACTERS

      
Application Number 19022769
Status Pending
Filing Date 2025-01-15
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Wedig, Geoffrey
  • Bancroft, James Jonathan

Abstract

Various examples of cross-application systems and methods for authoring, transferring, and evaluating rigging control systems for virtual characters are disclosed. Embodiments of a method include the steps or processes of creating, in a first application which implements a first rigging control protocol, a rigging control system description; writing the rigging control system description to a data file; and initiating transfer of the data file to a second application. In such embodiments, the rigging control system description may be defined according to a different second rigging control protocol. The rigging control system description may specify a rigging control input, such as a lower-order rigging element (e.g., a core skeleton for a virtual character), and at least one rule for operating on the rigging control input to produce a rigging control output, such as a higher-order skeleton or other higher-order rigging element.

IPC Classes  ?

  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

17.

AUGMENTED AND VIRTUAL REALITY EYEWEAR, SYSTEMS, AND METHODS FOR DELIVERING POLARIZED LIGHT AND DETERMINING GLUCOSE LEVELS

      
Application Number 19022813
Status Pending
Filing Date 2025-01-15
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Robaina, Nastasja U.
  • Samec, Nicole Elizabeth
  • Baerenrodt, Mark

Abstract

Various embodiments of a user-wearable device can comprise a frame configured to mount on a user. The device can include a display attached to the frame and configured to direct virtual images to an eye of the user. The device can also include a light source configured to provide polarized light to the eye of the user and that the polarized light is configured to reflect from the eye of the user. The device can further include a light analyzer configured to determine a polarization angle rotation of the reflected light from the eye of the user such that a glucose level of the user can be determined based at least in part on the polarization angle rotation of the reflected light.

IPC Classes  ?

  • A61B 5/1455 - Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value using optical sensors, e.g. spectral photometrical oximeters
  • A61B 5/00 - Measuring for diagnostic purposes Identification of persons
  • A61B 5/0205 - Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
  • A61B 5/024 - Measuring pulse rate or heart rate
  • A61B 5/08 - Measuring devices for evaluating the respiratory organs
  • A61B 5/145 - Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

18.

AUGMENTED REALITY SYSTEMS AND METHODS UTILIZING REFLECTIONS

      
Application Number 19022825
Status Pending
Filing Date 2025-01-15
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Harrises, Christopher M.
  • Samec, Nicole Elizabeth
  • Robaina, Nastasja U.
  • Baerenrodt, Mark
  • Wright, Adam Carl
  • Kaehler, Adrian

Abstract

A display system includes a wearable display device for displaying augmented reality content. The display device comprises a display area comprising light redirecting features that are configured to direct light to a user. The display area is at least partially transparent and is configured to provide a view of an ambient environment through the display area. The display device is configured to determine that a reflection of the user is within the user's field of view through the display area. After making this determination, augmented reality content is displayed in the display area with the augmented reality content augmenting the user's view of the reflection. In some embodiments, the augmented reality content may overlie on the user's view of the reflection, thereby allowing all or portions of the reflection to appear to be modified to provide a realistic view of the user with various modifications made to their appearance.

IPC Classes  ?

  • G06T 11/60 - Editing figures and textCombining figures or text
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

19.

METHODS, DEVICES, AND SYSTEMS FOR ILLUMINATING SPATIAL LIGHT MODULATORS

      
Application Number 19025019
Status Pending
Filing Date 2025-01-16
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Cheng, Hui-Chuan
  • Chung, Hyunsun
  • Trisnadi, Jahja I.
  • Carlisle, Clinton
  • Curtis, Kevin Richard
  • Oh, Chulwoo
  • Lin, Wei Chen

Abstract

An optical device may include a wedge-shaped light turning element, a first surface that is parallel to a horizontal axis, a second surface opposite to the first surface that is inclined with respect to the horizontal axis by a wedge angle, and a light module including light emitters. The light module can be configured to combine light emitted by the emitters. The optical device can further include a light input surface that is between the first and the second surfaces and disposed with respect to the light module to receive light emitted from the emitters. The optical device may include an end reflector disposed on a side opposite the light input surface. Light coupled into the light turning element may be reflected by the end reflector and/or reflected from the second surface towards the first surface.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 5/30 - Polarising elements
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only
  • G02B 30/26 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the autostereoscopic type
  • G02B 30/52 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being constructed from a stack or sequence of 2D planes, e.g. depth sampling systems
  • G02F 1/137 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
  • G03B 21/00 - Projectors or projection-type viewersAccessories therefor
  • G03B 21/20 - Lamp housings
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes
  • G09G 3/02 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes by tracing or scanning a light beam on a screen
  • G09G 3/24 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix using controlled light sources using incandescent filaments
  • H04N 13/00 - Stereoscopic video systemsMulti-view video systemsDetails thereof
  • H04N 13/239 - Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

20.

SYSTEMS AND METHODS FOR VIRTUAL AND AUGMENTED REALITY

      
Application Number 19026203
Status Pending
Filing Date 2025-01-16
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor Hand, Randall E.

Abstract

Examples of the disclosure describe systems and methods relating to mobile computing. According to an example method, a first user location of a user of a mobile computing system is determined. A first communication device in proximity to the first user location is identified based on the first user location. A first signal is communicated to the first communication device. A first information payload based on the first user location is received from the first communication device, in response to the first communication device receiving the first signal. Video or audio data based on the first information payload is presented to the user at a first time during which the user is at the first user location.

IPC Classes  ?

  • H04W 4/02 - Services making use of location information

21.

AUGMENTED REALITY DISPLAY WITH WAVEGUIDE CONFIGURED TO CAPTURE IMAGES OF EYE AND/OR ENVIRONMENT

      
Application Number 19028214
Status Pending
Filing Date 2025-01-17
First Publication Date 2025-05-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Sinay, Asif
  • Freedman, Barak
  • Klug, Michael Anthony
  • Oh, Chulwoo
  • Meitav, Nizan

Abstract

Head mounted display systems configured to project light to an eye of a user to display augmented reality image content in a vision field of the user are disclosed. In embodiments, the system includes a frame configured to be supported on a head of the user, an image projector configured to project images into the user's eye, a camera coupled to the frame, a waveguide optically coupled to the camera, an optical coupling optical element me, an out-coupling element configured to direct light emitted from the waveguide to the camera, and a first light source configured to direct light to the user's eye through the waveguide. Electronics control the camera to capture images periodically and farther control the first light source to pulse in time with the camera such that light emitted by the light source has a reduced intensity when the camera is not capturing images.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

22.

METHOD AND SYSTEM FOR EYEPIECE WAVEGUIDE DISPLAYS UTILIZING MULTI-DIRECTIONAL LAUNCH ARCHITECTURES

      
Application Number 19012007
Status Pending
Filing Date 2025-01-07
First Publication Date 2025-05-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Khandekar, Chinmay
  • Tekolste, Robert D.
  • Singh, Vikramjit
  • Liu, Victor Kai
  • Ong, Ryan
  • Uhlendorf, Kristina

Abstract

An eyepiece waveguide for augmented reality applications includes a substrate and a set of incoupling diffractive optical elements coupled to the substrate. A first subset of the set of incoupling diffractive optical elements is operable to diffract light into the substrate along a first range of propagation angles and a second subset of the set of incoupling diffractive optical elements is operable to diffract light into the substrate along a second range of propagation angles. The eyepiece waveguide also includes a combined pupil expander diffractive optical element coupled to the substrate.

IPC Classes  ?

23.

LIGHT FIELD DISPLAY METROLOGY

      
Application Number 19020531
Status Pending
Filing Date 2025-01-14
First Publication Date 2025-05-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Yeoh, Ivan Li Chuen
  • Edwin, Lionel Ernest
  • Miller, Samuel A.

Abstract

Examples of a light field metrology system for use with a display are disclosed. The light field metrology may capture images of a projected light field, and determine focus depths or lateral focus positions for various regions of the light field using the captured images. The determined focus depths or lateral positions may then be compared with intended focus depths or lateral positions, to quantify the imperfections of the display. Based on the measured imperfections, an appropriate error correction may be performed on the light field to correct for the measured imperfections. The display can be an optical display element in a head mounted display, for example, an optical display element capable of generating multiple depth planes or a light field display.

IPC Classes  ?

  • G01B 11/14 - Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
  • G01B 11/22 - Measuring arrangements characterised by the use of optical techniques for measuring depth
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 3/20 - Linear translation of whole images or parts thereof, e.g. panning
  • G06T 3/40 - Scaling of whole images or parts thereof, e.g. expanding or contracting
  • G06T 3/60 - Rotation of whole images or parts thereof
  • G06T 15/20 - Perspective computation
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 3/20 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix
  • G09G 3/34 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source
  • G09G 5/02 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
  • H04N 13/144 - Processing image signals for flicker reduction
  • H04N 13/327 - Calibration thereof
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
  • H04N 13/395 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
  • H04N 13/398 - Synchronisation thereofControl thereof

24.

OPTICAL DEVICE WITH ONE-WAY MIRROR

      
Application Number 19016085
Status Pending
Filing Date 2025-01-10
First Publication Date 2025-05-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Oh, Chulwoo
  • Komanduri, Ravi Kumar
  • Kleinman, David
  • Mathur, Vaibhav
  • Manly, David

Abstract

In some implementations, an optical device includes a one-way mirror formed by a polarization selective mirror and an absorptive polarizer. The absorptive polarizer has a transmission axis aligned with the transmission axis of the reflective polarizer. The one-way mirror may be provided on the world side of a head-mounted display system. Advantageously, the one-way mirror may reflect light from the world, which provides privacy and may improve the cosmetics of the display. In some implementations, the one-way mirror may include one or more of a depolarizer and a pair of opposing waveplates to improve alignment tolerances and reduce reflections to a viewer. In some implementations, the one-way mirror may form a compact integrated structure with a dimmer for reducing light transmitted to the viewer from the world.

IPC Classes  ?

  • G02B 5/30 - Polarising elements
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 5/08 - Mirrors
  • G02B 27/01 - Head-up displays
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors

25.

METASURFACES FOR REDIRECTING LIGHT AND METHODS FOR FABRICATING

      
Application Number 19018490
Status Pending
Filing Date 2025-01-13
First Publication Date 2025-05-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Lin, Dianmin
  • Melli, Mauro
  • St. Hilaire, Pierre
  • Peroz, Christophe
  • Poliakov, Evgeni

Abstract

A display system comprises a waveguide having light incoupling or light outcoupling optical elements formed of a metasurface. The metasurface is a multilevel (e.g., bi-level, tri-level, etc.) structure having a first level defined by spaced apart protrusions formed of a first optically transmissive material and a second optically transmissive material between the protrusions. The metasurface can also include a second level formed by the second optically transmissive material. The protrusions on the first level may be patterned by nanoimprinting the first optically transmissive material, and the second optically transmissive material may be deposited over and between the patterned protrusions. The widths of the protrusions and the spacing between the protrusions may be selected to diffract light, and a pitch of the protrusions may be 10-600 nm.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 5/18 - Diffracting gratings
  • G02B 6/00 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings
  • G02B 6/122 - Basic optical elements, e.g. light-guiding paths
  • G02B 6/293 - Optical coupling means having data bus means, i.e. plural waveguides interconnected and providing an inherently bidirectional system by mixing and splitting signals with wavelength selective means
  • G02B 27/01 - Head-up displays

26.

SURFACE RELIEF WAVEGUIDES WITH HIGH REFRACTIVE INDEX RESIST

      
Application Number 19018967
Status Pending
Filing Date 2025-01-13
First Publication Date 2025-05-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Traub, Matthew C.
  • Liu, Yingnan
  • Singh, Vikramjit
  • Xu, Frank Y.
  • Tekolste, Robert D.
  • Xue, Qizhen
  • Bhargava, Samarth
  • Liu, Victor Kai
  • Born, Brandon Michael-James
  • Messer, Kevin

Abstract

The disclosure describes an improved drop-on-demand, controlled volume technique for dispensing resist onto a substrate, which is then imprinted to create a patterned optical device suitable for use in optical applications such as augmented reality and/or mixed reality systems. The technique enables the dispensation of drops of resist at precise locations on the substrate, with precisely controlled drop volume corresponding to an imprint template having different zones associated with different total resist volumes. Controlled drop size and placement also provides for substantially less variation in residual layer thickness across the surface of the substrate after imprinting, compared to previously available techniques. The technique employs resist having a refractive index closer to that of the substrate index, reducing optical artifacts in the device. To ensure reliable dispensing of the higher index and higher viscosity resist in smaller drop sizes, the dispensing system can continuously circulate the resist.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G03F 7/00 - Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printed surfacesMaterials therefor, e.g. comprising photoresistsApparatus specially adapted therefor

27.

DISPLAY SYSTEM WITH SPATIAL LIGHT MODULATOR ILLUMINATION FOR DIVIDED PUPILS

      
Application Number 19019106
Status Pending
Filing Date 2025-01-13
First Publication Date 2025-05-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Cheng, Hui-Chuan
  • Oh, Chulwoo
  • Carlisle, Clinton
  • Klug, Michael Anthony
  • Molteni, Jr., William J.

Abstract

Illuminations systems that separate different colors into laterally displaced beams may be used to direct different color image content into an eyepiece for displaying images in the eye. Such an eyepiece may be used, for example, for an augmented reality head mounted display. Illumination systems may be provided that utilize one or more waveguides to direct light from a light source towards a spatial light modulator. Light from the spatial light modulator may be directed towards an eyepiece. Some aspects of the invention provide for light of different colors to be outcoupled at different angles from the one or more waveguides and directed along different beam paths.

IPC Classes  ?

  • G02B 27/09 - Beam shaping, e.g. changing the cross-sectioned area, not otherwise provided for
  • G02B 27/01 - Head-up displays
  • G02B 27/10 - Beam splitting or combining systems
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

28.

INTEGRATED OPTICAL COMPONENTS FOR HEAD MOUNTED DISPLAY DEVICES

      
Application Number 18686632
Status Pending
Filing Date 2022-08-31
First Publication Date 2025-05-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Nguyen, Bach
  • Singh, Vikramjit
  • Komanduri, Ravi Kumar
  • Shultz, Jason Allen
  • Ong, Ryan Jason
  • Traub, Matthew
  • Xu, Frank Y.

Abstract

In an example method for forming a variable optical viewing optics assembly (VOA) for a head mounted display, a prepolymer is deposited onto a substrate having a first optical element for the VOA. Further, a mold is applied to the prepolymer to conform the prepolymer to a curved surface of the mold on a first side of the prepolymer and to conform the prepolymer to a surface of the substrate on a second side of the prepolymer opposite the first side. Further, the prepolymer is exposed to actinic radiation sufficient to form a solid polymer from the prepolymer, such that the solid polymer forms an ophthalmic lens having a curved surface corresponding to the curved surface of the mold, and the substrate and the ophthalmic lens form an integrated optical component. The mold is released from the solid polymer, and the VOA is assembled using the integrated optical component.

IPC Classes  ?

29.

STACKED WAVEGUIDES HAVING DIFFERENT DIFFRACTION GRATINGS FOR COMBINED FIELD OF VIEW

      
Application Number 18975158
Status Pending
Filing Date 2024-12-10
First Publication Date 2025-05-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Oh, Chulwoo
  • Parthiban, Vikraman

Abstract

In one aspect, an optical device comprises a plurality of waveguides formed over one another and having formed thereon respective diffraction gratings, wherein the respective diffraction gratings are configured to diffract visible light incident thereon into respective waveguides, such that visible light diffracted into the respective waveguides propagates therewithin. The respective diffraction gratings are configured to diffract the visible light into the respective waveguides within respective field of views (FOVs) with respect to layer normal directions of the respective waveguides. The respective FOVs are such that the plurality of waveguides are configured to diffract the visible light within a combined FOV that is continuous and greater than each of the respective FOVs

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 5/18 - Diffracting gratings
  • G02B 6/10 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G02B 27/42 - Diffraction optics
  • G02F 1/29 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the position or the direction of light beams, i.e. deflection
  • H04N 9/31 - Projection devices for colour picture display

30.

METHODS AND APPARATUSES FOR PROVIDING INPUT FOR HEAD-WORN IMAGE DISPLAY DEVICES

      
Application Number 19004867
Status Pending
Filing Date 2024-12-30
First Publication Date 2025-05-01
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Pazmino, Lorena
  • Montoya, Andrea Isabel
  • Niles, Savannah
  • Rocha, Alexander
  • Bragg, Mario Antonio
  • Goel, Parag
  • Sommers, Jeffrey Scott
  • Lundmark, David Charles

Abstract

An apparatus for use with an image display device configured for head-worn by a user, includes: a screen; and a processing unit configured to assign a first area of the screen to sense finger-action of the user; wherein the processing unit is configured to generate an electronic signal to cause a change in a content displayed by the display device based on the finger-action of the user sensed by the assigned first area of the screen of the apparatus.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
  • G06F 3/04886 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

31.

IMAGING MODIFICATION, DISPLAY AND VISUALIZATION USING AUGMENTED AND VIRTUAL REALITY EYEWEAR

      
Application Number 19005650
Status Pending
Filing Date 2024-12-30
First Publication Date 2025-05-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Robaina, Nastasja U.
  • Samec, Nicole Elizabeth
  • Harrises, Christopher M.
  • Abovitz, Rony
  • Baerenrodt, Mark
  • Schmidt, Brian Lloyd

Abstract

A display system can include a head-mounted display configured to project light to an eye of a user to display augmented reality image content to the user. The display system can include one or more user sensors configured to sense the user and can include one or more environmental sensors configured to sense surroundings of the user. The display system can also include processing electronics in communication with the display, the one or more user sensors, and the one or more environmental sensors. The processing electronics can be configured to sense a situation involving user focus, determine user intent for the situation, and alter user perception of a real or virtual object within the vision field of the user based at least in part on the user intent and/or sensed situation involving user focus. The processing electronics can be configured to at least one of enhance or de-emphasize the user perception of the real or virtual object within the vision field of the user.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • A61B 17/00 - Surgical instruments, devices or methods
  • A61B 34/00 - Computer-aided surgeryManipulators or robots specially adapted for use in surgery
  • A61B 34/20 - Surgical navigation systemsDevices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 90/50 - Supports for surgical instruments, e.g. articulated arms
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

32.

HYBRID POLYMER WAVEGUIDE AND METHODS FOR MAKING THE SAME

      
Application Number 18987684
Status Pending
Filing Date 2024-12-19
First Publication Date 2025-04-24
Owner Magic Leap, Inc. (USA)
Inventor
  • Peroz, Christophe
  • Chang, Chieh
  • Bhagat, Sharad D.

Abstract

In some embodiments, a head-mounted augmented reality display system comprises one or more hybrid waveguides configured to display images by directing modulated light containing image information into the eyes of a viewer. Each hybrid waveguide is formed of two or more layers of different materials. A first (e.g., thicker) layer is a highly optically transparent core layer, and a second (e.g., thinner) auxiliary layer includes a pattern of protrusions and indentations, e.g., to form a diffractive optical element. The pattern may be formed by imprinting. The hybrid waveguide may include additional layers, e.g., forming a plurality of alternating core layers and thinner patterned layers. Multiple waveguides may be stacked to form an integrated eyepiece, with each waveguide configured to receive and output light of a different component color.

IPC Classes  ?

  • G02B 1/04 - Optical elements characterised by the material of which they are madeOptical coatings for optical elements made of organic materials, e.g. plastics
  • G02B 5/18 - Diffracting gratings
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays

33.

VIEWING OPTICS ASSEMBLY FOR AUGMENTED REALITY SYSTEM

      
Application Number 18983127
Status Pending
Filing Date 2024-12-16
First Publication Date 2025-04-24
Owner Magic Leap, Inc. (USA)
Inventor
  • Tekolste, Robert D.
  • Liu, Victor K.

Abstract

A viewing optics assembly for augmented reality includes a projector configured to generate image light and an eyepiece optically coupled to the projector. The eyepiece includes at least one eyepiece layer comprising a waveguide having a surface, an incoupling grating coupled to the waveguide, and an outcoupling grating coupled to the waveguide. The outcoupling grating comprises a first array of first ridges protruding from the surface of the waveguide, each of the first ridges having a first height in a direction perpendicular to the surface and a first width in a direction parallel to the surface and a plurality of second ridges, each of the plurality of second ridges protruding from a respective first ridge of the first ridges and having a second height and a second width. At least one of the first width or the second width varies as a function of position across the surface.

IPC Classes  ?

34.

METHODS, SYSTEMS, AND COMPUTER PROGRAM PRODUCT FOR MANAGING AND DISPLAYING WEBPAGES IN A VIRTUAL THREE-DIMENSIONAL SPACE WITH A MIXED REALITY SYSTEM

      
Application Number 18988336
Status Pending
Filing Date 2024-12-19
First Publication Date 2025-04-24
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Huang, Yi
  • Mak, Genevieve

Abstract

Disclosed are methods, systems, and articles of manufacture for managing and displaying web pages and web resources in a virtual three-dimensional (3D) space with an extended reality system. These techniques receive an input for 3D transform for a web page or a web page panel therefor. In response to the input, a browser engine coupled to a processor of an extended reality system determines 3D transform data for the web page or the web page panel based at least in part upon the 3D transform of the web page or the web page panel, wherein the 3D transform comprises a change in 3D position, rotation, or scale of the web page or the web page panel therefor in a virtual 3D space. A universe browser engine may present contents of the web page in a virtual 3D space based at least in part upon the 3D transform data.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/14 - Digital output to display device
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

35.

METHODS AND SYSTEMS FOR AUDIO SIGNAL FILTERING

      
Application Number 18986468
Status Pending
Filing Date 2024-12-18
First Publication Date 2025-04-17
Owner Magic Leap, Inc. (USA)
Inventor
  • Audfray, Remi Samuel
  • Jot, Jean-Marc
  • Dicker, Samuel Charles

Abstract

Systems and methods for rendering audio signals are disclosed. In some embodiments, a method may receive an input signal including a first portion and the second portion. A first processing stage comprising a first filter is applied to the first portion to generate a first filtered signal. A second processing stage comprising a second filter is applied to the first portion to generate a second filtered signal. A third processing stage comprising a third filter is applied to the second portion to generate a third filtered signal. A fourth processing stage comprising a fourth filter is applied to the second portion to generate a fourth filtered signal. A first output signal is determined based on a sum of the first filtered signal and the third filtered signal. A second output signal is determined based on a sum of the second filtered signal and the fourth filtered signal. The first output signal is presented to a first ear of a user of a virtual environment, and the second output signal is presented to the second ear of the user. The first portion of the input signal corresponds to a first location in the virtual environment, and the second portion of the input signal corresponds to a second location in the virtual environment.

IPC Classes  ?

  • H04S 1/00 - Two-channel systems
  • H04R 5/033 - Headphones for stereophonic communication
  • H04S 7/00 - Indicating arrangementsControl arrangements, e.g. balance control

36.

SPATIALLY VARIABLE LIQUID CRYSTAL DIFFRACTION GRATINGS

      
Application Number 18999315
Status Pending
Filing Date 2024-12-23
First Publication Date 2025-04-17
Owner Magic Leap, Inc. (USA)
Inventor Oh, Chulwoo

Abstract

The present disclosure relates to display systems and, more particularly, to augmented reality display systems including diffraction grating(s), and methods of fabricating same. A diffraction grating includes a plurality of different diffracting zones having a periodically repeating lateral dimension corresponding to a grating period adapted for light diffraction. The diffraction grating additionally includes a plurality of different liquid crystal layers corresponding to the different diffracting zones. The different liquid crystal layers include liquid crystal molecules that are aligned differently, such that the different diffracting zones have different optical properties associated with light diffraction.

IPC Classes  ?

37.

GEOMETRIES FOR MITIGATING ARTIFACTS IN SEE-THROUGH PIXEL ARRAYS

      
Application Number 19000352
Status Pending
Filing Date 2024-12-23
First Publication Date 2025-04-17
Owner Magic Leap, Inc. (USA)
Inventor
  • Russell, Andrew Ian
  • Trisnadi, Jahja I.
  • Mathur, Vaibhav
  • Manly, David
  • Johnson, Michael Robert
  • Carlisle, Clinton

Abstract

Disclosed are dimming assemblies and display systems for reducing artifacts produced by optically-transmissive displays. A system may include a substrate upon which a plurality of electronic components are disposed. The electronic components may include a plurality of pixels, a plurality of conductors, and a plurality of circuit modules. The plurality of pixels may be arranged in a two-dimensional array, with each pixel having a two-dimensional geometry corresponding to a shape with at least one curved side. The plurality of conductors may be arranged adjacent to the plurality of pixels. The system may also include control circuitry electrically coupled to the plurality of conductors. The control circuitry may be configured to apply electrical signals to the plurality of circuit modules by way of the plurality of conductors.

IPC Classes  ?

  • G02F 1/1345 - Conductors connecting electrodes to cell terminals
  • G02B 27/01 - Head-up displays
  • G02C 7/10 - Filters, e.g. for facilitating adaptation of the eyes to the darkSunglasses
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02F 1/1343 - Electrodes
  • G02F 1/1362 - Active matrix addressed cells
  • G02F 1/139 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on orientation effects in which the liquid crystal remains transparent

38.

METHOD AND SYSTEM FOR PERFORMING DYNAMIC FOVEATION BASED ON EYE GAZE

      
Application Number US2024050818
Publication Number 2025/080868
Status In Force
Filing Date 2024-10-10
Publication Date 2025-04-17
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Diaz, Edward
  • Irick, Kevin Maurice

Abstract

A method of forming a foveated image includes (a) setting dimensions of a first region, (b) receiving an image having a first resolution, and (c) forming the foveated image including a primary quality region having the dimensions of the first region and the first resolution and a secondary quality region having a second resolution less than the first resolution. The method also includes (d) outputting the foveated image, (e) determining an eye gaze location, and (f) determining an eye gaze velocity. If the eye gaze velocity is less than a threshold velocity, the method includes decreasing the dimensions of the primary quality region and repeating (b) - (f). If the eye gaze velocity is greater than or equal to the threshold velocity, the method includes repeating (a) - (f).

IPC Classes  ?

  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

39.

METHOD AND SYSTEM FOR FORMING FOVEATED IMAGES BASED ON EYE GAZE

      
Application Number US2024050823
Publication Number 2025/080872
Status In Force
Filing Date 2024-10-10
Publication Date 2025-04-17
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Diaz, Edward
  • Irick, Kevin Maurice

Abstract

A method includes rendering an original image at a first processor, encoding the original image to provide an encoded image, and transmitting the encoded image to a second processor. The method also includes decoding the encoded image to provide a decoded image, determining an eye gaze location, splitting the decoded image into N sections based on the eye gaze location, and processing N-1 sections of the N sections to produce N-1 secondary quality sections. The method further includes processing one section of the N sections to provide one primary quality section, combining the one primary quality section and the N-1 secondary quality sections to form a foveated image, and transmitting the foveated image to a display.

IPC Classes  ?

  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

40.

INDIVIDUAL VIEWING IN A SHARED SPACE

      
Application Number 18933172
Status Pending
Filing Date 2024-10-31
First Publication Date 2025-04-17
Owner Magic Leap, Inc. (USA)
Inventor
  • Alexander, Iv, Earle M.
  • Arroyo, Pedro Luis
  • Venerin, Jean I.
  • Adams, William

Abstract

A mixed reality virtual environment is sharable among multiple users through the use of multiple view modes that are selectable by a presenter. Multiple users with wearable display systems may wish to view a common virtual object, which may be presented in a virtual room to any suitable number of users. A presentation may be controlled by a presenter using a presenter wearable system that leads multiple participants through information associated with the virtual object. Use of different viewing modes allows individual users to see different virtual content through their wearable display systems, despite being in a shared viewing space or alternatively, to see the same virtual content in different locations within a shared space.

IPC Classes  ?

  • G09B 5/12 - Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations different stations being capable of presenting different information simultaneously
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04L 12/18 - Arrangements for providing special services to substations for broadcast or conference

41.

FACE MODEL CAPTURE BY A WEARABLE DEVICE

      
Application Number 18981592
Status Pending
Filing Date 2024-12-15
First Publication Date 2025-04-10
Owner Magic Leap, Inc. (USA)
Inventor
  • Amayeh, Gholamreza
  • Kaehler, Adrian
  • Lee, Douglas

Abstract

Systems and methods for generating a face model for a user of a head-mounted device are disclosed. The head-mounted device can include one or more eye cameras configured to image the face of the user while the user is putting the device on or taking the device off. The images obtained by the eye cameras may be analyzed using a stereoscopic vision technique, a monocular vision technique, or a combination, to generate a face model for the user. The face model can be used to generate a virtual image of at least a portion of the user's face, for example to be presented as an avatar.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06V 20/64 - Three-dimensional objects
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06V 40/19 - Sensors therefor

42.

MODES OF USER INTERACTION

      
Application Number 18984646
Status Pending
Filing Date 2024-12-17
First Publication Date 2025-04-10
Owner Magic Leap, Inc. (USA)
Inventor
  • Speelman, Daniel Stephen
  • Cano, Rodrigo
  • Gundersen, Kara Lauren
  • Hazen, Griffith Buckley
  • Pazmino, Lorena

Abstract

A mixed reality (MR) device can allow a user to switch between input modes to allow interactions with a virtual environment via devices such as a six degrees of freedom (6DoF) handheld controller and a touchpad input device. A default input mode for interacting with virtual content may rely on the user's head pose, which may be difficult to use in selecting virtual objects that are far away in the virtual environment. Thus, the system may be configured to allow the user to use a 6DoF cursor, and a visual ray that extends from the handheld controller to the cursor, to enable precise targeting. Input via a touchpad input device (e.g., that allows three degrees of freedom movements) may also be used in conjunction with the 6DoF cursor.

IPC Classes  ?

  • H04N 13/398 - Synchronisation thereofControl thereof
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • H04N 13/361 - Reproducing mixed stereoscopic imagesReproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

43.

TECHNIQUES FOR DETERMINING SETTINGS FOR A CONTENT CAPTURE DEVICE

      
Application Number 18990920
Status Pending
Filing Date 2024-12-20
First Publication Date 2025-04-10
Owner Magic Leap, Inc. (USA)
Inventor
  • Smith, Brian Keith
  • Tsunaev, Ilya

Abstract

A method includes receiving a first image captured by a content capture device, identifying a first object in the first image and determining a first update to a first setting of the content capture device. The method further includes receiving a second image captured by the content capture device, identifying a second object in the second image, and determining a second update to a second setting of the content capture device. The method further includes updating the first setting of the content capture device using the first update, receiving a third image using the updated first setting of the content capture device, updating the second setting of the content capture device using the second update, receiving a fourth image using the updated second setting of the content capture device, and stitching the third image and the fourth image together to form a composite image.

IPC Classes  ?

  • H04N 23/72 - Combination of two or more compensation controls
  • H04N 5/222 - Studio circuitryStudio devicesStudio equipment
  • H04N 5/265 - Mixing
  • H04N 23/63 - Control of cameras or camera modules by using electronic viewfinders
  • H04N 23/71 - Circuitry for evaluating the brightness variation
  • H04N 23/73 - Circuitry for compensating brightness variation in the scene by influencing the exposure time
  • H04N 23/741 - Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
  • H04N 23/743 - Bracketing, i.e. taking a series of images with varying exposure conditions
  • H04N 23/76 - Circuitry for compensating brightness variation in the scene by influencing the image signals

44.

METHOD AND SYSTEM FOR PERFORMING EYE TRACKING IN AUGMENTED REALITY DEVICES

      
Application Number 18983134
Status Pending
Filing Date 2024-12-16
First Publication Date 2025-04-10
Owner Magic Leap, Inc. (USA)
Inventor
  • Garcia, Giovanni
  • Melo, Christian
  • Farmer, Daniel
  • Shultz, Jason Allen
  • Nguyen, Bach
  • Schabacker, Charles Robert
  • Shoaee, Michael

Abstract

A wearable device for projecting image light to an eye of a viewer and forming an image of virtual content in an augmented reality display is provided. The wearable device includes a projector and stack of waveguides optically connected to the projector. The wearable device also includes an eye tracking system comprising a plurality of illumination sources, an optical element having optical power, and a set of cameras. The optical element is disposed between the plurality of illumination sources and the set of cameras. In some embodiments, the augmented reality display includes an eyepiece operable to output virtual content from an output region and a plurality of illumination sources. At least some of the plurality of illumination sources overlap with the output region.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays
  • H04N 23/56 - Cameras or camera modules comprising electronic image sensorsControl thereof provided with illuminating means
  • H04N 23/90 - Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

45.

EVENT-BASED CAMERA WITH HIGH-RESOLUTION FRAME OUTPUT

      
Application Number 18987441
Status Pending
Filing Date 2024-12-19
First Publication Date 2025-04-10
Owner Magic Leap, Inc. (USA)
Inventor
  • Zahnert, Martin Georg
  • Ilic, Alexander

Abstract

A high-resolution image sensor suitable for use in an augmented reality (AR) system to provide low latency image analysis with low power consumption. The AR system can be compact, and may be small enough to be packaged within a wearable device such as a set of goggles or mounted on a frame resembling ordinary eyeglasses. The image sensor may receive information about a region of an imaging array associated with a movable object, selectively output imaging information for that region, and synchronously output high-resolution image frames. The region may be updated dynamically as the image sensor and/or the object moves. The image sensor may output the high-resolution image frames less frequently than the region being updated when the image sensor and/or the object moves. Such an image sensor provides a small amount of data from which object information used in rendering an AR scene can be developed.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/593 - Depth or shape recovery from multiple images from stereo images
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/20 - Image preprocessing
  • H04N 13/00 - Stereoscopic video systemsMulti-view video systemsDetails thereof
  • H04N 13/106 - Processing image signals
  • H04N 25/531 - Control of the integration time by controlling rolling shutters in CMOS SSIS
  • H04N 25/705 - Pixels for depth measurement, e.g. RGBZ
  • H04N 25/75 - Circuitry for providing, modifying or processing image signals from the pixel array

46.

VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

      
Application Number US2024048685
Publication Number 2025/072540
Status In Force
Filing Date 2024-09-26
Publication Date 2025-04-03
Owner MAGIC LEAP, INC. (USA)
Inventor Lancelle, Marcel

Abstract

A head-mounted display system configured to be worn over eyes of a user includes a frame configured to be worn on a head of the user. The system also includes a display disposed on the frame over the eyes of the user. The system further includes an inwardly-facing light source disposed on the frame and configured to emit light toward the eyes of the user to improve visibility of respective portions of a face and the eyes of the user through the display. Moreover, the system includes a processor configured to control a brightness of the display, an opacity of the display, and an intensity of the light emitted by the inwardly-facing light source.

IPC Classes  ?

  • G02B 27/02 - Viewing or reading apparatus
  • G09G 3/20 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix
  • G09G 5/36 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory

47.

CENTRALIZED RENDERING

      
Application Number 18980110
Status Pending
Filing Date 2024-12-13
First Publication Date 2025-04-03
Owner Magic Leap, Inc. (USA)
Inventor Babu J D, Praveen

Abstract

A method is disclosed, the method comprising the steps of receiving, from a first client application, first graphical data comprising a first node; receiving, from a second client application independent of the first client application, second graphical data comprising a second node; and generating a scenegraph, wherein the scenegraph describes a hierarchical relationship between the first node and the second node according to visual occlusion relative to a perspective from a display.

IPC Classes  ?

48.

VIRTUAL AND AUGMENTED REALITY SYSTEMS AND METHODS USING DISPLAY SYSTEM CONTROL INFORMATION EMBEDDED IN IMAGE DATA

      
Application Number 18971685
Status Pending
Filing Date 2024-12-06
First Publication Date 2025-03-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Rodriguez, Jose Felix
  • Perez, Ricardo Martinez
  • Nourai, Reza

Abstract

A display system, such as a virtual reality or augmented reality display system, can control a display to present image data including a plurality of color components, on a plurality of depth planes supported by the display. The presentation of the image data through the display can be controlled based on control information that is embedded in the image data, for example to activate or inactive a color component and/or a depth plane. In some examples, light sources and/or spatial light modulators that relay illumination from the light sources may receive signals from a display controller to adjust a power setting to the light source or spatial light modulator based on control information embedded in an image data frame.

IPC Classes  ?

  • H04N 23/65 - Control of camera operation in relation to power supply
  • G02B 27/01 - Head-up displays
  • G02B 27/10 - Beam splitting or combining systems
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

49.

MANAGING MULTI-OBJECTIVE ALIGNMENTS FOR IMPRINTING

      
Application Number 18975299
Status Pending
Filing Date 2024-12-10
First Publication Date 2025-03-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Sevier, Jeremy Lee
  • Sadam, Satish
  • Imhof, Joseph Michael
  • Luo, Kang
  • Wang, Kangkang
  • Patterson, Roy Matthew
  • Xue, Qizhen
  • Best, Brett William
  • Carden, Charles Scott
  • Shafran, Matthew S.
  • Miller, Michael Nevin

Abstract

Systems and methods for managing multi-objective alignments in imprinting (e.g., single-sided or double-sided) are provided. An example system includes rollers for moving a template roll, a stage for holding a substrate, a dispenser for dispensing resist on the substrate, a light source for curing the resist to form an imprint on the substrate when a template of the template roll is pressed into the resist on the substrate, a first inspection system for registering a fiducial mark of the template to determine a template offset, a second inspection system for registering the imprint on the substrate to determine a wafer registration offset between a target location and an actual location of the imprint, and a controller for controlling to move the substrate with the resist below the template based on the template offset, and determine an overlay bias of the imprint on the substrate based on the wafer registration offset.

IPC Classes  ?

  • G03F 9/00 - Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
  • G03F 7/00 - Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printed surfacesMaterials therefor, e.g. comprising photoresistsApparatus specially adapted therefor

50.

DISPLAY SYSTEM HAVING A PLURALITY OF LIGHT PIPES FOR A PLURALITY OF LIGHT EMITTERS

      
Application Number 18975714
Status Pending
Filing Date 2024-12-10
First Publication Date 2025-03-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Curtis, Kevin
  • Hall, Heidi Leising
  • St. Hilaire, Pierre
  • Tinch, David

Abstract

A display system includes a plurality of light pipes and a plurality of light sources configured to emit light into the light pipes. The display system also comprises a spatial light modulator configured to modulate light received from the light pipes to form images. The display system may also comprise one or more waveguides configured to receive modulated light from the spatial light modulator and to relay that light to a viewer.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • G02B 19/00 - Condensers
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only

51.

DISPLAY SYSTEMS AND METHODS FOR CLIPPING CONTENT TO INCREASE VIEWING COMFORT

      
Application Number 18977386
Status Pending
Filing Date 2024-12-11
First Publication Date 2025-03-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Schwab, Brian David
  • Hand, Randall E.
  • Vlaskamp, Björn Nicolaas Servatius

Abstract

AR/VR display systems limit displaying content that exceeds an accommodation-vergence mismatch threshold, which may define a volume around the viewer. The volume may be subdivided into two or more zones, including an innermost loss-of-fusion zone (LoF) in which no content is displayed, and one or more outer AVM zones in which the displaying of content may be stopped, or clipped, under certain conditions. For example, content may be clipped if the viewer is verging within an AVM zone and if the content is displayed within the AVM zone for more than a threshold duration. A further possible condition for clipping content is that the user is verging on that content. In addition, the boundaries of the AVM zone and/or the acceptable amount of time that the content is displayed may vary depending upon the type of content being displayed, e.g., whether the content is user-locked content or in-world content.

IPC Classes  ?

52.

DEPTH BASED FOVEATED RENDERING FOR DISPLAY SYSTEMS

      
Application Number 18959036
Status Pending
Filing Date 2024-11-25
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Yeoh, Ivan Li Chuen
  • Edwin, Lionel Ernest
  • Samec, Nicole Elizabeth
  • Robaina, Nastasja U.
  • Mathur, Vaibhav
  • Dalrymple, Timothy Mark
  • Schaefer, Jason
  • Carlisle, Clinton
  • Cheng, Hui-Chuan
  • Oh, Chulwoo
  • Premysler, Philip
  • Zhang, Xiaoyang
  • Carlson, Adam C.

Abstract

Methods and systems for depth-based foveated rendering in a display system are disclosed. The display system may be an augmented reality display system configured to provide virtual content on a plurality of depth planes using different wavefront divergence. Some embodiments include monitoring eye orientations of a user of the display system. A fixation point can be determined based on the eye orientations, the fixation point representing a three-dimensional location with respect to a field of view. Location information of virtual object(s) to present is obtained, with the location information including three-dimensional position(s) of the virtual object(s). A resolution of the virtual object(s) can be adjusted based on a proximity of the location(s) of the virtual object(s) to the fixation point. The virtual object(s) are presented by the display system according to the adjusted resolution(s).

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • H04N 13/341 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
  • H04N 13/395 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
  • H04N 13/398 - Synchronisation thereofControl thereof

53.

VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

      
Application Number 18959252
Status Pending
Filing Date 2024-11-25
First Publication Date 2025-03-20
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Rodriguez, Jose Felix
  • Perez, Ricardo Martinez

Abstract

A virtual, augmented, or mixed reality display system includes a display configured to display virtual, augmented, or mixed reality image data, the display including one or more optical components which introduce optical distortions or aberrations to the image data. The system also includes a display controller configured to provide the image data to the display. The display controller includes memory for storing optical distortion correction information, and one or more processing elements to at least partially correct the image data for the optical distortions or aberrations using the optical distortion correction information.

IPC Classes  ?

  • G06T 5/80 - Geometric correction
  • G06F 1/3203 - Power management, i.e. event-based initiation of a power-saving mode
  • G06F 3/14 - Digital output to display device
  • G06T 3/18 - Image warping, e.g. rearranging pixels individually
  • G06T 3/4007 - Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation

54.

DEPTH BASED FOVEATED RENDERING FOR DISPLAY SYSTEMS

      
Application Number 18960851
Status Pending
Filing Date 2024-11-26
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Mathur, Vaibhav
  • Edwin, Lionel Ernest
  • Zhang, Xiaoyang
  • Vlaskamp, Bjorn Nicolaas Servatius

Abstract

Methods and systems for depth-based foveated rendering in the display system are disclosed. The display system may be an augmented reality display system configured to provide virtual content on a plurality of depth planes using different wavefront divergence. Some embodiments include determining a fixation point of a user's eyes. Location information associated with a first virtual object to be presented to the user via a display device is obtained. A resolution-modifying parameter of the first virtual object is obtained. A particular resolution at which to render the first virtual object is identified based on the location information and the resolution-modifying parameter of the first virtual object. The particular resolution is based on a resolution distribution specifying resolutions for corresponding distances from the fixation point. The first virtual object rendered at the identified resolution is presented to the user via the display system.

IPC Classes  ?

  • G09G 5/391 - Resolution modifying circuits, e.g. variable screen formats
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 5/70 - DenoisingSmoothing

55.

EYEPIECE FOR VIRTUAL, AUGMENTED, OR MIXED REALITY SYSTEMS

      
Application Number 18970619
Status Pending
Filing Date 2024-12-05
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Tekolste, Robert Dale
  • Welch, William Hudson
  • Browy, Eric
  • Liu, Victor Kai
  • Bhargava, Samarth

Abstract

An eyepiece for an augmented reality display system. The eyepiece can include a waveguide substrate. The waveguide substrate can include an input coupler grating (ICG), an orthogonal pupil expander (OPE) grating, a spreader grating, and an exit pupil expander (EPE) grating. The ICG can couple at least one input light beam into at least a first guided light beam that propagates inside the waveguide substrate. The OPE grating can divide the first guided light beam into a plurality of parallel, spaced-apart light beams. The spreader grating can receive the light beams from the OPE grating and spread their distribution. The spreader grating can include diffractive features oriented at approximately 90° to diffractive features of the OPE grating. The EPE grating can re-direct the light beams from the first OPE grating and the first spreader grating such that they exit the waveguide substrate.

IPC Classes  ?

  • G02B 6/122 - Basic optical elements, e.g. light-guiding paths
  • G02B 6/02 - Optical fibres with cladding
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

56.

MEDICAL ASSISTANT

      
Application Number 18964039
Status Pending
Filing Date 2024-11-29
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Robaina, Nastasja U.
  • Samec, Nicole Elizabeth
  • Baerenrodt, Mark
  • Harrises, Christopher M.

Abstract

A wearable display device, such as an augmented reality display device, can present virtual content to the wearer for applications in a healthcare setting. The wearer may be a patient or a healthcare provider (HCP). Applications can include, but are not limited to, access, display, and modification of patient medical records and sharing patient medical records among authorized HCPs, detecting one or more anomalies in a medical environment and presenting virtual content (e.g., alerts) indicating the one or more anomalies, detecting the presence of physical objects (e.g., medical instruments or devices) in the medical environment, enabling communication with and/or remove control of a medical device in the environment, and so forth.

IPC Classes  ?

  • G16H 10/60 - ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
  • A61B 3/00 - Apparatus for testing the eyesInstruments for examining the eyes
  • A61B 3/10 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions
  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • A61B 5/00 - Measuring for diagnostic purposes Identification of persons
  • A61B 5/06 - Devices, other than using radiation, for detecting or locating foreign bodies
  • A61B 5/1171 - Identification of persons based on the shapes or appearances of their bodies or parts thereof
  • A61B 5/339 - Displays specially adapted therefor
  • A61B 17/00 - Surgical instruments, devices or methods
  • A61B 34/20 - Surgical navigation systemsDevices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 90/50 - Supports for surgical instruments, e.g. articulated arms
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 16/22 - IndexingData structures thereforStorage structures
  • G06F 16/23 - Updating
  • G06F 21/32 - User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
  • G06F 21/62 - Protecting access to data via a platform, e.g. using keys or access control rules
  • G06F 40/205 - Parsing
  • G06F 40/289 - Phrasal analysis, e.g. finite state techniques or chunking
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes
  • G10L 15/26 - Speech to text systems
  • G16H 30/40 - ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
  • G16H 40/67 - ICT specially adapted for the management or administration of healthcare resources or facilitiesICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation

57.

DIMMING DEVICE ANGULAR UNIFORMITY CORRECTION

      
Application Number 18966688
Status Pending
Filing Date 2024-12-03
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Arencibia, Ricardo
  • Llaneras, Zachary Michael
  • Schaefer, Jason
  • Cohen, Howard Russell
  • Cheng, Hui-Chuan
  • Sours, Michael Alexander

Abstract

A method of operating an optical system includes identifying a set of angle dependent transmittance levels for light passing through pixels of a segmented dimmer exhibiting viewing angle transmittance variations for application of a same voltage to all pixels of the segmented dimmer. The method also includes determining a set of voltages to apply to pixels of the segmented dimmer. Determining the set of voltages includes using the set of angle dependent transmittance levels. The method includes applying the set of voltages to the pixels of the segmented dimmer to achieve light transmittance through the segmented dimmer corresponding to the set of angle dependent transmittance levels.

IPC Classes  ?

  • G09G 3/36 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source using liquid crystals
  • G02B 27/01 - Head-up displays

58.

DISPLAY FOR THREE-DIMENSIONAL IMAGE

      
Application Number 18969386
Status Pending
Filing Date 2024-12-05
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor Kaehler, Adrian

Abstract

Apparatuses and methods for displaying a 3-D representation of an object are described. Apparatuses can include a rotatable structure, motor, and multiple light field sub-displays disposed on the rotatable structure. The apparatuses can store a light field image to be displayed, the light field image providing multiple different views of the object at different viewing directions. A processor can drive the motor to rotate the rotatable structure and map the light field image to each of the light field sub-displays based in part on the rotation angle, and illuminate the light field sub-displays based in part on the mapped light field image. The apparatuses can include a display panel configured to be viewed from a fiducial viewing direction, where the display panel is curved out of a plane that is perpendicular to the fiducial viewing direction, and a plurality of light field sub-displays disposed on the display panel.

IPC Classes  ?

  • H04N 13/393 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume the volume being generated by a moving, e.g. vibrating or rotating, surface
  • G02B 30/27 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the autostereoscopic type involving lenticular arrays
  • G02B 30/54 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being generated by moving a 2D surface, e.g. by vibrating or rotating the 2D surface
  • G02B 30/56 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • H04N 13/307 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using fly-eye lenses, e.g. arrangements of circular lenses
  • H04N 13/32 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using arrays of controllable light sourcesImage reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using moving apertures or moving light sources
  • H04N 13/324 - Colour aspects
  • H04N 13/398 - Synchronisation thereofControl thereof

59.

METHOD AND SYSTEM FOR DIFFRACTIVE OPTICS EYEPIECE ARCHITECTURES INCORPORATING AN OPTICAL NOTCH FILTER

      
Application Number US2023032805
Publication Number 2025/058628
Status In Force
Filing Date 2023-09-14
Publication Date 2025-03-20
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Khandekar, Chinmay
  • Singh, Vikramjit
  • Tekolste, Robert D.

Abstract

An augmented reality system includes a projector assembly and a set of imaging optics optically coupled to the projector assembly. The augmented reality system also includes an eyepiece optically coupled to the set of imaging optics. The eyepiece has a world side and a user side opposite the world side and includes one or more eyepiece waveguides. Each of the one or more eyepiece waveguides includes an incoupling interface and an outcoupling interface operable to output virtual content toward the user side. The augmented reality system further includes an optical notch filter disposed on the world side of the eyepiece.

IPC Classes  ?

60.

METHOD AND SYSTEM FOR HIGH ORDER DIFFRACTION, LARGE FIELD OF VIEW AUGMENTED REALITY EYEPIECE WAVEGUIDES

      
Application Number US2023032806
Publication Number 2025/058629
Status In Force
Filing Date 2023-09-14
Publication Date 2025-03-20
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Singh, Vikramjit
  • Khandekar, Chinmay
  • Xue, Qizhen
  • Faraji-Dana, Mohammadsadegh

Abstract

A method of operating an eyepiece waveguide of an augmented reality system includes projecting virtual content using a projector assembly and diffracting the virtual content into the eyepiece waveguide via a first order diffraction. A first portion of the virtual content is clipped to produce a remaining portion of the virtual content. The method also includes propagating the remaining portion of the virtual content in the eyepiece waveguide, outcoupling the remaining portion of the virtual content out of the eyepiece waveguide, and diffracting the virtual content into the eyepiece waveguide via a second order diffraction. A second portion of the virtual content is clipped to produce a complementary portion. The method further includes propagating the complementary portion of the virtual content in the eyepiece waveguide and outcoupling the complementary portion of the virtual content out of the eyepiece waveguide.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/42 - Diffraction optics
  • G02B 6/00 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings
  • G02B 6/122 - Basic optical elements, e.g. light-guiding paths

61.

COMPACT EXTENDED DEPTH OF FIELD LENSES FOR WEARABLE DISPLAY DEVICES

      
Application Number US2023074212
Publication Number 2025/058645
Status In Force
Filing Date 2023-09-14
Publication Date 2025-03-20
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Gao, Chunyu
  • Singh, Vikramjit
  • Uhlendorf, Kristina
  • Schaefer, Jason
  • Arend, Erik, Heath
  • Mcdonald, Lorenzo

Abstract

A wearable display device includes waveguide(s) that present virtual image elements as an augmentation to the real-world environment. The display device includes a first extended depth of field (EDOF) refractive lens arranged between the waveguide(s) and the user's eye(s), and a second EDOF refractive lens located outward from the waveguide(s). The first EDOF lens has a (e.g., negative) optical power to alter the depth of the virtual image elements. The second EDOF lens has a substantially equal and opposite (e.g., positive) optical power to that of the first EDOF lens, such that the depth of real-world objects is not altered along with the depth of the virtual image elements. To reduce the weight and/or size of the device, one or both EDOF lenses is a compact lens, e.g., Fresnel lens or flattened periphery lens. The compact lens may be coated and/or embedded in another material to enhance its performance.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 6/10 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

62.

CUSTOMIZED POLYMER/GLASS DIFFRACTIVE WAVEGUIDE STACKS FOR AUGMENTED REALITY/MIXED REALITY APPLICATIONS

      
Application Number 18953940
Status Pending
Filing Date 2024-11-20
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Bhagat, Sharad D.
  • Hill, Brian George
  • Peroz, Christophe
  • Chang, Chieh
  • Li, Ling

Abstract

A diffractive waveguide stack includes first, second, and third diffractive waveguides for guiding light in first, second, and third visible wavelength ranges, respectively. The first diffractive waveguide includes a first material having first refractive index at a selected wavelength and a first target refractive index at a midpoint of the first visible wavelength range. The second diffractive waveguide includes a second material having a second refractive index at the selected wavelength and a second target refractive index at a midpoint of the second visible wavelength range. The third diffractive waveguide includes a third material having a third refractive index at the selected wavelength and a third target refractive index at a midpoint of the third visible wavelength range. A difference between any two of the first target refractive index, the second target refractive index, and the third target refractive index is less than 0.005 at the selected wavelength.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/01 - Head-up displays

63.

MULTI-DEPTH PLANE DISPLAY SYSTEM WITH REDUCED SWITCHING BETWEEN DEPTH PLANES

      
Application Number 18955217
Status Pending
Filing Date 2024-11-21
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Samec, Nicole Elizabeth
  • Robaina, Nastasja U.
  • Harrises, Christopher M.
  • Baerenrodt, Mark

Abstract

Methods and systems for reductions in switching between depth planes of a multi-depth plane display system are disclosed. The display system may be an AR display system configured to provide virtual content on a plurality of depth planes using different wavefront divergence. The system may monitor the fixation points based upon the gaze of each of the user's eyes, with each fixation point being a three-dimensional location in the user's field of view. Location information of virtual objects to be presented to the user are obtained, with each virtual object being associated with a depth plane. The depth plane on which the virtual object is to be presented may modified based upon the fixation point of the user's eyes.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • H04N 13/122 - Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/332 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
  • H04N 13/395 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes

64.

WEARABLE SYSTEM WITH HEADSET AND CONTROLLER INSIDE-OUT TRACKING

      
Application Number 18956664
Status Pending
Filing Date 2024-11-22
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Kasper, Dominik Michael
  • Zahnert, Martin Georg
  • Sanchez Nicuesa, Manel Quim
  • Gomez-Jordana Manas, Rafa
  • Baumli, Nathan Yuki
  • Shee, Koon Keong
  • Nienstedt, Zachary C.
  • Mount, Emily Elizabeth
  • Agarwal, Lomesh
  • Lampart, Andrea

Abstract

Wearable systems and method for operation thereof incorporating headset and controller inside-out tracking are disclosed. A wearable system may include a headset and a controller. The wearable system may cause fiducials of the controller to flash. The wearable system may track a pose of the controller by capturing headset images using a headset camera, identifying the fiducials in the headset images, and tracking the pose of the controller based on the identified fiducials in the headset images and based on a pose of the headset. While tracking the pose of the controller, the wearable system may capture controller images using a controller camera. The wearable system may identify two-dimensional feature points in each controller image and determine three-dimensional map points based on the two-dimensional feature points and the pose of the controller.

IPC Classes  ?

  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

65.

DEPTH SENSING TECHNIQUES FOR VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS

      
Application Number 18957796
Status Pending
Filing Date 2024-11-24
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Smith, Brian Keith
  • Shee, Koon Keong
  • Link, Gregory Michael

Abstract

Techniques for operating a depth sensor are discussed. A first sequence of operation steps and a second sequence of operation steps can be stored in memory on the depth sensor to define, respectively, a first depth sensing mode of operation and a second depth sensing mode of operation. In response to a first request for depth measurement(s) according to the first depth sensing mode of operation, the depth sensor can operate in the first mode of operation by executing the first sequence of operation steps. In response to a second request for depth measurement(s) according to the second depth sensing mode of operation, and without performing an additional configuration operation, the depth sensor can operate in the second mode of operation by executing the second sequence of operation steps.

IPC Classes  ?

  • H04N 23/667 - Camera operation mode switching, e.g. between still and video, sport and normal or high and low resolution modes
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computerOutput arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • H04N 13/139 - Format conversion, e.g. of frame-rate or size
  • H04N 13/296 - Synchronisation thereofControl thereof
  • H04N 23/959 - Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics

66.

DIFFRACTIVE OPTICAL ELEMENTS WITH MITIGATION OF REBOUNCE-INDUCED LIGHT LOSS AND RELATED SYSTEMS AND METHODS

      
Application Number 18954311
Status Pending
Filing Date 2024-11-20
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Schmulen, Jeffrey Dean
  • Ricks, Neal Paul
  • Bhargava, Samarth
  • Messer, Kevin
  • Liu, Victor Kai
  • Dixon, Matthew Grant
  • Deng, Xiaopei
  • Menezes, Marlon Edward
  • Yang, Shuqiang
  • Singh, Vikramjit
  • Luo, Kang
  • Xu, Frank Y.

Abstract

Display devices include waveguides with in-coupling optical elements that mitigate re-bounce of in-coupled light to improve in-coupling efficiency and/or uniformity. A waveguide receives light from a light source and includes an in-coupling optical element that in-couples the received light to propagate by total internal reflection within the waveguide. The in-coupled light may undergo re-bounce, in which the light reflects off a waveguide surface and, after the reflection, strikes the in-coupling optical element. Upon striking the in-coupling optical element, the light may be partially absorbed and/or out-coupled by the optical element, thereby reducing the amount of in-coupled light propagating through the waveguide. The in-coupling optical element can be truncated or have reduced diffraction efficiency along the propagation direction to reduce the occurrence of light loss due to re-bounce of in-coupled light, resulting in less in-coupled light being prematurely out-coupled and/or absorbed during subsequent interactions with the in-coupling optical element.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/10 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
  • G02B 6/26 - Optical coupling means
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/42 - Diffraction optics
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

67.

ATTENUATION OF LIGHT TRANSMISSION ARTIFACTS IN WEARABLE DISPLAYS

      
Application Number 18960810
Status Pending
Filing Date 2024-11-26
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Manly, David
  • Messer, Kevin
  • Mathur, Vaibhav
  • Carlisle, Clinton

Abstract

A wearable display system includes an eyepiece stack having a world side and a user side opposite the world side, wherein during use a user positioned on the user side views displayed images delivered by the system via the eyepiece stack which augment the user's view of the user's environment. The wearable display system also includes an angularly selective film arranged on the world side of the of the eyepiece stack. The angularly selective film includes a polarization adjusting film arranged between pair of linear polarizers. The linear polarizers and polarization adjusting film significantly reduces transmission of visible light incident on the angularly selective film at large angles of incidence without significantly reducing transmission of light incident on the angularly selective film at small angles of incidence.

IPC Classes  ?

  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G02B 5/30 - Polarising elements
  • G02B 27/01 - Head-up displays
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02F 1/137 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering

68.

DISPLAY SYSTEMS AND METHODS FOR DETERMINING REGISTRATION BETWEEN A DISPLAY AND A USER'S EYES

      
Application Number 18962659
Status Pending
Filing Date 2024-11-27
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Edwin, Lionel Ernest
  • Nienstedt, Zachary C.
  • Yeoh, Ivan Li Chuen
  • Miller, Samuel A.
  • Xu, Yan
  • Cazamias, Jordan Alexander

Abstract

A display system may include a head-mounted display (HMD) for rendering a three-dimensional virtual object which appears to be located in an ambient environment of a user of the display. One or more eyes of the user may not be in desired positions, relative to the HMD, to receive, or register, image information outputted by the HMD and/or to view an external environment. For example, the HMD-to-eye alignment may vary for different users and/or may change over time (e.g., as the HMD is displaced). The display system may determine a relative position or alignment between the HMD and the user's eyes. Based on the relative positions, the wearable device may determine if it is properly fitted to the user, may provide feedback on the quality of the fit to the user, and/or may take actions to reduce or minimize effects of any misalignment.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • A61B 3/11 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for measuring interpupillary distance or diameter of pupils
  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • G02B 27/01 - Head-up displays
  • G02B 30/00 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
  • G02B 30/40 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images giving the observer of a single two-dimensional [2D] image a perception of depth
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/16 - Sound inputSound output
  • G06T 3/40 - Scaling of whole images or parts thereof, e.g. expanding or contracting
  • G06V 10/42 - Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
  • G06V 10/46 - Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]Salient regional features
  • G06V 10/60 - Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

69.

Device controller

      
Application Number 29942166
Grant Number D1066297
Status In Force
Filing Date 2024-05-14
First Publication Date 2025-03-11
Grant Date 2025-03-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Swinton, Matthew David
  • Urban, Hayes

70.

NANOPATTERN ENCAPSULATION FUNCTION, METHOD AND PROCESS IN COMBINED OPTICAL COMPONENTS

      
Application Number 18554613
Status Pending
Filing Date 2022-04-13
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Vikramjit
  • Miller, Michael Nevin
  • Anderson, T.G.
  • Xu, Frank Y.

Abstract

Disclosed herein are systems and methods for displays, such as for a head wearable device. An example display can include an infrared illumination layer, the infrared illumination layer including a substrate, one or more LEDs disposed on a first surface of the substrate, and a first encapsulation layer disposed on the first surface of the substrate, where the encapsulation layer can include a nano-patterned surface. In some examples, the nano-patterned surface can be configured to improve a visible light transmittance of the illumination layer. In one or more examples, embodiments disclosed herein may provide a robust illumination layer that can reduce the haze associated with an illumination layer.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • B82Y 20/00 - Nanooptics, e.g. quantum optics or photonic crystals
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

71.

EYE CENTER OF ROTATION DETERMINATION WITH ONE OR MORE EYE TRACKING CAMERAS

      
Application Number 18948073
Status Pending
Filing Date 2024-11-14
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Cohen, David
  • Joseph, Elad
  • Ferens, Ron Nisim
  • Preter, Eyal
  • Bar-On, Eitan Shmuel
  • Yahav, Giora

Abstract

A display system can include a head-mounted display configured to project light to an eye of a user to display virtual image content at different amounts of divergence and collimation. The display system can include an inward-facing imaging system possibly comprising a plurality of cameras that image the user's eye and glints for thereon and processing electronics that are in communication with the inward-facing imaging system and that are configured to obtain an estimate of a center of rotation of the user's eye using cornea data derived from the glint images. The display system may render virtual image content with a render camera positioned at the determined position of the center of rotation of said eye.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 7/292 - Multi-camera tracking
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

72.

EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM

      
Application Number 18951308
Status Pending
Filing Date 2024-11-18
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Bhargava, Samarth
  • Liu, Victor Kai
  • Messer, Kevin

Abstract

An eyepiece waveguide for an augmented reality display system may include an optically transmissive substrate, an input coupling grating (ICG) region, a multi-directional pupil expander (MPE) region, and an exit pupil expander (EPE) region. The ICG region may receive an input beam of light and couple the input beam into the substrate as a guided beam. The MPE region may include a plurality of diffractive features which exhibit periodicity along at least a first axis of periodicity and a second axis of periodicity. The MPE region may be positioned to receive the guided beam from the ICG region and to diffract it in a plurality of directions to create a plurality of diffracted beams. The EPE region may overlap the MPE region and may out couple one or more of the diffracted beams from the optically transmissive substrate as output beams.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/09 - Beam shaping, e.g. changing the cross-sectioned area, not otherwise provided for
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes

73.

HEAD-MOUNTED DISPLAY SYSTEMS WITH POWER SAVING FUNCTIONALITY

      
Application Number 18952446
Status Pending
Filing Date 2024-11-19
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Rivera Cintron, Carlos A.
  • Link, Gregory
  • Sommers, Jeffrey Scott
  • Hull, Matthew Thomas
  • Rodriguez, Jose Felix
  • Martinez Perez, Ricardo

Abstract

Head-mounted display systems with power saving functionality are disclosed. The systems can include a frame configured to be supported on the head of the user. The systems can also include a head-mounted display disposed on the frame, one or more sensors, and processing electronics in communication with the display and the one or more sensors. In some implementations, the processing electronics can be configured to cause the system to reduce power of one or more components in response to at least in part on a determination that the frame is in a certain position (e.g., upside-down or on top of the head of the user). In some implementations, the processing electronics can be configured to cause the system to reduce power of one or more components in response to at least in part on a determination that the frame has been stationary for at least a threshold period of time.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 1/3218 - Monitoring of peripheral devices of display devices
  • G06F 1/3231 - Monitoring the presence, absence or movement of users
  • G06F 1/3234 - Power saving characterised by the action undertaken

74.

WEARABLE SYSTEM WITH CONTROLLER LOCALIZATION USING HEADSET CAMERAS AND CONTROLLER FIDUCIALS

      
Application Number 18956658
Status Pending
Filing Date 2024-11-22
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Nienstedt, Zachary C.
  • Roberts, Daniel
  • Lopez, Christopher Michael
  • Bucknor, Brian Edward Oliver
  • Miller, Samuel A.
  • Baumli, Nathan Yuki
  • Kasper, Dominik Michael
  • Sanchez Nicuesa, Manel Quim
  • Lampart, Andrea
  • Gomez-Jordana Manas, Rafa
  • Zahnert, Martin Georg
  • Stan, Nikola
  • Mount, Emily Elizabeth

Abstract

Wearable systems and method for operation thereof incorporating headset and controller localization using headset cameras and controller fiducials are disclosed. A wearable system may include a headset and a controller. The wearable system may alternate between performing headset tracking and performing controller tracking by repeatedly capturing images using a headset camera of the headset during headset tracking frames and controller tracking frames. The wearable system may cause the headset camera to capture a first exposure image an exposure above a threshold and cause the headset camera to capture a second exposure image having an exposure below the threshold. The wearable system may determine a fiducial interval during which fiducials of the controller are to flash at a fiducial frequency and a fiducial period. The wearable system may cause the fiducials to flash during the fiducial interval in accordance with the fiducial frequency and the fiducial period.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 23/73 - Circuitry for compensating brightness variation in the scene by influencing the exposure time

75.

AUGMENTED REALITY DISPLAY WITH FRAME MODULATION FUNCTIONALITY

      
Application Number 18828828
Status Pending
Filing Date 2024-09-09
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Smith, Brian Keith
  • Rivera Cintron, Carlos A.
  • Rodriguez, Jose Felix
  • Hull, Matthew Thomas
  • Link, Gregory Michael

Abstract

A head mounted display system can process images by assessing relative motion between the head mounted display and one or more features in a user's environment. The assessment of relative motion can include determining whether the head mounted display has moved, is moving and/or is expected to move with respect to one or more features in the environment. Additionally or alternatively, the assessment can include determining whether one or more features in the environment have moved, are moving and/or are expected to move relative to the head mounted display. The image processing can further include determining one or more virtual image content locations in the environment that correspond to a location where renderable virtual image content appears to a user when the location appears in the display and comparing the one or more virtual image content locations in the environment with a viewing zone.

IPC Classes  ?

  • H04N 23/667 - Camera operation mode switching, e.g. between still and video, sport and normal or high and low resolution modes
  • G02B 27/01 - Head-up displays
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

76.

PLENOPTIC CAMERA MEASUREMENT AND CALIBRATION OF HEAD-MOUNTED DISPLAYS

      
Application Number 18952703
Status Pending
Filing Date 2024-11-19
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor Schuck, Iii, Miller Harry

Abstract

A method for measuring performance of a head-mounted display module, the method including arranging the head-mounted display module relative to a plenoptic camera assembly so that an exit pupil of the head-mounted display module coincides with a pupil of the plenoptic camera assembly; emitting light from the head-mounted display module while the head-mounted display module is arranged relative to the plenoptic camera assembly; filtering the light at the exit pupil of the head-mounted display module; acquiring, with the plenoptic camera assembly, one or more light field images projected from the head-mounted display module with the filtered light; and determining information about the performance of the head-mounted display module based on acquired light field image.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 5/20 - Filters
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising

77.

ARCHITECTURES AND METHODS FOR OUTPUTTING DIFFERENT WAVELENGTH LIGHT OUT OF WAVEGUIDES

      
Application Number 18953953
Status Pending
Filing Date 2024-11-20
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Tekolste, Robert Dale
  • Klug, Michael Anthony
  • Schowengerdt, Brian T.

Abstract

Architectures are provided for selectively outputting light for forming images, the light having different wavelengths and being outputted with low levels of crosstalk. In some embodiments, light is incoupled into a waveguide and deflected to propagate in different directions, depending on wavelength. The incoupled light then outcoupled by outcoupling optical elements that outcouple light based on the direction of propagation of the light. In some other embodiments, color filters are between a waveguide and outcoupling elements. The color filters limit the wavelengths of light that interact with and are outcoupled by the outcoupling elements. In yet other embodiments, a different waveguide is provided for each range of wavelengths to be outputted. Incoupling optical elements selectively incouple light of the appropriate range of wavelengths into a corresponding waveguide, from which the light is outcoupled.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/42 - Diffraction optics

78.

HEAD POSE MIXING OF AUDIO FILES

      
Application Number 18954259
Status Pending
Filing Date 2024-11-20
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Mangiat, Stephen Vincent
  • Tucker, Michael Benson
  • Tajik, Anastasia Andreyevna

Abstract

Examples of wearable devices that can present to a user of the display device an audible or visual representation of an audio file comprising a plurality of stem tracks that represent different audio content of the audio file are described. Systems and methods are described that determine the pose of the user; generate, based on the pose of the user, an audio mix of at least one of the plurality of stem tracks of the audio file; generate, based on the pose of the user and the audio mix, a visualization of the audio mix; communicate an audio signal representative of the audio mix to the speaker; and communicate a visual signal representative of the visualization of the audio mix to the display.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangementsControl arrangements, e.g. balance control
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/16 - Sound inputSound output

79.

USER HEART RATE DETECTION SYSTEM AND METHOD USING FUSION OF MULTI-SENSOR DATA

      
Application Number US2023031746
Publication Number 2025/048824
Status In Force
Filing Date 2023-08-31
Publication Date 2025-03-06
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Mashayekhi, Ghoncheh
  • Rice, Darrian
  • Nyman, Edward Jr.
  • Esposito, Jennifer Miglionico
  • Shironoshita, Emilio Patrick

Abstract

Systems and methods for fusing multiple types of sensor data to determine a heart rate of a user. An accelerometer obtains accelerometer data associated with the user over a time period, and a gyroscope obtains gyroscope data associated with the user over the time period. Also, a camera obtains a plurality of images of the user's eye over the time period. The plurality images are analyzed to generate image data of the user's eyelid over the time period. The accelerometer data, the gyroscope data, and the image data are fused into fused sensor data, and a heart rate of the user is determined from the fused sensor data.

IPC Classes  ?

  • A61B 5/024 - Measuring pulse rate or heart rate
  • A61B 5/0205 - Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition

80.

Head mounted audio-visual display system

      
Application Number 29963500
Grant Number D1065190
Status In Force
Filing Date 2024-09-17
First Publication Date 2025-03-04
Grant Date 2025-03-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Awad, Haney
  • Kaji, Masamune
  • Swinton, Robert Dainis
  • Wheeler, William
  • Swinton, Matthew David
  • Gunther, Sebastian Gonzalo Arrieta

81.

METHODS AND APPARATUS FOR WEARABLE DISPLAY DEVICE WITH VARYING GRATING PARAMETERS

      
Application Number 18940738
Status Pending
Filing Date 2024-11-07
First Publication Date 2025-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Curtis, Kevin Richard
  • Cheng, Hui-Chuan
  • Greco, Paul M.
  • Welch, William Hudson
  • Browy, Eric C.
  • Schuck, Iii, Miller Harry
  • Sissom, Bradley Jay

Abstract

A device for viewing a projected image includes an input coupling grating operable to receive light related to the projected image from a light source and an expansion grating having a first grating structure characterized by a first set of grating parameters varying in one or more dimensions. The expansion grating structure is operable to receive light from the input coupling grating and to multiply the light related to the projected image. The device also includes an output coupling grating having a second grating structure characterized by a second set of grating parameters and operable to output the multiplied light in a predetermined direction.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • B29D 11/00 - Producing optical elements, e.g. lenses or prisms
  • G02B 1/00 - Optical elements characterised by the material of which they are madeOptical coatings for optical elements
  • G02B 5/18 - Diffracting gratings
  • G02B 5/30 - Polarising elements
  • G02B 6/293 - Optical coupling means having data bus means, i.e. plural waveguides interconnected and providing an inherently bidirectional system by mixing and splitting signals with wavelength selective means
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • G02B 7/00 - Mountings, adjusting means, or light-tight connections, for optical elements
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G02B 27/30 - Collimators
  • G02C 5/16 - Side-members resilient or with resilient parts
  • G02C 11/00 - Non-optical adjunctsAttachment thereof
  • G06F 1/16 - Constructional details or arrangements
  • G06F 1/20 - Cooling means
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/147 - Digital output to display device using display panels
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 3/20 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix
  • H04N 9/31 - Projection devices for colour picture display
  • H05K 7/20 - Modifications to facilitate cooling, ventilating, or heating

82.

METHOD AND SYSTEM FOR DETECTING FIBER POSITION IN A FIBER SCANNING PROJECTOR

      
Application Number 18943728
Status Pending
Filing Date 2024-11-11
First Publication Date 2025-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Melville, Charles David
  • Watson, Mathew D.
  • Rajiv, Abhijith
  • Kuehn, Benjamin John

Abstract

A method of measuring a position of a scanning cantilever includes providing a housing including an actuation region, a position measurement region including an aperture, and an oscillation region. The method also includes providing a drive signal to an actuator disposed in the actuation region, oscillating the scanning cantilever in response to the drive signal, generating a first light beam using a first optical source, directing the first light beam toward the aperture, detecting at least a portion of the first light beam using a first photodetector, generating a second light beam using a second optical source, directing the second light beam toward the aperture, detecting at least a portion of the second light beam using a second photodetector, and determining the position of the scanning cantilever based on the detected portion of the first light beam and the detected portion of the second light beam.

IPC Classes  ?

  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G02B 26/10 - Scanning systems

83.

VIRTUAL USER INPUT CONTROLS IN A MIXED REALITY ENVIRONMENT

      
Application Number 18948996
Status Pending
Filing Date 2024-11-15
First Publication Date 2025-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Kaehler, Adrian
  • Croston, John Adam

Abstract

A wearable display system includes a mixed reality display for presenting a virtual image to a user, an outward-facing imaging system configured to image an environment of the user, and a hardware processor operably coupled to the mixed reality display and to the imaging system. The hardware processor is programmed to generate a virtual remote associated with a parent device, render the virtual remote and the virtual control element on the mixed reality display, determine when the user of the wearable system interacts with the virtual control element of the virtual remote, and perform certain functions in response to user interaction with a virtual control element of the virtual remote. These functions may include generation the virtual control element to move on the mixed reality display; and when movement of the virtual control element surpasses a threshold condition, generate a focus indicator for the virtual control element.

IPC Classes  ?

  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06N 3/02 - Neural networks
  • G06N 20/00 - Machine learning
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/46 - Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]Salient regional features
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestriansBody parts, e.g. hands

84.

IMMERSIVE AUDIO PLATFORM

      
Application Number 18943633
Status Pending
Filing Date 2024-11-11
First Publication Date 2025-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Jot, Jean-Marc
  • Minnick, Michael
  • Pastouchenko, Dmitry
  • Simon, Michael Aaron
  • Scott, Iii, John Emmitt
  • Bailey, Richard St. Clair
  • Balasubramanyam, Shivakumar
  • Agadi, Harsharaj

Abstract

Disclosed herein are systems and methods for presenting audio content in mixed reality environments. A method may include receiving a first input from an application program; in response to receiving the first input, receiving, via a first service, an encoded audio stream; generating, via the first service, a decoded audio stream based on the encoded audio stream; receiving, via a second service, the decoded audio stream; receiving a second input from one or more sensors of a wearable head device; receiving, via the second service, a third input from the application program, wherein the third input corresponds to a position of one or more virtual speakers; generating, via the second service, a spatialized audio stream based on the decoded audio stream, the second input, and the third input; presenting, via one or more speakers of the wearable head device, the spatialized audio stream.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangementsControl arrangements, e.g. balance control
  • G02B 27/01 - Head-up displays

85.

VIRTUAL AND REAL OBJECT RECORDING IN MIXED REALITY DEVICE

      
Application Number 18938790
Status Pending
Filing Date 2024-11-06
First Publication Date 2025-02-20
Owner Magic Leap, Inc. (USA)
Inventor Huang, Ziqiang

Abstract

A virtual image generation system for use by an end user comprises memory, a display subsystem, an object selection device configured for receiving input from the end user and persistently selecting at least one object in response to the end user input, and a control subsystem configured for rendering a plurality of image frames of a three-dimensional scene, conveying the image frames to the display subsystem, generating audio data originating from the at least one selected object, and for storing the audio data within the memory.

IPC Classes  ?

  • G06F 3/16 - Sound inputSound output
  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
  • A63F 13/424 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
  • A63F 13/5255 - Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
  • A63F 13/5372 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
  • A63F 13/54 - Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04R 1/10 - EarpiecesAttachments therefor
  • H04R 1/32 - Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
  • H04R 1/40 - Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
  • H04R 3/00 - Circuits for transducers
  • H04R 5/02 - Spatial or constructional arrangements of loudspeakers

86.

LIGHT OUTPUT SYSTEM WITH REFLECTOR AND LENS FOR HIGHLY SPATIALLY UNIFORM LIGHT OUTPUT

      
Application Number 18934988
Status Pending
Filing Date 2024-11-01
First Publication Date 2025-02-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Sissom, Bradley Jay
  • Hall, Heidi Leising
  • Curtis, Kevin

Abstract

A user may interact and view virtual elements such as avatars and objects and/or real world elements in three-dimensional space in an augmented reality (AR) session. The system may allow one or more spectators to view from a stationary or dynamic camera a third person view of the users AR session. The third person view may be synchronized with the user view and the virtual elements of the user view may be composited onto the third person view.

IPC Classes  ?

  • F21V 13/04 - Combinations of only two kinds of elements the elements being reflectors and refractors
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/12 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind
  • G02B 23/06 - Telescopes, e.g. binocularsPeriscopesInstruments for viewing the inside of hollow bodiesViewfindersOptical aiming or sighting devices involving prisms or mirrors having a focusing action, e.g. parabolic mirror
  • G02B 27/01 - Head-up displays
  • G02B 30/50 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
  • H04N 13/315 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers the parallax barriers being time-variant
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

87.

CROSS REALITY SYSTEM WITH WIRELESS FINGERPRINTS

      
Application Number 18936094
Status Pending
Filing Date 2024-11-04
First Publication Date 2025-02-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Shveki, Gilboa
  • Weisbih, Ben
  • Kapota, Ofer
  • Torres, Rafael Domingos
  • Olshansky, Daniel
  • Holder, Joel David

Abstract

A cross reality system enables any of multiple devices to efficiently and accurately access previously stored maps and render virtual content specified in relation to those maps. Both stored maps and tracking maps used by portable devices may have wireless fingerprints associated with them. The portable devices may maintain wireless fingerprints based on wireless scans performed repetitively, based on one or more trigger conditions, as the devices move around the physical world. The wireless information obtained from these scans may be used to create or update wireless fingerprints associated with locations in a tracking map on the devices. One or more of these wireless fingerprints may be used when a previously stored map is to be selected based on its coverage of an area in which the portable device is operating. Maintaining wireless fingerprints in this way provides a reliable and low latency mechanism for performing map-related operations.

IPC Classes  ?

  • H04W 24/02 - Arrangements for optimising operational condition
  • G01S 5/02 - Position-fixing by co-ordinating two or more direction or position-line determinationsPosition-fixing by co-ordinating two or more distance determinations using radio waves
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04W 4/02 - Services making use of location information
  • H04W 4/029 - Location-based management or tracking services

88.

DISPLAY SYSTEM WITH LOW-LATENCY PUPIL TRACKER

      
Application Number 18929056
Status Pending
Filing Date 2024-10-28
First Publication Date 2025-02-13
Owner Magic Leap, Inc. (USA)
Inventor Klug, Michael Anthony

Abstract

A display system aligns the location of its exit pupil with the location of a viewer's pupil by changing the location of the portion of a light source that outputs light. The light source may include an array of pixels that output light, thereby allowing an image to be displayed on the light source. The display system includes a camera that captures image(s) of the eye and negatives of the eye image(s) are displayed by the light source. In the negative image, the dark pupil of the eye is a bright spot which, when displayed by the light source, defines the exit pupil of the display system, such that image content may be presented by modulating the light source. The location of the pupil of the eye may be tracked by capturing the images of the eye.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays

89.

VIRTUAL AND AUGMENTED REALITY SYSTEMS AND METHODS

      
Application Number 18930234
Status Pending
Filing Date 2024-10-29
First Publication Date 2025-02-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Rodriguez, Jose Felix
  • Perez, Ricardo Martinez

Abstract

A virtual reality (VR) and/or augmented reality (AR) display system is configured to control a display using control information that is embedded in or otherwise included with imagery data to be presented through the display. The control information can indicate depth plane(s) and/or color plane(s) to be used to present the imagery data, depth plane(s) and/or color plane(s) to be activated or inactivated, shift(s) of at least a portion of the imagery data (e.g., one or more pixels) laterally within a depth plane and/or longitudinally between depth planes, and/or other suitable controls.

IPC Classes  ?

  • G06T 7/50 - Depth or shape recovery
  • G02B 27/01 - Head-up displays
  • G06F 3/14 - Digital output to display device
  • G06T 7/579 - Depth or shape recovery from multiple images from motion
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 3/20 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/395 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
  • H04N 13/398 - Synchronisation thereofControl thereof

90.

SYSTEMS AND METHODS FOR VIRTUAL AND AUGMENTED REALITY

      
Application Number 18932396
Status Pending
Filing Date 2024-10-30
First Publication Date 2025-02-13
Owner Magic Leap, Inc. (USA)
Inventor Browy, Eric C.

Abstract

Disclosed herein are systems and methods for distributed computing and/or networking for mixed reality systems. A method may include capturing an image via a camera of a head-wearable device. Inertial data may be captured via an inertial measurement unit of the head-wearable device. A position of the head-wearable device can be estimated based on the image and the inertial data via one or more processors of the head-wearable device. The image can be transmitted to a remote server. A neural network can be trained based on the image via the remote server. A trained neural network can be transmitted to the head-wearable device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/14 - Digital output to display device
  • G06F 18/214 - Generating training patternsBootstrap methods, e.g. bagging or boosting
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • H04B 7/155 - Ground-based stations
  • H04L 67/10 - Protocols in which an application is distributed across nodes in the network

91.

SHAPED COLOR-ABSORBING REGIONS FOR WAVEGUIDES

      
Application Number 18718675
Status Pending
Filing Date 2022-12-16
First Publication Date 2025-02-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Vikramjit
  • Traub, Matthew C.
  • Menezes, Marlon Edward
  • Liu, Yingnan
  • Xu, Frank Y.

Abstract

A waveguide stack having color-selective regions on one or more waveguides. The color-selective regions are configured to absorb incident light of a first wavelength range in such a way as to reduce or prevent the incident light of the first wavelength range from coupling into a waveguide configured to transmit a light of a second wavelength range.

IPC Classes  ?

92.

SYSTEMS AND METHODS FOR END TO END SCENE RECONSTRUCTION FROM MULTIVIEW IMAGES

      
Application Number 18808906
Status Pending
Filing Date 2024-08-19
First Publication Date 2025-02-13
Owner MAGIC LEAP, INC. (USA)
Inventor Murez, Zachary Paul

Abstract

Systems and methods of generating a three-dimensional (3D) reconstruction of a scene or environment surrounding a user of a spatial computing system, such as a virtual reality, augmented reality or mixed reality system, using only multiview images comprising, and without the need for depth sensors or depth data from sensors. Features are extracted from a sequence of frames of RGB images and back-projected using known camera intrinsics and extrinsics into a 3D voxel volume wherein each pixel of the voxel volume is mapped to a ray in the voxel volume. The back-projected features are fused into the 3D voxel volume. The 3D voxel volume is passed through a 3D convolutional neural network to refine the and regress truncated signed distance function values at each voxel of the 3D voxel volume.

IPC Classes  ?

93.

VIRTUAL AND AUGMENTED REALITY SYSTEMS AND METHODS HAVING UNEQUAL NUMBERS OF COMPONENT COLOR IMAGES DISTRIBUTED ACROSS DEPTH PLANES

      
Application Number 18931455
Status Pending
Filing Date 2024-10-30
First Publication Date 2025-02-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Schowengerdt, Brian T.
  • Hua, Hong
  • Cheng, Hui-Chuan
  • Peroz, Christophe

Abstract

Images perceived to be substantially full color or multi-colored may be formed using component color images that are distributed in unequal numbers across a plurality of depth planes. The distribution of component color images across depth planes may vary based on color. In some embodiments, a display system includes a stack of waveguides that each output light of a particular color, with some colors having fewer numbers of associated waveguides than other colors. The waveguide stack may include multiple pluralities (e.g., first and second pluralities) of waveguides, each configured to produce an image by outputting light corresponding to a particular color. The total number of waveguides in the second plurality of waveguides may be less than the total number of waveguides in the first plurality of waveguides.

IPC Classes  ?

94.

SPATIAL AUDIO FOR INTERACTIVE AUDIO ENVIRONMENTS

      
Application Number 18924155
Status Pending
Filing Date 2024-10-23
First Publication Date 2025-02-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Audfray, Remi Samuel
  • Jot, Jean-Marc
  • Dicker, Samuel Charles

Abstract

Systems and methods of presenting an output audio signal to a listener located at a first location in a virtual environment are disclosed. According to embodiments of a method, an input audio signal is received. A first intermediate audio signal corresponding to the input audio signal is determined, based on a location of the sound source in the virtual environment, and the first intermediate audio signal is associated with a first bus. A second intermediate audio signal is determined. The second intermediate audio signal corresponds to a reverberation of the input audio signal in the virtual environment. The second intermediate audio signal is determined based on a location of the sound source, and further based on an acoustic property of the virtual environment. The second intermediate audio signal is associated with a second bus. The output audio signal is presented to the listener via the first and second buses.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangementsControl arrangements, e.g. balance control
  • G02B 27/01 - Head-up displays
  • G10K 15/10 - Arrangements for producing a reverberation or echo sound using time-delay networks comprising electromechanical or electro-acoustic devices
  • H04R 3/04 - Circuits for transducers for correcting frequency response
  • H04R 3/12 - Circuits for transducers for distributing signals to two or more loudspeakers
  • H04R 5/033 - Headphones for stereophonic communication
  • H04R 5/04 - Circuit arrangements
  • H04S 3/00 - Systems employing more than two channels, e.g. quadraphonic

95.

TECHNIQUE FOR CONTROLLING VIRTUAL IMAGE GENERATION SYSTEM USING EMOTIONAL STATES OF USER

      
Application Number 18927587
Status Pending
Filing Date 2024-10-25
First Publication Date 2025-02-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Sanger, George Alistair
  • Miller, Samuel A.
  • Devine, Graeme John

Abstract

A method of operating a virtual image generation system comprises allowing an end user to interact with a three-dimensional environment comprising at least one virtual object, presenting a stimulus to the end user in the context of the three-dimensional environment, sensing at least one biometric parameter of the end user in response to the presentation of the stimulus to the end user, generating biometric data for each of the sensed biometric parameter(s), determining if the end user is in at least one specific emotional state based on the biometric data for the each of the sensed biometric parameter(s), and performing an action discernible to the end user to facilitate a current objective at least partially based on if it is determined that the end user is in the specific emotional state(s).

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • A63F 13/21 - Input arrangements for video game devices characterised by their sensors, purposes or types
  • A63F 13/212 - Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
  • A63F 13/52 - Controlling the output signals based on the game progress involving aspects of the displayed game scene
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • A63F 13/822 - Strategy gamesRole-playing games
  • G02B 27/01 - Head-up displays
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 16/56 - Information retrievalDatabase structures thereforFile system structures therefor of still image data having vectorial format
  • G06F 16/58 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions

96.

AREA SPECIFIC COLOR ABSORPTION IN NANOIMPRINT LITHOGRAPHY

      
Application Number 18717946
Status Pending
Filing Date 2022-12-16
First Publication Date 2025-02-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Vikramjit
  • Traub, Matthew C.
  • Xu, Frank Y.

Abstract

An eyepiece includes an optical waveguide, a transmissive input coupler at a first end of the optical waveguide, an output coupler at a second end of the optical waveguide, and a polymeric color absorbing region along a portion of the optical waveguide between the transmissive input coupler and the output coupler. The transmissive input coupler is configured to couple incident visible light to the optical waveguide, and the color-absorbing region is configured to absorb a component of the visible light as the visible light propagates through the optical waveguide.

IPC Classes  ?

  • G02B 25/00 - EyepiecesMagnifying glasses
  • G02B 1/118 - Anti-reflection coatings having sub-optical wavelength surface structures designed to provide an enhanced transmittance, e.g. moth-eye structures
  • G02B 5/22 - Absorbing filters
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

97.

SYSTEMS AND METHODS FOR ENHANCED DEPTH DETERMINATION USING PROJECTION SPOTS

      
Application Number 18801164
Status Pending
Filing Date 2024-08-12
First Publication Date 2025-02-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Berger, Kai
  • Vohra, Hasnain Salim

Abstract

Systems and methods for enhanced depth determination using projection spots. An example method includes obtaining images of a real-world object, the images being obtained from image sensors positioned about the real-world object, and the images depicting projection spots projected onto the real-world object via projectors positioned about the real-world object. A projection spot map is accessed, the projection spot map including information indicative of real-world locations of projection spots based locations of the projection spots in the obtained images. Location information is assigned to the projection spots based on the projection spot map. Generation of a three-dimensional representation of the real-world object is caused.

IPC Classes  ?

  • G06T 7/521 - Depth or shape recovery from laser ranging, e.g. using interferometryDepth or shape recovery from the projection of structured light
  • G06T 7/557 - Depth or shape recovery from multiple images from light fields, e.g. from plenoptic cameras
  • G06T 7/586 - Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
  • G06T 15/10 - Geometric effects
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

98.

VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

      
Application Number 18923373
Status Pending
Filing Date 2024-10-22
First Publication Date 2025-02-06
Owner MAGIC LEAP, INC. (USA)
Inventor Taylor, Robert Blake

Abstract

A method for determining a focal point depth of a user of a three-dimensional (“3D”) display device includes tracking a first gaze path of the user. The method also includes analyzing 3D data to identify one or more virtual objects along the first gaze path of the user. The method further includes when only one virtual object intersects the first gaze path of the user identifying a depth of the only one virtual object as the focal point depth of the user.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/50 - Depth or shape recovery
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

99.

Enhanced eye tracking techniques based on neural network analysis of images

      
Application Number 18598620
Grant Number 12299195
Status In Force
Filing Date 2024-03-07
First Publication Date 2025-01-30
Grant Date 2025-05-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Zheng, Hao
  • Jia, Zhiheng

Abstract

Enhanced eye-tracking techniques for augmented or virtual reality display systems. An example method includes obtaining an image of an eye of a user of a wearable system, the image depicting glints on the eye caused by respective light emitters, wherein the image is a low dynamic range (LDR) image; generating a high dynamic range (HDR) image via computation of a forward pass of a machine learning model using the image; determining location information associated with the glints as depicted in the HDR image, wherein the location information is usable to inform an eye pose of the eye.

IPC Classes  ?

  • G06T 5/92 - Dynamic range modification of images or parts thereof based on global image properties
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

100.

HIGH ACCURACY DISPLACEMENT DEVICE

      
Application Number 18917014
Status Pending
Filing Date 2024-10-16
First Publication Date 2025-01-30
Owner Magic Leap, Inc. (USA)
Inventor
  • Donaldson, Nick
  • Yan, Changxin
  • Gupta, Ankur
  • Chauhan, Vikram

Abstract

Devices are described for high accuracy displacement of tools. In particular, embodiments provide a device for adjusting a position of a tool. The device includes a threaded shaft having a first end and a second end and a shaft axis extending from the first end to the second end, a motor that actuates the threaded shaft to move in a direction of the shaft axis. In some examples, the motor is operatively coupled to the threaded shaft. The device includes a carriage coupled to the camera, and a bearing assembly coupled to the threaded shaft and the carriage. In some examples, the bearing assembly permits a movement of the carriage with respect to the threaded shaft. The movement of the carriage allows the position of the camera to be adjusted.

IPC Classes  ?

  • H04N 23/695 - Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
  • G03B 17/56 - Accessories
  1     2     3     ...     33        Next Page