Magic Leap, Inc.

United States of America

Back to Profile

1-100 of 3,181 for Magic Leap, Inc. Sort by
Query
Aggregations
IP Type
        Patent 3,032
        Trademark 149
Jurisdiction
        United States 2,285
        World 712
        Canada 170
        Europe 14
Date
New (last 4 weeks) 27
2025 April (MTD) 13
2025 March 33
2025 February 18
2025 January 22
See more
IPC Class
G02B 27/01 - Head-up displays 1,404
G06T 19/00 - Manipulating 3D models or images for computer graphics 815
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 766
G02B 27/00 - Optical systems or apparatus not provided for by any of the groups , 485
F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems 280
See more
NICE Class
09 - Scientific and electric apparatus and instruments 134
42 - Scientific, technological and industrial services, research and design 58
41 - Education, entertainment, sporting and cultural services 41
38 - Telecommunications services 35
35 - Advertising and business services 30
See more
Status
Pending 432
Registered / In Force 2,749
  1     2     3     ...     32        Next Page

1.

SPATIALLY VARIABLE LIQUID CRYSTAL DIFFRACTION GRATINGS

      
Application Number 18999315
Status Pending
Filing Date 2024-12-23
First Publication Date 2025-04-17
Owner Magic Leap, Inc. (USA)
Inventor Oh, Chulwoo

Abstract

The present disclosure relates to display systems and, more particularly, to augmented reality display systems including diffraction grating(s), and methods of fabricating same. A diffraction grating includes a plurality of different diffracting zones having a periodically repeating lateral dimension corresponding to a grating period adapted for light diffraction. The diffraction grating additionally includes a plurality of different liquid crystal layers corresponding to the different diffracting zones. The different liquid crystal layers include liquid crystal molecules that are aligned differently, such that the different diffracting zones have different optical properties associated with light diffraction.

IPC Classes  ?

2.

METHODS AND SYSTEMS FOR AUDIO SIGNAL FILTERING

      
Application Number 18986468
Status Pending
Filing Date 2024-12-18
First Publication Date 2025-04-17
Owner Magic Leap, Inc. (USA)
Inventor
  • Audfray, Remi Samuel
  • Jot, Jean-Marc
  • Dicker, Samuel Charles

Abstract

Systems and methods for rendering audio signals are disclosed. In some embodiments, a method may receive an input signal including a first portion and the second portion. A first processing stage comprising a first filter is applied to the first portion to generate a first filtered signal. A second processing stage comprising a second filter is applied to the first portion to generate a second filtered signal. A third processing stage comprising a third filter is applied to the second portion to generate a third filtered signal. A fourth processing stage comprising a fourth filter is applied to the second portion to generate a fourth filtered signal. A first output signal is determined based on a sum of the first filtered signal and the third filtered signal. A second output signal is determined based on a sum of the second filtered signal and the fourth filtered signal. The first output signal is presented to a first ear of a user of a virtual environment, and the second output signal is presented to the second ear of the user. The first portion of the input signal corresponds to a first location in the virtual environment, and the second portion of the input signal corresponds to a second location in the virtual environment.

IPC Classes  ?

  • H04S 1/00 - Two-channel systems
  • H04R 5/033 - Headphones for stereophonic communication
  • H04S 7/00 - Indicating arrangementsControl arrangements, e.g. balance control

3.

GEOMETRIES FOR MITIGATING ARTIFACTS IN SEE-THROUGH PIXEL ARRAYS

      
Application Number 19000352
Status Pending
Filing Date 2024-12-23
First Publication Date 2025-04-17
Owner Magic Leap, Inc. (USA)
Inventor
  • Russell, Andrew Ian
  • Trisnadi, Jahja I.
  • Mathur, Vaibhav
  • Manly, David
  • Johnson, Michael Robert
  • Carlisle, Clinton

Abstract

Disclosed are dimming assemblies and display systems for reducing artifacts produced by optically-transmissive displays. A system may include a substrate upon which a plurality of electronic components are disposed. The electronic components may include a plurality of pixels, a plurality of conductors, and a plurality of circuit modules. The plurality of pixels may be arranged in a two-dimensional array, with each pixel having a two-dimensional geometry corresponding to a shape with at least one curved side. The plurality of conductors may be arranged adjacent to the plurality of pixels. The system may also include control circuitry electrically coupled to the plurality of conductors. The control circuitry may be configured to apply electrical signals to the plurality of circuit modules by way of the plurality of conductors.

IPC Classes  ?

  • G02F 1/1345 - Conductors connecting electrodes to cell terminals
  • G02B 27/01 - Head-up displays
  • G02C 7/10 - Filters, e.g. for facilitating adaptation of the eyes to the darkSunglasses
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02F 1/1343 - Electrodes
  • G02F 1/1362 - Active matrix addressed cells
  • G02F 1/139 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on orientation effects in which the liquid crystal remains transparent

4.

METHOD AND SYSTEM FOR PERFORMING DYNAMIC FOVEATION BASED ON EYE GAZE

      
Application Number US2024050818
Publication Number 2025/080868
Status In Force
Filing Date 2024-10-10
Publication Date 2025-04-17
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Diaz, Edward
  • Irick, Kevin Maurice

Abstract

A method of forming a foveated image includes (a) setting dimensions of a first region, (b) receiving an image having a first resolution, and (c) forming the foveated image including a primary quality region having the dimensions of the first region and the first resolution and a secondary quality region having a second resolution less than the first resolution. The method also includes (d) outputting the foveated image, (e) determining an eye gaze location, and (f) determining an eye gaze velocity. If the eye gaze velocity is less than a threshold velocity, the method includes decreasing the dimensions of the primary quality region and repeating (b) - (f). If the eye gaze velocity is greater than or equal to the threshold velocity, the method includes repeating (a) - (f).

IPC Classes  ?

  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

5.

METHOD AND SYSTEM FOR FORMING FOVEATED IMAGES BASED ON EYE GAZE

      
Application Number US2024050823
Publication Number 2025/080872
Status In Force
Filing Date 2024-10-10
Publication Date 2025-04-17
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Diaz, Edward
  • Irick, Kevin Maurice

Abstract

A method includes rendering an original image at a first processor, encoding the original image to provide an encoded image, and transmitting the encoded image to a second processor. The method also includes decoding the encoded image to provide a decoded image, determining an eye gaze location, splitting the decoded image into N sections based on the eye gaze location, and processing N-1 sections of the N sections to produce N-1 secondary quality sections. The method further includes processing one section of the N sections to provide one primary quality section, combining the one primary quality section and the N-1 secondary quality sections to form a foveated image, and transmitting the foveated image to a display.

IPC Classes  ?

  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

6.

INDIVIDUAL VIEWING IN A SHARED SPACE

      
Application Number 18933172
Status Pending
Filing Date 2024-10-31
First Publication Date 2025-04-17
Owner Magic Leap, Inc. (USA)
Inventor
  • Alexander, Iv, Earle M.
  • Arroyo, Pedro Luis
  • Venerin, Jean I.
  • Adams, William

Abstract

A mixed reality virtual environment is sharable among multiple users through the use of multiple view modes that are selectable by a presenter. Multiple users with wearable display systems may wish to view a common virtual object, which may be presented in a virtual room to any suitable number of users. A presentation may be controlled by a presenter using a presenter wearable system that leads multiple participants through information associated with the virtual object. Use of different viewing modes allows individual users to see different virtual content through their wearable display systems, despite being in a shared viewing space or alternatively, to see the same virtual content in different locations within a shared space.

IPC Classes  ?

  • G09B 5/12 - Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations different stations being capable of presenting different information simultaneously
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04L 12/18 - Arrangements for providing special services to substations for broadcast or conference

7.

FACE MODEL CAPTURE BY A WEARABLE DEVICE

      
Application Number 18981592
Status Pending
Filing Date 2024-12-15
First Publication Date 2025-04-10
Owner Magic Leap, Inc. (USA)
Inventor
  • Amayeh, Gholamreza
  • Kaehler, Adrian
  • Lee, Douglas

Abstract

Systems and methods for generating a face model for a user of a head-mounted device are disclosed. The head-mounted device can include one or more eye cameras configured to image the face of the user while the user is putting the device on or taking the device off. The images obtained by the eye cameras may be analyzed using a stereoscopic vision technique, a monocular vision technique, or a combination, to generate a face model for the user. The face model can be used to generate a virtual image of at least a portion of the user's face, for example to be presented as an avatar.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06V 20/64 - Three-dimensional objects
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06V 40/19 - Sensors therefor

8.

MODES OF USER INTERACTION

      
Application Number 18984646
Status Pending
Filing Date 2024-12-17
First Publication Date 2025-04-10
Owner Magic Leap, Inc. (USA)
Inventor
  • Speelman, Daniel Stephen
  • Cano, Rodrigo
  • Gundersen, Kara Lauren
  • Hazen, Griffith Buckley
  • Pazmino, Lorena

Abstract

A mixed reality (MR) device can allow a user to switch between input modes to allow interactions with a virtual environment via devices such as a six degrees of freedom (6DoF) handheld controller and a touchpad input device. A default input mode for interacting with virtual content may rely on the user's head pose, which may be difficult to use in selecting virtual objects that are far away in the virtual environment. Thus, the system may be configured to allow the user to use a 6DoF cursor, and a visual ray that extends from the handheld controller to the cursor, to enable precise targeting. Input via a touchpad input device (e.g., that allows three degrees of freedom movements) may also be used in conjunction with the 6DoF cursor.

IPC Classes  ?

  • H04N 13/398 - Synchronisation thereofControl thereof
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • H04N 13/361 - Reproducing mixed stereoscopic imagesReproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

9.

TECHNIQUES FOR DETERMINING SETTINGS FOR A CONTENT CAPTURE DEVICE

      
Application Number 18990920
Status Pending
Filing Date 2024-12-20
First Publication Date 2025-04-10
Owner Magic Leap, Inc. (USA)
Inventor
  • Smith, Brian Keith
  • Tsunaev, Ilya

Abstract

A method includes receiving a first image captured by a content capture device, identifying a first object in the first image and determining a first update to a first setting of the content capture device. The method further includes receiving a second image captured by the content capture device, identifying a second object in the second image, and determining a second update to a second setting of the content capture device. The method further includes updating the first setting of the content capture device using the first update, receiving a third image using the updated first setting of the content capture device, updating the second setting of the content capture device using the second update, receiving a fourth image using the updated second setting of the content capture device, and stitching the third image and the fourth image together to form a composite image.

IPC Classes  ?

  • H04N 23/72 - Combination of two or more compensation controls
  • H04N 5/222 - Studio circuitryStudio devicesStudio equipment
  • H04N 5/265 - Mixing
  • H04N 23/63 - Control of cameras or camera modules by using electronic viewfinders
  • H04N 23/71 - Circuitry for evaluating the brightness variation
  • H04N 23/73 - Circuitry for compensating brightness variation in the scene by influencing the exposure time
  • H04N 23/741 - Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
  • H04N 23/743 - Bracketing, i.e. taking a series of images with varying exposure conditions
  • H04N 23/76 - Circuitry for compensating brightness variation in the scene by influencing the image signals

10.

METHOD AND SYSTEM FOR PERFORMING EYE TRACKING IN AUGMENTED REALITY DEVICES

      
Application Number 18983134
Status Pending
Filing Date 2024-12-16
First Publication Date 2025-04-10
Owner Magic Leap, Inc. (USA)
Inventor
  • Garcia, Giovanni
  • Melo, Christian
  • Farmer, Daniel
  • Shultz, Jason Allen
  • Nguyen, Bach
  • Schabacker, Charles Robert
  • Shoaee, Michael

Abstract

A wearable device for projecting image light to an eye of a viewer and forming an image of virtual content in an augmented reality display is provided. The wearable device includes a projector and stack of waveguides optically connected to the projector. The wearable device also includes an eye tracking system comprising a plurality of illumination sources, an optical element having optical power, and a set of cameras. The optical element is disposed between the plurality of illumination sources and the set of cameras. In some embodiments, the augmented reality display includes an eyepiece operable to output virtual content from an output region and a plurality of illumination sources. At least some of the plurality of illumination sources overlap with the output region.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays
  • H04N 23/56 - Cameras or camera modules comprising electronic image sensorsControl thereof provided with illuminating means
  • H04N 23/90 - Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

11.

EVENT-BASED CAMERA WITH HIGH-RESOLUTION FRAME OUTPUT

      
Application Number 18987441
Status Pending
Filing Date 2024-12-19
First Publication Date 2025-04-10
Owner Magic Leap, Inc. (USA)
Inventor
  • Zahnert, Martin Georg
  • Ilic, Alexander

Abstract

A high-resolution image sensor suitable for use in an augmented reality (AR) system to provide low latency image analysis with low power consumption. The AR system can be compact, and may be small enough to be packaged within a wearable device such as a set of goggles or mounted on a frame resembling ordinary eyeglasses. The image sensor may receive information about a region of an imaging array associated with a movable object, selectively output imaging information for that region, and synchronously output high-resolution image frames. The region may be updated dynamically as the image sensor and/or the object moves. The image sensor may output the high-resolution image frames less frequently than the region being updated when the image sensor and/or the object moves. Such an image sensor provides a small amount of data from which object information used in rendering an AR scene can be developed.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/593 - Depth or shape recovery from multiple images from stereo images
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/20 - Image preprocessing
  • H04N 13/00 - Stereoscopic video systemsMulti-view video systemsDetails thereof
  • H04N 13/106 - Processing image signals
  • H04N 25/531 - Control of the integration time by controlling rolling shutters in CMOS SSIS
  • H04N 25/705 - Pixels for depth measurement, e.g. RGBZ
  • H04N 25/75 - Circuitry for providing, modifying or processing image signals from the pixel array

12.

VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

      
Application Number US2024048685
Publication Number 2025/072540
Status In Force
Filing Date 2024-09-26
Publication Date 2025-04-03
Owner MAGIC LEAP, INC. (USA)
Inventor Lancelle, Marcel

Abstract

A head-mounted display system configured to be worn over eyes of a user includes a frame configured to be worn on a head of the user. The system also includes a display disposed on the frame over the eyes of the user. The system further includes an inwardly-facing light source disposed on the frame and configured to emit light toward the eyes of the user to improve visibility of respective portions of a face and the eyes of the user through the display. Moreover, the system includes a processor configured to control a brightness of the display, an opacity of the display, and an intensity of the light emitted by the inwardly-facing light source.

IPC Classes  ?

  • G02B 27/02 - Viewing or reading apparatus
  • G09G 3/20 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix
  • G09G 5/36 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory

13.

CENTRALIZED RENDERING

      
Application Number 18980110
Status Pending
Filing Date 2024-12-13
First Publication Date 2025-04-03
Owner Magic Leap, Inc. (USA)
Inventor Babu J D, Praveen

Abstract

A method is disclosed, the method comprising the steps of receiving, from a first client application, first graphical data comprising a first node; receiving, from a second client application independent of the first client application, second graphical data comprising a second node; and generating a scenegraph, wherein the scenegraph describes a hierarchical relationship between the first node and the second node according to visual occlusion relative to a perspective from a display.

IPC Classes  ?

14.

VIRTUAL AND AUGMENTED REALITY SYSTEMS AND METHODS USING DISPLAY SYSTEM CONTROL INFORMATION EMBEDDED IN IMAGE DATA

      
Application Number 18971685
Status Pending
Filing Date 2024-12-06
First Publication Date 2025-03-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Rodriguez, Jose Felix
  • Perez, Ricardo Martinez
  • Nourai, Reza

Abstract

A display system, such as a virtual reality or augmented reality display system, can control a display to present image data including a plurality of color components, on a plurality of depth planes supported by the display. The presentation of the image data through the display can be controlled based on control information that is embedded in the image data, for example to activate or inactive a color component and/or a depth plane. In some examples, light sources and/or spatial light modulators that relay illumination from the light sources may receive signals from a display controller to adjust a power setting to the light source or spatial light modulator based on control information embedded in an image data frame.

IPC Classes  ?

  • H04N 23/65 - Control of camera operation in relation to power supply
  • G02B 27/01 - Head-up displays
  • G02B 27/10 - Beam splitting or combining systems
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

15.

MANAGING MULTI-OBJECTIVE ALIGNMENTS FOR IMPRINTING

      
Application Number 18975299
Status Pending
Filing Date 2024-12-10
First Publication Date 2025-03-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Sevier, Jeremy Lee
  • Sadam, Satish
  • Imhof, Joseph Michael
  • Luo, Kang
  • Wang, Kangkang
  • Patterson, Roy Matthew
  • Xue, Qizhen
  • Best, Brett William
  • Carden, Charles Scott
  • Shafran, Matthew S.
  • Miller, Michael Nevin

Abstract

Systems and methods for managing multi-objective alignments in imprinting (e.g., single-sided or double-sided) are provided. An example system includes rollers for moving a template roll, a stage for holding a substrate, a dispenser for dispensing resist on the substrate, a light source for curing the resist to form an imprint on the substrate when a template of the template roll is pressed into the resist on the substrate, a first inspection system for registering a fiducial mark of the template to determine a template offset, a second inspection system for registering the imprint on the substrate to determine a wafer registration offset between a target location and an actual location of the imprint, and a controller for controlling to move the substrate with the resist below the template based on the template offset, and determine an overlay bias of the imprint on the substrate based on the wafer registration offset.

IPC Classes  ?

  • G03F 9/00 - Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
  • G03F 7/00 - Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printed surfacesMaterials therefor, e.g. comprising photoresistsApparatus specially adapted therefor

16.

DISPLAY SYSTEM HAVING A PLURALITY OF LIGHT PIPES FOR A PLURALITY OF LIGHT EMITTERS

      
Application Number 18975714
Status Pending
Filing Date 2024-12-10
First Publication Date 2025-03-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Curtis, Kevin
  • Hall, Heidi Leising
  • St. Hilaire, Pierre
  • Tinch, David

Abstract

A display system includes a plurality of light pipes and a plurality of light sources configured to emit light into the light pipes. The display system also comprises a spatial light modulator configured to modulate light received from the light pipes to form images. The display system may also comprise one or more waveguides configured to receive modulated light from the spatial light modulator and to relay that light to a viewer.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • G02B 19/00 - Condensers
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only

17.

DISPLAY SYSTEMS AND METHODS FOR CLIPPING CONTENT TO INCREASE VIEWING COMFORT

      
Application Number 18977386
Status Pending
Filing Date 2024-12-11
First Publication Date 2025-03-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Schwab, Brian David
  • Hand, Randall E.
  • Vlaskamp, Björn Nicolaas Servatius

Abstract

AR/VR display systems limit displaying content that exceeds an accommodation-vergence mismatch threshold, which may define a volume around the viewer. The volume may be subdivided into two or more zones, including an innermost loss-of-fusion zone (LoF) in which no content is displayed, and one or more outer AVM zones in which the displaying of content may be stopped, or clipped, under certain conditions. For example, content may be clipped if the viewer is verging within an AVM zone and if the content is displayed within the AVM zone for more than a threshold duration. A further possible condition for clipping content is that the user is verging on that content. In addition, the boundaries of the AVM zone and/or the acceptable amount of time that the content is displayed may vary depending upon the type of content being displayed, e.g., whether the content is user-locked content or in-world content.

IPC Classes  ?

18.

DEPTH BASED FOVEATED RENDERING FOR DISPLAY SYSTEMS

      
Application Number 18959036
Status Pending
Filing Date 2024-11-25
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Yeoh, Ivan Li Chuen
  • Edwin, Lionel Ernest
  • Samec, Nicole Elizabeth
  • Robaina, Nastasja U.
  • Mathur, Vaibhav
  • Dalrymple, Timothy Mark
  • Schaefer, Jason
  • Carlisle, Clinton
  • Cheng, Hui-Chuan
  • Oh, Chulwoo
  • Premysler, Philip
  • Zhang, Xiaoyang
  • Carlson, Adam C.

Abstract

Methods and systems for depth-based foveated rendering in a display system are disclosed. The display system may be an augmented reality display system configured to provide virtual content on a plurality of depth planes using different wavefront divergence. Some embodiments include monitoring eye orientations of a user of the display system. A fixation point can be determined based on the eye orientations, the fixation point representing a three-dimensional location with respect to a field of view. Location information of virtual object(s) to present is obtained, with the location information including three-dimensional position(s) of the virtual object(s). A resolution of the virtual object(s) can be adjusted based on a proximity of the location(s) of the virtual object(s) to the fixation point. The virtual object(s) are presented by the display system according to the adjusted resolution(s).

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • H04N 13/341 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
  • H04N 13/395 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
  • H04N 13/398 - Synchronisation thereofControl thereof

19.

VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

      
Application Number 18959252
Status Pending
Filing Date 2024-11-25
First Publication Date 2025-03-20
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Rodriguez, Jose Felix
  • Perez, Ricardo Martinez

Abstract

A virtual, augmented, or mixed reality display system includes a display configured to display virtual, augmented, or mixed reality image data, the display including one or more optical components which introduce optical distortions or aberrations to the image data. The system also includes a display controller configured to provide the image data to the display. The display controller includes memory for storing optical distortion correction information, and one or more processing elements to at least partially correct the image data for the optical distortions or aberrations using the optical distortion correction information.

IPC Classes  ?

  • G06T 5/80 - Geometric correction
  • G06F 1/3203 - Power management, i.e. event-based initiation of a power-saving mode
  • G06F 3/14 - Digital output to display device
  • G06T 3/18 - Image warping, e.g. rearranging pixels individually
  • G06T 3/4007 - Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation

20.

DEPTH BASED FOVEATED RENDERING FOR DISPLAY SYSTEMS

      
Application Number 18960851
Status Pending
Filing Date 2024-11-26
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Mathur, Vaibhav
  • Edwin, Lionel Ernest
  • Zhang, Xiaoyang
  • Vlaskamp, Bjorn Nicolaas Servatius

Abstract

Methods and systems for depth-based foveated rendering in the display system are disclosed. The display system may be an augmented reality display system configured to provide virtual content on a plurality of depth planes using different wavefront divergence. Some embodiments include determining a fixation point of a user's eyes. Location information associated with a first virtual object to be presented to the user via a display device is obtained. A resolution-modifying parameter of the first virtual object is obtained. A particular resolution at which to render the first virtual object is identified based on the location information and the resolution-modifying parameter of the first virtual object. The particular resolution is based on a resolution distribution specifying resolutions for corresponding distances from the fixation point. The first virtual object rendered at the identified resolution is presented to the user via the display system.

IPC Classes  ?

  • G09G 5/391 - Resolution modifying circuits, e.g. variable screen formats
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 5/70 - DenoisingSmoothing

21.

EYEPIECE FOR VIRTUAL, AUGMENTED, OR MIXED REALITY SYSTEMS

      
Application Number 18970619
Status Pending
Filing Date 2024-12-05
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Tekolste, Robert Dale
  • Welch, William Hudson
  • Browy, Eric
  • Liu, Victor Kai
  • Bhargava, Samarth

Abstract

An eyepiece for an augmented reality display system. The eyepiece can include a waveguide substrate. The waveguide substrate can include an input coupler grating (ICG), an orthogonal pupil expander (OPE) grating, a spreader grating, and an exit pupil expander (EPE) grating. The ICG can couple at least one input light beam into at least a first guided light beam that propagates inside the waveguide substrate. The OPE grating can divide the first guided light beam into a plurality of parallel, spaced-apart light beams. The spreader grating can receive the light beams from the OPE grating and spread their distribution. The spreader grating can include diffractive features oriented at approximately 90° to diffractive features of the OPE grating. The EPE grating can re-direct the light beams from the first OPE grating and the first spreader grating such that they exit the waveguide substrate.

IPC Classes  ?

  • G02B 6/122 - Basic optical elements, e.g. light-guiding paths
  • G02B 6/02 - Optical fibres with cladding
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

22.

MEDICAL ASSISTANT

      
Application Number 18964039
Status Pending
Filing Date 2024-11-29
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Robaina, Nastasja U.
  • Samec, Nicole Elizabeth
  • Baerenrodt, Mark
  • Harrises, Christopher M.

Abstract

A wearable display device, such as an augmented reality display device, can present virtual content to the wearer for applications in a healthcare setting. The wearer may be a patient or a healthcare provider (HCP). Applications can include, but are not limited to, access, display, and modification of patient medical records and sharing patient medical records among authorized HCPs, detecting one or more anomalies in a medical environment and presenting virtual content (e.g., alerts) indicating the one or more anomalies, detecting the presence of physical objects (e.g., medical instruments or devices) in the medical environment, enabling communication with and/or remove control of a medical device in the environment, and so forth.

IPC Classes  ?

  • G16H 10/60 - ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
  • A61B 3/00 - Apparatus for testing the eyesInstruments for examining the eyes
  • A61B 3/10 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions
  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • A61B 5/00 - Measuring for diagnostic purposes Identification of persons
  • A61B 5/06 - Devices, other than using radiation, for detecting or locating foreign bodies
  • A61B 5/1171 - Identification of persons based on the shapes or appearances of their bodies or parts thereof
  • A61B 5/339 - Displays specially adapted therefor
  • A61B 17/00 - Surgical instruments, devices or methods
  • A61B 34/20 - Surgical navigation systemsDevices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 90/50 - Supports for surgical instruments, e.g. articulated arms
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 16/22 - IndexingData structures thereforStorage structures
  • G06F 16/23 - Updating
  • G06F 21/32 - User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
  • G06F 21/62 - Protecting access to data via a platform, e.g. using keys or access control rules
  • G06F 40/205 - Parsing
  • G06F 40/289 - Phrasal analysis, e.g. finite state techniques or chunking
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes
  • G10L 15/26 - Speech to text systems
  • G16H 30/40 - ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
  • G16H 40/67 - ICT specially adapted for the management or administration of healthcare resources or facilitiesICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation

23.

DIMMING DEVICE ANGULAR UNIFORMITY CORRECTION

      
Application Number 18966688
Status Pending
Filing Date 2024-12-03
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Arencibia, Ricardo
  • Llaneras, Zachary Michael
  • Schaefer, Jason
  • Cohen, Howard Russell
  • Cheng, Hui-Chuan
  • Sours, Michael Alexander

Abstract

A method of operating an optical system includes identifying a set of angle dependent transmittance levels for light passing through pixels of a segmented dimmer exhibiting viewing angle transmittance variations for application of a same voltage to all pixels of the segmented dimmer. The method also includes determining a set of voltages to apply to pixels of the segmented dimmer. Determining the set of voltages includes using the set of angle dependent transmittance levels. The method includes applying the set of voltages to the pixels of the segmented dimmer to achieve light transmittance through the segmented dimmer corresponding to the set of angle dependent transmittance levels.

IPC Classes  ?

  • G09G 3/36 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source using liquid crystals
  • G02B 27/01 - Head-up displays

24.

DISPLAY FOR THREE-DIMENSIONAL IMAGE

      
Application Number 18969386
Status Pending
Filing Date 2024-12-05
First Publication Date 2025-03-20
Owner Magic Leap, Inc. (USA)
Inventor Kaehler, Adrian

Abstract

Apparatuses and methods for displaying a 3-D representation of an object are described. Apparatuses can include a rotatable structure, motor, and multiple light field sub-displays disposed on the rotatable structure. The apparatuses can store a light field image to be displayed, the light field image providing multiple different views of the object at different viewing directions. A processor can drive the motor to rotate the rotatable structure and map the light field image to each of the light field sub-displays based in part on the rotation angle, and illuminate the light field sub-displays based in part on the mapped light field image. The apparatuses can include a display panel configured to be viewed from a fiducial viewing direction, where the display panel is curved out of a plane that is perpendicular to the fiducial viewing direction, and a plurality of light field sub-displays disposed on the display panel.

IPC Classes  ?

  • H04N 13/393 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume the volume being generated by a moving, e.g. vibrating or rotating, surface
  • G02B 30/27 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the autostereoscopic type involving lenticular arrays
  • G02B 30/54 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being generated by moving a 2D surface, e.g. by vibrating or rotating the 2D surface
  • G02B 30/56 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • H04N 13/307 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using fly-eye lenses, e.g. arrangements of circular lenses
  • H04N 13/32 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using arrays of controllable light sourcesImage reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using moving apertures or moving light sources
  • H04N 13/324 - Colour aspects
  • H04N 13/398 - Synchronisation thereofControl thereof

25.

METHOD AND SYSTEM FOR DIFFRACTIVE OPTICS EYEPIECE ARCHITECTURES INCORPORATING AN OPTICAL NOTCH FILTER

      
Application Number US2023032805
Publication Number 2025/058628
Status In Force
Filing Date 2023-09-14
Publication Date 2025-03-20
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Khandekar, Chinmay
  • Singh, Vikramjit
  • Tekolste, Robert D.

Abstract

An augmented reality system includes a projector assembly and a set of imaging optics optically coupled to the projector assembly. The augmented reality system also includes an eyepiece optically coupled to the set of imaging optics. The eyepiece has a world side and a user side opposite the world side and includes one or more eyepiece waveguides. Each of the one or more eyepiece waveguides includes an incoupling interface and an outcoupling interface operable to output virtual content toward the user side. The augmented reality system further includes an optical notch filter disposed on the world side of the eyepiece.

IPC Classes  ?

26.

METHOD AND SYSTEM FOR HIGH ORDER DIFFRACTION, LARGE FIELD OF VIEW AUGMENTED REALITY EYEPIECE WAVEGUIDES

      
Application Number US2023032806
Publication Number 2025/058629
Status In Force
Filing Date 2023-09-14
Publication Date 2025-03-20
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Singh, Vikramjit
  • Khandekar, Chinmay
  • Xue, Qizhen
  • Faraji-Dana, Mohammadsadegh

Abstract

A method of operating an eyepiece waveguide of an augmented reality system includes projecting virtual content using a projector assembly and diffracting the virtual content into the eyepiece waveguide via a first order diffraction. A first portion of the virtual content is clipped to produce a remaining portion of the virtual content. The method also includes propagating the remaining portion of the virtual content in the eyepiece waveguide, outcoupling the remaining portion of the virtual content out of the eyepiece waveguide, and diffracting the virtual content into the eyepiece waveguide via a second order diffraction. A second portion of the virtual content is clipped to produce a complementary portion. The method further includes propagating the complementary portion of the virtual content in the eyepiece waveguide and outcoupling the complementary portion of the virtual content out of the eyepiece waveguide.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/42 - Diffraction optics
  • G02B 6/00 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings
  • G02B 6/122 - Basic optical elements, e.g. light-guiding paths

27.

COMPACT EXTENDED DEPTH OF FIELD LENSES FOR WEARABLE DISPLAY DEVICES

      
Application Number US2023074212
Publication Number 2025/058645
Status In Force
Filing Date 2023-09-14
Publication Date 2025-03-20
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Gao, Chunyu
  • Singh, Vikramjit
  • Uhlendorf, Kristina
  • Schaefer, Jason
  • Arend, Erik, Heath
  • Mcdonald, Lorenzo

Abstract

A wearable display device includes waveguide(s) that present virtual image elements as an augmentation to the real-world environment. The display device includes a first extended depth of field (EDOF) refractive lens arranged between the waveguide(s) and the user's eye(s), and a second EDOF refractive lens located outward from the waveguide(s). The first EDOF lens has a (e.g., negative) optical power to alter the depth of the virtual image elements. The second EDOF lens has a substantially equal and opposite (e.g., positive) optical power to that of the first EDOF lens, such that the depth of real-world objects is not altered along with the depth of the virtual image elements. To reduce the weight and/or size of the device, one or both EDOF lenses is a compact lens, e.g., Fresnel lens or flattened periphery lens. The compact lens may be coated and/or embedded in another material to enhance its performance.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 6/10 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

28.

CUSTOMIZED POLYMER/GLASS DIFFRACTIVE WAVEGUIDE STACKS FOR AUGMENTED REALITY/MIXED REALITY APPLICATIONS

      
Application Number 18953940
Status Pending
Filing Date 2024-11-20
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Bhagat, Sharad D.
  • Hill, Brian George
  • Peroz, Christophe
  • Chang, Chieh
  • Li, Ling

Abstract

A diffractive waveguide stack includes first, second, and third diffractive waveguides for guiding light in first, second, and third visible wavelength ranges, respectively. The first diffractive waveguide includes a first material having first refractive index at a selected wavelength and a first target refractive index at a midpoint of the first visible wavelength range. The second diffractive waveguide includes a second material having a second refractive index at the selected wavelength and a second target refractive index at a midpoint of the second visible wavelength range. The third diffractive waveguide includes a third material having a third refractive index at the selected wavelength and a third target refractive index at a midpoint of the third visible wavelength range. A difference between any two of the first target refractive index, the second target refractive index, and the third target refractive index is less than 0.005 at the selected wavelength.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/01 - Head-up displays

29.

MULTI-DEPTH PLANE DISPLAY SYSTEM WITH REDUCED SWITCHING BETWEEN DEPTH PLANES

      
Application Number 18955217
Status Pending
Filing Date 2024-11-21
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Samec, Nicole Elizabeth
  • Robaina, Nastasja U.
  • Harrises, Christopher M.
  • Baerenrodt, Mark

Abstract

Methods and systems for reductions in switching between depth planes of a multi-depth plane display system are disclosed. The display system may be an AR display system configured to provide virtual content on a plurality of depth planes using different wavefront divergence. The system may monitor the fixation points based upon the gaze of each of the user's eyes, with each fixation point being a three-dimensional location in the user's field of view. Location information of virtual objects to be presented to the user are obtained, with each virtual object being associated with a depth plane. The depth plane on which the virtual object is to be presented may modified based upon the fixation point of the user's eyes.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • H04N 13/122 - Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/332 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
  • H04N 13/395 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes

30.

WEARABLE SYSTEM WITH HEADSET AND CONTROLLER INSIDE-OUT TRACKING

      
Application Number 18956664
Status Pending
Filing Date 2024-11-22
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Kasper, Dominik Michael
  • Zahnert, Martin Georg
  • Sanchez Nicuesa, Manel Quim
  • Gomez-Jordana Manas, Rafa
  • Baumli, Nathan Yuki
  • Shee, Koon Keong
  • Nienstedt, Zachary C.
  • Mount, Emily Elizabeth
  • Agarwal, Lomesh
  • Lampart, Andrea

Abstract

Wearable systems and method for operation thereof incorporating headset and controller inside-out tracking are disclosed. A wearable system may include a headset and a controller. The wearable system may cause fiducials of the controller to flash. The wearable system may track a pose of the controller by capturing headset images using a headset camera, identifying the fiducials in the headset images, and tracking the pose of the controller based on the identified fiducials in the headset images and based on a pose of the headset. While tracking the pose of the controller, the wearable system may capture controller images using a controller camera. The wearable system may identify two-dimensional feature points in each controller image and determine three-dimensional map points based on the two-dimensional feature points and the pose of the controller.

IPC Classes  ?

  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

31.

DEPTH SENSING TECHNIQUES FOR VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS

      
Application Number 18957796
Status Pending
Filing Date 2024-11-24
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Smith, Brian Keith
  • Shee, Koon Keong
  • Link, Gregory Michael

Abstract

Techniques for operating a depth sensor are discussed. A first sequence of operation steps and a second sequence of operation steps can be stored in memory on the depth sensor to define, respectively, a first depth sensing mode of operation and a second depth sensing mode of operation. In response to a first request for depth measurement(s) according to the first depth sensing mode of operation, the depth sensor can operate in the first mode of operation by executing the first sequence of operation steps. In response to a second request for depth measurement(s) according to the second depth sensing mode of operation, and without performing an additional configuration operation, the depth sensor can operate in the second mode of operation by executing the second sequence of operation steps.

IPC Classes  ?

  • H04N 23/667 - Camera operation mode switching, e.g. between still and video, sport and normal or high and low resolution modes
  • G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computerOutput arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • H04N 13/139 - Format conversion, e.g. of frame-rate or size
  • H04N 13/296 - Synchronisation thereofControl thereof
  • H04N 23/959 - Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics

32.

ATTENUATION OF LIGHT TRANSMISSION ARTIFACTS IN WEARABLE DISPLAYS

      
Application Number 18960810
Status Pending
Filing Date 2024-11-26
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Manly, David
  • Messer, Kevin
  • Mathur, Vaibhav
  • Carlisle, Clinton

Abstract

A wearable display system includes an eyepiece stack having a world side and a user side opposite the world side, wherein during use a user positioned on the user side views displayed images delivered by the system via the eyepiece stack which augment the user's view of the user's environment. The wearable display system also includes an angularly selective film arranged on the world side of the of the eyepiece stack. The angularly selective film includes a polarization adjusting film arranged between pair of linear polarizers. The linear polarizers and polarization adjusting film significantly reduces transmission of visible light incident on the angularly selective film at large angles of incidence without significantly reducing transmission of light incident on the angularly selective film at small angles of incidence.

IPC Classes  ?

  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G02B 5/30 - Polarising elements
  • G02B 27/01 - Head-up displays
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02F 1/137 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering

33.

DISPLAY SYSTEMS AND METHODS FOR DETERMINING REGISTRATION BETWEEN A DISPLAY AND A USER'S EYES

      
Application Number 18962659
Status Pending
Filing Date 2024-11-27
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Edwin, Lionel Ernest
  • Nienstedt, Zachary C.
  • Yeoh, Ivan Li Chuen
  • Miller, Samuel A.
  • Xu, Yan
  • Cazamias, Jordan Alexander

Abstract

A display system may include a head-mounted display (HMD) for rendering a three-dimensional virtual object which appears to be located in an ambient environment of a user of the display. One or more eyes of the user may not be in desired positions, relative to the HMD, to receive, or register, image information outputted by the HMD and/or to view an external environment. For example, the HMD-to-eye alignment may vary for different users and/or may change over time (e.g., as the HMD is displaced). The display system may determine a relative position or alignment between the HMD and the user's eyes. Based on the relative positions, the wearable device may determine if it is properly fitted to the user, may provide feedback on the quality of the fit to the user, and/or may take actions to reduce or minimize effects of any misalignment.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • A61B 3/11 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for measuring interpupillary distance or diameter of pupils
  • A61B 3/113 - Objective types, i.e. instruments for examining the eyes independent of the patients perceptions or reactions for determining or recording eye movement
  • G02B 27/01 - Head-up displays
  • G02B 30/00 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
  • G02B 30/40 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images giving the observer of a single two-dimensional [2D] image a perception of depth
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/16 - Sound inputSound output
  • G06T 3/40 - Scaling of whole images or parts thereof, e.g. expanding or contracting
  • G06V 10/42 - Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
  • G06V 10/46 - Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]Salient regional features
  • G06V 10/60 - Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

34.

DIFFRACTIVE OPTICAL ELEMENTS WITH MITIGATION OF REBOUNCE-INDUCED LIGHT LOSS AND RELATED SYSTEMS AND METHODS

      
Application Number 18954311
Status Pending
Filing Date 2024-11-20
First Publication Date 2025-03-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Schmulen, Jeffrey Dean
  • Ricks, Neal Paul
  • Bhargava, Samarth
  • Messer, Kevin
  • Liu, Victor Kai
  • Dixon, Matthew Grant
  • Deng, Xiaopei
  • Menezes, Marlon Edward
  • Yang, Shuqiang
  • Singh, Vikramjit
  • Luo, Kang
  • Xu, Frank Y.

Abstract

Display devices include waveguides with in-coupling optical elements that mitigate re-bounce of in-coupled light to improve in-coupling efficiency and/or uniformity. A waveguide receives light from a light source and includes an in-coupling optical element that in-couples the received light to propagate by total internal reflection within the waveguide. The in-coupled light may undergo re-bounce, in which the light reflects off a waveguide surface and, after the reflection, strikes the in-coupling optical element. Upon striking the in-coupling optical element, the light may be partially absorbed and/or out-coupled by the optical element, thereby reducing the amount of in-coupled light propagating through the waveguide. The in-coupling optical element can be truncated or have reduced diffraction efficiency along the propagation direction to reduce the occurrence of light loss due to re-bounce of in-coupled light, resulting in less in-coupled light being prematurely out-coupled and/or absorbed during subsequent interactions with the in-coupling optical element.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/10 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
  • G02B 6/26 - Optical coupling means
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/42 - Diffraction optics
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

35.

Device controller

      
Application Number 29942166
Grant Number D1066297
Status In Force
Filing Date 2024-05-14
First Publication Date 2025-03-11
Grant Date 2025-03-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Swinton, Matthew David
  • Urban, Hayes

36.

NANOPATTERN ENCAPSULATION FUNCTION, METHOD AND PROCESS IN COMBINED OPTICAL COMPONENTS

      
Application Number 18554613
Status Pending
Filing Date 2022-04-13
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Vikramjit
  • Miller, Michael Nevin
  • Anderson, T.G.
  • Xu, Frank Y.

Abstract

Disclosed herein are systems and methods for displays, such as for a head wearable device. An example display can include an infrared illumination layer, the infrared illumination layer including a substrate, one or more LEDs disposed on a first surface of the substrate, and a first encapsulation layer disposed on the first surface of the substrate, where the encapsulation layer can include a nano-patterned surface. In some examples, the nano-patterned surface can be configured to improve a visible light transmittance of the illumination layer. In one or more examples, embodiments disclosed herein may provide a robust illumination layer that can reduce the haze associated with an illumination layer.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • B82Y 20/00 - Nanooptics, e.g. quantum optics or photonic crystals
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

37.

EYE CENTER OF ROTATION DETERMINATION WITH ONE OR MORE EYE TRACKING CAMERAS

      
Application Number 18948073
Status Pending
Filing Date 2024-11-14
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Cohen, David
  • Joseph, Elad
  • Ferens, Ron Nisim
  • Preter, Eyal
  • Bar-On, Eitan Shmuel
  • Yahav, Giora

Abstract

A display system can include a head-mounted display configured to project light to an eye of a user to display virtual image content at different amounts of divergence and collimation. The display system can include an inward-facing imaging system possibly comprising a plurality of cameras that image the user's eye and glints for thereon and processing electronics that are in communication with the inward-facing imaging system and that are configured to obtain an estimate of a center of rotation of the user's eye using cornea data derived from the glint images. The display system may render virtual image content with a render camera positioned at the determined position of the center of rotation of said eye.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 7/292 - Multi-camera tracking
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

38.

EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM

      
Application Number 18951308
Status Pending
Filing Date 2024-11-18
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Bhargava, Samarth
  • Liu, Victor Kai
  • Messer, Kevin

Abstract

An eyepiece waveguide for an augmented reality display system may include an optically transmissive substrate, an input coupling grating (ICG) region, a multi-directional pupil expander (MPE) region, and an exit pupil expander (EPE) region. The ICG region may receive an input beam of light and couple the input beam into the substrate as a guided beam. The MPE region may include a plurality of diffractive features which exhibit periodicity along at least a first axis of periodicity and a second axis of periodicity. The MPE region may be positioned to receive the guided beam from the ICG region and to diffract it in a plurality of directions to create a plurality of diffracted beams. The EPE region may overlap the MPE region and may out couple one or more of the diffracted beams from the optically transmissive substrate as output beams.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/09 - Beam shaping, e.g. changing the cross-sectioned area, not otherwise provided for
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes

39.

HEAD-MOUNTED DISPLAY SYSTEMS WITH POWER SAVING FUNCTIONALITY

      
Application Number 18952446
Status Pending
Filing Date 2024-11-19
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Rivera Cintron, Carlos A.
  • Link, Gregory
  • Sommers, Jeffrey Scott
  • Hull, Matthew Thomas
  • Rodriguez, Jose Felix
  • Martinez Perez, Ricardo

Abstract

Head-mounted display systems with power saving functionality are disclosed. The systems can include a frame configured to be supported on the head of the user. The systems can also include a head-mounted display disposed on the frame, one or more sensors, and processing electronics in communication with the display and the one or more sensors. In some implementations, the processing electronics can be configured to cause the system to reduce power of one or more components in response to at least in part on a determination that the frame is in a certain position (e.g., upside-down or on top of the head of the user). In some implementations, the processing electronics can be configured to cause the system to reduce power of one or more components in response to at least in part on a determination that the frame has been stationary for at least a threshold period of time.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 1/3218 - Monitoring of peripheral devices of display devices
  • G06F 1/3231 - Monitoring the presence, absence or movement of users
  • G06F 1/3234 - Power saving characterised by the action undertaken

40.

WEARABLE SYSTEM WITH CONTROLLER LOCALIZATION USING HEADSET CAMERAS AND CONTROLLER FIDUCIALS

      
Application Number 18956658
Status Pending
Filing Date 2024-11-22
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Nienstedt, Zachary C.
  • Roberts, Daniel
  • Lopez, Christopher Michael
  • Bucknor, Brian Edward Oliver
  • Miller, Samuel A.
  • Baumli, Nathan Yuki
  • Kasper, Dominik Michael
  • Sanchez Nicuesa, Manel Quim
  • Lampart, Andrea
  • Gomez-Jordana Manas, Rafa
  • Zahnert, Martin Georg
  • Stan, Nikola
  • Mount, Emily Elizabeth

Abstract

Wearable systems and method for operation thereof incorporating headset and controller localization using headset cameras and controller fiducials are disclosed. A wearable system may include a headset and a controller. The wearable system may alternate between performing headset tracking and performing controller tracking by repeatedly capturing images using a headset camera of the headset during headset tracking frames and controller tracking frames. The wearable system may cause the headset camera to capture a first exposure image an exposure above a threshold and cause the headset camera to capture a second exposure image having an exposure below the threshold. The wearable system may determine a fiducial interval during which fiducials of the controller are to flash at a fiducial frequency and a fiducial period. The wearable system may cause the fiducials to flash during the fiducial interval in accordance with the fiducial frequency and the fiducial period.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 23/73 - Circuitry for compensating brightness variation in the scene by influencing the exposure time

41.

AUGMENTED REALITY DISPLAY WITH FRAME MODULATION FUNCTIONALITY

      
Application Number 18828828
Status Pending
Filing Date 2024-09-09
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Smith, Brian Keith
  • Rivera Cintron, Carlos A.
  • Rodriguez, Jose Felix
  • Hull, Matthew Thomas
  • Link, Gregory Michael

Abstract

A head mounted display system can process images by assessing relative motion between the head mounted display and one or more features in a user's environment. The assessment of relative motion can include determining whether the head mounted display has moved, is moving and/or is expected to move with respect to one or more features in the environment. Additionally or alternatively, the assessment can include determining whether one or more features in the environment have moved, are moving and/or are expected to move relative to the head mounted display. The image processing can further include determining one or more virtual image content locations in the environment that correspond to a location where renderable virtual image content appears to a user when the location appears in the display and comparing the one or more virtual image content locations in the environment with a viewing zone.

IPC Classes  ?

  • H04N 23/667 - Camera operation mode switching, e.g. between still and video, sport and normal or high and low resolution modes
  • G02B 27/01 - Head-up displays
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

42.

PLENOPTIC CAMERA MEASUREMENT AND CALIBRATION OF HEAD-MOUNTED DISPLAYS

      
Application Number 18952703
Status Pending
Filing Date 2024-11-19
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor Schuck, Iii, Miller Harry

Abstract

A method for measuring performance of a head-mounted display module, the method including arranging the head-mounted display module relative to a plenoptic camera assembly so that an exit pupil of the head-mounted display module coincides with a pupil of the plenoptic camera assembly; emitting light from the head-mounted display module while the head-mounted display module is arranged relative to the plenoptic camera assembly; filtering the light at the exit pupil of the head-mounted display module; acquiring, with the plenoptic camera assembly, one or more light field images projected from the head-mounted display module with the filtered light; and determining information about the performance of the head-mounted display module based on acquired light field image.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 5/20 - Filters
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising

43.

ARCHITECTURES AND METHODS FOR OUTPUTTING DIFFERENT WAVELENGTH LIGHT OUT OF WAVEGUIDES

      
Application Number 18953953
Status Pending
Filing Date 2024-11-20
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Tekolste, Robert Dale
  • Klug, Michael Anthony
  • Schowengerdt, Brian T.

Abstract

Architectures are provided for selectively outputting light for forming images, the light having different wavelengths and being outputted with low levels of crosstalk. In some embodiments, light is incoupled into a waveguide and deflected to propagate in different directions, depending on wavelength. The incoupled light then outcoupled by outcoupling optical elements that outcouple light based on the direction of propagation of the light. In some other embodiments, color filters are between a waveguide and outcoupling elements. The color filters limit the wavelengths of light that interact with and are outcoupled by the outcoupling elements. In yet other embodiments, a different waveguide is provided for each range of wavelengths to be outputted. Incoupling optical elements selectively incouple light of the appropriate range of wavelengths into a corresponding waveguide, from which the light is outcoupled.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/42 - Diffraction optics

44.

HEAD POSE MIXING OF AUDIO FILES

      
Application Number 18954259
Status Pending
Filing Date 2024-11-20
First Publication Date 2025-03-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Mangiat, Stephen Vincent
  • Tucker, Michael Benson
  • Tajik, Anastasia Andreyevna

Abstract

Examples of wearable devices that can present to a user of the display device an audible or visual representation of an audio file comprising a plurality of stem tracks that represent different audio content of the audio file are described. Systems and methods are described that determine the pose of the user; generate, based on the pose of the user, an audio mix of at least one of the plurality of stem tracks of the audio file; generate, based on the pose of the user and the audio mix, a visualization of the audio mix; communicate an audio signal representative of the audio mix to the speaker; and communicate a visual signal representative of the visualization of the audio mix to the display.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangementsControl arrangements, e.g. balance control
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/16 - Sound inputSound output

45.

USER HEART RATE DETECTION SYSTEM AND METHOD USING FUSION OF MULTI-SENSOR DATA

      
Application Number US2023031746
Publication Number 2025/048824
Status In Force
Filing Date 2023-08-31
Publication Date 2025-03-06
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Mashayekhi, Ghoncheh
  • Rice, Darrian
  • Nyman, Edward Jr.
  • Esposito, Jennifer Miglionico
  • Shironoshita, Emilio Patrick

Abstract

Systems and methods for fusing multiple types of sensor data to determine a heart rate of a user. An accelerometer obtains accelerometer data associated with the user over a time period, and a gyroscope obtains gyroscope data associated with the user over the time period. Also, a camera obtains a plurality of images of the user's eye over the time period. The plurality images are analyzed to generate image data of the user's eyelid over the time period. The accelerometer data, the gyroscope data, and the image data are fused into fused sensor data, and a heart rate of the user is determined from the fused sensor data.

IPC Classes  ?

  • A61B 5/024 - Measuring pulse rate or heart rate
  • A61B 5/0205 - Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition

46.

Head mounted audio-visual display system

      
Application Number 29963500
Grant Number D1065190
Status In Force
Filing Date 2024-09-17
First Publication Date 2025-03-04
Grant Date 2025-03-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Awad, Haney
  • Kaji, Masamune
  • Swinton, Robert Dainis
  • Wheeler, William
  • Swinton, Matthew David
  • Gunther, Sebastian Gonzalo Arrieta

47.

METHODS AND APPARATUS FOR WEARABLE DISPLAY DEVICE WITH VARYING GRATING PARAMETERS

      
Application Number 18940738
Status Pending
Filing Date 2024-11-07
First Publication Date 2025-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Curtis, Kevin Richard
  • Cheng, Hui-Chuan
  • Greco, Paul M.
  • Welch, William Hudson
  • Browy, Eric C.
  • Schuck, Iii, Miller Harry
  • Sissom, Bradley Jay

Abstract

A device for viewing a projected image includes an input coupling grating operable to receive light related to the projected image from a light source and an expansion grating having a first grating structure characterized by a first set of grating parameters varying in one or more dimensions. The expansion grating structure is operable to receive light from the input coupling grating and to multiply the light related to the projected image. The device also includes an output coupling grating having a second grating structure characterized by a second set of grating parameters and operable to output the multiplied light in a predetermined direction.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • B29D 11/00 - Producing optical elements, e.g. lenses or prisms
  • G02B 1/00 - Optical elements characterised by the material of which they are madeOptical coatings for optical elements
  • G02B 5/18 - Diffracting gratings
  • G02B 5/30 - Polarising elements
  • G02B 6/293 - Optical coupling means having data bus means, i.e. plural waveguides interconnected and providing an inherently bidirectional system by mixing and splitting signals with wavelength selective means
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • G02B 7/00 - Mountings, adjusting means, or light-tight connections, for optical elements
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G02B 27/30 - Collimators
  • G02C 5/16 - Side-members resilient or with resilient parts
  • G02C 11/00 - Non-optical adjunctsAttachment thereof
  • G06F 1/16 - Constructional details or arrangements
  • G06F 1/20 - Cooling means
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/147 - Digital output to display device using display panels
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 3/20 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix
  • H04N 9/31 - Projection devices for colour picture display
  • H05K 7/20 - Modifications to facilitate cooling, ventilating, or heating

48.

METHOD AND SYSTEM FOR DETECTING FIBER POSITION IN A FIBER SCANNING PROJECTOR

      
Application Number 18943728
Status Pending
Filing Date 2024-11-11
First Publication Date 2025-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Melville, Charles David
  • Watson, Mathew D.
  • Rajiv, Abhijith
  • Kuehn, Benjamin John

Abstract

A method of measuring a position of a scanning cantilever includes providing a housing including an actuation region, a position measurement region including an aperture, and an oscillation region. The method also includes providing a drive signal to an actuator disposed in the actuation region, oscillating the scanning cantilever in response to the drive signal, generating a first light beam using a first optical source, directing the first light beam toward the aperture, detecting at least a portion of the first light beam using a first photodetector, generating a second light beam using a second optical source, directing the second light beam toward the aperture, detecting at least a portion of the second light beam using a second photodetector, and determining the position of the scanning cantilever based on the detected portion of the first light beam and the detected portion of the second light beam.

IPC Classes  ?

  • G01S 7/481 - Constructional features, e.g. arrangements of optical elements
  • G02B 26/10 - Scanning systems

49.

VIRTUAL USER INPUT CONTROLS IN A MIXED REALITY ENVIRONMENT

      
Application Number 18948996
Status Pending
Filing Date 2024-11-15
First Publication Date 2025-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Kaehler, Adrian
  • Croston, John Adam

Abstract

A wearable display system includes a mixed reality display for presenting a virtual image to a user, an outward-facing imaging system configured to image an environment of the user, and a hardware processor operably coupled to the mixed reality display and to the imaging system. The hardware processor is programmed to generate a virtual remote associated with a parent device, render the virtual remote and the virtual control element on the mixed reality display, determine when the user of the wearable system interacts with the virtual control element of the virtual remote, and perform certain functions in response to user interaction with a virtual control element of the virtual remote. These functions may include generation the virtual control element to move on the mixed reality display; and when movement of the virtual control element surpasses a threshold condition, generate a focus indicator for the virtual control element.

IPC Classes  ?

  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06N 3/02 - Neural networks
  • G06N 20/00 - Machine learning
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/46 - Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]Salient regional features
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestriansBody parts, e.g. hands

50.

IMMERSIVE AUDIO PLATFORM

      
Application Number 18943633
Status Pending
Filing Date 2024-11-11
First Publication Date 2025-02-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Jot, Jean-Marc
  • Minnick, Michael
  • Pastouchenko, Dmitry
  • Simon, Michael Aaron
  • Scott, Iii, John Emmitt
  • Bailey, Richard St. Clair
  • Balasubramanyam, Shivakumar
  • Agadi, Harsharaj

Abstract

Disclosed herein are systems and methods for presenting audio content in mixed reality environments. A method may include receiving a first input from an application program; in response to receiving the first input, receiving, via a first service, an encoded audio stream; generating, via the first service, a decoded audio stream based on the encoded audio stream; receiving, via a second service, the decoded audio stream; receiving a second input from one or more sensors of a wearable head device; receiving, via the second service, a third input from the application program, wherein the third input corresponds to a position of one or more virtual speakers; generating, via the second service, a spatialized audio stream based on the decoded audio stream, the second input, and the third input; presenting, via one or more speakers of the wearable head device, the spatialized audio stream.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangementsControl arrangements, e.g. balance control
  • G02B 27/01 - Head-up displays

51.

VIRTUAL AND REAL OBJECT RECORDING IN MIXED REALITY DEVICE

      
Application Number 18938790
Status Pending
Filing Date 2024-11-06
First Publication Date 2025-02-20
Owner Magic Leap, Inc. (USA)
Inventor Huang, Ziqiang

Abstract

A virtual image generation system for use by an end user comprises memory, a display subsystem, an object selection device configured for receiving input from the end user and persistently selecting at least one object in response to the end user input, and a control subsystem configured for rendering a plurality of image frames of a three-dimensional scene, conveying the image frames to the display subsystem, generating audio data originating from the at least one selected object, and for storing the audio data within the memory.

IPC Classes  ?

  • G06F 3/16 - Sound inputSound output
  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
  • A63F 13/424 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
  • A63F 13/5255 - Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
  • A63F 13/5372 - Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
  • A63F 13/54 - Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04R 1/10 - EarpiecesAttachments therefor
  • H04R 1/32 - Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
  • H04R 1/40 - Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
  • H04R 3/00 - Circuits for transducers
  • H04R 5/02 - Spatial or constructional arrangements of loudspeakers

52.

LIGHT OUTPUT SYSTEM WITH REFLECTOR AND LENS FOR HIGHLY SPATIALLY UNIFORM LIGHT OUTPUT

      
Application Number 18934988
Status Pending
Filing Date 2024-11-01
First Publication Date 2025-02-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Sissom, Bradley Jay
  • Hall, Heidi Leising
  • Curtis, Kevin

Abstract

A user may interact and view virtual elements such as avatars and objects and/or real world elements in three-dimensional space in an augmented reality (AR) session. The system may allow one or more spectators to view from a stationary or dynamic camera a third person view of the users AR session. The third person view may be synchronized with the user view and the virtual elements of the user view may be composited onto the third person view.

IPC Classes  ?

  • F21V 13/04 - Combinations of only two kinds of elements the elements being reflectors and refractors
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/12 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind
  • G02B 23/06 - Telescopes, e.g. binocularsPeriscopesInstruments for viewing the inside of hollow bodiesViewfindersOptical aiming or sighting devices involving prisms or mirrors having a focusing action, e.g. parabolic mirror
  • G02B 27/01 - Head-up displays
  • G02B 30/50 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
  • H04N 13/315 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers the parallax barriers being time-variant
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

53.

CROSS REALITY SYSTEM WITH WIRELESS FINGERPRINTS

      
Application Number 18936094
Status Pending
Filing Date 2024-11-04
First Publication Date 2025-02-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Shveki, Gilboa
  • Weisbih, Ben
  • Kapota, Ofer
  • Torres, Rafael Domingos
  • Olshansky, Daniel
  • Holder, Joel David

Abstract

A cross reality system enables any of multiple devices to efficiently and accurately access previously stored maps and render virtual content specified in relation to those maps. Both stored maps and tracking maps used by portable devices may have wireless fingerprints associated with them. The portable devices may maintain wireless fingerprints based on wireless scans performed repetitively, based on one or more trigger conditions, as the devices move around the physical world. The wireless information obtained from these scans may be used to create or update wireless fingerprints associated with locations in a tracking map on the devices. One or more of these wireless fingerprints may be used when a previously stored map is to be selected based on its coverage of an area in which the portable device is operating. Maintaining wireless fingerprints in this way provides a reliable and low latency mechanism for performing map-related operations.

IPC Classes  ?

  • H04W 24/02 - Arrangements for optimising operational condition
  • G01S 5/02 - Position-fixing by co-ordinating two or more direction or position-line determinationsPosition-fixing by co-ordinating two or more distance determinations using radio waves
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04W 4/02 - Services making use of location information
  • H04W 4/029 - Location-based management or tracking services

54.

DISPLAY SYSTEM WITH LOW-LATENCY PUPIL TRACKER

      
Application Number 18929056
Status Pending
Filing Date 2024-10-28
First Publication Date 2025-02-13
Owner Magic Leap, Inc. (USA)
Inventor Klug, Michael Anthony

Abstract

A display system aligns the location of its exit pupil with the location of a viewer's pupil by changing the location of the portion of a light source that outputs light. The light source may include an array of pixels that output light, thereby allowing an image to be displayed on the light source. The display system includes a camera that captures image(s) of the eye and negatives of the eye image(s) are displayed by the light source. In the negative image, the dark pupil of the eye is a bright spot which, when displayed by the light source, defines the exit pupil of the display system, such that image content may be presented by modulating the light source. The location of the pupil of the eye may be tracked by capturing the images of the eye.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays

55.

VIRTUAL AND AUGMENTED REALITY SYSTEMS AND METHODS

      
Application Number 18930234
Status Pending
Filing Date 2024-10-29
First Publication Date 2025-02-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Rodriguez, Jose Felix
  • Perez, Ricardo Martinez

Abstract

A virtual reality (VR) and/or augmented reality (AR) display system is configured to control a display using control information that is embedded in or otherwise included with imagery data to be presented through the display. The control information can indicate depth plane(s) and/or color plane(s) to be used to present the imagery data, depth plane(s) and/or color plane(s) to be activated or inactivated, shift(s) of at least a portion of the imagery data (e.g., one or more pixels) laterally within a depth plane and/or longitudinally between depth planes, and/or other suitable controls.

IPC Classes  ?

  • G06T 7/50 - Depth or shape recovery
  • G02B 27/01 - Head-up displays
  • G06F 3/14 - Digital output to display device
  • G06T 7/579 - Depth or shape recovery from multiple images from motion
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 3/20 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/395 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
  • H04N 13/398 - Synchronisation thereofControl thereof

56.

SYSTEMS AND METHODS FOR VIRTUAL AND AUGMENTED REALITY

      
Application Number 18932396
Status Pending
Filing Date 2024-10-30
First Publication Date 2025-02-13
Owner Magic Leap, Inc. (USA)
Inventor Browy, Eric C.

Abstract

Disclosed herein are systems and methods for distributed computing and/or networking for mixed reality systems. A method may include capturing an image via a camera of a head-wearable device. Inertial data may be captured via an inertial measurement unit of the head-wearable device. A position of the head-wearable device can be estimated based on the image and the inertial data via one or more processors of the head-wearable device. The image can be transmitted to a remote server. A neural network can be trained based on the image via the remote server. A trained neural network can be transmitted to the head-wearable device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/14 - Digital output to display device
  • G06F 18/214 - Generating training patternsBootstrap methods, e.g. bagging or boosting
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • H04B 7/155 - Ground-based stations
  • H04L 67/10 - Protocols in which an application is distributed across nodes in the network

57.

SHAPED COLOR-ABSORBING REGIONS FOR WAVEGUIDES

      
Application Number 18718675
Status Pending
Filing Date 2022-12-16
First Publication Date 2025-02-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Vikramjit
  • Traub, Matthew C.
  • Menezes, Marlon Edward
  • Liu, Yingnan
  • Xu, Frank Y.

Abstract

A waveguide stack having color-selective regions on one or more waveguides. The color-selective regions are configured to absorb incident light of a first wavelength range in such a way as to reduce or prevent the incident light of the first wavelength range from coupling into a waveguide configured to transmit a light of a second wavelength range.

IPC Classes  ?

58.

SYSTEMS AND METHODS FOR END TO END SCENE RECONSTRUCTION FROM MULTIVIEW IMAGES

      
Application Number 18808906
Status Pending
Filing Date 2024-08-19
First Publication Date 2025-02-13
Owner MAGIC LEAP, INC. (USA)
Inventor Murez, Zachary Paul

Abstract

Systems and methods of generating a three-dimensional (3D) reconstruction of a scene or environment surrounding a user of a spatial computing system, such as a virtual reality, augmented reality or mixed reality system, using only multiview images comprising, and without the need for depth sensors or depth data from sensors. Features are extracted from a sequence of frames of RGB images and back-projected using known camera intrinsics and extrinsics into a 3D voxel volume wherein each pixel of the voxel volume is mapped to a ray in the voxel volume. The back-projected features are fused into the 3D voxel volume. The 3D voxel volume is passed through a 3D convolutional neural network to refine the and regress truncated signed distance function values at each voxel of the 3D voxel volume.

IPC Classes  ?

59.

VIRTUAL AND AUGMENTED REALITY SYSTEMS AND METHODS HAVING UNEQUAL NUMBERS OF COMPONENT COLOR IMAGES DISTRIBUTED ACROSS DEPTH PLANES

      
Application Number 18931455
Status Pending
Filing Date 2024-10-30
First Publication Date 2025-02-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Schowengerdt, Brian T.
  • Hua, Hong
  • Cheng, Hui-Chuan
  • Peroz, Christophe

Abstract

Images perceived to be substantially full color or multi-colored may be formed using component color images that are distributed in unequal numbers across a plurality of depth planes. The distribution of component color images across depth planes may vary based on color. In some embodiments, a display system includes a stack of waveguides that each output light of a particular color, with some colors having fewer numbers of associated waveguides than other colors. The waveguide stack may include multiple pluralities (e.g., first and second pluralities) of waveguides, each configured to produce an image by outputting light corresponding to a particular color. The total number of waveguides in the second plurality of waveguides may be less than the total number of waveguides in the first plurality of waveguides.

IPC Classes  ?

60.

AREA SPECIFIC COLOR ABSORPTION IN NANOIMPRINT LITHOGRAPHY

      
Application Number 18717946
Status Pending
Filing Date 2022-12-16
First Publication Date 2025-02-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Vikramjit
  • Traub, Matthew C.
  • Xu, Frank Y.

Abstract

An eyepiece includes an optical waveguide, a transmissive input coupler at a first end of the optical waveguide, an output coupler at a second end of the optical waveguide, and a polymeric color absorbing region along a portion of the optical waveguide between the transmissive input coupler and the output coupler. The transmissive input coupler is configured to couple incident visible light to the optical waveguide, and the color-absorbing region is configured to absorb a component of the visible light as the visible light propagates through the optical waveguide.

IPC Classes  ?

  • G02B 25/00 - EyepiecesMagnifying glasses
  • G02B 1/118 - Anti-reflection coatings having sub-optical wavelength surface structures designed to provide an enhanced transmittance, e.g. moth-eye structures
  • G02B 5/22 - Absorbing filters
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

61.

SPATIAL AUDIO FOR INTERACTIVE AUDIO ENVIRONMENTS

      
Application Number 18924155
Status Pending
Filing Date 2024-10-23
First Publication Date 2025-02-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Audfray, Remi Samuel
  • Jot, Jean-Marc
  • Dicker, Samuel Charles

Abstract

Systems and methods of presenting an output audio signal to a listener located at a first location in a virtual environment are disclosed. According to embodiments of a method, an input audio signal is received. A first intermediate audio signal corresponding to the input audio signal is determined, based on a location of the sound source in the virtual environment, and the first intermediate audio signal is associated with a first bus. A second intermediate audio signal is determined. The second intermediate audio signal corresponds to a reverberation of the input audio signal in the virtual environment. The second intermediate audio signal is determined based on a location of the sound source, and further based on an acoustic property of the virtual environment. The second intermediate audio signal is associated with a second bus. The output audio signal is presented to the listener via the first and second buses.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangementsControl arrangements, e.g. balance control
  • G02B 27/01 - Head-up displays
  • G10K 15/10 - Arrangements for producing a reverberation or echo sound using time-delay networks comprising electromechanical or electro-acoustic devices
  • H04R 3/04 - Circuits for transducers for correcting frequency response
  • H04R 3/12 - Circuits for transducers for distributing signals to two or more loudspeakers
  • H04R 5/033 - Headphones for stereophonic communication
  • H04R 5/04 - Circuit arrangements
  • H04S 3/00 - Systems employing more than two channels, e.g. quadraphonic

62.

TECHNIQUE FOR CONTROLLING VIRTUAL IMAGE GENERATION SYSTEM USING EMOTIONAL STATES OF USER

      
Application Number 18927587
Status Pending
Filing Date 2024-10-25
First Publication Date 2025-02-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Sanger, George Alistair
  • Miller, Samuel A.
  • Devine, Graeme John

Abstract

A method of operating a virtual image generation system comprises allowing an end user to interact with a three-dimensional environment comprising at least one virtual object, presenting a stimulus to the end user in the context of the three-dimensional environment, sensing at least one biometric parameter of the end user in response to the presentation of the stimulus to the end user, generating biometric data for each of the sensed biometric parameter(s), determining if the end user is in at least one specific emotional state based on the biometric data for the each of the sensed biometric parameter(s), and performing an action discernible to the end user to facilitate a current objective at least partially based on if it is determined that the end user is in the specific emotional state(s).

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • A63F 13/21 - Input arrangements for video game devices characterised by their sensors, purposes or types
  • A63F 13/212 - Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
  • A63F 13/52 - Controlling the output signals based on the game progress involving aspects of the displayed game scene
  • A63F 13/65 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
  • A63F 13/822 - Strategy gamesRole-playing games
  • G02B 27/01 - Head-up displays
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 16/56 - Information retrievalDatabase structures thereforFile system structures therefor of still image data having vectorial format
  • G06F 16/58 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions

63.

SYSTEMS AND METHODS FOR ENHANCED DEPTH DETERMINATION USING PROJECTION SPOTS

      
Application Number 18801164
Status Pending
Filing Date 2024-08-12
First Publication Date 2025-02-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Berger, Kai
  • Vohra, Hasnain Salim

Abstract

Systems and methods for enhanced depth determination using projection spots. An example method includes obtaining images of a real-world object, the images being obtained from image sensors positioned about the real-world object, and the images depicting projection spots projected onto the real-world object via projectors positioned about the real-world object. A projection spot map is accessed, the projection spot map including information indicative of real-world locations of projection spots based locations of the projection spots in the obtained images. Location information is assigned to the projection spots based on the projection spot map. Generation of a three-dimensional representation of the real-world object is caused.

IPC Classes  ?

  • G06T 7/521 - Depth or shape recovery from laser ranging, e.g. using interferometryDepth or shape recovery from the projection of structured light
  • G06T 7/557 - Depth or shape recovery from multiple images from light fields, e.g. from plenoptic cameras
  • G06T 7/586 - Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
  • G06T 15/10 - Geometric effects
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

64.

VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

      
Application Number 18923373
Status Pending
Filing Date 2024-10-22
First Publication Date 2025-02-06
Owner MAGIC LEAP, INC. (USA)
Inventor Taylor, Robert Blake

Abstract

A method for determining a focal point depth of a user of a three-dimensional (“3D”) display device includes tracking a first gaze path of the user. The method also includes analyzing 3D data to identify one or more virtual objects along the first gaze path of the user. The method further includes when only one virtual object intersects the first gaze path of the user identifying a depth of the only one virtual object as the focal point depth of the user.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/50 - Depth or shape recovery
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

65.

ENHANCED EYE TRACKING TECHNIQUES BASED ON NEURAL NETWORK ANALYSIS OF IMAGES

      
Application Number 18598620
Status Pending
Filing Date 2024-03-07
First Publication Date 2025-01-30
Owner Magic Leap, Inc. (USA)
Inventor
  • Zheng, Hao
  • Jia, Zhiheng

Abstract

Enhanced eye-tracking techniques for augmented or virtual reality display systems. An example method includes obtaining an image of an eye of a user of a wearable system, the image depicting glints on the eye caused by respective light emitters, wherein the image is a low dynamic range (LDR) image; generating a high dynamic range (HDR) image via computation of a forward pass of a machine learning model using the image; determining location information associated with the glints as depicted in the HDR image, wherein the location information is usable to inform an eye pose of the eye.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 5/92 - Dynamic range modification of images or parts thereof based on global image properties

66.

HIGH ACCURACY DISPLACEMENT DEVICE

      
Application Number 18917014
Status Pending
Filing Date 2024-10-16
First Publication Date 2025-01-30
Owner Magic Leap, Inc. (USA)
Inventor
  • Donaldson, Nick
  • Yan, Changxin
  • Gupta, Ankur
  • Chauhan, Vikram

Abstract

Devices are described for high accuracy displacement of tools. In particular, embodiments provide a device for adjusting a position of a tool. The device includes a threaded shaft having a first end and a second end and a shaft axis extending from the first end to the second end, a motor that actuates the threaded shaft to move in a direction of the shaft axis. In some examples, the motor is operatively coupled to the threaded shaft. The device includes a carriage coupled to the camera, and a bearing assembly coupled to the threaded shaft and the carriage. In some examples, the bearing assembly permits a movement of the carriage with respect to the threaded shaft. The movement of the carriage allows the position of the camera to be adjusted.

IPC Classes  ?

  • H04N 23/695 - Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
  • G03B 17/56 - Accessories

67.

METHODS AND SYSTEMS FOR INTERPOLATION OF DISPARATE INPUTS

      
Application Number 18917761
Status Pending
Filing Date 2024-10-16
First Publication Date 2025-01-30
Owner Magic Leap, Inc. (USA)
Inventor Wedig, Geoffrey

Abstract

Systems and methods are provided for interpolation of disparate inputs. A radial basis function neural network (RBFNN) may be used to interpolate the pose of a digital character. Input parameters to the RBFNN may be separated by data type (e.g. angular vs. linear) and manipulated within the RBFNN by distance functions specific to the data type (e.g. use an angular distance function for the angular input data). A weight may be applied to each distance to compensate for input data representing different variables (e.g. clavicle vs. shoulder). The output parameters of the RBFNN may be a set of independent values, which may be combined into combination values (e.g. representing x, y, z, w angular value in SO(3) space).

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06N 3/048 - Activation functions
  • G06N 3/09 - Supervised learning
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

68.

MIXED REALITY VIRTUAL REVERBERATION

      
Application Number 18909765
Status Pending
Filing Date 2024-10-08
First Publication Date 2025-01-30
Owner Magic Leap, Inc. (USA)
Inventor
  • Tajik, Anastasia Andreyevna
  • Jot, Jean-Marc

Abstract

A method of presenting an audio signal to a user of a mixed reality environment is disclosed, the method comprising the steps of detecting a first audio signal in the mixed reality environment, where the first audio signal is a real audio signal; identifying a virtual object intersected by the first audio signal in the mixed reality environment; identifying a listener coordinate associated with the user; determining, using the virtual object and the listener coordinate, a transfer function; applying the transfer function to the first audio signal to produce a second audio signal; and presenting, to the user, the second audio signal.

IPC Classes  ?

69.

REVERBERATION FINGERPRINT ESTIMATION

      
Application Number 18915247
Status Pending
Filing Date 2024-10-14
First Publication Date 2025-01-30
Owner Magic Leap, Inc. (USA)
Inventor
  • Parvaix, Mathieu
  • Jot, Jean-Marc
  • Leider, Colby Nelson

Abstract

Examples of the disclosure describe systems and methods for estimating acoustic properties of an environment. In an example method, a first audio signal is received via a microphone of a wearable head device. An envelope of the first audio signal is determined, and a first reverberation time is estimated based on the envelope of the first audio signal. A difference between the first reverberation time and a second reverberation time is determined. A change in the environment is determined based on the difference between the first reverberation time and the second reverberation time. A second audio signal is presented via a speaker of a wearable head device, wherein the second audio signal is based on the second reverberation time.

IPC Classes  ?

70.

METHOD AND SYSTEM FOR VARIABLE OPTICAL THICKNESS WAVEGUIDES FOR AUGMENTED REALITY DEVICES

      
Application Number 18907129
Status Pending
Filing Date 2024-10-04
First Publication Date 2025-01-23
Owner Magic Leap, Inc. (USA)
Inventor
  • Tekolste, Robert D.
  • Ong, Ryan Jason
  • Liu, Victor Kai
  • Bhargava, Samarth
  • Peroz, Christophe
  • Singh, Vikramjit
  • Menezes, Marlon Edward
  • Yang, Shuqiang
  • Xu, Frank Y.

Abstract

An augmented reality device includes a projector, projector optics optically coupled to the projector, and a substrate structure including a substrate having an incident surface and an opposing exit surface and a first variable thickness film coupled to the incident surface. The substrate structure can also include a first combined pupil expander coupled to the first variable thickness film, a second variable thickness film coupled to the opposing exit surface, an incoupling grating coupled to the opposing exit surface, and a second combined pupil expander coupled to the opposing exit surface.

IPC Classes  ?

  • G02B 6/13 - Integrated optical circuits characterised by the manufacturing method
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/12 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind
  • G02B 6/122 - Basic optical elements, e.g. light-guiding paths
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light

71.

APPARATUS FOR OPTICAL SEE-THROUGH HEAD MOUNTED DISPLAY WITH MUTUAL OCCLUSION AND OPAQUENESS CONTROL

      
Application Number 18908422
Status Pending
Filing Date 2024-10-07
First Publication Date 2025-01-23
Owner Magic Leap, Inc. (USA)
Inventor
  • Gao, Chunyu
  • Lin, Yuxiang
  • Hua, Hong

Abstract

The present invention comprises a compact optical see-through head-mounted display capable of combining, a see-through image path with a virtual image path such that the opaqueness of the see-through image path can be modulated and the virtual image occludes parts of the see-through image and vice versa.

IPC Classes  ?

  • G02B 26/00 - Optical devices or arrangements for the control of light using movable or deformable optical elements
  • G02B 5/04 - Prisms
  • G02B 13/06 - Panoramic objectivesSo-called "sky lenses"
  • G02B 25/00 - EyepiecesMagnifying glasses
  • G02B 27/01 - Head-up displays
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G03B 37/02 - Panoramic or wide-screen photographyPhotographing extended surfaces, e.g. for surveyingPhotographing internal surfaces, e.g. of pipe with scanning movement of lens or camera
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 23/45 - Cameras or camera modules comprising electronic image sensorsControl thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
  • H04N 23/698 - Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

72.

CROSS REALITY SYSTEM WITH QUALITY INFORMATION ABOUT PERSISTENT COORDINATE FRAMES

      
Application Number 18780769
Status Pending
Filing Date 2024-07-23
First Publication Date 2025-01-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Shahrokni, Ali
  • Torres, Rafael Domingos
  • Guberman Raza, Joao Lucas

Abstract

A cross reality system that provides an immersive user experience shared by multiple user devices by providing quality information about a shared map. The quality information may be specific to individual user devices rendering virtual content specified with respect to the shared map. The quality information may be provided for persistent coordinate frames (PCFs) in the map. The quality information about a PCF may indicate positional uncertainty of virtual content, specified with respect to the PCF, when rendered on the user device. The quality information may be computed as upper bounding errors by determining error statistics for one or more steps in a process of specifying position with respect to the PCF or transforming that positional expression to a coordinate frame local to the device for rendering the virtual content. Applications running on individual user devices may adjust the rendering of virtual content based on the quality information about the shared map.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 3/40 - Scaling of whole images or parts thereof, e.g. expanding or contracting

73.

VARIABLE-PITCH COLOR EMITTING DISPLAY

      
Application Number 18900448
Status Pending
Filing Date 2024-09-27
First Publication Date 2025-01-16
Owner Magic Leap, Inc. (USA)
Inventor
  • St. Hilaire, Pierre
  • Poliakov, Evgeni
  • Jolly, Sundeep Kumar

Abstract

This disclosure relates to the use of variable-pitch light-emitting devices for display applications, including for displays in augmented reality, virtual reality, and mixed reality environments. In particular, it relates to small (e.g., micron-size) light emitting devices (e.g., micro-LEDs) of variable pitch to provide the advantages, e.g., of compactness, manufacturability, color rendition, as well as computational and power savings. Systems and methods for emitting multiple lights by multiple panels where a pitch of one panel is different than pitch(es) of other panels are disclosed. Each panel may comprise a respective array of light emitters. The multiple lights may be combined by a combiner.

IPC Classes  ?

  • H04N 9/31 - Projection devices for colour picture display
  • G02B 27/01 - Head-up displays
  • G02B 27/10 - Beam splitting or combining systems
  • H01L 25/075 - Assemblies consisting of a plurality of individual semiconductor or other solid-state devices all the devices being of a type provided for in a single subclass of subclasses , , , , or , e.g. assemblies of rectifier diodes the devices not having separate containers the devices being of a type provided for in group

74.

NON-BLOCKING DUAL DRIVER EARPHONES

      
Application Number 18900477
Status Pending
Filing Date 2024-09-27
First Publication Date 2025-01-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Schmidt, Brian Lloyd
  • Roach, David Thomas
  • Land, Michael Z.
  • Herr, Richard D.

Abstract

A head-worn sound reproduction device is provided in the form of left and right earphones, which can either be clipped to each ear or mounted on other headgear. The earphones deliver high fidelity audio to a user's eardrums from near-ear range, in a lightweight form factor that is fully “non-blocking” (allows coupling in and natural hearing of ambient sound). Each earphone has a woofer component that produces bass frequencies, and a tweeter component that produces treble frequencies. The woofer outputs the bass frequencies from a position close to the ear canal, while the tweeter outputs treble frequencies from a position that is either close to the ear canal or further away. In certain embodiments, the tweeter is significantly further from the ear canal than the woofer, leading to a more expansive perceived “sound stage”, but still with a “pure” listening experience.

IPC Classes  ?

  • H04R 1/10 - EarpiecesAttachments therefor
  • G02C 11/00 - Non-optical adjunctsAttachment thereof
  • G02C 11/06 - Hearing aids
  • H04M 1/05 - Supports for telephone transmitters or receivers specially adapted for use on head, throat or breast
  • H04M 1/60 - Substation equipment, e.g. for use by subscribers including speech amplifiers
  • H04R 1/02 - CasingsCabinetsMountings therein
  • H04R 1/26 - Spatial arrangement of separate transducers responsive to two or more frequency ranges
  • H04R 1/28 - Transducer mountings or enclosures designed for specific frequency responseTransducer enclosures modified by provision of mechanical or acoustic impedances, e.g. resonator, damping means
  • H04R 1/34 - Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by using a single transducer with sound reflecting, diffracting, directing or guiding means
  • H04R 3/14 - Cross-over networks
  • H04R 5/033 - Headphones for stereophonic communication
  • H04R 5/04 - Circuit arrangements
  • H04S 1/00 - Two-channel systems
  • H04S 7/00 - Indicating arrangementsControl arrangements, e.g. balance control

75.

DISPLAY SYSTEM AND METHOD FOR PROVIDING VARIABLE ACCOMMODATION CUES USING MULTIPLE INTRA-PUPIL PARALLAX VIEWS FORMED BY LIGHT EMITTER ARRAYS

      
Application Number 18901851
Status Pending
Filing Date 2024-09-30
First Publication Date 2025-01-16
Owner Magic Leap, Inc. (USA)
Inventor Klug, Michael Anthony

Abstract

A display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. The wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The images may be formed by an emissive micro-display. Each pixel formed by the micro-display may be formed by one of a group of light emitters, which are at different locations such that the emitted light takes different paths to the eye to provide different amounts of parallax disparity.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 25/00 - EyepiecesMagnifying glasses
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only
  • G02B 27/30 - Collimators
  • G02B 30/24 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/398 - Synchronisation thereofControl thereof

76.

SECURE AUTHORIZATION VIA MODAL WINDOW

      
Application Number 18897926
Status Pending
Filing Date 2024-09-26
First Publication Date 2025-01-16
Owner Magic Leap, Inc. (USA)
Inventor Mak, Genevieve

Abstract

The disclosure relates to systems and methods for authorization of a user in a spatial 3D environment. The systems and methods can include receiving a request from an application executing on a mixed reality display system to authorize the user with a web service, displaying to the user an authorization window configured to accept user input associated with authorization by the web service and to prevent the application or other applications from receiving the user input, communicating the user input to the web service, receiving an access token from the web service, in which the access token is indicative of successful authorization by the web service, and communicating the access token to the application for authorization of the user. The authorization window can be a modal window displayed in an immersive mode by the mixed reality display system.

IPC Classes  ?

  • G06F 21/36 - User authentication by graphic or iconic representation
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 21/31 - User authentication
  • G06F 21/33 - User authentication using certificates
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

77.

DISPLAY DEVICE WITH DIFFRACTION GRATING HAVING REDUCED POLARIZATION SENSITIVITY

      
Application Number 18898376
Status Pending
Filing Date 2024-09-26
First Publication Date 2025-01-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Vikramjit
  • Luo, Kang
  • Deng, Xiaopei
  • Yang, Shuqiang
  • Xu, Frank Y.
  • Messer, Kevin

Abstract

Diffraction gratings provide optical elements, e.g., in a head-mountable display system, that can affect light, for example by incoupling light into a waveguide, outcoupling light out of a waveguide, and/or multiplying light propagating in a waveguide. The diffraction gratings may be configured to have reduced polarization sensitivity such that light of different polarization states, or polarized and unpolarized light, is incoupled, outcoupled, multiplied, or otherwise affected with a similar level of efficiency. The reduced polarization sensitivity may be achieved through provision of a transmissive layer and a metallic layer on one or more gratings. A diffraction grating may comprise a blazed grating or other suitable configuration.

IPC Classes  ?

  • G02B 5/18 - Diffracting gratings
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/01 - Head-up displays

78.

SYSTEMS AND METHODS FOR TEMPORARILY DISABLING USER CONTROL INTERFACES DURING ATTACHMENT OF AN ELECTRONIC DEVICE

      
Application Number 18899240
Status Pending
Filing Date 2024-09-27
First Publication Date 2025-01-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Pedroza, Carlos Julio Suate
  • Hendry, Todd Daniel
  • Rohena, Guillermo Padin
  • Greco, Paul M.
  • Diptee, Vinosh Christopher
  • Rynk, Evan Francis

Abstract

Systems and methods of disabling user control interfaces during attachment of a wearable electronic device to a portion of a user's clothing or accessory are disclosed. The wearable electronic device can include inertial measurement units (IMUs), optical sources, optical sensors or electromagnetic sensors. Based on the information provided by the IMUs, optical sources, optical sensors or electromagnetic sensors, an electrical processing and control system can make a determination that the electronic device is being grasped and picked up for attaching to a portion of a user's clothing or accessory or that the electronic device is in the process of being attached to a portion of a user's clothing or accessory and temporarily disable one or more user control interfaces disposed on the outside of the wearable electronic device.

IPC Classes  ?

79.

METHOD AND SYSTEM FOR FIBER SCANNING PROJECTOR WITH ANGLED EYEPIECE

      
Application Number 18900077
Status Pending
Filing Date 2024-09-27
First Publication Date 2025-01-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Schowengerdt, Brian T.
  • Watson, Mathew D.

Abstract

A wearable display system includes a fiber scanner including an optical fiber and a scanning mechanism configured to scan a tip of the optical fiber along an emission trajectory defining an optical axis. The wearable display system also includes an eyepiece positioned in front of the tip of the optical fiber and including a planar waveguide, an incoupling diffractive optical element (DOE) coupled to the planar waveguide, and an outcoupling DOE coupled to the planar waveguide. The wearable display system further includes a collimating optical element configured to receive light reflected by the incoupling DOE and collimate and reflect light toward the eyepiece.

IPC Classes  ?

80.

AUGMENTED REALITY DISPLAY COMPRISING EYEPIECE HAVING A TRANSPARENT EMISSIVE DISPLAY

      
Application Number 18886610
Status Pending
Filing Date 2024-09-16
First Publication Date 2025-01-09
Owner Magic Leap, Inc. (USA)
Inventor
  • Edwin, Lionel Ernest
  • Yeoh, Ivan Li Chuen

Abstract

An augmented reality head mounted display system an eyepiece having a transparent emissive display. The eyepiece and transparent emissive display are positioned in an optical path of a user's eye in order to transmit light into the user's eye to form images. Due to the transparent nature of the display, the user can see an outside environment through the transparent emissive display. The transmissive emissive display comprising a plurality of emitters configured to emit light into the eye of the user.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 3/00 - Simple or compound lenses
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 3/3208 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED] organic, e.g. using organic light-emitting diodes [OLED]

81.

DYNAMIC INCOUPLING GRATINGS IN IMAGING SYSTEMS

      
Application Number 18886763
Status Pending
Filing Date 2024-09-16
First Publication Date 2025-01-09
Owner Magic Leap, Inc. (USA)
Inventor
  • Trisnadi, Jahja I.
  • St. Hilaire, Pierre
  • Carlisle, Clinton

Abstract

An eyepiece for projecting an image light field to an eye of a viewer for forming an image of virtual content includes a waveguide, a light source configured to deliver a light beam to be incident on the waveguide, a controller coupled to the light source and configured to modulate an intensity of the light beam in a plurality of time slots, a dynamic input coupling grating (ICG) configured to, for each time slot, diffract a respective portion of the light beam into the waveguide at a respective total internal reflection (TIR) angle corresponding to a respective field angle, and an outcoupling diffractive optical element (DOE) configured to diffract each respective portion of the light beam out of the waveguide toward the eye at the respective field angle, thereby projecting the light field to the eye of the viewer.

IPC Classes  ?

  • G02B 27/42 - Diffraction optics
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • H04N 9/31 - Projection devices for colour picture display

82.

MULTIPLE DEGREE OF FREEDOM HINGE SYSTEMS AND EYEWEAR DEVICES COMPRISING SUCH HINGE SYSTEMS

      
Application Number 18896717
Status Pending
Filing Date 2024-09-25
First Publication Date 2025-01-09
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Lopez, Alejandro
  • Duenner, Andrew C.

Abstract

A multiple degree of freedom hinge system is provided, which is particularly well adapted for eyewear, such as spatial computing headsets. In the context of such spatial computing headsets having an optics assembly supported by opposing temple arms, the hinge system provides protection against over-extension of the temple arms or extreme deflections that may otherwise arise from undesirable torsional loading of the temple arms. The hinge systems also allow the temple arms to splay outwardly to enable proper fit and enhanced user comfort.

IPC Classes  ?

83.

TUNABLE CYLINDRICAL LENSES AND HEAD-MOUNTED DISPLAY INCLUDING THE SAME

      
Application Number 18888395
Status Pending
Filing Date 2024-09-18
First Publication Date 2025-01-09
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Russell, Andrew Ian
  • Haddock, Joshua Naaman

Abstract

Systems include three optical elements arranged along an optical axis each having a different cylinder axis and a variable cylinder refractive power. Collectively, the three elements form a compound optical element having an overall spherical refractive power (SPH), cylinder refractive power (CYL), and cylinder axis (Axis) that can be varied according to a prescription (Rx).

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/02 - Viewing or reading apparatus

84.

Lens set packaging

      
Application Number 29716372
Grant Number D1056706
Status In Force
Filing Date 2019-12-09
First Publication Date 2025-01-07
Grant Date 2025-01-07
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Hoit, Sarah
  • Palmer, James William
  • Gamez Castillejos, Daniel Marcelo
  • Palmer, Christopher G.

85.

Combined charging stand and devices

      
Application Number 29726647
Grant Number D1056835
Status In Force
Filing Date 2020-03-04
First Publication Date 2025-01-07
Grant Date 2025-01-07
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Martin, Spencer Byron

86.

VOICE ONSET DETECTION

      
Application Number 18764006
Status Pending
Filing Date 2024-07-03
First Publication Date 2025-01-02
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Lee, Jung-Suk
  • Jot, Jean-Marc

Abstract

In some embodiments, a first audio signal is received via a first microphone, and a first probability of voice activity is determined based on the first audio signal. A second audio signal is received via a second microphone, and a second probability of voice activity is determined based on the first and second audio signals. Whether a first threshold of voice activity is met is determined based on the first and second probabilities of voice activity. In accordance with a determination that a first threshold of voice activity is met, it is determined that a voice onset has occurred, and an alert is transmitted to a processor based on the determination that the voice onset has occurred. In accordance with a determination that a first threshold of voice activity is not met, it is not determined that a voice onset has occurred.

IPC Classes  ?

  • G10L 25/78 - Detection of presence or absence of voice signals
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 17/18 - Complex mathematical operations for evaluating statistical data
  • G10L 25/51 - Speech or voice analysis techniques not restricted to a single one of groups specially adapted for particular use for comparison or discrimination
  • H04R 3/00 - Circuits for transducers
  • H04R 3/04 - Circuits for transducers for correcting frequency response
  • H04R 5/04 - Circuit arrangements

87.

VIRTUAL LOCATION SELECTION FOR VIRTUAL CONTENT

      
Application Number 18823058
Status Pending
Filing Date 2024-09-03
First Publication Date 2024-12-26
Owner Magic Leap, Inc. (USA)
Inventor
  • Warren, Silas
  • Khan, Omar
  • Miller, Samuel A.
  • Arora, Tushar

Abstract

A method for placing content in an augmented reality system. A notification is received regarding availability of new content to display in the augmented reality system. A confirmation is received that indicates acceptance of the new content. Three dimensional information that describes the physical environment is provided, to an external computing device, to enable the external computing device to be used for selecting an assigned location in the physical environment for the new content. Location information is received, from the external computing device, that indicates the assigned location. A display location on a display system of the augmented reality system at which to display the new content so that the new content appears to the user to be displayed as an overlay at the assigned location in the physical environment is determined, based on the location information. The new content is displayed on the display system at the display location.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

88.

SYSTEMS AND METHODS FOR AUGMENTED REALITY

      
Application Number 18830108
Status Pending
Filing Date 2024-09-10
First Publication Date 2024-12-26
Owner Magic Leap, Inc. (USA)
Inventor
  • Woods, Michael Janusz
  • Rabinovich, Andrew

Abstract

Systems and methods for reducing error from noisy data received from a high frequency sensor by fusing received input with data received from a low frequency sensor by collecting a first set of dynamic inputs from the high frequency sensor, collecting a correction input point from the low frequency sensor, and adjusting a propagation path of a second set of dynamic inputs from the high frequency sensor based on the correction input point either by full translation to the correction input point or dampened approach towards the correction input point.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/0483 - Interaction with page-structured environments, e.g. book metaphor
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06T 3/18 - Image warping, e.g. rearranging pixels individually
  • G06T 7/277 - Analysis of motion involving stochastic approaches, e.g. using Kalman filters
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

89.

INLINE IN-COUPLING OPTICAL ELEMENTS

      
Application Number 18822705
Status Pending
Filing Date 2024-09-03
First Publication Date 2024-12-26
Owner Magic Leap, Inc. (USA)
Inventor
  • Khorasaninejad, Mohammadreza
  • Liu, Victor Kai
  • Lin, Dianmin
  • Peroz, Christophe
  • St. Hilaire, Pierre

Abstract

A display system includes a waveguide assembly having a plurality of waveguides, each waveguide associated with an in-coupling optical element configured to in-couple light into the associated waveguide. A projector outputs light from one or more spatially-separated pupils, and at least one of the pupils outputs light of two different ranges of wavelengths. The in-coupling optical elements for two or more waveguides are inline, e.g. vertically aligned, with each other so that the in-coupling optical elements are in the path of light of the two different ranges of wavelengths. The in-coupling optical element of a first waveguide selectively in-couples light of one range of wavelengths into the waveguide, while the in-coupling optical element of a second waveguide selectively in-couples light of another range of wavelengths. Absorptive color filters are provided forward of an in-coupling optical element to limit the propagation of undesired wavelengths of light to that in-coupling optical element.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 6/10 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
  • G02B 6/26 - Optical coupling means

90.

TOOL BRIDGE

      
Application Number 18823505
Status Pending
Filing Date 2024-09-03
First Publication Date 2024-12-26
Owner Magic Leap, Inc. (USA)
Inventor
  • Bailey, Richard St. Clair
  • Fong, Chun-Ip
  • Bridgewater, Erle Robert

Abstract

Disclosed herein are systems and methods for sharing and synchronizing virtual content. A method may include receiving, from a host application via a wearable device comprising a transmissive display, a first data package comprising first data; identifying virtual content based on the first data; presenting a view of the virtual content via the transmissive display; receiving, via the wearable device, first user input directed at the virtual content; generating second data based on the first data and the first user input; sending, to the host application via the wearable device, a second data package comprising the second data, wherein the host application is configured to execute via one or more processors of a computer system remote to the wearable device and in communication with the wearable device.

IPC Classes  ?

  • G06F 30/12 - Geometric CAD characterised by design entry means specially adapted for CAD, e.g. graphical user interfaces [GUI] specially adapted for CAD
  • G02B 27/01 - Head-up displays
  • G06F 111/18 - Details relating to CAD techniques using virtual or augmented reality
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

91.

VOICE PROCESSING FOR MIXED REALITY

      
Application Number 18700175
Status Pending
Filing Date 2022-10-13
First Publication Date 2024-12-19
Owner Magic Leap, Inc. (USA)
Inventor
  • Audfray, Remi Samuel
  • Hertensteiner, Mark Brandon

Abstract

This disclosure is related to systems and methods for rendering audio for a mixed reality environment. Methods according to embodiments of this disclosure include receiving an input audio signal, via a wearable device in communication with a mixed reality environment, the input audio signal corresponding to a sound source originating from a real environment. In some embodiments, the system can determine one or more acoustic properties associated with the mixed reality environment. In some embodiments, the system can determine a signal modification parameter based on the one or more acoustic properties associated with the mixed reality environment. In some embodiments, the system can apply the signal modification parameter to the input audio signal to determine a second audio signal. The system can present the second audio signal to the user.

IPC Classes  ?

92.

VIRTUAL AND AUGMENTED REALITY SYSTEMS AND METHODS HAVING IMPROVED DIFFRACTIVE GRATING STRUCTURES

      
Application Number 18814248
Status Pending
Filing Date 2024-08-23
First Publication Date 2024-12-19
Owner Magic Leap, Inc. (USA)
Inventor
  • Tekolste, Robert D.
  • Klug, Michael A.
  • Greco, Paul M.
  • Schowengerdt, Brian T.

Abstract

Disclosed is an improved diffraction structure for 3D display systems. The improved diffraction structure includes an intermediate layer that resides between a waveguide substrate and a top grating surface. The top grating surface comprises a first material that corresponds to a first refractive index value, the underlayer comprises a second material that corresponds to a second refractive index value, and the substrate comprises a third material that corresponds to a third refractive index value.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 5/18 - Diffracting gratings
  • G02B 27/42 - Diffraction optics
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

93.

DISPLAY SYSTEMS AND METHODS FOR DETERMINING VERTICAL ALIGNMENT BETWEEN LEFT AND RIGHT DISPLAYS AND A USER'S EYES

      
Application Number 18818343
Status Pending
Filing Date 2024-08-28
First Publication Date 2024-12-19
Owner Magic Leap, Inc. (USA)
Inventor Vlaskamp, Bjorn Nicolaas Servatius

Abstract

A wearable device may include a head-mounted display (HMD) for rendering a three-dimensional (3D) virtual object which appears to be located in an ambient environment of a user of the display. The relative positions of the HMD and one or more eyes of the user may not be in desired positions to receive image information outputted by the HMD. For example, the HMD-to-eye vertical alignment may be different between the left and right eyes. The wearable device may determine if the HMD is level on the user's head and may then provide the user with a left-eye alignment marker and a right-eye alignment marker. Based on user feedback, the wearable device may determine if there is any left-right vertical misalignment and may take actions to reduce or minimize the effects of any misalignment.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/01 - Head-up displays
  • G09G 5/38 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory with means for controlling the display position
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

94.

PAIRING WITH COMPANION DEVICE

      
Application Number 18819573
Status Pending
Filing Date 2024-08-29
First Publication Date 2024-12-19
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Nitin
  • Kaehler, Adrian

Abstract

This disclosure describes techniques for device authentication and/or pairing. A display system can comprise a head mountable display, computer memory, and processor(s). In response to receiving a request to authenticate a connection between the display system and a companion device (e.g., controller or other computer device), first data may be determined, the first data based at least partly on biometric data associated with a user. The first data may be sent to an authentication device configured to compare the first data to second data received from the companion device, the second data based at least partly on the biometric data. Based at least partly on a correspondence between the first and second data, the authentication device can send a confirmation to the display system to permit communication between the display system and companion device.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • G02B 27/01 - Head-up displays
  • G06F 21/32 - User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
  • G06F 21/44 - Program or device authentication
  • G06V 20/80 - Recognising image objects characterised by unique random patterns
  • H04B 1/3827 - Portable transceivers
  • H04L 9/40 - Network security protocols
  • H04M 1/60 - Substation equipment, e.g. for use by subscribers including speech amplifiers
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
  • H04W 8/00 - Network data management
  • H04W 12/065 - Continuous authentication
  • H04W 12/069 - Authentication using certificates or pre-shared keys
  • H04W 12/50 - Secure pairing of devices
  • H04W 12/77 - Graphical identity

95.

METHODS AND APPARATUSES FOR CORNER DETECTION

      
Application Number 18820433
Status Pending
Filing Date 2024-08-30
First Publication Date 2024-12-19
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Tobler, Christoph
  • Langmann, Benjamin

Abstract

An apparatus configured for head-worn by a user, includes: a screen configured to present graphics for the user; a camera system configured to view an environment in which the user is located; and a processing unit coupled to the camera system, the processing unit configured to: obtain a feature detection response for a first image, divide the feature detection response into a plurality of patches having a first patch and a second patch, determine a first maximum value in the first patch of the feature detection response, and identify a first set of one or more features for a first region of the first image based on a first criterion that relates to the determined first maximum value.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersectionsConnectivity analysis, e.g. of connected components
  • G06V 10/46 - Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]Salient regional features
  • G06V 10/771 - Feature selection, e.g. selecting representative features from a multi-dimensional feature space
  • G06V 20/10 - Terrestrial scenes
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes

96.

Surface relief waveguides with high refractive index resist

      
Application Number 18729437
Grant Number 12222537
Status In Force
Filing Date 2023-01-20
First Publication Date 2024-12-19
Grant Date 2025-02-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Traub, Matthew C
  • Liu, Yingnan
  • Singh, Vikramjit
  • Xu, Frank Y.
  • Tekolste, Robert D.
  • Xue, Qizhen
  • Bhargava, Samarth
  • Liu, Victor Kai
  • Born, Brandon Michael-James
  • Messer, Kevin

Abstract

The disclosure describes an improved drop-on-demand, controlled volume technique for dispensing resist onto a substrate, which is then imprinted to create a patterned optical device suitable for use in optical applications such as augmented reality and/or mixed reality systems. The technique enables the dispensation of drops of resist at precise locations on the substrate, with precisely controlled drop volume corresponding to an imprint template having different zones associated with different total resist volumes. Controlled drop size and placement also provides for substantially less variation in residual layer thickness across the surface of the substrate after imprinting, compared to previously available techniques. The technique employs resist having a refractive index closer to that of the substrate index, reducing optical artifacts in the device. To ensure reliable dispensing of the higher index and higher viscosity resist in smaller drop sizes, the dispensing system can continuously circulate the resist.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G03F 7/00 - Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printed surfacesMaterials therefor, e.g. comprising photoresistsApparatus specially adapted therefor

97.

METHODS AND SYSTEMS FOR GENERATING VIRTUAL CONTENT DISPLAY WITH A VIRTUAL OR AUGMENTED REALITY APPARATUS

      
Application Number 18802429
Status Pending
Filing Date 2024-08-13
First Publication Date 2024-12-19
Owner Magic Leap, Inc. (USA)
Inventor
  • Tekolste, Robert D.
  • Klug, Michael

Abstract

Several unique configurations for interferometric recording of volumetric phase diffractive elements with relatively high angle diffraction for use in waveguides are disclosed. Separate layer EPE and OPE structures produced by various methods may be integrated in side-by-side or overlaid constructs, and multiple such EPE and OPE structures may be combined or multiplexed to exhibit EPE/OPE functionality in a single, spatially-coincident layer. Multiplexed structures reduce the total number of layers of materials within a stack of eyepiece optics, each of which may be responsible for displaying a given focal depth range of a volumetric image. Volumetric phase type diffractive elements are used to offer properties including spectral bandwidth selectivity that may enable registered multi-color diffracted fields, angular multiplexing capability to facilitate tiling and field-of-view expansion without crosstalk, and all-optical, relatively simple prototyping compared to other diffractive element forms, enabling rapid design iteration.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 5/18 - Diffracting gratings
  • G02B 27/01 - Head-up displays
  • G02B 30/24 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
  • G02B 30/26 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the autostereoscopic type
  • G02F 1/1334 - Constructional arrangements based on polymer-dispersed liquid crystals, e.g. microencapsulated liquid crystals
  • G03H 1/04 - Processes or apparatus for producing holograms

98.

METHOD OF FABRICATING DISPLAY DEVICE HAVING PATTERNED LITHIUM-BASED TRANSITION METAL OXIDE

      
Application Number 18818146
Status Pending
Filing Date 2024-08-28
First Publication Date 2024-12-19
Owner Magic Leap, Inc. (USA)
Inventor
  • Melli, Mauro
  • Peroz, Christophe
  • West, Melanie Maputol

Abstract

The present disclosure generally relates to display systems, and more particularly to augmented reality display systems and methods of fabricating the same. A method of fabricating a display device includes providing a substrate comprising a lithium (Li)-based oxide and forming an etch mask pattern exposing regions of the substrate. The method additionally includes plasma etching the exposed regions of the substrate using a gas mixture comprising CHF3 to form a diffractive optical element, wherein the diffractive optical element comprises Li-based oxide features configured to diffract visible light incident thereon.

IPC Classes  ?

  • G02B 5/18 - Diffracting gratings
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/01 - Head-up displays

99.

SYSTEMS AND METHODS FOR VIRTUAL AND AUGMENTED REALITY

      
Application Number 18820103
Status Pending
Filing Date 2024-08-29
First Publication Date 2024-12-19
Owner MAGIC LEAP, INC. (USA)
Inventor Berkebile, Robert David

Abstract

An apparatus for providing a virtual or augmented reality experience, includes: a screen, wherein the screen is at least partially transparent for allowing a user of the apparatus to view an object in an environment surrounding the user; a surface detector configured to detect a surface of the object; an object identifier configured to obtain an orientation and/or an elevation of the surface of the object, and to make an identification for the object based on the orientation and/or the elevation of the surface of the object; and a graphic generator configured to generate an identifier indicating the identification for the object for display by the screen, wherein the screen is configured to display the identifier.

IPC Classes  ?

  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • G02B 27/01 - Head-up displays
  • G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/04842 - Selection of displayed objects or displayed text elements
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersectionsConnectivity analysis, e.g. of connected components
  • G06V 20/10 - Terrestrial scenes
  • G06V 20/20 - ScenesScene-specific elements in augmented reality scenes
  • G06V 40/18 - Eye characteristics, e.g. of the iris

100.

COMPUTATIONALLY EFFICIENT METHOD FOR COMPUTING A COMPOSITE REPRESENTATION OF A 3D ENVIRONMENT

      
Application Number 18804661
Status Pending
Filing Date 2024-08-14
First Publication Date 2024-12-12
Owner Magic Leap, Inc. (USA)
Inventor
  • Zhou, Lipu
  • Steinbruecker, Frank Thomas
  • Swaminathan, Ashwin
  • Ju, Hui
  • Koppel, Daniel Esteban
  • Zampogiannis, Konstantinos
  • Mehta, Pooja Piyush
  • Balakumar, Vinayram

Abstract

Methods and apparatus for providing a representation of an environment, for example, in an XR system, and any suitable computer vision and robotics applications. A representation of an environment may include one or more planar features. The representation of the environment may be provided by jointly optimizing plane parameters of the planar features and sensor poses that the planar features are observed at. The joint optimization may be based on a reduced matrix and a reduced residual vector in lieu of the Jacobian matrix and the original residual vector.

IPC Classes  ?

  • G06T 7/00 - Image analysis
  • G06F 17/16 - Matrix or vector computation
  • G06F 17/17 - Function evaluation by approximation methods, e.g. interpolation or extrapolation, smoothing or least mean square method
  • G06T 7/70 - Determining position or orientation of objects or cameras
  1     2     3     ...     32        Next Page