Cuesta Technology Holdings, LLC

United States of America

Back to Profile

1-13 of 13 for Cuesta Technology Holdings, LLC Sort by
Query
Aggregations
IP Type
        Patent 12
        Trademark 1
Date
2024 1
2022 1
Before 2020 11
IPC Class
G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints 9
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 6
A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells 5
A63F 13/2145 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens 5
A63F 13/26 - Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth 5
See more
Status
Pending 1
Registered / In Force 12
Found results for

1.

Systems and methods for extensions to alternative control of touch-based devices

      
Application Number 18825948
Status Pending
Filing Date 2024-09-05
First Publication Date 2024-12-26
Owner Cuesta Technology Holdings, LLC (USA)
Inventor
  • Flagg, Matthew
  • Barrett, Jeremy
  • Wills, Scott
  • Durkin, Sean
  • Valloppillil, Vinod
  • Mallick, Satya

Abstract

A device includes one or more input devices; one or more processors communicatively coupled to the one or more input devices; and memory storing instructions that, when executed cause the one or more processors to receive command inputs from the one or more input devices, process the received command inputs to determine translated inputs, and perform operations based on the determined translated inputs. The translated inputs are between different modes.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

2.

Systems and methods for extensions to alternative control of touch-based devices

      
Application Number 17589975
Grant Number 12099658
Status In Force
Filing Date 2022-02-01
First Publication Date 2022-10-20
Grant Date 2024-09-24
Owner Cuesta Technology Holdings, LLC (USA)
Inventor
  • Flagg, Matthew
  • Barrett, Jeremy
  • Wills, Scott
  • Durkin, Sean
  • Valloppillil, Vinod
  • Mallick, Satya

Abstract

Systems and methods of multi-modal control of a touch-based device include receiving multi-modal control inputs from one or more of voice commands, a game controller, a handheld remote, and physical gestures detected by a sensor; converting the multi-modal control inputs into corresponding translated inputs which correspond to physical inputs recognizable by the touch-based device; and providing the corresponding translated inputs to the touch-based device for control thereof, wherein the translated inputs are utilized by the touch-based device as corresponding physical inputs to control underlying applications executed on the touch-based device which expect the corresponding physical inputs.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/2145 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
  • A63F 13/26 - Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
  • A63F 13/42 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/042 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/04883 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
  • G06F 3/16 - Sound inputSound output
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

3.

Interactive imaging systems and methods for motion control by users

      
Application Number 16531219
Grant Number 11006103
Status In Force
Filing Date 2019-08-05
First Publication Date 2019-11-21
Grant Date 2021-05-11
Owner Cuesta Technology Holdings, LLC (USA)
Inventor
  • Flagg, Matthew
  • Roberts, Greg

Abstract

In various embodiments, the present invention provides a system and associated methods of calibration and use for an interactive imaging environment based on the optimization of parameters used in various segmentation algorithm techniques. These methods address the challenge of automatically calibrating an interactive imaging system, so that it is capable of aligning human body motion, or the like, to a visual display. As such the present invention provides a system and method of automatically and rapidly aligning the motion of an object to a visual display.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • H04N 17/00 - Diagnosis, testing or measuring for television systems or their details
  • G06T 7/11 - Region-based segmentation
  • G06T 7/194 - SegmentationEdge detection involving foreground-background segmentation
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • H04N 5/33 - Transforming infrared radiation

4.

Systems and methods for extensions to alternative control of touch-based devices

      
Application Number 16132541
Grant Number 11237638
Status In Force
Filing Date 2018-09-17
First Publication Date 2019-01-24
Grant Date 2022-02-01
Owner Cuesta Technology Holdings, LLC (USA)
Inventor
  • Flagg, Matthew
  • Barrett, Jeremy
  • Wills, Scott
  • Durkin, Sean
  • Valloppillil, Vinod
  • Mallick, Satya

Abstract

Systems and methods of multi-modal control of a touch-based device include receiving multi-modal control inputs from one or more of voice commands, a game controller, a handheld remote, and physical gestures detected by a sensor; converting the multi-modal control inputs into corresponding translated inputs which correspond to physical inputs recognizable by the touch-based device; and providing the corresponding translated inputs to the touch-based device for control thereof, wherein the translated inputs are utilized by the touch-based device as corresponding physical inputs to control underlying applications executed on the touch-based device which expect the corresponding physical inputs.

IPC Classes  ?

  • G06F 3/16 - Sound inputSound output
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • G06F 3/042 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • A63F 13/42 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • A63F 13/26 - Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/2145 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens

5.

Interactive imaging systems and methods for motion control by users

      
Application Number 16047090
Grant Number 10375384
Status In Force
Filing Date 2018-07-27
First Publication Date 2018-11-29
Grant Date 2019-08-06
Owner Cuesta Technology Holdings, LLC (USA)
Inventor
  • Flagg, Matthew
  • Roberts, Greg

Abstract

In various embodiments, the present invention provides a system and associated methods of calibration and use for an interactive imaging environment based on the optimization of parameters used in various segmentation algorithm techniques. These methods address the challenge of automatically calibrating an interactive imaging system, so that it is capable of aligning human body motion, or the like, to a visual display. As such the present invention provides a system and method of automatically and rapidly aligning the motion of an object to a visual display.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • H04N 17/00 - Diagnosis, testing or measuring for television systems or their details
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • H04N 5/33 - Transforming infrared radiation
  • G06T 7/11 - Region-based segmentation
  • G06T 7/194 - SegmentationEdge detection involving foreground-background segmentation

6.

Multi-modal input control of touch-based devices

      
Application Number 15602691
Grant Number 10108271
Status In Force
Filing Date 2017-05-23
First Publication Date 2017-09-07
Grant Date 2018-10-23
Owner Cuesta Technology Holdings, LLC (USA)
Inventor
  • Flagg, Matthew
  • Barrett, Jeremy
  • Wills, Scott
  • Durkin, Sean
  • Valloppillil, Vinod
  • Mallick, Satya

Abstract

Systems and methods configured to facilitate multi-modal user inputs in lieu of physical input for a processing device configured to execute an application include obtaining non-physical input for processing device and the application, wherein the physical input comprises one or more of touch-based input and tilt input; processing the non-physical input to convert into appropriate physical input commands for the application; and providing the physical input commands to the processing device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06F 3/16 - Sound inputSound output
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/042 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
  • A63F 13/42 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
  • A63F 13/26 - Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
  • A63F 13/2145 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells

7.

Interactive imaging systems and methods for motion control by users

      
Application Number 15603018
Grant Number 10038896
Status In Force
Filing Date 2017-05-23
First Publication Date 2017-09-07
Grant Date 2018-07-31
Owner Cuesta Technology Holdings, LLC (USA)
Inventor
  • Flagg, Matthew
  • Roberts, Greg

Abstract

In various embodiments, the present invention provides a system and associated methods of calibration and use for an interactive imaging environment based on the optimization of parameters used in various segmentation algorithm techniques. These methods address the challenge of automatically calibrating an interactive imaging system, so that it is capable of aligning human body motion, or the like, to a visual display. As such the present invention provides a system and method of automatically and rapidly aligning the motion of an object to a visual display.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • H04N 17/00 - Diagnosis, testing or measuring for television systems or their details
  • G06T 7/11 - Region-based segmentation
  • G06T 7/194 - SegmentationEdge detection involving foreground-background segmentation
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • H04N 5/33 - Transforming infrared radiation

8.

Interactive imaging systems and methods for motion control by users

      
Application Number 15097748
Grant Number 09674514
Status In Force
Filing Date 2016-04-13
First Publication Date 2016-08-04
Grant Date 2017-06-06
Owner CUESTA TECHNOLOGY HOLDINGS, LLC (USA)
Inventor
  • Flagg, Matthew
  • Roberts, Greg

Abstract

In various embodiments, the present invention provides a system and associated methods of calibration and use for an interactive imaging environment based on the optimization of parameters used in various segmentation algorithm techniques. These methods address the challenge of automatically calibrating an interactive imaging system, so that it is capable of aligning human body motion, or the like, to a visual display. As such the present invention provides a system and method of automatically and rapidly aligning the motion of an object to a visual display.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • H04N 17/00 - Diagnosis, testing or measuring for television systems or their details
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • H04N 5/33 - Transforming infrared radiation
  • G06T 7/11 - Region-based segmentation
  • G06T 7/194 - SegmentationEdge detection involving foreground-background segmentation

9.

Systems and methods for extensions to alternative control of touch-based devices

      
Application Number 14923888
Grant Number 09671874
Status In Force
Filing Date 2015-10-27
First Publication Date 2016-02-25
Grant Date 2017-06-06
Owner CUESTA TECHNOLOGY HOLDINGS, LLC (USA)
Inventor
  • Flagg, Matthew
  • Barrett, Jeremy
  • Wills, Scott
  • Durkin, Sean
  • Valloppillil, Vinod

Abstract

Systems and methods configured to facilitate multi-modal user inputs in lieu of physical input for a processing device configured to execute an application include obtaining non-physical input for processing device and the application, wherein the physical input comprises one or more of touch-based input and tilt input; processing the non-physical input to convert into appropriate physical input commands for the application; and providing the physical input commands to the processing device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • A63F 13/42 - Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
  • G06F 3/16 - Sound inputSound output
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • G06F 3/042 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • A63F 13/26 - Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/2145 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
  • A63F 13/67 - Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use

10.

Systems and methods for alternative control of touch-based devices

      
Application Number 14075742
Grant Number 09658695
Status In Force
Filing Date 2013-11-08
First Publication Date 2014-05-08
Grant Date 2017-05-23
Owner CUESTA TECHNOLOGY HOLDINGS, LLC (USA)
Inventor
  • Flagg, Matthew
  • Barrett, Jeremy
  • Wills, Scott
  • Durkin, Sean
  • Valloppillil, Vinod

Abstract

A computer-implemented method, a system, and software includes providing output from a touch-based device to an external display; detecting gestures from a user located away from and not physically touching the touch-based device; and translating the detected gestures into appropriate commands for the touch-based device. The systems and methods provide alternative control of touch-based devices such as mobile devices. The systems and methods can include a mobile device coupled to an external display device and controlled via user gestures monitored by a collocated sensor. Accordingly, the systems and methods allow users to operate applications (“apps”) on the mobile device displayed on the external display device and controlled without touching the mobile device using gestures monitored by the collocated sensor. This enables the wide variety of rich apps to be operated in a new manner.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • A63F 13/26 - Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
  • A63F 13/213 - Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
  • A63F 13/2145 - Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens

11.

System and method for enabling meaningful interaction with video based characters and objects

      
Application Number 13243486
Grant Number 08538153
Status In Force
Filing Date 2011-09-23
First Publication Date 2012-02-09
Grant Date 2013-09-17
Owner CUESTA TECHNOLOGY HOLDINGS, LLC (USA)
Inventor
  • Flagg, Matthew
  • Roberts, Greg

Abstract

The present disclosure provides a system and method for enabling meaningful body-to-body interaction with virtual video-based characters or objects in an interactive imaging environment including: capturing a corpus of video-based interaction data, processing the captured video using a segmentation process that corresponds to the capture setup in order to generate binary video data, labeling the corpus by assigning a description to clips of silhouette video, processing the labeled corpus of silhouette motion data to extract horizontal and vertical projection histograms for each frame of silhouette data, and estimating the motion state automatically from each frame of segmentation data using the processed model. Virtual characters or objects are represented using video captured from video-based motion, thereby creating the illusion of real characters or objects in an interactive imaging experience.

IPC Classes  ?

  • G06K 9/34 - Segmentation of touching or overlapping patterns in the image field
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06K 9/36 - Image preprocessing, i.e. processing the image information without deciding about the identity of the image

12.

System and associated methods of calibration and use for an interactive imaging environment

      
Application Number 13243071
Grant Number 08867835
Status In Force
Filing Date 2011-09-23
First Publication Date 2012-01-19
Grant Date 2014-10-21
Owner CUESTA TECHNOLOGY HOLDINGS, LLC (USA)
Inventor
  • Flagg, Matthew
  • Roberts, Greg

Abstract

In various embodiments, the present invention provides a system and associated methods of calibration and use for an interactive imaging environment based on the optimization of parameters used in various segmentation algorithm techniques. These methods address the challenge of automatically calibrating an interactive imaging system, so that it is capable of aligning human body motion, or the like, to a visual display. As such the present invention provides a system and method of automatically and rapidly aligning the motion of an object to a visual display.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06T 7/00 - Image analysis
  • G06K 9/34 - Segmentation of touching or overlapping patterns in the image field

13.

PLAYMOTION

      
Serial Number 78660473
Status Registered
Filing Date 2005-06-29
Registration Date 2006-06-20
Owner CUESTA TECHNOLOGY HOLDINGS, LLC ()
NICE Classes  ? 09 - Scientific and electric apparatus and instruments

Goods & Services

Software for processing images, graphics, and/or text