|
Found results for
patents
1.
|
PROTOCOL SIMULATION IN A VIRTUALIZED ROBOTIC LAB ENVIRONMENT
Application Number |
18930141 |
Status |
Pending |
Filing Date |
2024-10-29 |
First Publication Date |
2025-02-13 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Washington, Jeff
- Budd, Geoffrey J.
- Singh, Nikhita
- Sganga, Jake
- Honda, Alexander Li
|
Abstract
A lab system identifies a set of steps associated with a protocol for a lab meant to be performed by a robot within the lab using equipment and reagents. The lab system renders, within a user interface, a virtual representation of the lab, a virtual robot, and virtual equipment and reagents. Responsive to operating in a first mode, the lab system simulates the identified set of steps identify virtual positions of the virtual robot within the lab as the virtual robot performs the steps and modifies the virtual representation of the lab to mirror the identified positions of the virtual robot in real-time. Responsive to operating in a second mode, the lab system identifies positions of the robot within the lab as the robot performs the identified set of steps and modifies the virtual representation of the lab to mirror the identified positions of the robot in real-time.
IPC Classes ?
- B25J 9/16 - Programme controls
- B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J 19/02 - Sensing devices
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
- G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
- G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
- G06F 3/04842 - Selection of displayed objects or displayed text elements
- G06F 40/10 - Text processing
- G06F 40/40 - Processing or translation of natural language
- G06N 20/00 - Machine learning
- G06T 7/50 - Depth or shape recovery
- G06T 7/70 - Determining position or orientation of objects or cameras
- G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
|
2.
|
TRANSLATION AND AUTOMATION OF PROTOCOLS IN A ROBOTIC LAB
Application Number |
18926826 |
Status |
Pending |
Filing Date |
2024-10-25 |
First Publication Date |
2025-02-06 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Washington, Jeff
- Budd, Geoffrey J.
- Singh, Nikhita
- Sganga, Jake
- Honda, Alexander Li
|
Abstract
A lab system configures robots to performs protocols in labs. The lab automation system receives, via a user interface, an instruction from a user to perform a protocol within a lab. The instruction may comprise text, and the lab may comprise a robot configured to perform the protocol. The lab system converts, using a machine learned model, the text into steps and, for each step, identifies one or more of an operation, lab equipment, and reagent associated with the step. In response to detecting an ambiguity/error associated with the step, the lab system notifies the user via the user interface of the ambiguity/error. The lab system may receive one or more indications from the user that resolve the ambiguity/error and update the associated steps. For each step, the lab system configures the robot to perform an identified operation, interact with identified lab equipment, and/or access/use an identified reagent.
IPC Classes ?
- B25J 9/16 - Programme controls
- B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J 19/02 - Sensing devices
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
- G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
- G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
- G06F 3/04842 - Selection of displayed objects or displayed text elements
- G06F 40/10 - Text processing
- G06F 40/40 - Processing or translation of natural language
- G06N 20/00 - Machine learning
- G06T 7/50 - Depth or shape recovery
- G06T 7/70 - Determining position or orientation of objects or cameras
- G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
|
3.
|
Adapting robotic protocols between labs
Application Number |
18403484 |
Grant Number |
12246455 |
Status |
In Force |
Filing Date |
2024-01-03 |
First Publication Date |
2024-06-13 |
Grant Date |
2025-03-11 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Washington, Jeff
- Budd, Geoffrey J.
- Singh, Nikhita
- Sganga, Jake
- Honda, Alexander Li
|
Abstract
A lab system accesses a first protocol for performance by a first robot in a first lab. The first protocol includes a set of steps, each associated with an operation, reagent, and equipment. For each of one or more steps, the lab system modifies the step by: (1) identifying one or more replacement operations that achieve an equivalent or substantially similar result as a performance of the operation, (2) identifying replacement equipment that operates substantially similarly to the equipment, and/or (3) identifying one or more replacement reagents that, when substituted for the reagent, do not substantially affect the performance of the step. The lab system generates a modified protocol by replacing one or more of the set of steps with the modified steps. The lab system selects a second lab including a second and configures the second robot to perform the modified protocol in the second lab.
IPC Classes ?
- B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J 9/16 - Programme controls
- B25J 19/02 - Sensing devices
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
- G01N 35/04 - Details of the conveyor system
- G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
- G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
- G06F 3/04842 - Selection of displayed objects or displayed text elements
- G06F 40/10 - Text processing
- G06F 40/40 - Processing or translation of natural language
- G06N 20/00 - Machine learning
- G06T 7/50 - Depth or shape recovery
- G06T 7/70 - Determining position or orientation of objects or cameras
- G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
- G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
|
4.
|
Protocol simulation in a virtualized robotic lab environment
Application Number |
18416745 |
Grant Number |
12162161 |
Status |
In Force |
Filing Date |
2024-01-18 |
First Publication Date |
2024-06-06 |
Grant Date |
2024-12-10 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Washington, Jeff
- Budd, Geoffrey J.
- Singh, Nikhita
- Sganga, Jake
- Honda, Alexander Li
|
Abstract
A lab system identifies a set of steps associated with a protocol for a lab meant to be performed by a robot within the lab using equipment and reagents. The lab system renders, within a user interface, a virtual representation of the lab, a virtual robot, and virtual equipment and reagents. Responsive to operating in a first mode, the lab system simulates the identified set of steps identify virtual positions of the virtual robot within the lab as the virtual robot performs the steps and modifies the virtual representation of the lab to mirror the identified positions of the virtual robot in real-time. Responsive to operating in a second mode, the lab system identifies positions of the robot within the lab as the robot performs the identified set of steps and modifies the virtual representation of the lab to mirror the identified positions of the robot in real-time.
IPC Classes ?
- B25J 9/16 - Programme controls
- B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J 19/02 - Sensing devices
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
- G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
- G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
- G06F 3/04842 - Selection of displayed objects or displayed text elements
- G06F 40/10 - Text processing
- G06F 40/40 - Processing or translation of natural language
- G06N 20/00 - Machine learning
- G06T 7/50 - Depth or shape recovery
- G06T 7/70 - Determining position or orientation of objects or cameras
- G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
|
5.
|
AUTOMATED LABORATORY SCHEDULING BASED ON USER-DRAWN WORKFLOW
Application Number |
US2023034516 |
Publication Number |
2024/118149 |
Status |
In Force |
Filing Date |
2023-10-05 |
Publication Date |
2024-06-06 |
Owner |
ARTIFICIAL, INC. (USA)
|
Inventor |
- Rose, Erik
- Leedom, Benjamin, C.
- Fournie, Jon
- Juhasz, James
- Sander, Slawomir
|
Abstract
A lab system configures robots to performs protocols in labs. A lab system generates an interface including a representation of lab systems within a lab associated with a schedule of tasks. Each task associated with a pre-scheduled assay. A lab system receives, from a user via the interface, a workflow path through the lab for an assay. The workflow path is associated with an ordered subset of the lab systems used in the performance of the assay. A lab system converts the received workflow into a set of lab system tasks required to perform the assay. Each of the subset of lab systems is associated with a subset of lab system tasks. A lab system modifies the schedule of tasks to include the set of lab system tasks by optimizing a combination of the set of lab system tasks and the tasks associated with pre-scheduled assays.
IPC Classes ?
- G06Q 10/0633 - Workflow analysis
- G06Q 10/0631 - Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q 10/10 - Office automationTime management
|
6.
|
AUTOMATED LABRATORY SCHEDULING BASED ON USER-DRAWN WORKFLOW
Application Number |
18469747 |
Status |
Pending |
Filing Date |
2023-09-19 |
First Publication Date |
2024-05-30 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Rose, Erik
- Leedom, Benjamin C.
- Fournie, Jon
|
Abstract
A lab system configures robots to performs protocols in labs. A lab system generates an interface including a representation of lab systems within a lab associated with a schedule of tasks. Each task associated with a pre-scheduled assay. A lab system receives, from a user via the interface, a workflow path through the lab for an assay. The workflow path is associated with an ordered subset of the lab systems used in the performance of the assay. A lab system converts the received workflow into a set of lab system tasks required to perform the assay. Each of the subset of lab systems is associated with a subset of lab system tasks. A lab system modifies the schedule of tasks to include the set of lab system tasks by optimizing a combination of the set of lab system tasks and the tasks associated with pre-scheduled assays.
IPC Classes ?
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
|
7.
|
AUTOMATED LABORATORY WORKFLOW RECOVERY BASED ON MANUAL DATA ENTRY
Application Number |
18469749 |
Status |
Pending |
Filing Date |
2023-09-19 |
First Publication Date |
2024-05-30 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Leedom, Benjamin C.
- Juhasz, James
- Sander, Slawomir A.
|
Abstract
A lab system configures robots to performs protocols in labs. A lab system generates an interface including a representation of lab systems within a lab associated with a schedule of tasks. Each task associated with a pre-scheduled assay. A lab system receives, from a user via the interface, a workflow path through the lab for an assay. The workflow path is associated with an ordered subset of the lab systems used in the performance of the assay. A lab system converts the received workflow into a set of lab system tasks required to perform the assay. Each of the subset of lab systems is associated with a subset of lab system tasks. A lab system modifies the schedule of tasks to include the set of lab system tasks by optimizing a combination of the set of lab system tasks and the tasks associated with pre-scheduled assays.
IPC Classes ?
- G06Q 10/0631 - Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q 10/0633 - Workflow analysis
|
8.
|
Predictive instruction text with virtual lab representation highlighting
Application Number |
17392119 |
Grant Number |
11958198 |
Status |
In Force |
Filing Date |
2021-08-02 |
First Publication Date |
2022-02-10 |
Grant Date |
2024-04-16 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Washington, Jeff
- Budd, Geoffrey J.
- Singh, Nikhita
- Sganga, Jake
- Honda, Alexander Li
|
Abstract
A lab automation system receives an instruction from a user to perform a protocol within a lab via an interface including a graphical representation of the lab. The lab includes a robot and set of equipment rendered within the graphical representation of the lab. The lab automation system identifies an ambiguous term of the instruction and pieces of equipment corresponding to the ambiguous term and modifies the interface to include a predictive text interface element listing the pieces of equipment. Upon a mouseover of a listed piece of equipment within the predictive text interface element, the lab automation system modifies the graphical representation of the lab to highlight the listed piece of equipment corresponding to the mouseover. Upon a selection of the listed piece of equipment within the predictive text interface element, the lab automation system modifies the instruction to include the listed piece of equipment.
IPC Classes ?
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
- B25J 9/16 - Programme controls
- B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J 19/02 - Sensing devices
- G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
- G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
- G06F 3/04842 - Selection of displayed objects or displayed text elements
- G06F 40/10 - Text processing
- G06F 40/40 - Processing or translation of natural language
- G06N 20/00 - Machine learning
- G06T 7/50 - Depth or shape recovery
- G06T 7/70 - Determining position or orientation of objects or cameras
- G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
|
9.
|
TRANSLATION AND AUTOMATION OF PROTOCOLS IN A ROBOTIC LAB
Application Number |
US2021044226 |
Publication Number |
2022/031621 |
Status |
In Force |
Filing Date |
2021-08-02 |
Publication Date |
2022-02-10 |
Owner |
ARTIFICIAL, INC. (USA)
|
Inventor |
- Washington, Jeff
- Budd, Geoffrey, J.
- Singh, Nikhita
- Sganga, Jake
- Honda, Alexander, Li
|
Abstract
A lab system configures robots to performs protocols in labs. The lab automation system receives, via a user interface, an instruction from a user to perform a protocol within a lab. The instruction may comprise text, and the lab may comprise a robot configured to perform the protocol. The lab system converts, using a machine learned model, the text into steps and, for each step, identifies one or more of an operation, lab equipment, and reagent associated with the step. In response to detecting an ambiguity /error associated with the step, the lab system notifies the user via the user interface of the ambiguity /err or. The lab system may receive one or more indications from the user that resolve the ambiguity /error and update the associated steps. For each step, the lab system configures the robot to perform an identified operation, interact with identified lab equipment, and/or access/use an identified reagent.
IPC Classes ?
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
|
10.
|
Translation and automation of protocols in a robotic lab
Application Number |
17392096 |
Grant Number |
12179367 |
Status |
In Force |
Filing Date |
2021-08-02 |
First Publication Date |
2022-02-10 |
Grant Date |
2024-12-31 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Washington, Jeff
- Budd, Geoffrey J.
- Singh, Nikhita
- Sganga, Jake
- Honda, Alexander Li
|
Abstract
A lab system configures robots to performs protocols in labs. The lab automation system receives, via a user interface, an instruction from a user to perform a protocol within a lab. The instruction may comprise text, and the lab may comprise a robot configured to perform the protocol. The lab system converts, using a machine learned model, the text into steps and, for each step, identifies one or more of an operation, lab equipment, and reagent associated with the step. In response to detecting an ambiguity/error associated with the step, the lab system notifies the user via the user interface of the ambiguity/error. The lab system may receive one or more indications from the user that resolve the ambiguity/error and update the associated steps. For each step, the lab system configures the robot to perform an identified operation, interact with identified lab equipment, and/or access/use an identified reagent.
IPC Classes ?
- B25J 9/16 - Programme controls
- B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J 19/02 - Sensing devices
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
- G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
- G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
- G06F 3/04842 - Selection of displayed objects or displayed text elements
- G06F 40/10 - Text processing
- G06F 40/40 - Processing or translation of natural language
- G06N 20/00 - Machine learning
- G06T 7/50 - Depth or shape recovery
- G06T 7/70 - Determining position or orientation of objects or cameras
- G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
|
11.
|
Protocol simulation in a virtualized robotic lab environment
Application Number |
17392105 |
Grant Number |
11919174 |
Status |
In Force |
Filing Date |
2021-08-02 |
First Publication Date |
2022-02-10 |
Grant Date |
2024-03-05 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Washington, Jeff
- Budd, Geoffrey J.
- Singh, Nikhita
- Sganga, Jake
- Honda, Alexander Li
|
Abstract
A lab system identifies a set of steps associated with a protocol for a lab meant to be performed by a robot within the lab using equipment and reagents. The lab system renders, within a user interface, a virtual representation of the lab, a virtual robot, and virtual equipment and reagents. Responsive to operating in a first mode, the lab system simulates the identified set of steps identify virtual positions of the virtual robot within the lab as the virtual robot performs the steps and modifies the virtual representation of the lab to mirror the identified positions of the virtual robot in real-time. Responsive to operating in a second mode, the lab system identifies positions of the robot within the lab as the robot performs the identified set of steps and modifies the virtual representation of the lab to mirror the identified positions of the robot in real-time.
IPC Classes ?
- G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
- B25J 9/16 - Programme controls
- B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J 19/02 - Sensing devices
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
- G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
- G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
- G06F 3/04842 - Selection of displayed objects or displayed text elements
- G06F 40/10 - Text processing
- G06F 40/40 - Processing or translation of natural language
- G06N 20/00 - Machine learning
- G06T 7/50 - Depth or shape recovery
- G06T 7/70 - Determining position or orientation of objects or cameras
- G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
|
12.
|
Adapting robotic protocols between labs
Application Number |
17392113 |
Grant Number |
11897144 |
Status |
In Force |
Filing Date |
2021-08-02 |
First Publication Date |
2022-02-10 |
Grant Date |
2024-02-13 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Washington, Jeff
- Budd, Geoffrey J.
- Singh, Nikhita
- Sganga, Jake
- Honda, Alexander Li
|
Abstract
A lab system accesses a first protocol for performance by a first robot in a first lab. The first protocol includes a set of steps, each associated with an operation, reagent, and equipment. For each of one or more steps, the lab system modifies the step by: (1) identifying one or more replacement operations that achieve an equivalent or substantially similar result as a performance of the operation, (2) identifying replacement equipment that operates substantially similarly to the equipment, and/or (3) identifying one or more replacement reagents that, when substituted for the reagent, do not substantially affect the performance of the step. The lab system generates a modified protocol by replacing one or more of the set of steps with the modified steps. The lab system selects a second lab including a second and configures the second robot to perform the modified protocol in the second lab.
IPC Classes ?
- B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J 19/02 - Sensing devices
- B25J 9/16 - Programme controls
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
- G01N 35/04 - Details of the conveyor system
- G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
- G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
- G06F 40/10 - Text processing
- G06F 40/40 - Processing or translation of natural language
- G06N 20/00 - Machine learning
- G06T 7/50 - Depth or shape recovery
- G06T 7/70 - Determining position or orientation of objects or cameras
- G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
- G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06F 3/04842 - Selection of displayed objects or displayed text elements
|
13.
|
Robotics calibration in a lab environment
Application Number |
17392125 |
Grant Number |
11999066 |
Status |
In Force |
Filing Date |
2021-08-02 |
First Publication Date |
2022-02-10 |
Grant Date |
2024-06-04 |
Owner |
Artificial, Inc. (USA)
|
Inventor |
- Washington, Jeff
- Budd, Geoffrey J.
- Singh, Nikhita
- Sganga, Jake
- Honda, Alexander Li
|
Abstract
A lab system calibrates robots and cameras within a lab. The lab system accesses, via a camera within a lab, an image of a robot arm, which comprises a visible tag located on an exterior. The lab system determines a position of the robot arm using position sensors located within the robot arm and determines a location of the camera relative to the robot arm based on the determined position and the location of the tag. The lab system calibrates the camera using the determined location of the camera relative to the robot arm. After calibrating the camera, the lab system accesses, via the camera, a second image of equipment in the lab that comprises a second visible tag on an exterior. The lab system determines, based on a location of the second visible tag within the accessed second image, a location of the equipment relative to the robot arm.
IPC Classes ?
- B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J 9/16 - Programme controls
- B25J 19/02 - Sensing devices
- G01N 35/00 - Automatic analysis not limited to methods or materials provided for in any single one of groups Handling materials therefor
- G01N 35/04 - Details of the conveyor system
- G06F 3/04812 - Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
- G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
- G06F 3/04842 - Selection of displayed objects or displayed text elements
- G06F 40/10 - Text processing
- G06F 40/40 - Processing or translation of natural language
- G06N 20/00 - Machine learning
- G06T 7/50 - Depth or shape recovery
- G06T 7/70 - Determining position or orientation of objects or cameras
- G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
- G06T 7/80 - Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
|
|