An autonomous robot drive assembly includes a force sensing assembly. The force sensing assembly is a force sensing handlebar that is mounted in a specific orientation to allow for a user to manipulate the robot. The handlebar is configured to allow a user to manipulate the handlebar by providing force to the handlebar to move the handlebar from a neutral position. The manipulation of the handlebar causes instructions to be determined for operation of the robot. Based on the manipulation of the handlebar, a drive assembly of the robot moves the robot in accordance with the instructions.
An autonomous robot drive assembly includes a force sensing assembly. The force sensing assembly is a force sensing handlebar that is mounted in a specific orientation to allow for a user to manipulate the robot. The handlebar is configured to allow a user to manipulate the handlebar by providing force to the handlebar to move the handlebar from a neutral position. The manipulation of the handlebar causes instructions to be determined for operation of the robot. Based on the manipulation of the handlebar, a drive assembly of the robot moves the robot in accordance with the instructions.
An autonomous robot drive assembly includes a plurality of drive units. The plurality of drive units may allow for movement and control of the autonomous robot drive. Each of the plurality of drive units are configured to be oriented independent of the other drive units. Each drive unit may include a plurality of independently operable driven wheels. Each drive unit may further include a drive unit coupling, allowing for the drive unit to rotate independently of other portions of the autonomous robot. The drive unit coupling may not be driven and may be configured to freely rotate.
A mechanical drive unit for a robot may be controlled by receiving from a force sensor an input message characterizing a physical force exerted on the force sensor in a first direction. A physical force input vector may be determined based on the input message and quantifying the physical force in two or more dimensions. A force output vector aggregating the physical force input vector and a second force input vector and quantifying a force to apply to move the robot in a second direction may be determined at least in part by applying a force multiplier multiplying the physical force input vector. An indication of the force output vector may be transmitted to the omnidirectional mechanical drive unit via a communication interface. The robot may be moved via the mechanical drive unit in the second direction based on the force output vector.
A mechanical drive unit for a robot may be controlled by receiving from a force sensor an input message characterizing a physical force exerted on the force sensor in a first direction. A physical force input vector may be determined based on the input message and quantifying the physical force in two or more dimensions. A force output vector aggregating the physical force input vector and a second force input vector and quantifying a force to apply to move the robot in a second direction may be determined at least in part by applying a force multiplier multiplying the physical force input vector. An indication of the force output vector may be transmitted to the omnidirectional mechanical drive unit via a communication interface. The robot may be moved via the mechanical drive unit in the second direction based on the force output vector.
An autonomous robotic cart includes a chassis, sensors coupled with the chassis, visible light cameras, and a handlebar unit coupled with the chassis. The handlebar unit includes a handlebar and a force sensor configured to detect a translational force and a rotational force exerted on the handlebar. The autonomous robotic cart also includes a holonomic and omnidirectional mechanical drive unit coupled with the chassis. The autonomous robotic cart is configured to autonomously navigate a physical environment to execute one or more navigation goals determined based on communication with a remote computing system configured to manage a fleet of robots including the autonomous robotic cart and also to cause the autonomous robotic cart to move translationally and rotationally in a direction corresponding to a output force vector determined based on sensor data.
An omnidirectional mechanical drive unit in a robot may be controlled by a processor. An input message characterizing a physical force exerted on a force sensor in a first direction may be received. A physical force input vector quantifying the physical force in two or more dimensions may be determined based on the input message. Upon determining that a triggering condition for navigational feedback is satisfied, a haptic force input vector for provide haptic navigational feedback via the omnidirectional mechanical drive unit may be determined. A force output vector aggregating the physical force input vector and the haptic force input vector may be determined. The force output vector may quantify a force to apply to move the robot in a second direction. An indication of the force output vector may be transmitted to the omnidirectional mechanical drive unit. The robot may be moved based on the force output vector.
A computing system may be configured as a fleet controller for autonomous mobile robots operating within a physical environment. The system may include a communication interface receiving sensor data from the robots including image data captured by visible light cameras located on the robots, an environment mapper determining a global scene graph representing the environment and identifying navigable regions of the environment, a workflow coordinator determining a workflow including tasks to be performed within the environment by one or more of the robots in cooperation with a human, and a route planner configured to determine routing information for the one or robots including a nominal route from a source location to a destination location. The robots may be configured to autonomously navigate the environment to execute the tasks based on the routing information.
An autonomous robot drive assembly includes a plurality of drive units. The plurality of drive units may allow for movement and control of the autonomous robot drive. Each of the plurality of drive units are configured to be oriented independent of the other drive units. Each drive unit may include a plurality of independently operable driven wheels. Each drive unit may further include a drive unit coupling, allowing for the drive unit to rotate independently of other portions of the autonomous robot. The drive unit coupling may not be driven and may be configured to freely rotate.
An autonomous robotic cart includes a chassis, sensors coupled with the chassis, visible light cameras, and a handlebar unit coupled with the chassis. The handlebar unit includes a handlebar and a force sensor configured to detect a translational force and a rotational force exerted on the handlebar. The autonomous robotic cart also includes a holonomic and omnidirectional mechanical drive unit coupled with the chassis. The autonomous robotic cart is configured to autonomously navigate a physical environment to execute one or more navigation goals determined based on communication with a remote computing system configured to manage a fleet of robots including the autonomous robotic cart and also to cause the autonomous robotic cart to move translationally and rotationally in a direction corresponding to a output force vector determined based on sensor data.
This application describes systems, devices, computer readable media, and methods for the function and operation of robotic carts. A robotic cart may include a base component configured for the receipt of a payload, a battery unit, and a mobility apparatus. The robotic cart may include a handlebar component coupled with the base component. The handlebar unit may include a sensor unit configured to transmit a hand detection message when the handlebar unit is grasped by one or more hands and to transmit a force direction message indicating a two-dimensional direction associated with a directional force applied by one or more hands. The robotic cart may be configured to map the area around it and to autonomously move the robotic cart along a path to perform a task.
An autonomous robot drive assembly includes a plurality of drive units. The plurality of drive units may allow for movement and control of the autonomous robot drive. Each of the plurality of drive units are configured to be oriented independent of the other drive units. Each drive unit may include a plurality of independently operable driven wheels. Each drive unit may further include a drive unit coupling, allowing for the drive unit to rotate independently of other portions of the autonomous robot. The drive unit coupling may not be driven and may be configured to freely rotate.
One or more simulated capture paths through a physical environment may be determined for a robot based on an environment navigation model of the physical environment. A plurality of simulated object parameter values may be determined for an object type. Simulated sensor data for a plurality of simulated instances of the object type may be determined based on the one or more simulated capture paths, the environment navigation model, and the simulated object parameter values. An object recognition model to recognize an object corresponding with the object type based on the simulated sensor data.
G05D 1/02 - Control of position or course in two dimensions
G06V 10/774 - Generating sets of training patternsBootstrap methods, e.g. bagging or boosting
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
14.
PLACE ENROLLMENT IN A ROBOTIC CART COORDINATION SYSTEM
An initial environment navigation model for a physical environment may be determined based on sensor data collected from a mobile enrollment device. The sensor data may include data collected from a first one or more cameras at the mobile enrollment device. The initial environment navigation model may be sent to a robot via a communication interface. The robot may be instructed to autonomously navigate the physical environment based on the initial environment navigation model and additional sensor data collected by the robot. An updated environment navigation model for the physical environment may be determined based on the initial environment navigation model and the additional sensor data.
This application describes systems, devices, computer readable media, and methods for the function and operation of robotic carts. A robotic cart may include a base component configured for the receipt of a payload, a battery unit, and a mobility apparatus. The robotic cart may include a handlebar component coupled with the base component. The handlebar unit may include a sensor unit configured to transmit a hand detection message when the handlebar unit is grasped by one or more hands and to transmit a force direction message indicating a two-dimensional direction associated with a directional force applied by one or more hands. The robotic cart may be configured to map the area around it and to autonomously move the robotic cart along a path to perform a task.
G05D 1/02 - Control of position or course in two dimensions
B60L 3/00 - Electric devices on electrically-propelled vehicles for safety purposesMonitoring operating variables, e.g. speed, deceleration or energy consumption
B62D 37/00 - Stabilising vehicle bodies without controlling suspension arrangements
B62D 51/00 - Motor vehicles characterised by the driver not being seated
This application describes systems, devices, computer readable media, and methods for the function and operation of robotic carts. A robotic cart may include a base component configured for the receipt of a payload, a battery unit, and a mobility apparatus. The robotic cart may include a handlebar component coupled with the base component. The handlebar unit may include a sensor unit configured to transmit a hand detection message when the handlebar unit is grasped by one or more hands and to transmit a force direction message indicating a two-dimensional direction associated with a directional force applied by one or more hands. The robotic cart may be configured to map the area around it and to autonomously move the robotic cart along a path to perform a task.
09 - Scientific and electric apparatus and instruments
42 - Scientific, technological and industrial services, research and design
Goods & Services
Downloadable software allowing interaction between humans
and mobile interactive robots, logistics and fulfillment
mobile robots, automated human collaborative robots, and
fleets of mobile robots; downloadable software for
controlling, managing, and regulating mobile interactive
robots, logistics and fulfillment mobile robots, automated
human collaborative robots, and fleets of mobile robots;
downloadable no-code tools for crafting mobile interactive
robot behavior. Non-downloadable software allowing interaction between
humans and mobile interactive robots, logistics and
fulfillment mobile robots, automated human collaborative
robots, and fleets of mobile robots (term considered too
vague by the International Bureau - Rule 13 (2) (b) of the
Regulations); non-downloadable software for controlling,
managing, and regulating for controlling, managing, and
regulating mobile interactive robots, logistics and
fulfillment mobile robots, automated human collaborative
robots, and fleets of mobile robots (term considered too
vague by the International Bureau - Rule 13 (2) (b) of the
Regulations); non-downloadable no-code tools for crafting
mobile interactive robot behavior (term considered too vague
by the International Bureau - Rule 13 (2) (b) of the
Regulations); maintenance and updates no-code tools for
allowing interaction between humans and mobile interactive
robots, logistics and fulfillment mobile robots, automated
human collaborative robots, and fleets of mobile robots;
maintenance and updates of no-code tools for crafting mobile
interactive robot behavior.
09 - Scientific and electric apparatus and instruments
37 - Construction and mining; installation and repair services
42 - Scientific, technological and industrial services, research and design
Goods & Services
Mobile interactive robots; logistics and fulfillment mobile
robots; automated human collaborative robots; panels for
controlling, managing, and regulating mobile interactive
robots, logistics and fulfillment mobile robots, and
automated human collaborative robots; downloadable computer
software for controlling, managing, and regulating for
controlling, managing, and regulating mobile interactive
robots, logistics and fulfillment mobile robots, and
automated human collaborative robots. Maintenance, servicing, and repair of mobile interactive
robots, logistics and fulfillment mobile robots, and
automated human collaborative robots. Installation and maintenance of computer software for mobile
interactive robots, logistics and fulfillment mobile robots,
and automated human collaborative robots; providing
non-downloadable software for controlling, managing, and
regulating for controlling, managing, and regulating mobile
interactive robots, logistics and fulfillment mobile robots,
and automated human collaborative robots.
37 - Construction and mining; installation and repair services
42 - Scientific, technological and industrial services, research and design
Goods & Services
Maintenance, servicing, and repair of mobile interactive robots including mobile interactive carts, logistics and fulfillment mobile robotic devices, and automated human collaborative robots Installation and maintenance of computer software for mobile interactive robots including mobile interactive carts, logistics and fulfillment mobile robotic devices, and automated human collaborative robots
37 - Construction and mining; installation and repair services
42 - Scientific, technological and industrial services, research and design
Goods & Services
Maintenance and updates no-code tools for allowing interaction between humans and mobile interactive robots, logistics and fulfillment mobile robots, automated human collaborative robots, and fleets of mobile robots. Maintenance and updates of no-code tools for crafting mobile interactive robot behavior Providing online, non-downloadable software for allowing interaction between humans and mobile interactive robots, logistics and fulfillment mobile robots, automated human collaborative robots, and fleets of mobile robots; Providing online, non-downloadable software for controlling, managing, and regulating mobile interactive robots, logistics and fulfillment mobile robots, automated human collaborative robots, and fleets of mobile robots; Providing online, non-downloadable software no-code tools for crafting mobile interactive robot behavior
37 - Construction and mining; installation and repair services
42 - Scientific, technological and industrial services, research and design
Goods & Services
Maintenance, servicing, and repair of mobile interactive robots, logistics and fulfillment mobile robots, and automated human collaborative robots Installation and maintenance of computer software for mobile interactive robots, logistics and fulfillment mobile robots, and automated human collaborative robots; Providing online, non-downloadable software for controlling, managing, and regulating mobile interactive robots, logistics and fulfillment mobile robots, and automated human collaborative robots
A cleaning robot may determine a three-dimensional model of a physical environment based on data collected from one or more sensors. The cleaning robot may then identify a surface within the physical environment to clean. Having identified that surface, the robot may autonomously navigate to a location proximate to the surface, position an ultraviolet light source in proximity to the surface, and activate the ultraviolet light source for a period of time.
A model of a physical environment may be determined based at least in part on sensor data collected by one or more sensors at a robot. The model may include a plurality of constraints and a plurality of data values. A trajectory through the physical environment may be determined for an ultraviolet end effector coupled with the robot to clean one or more surfaces in the physical environment. The ultraviolet end effector may include one or more ultraviolet light sources. The ultraviolet end effector may be moved along the trajectory.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/02 - Control of position or course in two dimensions
A robot may identify a human located proximate to the robot in a physical environment based on sensor data captured from one or more sensors on the robot. A trajectory of the human through space may be predicted. When the predicted trajectory of the human intersects with a current path of the robot, an updated path to a destination location in the environment may be determined so as to avoid a collision between the robot and the human along the predicted trajectory. The robot may then move along the determined path.
G01C 21/00 - NavigationNavigational instruments not provided for in groups
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/249 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
G05D 1/617 - Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
G05D 1/689 - Pointing payloads towards fixed or moving targets
A cleaning robot may determine a three-dimensional model of a physical environment based on data collected from one or more sensors. The cleaning robot may then identify a surface within the physical environment to clean. Having identified that surface, the robot may autonomously navigate to a location proximate to the surface, position an ultraviolet light source in proximity to the surface, and activate the ultraviolet light source for a period of time.
A61L 2/24 - Apparatus using programmed or automatic operation
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
A robot may identify a human located proximate to the robot in a physical environment based on sensor data captured from one or more sensors on the robot. A trajectory of the human through space may be predicted. When the predicted trajectory of the human intersects with a current path of the robot, an updated path to a destination location in the environment may be determined so as to avoid a collision between the robot and the human along the predicted trajectory. The robot may then move along the determined path.
42 - Scientific, technological and industrial services, research and design
Goods & Services
Providing on-line non-downloadable software using artificial intelligence for recognition and tracking of objects, humans, faces, gestures, and motion, namely a cognitive engine for enabling an autonomous robot operating in a dynamic environment; computer software design and development services for use with humanoid robots with artificial intelligence, namely robots including humanoid or anthropomorphic behaviors, appearances, or characteristics; and computer software design and development services for use with humanoid interactive robots with artificial intelligence for use in security, safety, inspection, tactical, hospitality, education, or entertainment applications