Provided is an autonomous mobile robotic device that may carry, transport, purchase and deliver one or more items for purchase in a work environment. The robotic device may include a container within which the one or more items may be placed. Once tasks are complete, the robotic device may autonomously navigate to a predetermined location.
Some aspects include a method for operating a wheeled device, including: capturing, by a primary sensor coupled to the wheeled device, primary sensor data indicative of a plurality of radial distances to objects; transforming, by a processor of the wheeled device, the plurality of radial distances from a perspective of the primary sensor to a perspective of the wheeled device; generating, by the processor, a partial map of visible areas in real-time at a first position of the wheeled device based on the primary sensor data and some secondary sensor data, wherein: the partial map is a bird's eye view; and the processor iteratively completes a full map of the environment based on new sensor data captured by sensors as the wheeled device performs work within the environment and new areas become visible to the sensors; and executing, by the wheeled device, a movement path to a second position.
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G01C 21/12 - NavigationNavigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigatedDead reckoning
3.
STATIONARY SERVICE APPLIANCE FOR A POLY FUNCTIONAL ROAMING DEVICE
A method for autonomously servicing a first cleaning component of a battery-operated mobile device, including: inferring, with a processor of the mobile device, a value of at least one environmental characteristic based on sensor data captured by a sensor disposed on the mobile device; actuating, with a controller of the mobile device, a first actuator interacting with the first cleaning component to at least one of: turn on, turn off, reverse direction, and increase or decrease in speed such that the first cleaning component engages or disengages based on the value of at least one environmental characteristic or at least one user input received by an application of a smartphone paired with the mobile device; and dispensing, by a maintenance station, water from a clean water container of the maintenance station for washing the first cleaning component when the mobile device is docked at the maintenance station.
G05D 1/648 - Performing a task within a working area or space, e.g. cleaning
A47L 9/00 - Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating actionStoring devices specially adapted to suction cleaners or parts thereofCarrying-vehicles specially adapted for suction cleaners
A47L 9/28 - Installation of the electric equipment, e.g. adaptation or attachment to the suction cleanerControlling suction cleaners by electric means
A47L 11/30 - Floor-scrubbing machines characterised by means for taking-up dirty liquid by suction
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G05D 1/223 - Command input arrangements on the remote controller, e.g. joysticks or touch screens
G05D 1/2285 - Command input arrangements located on-board unmanned vehicles using voice or gesture commands
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
4.
Efficient Coverage Planning of Mobile Robotic Devices
A method for covering a work environment by a robot, including: obtaining sensor data indicative of operational hazards; generating a map based on data obtained from sensors of the robot; determining an object type of an operational hazard based on extracted features and a database of various object types and their features; generating a coverage plan for areas of the work environment; executing the coverage plan by the robot; capturing debris sensor data indicative of at least presence and absence of debris in locations within the work environment; determining areas of the work environment with a high presence of debris and a low presence of debris, wherein the map is updated to distinguish the areas with the high presence of debris and the areas with the low presence of debris.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
5.
A METHOD FOR A ROBOTIC DEVICE TO POLYMORPH, ADAPT, AND ACTUATE IN REAL TIME TO RESPOND TO A PERCEIVED STIMULI BASED ON A PROBABILISTIC PREDICTION OF AN OUTCOME GIVEN A CERTAIN RESPONSE
Some aspects include a method for operating an autonomous robot, including: capturing, with a first sensor disposed on the robot, data of an environment of the robot; generating, with the processor, a map of the environment based on at least the data of the environment; localizing, with the processor, the robot within the environment; capturing, with a second sensor disposed on the robot, data of a floor surface; determining, with the processor, a floor type of areas of the environment based on the data of the floor surface; and determining, with the processor, settings of the robot based on at least the floor type of the floor surface, wherein the settings comprise at least an elevation of each of at least one component of the robot from the floor surface.
Some aspects provide a method for instructing operation of a robotic floor-cleaning device based on the position of the robotic floor-cleaning device within a two-dimensional map of the workspace. A two-dimensional map of a workspace is generated using inputs from sensors positioned on a robotic floor-cleaning device to represent the multi-dimensional workspace of the robotic floor-cleaning device. The two-dimensional map is provided to a user on a user interface. A user may adjust the boundaries of the two-dimensional map through the user interface and select settings for map areas to control device operation in various areas of the workspace.
Some aspects provide a media storing instructions that when executed by a processor of a robot effectuates operations including: capturing first data indicative of a position of the robot relative to objects within the workspace and second data indicative of movement of the robot; generating or updating a map of the workspace based on at least one of: at least a part of the first data and at least a part of the second data; segmenting the map into a plurality of zones; transmitting the map to an application of a communication device; receiving an updated map; generating a movement path based on the map or the updated map; and actuating the robot to traverse the movement path.
A method for cleaning a workspace, including: autonomously moving, with a mechanism of a robot, a cloth of a mopping assembly of the robot upwards and downwards relative to a work surface of the work space. The cloth is disengaged from the work surface when the cloth is moved upwards relative to the work surface such that the cloth is not in contact with the work surface. The cloth is engaged with the work surface when the cloth is moved downwards relative to the work surface such that the cloth is in contact with the work surface. The mechanism moves the cloth of the mopping assembly upwards and downwards relative to the work surface based on input provided by at least one sensor of the robot.
A47L 7/00 - Suction cleaners adapted for additional purposesTables with suction openings for cleaning purposesContainers for cleaning articles by suctionSuction cleaners adapted to cleaning of brushesSuction cleaners adapted to taking-up liquids
A47L 5/00 - Structural features of suction cleaners
A47L 9/00 - Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating actionStoring devices specially adapted to suction cleaners or parts thereofCarrying-vehicles specially adapted for suction cleaners
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
Provided is a bumper apparatus of a robot, including: a bumper elastically coupled with a chassis of the robot; at least one sensor configured to sense object impacts with the bumper; and at least one elastic element coupled to or interfacing with the chassis and coupled to or interfacing with the bumper; wherein: the at least one elastic element facilitates movement of the bumper relative to the chassis upon impact with an object and disengagement from the object after impact; the at least one elastic element facilitates a return of the bumper to a neutral position upon disengaging from the object after impact; and the bumper covers at least a front side of the chassis.
B62D 24/04 - Vehicle body mounted on resilient suspension for movement relative to the vehicle frame
A47L 9/00 - Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating actionStoring devices specially adapted to suction cleaners or parts thereofCarrying-vehicles specially adapted for suction cleaners
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
B62D 27/04 - Connections between superstructure sub-units resilient
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
10.
METHOD OF LIGHTWEIGHT SIMULTANEOUS LOCALIZATION AND MAPPING PERFORMED ON A REAL-TIME COMPUTING AND BATTERY OPERATED WHEELED DEVICE
Some aspects include a method for operating a wheeled device, including: capturing, by a primary sensor coupled to the wheeled device, primary sensor data indicative of a plurality of radial distances to objects; transforming, by a processor of the wheeled device, the plurality of radial distances from a perspective of the primary sensor to a perspective of the wheeled device; generating, by the processor, a partial map of visible areas in real-time at a first position of the wheeled device based on the primary sensor data and some secondary sensor data, wherein: the partial map is a bird's eye view; and the processor iteratively completes a full map of the environment based on new sensor data captured by sensors as the wheeled device performs work within the environment and new areas become visible to the sensors; and executing, by the wheeled device, a movement path to a second position.
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G01C 21/12 - NavigationNavigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigatedDead reckoning
Provided is a device, comprising: a smart gym equipment, comprising: one or more sensors; one or more actuators; one or more electric magnets; a processor; and a tangible, non-transitory, machine-readable media storing instructions that when executed by the processor effectuates operations comprising: adjusting resistance in continuous amounts during a weight-lifting training in relation to a pulled distance of a weight value, wherein a change of the weight value is proportional to the pull distance; and the weight value is adjusted by an adjustment in an electrical current flowing through a wire in the smart gym equipment thereby adjusting a strength of a magnetic field; wherein: the processor determines a value for the electrical current; the adjustment in the electrical current based on at least one sensed data; and the device receives and transmits data to an application of a communication device paired with the device.
A method for operating a robot, including: capturing images of a workspace; capturing data indicative of movement of the robot; capturing LIDAR data as the robot moves within the workspace; generating a map of the workspace based on the LIDAR data; actuating the robot to drive; discriminating between an object on a floor surface along a path of the robot and the floor surface based on the captured images; actuating the robot to drive until determining all areas of the workspace are discovered and included in the map; and executing a cleaning function.
G05D 1/249 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
13.
Method and system for collaborative construction of a map
Methods and systems for constructing a map of an environment. One or more sensory devices installed on an autonomous vehicle take readings within a field of view of the sensory device. As the vehicle moves within the environment, the sensory device continuously takes readings within new fields of view. At the same time, sensory devices installed on other autonomous vehicles operating within the same environment and/or fixed devices monitoring the environment take readings within their respective fields of view. The readings recorded by a processor of each autonomous vehicle may be shared with all other processors of autonomous vehicles operating within the same environment with whom a data transfer channel is established. Processors combine overlapping readings to construct continuously growing segments of the map. Combined readings are taken by the same sensory device or by different sensory devices and are taken at the same time or at different times.
Some aspects include a schedule development method for a robotic floor-cleaning device that recognizes patterns in user input to automatically devise a work schedule.
G06F 17/00 - Digital computing or data processing equipment or methods, specially adapted for specific functions
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
Provided is a navigation system for a leader vehicle leading follower vehicles. The leader vehicle is configured to transmit real-time movement data to follower vehicles. The follower vehicles each include a signal receiver for receiving the data from the leader vehicle and a processor configured to determine a set of active maneuvering instructions for the respective follower vehicle based on at least a portion of the data received from the leader vehicle and actuate the respective follower vehicle to execute the set of active maneuvering instructions.
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
B60W 10/04 - Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
B60W 10/20 - Conjoint control of vehicle sub-units of different type or different function including control of steering systems
B60W 30/165 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle automatically following the path of a preceding lead vehicle, e.g. "electronic tow-bar"
16.
Method for efficient operation of mobile robotic devices
A method for autonomously planning work duties of a robot within an environment of the robot, including: obtaining, with a processor of the robot, first data indicative of a presence or an absence of at least one human within the environment at a particular time; actuating, with the processor, the robot to execute work duties based on the first data, wherein: the robot executes the work duties when the first data indicates the absence of the at least one human from the environment; and the work duties comprise cleaning at least a portion of the environment; capturing, with at least one sensor of a plurality of sensors disposed on the robot, second data while the robot executes the work duties; and altering, with the processor, a navigational route of the robot or the work duties based on the second data.
A method for pairing a robotic device with an application of a communication device, including: receiving, with the application, login information of a user; logging, with the application, into a user account using the login information; receiving, with the application, a password for a wireless network for connecting the robotic device to the wireless network; initiating, with the robotic device, pairing of the robotic device with the application upon the user pressing a button on a user interface of the robotic device; displaying, with the application, a map and a status of the robotic device; and transmitting, with the application, each of: an adjustment to the map; an instruction to perform a function; a cleaning setting; scheduling information; a cleaning intensity or a cleaning frequency setting; and an area within the map which the robotic device is to avoid to the robotic device.
H04W 76/11 - Allocation or use of connection identifiers
G06K 19/06 - Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
Some aspects provide a system including a mobile robot and a recharging station. The mobile robot aligns with the recharging station based on at least a first signal received by a first signal receiver of the mobile robot and the mobile robot is actuated to adjust a direction of movement of the mobile robot based on at least the first signal received by the first signal receiver of the mobile robot.
B25J 19/00 - Accessories fitted to manipulators, e.g. for monitoring, for viewingSafety devices combined with or specially adapted for use in connection with manipulators
A memory storing program code that when executed by a processor of a robot effectuates operations, including: detecting, with a sensor of a plurality of sensors disposed on the robot, an object in a line of sight of the sensor; adjusting, with the processor of the robot, a current path of the robot to detour around or avoid the object; generating, with the processor of the robot, a planar representation of a workspace of the robot based on data collected by at least some sensors of the plurality of sensors; and wherein an application of a communication device paired with the robot is configured to display the planar representation.
Provided is a robotic cleaning device, including a chassis, an oscillating mechanism, and a cleaning component, wherein the oscillating mechanism causes the cleaning component to oscillate in at least plane parallel to a driving surface of the robotic cleaning device.
A robotic cleaner executing operations such as capturing data indicative of locations of objects in a workspace through which the robot moves; generating or updating a map of at least a part of the workspace based on at least the data; and navigating based on the map or an updated map of the workspace. The robotic cleaner may include a side brush with a main body with at least one attachment point and at least one bundle of bristles attached to the at least one attachment point of the main body.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
22.
System and Method for Establishing Virtual Boundaries for Robotic Devices
A method for centrally aligning a robot with an electronic device, including: transmitting, with at least one transmitter, a first signal; receiving, with a first receiver and a second receiver, the first signal; detecting, with a controller coupled to the first receiver and the second receiver, the robot is centrally aligned with the electronic device when the first receiver and the second receiver simultaneously receive the first signal, wherein a virtual line passing through a center of the robot and a center of the electronic device is aligned with a midpoint between the first receiver and the second receiver; and executing, with the robot, a particular movement type when the robot is aligned with the electronic device.
A removable dustbin for a robotic vacuum that is wholly separable from all electronic parts thereof including a motor unit such that the dustbin, when separated from the electronic parts, may be safely immersed in water for quick and easy cleaning. The dustbin design further facilitates easy access to the motor for convenient servicing and repair.
A47L 9/14 - Bags or the likeAttachment of, or closures for, bags
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
24.
METHODS FOR FINDING THE PERIMETER OF A PLACE USING OBSERVED COORDINATES
Provided is a system including a robot and an application of a communication device. The robot includes a medium storing instructions that when executed by a processor of the robot effectuate operations including: obtaining first data indicative of a relative position of the robot in a workspace; actuating the robot to drive within the workspace to form a map including mapped perimeters that correspond with physical perimeters of the workspace while obtaining second data indicative of movement of the robot; and forming the map of the workspace based on at least some of the first data, wherein the map of the workspace expands as new first data are obtained, until all perimeters of the workspace are included in the map. The application is configured to display information, such as the map, and receive user input.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G01C 21/16 - NavigationNavigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigatedDead reckoning by integrating acceleration or speed, i.e. inertial navigation
G01C 21/20 - Instruments for performing navigational calculations
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G05D 1/228 - Command input arrangements located on-board unmanned vehicles
G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
G05D 1/247 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
G05D 1/249 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
G05D 1/648 - Performing a task within a working area or space, e.g. cleaning
G06T 3/14 - Transformations for image registration, e.g. adjusting or mapping for alignment of images
Provided is a robotic refuse container system, including: a first robotic refuse container, including: a chassis; a set of wheels; a rechargeable battery; a processor; a refuse container; a plurality of sensors; and a medium storing instructions that when executed by the processor effectuates operations including: collecting sensor data; determining a movement path of the first robotic refuse container from a first location to a second location; and pairing the first robotic refuse container with an application of a communication device; and the application of the communication device, configured to: receive at least one input designating at least a schedule, an instruction to navigate the first robotic refuse container to a particular location, and a second movement path of the first robotic refuse container; and display a status; wherein the first robotic refuse container remains parked at the first location until receiving an instruction to execute a particular action.
Included is a method for adjusting window shade settings of a window shade. At least one sensor captured environmental data of surroundings. A processor actuates at least one window shading setting to be applied to the window shade based on at least one of the environmental data, window shade setting preferences of a user, and at least one input received by an application of a communication device paired with the processor.
A control system that coordinates and manages the execution of tasks by one or more robotic devices within an environment. The control system transmits and receives information to and from one or more robotic devices using a wireless communication channel. The one or more robotic devices may execute one or more actions based on the information received and may transmit information to the control system using a wireless communication channel.
G05B 19/418 - Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
28.
System and method for guiding heading of a mobile robotic device
A robotic device including a medium storing instructions that when executed by a processor effectuates operations including: capturing images and sensor data of an environment as the robotic device drives back and forth in straight lines; generating or updating a map of the environment based on at least one of the one or more images and the sensor data; recognizing one or more rooms in the map based on at least one of the one or more images and the sensor data; determining at least one of a position and an orientation of the robotic device relative to the environment based on at least one of the one or more images and the sensor data; and actuating the robotic device to adjust a heading of the robotic device based on the at least one of the position and the orientation of the robotic device relative to the environment.
Provided is a robotic device, including: a chassis; a set of wheels; a control system; a battery; one or more sensors; a processor; a tangible, non-transitory, machine readable medium storing instructions that when executed by the processor effectuates operations including: capturing, with the one or more sensors, data of an environment of the robotic device and data indicative of movement of the robotic device; generating or updating, with the processor, a map of the environment based on at least a portion of the captured data; and generating or updating, with the processor, a movement path of the robotic device.
A method for identifying a doorway, including: capturing, with a sensor disposed on a robot, sensor data of an environment of the robot as the robot drives along a movement path; identifying, with a processor of the robot, at least one feature from the sensor data indicative of a doorway; identifying, with the processor, the doorway at a location within the environment upon detecting the at least one feature in the sensor data; and generating, with the processor, a map of the environment based on at least the sensor data.
Provided is a robot, including: a chassis; a set of wheels; a plurality of sensors; a camera; a processor; a memory storing instructions that when executed by the processor effectuate operations including: capturing, with the camera, spatial data of surroundings of the robot; generating, with the processor, a spatial model of the surroundings based on at least the spatial data of the surroundings; generating, with the processor, a movement path based on the spatial model of the surroundings; and inferring, with the processor, a location of the robot.
Included is a method for operating Internet of Things (IoT) smart devices within an environment, including: connecting at least one IoT smart device with an application executed on a smartphone, wherein the IoT smart devices comprise at least a robotic cleaning device and a docking station of the robotic cleaning device; generating a map of an environment with the robotic cleaning device; displaying the map with the application; and receiving user inputs with the application, wherein the user inputs specify at least: a command to turn on or turn off a first IoT smart device; a command for the robotic cleaning device to clean the environment; and a command for the robotic cleaning device to clean a particular room within the environment.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G05D 1/224 - Output arrangements on the remote controller, e.g. displays, haptics or speakers
G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
A method for operating a mopping robot, including: storing, with a liquid reservoir disposed on the robot, liquid; releasing, with an electronically-controlled liquid release mechanism disposed on the robot, liquid from the liquid reservoir onto a work surface; receiving, with an application of a communication device, user input designating at least a schedule for mopping and a quantity of liquid to release during mopping of an area; determining, with a processor of the robot, a schedule for mopping at least one area based on at least one of first sensor data or the user input provided to the application; and determining, with the processor of the robot, a quantity of liquid to release during mopping of the at least one area based on at least one of second sensor data or the user input provided to the application.
A47L 9/04 - Nozzles with driven brushes or agitators
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
Provided is a first robot including: a machine readable medium storing instructions that when executed by the processor of the first robot effectuates operations including: executing, with the processor of the first robot, a task; and transmitting, with the processor of the first robot, a signal to a processor of a second robot during execution of the task when its power supply level reduces below a predetermined threshold; and the second robot including: a machine readable medium storing instructions that when executed by the processor of the second robot effectuates operations including: executing, with the processor of the second robot, the remainder of the task upon receiving the signal transmitted from the processor of the first robot; and wherein the first robot navigates to a charging station when its power supply level reduces below the predetermined threshold and wherein the first robot and second robot provide the same services.
B62D 33/063 - Drivers' cabs movable from one position into at least one other position, e.g. tiltable, pivotable about a vertical axis, displaceable from one side of the vehicle to the other
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
A robot that recognizes a user by identifying elements in a captured image that match elements in at least one previously captured image of the user, identifying biometric data that matches previously captured biometric data of the user, or identifying voice data that matches previously captured voice data of the user. Upon recognition of the user, the user is authorized to use the robot, control the robot, modify settings of the robot, add or delete or modify access of users to the robot, actuate the robot, program the robot, and assign a task to the robot.
G06V 10/75 - Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video featuresCoarse-fine approaches, e.g. multi-scale approachesImage or video pattern matchingProximity measures in feature spaces using context analysisSelection of dictionaries
A method for operating a robot, including: capturing images of a workspace; capturing data indicative of movement of the robot; capturing LIDAR data as the robot moves within the workspace; generating a map of the workspace based on the LIDAR data; actuating the robot to drive; discriminating between an object on a floor surface along a path of the robot and the floor surface based on the captured images; actuating the robot to drive until determining all areas of the workspace are discovered and included in the map; and executing a cleaning function.
G05D 1/249 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
37.
Simultaneous collaboration, localization, and mapping
Provided is a medium storing instructions that when executed by a processor of a first wheeled device effectuates operations including: capturing sensor readings of an environment; finding a position of the first wheeled device within a map of the environment based on at least some of the sensor readings; and generating a new map of the environment when the processor is unable to load the previously generated map or cannot find the position of the first wheeled device within the previously generated map; wherein: the map is previously generated with the processor of the first wheeled device or a processor of a second wheeled device; the map is loaded into a memory of the first wheeled device at a beginning of each work session; and the processor of the first wheeled device iteratively tracks the position of the first wheeled device while performing a task.
Provided is a tangible, non-transitory, machine readable medium storing instructions that when executed by the processor effectuates operations including: capturing visual readings to objects within an environment; capturing readings of wheel rotation; capturing readings of a driving surface; capturing distances to obstacles; determining displacement of the robotic device in two dimensions based on sensor readings of the driving surface; estimating, with the processor, a corrected position of the robotic device to replace a last known position of the robotic device; determining a most feasible element in an ensemble based on the visual readings; and determining a most feasible position of the robotic device as the corrected position based on the most feasible element in the ensemble and the visual readings.
Provided is a robotic refuse container, including a chassis; a set of wheels coupled to the chassis; a refuse container coupled to the chassis for collecting refuse; a rechargeable battery; a plurality of sensors; a processor; and a tangible, non-transitory, machine-readable medium storing instructions that when executed by the processor effectuates operations including: detecting, with at least one sensor disposed on the robotic refuse container, an amount of refuse within the refuse container; instructing, with the processor, the robotic refuse container to navigate to a location to empty the refuse upon detecting the amount of refuse exceeds a predetermined refuse amount in the refuse container; and emptying, by a separate device, the refuse within the refuse container at the location.
A system for collaboration between a first robot and a second robot, including: an application of a communication device configured to receive at least one input designating an instruction for the first robot to execute a first task and an instruction for the second robot to execute a second task after the first robot completes the first task; the first robot, including a medium storing instructions that when executed by a processor of the first robot effectuates operations including: actuating the first robot to execute the first task; and actuating the first robot to dock at a charging station upon completion of the first task; and the second robot including a medium storing instructions that when executed by a processor of the second robot effectuates operations including actuating the second robot to execute the second task upon receiving a signal indicating the first task is complete by the first robot.
G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
41.
Method and apparatus for overexposing images captured by drones
Some aspects include a method for operating an autonomous robot, including: capturing, with a first sensor disposed on the robot, data of an environment of the robot; generating, with the processor, a map of the environment based on at least the data of the environment; localizing, with the processor, the robot within the environment; capturing, with a second sensor disposed on the robot, data of a floor surface; determining, with the processor, a floor type of areas of the environment based on the data of the floor surface; and determining, with the processor, settings of the robot based on at least the floor type of the floor surface, wherein the settings comprise at least an elevation of each of at least one component of the robot from the floor surface.
An autonomous mobile robotic device that may carry and transport one or more items within an environment. The robotic device may comprise a container within which the one or more items may be stored. The robotic device may pick up and deliver the one or more items to one or more locations. The robotic device may be provided with scheduling information for task execution or pickup and delivery of one or more items. Once tasks are complete, the robotic device may autonomously navigate to a storage location.
Provided is a smart gym equipment, including a frame; a plurality of sensors; at least one actuator; a plurality of weights; a processor; and, a tangible, non-transitory, machine-readable medium storing instructions that when executed by the processor effectuates operations including: obtaining, with the processor, sensor data captured by at least some of the plurality of sensors; receiving, with the processor, input data; determining, with the processor, at least one equipment setting of the smart gym equipment based on at least some of the input data and at least one relationship relating the at least some of the input data to the at least one equipment setting; and instructing, with the processor, the at least one actuator to automatically implement the at least one equipment setting.
A system of two robots, including: a first robot, including: a plurality of sensors; a control system; and medium storing instructions that when executed by the control system of the first robot effectuates operations including: generating or updating a grid map of an environment; and transmitting a message to a control system of a second robot; and the second robot, including: a plurality of sensors; a control system; and a medium storing instructions that when executed by the control system of the second robot effectuates operations including: generating or updating a grid map of the environment independent from the gird map generated by the control system of the first robot; and actuating the second robot to begin performing coverage of the environment upon receiving the message from the control system of the first robot.
G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
A method of sending scheduling information to a robotic device is provided, comprising: generating, with an application of a communication device, at least one scheduling command, the application comprising a graphical user interface; transmitting, with the application, the at least one scheduling command to a router, wherein the router is configured to transmit the at least one scheduling command to a cloud service; receiving, with a processor of a robotic device, the at least one scheduling command from the cloud service; generating or modifying, with the processor of the robotic device, scheduling information of the robotic device based on the at least one scheduling command; and suggesting, with the processor of the robotic device, a schedule of the robotic device based on previously received operational instructions and times of executing the instructions by the robotic device.
Provided is a robotic device for transporting and delivering at least one item, including: a processor; a chassis including a set of wheels; a motor for driving the set of wheels; a control system module for controlling the movement of the robotic device; a set of sensors; a screen with a graphical user interface; at least one compartment for storing the at least one item for transportation and delivery; and a door that provides access to the at least one item stored in the at least one compartment.
A method for operating a robot, including: capturing, with the at least one sensor, data of an environment and data indicative of movement of the robot; generating or updating, with the processor, a map of the environment based on at least a portion of the captured data; inferring, with the processor, a current location of the robot; and actuating, with the processor, the robot to execute a first task; wherein an application of a communication device is used by a user to schedule the first task.
A method for covering a work environment by a robot, including: obtaining sensor data indicative of operational hazards; generating a map based on data obtained from sensors of the robot; determining an object type of an operational hazard based on extracted features and a database of various object types and their features; generating a coverage plan for areas of the work environment; executing the coverage plan by the robot; capturing debris sensor data indicative of at least presence and absence of debris in locations within the work environment; determining areas of the work environment with a high presence of debris and a low presence of debris, wherein the map is updated to distinguish the areas with the high presence of debris and the areas with the low presence of debris.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
Some aspects include a method for operating a cleaning robot, including: capturing LIDAR data; generating a first iteration of a map of the environment in real time; capturing sensor data from different positions within the environment; capturing movement data indicative of movement of the cleaning robot; aligning and integrating newly captured LIDAR data with previously captured LIDAR data at overlapping points; generating additional iterations of the map based on the newly captured LIDAR data and at least some of the newly captured sensor data; localizing the cleaning robot; planning a path of the cleaning robot; and actuating the cleaning robot to drive along a trajectory that follows along the planned path by providing pulses to one or more electric motors of wheels of the cleaning robot.
A method for operating an autonomous wheeled device, including: obtaining first data indicative of objects within an environment; generating, with a processor of the autonomous wheeled device, a map of the environment using the first data; transmitting, with the processor, first information to an application of a smartphone; proposing, with the application, a suggested schedule for the autonomous wheeled device; receiving, with the processor, second information from the application of the smartphone; and actuating, with the processor, the autonomous wheeled device to operate according to the new schedule or the adjustment to the existing schedule and the suggested schedule, wherein the processor only actuates the autonomous wheeled device to operate according to the suggested schedule after the application receives approval of the suggested schedule.
Provided is a method including obtaining a map of an environment of a robot; maneuvering the robot to a first location and orientation that positions the robot to sense a part of the working environment at a second location of the working environment; sensing, while the robot is at the first location, the part of the physical layout of the working environment at the second location; updating the map of the physical layout of the working environment; determining at least a part of a route plan of the robot through the working environment; and maneuvering the robot along the at least the part of the route plan.
G01C 22/02 - Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers or using pedometers by conversion into electric waveforms and subsequent integration, e.g. using tachometer generator
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
52.
System and method for establishing virtual boundaries for robotic devices
Methods for utilizing virtual boundaries with robotic devices are presented including: positioning a boundary component having a receiver pair to receive a first robotic device signal substantially simultaneously by each receiver of the receiver pair from a robotic device only when the robotic device is positioned along a virtual boundary; operating the robotic device to move automatically within an area co-located with the virtual boundary; transmitting the first robotic device signal by the robotic device; and receiving the first robotic device signal by the receiver pair thereby indicating that the robotic device is positioned along the virtual boundary.
Provided is a method for a robotic device to autonomously overcome obstructions hindering the operational capacity of the robotic device. When a robotic device encounters an obstruction, the robotic device may enact one of a number of predetermined responses to overcome the obstruction without requiring the intervention of an outside entity to assist the robotic device with overcoming the obstruction.
A method for cleaning a workspace, including: autonomously moving, with a mechanism of a robot, a cloth of a mopping assembly of the robot upwards and downwards relative to a work surface of the work space. The cloth is disengaged from the work surface when the cloth is moved upwards relative to the work surface such that the cloth is not in contact with the work surface. The cloth is engaged with the work surface when the cloth is moved downwards relative to the work surface such that the cloth is in contact with the work surface. The mechanism moves the cloth of the mopping assembly upwards and downwards relative to the work surface based on input provided by at least one sensor of the robot.
A47L 7/00 - Suction cleaners adapted for additional purposesTables with suction openings for cleaning purposesContainers for cleaning articles by suctionSuction cleaners adapted to cleaning of brushesSuction cleaners adapted to taking-up liquids
A47L 5/00 - Structural features of suction cleaners
A47L 9/00 - Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating actionStoring devices specially adapted to suction cleaners or parts thereofCarrying-vehicles specially adapted for suction cleaners
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
55.
Stationary service appliance for a poly functional roaming device
A method for autonomously servicing a first cleaning component of a battery-operated mobile device, including: inferring, with a processor of the mobile device, a value of at least one environmental characteristic based on sensor data captured by a sensor disposed on the mobile device; actuating, with a controller of the mobile device, a first actuator interacting with the first cleaning component to at least one of: turn on, turn off, reverse direction, and increase or decrease in speed such that the first cleaning component engages or disengages based on the value of at least one environmental characteristic or at least one user input received by an application of a smartphone paired with the mobile device; and dispensing, by a maintenance station, water from a clean water container of the maintenance station for washing the first cleaning component when the mobile device is docked at the maintenance station.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
A47L 9/00 - Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating actionStoring devices specially adapted to suction cleaners or parts thereofCarrying-vehicles specially adapted for suction cleaners
A47L 9/28 - Installation of the electric equipment, e.g. adaptation or attachment to the suction cleanerControlling suction cleaners by electric means
A47L 11/30 - Floor-scrubbing machines characterised by means for taking-up dirty liquid by suction
G05D 1/223 - Command input arrangements on the remote controller, e.g. joysticks or touch screens
G05D 1/2285 - Command input arrangements located on-board unmanned vehicles using voice or gesture commands
G05D 1/648 - Performing a task within a working area or space, e.g. cleaning
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G05D 105/10 - Specific applications of the controlled vehicles for cleaning, vacuuming or polishing
Provided is a system including a robot and an application of a communication device. The robot includes a medium storing instructions that when executed by a processor of the robot effectuate operations including: obtaining first data indicative of a relative position of the robot in a workspace; actuating the robot to drive within the workspace to form a map including mapped perimeters that correspond with physical perimeters of the workspace while obtaining second data indicative of movement of the robot; and forming the map of the workspace based on at least some of the first data, wherein the map of the workspace expands as new first data are obtained, until all perimeters of the workspace are included in the map. The application is configured to display information, such as the map, and receive user input.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G01C 21/16 - NavigationNavigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigatedDead reckoning by integrating acceleration or speed, i.e. inertial navigation
G01C 21/20 - Instruments for performing navigational calculations
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/228 - Command input arrangements located on-board unmanned vehicles
G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
G05D 1/247 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
G05D 1/249 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
G05D 1/648 - Performing a task within a working area or space, e.g. cleaning
G06T 3/14 - Transformations for image registration, e.g. adjusting or mapping for alignment of images
Included is a method for a first robotic device to collaborate with a second robotic device, including: actuating, with a processor of the first robotic device, the first robotic device to execute a first part of a cleaning task; transmitting, with the processor of the first robotic device, information to a processor of the second robotic device upon completion of the first part of the cleaning task; receiving, with the processor of the second robotic device, the information transmitted from the processor of the first robotic device; and actuating, with the processor of the second robotic device, the second robotic device to execute a second part of the cleaning task upon receiving the signal; wherein the first robotic device and the second robotic device are surface cleaning robotic devices with a functionality comprising at least one of vacuuming and mopping.
H04L 67/125 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
Provided is a robot, including: a plurality of sensors; a processor; a tangible, non-transitory, machine readable medium storing instructions that when executed by the processor effectuates operations including: capturing, with an image sensor, images of a workspace as the robot moves within the workspace; identifying, with the processor, at least one characteristic of at least one object captured in the images of the workspace; determining, with the processor, an object type of the at least one object based on characteristics of different types of objects stored in an object dictionary, wherein possible object types comprise a type of clothing, a cord, a type of pet bodily waste, and a shoe; and instructing, with the processor, the robot to execute at least one action based on the object type of the at least one object.
A robotic cleaner executing operations such as capturing data indicative of locations of objects in a workspace through which the robot moves; generating or updating a map of at least a part of the workspace based on at least the data; and navigating based on the map or an updated map of the workspace. The robotic cleaner may include a side brush with a main body with at least one attachment point and at least one bundle of bristles attached to the at least one attachment point of the main body.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
A robot for transporting items, including: a chassis; a cavity within which items are stored for transportation; a set of wheels coupled to the chassis; a control system to actuate movement of the set of wheels; a power supply; at least one sensor; a processor electronically coupled to the control system and the at least one sensor; and a tangible, non-transitory, machine readable medium storing instructions that when executed by the processor effectuates operations including: capturing, with the at least one sensor, data of an environment and data indicative of movement of the robot; generating or updating, with the processor, a map of the environment based on at least a portion of the captured data; inferring, with the processor, a current location of the robot; and actuating, with the processor, the robot to execute a transportation task.
A robot configured to perceive a model of an environment, including: a chassis; a set of wheels; a plurality of sensors; a processor; and memory storing instructions that when executed by the processor effectuates operations including: capturing a plurality of data while the robot moves within the environment; perceiving the model of the environment based on at least a portion of the plurality of data, the model being a top view of the environment; storing the model of the environment in a memory accessible to the processor; and transmitting the model of the environment and a status of the robot to an application of a smartphone previously paired with the robot.
G06V 10/75 - Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video featuresCoarse-fine approaches, e.g. multi-scale approachesImage or video pattern matchingProximity measures in feature spaces using context analysisSelection of dictionaries
A method for perceiving a model of an environment, including: capturing a plurality of data while the robot moves within the environment, wherein: the plurality of data comprises at least a first data and a second data captured by a first sensor of a first sensor type and a second sensor of a second sensor type, respectively; the first sensor type is an imaging sensor; the second senor type captures movement data; an active source of illumination is positioned adjacent to the imaging sensor such that reflections of illumination light illuminating a path of the robot fall within a field of view of the imaging sensor; perceiving the model of the environment based on at least a portion of the plurality of data; storing the model of the environment in a memory; and transmitting the model of the environment to an application of a smartphone.
G06V 10/75 - Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video featuresCoarse-fine approaches, e.g. multi-scale approachesImage or video pattern matchingProximity measures in feature spaces using context analysisSelection of dictionaries
Provided is a robot including a media storing instructions that when executed by the processor of the robot effectuates operations including: obtaining sensor data indicative of operational hazards within a work environment; generating a map of the work environment based on data obtained from at least some sensors of the robot; identifying at least one room in the map; determining an object type of an operational hazard based on extracted features of the operational hazard and a database of various object types and their features; updating the map to include the object type of the operational hazard at a location in which the operational hazard was encountered by the robot; generating a coverage plan for areas of the work environment; and executing the coverage plan by the robot.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
A removable mop attachment module, including a frame; a reservoir positioned within the frame; at least one drainage aperture positioned at a bottom of the reservoir; at least one breathing aperture positioned on the reservoir; and a pressure actuated valve positioned at least partially on an inner surface of the reservoir, covering the at least one breathing aperture.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
A distance estimation system comprised of a laser light emitter, two image sensors, and an image processor are positioned on a baseplate such that the fields of view of the image sensors overlap and contain the projections of an emitted collimated laser beam within a predetermined range of distances. The image sensors simultaneously capture images of the laser beam projections. The images are superimposed and displacement of the laser beam projection from a first image taken by a first image sensor to a second image taken by a second image sensor is extracted by the image processor. The displacement is compared to a preconfigured table relating displacement distances with distances from the baseplate to projection surfaces to find an estimated distance of the baseplate from the projection surface at the time that the images were captured.
A method for localizing an electronic device, including: capturing data of surroundings of the electronic device with at least one sensor of the electronic device; and inferring a location of the electronic device based on at least some of the data of the surroundings, wherein inferring the location of the electronic device includes: determining a probability of the electronic device being located at different possible locations within the surroundings based on the at least some of the data of the surroundings; and inferring the location of the electronic device based on the probability of the electronic device being located at different possible locations within the surroundings.
A robot including a main brush; a peripheral brush; a first actuator; a first sensor; processors; and memory storing instructions that when executed by the processors effectuate operations. The operations include determining a first location of the robot in a working environment; obtaining first data from the first sensor or another sensor indicative of a value of an environmental characteristic of the first location; adjusting a first operational parameter of the first actuator based on the sensed first data; and forming or updating a debris map of the working environment based on data output by the first sensor or the another sensor configured to collect data indicative of an existence of debris on a floor of the working environment over at least one cleaning session.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
A robot including a chassis; a set of wheels coupled to the chassis; a range finding system coupled to the robot; a plurality of sensors; a processor; and a tangible, non-transitory, machine-readable medium storing instructions that when executed by the processor effectuates operations including: obtaining, with the processor, distances to obstacles measured by the range finding system as the robot moves relative to the obstacles; and determining, with the processor, a position of the obstacle based at least partially on the distance measurements, including: identifying, with the processor, at least one position of the range finding system when encountering the obstacle; and determining, with the processor, the position of the obstacle based at least partially on the at least one position of the range finding system when encountering the obstacle.
Provided is a mobile robotic device, including at least: a plurality of sensors; a processor; and a tangible, non-transitory, machine readable medium storing instructions that when executed by the processor effectuates operations comprising: selecting, by the processor, one or more actions to navigate through a workspace, wherein each action transitions the mobile robotic device from a current state to a next state; actuating, by the processor, the mobile robotic device to execute the selected one or more actions; detecting, by the processor, whether a collision is incurred by the mobile robotic device for each action executed; and, assigning, by the processor, each collision to a location within a map of the workspace wherein the location corresponds to where the respective collision occurred.
A method for determining at least one action of a robot, including capturing, with an image sensor disposed on the robot, images of objects within an environment of the robot as the robot moves within the environment; identifying, with a processor of the robot, at least a first object based on the captured images; and actuating, with the processor, the robot to execute at least one action based on the first object identified, wherein the at least one action comprises at least generating a virtual boundary and avoiding crossing the virtual boundary.
Provided is an autonomous wheeled device. A first sensor obtains first data indicative of distances to objects within an environment of the autonomous wheeled device and a second sensor obtains second data indicative of movement of the autonomous wheeled device. A processor generates at least a portion of a map of the environment using at least one of the first data and the second data and a first path of the autonomous wheeled device. The processor transmits first information to an application of a communication device paired with the autonomous wheeled device and receives second information from the application.
A method for pairing a robotic device with an application of a communication device, including the application receiving an indication to pair the robotic device with the application; the application receiving a password for the first Wi-Fi network; the robotic device enabling pairing of the robotic device with the application upon the user pressing at least one button on a user interface of the robotic device; the application displaying a map of an environment of the robotic device and a status of the robotic device and receiving mapping, cleaning, and scheduling information.
H04W 76/11 - Allocation or use of connection identifiers
H04L 29/06 - Communication control; Communication processing characterised by a protocol
G06K 19/06 - Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
A robot including a medium storing instructions that when executed by a processor of the robot effectuates operations including: capturing images of a workspace as the robot moves within the workspace; identifying at least one characteristic of an object captured in the images of the workspace; determining an object type of the object based on an object dictionary of different types of objects, wherein the different object types comprise at least a cord, clothing garments, a shoe, earphones, and pet bodily waste; and instructing the robot to execute at least one action based on the object type of the object, wherein the at least one action comprises avoiding the object or cleaning around the object.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
Provided is a medium storing instructions that when executed by one or more processors of a robot effectuate operations including: obtaining, with a processor, first data indicative of a position of the robot in a workspace; actuating, with the processor, the robot to drive within the workspace to form a map including mapped perimeters that correspond with physical perimeters of the workspace while obtaining, with the processor, second data indicative of displacement of the robot as the robot drives within the workspace; and forming, with the processor, the map of the workspace based on at least some of the first data; wherein: the map of the workspace expands as new first data of the workspace are obtained with the processor; and the robot is paired with an application of a communication device.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G01C 21/16 - NavigationNavigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigatedDead reckoning by integrating acceleration or speed, i.e. inertial navigation
G01C 21/20 - Instruments for performing navigational calculations
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/02 - Control of position or course in two dimensions
G06T 3/00 - Geometric image transformations in the plane of the image
An autonomous mobile robotic refuse container device that transports itself from a storage location to a refuse collection location and back to the storage location after collection of the refuse. When it is time for refuse collection, the robotic device autonomously navigates from the refuse container storage location to the refuse collection location. Once the refuse within the container has been collected, the robotic device autonomously navigates back to the refuse container storage location.
Included is a surface cleaning service system including: one or more robotic surface cleaning devices, each including: a chassis; a set of wheels; one or more motors to drive the wheels; one or more processors; one or more sensors; and a network interface card, wherein the one or more processors of each of the one or more robotic surface cleaning devices determine respective usage data. A control system or the one or more processors of each of the one or more robotic surface cleaning devices is configured to associate each usage data with a particular corresponding robotic surface cleaning device of the one or more robotic surface cleaning devices.
G05D 1/02 - Control of position or course in two dimensions
A47L 9/00 - Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating actionStoring devices specially adapted to suction cleaners or parts thereofCarrying-vehicles specially adapted for suction cleaners
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
A47L 9/28 - Installation of the electric equipment, e.g. adaptation or attachment to the suction cleanerControlling suction cleaners by electric means
A robotic cleaner executing operations such as capturing data indicative of locations of objects in a workspace through which the robot moves; generating or updating a map of at least a part of the workspace based on at least the data; and navigating based on the map or an updated map of the workspace. The robotic cleaner may include a side brush with a main body with at least one attachment point and at least one bundle of bristles attached to the at least one attachment point of the main body, wherein the bristles are between 50 to 90 millimeters in length and positioned between 5 to 30 degrees with respect to a horizontal plane.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
Provided is an autonomous coverage robot including: a chassis; a set of wheels; a plurality of sensors; and a mopping assembly including: a fluid reservoir for storing a cleaning fluid; a cloth for receiving the cleaning fluid, wherein the cloth is oriented toward a work surface; a means to move at least the cloth of the mopping assembly up and down in a plane perpendicular to the work surface, wherein the means to move at least the cloth of the mopping assembly up and down is controlled automatically based on input provided by at least one of the plurality of sensors; and a means to move at least a portion of the mopping assembly back and forth in a plane parallel to the work surface.
A47L 7/00 - Suction cleaners adapted for additional purposesTables with suction openings for cleaning purposesContainers for cleaning articles by suctionSuction cleaners adapted to cleaning of brushesSuction cleaners adapted to taking-up liquids
A47L 9/00 - Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating actionStoring devices specially adapted to suction cleaners or parts thereofCarrying-vehicles specially adapted for suction cleaners
A47L 5/00 - Structural features of suction cleaners
Provided is an electronic razor, including: a frame; one or more razor blades detachable from the frame; a razor blade motor to drive the one or more razor blades; one or more sensors; a processor; and a suctioning mechanism positioned below the one or more razor blades, including: a suction fan; a suction fan motor to drive the suction fan; and a hair collection compartment.
B26B 19/02 - Clippers or shavers operating with a plurality of cutting edges, e.g. hair clippers, dry shavers of the reciprocating-cutter type
B26B 19/20 - Clippers or shavers operating with a plurality of cutting edges, e.g. hair clippers, dry shavers with provision for shearing hair of preselected or variable length
B26B 21/38 - Safety razors with one or more blades arranged transversely to the handle with provision for reciprocating the blade by means other than rollers
80.
Robotic floor cleaning device with controlled liquid release mechanism
A robotic floor cleaning device that features a controlled liquid releasing mechanism. A rotatable cylinder with at least one aperture for storing a limited quantity of liquid is connected to a non-propelling wheel of the robotic floor cleaning device. There is a passage below the cylinder and between the cylinder and a drainage mechanism. The cylinder is within or adjacent to a liquid reservoir. Each time an aperture is exposed to the liquid within the reservoir it fills with liquid. As the wheel turns the connected cylinder is rotated until the aperture is adjacent to the passage. The liquid in the aperture will flow through the passage and enter the drainage mechanism which disperses the liquid to the working surface. The release of liquid is halted when the connected wheel stops turning.
A47L 11/12 - Floor surfacing or polishing machines motor-driven with reciprocating or oscillating tools
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
Provided is a system including at least two robots. A first robot includes a chassis, a set of wheels, a wheel suspension, sensors, a processor, and a machine-readable medium for storing instructions. A camera of the first robot captures images of an environment from which the processor generates or updates a map of the environment and determines a location of items within the environment. The processor extracts features of the environment from the images and determines a location of the first robot. The processor transmits information to a processor of a second robot and determines an action of the first robot and the second robot. A smart phone application is paired with at least the first robot and is configured to receive at least one user input specifying an instruction for at least the first robot and at least one user preference.
Some aspects provide a method for instructing operation of a robotic floor-cleaning device based on the position of the robotic floor-cleaning device within a two-dimensional map of the workspace. A two-dimensional map of a workspace is generated using inputs from sensors positioned on a robotic floor-cleaning device to represent the multi-dimensional workspace of the robotic floor-cleaning device. The two-dimensional map is provided to a user on a user interface. A user may adjust the boundaries of the two-dimensional map through the user interface and select settings for map areas to control device operation in various areas of the workspace.
Provided is a method including: capturing, with at least one sensor of a robot, first data indicative of the position of the robot in relation to objects within the workspace and second data indicative of movement of the robot; recognizing, with a processor of the robot, a first area of the workspace based on observing at least one of: a first part of the first data and a first part of the second data; generating, with the processor of the robot, at least part of a map of the workspace based on at least one of: the first part of the first data and the first part of the second data; generating, with the processor of the robot, a first movement path covering at least part of the first recognized area; actuating, with the processor of the robot, the robot to move along the first movement path.
G06T 7/55 - Depth or shape recovery from multiple images
G01C 21/16 - NavigationNavigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigatedDead reckoning by integrating acceleration or speed, i.e. inertial navigation
G06T 7/30 - Determination of transform parameters for the alignment of images, i.e. image registration
G06T 7/521 - Depth or shape recovery from laser ranging, e.g. using interferometryDepth or shape recovery from the projection of structured light
G06T 3/00 - Geometric image transformations in the plane of the image
Some embodiments include a robot, including: a chassis; a set of wheels coupled to the chassis; at least one encoder coupled to a wheel with a resolution of at least one count for every ninety degree rotation of the wheel; a trailing arm suspension coupled to each drive wheel for overcoming surface transitions and obstacles, wherein a first suspension arm is positioned on a right side of a right drive wheel and a second suspension arm is positioned on a left side of a left drive wheel; a roller brush; a collection bin; a fan with multiple blades for creating a negative pressure resulting in suction of dust and debris; a network card for wireless communication with at least one of: a computing device, a charging station, and another robot; a plurality of sensors; a processor; and a media storing instructions that when executed by the processor effectuates robotic operations.
G05D 1/02 - Control of position or course in two dimensions
A47L 9/00 - Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating actionStoring devices specially adapted to suction cleaners or parts thereofCarrying-vehicles specially adapted for suction cleaners
A47L 5/22 - Structural features of suction cleaners with power-driven air-pumps or air-compressors, e.g. driven by motor vehicle engine vacuum with rotary fans
A47L 9/04 - Nozzles with driven brushes or agitators
A47L 9/28 - Installation of the electric equipment, e.g. adaptation or attachment to the suction cleanerControlling suction cleaners by electric means
A method for centrally aligning a robot with an electronic device, including: transmitting, with at least one transmitter, a first signal; receiving, with a first receiver and a second receiver, the first signal; detecting, with a controller coupled to the first receiver and the second receiver, the robot is centrally aligned with the electronic device when the first receiver and the second receiver simultaneously receive the first signal, wherein a virtual line passing through a center of the robot and a center of the electronic device is aligned with a midpoint between the first receiver and the second receiver; and executing, with the robot, a particular movement type when the robot is aligned with the electronic device.
Provided is a robot, including: a chassis; a set of wheels coupled to the chassis; at least one motor for driving the set of wheels; at least one motor controller; a range finding system coupled to the robot; a plurality of sensors; a processor; and a tangible, non-transitory, machine-readable medium storing instructions that when executed by the processor effectuates operations including: obtaining, with the processor, distances to obstacles measured by the range finding system as the robot moves relative to the obstacles; monitoring, with the processor, the distance measurements; identifying, with the processor, outlier distance measurements in otherwise steadily fitting distance measurements; determining, with the processor, a depth of an obstacle based on the distance measurements; and determining, with the processor, a position of the obstacle based on the distance measurements.
Provided is an autonomous versatile robotic chassis, including: a chassis; a set of wheels coupled to the chassis; one or more motors to drive the set of wheels; at least one storage compartment within which one or more items for delivery are placed during transportation; a processor; one or more sensors; a camera; and a tangible, non-transitory, machine readable medium storing instructions that when executed by the processor effectuates operations including: generating, with the processor, a map of an environment; localizing, with the processor, the robotic chassis; receiving, with the processor, a request for delivery of the one or more items to a first location; generating, with the processor, a movement path to the first location from a current location; and instructing, with the processor, the robotic chassis to transport the one or more items to the first location by navigating along the movement path.
B60P 3/20 - Vehicles adapted to transport, to carry or to comprise special loads or objects for transporting refrigerated goods
B60N 3/10 - Arrangements or adaptations of other passenger fittings, not otherwise provided for of receptacles for food or beverages, e.g. refrigerated
G06N 7/01 - Probabilistic graphical models, e.g. probabilistic networks
Provided is a machine-readable medium storing instructions that when executed by a processor effectuate operations including: receiving, with an application executed by a communication device, a first set of inputs including user data; generating, with the application, a three-dimensional model of the user based on the user data; receiving, with the application, a second set of inputs including a type of clothing garment; generating, with the application, a first set of clothing garments including clothing garments from a database of clothing garments that are the same type of clothing garment; generating, with the application, a second set of clothing garments from the first set of clothing garments based on the user data and one or more relationships between clothing attributes and human attributes; and presenting, with the application, the clothing garments from the second set of clothing garments virtually fitted on the three-dimensional model of the user.
Provided is a medium storing instructions that when executed by one or more processors effectuate operations including: obtaining a stream of spatial data indicative of a robot's position in a workspace; obtaining a stream of movement data indicative of the robot's displacement in the workspace; navigating along a path of the robot in the workspace based on the stream of spatial data; while navigating, mapping at least part of the workspace based on the stream of spatial data to form or update a spatial map in memory; wherein the spatial map expands as new areas of the workspace are covered by the robot and spatial data of the new areas of the workspace are obtained and used by the one or more processors to update the spatial map; and wherein the spatial map of the workspace is segmented into two or more zones.
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/02 - Control of position or course in two dimensions
G01C 21/16 - NavigationNavigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigatedDead reckoning by integrating acceleration or speed, i.e. inertial navigation
G06T 7/30 - Determination of transform parameters for the alignment of images, i.e. image registration
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G01C 21/20 - Instruments for performing navigational calculations
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G06T 7/62 - Analysis of geometric attributes of area, perimeter, diameter or volume
H04N 23/00 - Cameras or camera modules comprising electronic image sensorsControl thereof
G06T 7/521 - Depth or shape recovery from laser ranging, e.g. using interferometryDepth or shape recovery from the projection of structured light
G06T 3/00 - Geometric image transformations in the plane of the image
A fleet of delivery robots, each including: a chassis; a storage compartment within which items are stored for transportation; a set of wheels coupled to the chassis; at least one sensor; a processor electronically coupled to the control system and the at least one sensor; and a tangible, non-transitory, machine readable medium storing instructions that when executed by the processor effectuates operations including: capturing, with the at least one sensor, data of an environment and data indicative of movement of the respective delivery robot; generating or updating, with the processor, a first map of the environment based on at least a portion of the captured data; inferring, with the processor, a current location of the respective delivery robot; and actuating, with the processor, the respective delivery robot to execute a delivery task including transportation of at least one item from a first location to a second location.
B62D 33/063 - Drivers' cabs movable from one position into at least one other position, e.g. tiltable, pivotable about a vertical axis, displaceable from one side of the vehicle to the other
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G05D 1/02 - Control of position or course in two dimensions
G06T 7/55 - Depth or shape recovery from multiple images
Provided is provide a robotic device, including: a chassis; a set of wheels; a control system; a battery; one or more sensors; a processor; a tangible, non-transitory, machine readable medium storing instructions that when executed by the processor effectuates operations including: capturing, with the one or more sensors, data of an environment of the robotic device and data indicative of movement of the robotic device; generating or updating, with the processor, a map of the environment based on at least a portion of the captured data; and generating or updating, with the processor, a movement path of the robotic device based on at least the map of the environment.
Provided is a tangible, non-transitory, machine readable medium storing instructions that when executed by a processor of a robotic device effectuates operations including, receiving, by the processor, a sequence of one or more commands; executing, by the robotic device, the sequence of one or more commands; saving, by the processor, the sequence of one or more commands in memory after a predetermined amount of time from receiving a most recent one or more commands; and re-executing, by the robotic device, the saved sequence of one or more commands.
A mop module of a robot, including: a liquid reservoir for storing liquid; and an electronically-controlled liquid release mechanism; wherein: the electronically-controlled liquid release mechanism releases liquid from the liquid reservoir onto a work surface of the robot for mopping; a schedule for mopping at least one area is determined by a processor of the robot or based on user input provided to an application of a communication device paired with the robot; and a quantity of liquid released while the robot mops the at least one area is determined by the processor of the robot or based on user input provided to the application of the communication device paired with the robot.
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
Provided is a bumper apparatus of a robot, including: a bumper elastically coupled with a chassis of the robot; and at least one elastic element coupled to or interfacing with both the chassis and the bumper; wherein: the at least one elastic element facilitates movement of the bumper relative to the chassis upon impact with an object and disengagement from the object after impact; the at least one elastic element facilitates a return of the bumper to a neutral position upon disengaging from the object after impact; and the bumper covers at least a portion of the chassis.
B62D 24/04 - Vehicle body mounted on resilient suspension for movement relative to the vehicle frame
A47L 9/00 - Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating actionStoring devices specially adapted to suction cleaners or parts thereofCarrying-vehicles specially adapted for suction cleaners
B62D 27/04 - Connections between superstructure sub-units resilient
A47L 11/40 - Parts or details of machines not provided for in groups , or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers or levers
G05D 1/02 - Control of position or course in two dimensions
95.
Method for robotic devices to identify doorways using machine learning
A method for identifying a doorway, including receiving, with a processor of an automated mobile device, sensor data of an environment of the automated mobile device from one or more sensors coupled with the processor, wherein the sensor data is indicative of distances to objects within the environment; identifying, with the processor, a doorway in the environment based on the sensor data; marking, with the processor, the doorway in an indoor map of the environment; and instructing, with the processor, the automated mobile device to execute one or more actions upon identifying the doorway, wherein the one or more actions comprises finishing a first task in a first work area before crossing the identified doorway into a second work area to perform a second task.
Provided is a tangible, non-transitory, machine readable medium storing instructions that when executed by one or more processors of a robotic device effectuate operations including capturing, with a camera of the robotic device, spatial data of surroundings of the robotic device; generating, with the one or more processors of the robotic device, a movement path based on the spatial data of the surroundings; capturing, with at least one sensor of the robotic device, at least one measurement relative to the surroundings of the robotic device; obtaining, with the one or more processors of the robotic device, the at least one measurement; and inferring, with the one or more processors of the robotic device, a location of the robotic device based on the at least one measurement.
A robotic device, including a tangible, non-transitory, machine readable medium storing instructions that when executed by a processor effectuates operations including: capturing, with the camera, one or more images of an environment of the robotic device; capturing, with the plurality of sensors, sensor data of the environment; generating or updating, with the processor, a map of the environment; identifying, with the processor, one or more rooms in the map; receiving, with the processor, one or more multidimensional arrays including at least one parameter that is used to identify a feature included in the one or more images; determining, with the processor, a position and orientation of the robotic device relative to the feature; and transmitting, with the processor, a signal to the processor of the controller to adjust a heading of the robotic device.
A retractable cable assembly in use with an electrical charger, power adapter, or other power supply. A cable wound on a spool disposed within a housing may be extracted by manually pulling on the cable or pressing of a release switch until the desired length of the cable is drawn. As the cable is drawn an engaged locking mechanism is used to keep the cable in place during and after extraction of the cable until which time retraction of the cable is desired. A retraction actuator disengages the locking mechanism, thereby freeing the cable and immediately retracting the cable within the housing.
Provided is a tangible, non-transitory, machine readable medium storing instructions that when executed by the image processor effectuates operations including: capturing, with a first image sensor, a first image of at least two light points projected on a surface by the at least one laser light emitter; extracting, with at least one image processor, a first distance between the at least two light points in the first image in a first direction; and estimating, with the at least one image processor, a first distance to the surface on which the at least two light points are projected based on at least the first distance between the at least two light points and a predetermined relationship relating a distance between at least two light points in the first direction and a distance to the surface on which the at least two light points are projected.
G01B 11/14 - Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures