Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include, at least, detecting an optical sight to align with an associated agricultural object, tracking the agricultural objects relative to the optical sight, predicting a parameter to track in association with agricultural object, and activating an emitter to apply an action based on the parameter.
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include identifying an agricultural object being associated with a subset of agricultural objects, identifying an action corresponding to a subset of agricultural objects responsive, selecting an emitter with which to perform an action associated with the agricultural object, configuring an agricultural projectile delivery system to activate an emitter to propel an agricultural projectile to intercept an agricultural object.
44 - Medical, veterinary, hygienic and cosmetic services; agriculture, horticulture and forestry services
Goods & Services
Agricultural machines, namely, autonomous robotic weeders, thinners, spreaders; Agricultural machinery and attachments, namely, fertilizer spreaders; Agricultural machines, namely, robotic platforms utilizing artificial intelligence to provide precision mapping and application of fertilizers, insecticides, fungicides, pesticides, and organic treatments to agricultural crops Horticulture consulting services for treatment and maintenance of crops; Agronomic consulting services in the field of crop yields, pest control, plant nutrient management
44 - Medical, veterinary, hygienic and cosmetic services; agriculture, horticulture and forestry services
Goods & Services
Horticulture consulting services for treatment and maintenance of crops; Agronomic consulting services in the field of crop yields, pest control, plant nutrient management
09 - Scientific and electric apparatus and instruments
42 - Scientific, technological and industrial services, research and design
Goods & Services
Downloadable mobile applications for use in the field of agriculture; Downloadable mobile applications for control and monitoring of autonomous robotic weeders, thinners, spreaders; Downloadable mobile applications for precision mapping and application of fertilizers, insecticides, fungicides, pesticides, and organic treatments to agricultural crops; Downloadable computer programs using artificial intelligence (AI) for use in the field of agriculture; Downloadable computer programs using artificial intelligence (AI) for control and monitoring of autonomous robotic weeders, thinners, spreaders; Downloadable computer programs using artificial intelligence (AI) for precision mapping and application of fertilizers, insecticides, fungicides, pesticides, and organic treatments to agricultural crops Providing temporary use of on-line non-downloadable software for use in the field of agriculture; Providing temporary use of on-line non-downloadable software for control and monitoring of autonomous robotic weeders, thinners, spreaders; Providing temporary use of on-line non-downloadable software for for precision mapping and application of fertilizers, insecticides, fungicides, pesticides, and organic treatments to agricultural crops; Providing on-line non-downloadable software using artificial intelligence (AI) for control and monitoring of autonomous robotic weeders, thinners, spreaders; Providing on-line non-downloadable software using artificial intelligence (AI) for precision mapping and application of fertilizers, insecticides, fungicides, pesticides, and organic treatments to agricultural crops; Providing on-line non-downloadable software using artificial intelligence (AI) for use in the field of agriculture
7.
CONFIGURABLE AND MODULAR LASER COMPONENT FOR TARGETING SMALL OBJECTS
An agricultural treatment system can include a laser light source, a redirection component, an exit component, and safety component(s). The laser light source can emit a laser beam within the system in a first direction. The redirection component can be a mirror that can receive the laser beam and redirect it in a second direction toward an agricultural target object outside the system. The exit component can transition the laser beam from within to outside the system and toward the agricultural target object. Safety component(s) can facilitate maximizing the strength of the laser beam at or proximate the agricultural target object while also reducing the strength of the laser beam at locations away from the agricultural target object. Safety component(s) can include a converging lens and a diverging lens, which can be adjusted to locate a focal point of the laser beam at or proximate the agricultural target object.
An agricultural treatment system can include a laser light source, a redirection component, an exit component, and safety component(s). The laser light source can emit a laser beam within the system in a first direction. The redirection component can be a mirror that can receive the laser beam and redirect it in a second direction toward an agricultural target object outside the system. The exit component can transition the laser beam from within to outside the system and toward the agricultural target object. Safety component(s) can facilitate maximizing the strength of the laser beam at or proximate the agricultural target object while also reducing the strength of the laser beam at locations away from the agricultural target object. Safety component(s) can include a converging lens and a diverging lens, which can be adjusted to locate a focal point of the laser beam at or proximate the agricultural target object.
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
A system provides for receiving data representing a policy with which to implement one or more actions in association with one or more subsets of agricultural objects. The system identifies a subset of payloads to provide actions based on data representing the policy for one or more subsets of agricultural objects. The system determines whether a payload is sufficient to implement the one or more actions. The system transmits data representing executable instructions to implement the one or more actions.
Various embodiments of apparatuses, methods, systems, and computer program products described herein are directed to a treatment system having a first group of in line emitters, a second group of in line emitters, and one or more payload sources, configured to identify a first group of one or more agricultural objects, select a first emitter from the first group of in line emitters to apply a first liquid-based treatment to the identified first group of one or more agricultural objects, identify a second group of one or more agricultural objects, select a second emitter from the second group of in line emitters to apply a second liquid-based treatment to the identified second group of one or more agricultural objects, simultaneously apply the first liquid-based treatment towards the first group of one or more agricultural objects and apply the second liquid-based treatment towards the second group of one or more agricultural objects.
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/249 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
G05D 1/646 - Following a predefined trajectory, e.g. a line marked on the floor or a flight path
G06F 18/243 - Classification techniques relating to the number of classes
A method and system having instructions to perform actions include obtaining images of an agricultural environment including a first image comprising at least one background portion and one or more regions of interest, implementing a machine learning (ML) algorithm on a portion of the first image including a portion of the background portion and the one or more regions of interest, detecting a plurality of objects associated with a plurality of real-world objects in the agricultural environment in at least one region of interest in the one or more regions of interest of the first image including detecting a first object and detecting a second object, implementing a second algorithm on the portion of the first image comprising the first object to detect one or more divided features of the first object, tracking a feature of the one or more divided features across subsequent images as the moving platform traverses the agricultural environment, tracking the second object, and selecting a target action configured to target the second object and applying the target action via the target mechanism to the second object.
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/249 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural treatment system and method of operation. The agricultural treatment system uses a spraying apparatus for spraying fluid at agricultural objects. The spraying apparatus is configured with a spraying head assembly that includes a moveable spraying head with one or more spraying tips a first motor assembly having a first motor interconnected with the spraying head assembly, wherein the one or more spraying tips pivot and rotate about an axis via a rotation of the first motor to emit a first fluid of the one or more fluids at the agricultural objects.
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
A01C 23/04 - Distributing under pressureDistributing mudAdaptation of watering systems for fertilising-liquids
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
A01M 21/04 - Apparatus for destruction by steam, chemicals, burning, or electricity
B05B 12/12 - Arrangements for controlling deliveryArrangements for controlling the spray area responsive to condition of liquid or other fluent material discharged, of ambient medium or of target responsive to conditions of ambient medium or target, e.g. humidity, temperature
B05B 13/04 - Means for supporting workArrangement or mounting of spray headsAdaptation or arrangement of means for feeding work the spray heads being moved during operation
B23K 26/08 - Devices involving relative movement between laser beam and workpiece
G05D 1/249 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural treatment system and method of operation. The agricultural treatment system may obtain imagery of emitted fluid projectiles at intended target locations. The system may identify positional parameters of a spraying head and/or motors used to maneuver the spraying head to emit the fluid projectile. The system may generate a calibration or lookup table based on a three-dimensional coordinate of the intended target location and of the positional parameters of the spraying head. The system may then use the lookup table to perform subsequent spray operations.
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural treatment system and method of operation. The agricultural treatment system may obtain with one or more image sensors at a first time period, a first set of images each comprising a plurality of pixels depicting a ground area and a first target agricultural object positioned in the ground area. The system may emit a first fluid projectile of a first fluid at the first target agricultural object. The system may obtain with the one or more image sensors at a second time period, a second set of images each comprising a plurality of pixels depicting the ground area and the agricultural object. The system may compare the first image with the second image to determine a change in pixels between at least a first image of the first set of images and at least a second image of the second set of images. And the system may, based on the determined change in pixels as between the first and second images, identify a first group of pixels that represent a first spray object.
Various embodiments of apparatuses, methods, systems, and computer program products described herein are directed to a treatment system. An example apparatus includes a treatment apparatus having a first fluid source, one or more channels for delivering a first fluid of the first fluid source, and a treatment unit, the treatment unit having one or more fluid regulators, including a first fluid regulator configured to regulate a flow of the first fluid received by the treatment unit, a movable treatment head unit having one or more spraying tips, a first channel fluidly coupling the one or more spraying tips with the first fluid regulator, wherein the first fluid regulator controls a volume of fluid emitted from the one or more spraying tips, an one or more controllers disposed in the treatment unit configured to control the one or more fluid regulators.
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
A01C 23/04 - Distributing under pressureDistributing mudAdaptation of watering systems for fertilising-liquids
A01M 9/00 - Special adaptations or arrangements of powder-spraying apparatus for purposes covered by this subclass
17.
SYSTEMS AND METHODS FOR SURVEYING AN AGRICULTURAL ENVIRONMENT ON A MOVING PLATFORM
Various embodiments relate generally to computer vision and automation to autonomously survey an agricultural environment. In some examples, a system can receive sensor data of a geographic boundary from one or more sensors onboard of the agricultural observation system, detect a plurality of external dynamic objects, a plurality of external static objects, one or more agricultural objects, or a combination thereof from the sensor data, determine a position of a component of the agricultural observation system relative to the plurality of external dynamic objects, the plurality of external static objects, the one or more agricultural objects, or a combination thereof, and generate a local map, in real time, via one or more processing units.
A method implemented by a treatment system disposed on a vehicle, the treatment system having one or more processors, a storage, and a treatment mechanism, includes capturing a first image of a region of an agricultural environment, detecting, by implementing a first machine learning (ML) algorithm on a first portion of the first image, a presence of at least a portion of a first object in the first image, determining whether the first object detected is a treatment candidate, determining, upon determining that the first object is a treatment candidate, a first three dimensional (3D) location of at least a portion of the first object in the agricultural environment, and applying, a treatment to at least the portion of the first object by activating the treatment mechanism to interact with the first object.
Various embodiments of methods, systems, and computer program products described herein are directed to a treatment system. An example method includes identifying a type of a first target agricultural object, based on the identified type of the first target agricultural object, determining a first spray profile for treating the first target agricultural object, sending instructions of a first treatment parameter to a treatment unit, and based on the first spray profile, activating the treatment unit to emit a first fluid at the first target agricultural object, wherein the first fluid regulator is moved to an opened position for an amount of time such that the first fluid is emitted, via the first spraying tip, as a first fluid projectile at the first target agricultural object for a first spray duration.
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
A01C 23/04 - Distributing under pressureDistributing mudAdaptation of watering systems for fertilising-liquids
A01M 9/00 - Special adaptations or arrangements of powder-spraying apparatus for purposes covered by this subclass
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may receive captured sensor data of an agricultural environment by one or more image capture devices, generate a first image from the captured sensor data, detect a first real-world agricultural object from the first image, determine a first pose of the first real-world agricultural object, identify the detected first real-world agricultural object as a new agricultural object or a preidentified agricultural object, and determine a treatment policy associated with the first real-world agricultural object.
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
A01M 9/00 - Special adaptations or arrangements of powder-spraying apparatus for purposes covered by this subclass
G05B 19/4155 - Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by programme execution, i.e. part programme or machine function execution, e.g. selection of a programme
G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
G06F 18/243 - Classification techniques relating to the number of classes
21.
Multiaction treatment of plants in an agricultural environment
A method includes traversing, by the treatment system, along a path in an agricultural environment, receiving, by the treatment system, one or more sensor readings comprising one or more agricultural objects, identifying one or more objects of interest from the one or more agricultural objects by analyzing the one or more sensor readings, determining a first target object of the one or more objects of interest for treatment, selecting a treatment policy to treat the first target object, and activating the treatment mechanism to treat the first target object with the treatment policy.
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
A computer-implemented method of sensor input processing, implemented by an agricultural platform comprising a processor and a sensor, includes capturing, using the sensor, sensor images of a vicinity of a target object of a time interval during which a treatment is applied to the target object; processing the sensor images using one or more machine learning (ML) algorithms wherein at least one ML algorithm uses an ML model trained to detect a presence of a treatment action in the vicinity of the target object; and providing, selectively based on a result of detecting the presence of the treatment action in the vicinity of the target object, an outcome of the processing for further processing.
A method of processing agricultural images includes comparing object detections performed by multiple image processing schemes to determine a set of ground truth images from which at least one machine learning (ML) models used by at least one ML algorithm included in the multiple image processing schemes is trained, wherein the multiple image processing schemes include two or more of (a) an image processing scheme that includes a cascade of multiple ML algorithms; (b) an image processing scheme that includes image annotation based on user feedback; or (c) an image processing scheme that includes a cascade of an ML algorithm or a computer vision (CV) algorithm and a user feedback.
A computer-implemented method of sensor input processing, implemented by an agricultural platform comprising a processor and a sensor, includes capturing, using the sensor, sensor images of a vicinity of a target object of a time interval during which a treatment is applied to the target object; processing the sensor images using one or more machine learning (ML) algorithms wherein at least one ML algorithm uses an ML model trained to detect a presence of a treatment action in the vicinity of the target object; and providing, selectively based on a result of detecting the presence of the treatment action in the vicinity of the target object, an outcome of the processing for further processing.
A computer-implemented method includes performing, using a processor onboard a vehicle, a machine learning (ML) processing on sensor input from sensors onboard an agricultural vehicle, identifying, according to a rule, a subset of data resulting from the ML processing and generating and displaying, in real-time, the subset of data to a user interface, thereby enabling a user interaction with the subset of data.
A01B 69/04 - Special adaptations of automatic tractor steering, e.g. electric system for contour ploughing
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G06V 10/77 - Processing image or video features in feature spacesArrangements for image or video recognition or understanding using pattern recognition or machine learning using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]Blind source separation
27.
EVALUATION OF INFERENCES FROM MULTIPLE MODELS TRAINED ON SIMILAR SENSOR INPUTS
A computer-implemented method of sensor input processing, implemented by an agricultural platform comprising a processor and a sensor includes receiving sensor input from the sensor; processing the sensor input by multiple machine learning (ML) algorithms, each using a corresponding ML model for generating labels for objects identified in the sensor input; combining labels generated by each ML algorithm to generate a super-imposed labeled sensor input frame; comparing outputs of the ML algorithms to determine similarities or differences; and using results of the comparing for improving an operational characteristic of the sensor input processing.
An agricultural treatment system can include one or more motor assemblies, linkage assemblies coupled to the motor assemblies, and a treatment head assembly coupled to the linkage assemblies. Each motor assembly can rotate about a motor assembly rotational axis and each linkage assembly can move in a direction in response to motor assembly rotation about a motor assembly rotational axis. Portions of the treatment head assembly can rotate about one or more treatment head assembly rotational axes in response to linkage assembly movement. The treatment head assembly can include a treatment head, a fluid port that delivers a fluid into the treatment head, spraying tip(s) that spray the fluid onto an agricultural target object, and a fluid regulator that delivers the fluid from the fluid port to the spraying tip(s). The fluid regulator can be coupled to and can rotate with the treatment head, which can be movable.
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include, receiving data representing a policy specifying a type of action for an agricultural object, selecting an emitter with which to perform a type of action for the agricultural object as one of one or more classified subsets, and configuring the agricultural projectile delivery system to activate an emitter to propel an agricultural projectile to intercept the agricultural object.
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The system uses a pump in combination with a fluid regulator to obtain and disperse fluid from the fluid tanks. The fluid regulator includes a fluid input port, a fluid output port and a fluid control valve. A first tube fluidly coupled from the fluid pump to the fluid input port. The system includes a moveable treatment head having one or more spraying tips, including a first spraying tip. A second tube is fluidly coupled from the fluid output port to the moveable treatment head. A system includes a controller configured to provide to the fluid regulator to open and close the fluid control valve to release an amount of the first fluid.
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
A01C 23/04 - Distributing under pressureDistributing mudAdaptation of watering systems for fertilising-liquids
A01M 9/00 - Special adaptations or arrangements of powder-spraying apparatus for purposes covered by this subclass
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. Fluid may be provided through a first pressurized fluid line from a fluid source within an agricultural treatment system to a fluid port of a treatment head assembly of the agricultural treatment system. The fluid may be delivered from the fluid port to one or more spraying tips when the fluid regulator is actuated when in a desired position.
Various examples of apparatuses, methods, systems, and computer program products described herein are directed to agricultural observation and treatment systems and their method of operation. The agricultural observation and treatment systems can be configured to obtain one or more sensor readings of a region of an agricultural environment. The systems can implement one or more machine learning (ML) algorithms, determine one or more parameters for use with the one or more ML algorithms, detect a real-world target from the one or more sensor readings using the one or more ML algorithms, and apply a treatment to the target by selectively activating a treatment mechanism configured to interact with the target.
Various examples of apparatuses, methods, systems, and computer program products described herein are directed to agricultural observation and treatment systems and their method of operation. The agricultural observation and treatment systems can be configured to obtain one or more sensor readings of a region of an agricultural environment. The systems can implement one or more machine learning (ML) algorithms, determine one or more parameters for use with the one or more ML algorithms, detect a real-world target from the one or more sensor readings using the one or more ML algorithms, and apply a treatment to the target by selectively activating a treatment mechanism configured to interact with the target.
A method performed by a treatment system disposed on a moving platform, the treatment system having one or more processors, a storage and a treatment mechanism, comprising: receiving one or more images of an environment in which the moving platform is operating; identifying, in real-time, a pose of the moving platform using sensor inputs; identifying one or more target objects by processing the one or more images using a machine learning (ML) algorithm; and controlling the treatment mechanism to treat the one or more target objects by orienting the treatment mechanism towards the one or more target objects at least partially based on the pose.
A method includes receiving, by the treatment system, during operation in an agricultural environment, one or more images comprising one or more agricultural objects in the agricultural environment, identifying, in real-time, one or more objects of interest from the one or more agricultural objects by analyzing the one or more images, wherein the analyzing results in a first object being identified as belonging to one or more target objects and a second object being identified as not belonging to the one or more target objects, logging one or more results of the identification of each of the one or more objects of interest and a corresponding treatment decision; and activating the treatment mechanism to treat the one or more target objects.
A method performed by a treatment system disposed on a moving platform, the treatment system having one or more processors, a storage and a treatment mechanism, comprising: receiving one or more images of an environment in which the moving platform is operating; identifying, in real-time, a pose of the moving platform using sensor inputs; identifying one or more target objects by processing the one or more images using a machine learning (ML) algorithm; and controlling the treatment mechanism to treat the one or more target objects by orienting the treatment mechanism towards the one or more target objects at least partially based on the pose.
A method includes receiving, by the treatment system, during operation in an agricultural environment, one or more images comprising one or more agricultural objects in the agricultural environment, identifying, in real-time, one or more objects of interest from the one or more agricultural objects by analyzing the one or more images, wherein the analyzing results in a first object being identified as belonging to one or more target objects and a second object being identified as not belonging to the one or more target objects, logging one or more results of the identification of each of the one or more objects of interest and a corresponding treatment decision; and activating the treatment mechanism to treat the one or more target objects.
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include, at least, detecting an optical sight to align with an associated agricultural object, tracking the agricultural objects relative to the optical sight, predicting a parameter to track in association with agricultural object, and activating an emitter to apply an action based on the parameter.
A method includes obtaining, by the treatment system configured to implement a machine learning (ML) algorithm, one or more images of a region of an agricultural environment near the treatment system, wherein the one or more images are captured from the region of a real-world where agricultural target objects are expected to be present, determining one or more parameters for use with the ML algorithm, wherein at least one of the one or more parameters is based on one or more ML models related to identification of an agricultural object, determining a real-world target in the one or more images using the ML algorithm, wherein the ML algorithm is at least partly implemented using the one or more processors of the treatment system, and applying a treatment to the target by selectively activating the treatment mechanism based on a result of the determining the target.
A method includes obtaining, by the treatment system configured to implement a machine learning (ML) algorithm, one or more images of a region of an agricultural environment near the treatment system, wherein the one or more images are captured from the region of a real-world where agricultural target objects are expected to be present, determining one or more parameters for use with the ML algorithm, wherein at least one of the one or more parameters is based on one or more ML models related to identification of an agricultural object, determining a real-world target in the one or more images using the ML algorithm, wherein the ML algorithm is at least partly implemented using the one or more processors of the treatment system, and applying a treatment to the target by selectively activating the treatment mechanism based on a result of the determining the target.
A method includes receiving, by the treatment system, during operation in an agricultural environment, one or more images comprising one or more agricultural objects in the agricultural environment, identifying, in real-time, one or more objects of interest from the one or more agricultural objects by analyzing the one or more images, wherein the analyzing results in a first object being identified as belonging to one or more target objects and a second object being identified as not belonging to the one or more target objects, logging one or more results of the identification of each of the one or more objects of interest and a corresponding treatment decision; and activating the treatment mechanism to treat the one or more target objects.
Computer vision and autonomous identification of objects are used for the application of a treatment to an object. Robotics and mobility technologies navigate a delivery system which is configured to identify and apply an agricultural treatment to an identified agricultural object. The delivery system identifies a subset of payloads to provide one or more actions based on data representing a policy for one or more subsets of agricultural objects. The delivery system causes one or more cartridges to be charged based on the subset of payloads.
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include, receiving data representing a policy specifying a type of action for an agricultural object, selecting an emitter with which to perform a type of action for the agricultural object as one of one or more classified subsets, and configuring the agricultural projectile delivery system to activate an emitter to propel an agricultural projectile to intercept the agricultural object.
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system uses a treatment unit for spraying fluid at agricultural objects. The treatment unit is configured with a treatment head assembly that includes a moveable treatment head with one or more spraying tips. A first and second motor assembly are operated by the treatment unit to control the movement of the treatment head. The first motor assembly includes a first motor rotatable in a first rotational axis. A first linkage assembly is connected to the first motor and the treatment head assembly. The first linkage assembly is rotatable by the first motor. The second linkage assembly is rotatable by the second motor.
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
A01C 23/04 - Distributing under pressureDistributing mudAdaptation of watering systems for fertilising-liquids
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
A01M 21/04 - Apparatus for destruction by steam, chemicals, burning, or electricity
B05B 12/12 - Arrangements for controlling deliveryArrangements for controlling the spray area responsive to condition of liquid or other fluent material discharged, of ambient medium or of target responsive to conditions of ambient medium or target, e.g. humidity, temperature
B05B 13/04 - Means for supporting workArrangement or mounting of spray headsAdaptation or arrangement of means for feeding work the spray heads being moved during operation
B23K 26/08 - Devices involving relative movement between laser beam and workpiece
B23K 26/38 - Removing material by boring or cutting
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The system uses a pump in combination with a fluid regulator to obtain and disperse fluid from the fluid tanks. The fluid regulator includes a fluid input port, a fluid output port and a fluid control valve. A first tube fluidly coupled from the fluid pump to the fluid input port. The system includes a moveable treatment head having one or more spraying tips, including a first spraying tip. A second tube is fluidly coupled from the fluid output port to the moveable treatment head. A system includes a controller configured to provide to the fluid regulator to open and close the fluid control valve to release an amount of the first fluid.
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
A01C 23/04 - Distributing under pressureDistributing mudAdaptation of watering systems for fertilising-liquids
A01M 9/00 - Special adaptations or arrangements of powder-spraying apparatus for purposes covered by this subclass
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
46.
Precision detection and control of vegetation with real time pose estimation
A method includes receiving sensor inputs including one or more images comprising one or more agricultural objects; continuously performing a pose estimation of the treatment system based on sensor inputs that are time synchronized and fused; identifying the one or more agricultural objects as target objects; tracking the one or more agricultural objects identified by the analyzing; controlling an orientation of the treatment mechanism according to the pose estimation for targeting the one or more agricultural objects; and activating the treatment mechanism to treat the one or more agricultural objects according to the orientation.
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include, receiving data representing a policy specifying a type of action for an agricultural object, selecting an emitter with which to perform a type of action for the agricultural object as one of one or more classified subsets, and configuring the agricultural projectile delivery system to activate an emitter to propel an agricultural projectile to intercept the agricultural object.
A method includes obtaining, by the treatment system configured to implement a machine learning (ML) algorithm, one or more images of a region of an agricultural environment near the treatment system, wherein the one or more images are captured from the region of a real-world where agricultural target objects are expected to be present, determining one or more parameters for use with the ML algorithm, wherein at least one of the one or more parameters is based on one or more ML models related to identification of an agricultural object, determining a real-world target in the one or more images using the ML algorithm, wherein the ML algorithm is at least partly implemented using the one or more processors of the treatment system, and applying a treatment to the target by selectively activating the treatment mechanism based on a result of the determining the target.
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
A01M 9/00 - Special adaptations or arrangements of powder-spraying apparatus for purposes covered by this subclass
G05B 19/4155 - Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by programme execution, i.e. part programme or machine function execution, e.g. selection of a programme
G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
G06F 18/243 - Classification techniques relating to the number of classes
A01C 23/04 - Distributing under pressureDistributing mudAdaptation of watering systems for fertilising-liquids
A01M 21/04 - Apparatus for destruction by steam, chemicals, burning, or electricity
A01C 23/02 - Special arrangements for delivering the liquid directly into the soil
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The system uses a pump in combination with a fluid regulator to obtain and disperse fluid from the fluid tanks. The fluid regulator includes a fluid input port, a fluid output port and a fluid control valve. A first tube fluidly coupled from the fluid pump to the fluid input port. The system includes a moveable treatment head having one or more spraying tips, including a first spraying tip. A second tube is fluidly coupled from the fluid output port to the moveable treatment head. A system includes a controller configured to provide to the fluid regulator to open and close the fluid control valve to release an amount of the first fluid.
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
A01C 23/04 - Distributing under pressureDistributing mudAdaptation of watering systems for fertilising-liquids
A01M 9/00 - Special adaptations or arrangements of powder-spraying apparatus for purposes covered by this subclass
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
51.
Autonomous detection and treatment of agricultural objects via precision treatment delivery system
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system determines a vehicle pose of a vehicle as the vehicle moves along a path. The system identifies a first target agricultural object for treatment. Based on the determined vehicle pose, the system positions a treatment head of a first treatment unit such that a first projectile fluid may be emitted by the first treatment unit at the identified first target agricultural object. The system then causes an emitter to emit a fluid from the treatment head at the first target agricultural object.
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
A01C 23/04 - Distributing under pressureDistributing mudAdaptation of watering systems for fertilising-liquids
G05B 19/4155 - Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by programme execution, i.e. part programme or machine function execution, e.g. selection of a programme
54.
AUTONOMOUS AGRICULTURAL TREATMENT SYSTEM USING MAP BASED TARGETTING OF AGRICULTURAL OBJECTS
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
A01N 65/00 - Biocides, pest repellants or attractants, or plant growth regulators containing material from algae, lichens, bryophyta, multi-cellular fungi or plants, or extracts thereof
C05G 3/60 - Biocides or preservatives, e.g. disinfectants, pesticides or herbicidesPest repellants or attractants
G01N 21/25 - ColourSpectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
G01N 21/31 - Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
G06F 17/14 - Fourier, Walsh or analogous domain transformations
G06T 1/20 - Processor architecturesProcessor configuration, e.g. pipelining
55.
Autonomous agricultural treatment system using map based targeting of agricultural objects
Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
A01G 7/06 - Treatment of growing trees or plants, e.g. for preventing decay of wood, for tingeing flowers or wood, for prolonging the life of plants
G06K 9/62 - Methods or arrangements for recognition using electronic means
A01C 23/04 - Distributing under pressureDistributing mudAdaptation of watering systems for fertilising-liquids
A01M 21/04 - Apparatus for destruction by steam, chemicals, burning, or electricity
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
A01C 23/02 - Special arrangements for delivering the liquid directly into the soil
56.
Targeting agricultural objects to apply units of treatment autonomously
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include, at least, detecting an optical sight to align with an associated agricultural object, tracking the agricultural objects relative to the optical sight, predicting a parameter to track in association with agricultural object, and activating an emitter to apply an action based on the parameter.
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include, receiving data representing a policy specifying a type of action for an agricultural object, selecting an emitter with which to perform a type of action for the agricultural object as one of one or more classified subsets, and configuring the agricultural projectile delivery system to activate an emitter to propel an agricultural projectile to intercept the agricultural object.
Systems and methods for computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include identifying a subset of payloads to provide one or more actions based on data representing a policy for one or more subsets of agricultural objects, causing one or more cartridges to be charged based on the subset of payloads, and, and implementing one or more cartridges at an agricultural projectile delivery system.
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include calculating an amount of light originating from a light source sensed at a location in which an agricultural object is interposed between the light source and an image capture device, causing emission of an obscurant, and directing the obscurant to a region interposed between the light source and the agricultural object.
B05B 12/12 - Arrangements for controlling deliveryArrangements for controlling the spray area responsive to condition of liquid or other fluent material discharged, of ambient medium or of target responsive to conditions of ambient medium or target, e.g. humidity, temperature
H04N 5/232 - Devices for controlling television cameras, e.g. remote control
B05B 13/00 - Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups
B05B 17/06 - Apparatus for spraying or atomising liquids or other fluent materials, not covered by any other group of this subclass operating with special methods using ultrasonic vibrations
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
A01C 23/00 - Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
60.
Pixel projectile delivery system to replicate an image on a surface using pixel projectiles
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include receiving data representing an image, detecting a portion of the image associated with a unit of spatial translation relative to a reference, identifying a subset of pixels to be formed on the surface, and causing emission of a subset of pixel projectiles directed to impact a portion of a surface to form a replica of a portion of the image.
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include receiving data representing actions to be performed relative to a subset of agricultural objects, positioning an emitter of an agricultural projectile delivery system adjacent to an agricultural object, identifying a corresponding action to be performed in association with the agricultural object, selecting an emitter to perform the action, and causing the emitter to emit an agricultural projectile to intercept the agricultural object.
A01B 69/00 - Steering of agricultural machines or implementsGuiding agricultural machines or implements on a desired track
A01G 25/09 - Watering arrangements making use of movable installations on wheels or the like
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
B05B 7/24 - Spraying apparatus for discharge of liquids or other fluent materials from two or more sources, e.g. of liquid and air, of powder and gas with means, e.g. a container, for supplying liquid or other fluent material to a discharge device
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
62.
Calibration of systems to deliver agricultural projectiles
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include identifying an emitter of an agricultural projectile delivery system to calibrate a trajectory of an agricultural projectile to intercept a target, predicting a projectile impact site relative to the reference of alignment, determining a calibration parameter to align the projectile impact site and the target, and adjusting the trajectory based on the one or more calibration parameters.
A01B 79/02 - Methods for working soil combined with other agricultural processing, e.g. fertilising, planting
A01M 7/00 - Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
B05B 9/06 - Spraying apparatus for discharge of liquid or other fluent material without essentially mixing with gas or vapour characterised by means for supplying liquid or other fluent material with pressurised or compressible containerSpraying apparatus for discharge of liquid or other fluent material without essentially mixing with gas or vapour characterised by means for supplying liquid or other fluent material with pump the delivery being related to the movement of a vehicle, e.g. the pump being driven by a vehicle wheel
63.
Managing stages of growth of a crop with micro-precision via an agricultural treatment delivery system
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include identifying an agricultural object being associated with a subset of agricultural objects, identifying an action corresponding to a subset of agricultural objects responsive, selecting an emitter with which to perform an action associated with the agricultural object, configuring an agricultural projectile delivery system to activate an emitter to propel an agricultural projectile to intercept an agricultural object.