A health indicator system for a sensor pod includes one or more sensors associated with the sensor pod, a sensor pod housing, an internal sensor status monitoring system configured to determine a status of at least one sensor of the one or more sensors, and a plurality of visual indicators located on at least one exterior surface of the sensor pod housing, the plurality of visual indicators configured to display a predetermined configuration based on the determined status of each of the one or more sensors. The plurality of visual indicators are located on the sensor pod housing at a level such that when an external operator standing on a ground surface approaches the vehicle, the plurality of visual indicators is within the external operator's field of view.
B60Q 1/50 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
B60Q 1/00 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
B60Q 5/00 - Arrangement or adaptation of acoustic signal devices
G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
2.
SYSTEM AND METHODS FOR AV MAPS USING SATELLITE IMAGES
Disclosed are methods, systems and non-transitory computer readable memory for AV maps using satellite images. For instance, a method may include: obtain at least one satellite image; annotate the at least one satellite image; convert the annotated map into a trajectory and a sparse map; sense an environment and update the sparse map and the trajectory; and execute the trajectory through the environment.
A brake system for an autonomous tractor-trailer includes a first brake valve configured to allow pressurized air to flow to first tractor brakes and trailer brakes, a second brake valve configured to allow pressurized air to flow to second tractor brakes, and a third brake valve configured to allow pressurized air to flow to the trailer brakes, wherein the third brake valve is configured to be actuated before the first brake valve and the second brake valve.
A method of braking a tractor-trailer includes sensing, autonomously, a braking event, the braking event being indicative of a requirement for braking the tractor-trailer and actuating a trailer brake valve of a trailer of the tractor-trailer to apply trailer brakes prior to actuating a tractor brake valve of a tractor of the tractor-trailer to apply tractor brakes and to apply trailer brakes. Actuating the trailer brake valve of the trailer of the tractor-trailer occurs in response to the sensing of the braking event.
B60T 7/20 - Brake-action initiating means for automatic initiationBrake-action initiating means for initiation not subject to will of driver or passenger specially adapted for trailers, e.g. in case of uncoupling of trailer
Systems and methods for controlling a vehicle are provided. The system may comprise a vehicle and one or more sensors, coupled to the vehicle, configured to generate one or more data points pertaining to one or more of an environment of the vehicle and one or more system component measurements of the vehicle. The system may comprise one or more actuation controls configured to enable the vehicle to perform one or more driving actions and a remote station system configured to receive the one or more data points generated by the one or more sensors, generate one or more driving actions, and transmit the one or more driving actions to the vehicle. The system may comprise a switch configured to switch command of the vehicle between automatic trajectory control and remote station system control.
Systems, including remote station systems, and methods for remotely controlling a vehicle are provided. The remote station system may comprise a transmitter configured to receive one or more data points generated by one or more sensors coupled to a vehicle and, from the vehicle, a trajectory of the vehicle. The remote station system may comprise a display configured to display the one or more data points generated by the one or more sensors and display the trajectory of the vehicle. The remote station system may comprise one or more remote actuation controls configured generate one or more driving actions. The one or more driving actions may correlate to one or more actuator commands configured to cause the vehicle to perform the one or more driving actions. The transmitter may be configured to transmit the one or more driving actions to the vehicle.
Systems and methods for controlling a vehicle is provided. The system may comprise a vehicle, one or more sensors, coupled to the vehicle, configured to generate one or more data points, one or more actuation controls configured to enable the vehicle to perform one or more driving actions, and an automatic trajectory control system, comprising a processor, configured to perform automatic trajectory control. The automatic trajectory control system may be configured to receive one or more data points generated by the one or more sensors, automatically generate an automatic trajectory command, generate one or more driving actions, and cause the vehicle, via the one or more actuation controls, to perform the one or more driving actions in accordance with the automatic trajectory command. The system may comprise a switch configured to switch command of the vehicle between automatic trajectory control and a remote station system control.
Systems and methods for controlling a vehicle are provided. The system may comprise a vehicle, one or more sensors configured to generate one or more data points, one or more actuation controls configured to enable the vehicle to perform one or more driving actions and a controller, configured to automatically generate an automatic trajectory command, generate, based on the one or more automatic trajectory plot points, one or more driving actions, determine whether a remote trajectory command is present for a predetermined timeframe, determine whether the remote trajectory command is different from the automatic trajectory command when the remote trajectory command is present for the predetermined timeframe, and cause the vehicle, via the one or more actuation controls, to perform the one or more driving actions during the predetermined timeframe in accordance with the remote trajectory command when the remote trajectory command is different from the automatic trajectory command.
Systems and methods for controlling a vehicle are provided. The system may comprise a vehicle, one or more sensors, one or more actuation controls configured to enable the vehicle to perform one or more driving actions, and a controller, comprising a processor, configured to receive one or more trajectory commands, automatically generate an automatic trajectory command, and generate one or more driving actions. The one or more driving actions may correlate to one or more actuator commands configured to cause the vehicle to be positioned in accordance with the one or more trajectory plot points of the automatic trajectory command. The system may comprise a remote station system comprising one or more remote actuation controls configured generate the one or more driving actions of the remote trajectory command. The remote station system may be configured to generate one or more driving actions.
This disclosure provides systems and methods for controlling a vehicle in response to an abnormal condition. The method may include generating, by a main computing system, a nominal motion plan and a fallback motion plan for each predetermined interval from a location of the vehicle. The method may also include sending, by the main computing system, the nominal motion plan and the fallback motion plan for each predetermined interval to a redundant ACE system. The method may include detecting, by the redundant ACE system, an abnormal condition of the vehicle, and in response to the abnormal condition, controlling the vehicle, by the redundant ACE system, to perform a predetermined vehicle action comprising navigating the vehicle to a safe stop according to the last received fallback motion plan.
This disclosure provides systems and methods for controlling a vehicle in response to an abnormal condition. The method may include generating, by a main computing system, a nominal motion plan and a fallback motion plan for each predetermined interval from a location of the vehicle. The method may also include sending, by the main computing system, the nominal motion plan and the fallback motion plan for each predetermined interval to a redundant ACE system. The method may include detecting, by the redundant ACE system, an abnormal condition of the vehicle, and in response to the abnormal condition, controlling the vehicle, by the redundant ACE system, to perform a predetermined vehicle action comprising navigating the vehicle to a safe stop according to the last received fallback motion plan.
Systems and methods for controlling a vehicle are provided. The method may comprise generating one or more data points from one or more sensors coupled to a vehicle and performing remote station system control of the vehicle using a remote station system. The performing the remote station system control of the vehicle may comprise, using the remote station system, receiving the one or more data points generated by the one or more sensors and generating a remote trajectory command, and generating, based on the one or more trajectory plot points, one or more driving actions. The method may comprise transmitting the trajectory command to the vehicle and performing a fallback function. Performing the fallback function may comprise determining whether command of the vehicle should fall back to one or more secondary control modes and switching, using a switch, control of the vehicle to the one or more secondary control modes.
Systems and methods for controlling a vehicle are provided. The system may comprise a vehicle, one or more sensors configured to generate one or more data points, one or more actuation controls configured to enable the vehicle to perform one or more driving actions, and an automatic trajectory control system, comprising a processor, configured to perform automatic trajectory control. In the performing the automatic trajectory control, the automatic trajectory control system may be configured to transmit an automatic trajectory command to a remote station system. The system may comprise the remote station system. The remote station system may be configured to receive the one or more data points generated by the one or more sensors and generate one or more driving actions. The one or more driving actions may correlate to one or more actuator commands configured to cause the vehicle to perform the one or more driving actions.
Systems (e.g., remote station systems) and methods for remotely controlling a vehicle are provided. The remote station system may comprise a transmitter configured to receive one or more data points and a processor configured to identify one or more objects within a field of view of the vehicle, using the one or more data points and generate a signal to magnify a section of the field of view of the vehicle containing the one or more objects. The remote station system may comprise a display configured to display the one or more data points generated by the one or more sensors and display the one or more objects in a magnified state. The remote station system may comprise one or more remote actuation controls configured generate one or more driving actions. The one or more driving actions may correlate to one or more actuator commands.
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
15.
SYSTEMS AND METHODS FOR CONTROLLING A VEHICLE USING A REDUNDANT ACTUATOR CONTROL ENGINE SYSTEM
This disclosure provides systems and methods for controlling a vehicle in response to an abnormal condition. The method may include generating, by a main computing system, a nominal motion plan and a fallback motion plan for each predetermined interval from a location of the vehicle. The method may also include sending, by the main computing system, the nominal motion plan and the fallback motion plan for each predetermined interval to a redundant ACE system. The method may include detecting, by the redundant ACE system, an abnormal condition of the vehicle, and in response to the abnormal condition, controlling the vehicle, by the redundant ACE system, to perform a predetermined vehicle action comprising navigating the vehicle to a safe stop according to the last received fallback motion plan.
Systems and methods for controlling a vehicle are provided. The system may comprise a vehicle and one or more sensors, coupled to the vehicle, configured to generate one or more data points pertaining to one or more of an environment of the vehicle and one or more system component measurements of the vehicle. The system may comprise one or more actuation controls configured to enable the vehicle to perform one or more driving actions and a remote station system configured to receive the one or more data points generated by the one or more sensors, generate one or more driving actions, and transmit the one or more driving actions to the vehicle. The system may comprise a switch configured to switch command of the vehicle between automatic trajectory control and remote station system control.
Systems and methods for performing map localization for use with autonomous vehicles (AVs) and updating map localization in an autonomous driving environment are provided. The method may comprise, using a localizer, receiving one or more sensor inputs from at least one of: a perception module; a local pose module, and a prior map; and, using the perception inputs, generating, using the localizer, a filtered estimate of a morphing of the prior map, generating a morphed prior map. The morphed prior map may match a perceived reality around a current local pose.
A roof mounted sensor pod assembly for a vehicle. the roof mounted sensor pod having a frame configured to attach to the vehicle, a connecting assembly located on the frame, and a sensor pod configured to couple to the frame with the connecting assembly. The connecting assembly is configured to support the weight of the sensor pod during installation of the sensor pod and prior to securing of the sensor pod to the frame. A vehicle includes a roof mounted sensor pod assembly.
A roof mounted sensor pod assembly for a vehicle having a sensor pod configured to couple to the vehicle and a connecting assembly. The connecting assembly having a vehicle portion integral with the vehicle and a sensor pod portion attached to the sensor pod and removably coupled to the vehicle portion. The connecting assembly is configured to allow the sensor pod to be removably coupled to the vehicle. A vehicle includes a roof mounted sensor pod assembly.
A roof mounted sensor pod assembly for a vehicle. The roof mounted sensor pod assembly having a bracket comprising a first distal side surface and a second distal side surface, the first distal side surface configured to attach to the vehicle, a connecting assembly located at the second distal side surface of the bracket, and a sensor pod configured to couple to the second distal side surface of the bracket, with the connecting assembly. The connecting assembly is configured to support the weight of the sensor pod during installation of the sensor pod and prior to securing of the sensor pod to the bracket. A vehicle includes a roof mounted sensor pod assembly.
A method of installing a sensor pod on a roof of a vehicle includes providing a connecting assembly having a vehicle portion configured to couple to the vehicle and a sensor pod portion attached to the sensor pod, installing the vehicle portion of the connecting assembly on the vehicle, coupling the sensor pod portion of the connecting assembly to the vehicle portion, after coupling the sensor pod portion to the vehicle portion, supporting the weight of the sensor pod with the connecting assembly, and securing the sensor pod to the vehicle portion of the connecting assembly. The connecting assembly supports the weight of the sensor pod before and after securing the sensor pod to the vehicle portion. A method of uninstalling a sensor pod is also provided.
Systems and methods are provided for providing redundant pulse-width modulation (PWM) throttle control. The system includes a manual throttle controller configured to generate a manual PWM throttle control signal, and an automated throttle control system. The automated throttle control system includes a plurality of automated throttle controllers, each of which being configured to independently control a throttle of a vehicle, and each including a processor configured to generate and output an automated PWM throttle control signal, a first double pole double throw (DPDT) relay that, when engaged, is configured to receive and output the manual PWM throttle control signal, and a second DPDT relay, configured to receive and output the automated PWM throttle control signal to an engine, when the second DPDT relay is engaged; and receive and output the manual PWM throttle control signal to the engine, when the DPDT relay is disengaged.
F02D 41/00 - Electrical control of supply of combustible mixture or its constituents
F02D 9/02 - Controlling engines by throttling air or fuel-and-air induction conduits or exhaust conduits concerning induction conduits
F02D 11/10 - Arrangements for, or adaptations to, non-automatic engine control initiation means, e.g. operator initiated characterised by non-mechanical control linkages, e.g. fluid control linkages or by control linkages with power drive or assistance of the electric type
F02D 41/20 - Output circuits, e.g. for controlling currents in command coils
F02D 41/22 - Safety or indicating devices for abnormal conditions
23.
SYSTEMS AND METHODS FOR DETECTING AND LABELING A COLLIDABILITY OF ONE OR MORE OBSTACLES ALONG TRAJECTORIES OF AUTONOMOUS VEHICLES
Systems and methods for detecting and labeling a collidability of obstacles within a vehicle environment are provided. The method may comprise generating one or more data points from one or more sensors coupled to a vehicle. The method may comprise, using a processor, detecting one or more obstacles within a LiDAR point cloud, generating a patch for each of the one or more detected obstacles, projecting the LiDAR point cloud into the image, performing a factor query on an image for each of the one or more detected obstacles, for each of the one or more detected obstacles, based on the factor query, determining a label for the obstacle, and, for each of the one or more detected obstacles, labeling the obstacle with the label. The label may indicate whether each of the one or more detected obstacles is collidable and not non-collidable.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
G06V 20/70 - Labelling scene content, e.g. deriving syntactic or semantic representations
24.
SYSTEMS AND METHODS FOR PLANNING A TRAJECTORY OF AN AUTONOMOUS VEHICLE BASED ON ONE OR MORE OBSTACLES
Systems and methods for planning a trajectory of a vehicle based on one or more obstacles is provided. The method may comprise generating one or more data points from one or more sensors coupled to a vehicle, and, using a processor, detecting one or more obstacles within a LiDAR point cloud, generating a patch for each of the one or more obstacles, projecting the LiDAR point cloud into the image, wherein each patch represents a region of an image for each of the one or more obstacles, performing a factor query on the image for each of the one or more obstacles, for each of the one or more obstacles, based on the factor query, determining a label for the obstacle and labeling the obstacle with the label, and planning a trajectory of the vehicle. The label may indicate a collidability of the obstacle.
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
G06T 3/40 - Scaling of whole images or parts thereof, e.g. expanding or contracting
G06T 7/90 - Determination of colour characteristics
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
G06V 20/70 - Labelling scene content, e.g. deriving syntactic or semantic representations
25.
SYSTEMS AND METHODS FOR GENERATING A HEATMAP CORRESPONDING TO AN ENVIRONMENT OF A VEHICLE
Systems and methods for detecting a portion of an environment of a vehicle are provided. The method may comprise generating, using one or more sensors coupled to a vehicle, environment data from an environment of the vehicle, wherein the environmental data comprises one or more of the following: ground LiDAR data from the environment; camera data from the environment; and path data corresponding to a change in position of one or more other vehicles within the environment. The method may comprise inputting the environmental data into a machine learning model trained to generate a heatmap, and, using a processor, based on the environmental data, determining a portion of the environment, wherein the portion of the environment comprises an area having a likelihood, greater than a minimum threshold, of being adjacent to one or more pavement markings, and generating the heatmap, wherein the heatmap corresponds to the portion of the environment.
Exemplary embodiments include methods and systems to adjust a driving path in a planner-map, including: identifying a driving surface; determining a path on the driving surface; identifying one or more objects relative to the driving surface using one or more sensors; detecting one or more occlusion areas from sensor data from the one or more sensors; tracking one or more tracks of one or more objects within the one or more occlusion areas; adding uncertainty to the one or more tracks within the one or more occlusion areas; and adjusting the path on the driving surface based on the uncertainty of the one or more tracks within the one or more occlusion areas.
Exemplary embodiments include methods and systems to adjust a driving path in a planner-map, including: identifying a driving surface; determining a path on the driving surface; identifying one or more objects relative to the driving surface using one or more sensors; detecting one or more occlusion areas from sensor data from the one or more sensors; tracking one or more tracks of one or more objects within the one or more occlusion areas; adding uncertainty to the one or more tracks within the one or more occlusion areas; and adjusting the path on the driving surface based on the uncertainty of the one or more tracks within the one or more occlusion areas.
G01C 21/16 - NavigationNavigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigatedDead reckoning by integrating acceleration or speed, i.e. inertial navigation
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
Systems and methods for detecting and labeling one or more obstacles within a vehicle environment are provided. The method may comprise generating one or more data points from one or more sensors coupled to a vehicle and, using a processor, detecting one or more obstacles within a LiDAR point cloud, generating a patch for each of the one or more detected obstacles, projecting the LiDAR point cloud into an image, performing a color query on the image for each obstacle, performing a shape query on the image for each of the one or more detected obstacles, for each of the one or more detected obstacles, determining a label for the obstacle based on one or more of the color query and the shape query and labeling the obstacle with the label. The label may indicate whether each of the one or more detected obstacles is a piece of vegetation and not a pedestrian.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
Systems and methods for detecting and identifying vegetation within a vehicle environment are provided. The method may comprise generating one or more data points from one or more sensors coupled to a vehicle. The one or more data points may comprise a Light Detection and Ranging (LiDAR) point cloud generated by a LiDAR sensor and an image captured by a camera. The method may further comprise, using a processor, detecting one or more obstacles within the LiDAR point cloud, generating a patch for each of the one or more obstacles, projecting the LiDAR point cloud into the image, wherein each patch represents a region of the image for each of the one or more obstacles, performing a color query on the image for each of the one or more obstacles, determining a label for the obstacle based on the color query, and labeling the obstacle with the label.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
G06V 10/56 - Extraction of image or video features relating to colour
Exemplary embodiments described herein include systems and method to remove tracks from a tracker when the tracks are within an occluded region, including defining a map of a driving area; defining an occlusion area within the map; detecting an object using sensor data from one or more sensors; determining that the object entered the occlusion area; creating an estimated object location for the object within the occlusion area; dropping the estimated object location for the object when the occlusion area is cleared.
Exemplary embodiments include systems and methods to maintain tracking of an object that passes into an occluded area, including defining a map of a driving area; defining one or more occlusion areas within the map; detecting an object using sensor data from one or more sensors; creating an object track for the object detected using sensor data; determining that the object track entered one of the one or more occlusion areas; and maintaining the object track while the object track remains in the one or more occlusion areas.
An autonomous vehicle includes a sensor pod having a mirror, a connecting assembly extending between the sensor pod and the autonomous vehicle, and a user interface provided on the sensor pod. The user interface is configured to provide two-way communication between a local user and a remote user. A method for two-way communication includes providing a vehicle with a sensor pod. The method includes initiating two-way communication between a local user and a remote user via the sensor pod.
B60R 1/00 - Optical viewing arrangementsReal-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
B60L 15/38 - Control or regulation of multiple-unit electrically-propelled vehicles with automatic control
B60R 11/00 - Arrangements for holding or mounting articles, not otherwise provided for
H04W 4/48 - Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for in-vehicle communication
H04W 4/38 - Services specially adapted for particular environments, situations or purposes for collecting sensor information
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
33.
SYSTEMS AND METHODS FOR GENERATING A HEATMAP CORRESPONDING TO AN ENVIRONMENT OF A VEHICLE
Systems and methods for detecting a portion of an environment of a vehicle are provided. The method may comprise generating, using one or more sensors coupled to a vehicle, environment data from an environment of the vehicle, wherein the environmental data comprises one or more of the following: ground LiDAR data from the environment; camera data from the environment; and path data corresponding to a change in position of one or more other vehicles within the environment. The method may comprise inputting the environmental data into a machine learning model trained to generate a heatmap, and, using a processor, based on the environmental data, determining a portion of the environment, wherein the portion of the environment comprises an area having a likelihood, greater than a minimum threshold, of being adjacent to one or more pavement markings, and generating the heatmap, wherein the heatmap corresponds to the portion of the environment.
Systems and methods for planning a trajectory of a vehicle based on one or more obstacles is provided. The method may comprise generating one or more data points from one or more sensors coupled to a vehicle, and, using a processor, detecting one or more obstacles within a LiDAR point cloud, generating a patch for each of the one or more obstacles, projecting the LiDAR point cloud into the image, wherein each patch represents a region of an image for each of the one or more obstacles, performing a factor query on the image for each of the one or more obstacles, for each of the one or more obstacles, based on the factor query, determining a label for the obstacle and labeling the obstacle with the label, and planning a trajectory of the vehicle. The label may indicate a collidability of the obstacle.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
G01S 17/93 - Lidar systems, specially adapted for specific applications for anti-collision purposes
35.
METHOD OF TWO-WAY COMMUNICATION WITH A SENSOR POD WITH USER INTERFACE
A method for two-way communication includes providing a vehicle with a sensor pod. The method includes initiating two-way communication between a local user and a remote user via the sensor pod.
An autonomous vehicle includes a sensor pod having a mirror, a connecting assembly extending between the sensor pod and the autonomous vehicle, and a user interface provided on the sensor pod. The user interface is configured to provide two-way communication between a local user and a remote user.
This disclosure provides methods and systems for dynamically creating a trajectory for navigating a vehicle. The method may include receiving sensor data from at least one sensor of the autonomous vehicle, the sensor data representative of a driving surface in a field of view of the autonomous vehicle; segmenting a portion of the driving surface in the field of view of the autonomous vehicle by determining nominal path based at least in part on the image data; assigning a plurality of nodes to at least a portion of the nominal path; associating the plurality of the nodes assigned to the nominal path with a line to generate at least one segmentation polyline; determining updated nominal path by fitting the each of the plurality of segmentation lines to the nominal path; generating a trajectory based on the updated nominal path; and navigating the autonomous vehicle according to the trajectory.
This disclosure provides methods and systems for dynamically creating a trajectory for navigating a vehicle. The method may include receiving sensor data from at least one sensor of the autonomous vehicle, the sensor data representative of a driving surface in a field of view of the autonomous vehicle; segmenting a portion of the driving surface in the field of view of the autonomous vehicle by determining nominal path based at least in part on the image data; assigning a plurality of nodes to at least a portion of the nominal path; associating the plurality of the nodes assigned to the nominal path with a line to generate at least one segmentation polyline; determining updated nominal path by fitting the each of the plurality of segmentation lines to the nominal path; generating a trajectory based on the updated nominal path; and navigating the autonomous vehicle according to the trajectory.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G06V 10/26 - Segmentation of patterns in the image fieldCutting or merging of image elements to establish the pattern region, e.g. clustering-based techniquesDetection of occlusion
G06V 10/34 - Smoothing or thinning of the patternMorphological operationsSkeletonisation
G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
Systems and methods are provided of downsampling an image to a plurality of image resolutions. The method comprises capturing, using a camera, an image depicting an environment within view of the camera, identifying a first section of the image depicting an area of the environment spaced within a first distance range from the camera, identifying a second section of the image depicting an area of the environment spaced within a second distance range from the camera, identifying a third section of the image depicting an area of the environment spaced within a third distance range from the camera, and downsampling the first section of the image to a first image resolution, generating a first processed image, the second section of the image to a second image resolution, generating a second processed image, and the third section of the image to a third image resolution, generating a third processed image.
Systems and methods for detecting and measuring object velocity are provided. The method comprises, using a spinning Light Detection and Ranging (LiDAR) system, generating a first LiDAR point cloud of an environment surrounding a vehicle, using a scanning LiDAR system, generating a second LiDAR point cloud of the environment surrounding the vehicle, and, using a processor, identifying points within the first LiDAR point cloud coinciding with a position of an object and points within the second LiDAR point cloud coinciding with a position of the object, determining whether the points within the second LiDAR point cloud line up with the points within the first LiDAR point cloud, and, when the points within the second LiDAR point cloud do not line up with the points within the first LiDAR point cloud, determining that the object is moving.
Systems and methods are provided of downsampling an image to a plurality of image resolutions. The method comprises capturing, using a camera, an image depicting an environment within view of the camera, identifying a first section of the image depicting an area of the environment spaced within a first distance range from the camera, identifying a second section of the image depicting an area of the environment spaced within a second distance range from the camera, identifying a third section of the image depicting an area of the environment spaced within a third distance range from the camera, and downsampling the first section of the image to a first image resolution, generating a first processed image, the second section of the image to a second image resolution, generating a second processed image, and the third section of the image to a third image resolution, generating a third processed image.
G06V 10/26 - Segmentation of patterns in the image fieldCutting or merging of image elements to establish the pattern region, e.g. clustering-based techniquesDetection of occlusion
G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
A warning indicator system for an autonomous vehicle includes a sensor pod, a connecting assembly, and a warning indicator. The sensor pod has a sensor and a sensor pod housing. The connecting assembly is configured to couple the sensor pod to the autonomous vehicle. The warning indicator is associated with the sensor pod and the connecting assembly. The warning indicator is configured to alert other vehicles and persons to a stopped condition of the autonomous vehicle. An autonomous vehicle includes the warning indicator.
B60Q 1/50 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
B60Q 1/26 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
43.
SYSTEMS AND METHODS FOR GENERATING A TRAINING SET FOR A NEURAL NETWORK CONFIGURED TO GENERATE CANDIDATE TRAJECTORIES FOR AN AUTONOMOUS VEHICLE
This disclosure provides methods and systems for generating a training set for a neural network configured to generate candidate trajectories for an autonomous vehicle, comprising: receiving a set of sensor data representative of one or more portions of a plurality of objects in the environment of the autonomous vehicle; for each object, calculating a representative box enclosing the object, the representative box having portions comprising corners, edges, and planes; for each representative box, calculating at least one vector into the representative box from a position on the autonomous vehicle; for each vector, calculating a first and second corner position of the representative box, an edge of the representative box, and a plane of the representative box; determining the highest confidence corners, edge, and plane of each representative based on calculation from the at least one vector; and generating a training set including the highest confidence corners, edge, and plane of each representative box.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
44.
SYSTEMS AND METHODS FOR DETECTING AND TRACKING OBJECTS IN AN ENVIRONMENT OF AN AUTONOMOUS VEHICLE
This disclosure provides methods and systems for detecting and tracking objects in an environment of an autonomous vehicle. The method may include: receiving sensor data from at least one sensor of the autonomous vehicle, the sensor data representative of one or more portions of an object in the environment of the autonomous vehicle; determining a highest confidence portion of the object, wherein the highest confidence portion of the object comprises a portion of the object that is observed and estimated with highest accuracy and confidence; determining features of the highest confidence portion of the object; training a machine learning model based at least in part on the features of the highest confidence portion of the object and an error metric that measures difference between the highest confidence portion of the object and a corresponding portion of the object in a ground truth; and detecting or tracking one or more objects in the environment of the autonomous vehicle using the trained machine learning model.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
This disclosure provides systems and methods for path planning by a planner of an autonomous vehicle. The method may include receiving perception data by the planner from a perception module, wherein the perception data comprises tracking or predicted object data associated with objects or obstacles in an environment of the autonomous vehicle, and wherein the tracking or predicted object data are determined based on high recall detection data and high precision detection data, generating by the planner a trajectory for controlling the autonomous vehicle based on the perception data received from the perception module, and transmitting to a controller of the autonomous vehicle the trajectory such that the autonomous vehicle is navigated by the controller to a destination.
This disclosure provides methods and systems for detecting and tracking objects in an environment of an autonomous vehicle. The method may include: receiving sensor data from at least one sensor of the autonomous vehicle, the sensor data representative of one or more portions of an object in the environment of the autonomous vehicle; determining a highest confidence portion of the object, wherein the highest confidence portion of the object comprises a portion of the object that is observed and estimated with highest accuracy and confidence; determining features of the highest confidence portion of the object; training a machine learning model based at least in part on the features of the highest confidence portion of the object and an error metric that measures difference between the highest confidence portion of the object and a corresponding portion of the object in a ground truth; and detecting or tracking one or more objects in the environment of the autonomous vehicle using the trained machine learning model.
G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
A method for remotely stopping a vehicle configured to operate autonomously or semi-autonomously. The method includes providing an electromechanical remote stop system, applying a control signal to the vehicle with the electromechanical remote stop system, and observing an errant behavior of the vehicle. The method includes actuating the electromechanical remote stop system after observing the errant behavior, ceasing applying the control signal based on actuating the electromechanical remote stop system, and applying a first braking force due to the ceasing applying the control signal. The method includes applying a second braking force due to the ceasing applying the control signal. Actuating the electromechanical remote stop system, and thus, applying the first braking force and the second braking force, occurs at a location exterior to the vehicle.
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
B60W 10/06 - Conjoint control of vehicle sub-units of different type or different function including control of propulsion units including control of combustion engines
B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
A remote stop system for a vehicle configured to be operated autonomously or semi-autonomously. The remote stop system includes a remote stop controller configured to be installed on the vehicle and a remote stop actuator configured to send a wireless control signal to the remote stop controller. The remote stop actuator has a first state having the remote stop actuator continuously transmitting the wireless control signal to the remote stop controller; and a second state having the remote stop actuator without transmitting the wireless control signal to the remote stop controller. In the second state, an engine of the vehicle is automatically stopped simultaneously with applying a braking force to the vehicle. A vehicle system includes the vehicle and the remote stop system.
B60T 7/08 - Brake-action initiating means for personal initiation hand-actuated
B60T 15/42 - Other control devices or valves characterised by definite functions with a quick braking action, i.e. with accelerating valves actuated by brake-pipe pressure variation
B60T 17/22 - Devices for monitoring or checking brake systemsSignal devices
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
A method for remotely stopping a vehicle configured to operate autonomously or semi-autonomously. The method includes providing a remote stop system, applying a control signal to the vehicle with the remote stop system, observing an errant behavior of the vehicle, and actuating the remote stop system after observing the errant behavior. The method includes ceasing applying the control signal based on actuating the remote stop system. The method includes stopping the vehicle due to actuating the remote stop system. Actuating the remote stop system, and thus, causing the vehicle to stop, occurs at a location exterior to the vehicle.
This disclosure provides systems and methods for controlling a vehicle. The method comprises receiving data from a set of sensors, wherein the data represents objects or obstacles in an environment of the autonomous vehicle; identifying objects or obstacles from the received data; determining multiple sets of attributes of the objects or obstacles, wherein each set of attributes of the objects or obstacles are determined based on data received by an individual sensor; determining a candidate trajectory for the autonomous vehicle based on the multiple sets of attributes of the objects or obstacles; and controlling the autonomous vehicle according to the candidate trajectory.
This disclosure provides systems and methods for controlling a vehicle. The method comprises receiving data from a set of sensors, wherein the data represents objects or obstacles in an environment of the autonomous vehicle; determining attributes of each of the objects or obstacles based on the received data from the set of sensors; integrating the attributes of the each of the objects or obstacles; identifying objects or obstacles based on the integrated attributes; determining a candidate trajectory for the autonomous vehicle to avoid the objects or obstacles; and controlling the autonomous vehicle according to the candidate trajectory.
This disclosure provides systems and methods for detecting and tracking objects or obstacles in an environment of an autonomous vehicle. The method may include receiving data from a set of sensors, wherein the data represents objects or obstacles in an environment of the autonomous vehicle; and using a processor: generating high precision detection data based on the received data; identifying, from the high precision detection data, a set of objects that are classifiable by at least one known classifier; generating high recall detection data based on the received data; identifying from the high recall detection data a set of obstacles; and performing an operation on the high precision detection data of the objects and the high recall detection data of the obstacles, based on a status of the autonomous vehicle or based on one or more characteristics of the objects or the obstacles.
This disclosure provides systems and methods for path planning by a planner of an autonomous vehicle. The method may include receiving by the planner perception data from a perception module, wherein the perception data comprises tracking or predicted object data associated with objects or obstacles in an environment of the autonomous vehicle, and wherein the tracking or predicted object data are determined based on high recall detection data and high precision detection data, generating by the planner a trajectory for controlling the autonomous vehicle based on the perception data received from the perception module, and transmitting to a controller of the autonomous vehicle the trajectory such that the autonomous vehicle is navigated by the controller to a destination.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
B60W 40/02 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to ambient conditions
A remote stop system for a vehicle configured to be operated autonomously or semi-autonomously. The remote stop system includes a mechanical remote stop system configured to apply a first braking force and an electrical remote stop system configured to apply a second braking force. A remote stop actuator is configured to send a wireless control signal to the vehicle. The remote stop actuator having two states: a first state having the remote stop actuator transmitting the wireless control signal to the mechanical remote stop system and the electrical remote stop system and a second state having the remote stop actuator without transmitting the wireless control signal to the mechanical remote stop system and the electrical remote stop system. In the second state, the first braking force and the second braking force are applied simultaneously. A vehicle system includes the vehicle and the remote stop system.
B60T 7/16 - Brake-action initiating means for automatic initiationBrake-action initiating means for initiation not subject to will of driver or passenger operated by remote control, i.e. initiating means not mounted on vehicle
B60T 8/18 - Arrangements for adjusting wheel-braking force to meet varying vehicular or ground-surface conditions, e.g. limiting or varying distribution of braking force responsive to vehicle weight or load, e.g. load distribution
55.
SYSTEMS AND METHODS FOR CONTROLLING A VEHICLE USING HIGH PRECISION AND HIGH RECALL DETECTION
This disclosure provides systems and methods for controlling a vehicle based on a combination of high precision detection and high recall detection. The disclosed systems and methods can efficiently generate trajectories by reducing duplicate detection or duplicate calculation of objects or obstacles of common and known object types and objects or obstacles without class identification.
G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
G06V 10/80 - Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
56.
SYSTEMS AND METHODS FOR CONTROLLING A VEHICLE BY DETECTING AND TRACKING OBJECTS THROUGH ASSOCIATED DETECTIONS
This disclosure provides methods and systems for dynamically detecting and tracking objects in an environment of an autonomous vehicle. In some embodiments, the method comprises: receiving image data from sensors of the autonomous vehicle, the image data comprising a plurality of images representative of objects in a field of view of the autonomous vehicle; detecting the objects in the plurality of images through an object detector; generating image embeddings for the objects detected in the plurality of images; determining similarity scores of the image embeddings of the objects that are detected in images received from two or more different sensors; identifying the objects that are detected in the images received from the two or more different sensors as a candidate object for tracking, if the objects have a similarity score of the image embeddings equal to or greater than a threshold value; and initializing a track for the candidate object.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
G06V 10/762 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
57.
SYSTEMS AND METHODS FOR DETECTING WATER ALONG DRIVING ROUTES
Systems and methods for determining a presence of water along a surface from Light Detection and Ranging (LiDAR) point clouds is provided. The method comprises generating, using a LiDAR system coupled to a vehicle, at least one point cloud, wherein the LiDAR system comprises a processor, and, using the processor, identifying and isolating a ground plane within the at least one point cloud, identifying one or more objects within the at least one point cloud, determining whether a reflection of the one or more objects is present in the at least one point cloud below the ground plane, and, when the reflection is present, determining that water is present along a surface.
Systems and methods for determining a degree of wetness of a road surface from Light Detection and Ranging (LiDAR) point clouds are provided. The method comprises generating, using a LiDAR system, at least one point cloud, wherein the LiDAR system comprises a processor, and, using the processor, identifying and isolating one or more road surface points within a point cloud of the at least one point cloud, wherein the one or more road surface points indicate a road surface portion within an environment of the point cloud, analyzing the one or more road surface points to determine a number of the one or more road surface points that are zero intensity returns, and, based on the number of zero intensity returns, determining a degree of wetness of the road surface.
Systems and methods for identifying water on a road surface via ultrasonic return intensity analysis are provided. The method comprises generating, using a processor, a road surface mapping of a road surface of an environment of a vehicle, transmitting, using one or more ultrasonic transducers, one or more ultrasonic waves in a direction of the road surface, receiving, using the one or more ultrasonic transducers, one or more returned ultrasonic waves, wherein the one or more returned ultrasonic waves return to the one or more ultrasonic transducers after reflecting off the road surface, determining an intensity of the one or more returned ultrasonic waves, and, when the intensity is below a threshold intensity, determining that a location at which the returned ultrasonic waves reflected off of the road surface has water present.
Systems and methods of calibrating sensors for an autonomous vehicle. A method includes detecting a first vehicle feature of the autonomous vehicle from one or more first sensors, detecting a second vehicle feature of the autonomous vehicle from one or more second sensors, and calibrating the one or more second sensors with the one or more first sensors based on the first vehicle feature and the second vehicle feature. A system includes one or more first sensors, one or more second sensors, and one or more controllers. The one or more controllers detect a first vehicle feature from the one or more first sensors, detect a second vehicle feature from the one or more second sensors, and calibrate the one or more second sensors with the one or more first sensors based on the first vehicle feature and the second vehicle feature.
Systems and methods for detecting a presence of water along a surface and adjusting a speed of a vehicle are provided. The method comprises generating one or more data points from one or more sensors coupled to a vehicle, and, using a processor, determining, based on the one or more data points, whether water is present along a surface, when water is present along the surface, determining a speed adjustment of the vehicle based on the water present along the surface, and generating a control signal configured to cause the vehicle to adjust a speed of the vehicle according to the speed adjustment.
B60K 26/02 - Arrangement or mounting of propulsion-unit control devices in vehicles of initiating means or elements
F02D 11/10 - Arrangements for, or adaptations to, non-automatic engine control initiation means, e.g. operator initiated characterised by non-mechanical control linkages, e.g. fluid control linkages or by control linkages with power drive or assistance of the electric type
F02N 11/08 - Circuits specially adapted for starting of engines
63.
SYSTEMS AND METHODS FOR PROVIDING REDUNDANT PULSE-WIDTH MODULATION (PWM) THROTTLE CONTROL
Systems and methods are provided for providing redundant pulse-width modulation (PWM) throttle control. The system includes a manual throttle controller configured to generate a manual PWM throttle control signal, and an automated throttle control system. The automated throttle control system includes a plurality of automated throttle controllers, each of which being configured to independently control a throttle of a vehicle, and each including a processor configured to generate and output an automated PWM throttle control signal, a first double pole double throw (DPDT) relay that, when engaged, is configured to receive and output the manual PWM throttle control signal, and a second DPDT relay, configured to receive and output the automated PWM throttle control signal to an engine, when the second DPDT relay is engaged; and receive and output the manual PWM throttle control signal to the engine, when the DPDT relay is disengaged.
B60K 26/02 - Arrangement or mounting of propulsion-unit control devices in vehicles of initiating means or elements
F02D 11/10 - Arrangements for, or adaptations to, non-automatic engine control initiation means, e.g. operator initiated characterised by non-mechanical control linkages, e.g. fluid control linkages or by control linkages with power drive or assistance of the electric type
F02D 29/00 - Controlling engines, such controlling being peculiar to the devices driven thereby, the devices being other than parts or accessories essential to engine operation, e.g. controlling of engines by signals external thereto
F02N 11/08 - Circuits specially adapted for starting of engines
64.
Systems and methods for providing redundant pulse-width modulation (PWM) throttle control
Systems and methods are provided for providing redundant pulse-width modulation (PWM) throttle control. The system includes a manual throttle controller configured to generate a manual PWM throttle control signal, and an automated throttle control system. The automated throttle control system includes a plurality of automated throttle controllers, each of which being configured to independently control a throttle of a vehicle, and each including a processor configured to generate and output an automated PWM throttle control signal, a first double pole double throw (DPDT) relay that, when engaged, is configured to receive and output the manual PWM throttle control signal, and a second DPDT relay, configured to receive and output the automated PWM throttle control signal to an engine, when the second DPDT relay is engaged; and receive and output the manual PWM throttle control signal to the engine, when the DPDT relay is disengaged.
F02D 41/00 - Electrical control of supply of combustible mixture or its constituents
F02D 9/02 - Controlling engines by throttling air or fuel-and-air induction conduits or exhaust conduits concerning induction conduits
F02D 11/10 - Arrangements for, or adaptations to, non-automatic engine control initiation means, e.g. operator initiated characterised by non-mechanical control linkages, e.g. fluid control linkages or by control linkages with power drive or assistance of the electric type
F02D 41/20 - Output circuits, e.g. for controlling currents in command coils
F02D 41/22 - Safety or indicating devices for abnormal conditions
65.
METHOD OF INSTALLING AND UNINSTALLING A WARNING INDICATOR DEVICE ON A VEHICLE
A method of installing a warning indicator device on a vehicle. The method includes mounting the warning indicator device on the vehicle, attaching one or more removable fastening mechanisms to the warning indicator device and the vehicle, and applying a force with the one or more removable fastening mechanisms to secure the warning indicator device to the vehicle. A method of uninstalling a warning indicator device from a vehicle. The method includes loosening one or more removable fastening mechanisms such that a force on the warning indicator device from the one or more removable fastening mechanisms is removed, detaching the one or more removable fastening mechanisms from the warning indicator device, and removing the warning indicator device from the vehicle.
B60Q 1/26 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
B60Q 1/00 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
B60Q 1/38 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating change of drive direction using immovably-mounted light sources, e.g. fixed flashing lamps
B60Q 1/46 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for giving flashing caution signals during drive, other than signalling change of direction, e.g. flashing the headlights
66.
SYSTEMS AND METHODS OF WARNING APPROACHING VEHICLES OF AN AUTONOMOUS VEHICLE
Systems and methods of warning approaching vehicles of an autonomous vehicle. A method of warning approaching vehicles on an autonomous vehicle includes receiving one or more sensor signals from one or more vehicle sensors on the autonomous vehicle, determining a stopped condition of the vehicle based on the one or more sensor signals, activating one or more warning indicators on a warning indicator device based on the stopped condition. The warning indicator device is mounted on the autonomous vehicle.
B60Q 1/50 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
B60R 1/20 - Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
68.
SYSTEMS AND METHODS FOR CONTROLLING A VEHICLE BY TELEOPERATION BASED ON A SPEED LIMITER
This disclosure provides systems and methods for controlling a vehicle by teleoperation based on a speed limiter. The method may include: receiving, at the autonomous vehicle, a teleoperation input from a teleoperation system, wherein the teleoperation input comprises a throttle control input for remotely controlling a speed of the autonomous vehicle; determining the speed of the autonomous vehicle; determining if the speed of the autonomous vehicle has reached a threshold speed below a speed limit; and upon determining that the speed of the autonomous vehicle has reached the threshold speed, reducing effect of the throttle control input from the teleoperation system such that an acceleration rate of the speed of the autonomous vehicle is reduced.
A health indicator system for a sensor pod includes one or more sensors associated with the sensor pod, a sensor pod housing, an internal sensor status monitoring system configured to determine a status of at least one sensor of the one or more sensors, and a plurality of visual indicators located on at least one exterior surface of the sensor pod housing, the plurality of visual indicators configured to display a predetermined configuration based on the determined status of each of the one or more sensors. The plurality of visual indicators are located on the sensor pod housing at a level such that when an external operator standing on a ground surface approaches the vehicle, the plurality of visual indicators is within the external operator's field of view.
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
B60R 1/20 - Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
A method of regulating a temperature of a sensor pod on an autonomous vehicle includes determining a temperature of the sensor pod, determining an amount a vent should be opened based upon the operational status of the autonomous vehicle, an internal temperature of the sensor pod, and ambient temperature of the environment, opening a vent when the temperature is above a predetermined threshold temperature, and closing the vent when the temperature is below the predetermined threshold temperature.
A vented sensor pod system includes a sensor pod housing, one or more sensors located within the sensor pod housing, and one or more vents coupled to the sensor pod housing, the vent configured to selectively release or maintain heat within the sensor pod housing.
A dense sensor pod assembly for a vehicle. The dense sensor pod assembly includes a sensor pod housing, a connecting assembly, and a scanning lidar. The sensor pod housing includes one or more sensors located within the sensor pod housing. The connecting assembly couples the sensor pod housing to the vehicle. The scanning lidar is located within the connecting assembly. The scanning lidar, the connecting assembly, and the sensor pod housing form a dense sensor pod configuration.
A health indicator system for a sensor pod includes one or more sensors associated with the sensor pod, a sensor pod housing, an internal sensor status monitoring system configured to determine a status of at least one sensor of the one or more sensors, and a plurality of visual indicators located on at least one exterior surface of the sensor pod housing, the plurality of visual indicators configured to display a predetermined configuration based on the determined status of each of the one or more sensors. The plurality of visual indicators are located on the sensor pod housing at a level such that when an external operator standing on a ground surface approaches the vehicle, the plurality of visual indicators is within the external operator's field of view.
B60Q 1/50 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
B60Q 1/00 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
B60Q 5/00 - Arrangement or adaptation of acoustic signal devices
G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
75.
SYSTEMS AND METHODS FOR CONTROLLING A VEHICLE BY TELEOPERATION BASED ON MAP CREATION
This disclosure provides systems and methods for controlling a vehicle by teleoperations based on map creation. The method may include: receiving, at the autonomous vehicle, a teleoperation input from a teleoperation system through a communication link, wherein the teleoperation input comprises a modification to at least a portion of an existing trajectory of the autonomous vehicle; generating an updated map data based on the received teleoperation input from the teleoperation system, wherein the updated map data comprises the modification to the least the portion of the existing trajectory; determining a modified trajectory or a new trajectory for the autonomous vehicle based at least in part on the updated map data; and controlling the autonomous vehicle according to the modified trajectory or the new trajectory.
Systems and methods for deploying warning devices for a vehicle. The vehicle can be an autonomous vehicle. A method of deploying warning devices for a vehicle includes deploying one or more warning devices from the vehicle, the one or more warning devices being coupled to a deployment vehicle. Deploying the one or more warning devices includes controlling the deployment vehicle. A system for deploying warning devices for a vehicle includes a deployment vehicle and one or more warning devices coupled to the deployment vehicle. The system includes a controller configured to deploy the one or more warning devices from the vehicle. Deploying the one or more warning devices includes the controller configured to control the deployment vehicle.
B60Q 1/50 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
B60Q 1/46 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for giving flashing caution signals during drive, other than signalling change of direction, e.g. flashing the headlights
B60Q 1/52 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for indicating emergencies
B60Q 7/00 - Arrangement or adaptation of portable emergency signal devices on vehicles
77.
SYSTEMS AND METHODS FOR CONTROLLING A VEHICLE BY TELEOPERATION BASED ON MAP CREATION
This disclosure provides systems and methods for controlling a vehicle by teleoperations based on map creation. The method may include: receiving sensor data from one or more sensors on the autonomous vehicle through a communication link, wherein the sensor data comprises environmental data associated with a physical environment of the autonomous vehicle and operation data comprising an existing trajectory of the autonomous vehicle; transforming the sensor data into visualization data configured to represent the physical environment of the autonomous vehicle on a visualization interface; generating updated map data comprising a modification to the least the portion of the existing trajectory; and transmitting to the autonomous vehicle a teleoperation input comprising the updated map data.
This disclosure provides systems and methods for controlling a vehicle by teleoperations based on map creation. The method may include: receiving, at the autonomous vehicle, a teleoperation input from a teleoperation system through a communication link, wherein the teleoperation input comprises a modification to at least a portion of an existing trajectory of the autonomous vehicle; and using a processor: generating an updated map data based on the received teleoperation input from the teleoperation system, wherein the updated map data comprises the modification to the least the portion of the existing trajectory; determining a modified trajectory or a new trajectory for the autonomous vehicle based at least in part on the updated map data and by incorporating sensor data from one or more sensors on the autonomous vehicle, wherein the sensor data comprises environmental data associated with a physical environment of the autonomous vehicle; and controlling the autonomous vehicle according to the modified trajectory or the new trajectory.
A method for warning approaching vehicles of an autonomous vehicle includes sensing a stopped condition of the autonomous vehicle, activating a warning indicator of the autonomous vehicle based on the stopped condition, sensing a start condition of the autonomous vehicle, and deactivating the warning indicator of the autonomous vehicle based on the start condition. The warning indicator includes a plurality of LEDs or an LED panel coupled to or formed within a sensor pod coupled to the autonomous vehicle.
B60Q 1/26 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
B60Q 1/50 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
A warning indicator system for an autonomous vehicle includes a sensor pod, a connecting assembly, and a warning indicator. The sensor pod has a sensor and a sensor pod housing. The connecting assembly is configured to couple the sensor pod to the autonomous vehicle. The warning indicator is associated with the sensor pod and the connecting assembly. The warning indicator is configured to alert other vehicles and persons to a stopped condition of the autonomous vehicle. An autonomous vehicle includes the warning indicator.
B60Q 1/50 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
B60Q 1/26 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
81.
System and method for identifying a vehicle subject to an emergency alert and dispatching of signals
Systems and methods are provided for identifying a vehicle subject to an emergency alert are provided. The system comprises one or more autonomous vehicles, each autonomous vehicle comprising a vehicle detection and identification system configured to analyze one or more vehicles within a surrounding environment, and a wireless emergency alert system. The wireless emergency alert system may be configured to receive or generate an emergency alert, wherein the emergency alert includes a geographic region associated with the emergency alert and one or more identifiable markers of a wanted vehicle, determine one or more autonomous vehicles to receive the emergency alert, and relay the emergency alert to the one or more autonomous vehicles.
Systems and methods are provided for identifying a vehicle subject to an emergency alert. The method includes receiving an emergency alert, wherein the emergency alert includes a geographic region associated with the emergency alert and one or more identifiable markers of a wanted vehicle, determining whether the autonomous vehicle is within the geographic region associated with the emergency alert, and detecting, using one or more detection mechanisms coupled to the autonomous vehicle, a detected vehicle within an environment of the autonomous vehicle. The method further includes, when the autonomous vehicle is within the geographic region associated with the emergency alert, determining, for each identifiable marker, whether the detected vehicle matches the identifiable marker, and when the detected vehicle matches the one or more identifiable markers, generating, using a processor coupled to the autonomous vehicle, a signal indicating that the detected vehicle is the wanted vehicle.
A method of installing a sensor pod on a vehicle includes aligning a sensor pod arm with a bracket attached to the vehicle, lowering the sensor pod arm onto the bracket, supporting the weight of the sensor pod on a support axle between the bracket and the sensor pod arm before rigidly coupling the sensor pod arm to the bracket, rotating the sensor pod arm into alignment with the bracket, and securing the sensor pod arm to the bracket. A method of uninstalling the sensor pod on a vehicle.
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
B60R 11/02 - Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the likeArrangement of controls thereof
84.
SYSTEMS AND METHODS FOR LIDAR ATMOSPHERIC FILTERING BACKGROUND
Systems and methods are provided for filtering atmospheric conditions from LiDAR point clouds. The method includes generating, using a LiDAR system, at least one point cloud, wherein the LiDAR system includes a processor. The method further includes, using the processor, identifying and isolating one or more ground points within a point cloud of the at least one point cloud, wherein the one or more ground points indicate a ground portion within an environment of the point cloud, filtering out the ground portion from the point cloud, generating an initial processed point cloud, identifying and isolating one or more atmospheric condition points within the initial processed point cloud, wherein the one or more atmospheric condition points indicate one or more atmospheric conditions within an environment of the processed point cloud, and filtering out the atmospheric condition points from the initial processed point cloud, generating a final processed point cloud.
Systems and methods are provided for filtering atmospheric conditions from LiDAR point clouds. The method includes generating, using a LiDAR system, at least one point cloud, wherein the LiDAR system includes a processor. The method further includes, using the processor, identifying and isolating one or more ground points within a point cloud of the at least one point cloud, wherein the one or more ground points indicate a ground portion within an environment of the point cloud, filtering out the ground portion from the point cloud, generating an initial processed point cloud, identifying and isolating one or more atmospheric condition points within the initial processed point cloud, wherein the one or more atmospheric condition points indicate one or more atmospheric conditions within an environment of the processed point cloud, and filtering out the atmospheric condition points from the initial processed point cloud, generating a final processed point cloud.
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/076 - Rear-view mirror arrangements mounted on vehicle exterior yieldable to excessive external force and provided with an indexed use position
B60R 1/078 - Rear-view mirror arrangements mounted on vehicle exterior easily removableRear-view mirror arrangements mounted on vehicle exterior mounted for bodily outward movement, e.g. when towing
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
B60R 11/00 - Arrangements for holding or mounting articles, not otherwise provided for
B62D 65/16 - Joining sub-units or components to, or positioning sub-units or components with respect to, body shell or other sub-units or components the sub-units or components being exterior fittings, e.g. bumpers, lights, wipers
A universal bracket for connecting a sensor pod and a vehicle. The universal bracket includes a first end having a surface for connecting to the vehicle, a second end for connecting to the sensor pod, three fixation points extending perpendicular to and through the surface for preventing lateral movement, vertical movement, and forward movement of the universal bracket with respect to the vehicle, the three fixation further preventing rotational movement of the universal bracket with respect to the vehicle, and at least one port extending from the first end to the second end, the at least one port configured to allow passage of one or more conduits extending from the vehicle to the sensor pod. A connecting assembly includes the universal bracket.
A universal bracket for connecting a sensor pod and a vehicle. The universal bracket includes a first end having a surface for connecting to the vehicle, a second end for connecting to the sensor pod, three fixation points extending perpendicular to and through the surface for preventing lateral movement, vertical movement, and forward movement of the universal bracket with respect to the vehicle, the three fixation further preventing rotational movement of the universal bracket with respect to the vehicle, and at least one port extending from the first end to the second end, the at least one port configured to allow passage of one or more conduits extending from the vehicle to the sensor pod. A connecting assembly includes the universal bracket.
B60R 11/00 - Arrangements for holding or mounting articles, not otherwise provided for
B62D 65/16 - Joining sub-units or components to, or positioning sub-units or components with respect to, body shell or other sub-units or components the sub-units or components being exterior fittings, e.g. bumpers, lights, wipers
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/076 - Rear-view mirror arrangements mounted on vehicle exterior yieldable to excessive external force and provided with an indexed use position
B60R 1/078 - Rear-view mirror arrangements mounted on vehicle exterior easily removableRear-view mirror arrangements mounted on vehicle exterior mounted for bodily outward movement, e.g. when towing
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
A quick swap sensor pod for a truck having an arm having a protrusion with a lower surface, a pin receiving opening extending through the protrusion to the lower surface, the pin receiving opening having a depth and aligned vertically, and a conduit connector within the arm for coupling a conduit to the quick swap sensor pod. The arm is configured to rotate about an axis of the pin receiving opening between an initial position and a final position. In both the initial position and the final position, the depth of the pin receiving opening is configured to counteract a moment created by a weight of the quick swap sensor pod and the lower surface is configured to support the weight of the quick swap sensor pod.
An apparatus for reducing damage and debris in a sensor pod collision. The apparatus having a bracket configured to couple a sensor pod to a vehicle, the bracket having a post, a sensor pod arm rotatable about the post, a fastener for securing the sensor pod arm to the post, and a frangible fixation point spaced apart from the post, the frangible fixation point configured to break apart at a predetermined force.
A connecting assembly for connecting sensors in a sensor pod to a vehicle. The connecting assembly has a conduit connector located on a housing of the sensor pod, a conduit configured to connect with the conduit connector and extending from the conduit connector to the vehicle, and a conduit connector point located at a connection between the conduit connector and the conduit. The conduit connector point has a first shear strength when the conduit is in tension and the conduit has a second shear strength when the conduit is in tension. The first shear strength is less than the second shear strength.
B60R 16/02 - Electric or fluid circuits specially adapted for vehicles and not otherwise provided forArrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
93.
SENSOR ASSEMBLY WITH LIDAR FOR AUTONOMOUS VEHICLES
A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.
B60R 11/04 - Mounting of cameras operative during driveArrangement of controls thereof relative to the vehicle
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G03B 37/04 - Panoramic or wide-screen photographyPhotographing extended surfaces, e.g. for surveyingPhotographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
A sensor assembly for autonomous vehicles includes a side minor assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.
B60R 11/04 - Mounting of cameras operative during driveArrangement of controls thereof relative to the vehicle
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G03B 37/04 - Panoramic or wide-screen photographyPhotographing extended surfaces, e.g. for surveyingPhotographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
A sensor assembly for autonomous vehicles includes a side minor assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.
B60R 11/04 - Mounting of cameras operative during driveArrangement of controls thereof relative to the vehicle
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G03B 37/04 - Panoramic or wide-screen photographyPhotographing extended surfaces, e.g. for surveyingPhotographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.
B60R 11/04 - Mounting of cameras operative during driveArrangement of controls thereof relative to the vehicle
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G03B 37/04 - Panoramic or wide-screen photographyPhotographing extended surfaces, e.g. for surveyingPhotographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.
B60R 11/04 - Mounting of cameras operative during driveArrangement of controls thereof relative to the vehicle
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G03B 37/04 - Panoramic or wide-screen photographyPhotographing extended surfaces, e.g. for surveyingPhotographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.
B60R 11/04 - Mounting of cameras operative during driveArrangement of controls thereof relative to the vehicle
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G03B 37/04 - Panoramic or wide-screen photographyPhotographing extended surfaces, e.g. for surveyingPhotographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.
B60R 11/04 - Mounting of cameras operative during driveArrangement of controls thereof relative to the vehicle
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G03B 37/04 - Panoramic or wide-screen photographyPhotographing extended surfaces, e.g. for surveyingPhotographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
A sensor assembly for autonomous vehicles includes a side minor assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.
B60R 11/04 - Mounting of cameras operative during driveArrangement of controls thereof relative to the vehicle
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
B60R 1/06 - Rear-view mirror arrangements mounted on vehicle exterior
B60R 1/12 - Mirror assemblies combined with other articles, e.g. clocks
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/931 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G03B 37/04 - Panoramic or wide-screen photographyPhotographing extended surfaces, e.g. for surveyingPhotographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view