Techniques are described herein for generating and performing backup trajectories on an autonomous vehicle, based on failing to receive a driving trajectory from a primary trajectory generation system. A trajectory execution system may be configured to periodically receive driving trajectories from the trajectory generation system and implement the trajectories via the drive system of the autonomous vehicle. The trajectory generation system described herein also may generate and validate future safety trajectories in order to determine whether the trajectory execution system can generate valid safety trajectories in response to trajectory loss event. The future safety trajectories generated by the trajectory generation system may be based on a resettable trajectory loss timer of the trajectory execution system and may apply a similar safety algorithm to the future vehicle state and location based on the primary driving trajectory determined by the trajectory generation system.
A trajectory associated with a vehicle may be determined based on determining that an object in the vehicle's environment is associated with a behavior of interest which may diverge from a nominal behavior otherwise associated with the object. Based on the divergence from anticipated nominal behavior, a region of the environment may be identified and used to determine a trajectory for control. The determined trajectory may be used to control a vehicle and/or inform the driving behavior of a vehicle.
B60W 30/095 - Prévision du trajet ou de la probabilité de collision
B60W 40/02 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes
B60W 50/00 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
3.
DETERMINING VEHICLE TRAJECTORIES BASED ON OBJECT BEHAVIORS
A trajectory associated with a vehicle may be determined based on determining that an object in the vehicle's environment is associated with a behavior of interest which may diverge from a nominal behavior otherwise associated with the object. Based on the divergence from anticipated nominal behavior, a region of the environment may be identified and used to determine a trajectory for control. The determined trajectory may be used to control a vehicle and/or inform the driving behavior of a vehicle.
Techniques for intelligent vehicle pull over in response to detecting abnormal conditions. The techniques may include determining a presence of a condition necessitating that a vehicle cease from operating on a road surface and determining a constraint associated with the vehicle ceasing from operating on the road surface. In some examples, the constraint may be a time constraint, a geographical constraint, and/or the like. The techniques may also include sending, to a planner component of the vehicle, an indication that the vehicle is to cease from operating on the road surface in accordance with the constraint. Additionally, based at least in part on a determination that the planner component acknowledged the indication in accordance with the constraint, the techniques may include refraining from causing the vehicle to stop on the road surface such that the vehicle impedes a flow of traffic.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
Techniques comprise determining, based at least in part on a global positioning system associated with a user device, a user location and causing, based on the user location, the user device to display an indication of a vehicle location relative to the user location. Vehicle information is received, receiving, via a direct wireless interface and based at least in part on the user being at or less than a first threshold distance from the vehicle the vehicle information comprising a direction between the user and the vehicle. Based at least in part on the vehicle information, the user device is caused to display an additional indication indicative of the vehicle location and a direction to follow to reach the vehicle. Further, based at least in part on the user being at or less than a second threshold distance from the vehicle, opening of a door of the vehicle is described.
G01C 21/34 - Recherche d'itinéraireGuidage en matière d'itinéraire
G01C 21/36 - Dispositions d'entrée/sortie pour des calculateurs embarqués
H04L 9/32 - Dispositions pour les communications secrètes ou protégéesProtocoles réseaux de sécurité comprenant des moyens pour vérifier l'identité ou l'autorisation d'un utilisateur du système
B60W 40/08 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conducteurs ou aux passagers
Techniques are described herein for generating and performing backup trajectories on an autonomous vehicle, based on failing to receive a driving trajectory from a primary trajectory generation system. A trajectory execution system may be configured to periodically receive driving trajectories from the trajectory generation system and implement the trajectories via the drive system of the autonomous vehicle. The trajectory generation system described herein also may generate and validate future safety trajectories in order to determine whether the trajectory execution system can generate valid safety trajectories in response to trajectory loss event. The future safety trajectories generated by the trajectory generation system may be based on a resettable trajectory loss timer of the trajectory' execution system and may apply a similar safety algorithm to the future vehicle state and location based on the primary driving trajectory determined by the trajectory generation system.
B60W 40/02 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
Techniques for determining and aggregating prediction error are described herein. A prediction may be determined for an object within a region based on first sensor data. A prediction error may be determined based on a difference between the prediction and second sensor data. The prediction error may be classified by one or more error types and aggregated by a same error type. The aggregated prediction error may be used to train a machine learned prediction model.
Techniques are described herein for determining, based at least in part on a global positioning system associated with a user device, a user location and causing, based on the user location, the user device to display an indication of a vehicle location relative to the user location. Furthermore, receiving, via a direct wireless interface and based at least in part on the user being at or less than a first threshold distance from the vehicle, vehicle information, is described wherein the vehicle information comprises a direction between the user and the vehicle. It is further described to cause, based at least in part on the vehicle information, the user device to display an additional indication, wherein the additional indication is indicative of the vehicle location relative to the user location and a direction to follow for the user to reach the vehicle. Further causing, based at least in part on the user being at or less than a second threshold distance from the vehicle, opening of a door of the vehicle is described.
G01S 19/14 - Récepteurs spécialement adaptés pour des applications spécifiques
B60R 25/24 - Moyens pour enclencher ou arrêter le système antivol par des éléments d’identification électroniques comportant un code non mémorisé par l’utilisateur
E05F 15/77 - Mécanismes pour battants mus par une force motrice avec déclenchement automatique utilisant une commande sans fil
G01S 13/02 - Systèmes utilisant la réflexion d'ondes radio, p. ex. systèmes du type radar primaireSystèmes analogues
G01S 13/46 - Détermination indirecte des données relatives à la position
Collision avoidance and error determination for a component of an autonomous vehicle comprising receiving a first trajectory, such as to return a vehicle to an intended trajectory, that a vehicle is predicted to follow, based on an offset between the vehicle and a second trajectory associated with the vehicle, such as a reference trajectory. The first trajectory predicts a first movement characteristic (e.g., a position) of the vehicle at a point in time. A second movement characteristic is received, representing an actual movement characteristic of the vehicle at that point in time. A first error between the first and second movement characteristics is determined. Based at least in part on the first error, performance of a model for generating trajectories that a vehicle is predicted to follow is validated.
B60W 30/09 - Entreprenant une action automatiquement pour éviter la collision, p. ex. en freinant ou tournant
B60W 30/095 - Prévision du trajet ou de la probabilité de collision
B60W 50/00 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier
B60W 50/02 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour préserver la sécurité en cas de défaillance du système d'aide à la conduite, p. ex. en diagnostiquant ou en palliant à un dysfonctionnement
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
11.
Sensor fusion for detecting false-positive observations
Techniques for identifying false-positive sensor observations include using sensor observations from different sensor modalities. The techniques may include receiving first sensor data (e.g., radar data) and second sensor data (e.g., lidar data) generated by different types of sensors of a vehicle that is operating in an environment. The techniques may also include determining an absence of an observation in the first sensor data at a location in the environment where an observation is indicated in the second sensor data. The techniques may also include receiving an indication that a retroreflective surface is present in the environment. Based at least in part on at least one of the retroreflective surface being present in the environment or the absence of a radar observation at the location in the environment where a lidar observation is indicated, the techniques may include determining that the observation in the second sensor data is a false-positive observation.
Techniques for determining whether a machine-learned model has improved or regressed in response to an update, as well as verifying whether those improvements or regressions impact overall vehicle safety, are described herein. The techniques may include running a simulation in which a simulated vehicle traverses a simulated environment. During the simulation, sensor data associated with the simulated environment may be input to the machine-learned model, which is configured for use in the real vehicle. As such, perception data outputs associated with objects detected in the simulated environment may be received from the machine-learned model during the simulation, and one or more error models may be generated based on the outputs and a ground truth. If the error model indicates that an error meets or exceeds a threshold error, the machine-learned model may be updated to reduce the error below the threshold.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
G06F 30/27 - Optimisation, vérification ou simulation de l’objet conçu utilisant l’apprentissage automatique, p. ex. l’intelligence artificielle, les réseaux neuronaux, les machines à support de vecteur [MSV] ou l’apprentissage d’un modèle
Techniques for collecting sensor data from an environment in which a vehicle operates are discussed herein. The sensor data may be collected from sensors mounted on a sensor platform which is mounted to the vehicle. The vehicle may be operated in a non-autonomous mode. The sensor data may be used to train a machine learned model for use by a purpose-built autonomous vehicle operating in the environment. Sensors mounted on the sensor platform may correspond to sensors which are built into the purpose-built autonomous vehicle.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
14.
Conversion of driving logs to simulations for autonomous vehicle testing and validation
Techniques are described herein for converting driving logs to driving simulations and for analysis and improvement of said conversion. Log data may be received and an object may be identified. It may be determined that a plurality of models does not contain a sufficient model for representing the object in a driving simulation, and a machine-generated model may be generated for representing the object in the driving simulation. A driving simulation may then be generated including the model.
Techniques for determining labels for potentially small, moving, and non-impeding objects in an environment are disclosed. Unlabeled lidar segments may be evaluated to determine whether segments are associated with objects located in a drivable region and to determine one or more characteristics of such segments. This determination includes comparing segment characteristics to various criteria that may be associated with small, dynamic objects, such as object size, motion, occlusion, and/or solidity. Based on this evaluation and whether the segment is in a drivable region, the system determines whether to label the segment as a small, dynamic, non-impeding object.
Systems and techniques for a cooling system are discussed herein. The cooling system may have a cooling block having a first side and a second side opposite the first side. A first computing component may be positioned proximate the first side of the cooling block and a second computing component may be positioned proximate the second side of the cooling block. The cooling block may include a fluid pathway that passes a cooling fluid through where when passing through the fluid pathway the cooling fluid can thermally transfer heat simultaneously from both the first computing component and the second computing component. The cooling system and the computing components can be installed inside a vehicle as its vehicle controller.
H01M 10/6556 - Composants solides comprenant des canaux d'écoulement ou des tubes pour un échange de chaleur
H01M 10/656 - Moyens de commande de la température associés de façon structurelle avec les éléments caractérisés par le type de fluide pour l'échange de chaleur
17.
Object classification with out-of-distribution detection
An out-of-distribution (OOD) detector may be provided to efficiently identify OOD inputs to trained machine learning models, to prevent passing misleading predictions from the trained models to downstream systems. During training of a machine learning model including a deep neural network (e.g., an object detection and/or object classifier model based on sensor data), an out-of-distribution detector may be constructed by modeling the density of the intermediate features of the deep neural network (e.g., using a Gaussian mixture model). In some examples, the intermediate features may be within a finetuned model based on a pretrained foundation model, and the machine learning model may be finetuned with regularization toward the original weights of the pretrained model. During inference, the machine learning model may output predictions, such as multilabel object detection and/or classification outputs, and the out-of-distribution detector may be used to efficiently generate confidence values associated with the outputs of the model.
B60W 50/00 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G06F 18/2415 - Techniques de classification relatives au modèle de classification, p. ex. approches paramétriques ou non paramétriques basées sur des modèles paramétriques ou probabilistes, p. ex. basées sur un rapport de vraisemblance ou un taux de faux positifs par rapport à un taux de faux négatifs
Radar sensor alignment validation may ensure a relative position and/or orientation of two or more radar sensors. Radar sensor alignment validation may include determining whether radar data used to determine a sensor alignment is usable for validation and/or determining whether the sensor alignment itself is accurate. This may include iteratively altering a sensor alignment by changing at least one sensor's relative pose, re-determining altered radar data using the altered pose, and determining an updated sensor alignment for the altered radar data. This process may be iterated and metrics may be determined for multiple updated sensor alignments determined this way. These metrics may be used to determine suitability of the underlying radar data for validation and/or an accuracy of the sensor alignment. A validated radar sensor alignment may be used as part of controlling an autonomous vehicle.
Techniques for determining a phase delay of a received light signal are discussed herein. A vehicle may emit a continuous light signal from a ToF device. The ToF device may include a sensor receiver configured to receive the signal after the signal reflects off surface(s) in the environment. To determine the distance between the ToF device and the surface(s), the vehicle may sample the received signal at two or more offsets and identify regions of overlap between the offset signals and the received signal. The vehicle can determine area measurements of the overlapping regions and use such area measurements to determine normalized amplitudes for each of the overlapping regions and, based on comparing such values, determine linear equation(s) corresponding to the phase delay. The vehicle can evaluate such linear equations to determine the phase delay of the received light signal. The vehicle can be controlled based on the phase delay.
Techniques for improving operational decisions of an autonomous vehicle are discussed herein. In some cases, a system may generate a lane change score based on one or more lateral metrics and one or more longitudinal metrics associated with the lane change trajectory. The system may then determine if the lane change operation is safe based on the lane change score.
A vehicle trajectory for transitioning out of a deceleration trajectory may be determined based on predicted vehicle state(s) and/or environmental condition(s). A determination about whether to transition into the determined trajectory may be based on whether the condition that triggered the deceleration trajectory still exists, whether the determined trajectory meets safety criteria, and/or whether an alternative deceleration trajectory is warranted. The determination may be used to determine which trajectory to control the vehicle and/or to inform the vehicle's driving behavior.
B60T 8/58 - Dispositions pour adapter la force de freinage sur la roue aux conditions propres au véhicule ou à l'état du sol, p. ex. par limitation ou variation de la force de freinage selon une condition de vitesse, p. ex. accélération ou décélération selon une condition de vitesse et une autre condition, ou selon une pluralité de conditions de vitesse
B60T 8/171 - Détection des paramètres utilisés pour la régulationMesure des valeurs utilisées pour la régulation
B60T 8/172 - Détermination des paramètres de commande utilisés pour la régulation, p. ex. par des calculs impliquant des paramètres mesurés ou détectés
Techniques for generating and/or utilizing parking zones for pickup and/or drop off scenarios are described herein. A vehicle (such as an autonomous vehicle) may receive a request to transport a passenger from a starting location to a destination location. The request may include or otherwise be associated with a parking zone proximate the starting location within which the vehicle is to pick up the passenger. The vehicle may navigate to the parking zone and identify the one or more parking spaces within the parking zone. The vehicle may determine a subset of the parking spaces based on parameters associated with the parking zone (e.g., filter out non-conforming parking spaces). The vehicle can generate candidate actions (e.g., trajectories) to the subset of parking spaces, determine a control trajectory that leads to one of the parking spaces, and control the vehicle based on the control trajectory.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
B60W 40/06 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes liés à l'état de la route
B60W 30/06 - Manœuvre automatique de stationnement
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
G08G 1/14 - Systèmes de commande du trafic pour véhicules routiers indiquant des places libres individuelles dans des parcs de stationnement
B60W 50/00 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier
Techniques for generating and/or utilizing parking zones for pickup and/or drop off scenarios are described herein. A vehicle (such as an autonomous vehicle) may receive a request to transport a passenger from a starting location to a destination location. The request may include or otherwise be associated with a parking zone proximate the starting location within which the vehicle is to pick up the passenger. The vehicle may navigate to the parking zone and identify the one or more parking spaces within the parking zone. The vehicle may determine a subset of the parking spaces based on parameters associated with the parking zone (e.g., filter out non-conforming parking spaces). The vehicle can generate candidate actions (e.g., trajectories) to the subset of parking spaces, determine a control trajectory that leads to one of the parking spaces, and control the vehicle based on the control trajectory.
Techniques for facilitating ride requests using parking zone(s) are described herein. A ridesharing application may receive a request to transport a potential passenger from a starting location to a destination location. Based on the request, the ridesharing application may determine and/or display a recommended pickup and/or drop off location to a user interface of a user device. The potential passenger may accept or reject the recommended location(s). Based on rejecting the recommended location(s), the ridesharing application may retrieve one or more parking zones and cause such parking zones to be displayed via the user interface. The passenger may select which of the parking zones within which the passenger would like to the picked up or dropped off. Based on the passenger selecting a parking zone and/or confirming the ride request, the rideshare application may send the ride request to a vehicle in a fleet of vehicles.
A vehicle trajectory' for transitioning out of a deceleration trajectory may be determined based on predicted vehicle state(s) and/or environmental condition(s). A determination about whether to transition into the determined trajectory' may be based on whether the condition that triggered the deceleration trajectory' still exists, whether the determined trajectory' meets safety criteria, and/or whether an alternative deceleration trajectory is warranted. The detennination may be used to determine which trajectory to control the vehicle and/or to inform the vehicle's driving behavior.
B60W 30/08 - Anticipation ou prévention de collision probable ou imminente
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
B60W 40/02 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes
B60W 40/12 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés à des paramètres du véhicule lui-même
27.
Fan optimization based on ambient and radiator outlet temperatures
Techniques for cooling a portion of a vehicle are discussed herein. A thermal management component may receive an ambient temperature proximate the vehicle. The vehicle may also receive a first coolant outlet temperature from a first radiator and a second coolant outlet temperature from a second radiator. In some examples, the first radiator may be located at a first end of the vehicle while the second radiator can be located at a second (and opposite) end of the vehicle. The thermal management component may determine a first cooling parameter for a first radiator fan proximate the first radiator and a second cooling parameter for a second radiator fan proximate the second radiator. Upon determining the first and/or second cooling parameter(s), the thermal management component may operate the first fan according to the first cooling parameter and the second fan according to the second cooling parameter.
Auto-tuning covariances associated with a set of noise models for a variety of sensor modalities and/or perception components such that the covariances are leveled respective to one another may include whitening the covariances and/or error models and determining scalars to apply to the covariances. Determining these scalars may comprise using the residuals that result from generating the set of noise model (e.g., such as may be determined as part of least squares estimation) along with the hat matrix of the process model to determine the scalars. The covariances may iteratively be updated until the scalar adjustments converge or until another end condition is met.
G01S 5/02 - Localisation par coordination de plusieurs déterminations de direction ou de ligne de positionLocalisation par coordination de plusieurs déterminations de distance utilisant les ondes radioélectriques
G01S 5/20 - Position de source déterminée par plusieurs goniomètres espacés
29.
Machine-learned model architecture for predicting future object state
Predicting a future state, such as a future position and/or orientation (i.e., pose), of an object may comprise classifying, by a first machine-learned model, a lane the object may occupy and classifying, by a second machine-learned model, a target pose the object may occupy. A third machine-learned model may determine an offset from the target pose that may be used to determine a predicted (future) pose of the object by applying the offset to the target pose.
Detecting, tracking, and/or predicting a future state of an object may include encoding spatiotemporal data associated with sensors that generate sensor data used to detect, track, and predict future state data of the object. A first machine-learned model may generate an embedding for a subset of sensor data associated with a ray emanating from a sensor and the embedding may be encoded with spatial characteristics of the sensor that received the subset and a time at which the subset was received. This and other spatiotemporally-encoded embeddings may be used by a second machine-learned model to determine an aggregated embedding that may be used for object detection, tracking, and/or prediction.
G06V 10/44 - Extraction de caractéristiques locales par analyse des parties du motif, p. ex. par détection d’arêtes, de contours, de boucles, d’angles, de barres ou d’intersectionsAnalyse de connectivité, p. ex. de composantes connectées
Techniques are provided comprising receiving first sensor data associated with a first vehicle pose and second sensor data associated with a second vehicle pose. A user input is received via a user interface indicating a position associated with a feature represented in the first sensor data and the second sensor data. Based at least in part on the position associated with the feature, the first vehicle pose, and the second vehicle pose, an alignment between the first sensor data and the second sensor data is determined. Map data is determined based at least in part on the first sensor data, the second sensor data, and the alignment.
Techniques for generating simulations to evaluate a performance of an object controller. The controller may be configured to control one or more functionalities of a simulated object (e.g., a “smart object”) in a simulation, such as make the smart object operate like a human in a given scenario. A simulation system may generate one or more simulations based on log data captured by a vehicle operating in a physical environment. The simulation system may cause the object controller to control at least one smart object in the simulation, the smart object being representative of an actual object detected in the physical environment. The simulation system may compare a smart object trajectory to an actual object trajectory of the object to determine a performance metric associated with the object controller. Based on a determination that the performance metric satisfies a threshold, the simulation system may validate the object controller.
B60W 50/02 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour préserver la sécurité en cas de défaillance du système d'aide à la conduite, p. ex. en diagnostiquant ou en palliant à un dysfonctionnement
B60W 50/00 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier
G06F 30/20 - Optimisation, vérification ou simulation de l’objet conçu
G06F 119/02 - Analyse de fiabilité ou optimisation de fiabilitéAnalyse de défaillance, p. ex. performance dans le pire scénario, analyse du mode de défaillance et de ses effets [FMEA]
34.
Sensor anomaly detection using machine-learned attention over spatiotemporally encoded sensor data
Detecting a sensor anomaly may include encoding spatiotemporal data associated with one or more sensors.. A first machine-learned model may generate an embedding for a subset of sensor data associated with a ray emanating from a sensor and the embedding may be encoded with spatial characteristics of the sensor that received the subset and a time at which the subset was received. This and other spatiotemporally-encoded embeddings may be used by a second machine-learned model to determine an aggregated embedding that may be used to detect a sensor anomaly.
G06V 10/40 - Extraction de caractéristiques d’images ou de vidéos
B60W 50/02 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour préserver la sécurité en cas de défaillance du système d'aide à la conduite, p. ex. en diagnostiquant ou en palliant à un dysfonctionnement
G05D 1/43 - Commande de la position ou du cap par référence à un système à deux dimensions
G06V 10/70 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique
G06V 20/56 - Contexte ou environnement de l’image à l’extérieur d’un véhicule à partir de capteurs embarqués
Systems and techniques for determining deceleration controls to use in controlling a vehicle are described. A deceleration distribution system may receive a deceleration input from a vehicle control system or component and determine a rate of change of torque associated with adjusting current vehicle to torque to a torque associated with the deceleration input. The deceleration distribution system may then determine one or more deceleration components capable of achieving the determined rate of change and generate deceleration controls for such components to slow and/or stop the vehicle based on the received deceleration input.
B60L 7/00 - Systèmes de freins électrodynamiques pour véhicules, en général
B60L 7/26 - Systèmes de freins électrodynamiques pour véhicules, en général avec freinage additionnel mécanique ou électromagnétique commandant l'effet de freinage
Techniques for simulating environments are discussed herein. For example, a simulated autonomous vehicle may have simulated sensors to generate simulated sensor data based on log data. Using the log data, a simulated environment for testing a simulated vehicle may be generated. The simulated environment may represent the environment in which the real-world vehicle was operating. The techniques may further include generating simulated sensor data by the simulated sensors and sending the simulated sensor data to an autonomous vehicle controller where a trajectory of the simulated autonomous vehicle may be determined based on the simulated sensor data. In this way, an autonomous vehicle controller need not rely on the position of the vehicle in the original log data, and instead accurate simulated sensor data can be generated to determine accurate occlusions, amongst other things.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
B60W 50/06 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour améliorer la réponse dynamique du système d'aide à la conduite, p. ex. pour améliorer la vitesse de régulation, ou éviter le dépassement de la consigne ou l'instabilité
G05B 17/02 - Systèmes impliquant l'usage de modèles ou de simulateurs desdits systèmes électriques
A vehicle trajectory may be evaluated based on a relative probability distribution associated with a vehicle and an object in the vehicle environment and/or based on velocities associated with the vehicle and/or the object. The relative probability distribution may be determined based on a vehicle trajectory and/or a predicted object trajectory. A collision probability and/or a collision severity may be determined based on the relative probability distribution and/or the velocities. The collision probability and/or severity may be used to determine whether the vehicle trajectory is safe and/or to inform the driving behavior of the vehicle.
G06N 7/01 - Modèles graphiques probabilistes, p. ex. réseaux probabilistes
G08G 1/056 - Détection du mouvement du trafic pour le comptage ou la commande avec des dispositions pour distinguer la direction de circulation
G08G 1/0967 - Systèmes impliquant la transmission d'informations pour les grands axes de circulation, p. ex. conditions météorologiques, limites de vitesse
A computer implemented method is provided. The method, comprises: receiving from an autonomous vehicle sensor data, wherein the sensor data is indicative of an environment in which the vehicle is currently located or was previously located. The method further comprises receiving or determining additional data associated with at least one of: the sensor data, the vehicle, or the environment, wherein the additional data is different from the sensor data. The method further comprises displaying, on a display, an output comprising a representation of the sensor data. The method further comprises determining, based at least in part on the additional data, a reliability metric, the reliability metric being indicative of how reliable the sensor data is at representing the environment in which the vehicle is located at a current time. The method further comprises causing the output on the display to be based at least in part on the reliability metric.
G06T 7/50 - Récupération de la profondeur ou de la forme
G06T 7/70 - Détermination de la position ou de l'orientation des objets ou des caméras
G06T 11/20 - Traçage à partir d'éléments de base, p. ex. de lignes ou de cercles
G06T 17/00 - Modélisation tridimensionnelle [3D] pour infographie
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
39.
Systems and methods for identifying road segments and determining vehicle performance metrics
Techniques are disclosed for determining a safety metric or other performance metric associated with a vehicle. The techniques may comprise receiving a driving simulation scenario associated simulating a simulated vehicle. Based at least in part on map data and the driving simulation scenario, a road segment of the map data may be determined. A simulation of the vehicle may be initiated in accordance with the driving simulation scenario, the simulation including the road segment. A safety metric associated with autonomous operation of the simulated vehicle in a region comprising the road segment may be determined based at least in part on the simulation.
A system for vibration reducing covers for electrical components. The system includes composite covers for electronic components of drive units to reduce vibrations and associated noise during operation of a vehicle. The cover is formed of a composite to dampen oscillations occurring as a result of vibrations from the drivetrain of the vehicle. The cover also includes a thermal pathway for transferring energy as heat from an interior of compartment containing the electronic components to an environment outside the cover. Additionally, the cover can include an electromagnetic interference and/or radio frequency interference shield to prevent interference with other electronic components on and/or nearby the vehicle during operation.
This application relates to techniques for improving sensor cleaning techniques. The techniques described herein utilize heat generated by multiple heat generating components of a vehicle, which is normally wasted. The techniques collect the waste heat to heat fluid stored in a reservoir. The heated fluid is further dispensed to heat and/or clean sensors. In some examples, a vehicle implementing the techniques includes a computing system to control an operation of the vehicle; a sensor to sense an environment of the vehicle; and a sensor cleaning system to clean the sensor with a fluid. In implementations, the sensor cleaning system includes a heat exchanger to apply heat generated by a heat generating component to heat fluid in a reservoir to generate heated fluid, and a dispenser fluidly coupled to the reservoir and configured to dispense the heated fluid toward the sensor to heat or clean the sensor.
B60S 1/54 - Nettoyage des pare-brise, fenêtres ou dispositifs optiques utilisant un gaz, p. ex. air chaud
B60S 1/56 - Nettoyage des pare-brise, fenêtres ou dispositifs optiques spécialement adaptés pour nettoyer d'autres parties ou dispositifs que les fenêtres avant ou les pare-brise
42.
Tree search integrated dynamic vehicle controller profile for vehicle path planning
A tree search may comprise a cost function that may be used to determine, from multiple controller profiles, a controller profile to implement a transition between vehicle actions. A controller profile may limit the controls output by a controller sufficient to track from a first vehicle action to a second vehicle action. The controller controls may result in a trajectory achieved by the vehicle and the controller profile may limit the controls of the controller such that the trajectory is limited by a maximum jerk and/or maximum acceleration.
A transformer-based machine-learned model may use relative positions between token embeddings to more accurately predict outputs. Example outputs may include an object detection, sensor data segmentation, object state prediction, and/or the like. Efficiently computing relative position-aware attention scores of the transformer-based machine-learned model may comprise utilizing fast memory available in specialized hardware according to the memory loading and flushing techniques described herein.
G05B 13/02 - Systèmes de commande adaptatifs, c.-à-d. systèmes se réglant eux-mêmes automatiquement pour obtenir un rendement optimal suivant un critère prédéterminé électriques
B60W 50/00 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
44.
Sensor cleaning assembly with rotating sensor window
A sensor assembly includes one or more sensors configured to collect data from an environment. A sensor window is disposed in a field of view of the sensor, such that the sensor senses the environment through the sensor window to generate sensor data. The sensor window is rotated relative to the sensor to cause obstructions on the sensor window, like water or debris, to disperse.
Techniques for object detection based on sensor data are discussed herein. In some cases, the techniques described herein include determining that a first sensor data cluster and a second sensor data cluster are associated with the same object if the two clusters satisfy a set of conditions. For example, a condition may be defined based on at least one of: (i) whether the second cluster is within an azimuth range associated with the first cluster, (ii) whether a height difference associated with the two clusters falls below a threshold, (iii) whether an aspect ratio associated with a combination of the two clusters is acceptable (e.g., satisfies at least one of one or more predefined aspect ratio conditions, falls within one of one or more predefined aspect ratio ranges, and/or the like), and/or (iv) whether cluster tracks associated with the two clusters jointly move across time.
B60W 30/09 - Entreprenant une action automatiquement pour éviter la collision, p. ex. en freinant ou tournant
B60W 50/14 - Moyens d'information du conducteur, pour l'avertir ou provoquer son intervention
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G01S 13/89 - Radar ou systèmes analogues, spécialement adaptés pour des applications spécifiques pour la cartographie ou la représentation
G01S 17/89 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
46.
Polarization apparatus and method for stray light reduction in a lidar
The lidar systems may reduce stray light interference. A system may comprise a laser system which, upon receiving a signal at a first time, generates an illumination light. A polarizing beam splitter (PBS) may receive the illumination light from the laser system along a first light path and the illumination light may exit from the PBS along a second light path. A quarter wave plate (QWP) may receive the illumination light exiting from the PBS along the second light path and may pass the illumination light to an object from which the illumination light is reflected. A detector arranged proximate the PBS may receive the reflected light from the PBS along a third light path and may detect the reflected light at a second time, and a processor may determine a distance to the object based at least in part on the illumination light and the reflected light.
G02B 26/08 - Dispositifs ou dispositions optiques pour la commande de la lumière utilisant des éléments optiques mobiles ou déformables pour commander la direction de la lumière
Techniques are described herein for inferring a state of a traffic direction feature. The technique comprises using two detection models, wherein one detection model is trained to classify a state of the traffic direction feature based at least in part on imaging data, and wherein the second detection model is trained to classify whether traffic being allowed to travel along a route or not based at least in part on sensor data indicating a movement status of an object. By comparing the outputs of the two detection models, the state of the traffic direction feature may be inferred.
G06V 10/764 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant la classification, p. ex. des objets vidéo
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
Techniques for segmenting sensor data are discussed herein. Data can be represented in individual levels in a multi-resolution voxel space. A first level can correspond to a first region of an environment and a second level can correspond to a second region of an environment that is a subset of the first region. In some examples, the levels can comprise a same number of voxels, such that the first level covers a large, low-resolution region, while the second level covers a smaller, higher-resolution region, though more levels are contemplated. Operations may include analyzing sensor data represented in the voxel space from a perspective, such as a top-down perspective. From this perspective, techniques may generate masks that represent objects in the voxel space. Additionally, techniques may generate segmentation data to verify and/or generate the masks, or otherwise cluster the sensor data.
Techniques for generating a tree structure based on multiple machine-learned trajectories are described herein. A planning component (“ML system”) within a vehicle may receive and encode various types of sensor and/or vehicle data. The ML system can provide the encoded data as input to multiple machine-learning models (“ML models”), each of which may be trained to output a unique candidate trajectory for the vehicle follow. In some examples, each ML model may be trained to output a unique type of learned trajectory that causes the vehicle to perform a certain type of action. Using the learned candidate trajectories, the ML system may generate a tree structure that includes some or all of the candidate trajectories. The vehicle may determine a control trajectory based on the generation and traversal of the tree structure using a tree search algorithm, and may follow the control trajectory within the environment.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
B60W 40/06 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes liés à l'état de la route
Techniques for mitigating risks of a vehicle when turn signal(s) become inoperable are described herein. A vehicle may have turn signal indicators configured to indicate a navigational intent of the vehicle to the proximate object(s). In some examples, turn signal indicators may become inoperable due to a failure of the turn signal indicator and/or a failure of the turn signal controller. Based on determining that the turn signal indicator(s) is inoperable, the vehicle may determine a position of the turn signal indicator(s) on the vehicle (e.g., front right, back right, front left, back left, right side, etc.). The vehicle may use the position data to determine an action for the vehicle to follow. As such, the vehicle may be controlled based on the action.
B60W 50/02 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour préserver la sécurité en cas de défaillance du système d'aide à la conduite, p. ex. en diagnostiquant ou en palliant à un dysfonctionnement
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
51.
Synthetic generation of simulation scenarios and probability-based simulation evaluation
Techniques are discussed herein for generating and evaluating driving simulations based on synthetic scenarios. Simulated objects may be controlled based on parameters determining the attributes and behaviors of the objects, and scenarios may be synthetically modified by changing the parameters for a simulated object. For a driving scenario with a synthetic simulated object, a simulation system may analyze driving log data to determine a probability or prevalence associated with the driving scenario. In various examples, the simulation system may determine marginal density estimates for individual attributes of the simulated object, as well as a cumulative distribution function modeling the dependence between the attributes. The joint probability distribution determined for the synthetic simulated object can be used for evaluating the efficacy of the simulation and the performance of the simulated vehicle controllers.
G06F 7/48 - Méthodes ou dispositions pour effectuer des calculs en utilisant exclusivement une représentation numérique codée, p. ex. en utilisant une représentation binaire, ternaire, décimale utilisant des dispositifs n'établissant pas de contact, p. ex. tube, dispositif à l'état solideMéthodes ou dispositions pour effectuer des calculs en utilisant exclusivement une représentation numérique codée, p. ex. en utilisant une représentation binaire, ternaire, décimale utilisant des dispositifs non spécifiés
G06F 30/20 - Optimisation, vérification ou simulation de l’objet conçu
Techniques comprise receiving sensor data comprising first sensor data collected by a first lidar sensor and a second sensor data collected by a second lidar sensor of a vehicle. An object is detected based at least in part on the first sensor data and the second sensor data. A difference between the first sensor data associated with the object and the second sensor data associated with the object is determined, the difference corresponding with blooming of the first sensor data or the second sensor data. The vehicle is controlled based at least in part on the difference between the first sensor data and the second sensor data.
Techniques for determining a presence of an object, especially an object such as animal or debris, in a path of a vehicle, are discussed herein. For example, sensors of various modalities, which may include multispectral sensors, may capture data representing an environment the vehicle is traversing. In examples, one or more trained machine learned (ML) models, operating on a vehicle computing system, may detect and/or classify objects in the environment, based on input data of one or more modalities or spectral bands. The ML models may be pre-trained using training data including real sensor data, synthetic data, and/or augmented data, along with auto-generated annotations. In some examples, hyperspectral data may be used to identify materials associated with detected objects. A confidence score associated with the detection of the object may also be computed. The vehicle may be controlled based on detection of the object and its classification.
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
G06V 10/764 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant la classification, p. ex. des objets vidéo
G06V 10/774 - Génération d'ensembles de motifs de formationTraitement des caractéristiques d’images ou de vidéos dans les espaces de caractéristiquesDispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant l’intégration et la réduction de données, p. ex. analyse en composantes principales [PCA] ou analyse en composantes indépendantes [ ICA] ou cartes auto-organisatrices [SOM]Séparation aveugle de source méthodes de Bootstrap, p. ex. "bagging” ou “boosting”
G06V 20/70 - Étiquetage du contenu de scène, p. ex. en tirant des représentations syntaxiques ou sémantiques
Techniques for determining adversarial cost(s) representing an object in an environment are described herein. For example, a vehicle may generate candidate action(s) (or trajectories) and input such candidate actions into a tree structure. The vehicle can determine a control trajectory for the vehicle to follow based a determining which of the candidate actions to follow at each layer in the tree structure. The vehicle can determine which candidate action to follow based on determining adversarial cost(s). To generate the adversarial cost(s), the vehicle can detect an object and determine associated object trajectories. Based on a likelihood that the object will deviate from a predicted object trajectory, the vehicle can determine one or more sub-factors that may be combined to generate the adversarial cost(s). The vehicle can be controlled based on the adversarial cost(s).
Techniques are described herein for determining whether a traffic light signal is flashing. The technique comprises collating data representing a time-ordered sequence of classifications indicative of a likelihood that a traffic light signal is active at respective times, thereby to create collated data. output data is generated, based on the collated data, using a convolutional neural network (CNN) arranged to provide an indication of a likelihood that a traffic light signal is flashing based on an input time-ordered sequence of classifications. It is then determined whether the traffic light signal is flashing based on the output data.
G06V 10/764 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant la classification, p. ex. des objets vidéo
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G06V 10/62 - Extraction de caractéristiques d’images ou de vidéos relative à une dimension temporelle, p. ex. extraction de caractéristiques axées sur le tempsSuivi de modèle
G06V 10/82 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant les réseaux neuronaux
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
57.
Detecting vehicle impacts based on interior microphones
Techniques for detecting impacts to a vehicle body are described herein. A vehicle may receive first audio data (associated with a sound) from a microphone located inside the vehicle. Further, the vehicle may receive second audio data (associated with the sound) from a microphone that is external to the vehicle. The vehicle can compare, for example, the volumes of the first and second audio to determine a difference. The vehicle can compare the difference to a threshold to determine whether the sound corresponds to an impact to the vehicle body. For example, if the difference is below the threshold (e.g., the volume levels of the first and second audio data are similar), the vehicle can classify the sound as corresponding to an impact to the vehicle. The vehicle can control the vehicle based on the sound being classified as an impact to the vehicle body.
B60R 21/0136 - Circuits électriques pour déclencher le fonctionnement des dispositions de sécurité en cas d'accident, ou d'accident imminent, de véhicule comportant des moyens pour détecter les collisions, les collisions imminentes ou un renversement réagissant à un contact effectif avec un obstacle
H04R 1/40 - Dispositions pour obtenir la fréquence désirée ou les caractéristiques directionnelles pour obtenir la caractéristique directionnelle désirée uniquement en combinant plusieurs transducteurs identiques
B60R 21/013 - Circuits électriques pour déclencher le fonctionnement des dispositions de sécurité en cas d'accident, ou d'accident imminent, de véhicule comportant des moyens pour détecter les collisions, les collisions imminentes ou un renversement
This disclosure is directed to techniques for storing vehicle data. For instance, system(s) may receive first vehicle data generated by first vehicles operating in a first geographic area. The system(s) may then store the first vehicle data in a first memory. Additionally, the system(s) may determine a type of vehicle data to request from other system(s). The system(s) may then send the other system(s) a request for the type of vehicle data. Based on sending the request, the systems may receive second vehicle data that includes the type of vehicle data, where the second vehicle data is generated by second vehicles operating in a second geographic area. The system(s) may then store the second vehicle data in a second memory. The first memory may be a first type of memory and the second memory may be a second type of memory that is different than the first type of memory.
Techniques for determining whether a reflected lidar pulse has been subject to multipath reflection effects are disclosed. An initially emitted lidar pulse is generated having a property that varies across the pulse (either spatially in cross-section and/or temporally). Detected reflected pulses are analyzed to determine if they have similar or different properties. If the properties of both pulses are similar, the reflected pulse was likely not affected by multipath reflection. If the properties are similar, the reflected pulse likely was affected by multipath reflection. Pulses having a high likelihood of multipath reflections may be discarded (or disregarded) for subsequent processing.
G01S 17/32 - Systèmes déterminant les données relatives à la position d'une cible pour mesurer la distance uniquement utilisant la transmission d'ondes continues, soit modulées en amplitude, en fréquence ou en phase, soit non modulées
Techniques are described herein for determining a state of a vehicle and obtaining first and second audio data for playback by the vehicle. Respective audio priorities are determined for the first and second audio data, wherein at least the audio priority associated with the first audio data is determined based on vehicle data. Based on the audio priorities, a relative volume of the first audio data the second audio data is determined and the vehicle is caused to play the first audio data at the relative volume to the second audio data.
B60K 35/26 - Dispositions de sortie, c.-à-d. du véhicule à l'utilisateur, associées aux fonctions du véhicule ou spécialement adaptées à celles-ci utilisant une sortie acoustique
B60K 35/29 - Instruments caractérisés par la manière dont les informations sont traitées, p. ex. présentant des informations sur plusieurs dispositifs d’affichage ou hiérarchisant les informations en fonction des conditions de conduite
B60K 35/28 - Dispositions de sortie, c.-à-d. du véhicule à l'utilisateur, associées aux fonctions du véhicule ou spécialement adaptées à celles-ci caractérisées par le type d’informations de sortie, p. ex. divertissement vidéo ou informations sur la dynamique du véhiculeDispositions de sortie, c.-à-d. du véhicule à l'utilisateur, associées aux fonctions du véhicule ou spécialement adaptées à celles-ci caractérisées par la finalité des informations de sortie, p. ex. pour attirer l'attention du conducteur
B60R 11/02 - Autres aménagements pour tenir ou monter des objets pour postes radio, de télévision, téléphones, ou objets similairesDisposition de leur commande
61.
MACHINE-LEARNED SCENARIO DATA DIFFICULTY METRIC FOR REDUCED COMPUTATIONAL COMPLEXITY
Simulation for testing and/or validating autonomous vehicle functions may comprise sampling a set of scenario data to determine a subset of the scenario data for simulating operation of the autonomous vehicle. Determining to include a first scenario in the subset may be based at least in part on one or more difficulty metrics determined by a machine-learned model for the first scenario.
Simulation for testing and/or validating autonomous vehicle functions may comprise sampling a set of scenario data to determine a subset of the scenario data for simulating operation of the autonomous vehicle. Determining to include a first scenario in the subset may be based at least in part on one or more difficulty metrics determined by a machine-learned model for the first scenario.
A pre-trained machine-learned model, pre-generated clusters determined from embeddings generated by the machine-learned model, and/or difficulty metric(s) determined from simulation and associated with the clusters may be transmitted to and used on a vehicle. The machine-learned model may use sensor data to generate an embedding or a difficulty metric characterizing a current scenario encountered by the vehicle and the vehicle may alter operation of the vehicle based on the difficulty metric or difficulty metric(s) for the cluster associated with the embedding.
Techniques are described herein for determining a state of a vehicle and obtaining first and second audio data for playback by the vehicle. Respective audio priorities are determined for the first and second audio data, wherein at least the audio priority associated with the first audio data is determined based on vehicle data. Based on the audio priorities, a relative volume of the first audio data the second audio data is determined and the vehicle is caused to play the first audio data at the relative volume to the second audio data.
A vehicle comprises an object detection system, comprising: (A) a transmitter, (B) a receiver affixed, and (C) at least one processor configured to: (i) based on a characteristic of the environment in which the vehicle is currently located, determine an operating parameter of at least one of the transmitter and the receiver, (ii) cause the transmitter to output electromagnetic radiation with an intensity associated with the operating parameter of the transmitter, (iii) receive, from the receiver, a signal indicative of the electromagnetic radiation detected by the receiver, wherein the signal is based on the operating parameter of the receiver, (iv) determine that the transmission of the electromagnetic radiation between the transmitter and receiver is at least partially obstructed by an object based at least in part on the signal, and (v) control movement of a first and second door based at least in part on the electromagnetic radiation being obstructed.
B60J 5/04 - Portes disposées sur les côtés du véhicule
E05F 15/75 - Mécanismes pour battants mus par une force motrice avec déclenchement automatique sensible au déplacement ou à la présence de personnes ou d’objets sensible au poids ou à un autre contact physique d’une personne ou d’un objet
G01V 3/08 - Prospection ou détection électrique ou magnétiqueMesure des caractéristiques du champ magnétique de la terre, p. ex. de la déclinaison ou de la déviation fonctionnant au moyen de champs magnétiques ou électriques produits ou modifiés par les objets ou les structures géologiques, ou par les dispositifs de détection
Systems and techniques for performing a thermal shock test are discussed herein. The thermal shock test system includes a first thermal shock unit associated with a first electronic component, a second thermal shock unit associated with a second electronic component, a first temperature regulator that maintains a first fluid at a first temperature, and a second temperature regulator that maintains a second fluid at a second temperature. The thermal shock test system may be configurable between a first configuration and a second configuration. The first configuration includes the first thermal shock unit being associated with the first fluid and the second thermal shock unit being associated with the second fluid. The second configuration includes the second thermal shock unit being associated with the first fluid and the first thermal shock unit being associated with the second fluid. The thermal shock unit may transition between the first configuration and the second configuration within a time period that causes the first electronic component and the second electronic component to experience a thermal shock. The first electronic component and the second electronic component may be evaluated for failure after the thermal shock.
G01F 1/684 - Dispositions de structureMontage des éléments, p. ex. relativement à l'écoulement de fluide
G01F 15/00 - Détails des appareils des groupes ou accessoires pour ces derniers, dans la mesure où de tels accessoires ou détails ne sont pas adaptés à ces types particuliers d'appareils, p. ex. pour l'indication à distance
G01F 15/02 - Compensation ou correction des variations de pression, de poids spécifique ou de température
09 - Appareils et instruments scientifiques et électriques
38 - Services de télécommunications
Produits et services
Downloadable software in the nature of a mobile application
for facilitating live communication between passengers and
human support staff during autonomous vehicle rides;
downloadable software for providing real-time ride support
and customer service during autonomous vehicle transport,
featuring live human interaction; downloadable software for
answering passenger inquiries and providing assistance
during autonomous vehicle rides; downloadable software in
the nature of a mobile application for managing ride-related
questions, providing travel information, and enhancing the
passenger experience in autonomous vehicles; downloadable
software for use in connection with autonomous vehicles for
providing human-assisted support during transport using
blockchain technology; downloadable software in the nature
of a mobile application for providing information about
autonomous vehicle rides; downloadable software in the
nature of a mobile application for managing and monitoring
autonomous vehicle rides; downloadable software for
providing real-time ride status updates and travel
information to passengers of autonomous vehicles;
downloadable software for facilitating communication between
passengers and an autonomous vehicle; downloadable software
for answering passenger questions about autonomous vehicle
ride progress, safety features, and in-vehicle amenities;
downloadable software in the nature of a mobile application
for providing personalized assistance and information to
passengers during autonomous vehicle rides; downloadable
software for providing virtual concierge services for
passengers of autonomous vehicles; downloadable software for
providing virtual concierge services in the form of
passenger in-vehicle experience and ambiance; downloadable
software for enabling voice-activated interaction between
autonomous vehicle passengers and onboard systems and remote
operators; downloadable software for delivering
entertainment and informational content to passengers of
autonomous vehicles. Providing access to telecommunication networks for real-time
communication between passengers and human ride assistants
in autonomous vehicles; telecommunication services for
enabling real-time passenger support during autonomous
vehicle rides; transmission of data and voice between
passengers and human operators during autonomous
transportation services; providing access to
telecommunication networks for communication between
autonomous vehicle passengers and autonomous vehicle service
providers; providing telecommunication access services for
real-time communication between passengers and an autonomous
vehicle assistant; providing electronic transmission of
messages and data between passengers and autonomous vehicle
systems and customer service providers; providing
telecommunications connections to a global computer network
for enabling communication between autonomous vehicle
passengers; electronic transmission of data and messages
between autonomous vehicles, autonomous vehicle fleet and
operation systems, and mobile devices of passengers;
telecommunication services, namely, providing electronic
message alerts via the internet notifying passengers of ride
status, route updates, and in-vehicle features;
telecommunication services, namely, providing electronic
message alerts via the internet notifying passengers of
in-vehicle controls, communications for emergency vehicles,
lost items, changing vehicle features, and passenger in-ride
commands and demands.
Techniques for determining a planned trajectory usable to control a vehicle in an environment are discussed herein. A computing device can receive multiple planned trajectories generated by different models, and determine to use one of the planned trajectories to control the vehicle at a future time. The models can represent machine learned models that are independently trained using different training data and one of the models may leverage human driving data during training. The techniques can also include determining a bias value to cause the vehicle to utilize a planned trajectory from a set of available planned trajectories.
Techniques for supervised execution of software applications on a vehicle computing device. In some cases, an example method may include receiving a first request to operate a vehicle in a first operational mode. The method may further include determining that the first operational mode is associated with a first set of software applications comprising a first software application. The method may further include based at least in part on receiving the first request, initiating execution of the first software application. The method may further include, based at least in part on receiving the first request and determining that a second software application is outside the first set, at least one of stopping execution of the second software application or refraining from initiating execution of the second software application.
Techniques for determining a rear end collision probability for a vehicle are discussed herein. The rear end collision probability can be determined based on data associated with the vehicle and an object proximate the vehicle, probability distribution data, and a vehicle maneuver value. The probability distribution data, which can be received, may represent a reaction time of the object and a maneuver value of the object. The rear end collision probability can be utilized to control the vehicle.
Techniques for determining a planned trajectory usable to control a vehicle in an environment are discussed herein. A computing device can receive multiple planned trajectories generated by different models, and detennine to use one of the planned trajectories to control the vehicle at a future time. The models can represent machine learned models that are independently trained using different training data and one of the models may leverage human driving data during training. The techniques can also include determining a bias value to cause the vehicle to utilize a planned trajectory from a set of available planned trajectories.
G05D 101/15 - Détails des architectures logicielles ou matérielles utilisées pour la commande de la position utilisant des techniques d’intelligence artificielle [IA] utilisant l’apprentissage automatique, p. ex. les réseaux neuronaux
72.
SUPERVISED EXECUTION OF SOFTWARE APPLICATIONS ON VEHICLE COMPUTING DEVICES
Techniques for supervised execution of software applications on a vehicle computing device. In some cases, an example method may include receiving a first request to operate a vehicle in a first operational mode. The method may further include determining that the first operational mode is associated with a first set of software applications comprising a first software application. The method may further include based at least in part on receiving the first request, initiating execution of the first software application. The method may further include, based at least in part on receiving the first request and determining that a second software application is outside the first set, at least one of stopping execution of the second software application or refraining from initiating execution of the second software application.
Techniques are discussed herein for executing simulations with parameterized object controllers to control smart agents, to evaluate the agent realism of the smart agents and to validate the efficacy of the simulations. A simulation system may analyze log data captured by a vehicle operating in a physical environment, to determine observable and non-observable behavior characteristics of the agents in the environment. The simulation system may execute simulations using object controllers to control simulated objects (e.g., “smart agents”) based on parameters associated with scenarios, object types, and/or scenario locations. Log data associated with smart agent behaviors may be aggregated and compared to the behavior characteristics of agents in physical environments, to determine metrics for agent realism and simulation efficacy. Based on such metrics, simulation results may be validated and/or object controller parameters may be modified.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G06F 18/21 - Conception ou mise en place de systèmes ou de techniquesExtraction de caractéristiques dans l'espace des caractéristiquesSéparation aveugle de sources
Techniques for fault injection testing are described herein. The techniques may include receiving, at a computing device, an indication to simulate a fault associated with a component of a vehicle. The computing device may be coupled to the component via a control line that includes a relay or switch component that, when activated, causes the component to simulate the fault. The computing device may also receive data indicative of a vehicle response, such as a measured trajectory of the vehicle in response to the fault, whether a backup system of the vehicle performed correctly in response to the fault, whether the vehicle responded to the fault within a threshold period of time, and the like. The computing device may determine a difference between the vehicle response and an intended response of the vehicle relative to a threshold difference.
B60W 30/09 - Entreprenant une action automatiquement pour éviter la collision, p. ex. en freinant ou tournant
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G07C 5/08 - Enregistrement ou indication de données de marche autres que le temps de circulation, de fonctionnement, d'arrêt ou d'attente, avec ou sans enregistrement des temps de circulation, de fonctionnement, d'arrêt ou d'attente
G09B 9/042 - Simulateurs pour l'enseignement ou l'entraînement pour l'enseignement de la conduite des véhicules ou autres moyens de transport pour l'enseignement de la conduite des véhicules terrestres avec une simulation dans un véhicule réel
75.
Detecting and resolving disparities between map data and environments perceived by sensor systems
Techniques are discussed herein for detecting and resolving disparities between sensor data perceived by sensor-based systems operating in an environment and corresponding map data of the environment. Sensor data may be captured by a vehicle or other sensor system operating in an environment, including representations of objects at various locations in the environment. The object representations may be used to determine disparities between the sensor data and map data associated with the same locations. Such disparities may be caused by, for example, physical changes in the environment, map data changes, and/or localization errors of the sensor system. The techniques discussed herein further include analyzing the map data and sensor data to determine causes associated with the disparities, and resolving the disparities by updating the map data and/or transmitting updated sensor configuration data to sensor systems in the environment.
Techniques are described herein for determining correction coefficients for steering wheel angles according to a steering component of a vehicle. The techniques comprise redefining the problem of determining the correction into terms of turn angle and crab angle of the vehicle, which both directly depends on leading and trailing steering angles, taking advantage of techniques for estimation of the turn angle based on non-steering parameters/measurements. By redefining the problem, at least one of a scale or bias term for the correction may be calculated to be used in a linear model for estimating a crab angle of the vehicle.
B60W 50/06 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour améliorer la réponse dynamique du système d'aide à la conduite, p. ex. pour améliorer la vitesse de régulation, ou éviter le dépassement de la consigne ou l'instabilité
B60W 50/02 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour préserver la sécurité en cas de défaillance du système d'aide à la conduite, p. ex. en diagnostiquant ou en palliant à un dysfonctionnement
77.
Time simulation management of real world sensor frame data
Techniques for utilizing adjusted timestamps and updated hardware internal timers to drive sensor data (e.g., camera and/or video frames) are discussed herein. Timestamps can be adjusted to be adjusted timestamps based on simulation times and adjustment times. The adjustment times can include delays associated with frames being driven out of buffers. The adjustment times can further include delays associated with frames propagating through simulation devices. Internal timers can be updated to be updated timers based on numbers of timesteps between signals of primary clocks.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
B60W 50/04 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour surveiller le fonctionnement du système d'aide à la conduite
Techniques are discussed for generating and optimizing trajectories for controlling autonomous vehicles in performing on-route and off-route actions within a driving environment. A planning component of an autonomous vehicle can receive or generate time-discretized (or temporal) trajectories for the autonomous vehicle to traverse an environment. Trajectories can be optimized, for example, based on the lateral and longitudinal dynamics of the vehicle, using loss functions and/or costs. In some examples, the temporal optimization of a trajectory may include resampling a previous trajectory based on the differences in the time sequences of the temporal trajectories, to ensure temporal consistency of trajectories across planning cycles. Constraints also may be applied during temporal optimization in some examples, to control or restrict driving maneuvers that are not supported by the autonomous vehicle.
Techniques for detecting and labeling construction zones in an environment are disclosed. Two-dimensional images may be evaluated to identify pixels that may be associated with a construction zone and labeled accordingly. Labels for corresponding pixels in a separate image or non-construction zone labels for the same pixels may be compared to the construction zone pixels and an output image pixel label may be determined based on one or more criteria. An output image can be provided for vehicle control and for other operations, such as top-down segmentation and trajectory determination. Output images and related data may also be used to train a model to perform construction zone detection and labeling.
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
B60W 40/04 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes liés aux conditions de trafic
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G06V 10/25 - Détermination d’une région d’intérêt [ROI] ou d’un volume d’intérêt [VOI]
G06V 10/26 - Segmentation de formes dans le champ d’imageDécoupage ou fusion d’éléments d’image visant à établir la région de motif, p. ex. techniques de regroupementDétection d’occlusion
G06V 10/75 - Organisation de procédés de l’appariement, p. ex. comparaisons simultanées ou séquentielles des caractéristiques d’images ou de vidéosApproches-approximative-fine, p. ex. approches multi-échellesAppariement de motifs d’image ou de vidéoMesures de proximité dans les espaces de caractéristiques utilisant l’analyse de contexteSélection des dictionnaires
G06V 10/80 - Fusion, c.-à-d. combinaison des données de diverses sources au niveau du capteur, du prétraitement, de l’extraction des caractéristiques ou de la classification
G06V 10/82 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant les réseaux neuronaux
80.
Parallel processing filter for detecting object relevance to vehicle operation planning
A relevance filter may determine to use additional computing resources to generate predicted trajectory(ies) for a subset object(s) detected by a vehicle that depend on a candidate action for controlling the vehicle. The relevance filter may determine the subset by a memory storage and parallel computing technique that includes determining a set of vehicle states associated with a path for controlling the vehicle; determining a set of object states associated with a predicted trajectory of an object; determining a set of interaction scores based on the set of vehicle states and the set of object states; determining, based on the set of interaction scores, a trajectory importance score associated with the predicted trajectory; determining, based on the trajectory importance score, an object importance score associated with the object; and controlling the vehicle based at least in part on the object importance score and/or a new predicted trajectory for the object.
Techniques are discussed herein for determining optimal driving trajectories for autonomous vehicles in complex multi-agent driving environments. A baseline trajectory may be perturbed and parameterized into a vector of vehicle states associated with different segments (or portions) of the trajectory. Such a vector may be modified to ensure the resultant perturbed trajectory is kino-dynamically feasible. The vectorized perturbed trajectory may be input, including a representation of the current driving environment and additional agents, into a prediction model trained to output a predicted future driving scene. The predicted future driving scene, including predicted future states for the vehicle and predicted trajectories for the additional agents in the environment, may be evaluated to determine costs associated with each perturbed trajectory. Based on the determined costs, the optimization algorithm may determine subsequent perturbations and/or the optimal trajectory for controlling the vehicle in the driving environment.
A vehicle safety system of an autonomous vehicle may determine predicted velocity vectors for a potential collision, and use the velocity vectors to determine a trajectory for the autonomous vehicle to traverse the environment. The vehicle safety system may analyze sensor data to determine a likelihood of a potential collision with a dynamic object in the environment. Predicted velocity vectors may be determined for the autonomous vehicle and the dynamic object at a time and/or location associated with the potential collision. The predicted velocity vectors may be used to determine a point of impact, relative angle, and/or relative velocity between the autonomous vehicle and dynamic object at the potential collision. The likelihood of the potential collision and the predicted velocity vectors may be used to determine a trajectory for the autonomous vehicle, which may include following a current trajectory or transitioning to one or more contingent trajectories.
This disclosure describes procedures, as well as methods, systems and computer-readable media for compressing sensor data such as video data. In many use cases, such as for fully or semi-autonomous vehicles (AV), a sensor is moving through an environment multiple times, capturing data. This may result in that a same part of the environment is captured several times, and compression can be aided by the fact that for some environments, many of the structures and objects in the environment stays the same over time.
G06V 10/62 - Extraction de caractéristiques d’images ou de vidéos relative à une dimension temporelle, p. ex. extraction de caractéristiques axées sur le tempsSuivi de modèle
G06V 10/74 - Appariement de motifs d’image ou de vidéoMesures de proximité dans les espaces de caractéristiques
G06V 20/56 - Contexte ou environnement de l’image à l’extérieur d’un véhicule à partir de capteurs embarqués
84.
Systems and methods for ingesting data based on predefined driving sequences
Techniques associated with ingesting data based on the catalog are discussed herein. In some examples, log data associated with a vehicle in an environment can be received. The log data can include at least one of location data, state data, or prediction data. A sequence of data can be identified as corresponding to a driving sequence based on a set of rules. An identification of the driving sequence involving the vehicle in the environment can be associated with the sequence of data in a database. An inquiry for retrieving the sequence of data or information associated with the driving sequence can be received. In response to the inquiry, the sequence of data or information associated with the driving sequence can be returned.
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G06V 10/764 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant la classification, p. ex. des objets vidéo
A light sensor and apparatus capable of determining depth and image data simultaneously from a target in an environment is discussed herein. The apparatus includes an array of photodetector elements comprising a first portion to detect frequency modulated continuous wave (FMCW) signals from the reflected light signals and one or more additional portions that detect visible light signals from the environment.
Techniques for evaluating and controlling vehicles based on the reliability of their software components are described herein. In some cases, a method may include receiving an exposure duration associated with a first scenario and a first operational domain; determining, based at least in part on the exposure duration and a failure rate associated with a first software component of a vehicle, a first fault rate for the first software component; determining a first evaluation measure for the first software component with respect to the first scenario, the first operational domain, and a first energy level, based at least in part on: (i) the first fault rate, and (ii) a second exposure measure associated with the first scenario, first operational domain, and the first energy level; and controlling the vehicle based at least in part on the first evaluation measure.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
B60W 50/02 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour préserver la sécurité en cas de défaillance du système d'aide à la conduite, p. ex. en diagnostiquant ou en palliant à un dysfonctionnement
A method comprising obtaining weather data relating to an area of an environment, wherein the weather data is associated with a first precipitation rate in the area and is associated with a first time; determining, based at least in part on the weather data, a second precipitation rate at a second time different from the first time and wherein the weather data does not contain the second precipitation rate; determining, based at least in part on the second precipitation rate, a road surface value associated with an amount of precipitation on a surface of a road in the area; and controlling an autonomous vehicle based at least in part on the road surface value.
Particulate matter, such as fog, snow, rain, steam, vehicle exhaust, debris (plastic bags), etc. may cause one or more sensor types to generate false positive solid surface detections. In particular, various depth measurements may be impeded by particulate matter. Identifying false positive return(s) may comprise clustering lidar points, determining differences in range indicated by two different lidar devices having lidar points in the cluster, determining first differences that are more negative than a negative difference threshold and second differences that are more positive than a positive difference threshold, determining a first portion of lidar data in the cluster associated with the first differences and the second differences is associated with particulate matter or debris, and controlling a vehicle based at least in part on suppressing the first portion of the lidar data or indicating that the first portion of the lidar data is associated with particulate matter or debris.
G01S 17/931 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
B60W 50/02 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour préserver la sécurité en cas de défaillance du système d'aide à la conduite, p. ex. en diagnostiquant ou en palliant à un dysfonctionnement
B60W 50/029 - Adaptation aux défaillances ou contournement par solutions alternatives, p. ex. en évitant l'utilisation de parties défaillantes
G01S 7/48 - Détails des systèmes correspondant aux groupes , , de systèmes selon le groupe
G01S 17/89 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie
A vehicle computing system may implement techniques to determine whether two objects in an environment are related as an articulated object. The techniques may include applying heuristics and algorithms to object representations (e.g., bounding boxes) to determine whether two objects are related as a single object with two portions that articulate relative to each other. The techniques may include predicting future states of the articulated object in the environment. One or more model(s) may be used to determine presence of the articulated object and/or predict motion of the articulated object in the future. Based on the presence and/or motion of the articulated object, the vehicle computing system may control operation of the vehicle.
G06V 20/56 - Contexte ou environnement de l’image à l’extérieur d’un véhicule à partir de capteurs embarqués
G06V 20/17 - Scènes terrestres transmises par des avions ou des drones
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
Techniques for determining drivable area(s), parking location(s). or other incident areas in an environment are discussed herein. The drivable area(s), parking location(s), and/or other incident areas can be determined by a machine learned model. Training of the machine learned model can be based on sensor data and map data. The sensor data and the map data can be utilized to determine a representation (e.g., a top-down representation) of an environment. The representation can include at least road marking and velocity information associated with a dynamic object in the environment. The sensor data can be utilized to determine the dynamic object. The machine learned model can generate outputs including probabilities that elements of the outputs represent a drivable area, non-drivable area, a parking location, and/or an incident area. The outputs can be utilized to generate a trajectory. The trajectory can be utilized to control a vehicle to traverse the environment.
B60W 30/08 - Anticipation ou prévention de collision probable ou imminente
B60W 40/02 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes
G06V 20/56 - Contexte ou environnement de l’image à l’extérieur d’un véhicule à partir de capteurs embarqués
G06N 3/008 - Vie artificielle, c.-à-d. agencements informatiques simulant la vie fondés sur des entités physiques commandées par une intelligence simulée de manière à reproduire des formes de vie intelligentes, p. ex. fondés sur des robots reproduisant les animaux ou les humains dans leur apparence ou leur comportement
92.
OPERATING AN AUTONOMOUS VEHICLE BASED ON ROAD SURFACE CONDITION
A method comprising obtaining weather data relating to an area of an environment, wherein the weather data is associated with a first precipitation rate in the area and is associated with a first time; determining, based at least in part on the weather data, a second precipitation rate at a second time different from the first time and wherein the weather data does not contain the second precipitation rate; determining, based at least in part on the second precipitation rate, a road surface value associated with an amount of precipitation on a surface of a road in the area; and controlling an autonomous vehicle based at least in part on the road surface value.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
B60W 40/06 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes liés à l'état de la route
B60W 50/00 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier
Techniques for determining drivable area(s), parking location(s), or other incident areas in an environment are discussed herein. The drivable area(s), parking location(s), and/or other incident areas can be determined by a machine learned model. Training of the machine learned model can be based on sensor data and map data. The sensor data and the map data can be utilized to determine a representation (e.g., a top-down representation) of an environment. The representation can include at least road marking and velocity information associated with a dynamic object in the environment. The sensor data can be utilized to determine the dynamic object. The machine learned model can generate outputs including probabilities that elements of the outputs represent a drivable area, non-drivable area, a parking location, and/or an incident area. The outputs can be utilized to generate a trajectory. The trajectory can be utilized to control a vehicle to traverse the environment.
G06V 20/56 - Contexte ou environnement de l’image à l’extérieur d’un véhicule à partir de capteurs embarqués
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
Techniques for determining drivable area(s), parking location(s), or other incident areas in an environment are discussed herein. The drivable area(s), parking location(s), and/or other incident areas can be determined by a machine learned model. Training of the machine learned model can be based on sensor data and map data. The sensor data and the map data can be utilized to determine a representation (e.g., a top-down representation) of an environment. The representation can include at least road marking and velocity information associated with a dynamic object in the environment. The sensor data can be utilized to determine the dynamic object. The machine learned model can generate outputs including probabilities that elements of the outputs represent a drivable area, non-drivable area, a parking location, and/or an incident area. The outputs can be utilized to generate a trajectory. The trajectory can be utilized to control a vehicle to traverse the environment.
Techniques are described for modifying a factor associated with an invalid point cloud registration (e.g., alignment) between two data sets, to improve a map generation process. A computing system receives sensor data from a vehicle traversing an environment. The computing system aligns discrete data sets, such as to generate a representation of the environment based on the discrete data sets. In some examples, the computing system can apply an optimization algorithm to the alignments, to optimize the representation of the environment. The computing system identifies an invalid (non-ideal) alignment between two data sets and modifies a factor associated with the invalid alignment, introducing a repulsive force. The computing system performs an optimization with the modified factor, which causes a location associated with at least one of the two data sets to be different from a previously measured location. The computing system generates a map based on the optimization.
G06K 9/00 - Méthodes ou dispositions pour la lecture ou la reconnaissance de caractères imprimés ou écrits ou pour la reconnaissance de formes, p.ex. d'empreintes digitales
G06T 7/33 - Détermination des paramètres de transformation pour l'alignement des images, c.-à-d. recalage des images utilisant des procédés basés sur les caractéristiques
G06V 10/74 - Appariement de motifs d’image ou de vidéoMesures de proximité dans les espaces de caractéristiques
96.
Computational framework for automatic high assurance system design
Techniques for efficiently generating systems are disclosed herein. Abstract system definitions can define the various functions of a system and/or the fitness requirements of the system. Components capable of implementing the aspects of the abstract system can be determined and systems configured to perform the functions of the abstract system are determined. The fitness of these systems can be evaluated to determine whether they meet the fitness requirements of the abstract system. If not, an algorithm is used to iteratively adjust the configurations and components of the systems until the fitness criteria is met.
B60W 50/02 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour préserver la sécurité en cas de défaillance du système d'aide à la conduite, p. ex. en diagnostiquant ou en palliant à un dysfonctionnement
B60W 40/02 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes
B60W 50/029 - Adaptation aux défaillances ou contournement par solutions alternatives, p. ex. en évitant l'utilisation de parties défaillantes
G06F 30/20 - Optimisation, vérification ou simulation de l’objet conçu
G06N 3/126 - Algorithmes évolutionnaires, p. ex. algorithmes génétiques ou programmation génétique
G06N 7/01 - Modèles graphiques probabilistes, p. ex. réseaux probabilistes
Systems and techniques for whether to associate predicted trajectories of objects detected in an environment with candidate vehicle trajectories are described. A vehicle traversing an environment may have several candidate actions that may be used to control the vehicle. Predicted object trajectories may be excluded from further processing based on a relevancy score determination. Various factors may be evaluated to determine the relevancy score associated with a predicted object trajectory and candidate vehicle action pair that indicated the impact of the predicted object trajectory and candidate vehicle action. A resultant trajectory may be determined using only those relevant predicted object trajectories.
Techniques for integrating sensor data into a scene or map based on statistical data of captured environmental data are discussed herein. The data may be stored as a multi-resolution voxel space and the techniques may comprise first applying a pre-alignment or localization technique prior to fully integrating the sensor data.
G06T 7/33 - Détermination des paramètres de transformation pour l'alignement des images, c.-à-d. recalage des images utilisant des procédés basés sur les caractéristiques
G06T 3/20 - Translation linéaire d’images complètes ou de parties d’image, p. ex. panoramique
G06T 3/60 - Rotation d’images entières ou de parties d'image
99.
VEHICLE LOCALIZATION AND MAPPING BASED ON SENSOR DATA ELIABILITY DETERMINATIONS
Techniques for determining whether a global navigation satellite system (GNSS) measurement generated by a sensor associated with a vehicle is reliable. In some cases, an example system determines whether a first GNSS measurement generated by a first sensor and associated with a first time is reliable based on at least one of: (i) one or more GNSS measurements generated by the first sensor and associated with one or more times before and/or after the first time, or (ii) one or more GNSS measurements generated by a second sensor and associated with the first time. For example, the system may determine whether a first GNSS measurement generated by a first sensor and associated with a first time is reliable based on whether a ratio of a sum of incremental distances associated with a GNSS measurement sequence including the first GNSS measurement over an odometry-based distance falls within a threshold range.
G01C 21/28 - NavigationInstruments de navigation non prévus dans les groupes spécialement adaptés pour la navigation dans un réseau routier avec corrélation de données de plusieurs instruments de navigation
G01C 21/00 - NavigationInstruments de navigation non prévus dans les groupes
G01S 19/39 - Détermination d'une solution de navigation au moyen des signaux émis par un système de positionnement satellitaire à radiophares le système de positionnement satellitaire à radiophares transmettant des messages horodatés, p. ex. GPS [Système de positionnement global], GLONASS [Système mondial de satellites de navigation] ou GALILEO
100.
VEHICLE LOCALIZATION AND MAPPING BASED ON SENSOR DATA RELIABILITY DETERMINATIONS
Techniques for determining whether a global navigation satellite system (GNSS) measurement generated by a sensor associated with a vehicle is reliable. In some cases, an example system determines whether a first GNSS measurement generated by a first sensor and associated with a first time is reliable based on at least one of: (i) one or more GNSS measurements generated by the first sensor and associated with one or more times before and/or after the first time, or (ii) one or more GNSS measurements generated by a second sensor and associated with the first time. For example, the system may determine whether a first GNSS measurement generated by a first sensor and associated with a first time is reliable based on whether a ratio of a sum of incremental distances associated with a GNSS measurement sequence including the first GNSS measurement over an odometry-based distance falls within a threshold range.
G01S 19/39 - Détermination d'une solution de navigation au moyen des signaux émis par un système de positionnement satellitaire à radiophares le système de positionnement satellitaire à radiophares transmettant des messages horodatés, p. ex. GPS [Système de positionnement global], GLONASS [Système mondial de satellites de navigation] ou GALILEO
G05D 1/246 - Dispositions pour déterminer la position ou l’orientation utilisant des cartes d’environnement, p. ex. localisation et cartographie simultanées [SLAM]
G05D 1/248 - Dispositions pour déterminer la position ou l’orientation utilisant des signaux fournis par des sources artificielles extérieures au véhicule, p. ex. balises de navigation générés par des satellites, p. ex. GPS