Disclosed herein are methods and apparatus for controlling autonomous vehicles utilizing maps that include visibility information. A map is stored at a computing device associated with a vehicle. The vehicle is configured to operate in an autonomous mode that supports a plurality of driving behaviors. The map includes information about a plurality of roads, a plurality of features, and visibility information for at least a first feature in the plurality of features. The computing device queries the map for visibility information for the first feature at a first position. The computing device, in response to querying the map, receives the visibility information for the first feature at the first position. The computing device selects a driving behavior for the vehicle based on the visibility information. The computing device controls the vehicle in accordance with the selected driving behavior.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
Example systems and methods allow for reporting and sharing of information reports relating to driving conditions within a fleet of autonomous vehicles. One example method includes receiving information reports relating to driving conditions from a plurality of autonomous vehicles within a fleet of autonomous vehicles. The method may also include receiving sensor data from a plurality of autonomous vehicles within the fleet of autonomous vehicles. The method may further include validating some of the information reports based at least in part on the sensor data. The method may additionally include combining validated information reports into a driving information map. The method may also include periodically filtering the driving information map to remove outdated information reports. The method may further include providing portions of the driving information map to autonomous vehicles within the fleet of autonomous vehicles.
B60W 30/00 - Fonctions des systèmes d'aide à la conduite des véhicules routiers non liées à la commande d'un sous-ensemble particulier, p. ex. de systèmes comportant la commande conjuguée de plusieurs sous-ensembles du véhicule
B60W 50/00 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier
G08G 1/00 - Systèmes de commande du trafic pour véhicules routiers
G08G 1/01 - Détection du mouvement du trafic pour le comptage ou la commande
G08G 1/0967 - Systèmes impliquant la transmission d'informations pour les grands axes de circulation, p. ex. conditions météorologiques, limites de vitesse
3.
Methods and Systems for Estimating Rain Rate via Vehicle Imaging Radar
Example embodiments relate to techniques for using vehicle image radar to estimate rain rate and other weather conditions. A computing device may receive radar data from a radar unit coupled to a vehicle. The radar data can represent the vehicle's environment. The computing device may use the radar data to determine a radar representation that indicates backscatter power and estimate, using a rain rate model, a rain rate for the environment based on the radar representation. The computing device may then control the vehicle based on the rain rate. In some examples, the computing device may provide the rain rate estimation and an indication of its current location to other vehicles to enable the vehicles to adjust routes based on the rain rate estimation.
G01S 13/95 - Radar ou systèmes analogues, spécialement adaptés pour des applications spécifiques pour la météorologie
B60W 40/02 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes
Example embodiments relate to radar reflection filtering using a vehicle sensor system. A computing device may detect a first object in radar data from a radar unit coupled to a vehicle and, responsive to determining that information corresponding to the first object is unavailable from other vehicle sensors, use the radar data to determine a position and a velocity for the first object relative to the radar unit. The computing device may also detect a second object aligned with a vector extending between the radar unit and the first object. Based on a geometric relationship between the vehicle, the first object, and the second object, the computing device may determine that the first object is a self-reflection of the vehicle caused at least in part by the second object and control the vehicle based on this determination.
The present disclosure relates to systems and devices having a rotatable mirror assembly. An example system includes a housing and a rotatable mirror assembly. The rotatable mirror assembly includes a plurality of reflective surfaces, a shaft defining a rotational axis, and a mirror body coupling the plurality of reflective surfaces to the shaft. The mirror body includes a plurality of flexible support members. The rotatable mirror assembly also includes a coupling bracket configured to removably couple the rotatable mirror assembly to the housing. The system also includes a transmitter configured to emit emission light into an environment of the system after interacting with at least one reflective surface of the plurality of reflective surfaces. The system additionally includes a receiver configured to detect return light from the environment after interacting with the at least one reflective surface of the plurality of reflective surfaces.
G01S 7/481 - Caractéristiques de structure, p. ex. agencements d'éléments optiques
G01D 5/14 - Moyens mécaniques pour le transfert de la grandeur de sortie d'un organe sensibleMoyens pour convertir la grandeur de sortie d'un organe sensible en une autre variable, lorsque la forme ou la nature de l'organe sensible n'imposent pas un moyen de conversion déterminéTransducteurs non spécialement adaptés à une variable particulière utilisant des moyens électriques ou magnétiques influençant la valeur d'un courant ou d'une tension
G01S 17/08 - Systèmes déterminant les données relatives à la position d'une cible pour mesurer la distance uniquement
6.
SYNTHESIZING THREE-DIMENSIONAL VISUALIZATIONS FROM PERSPECTIVES OF ONBOARD SENSORS OF AUTONOMOUS VEHICLES
Aspects of the disclosure provide for generating a visualization of a three-dimensional (3D) world view from the perspective of a camera of a vehicle. For example, images of a scene captured by a camera of the vehicle and 3D content for the scene may be received. A virtual camera model for the camera of the vehicle may be identified. A set of matrices may be generated using the virtual camera model. The set of matrices may be applied to the 3D content to create a 3D world view. The visualization may be generated using the 3D world view as an overlay with the image, and the visualization provides a real-world image from the perspective of the camera of the vehicle with one or more graphical overlays of the 3D content.
Example embodiments relate to techniques for enabling one or more systems of a vehicle (e.g., an autonomous vehicle) to request remote assistance to help the vehicle navigate in an environment. A computing device may be configured to receive a request for assistance from a vehicle. The request may include an image frame representative of a portion of an environment. The computing device may also be configured to initiate display of a graphical user interface to visually represent the image frame. Further, the computing device may determine a bounding region for the image frame. The bounding region may be associated with one or more objects in the image frame. Additionally, the computing device may be configured to receive, via the GUI, an input that includes an object identifier, and associate the object identifier with each of the one or more objects in the bounding region.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
G06T 7/70 - Détermination de la position ou de l'orientation des objets ou des caméras
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
H04W 4/40 - Services spécialement adaptés à des environnements, à des situations ou à des fins spécifiques pour les véhicules, p. ex. communication véhicule-piétons
Example embodiments relate to lane adjustment techniques for slow lead agents. A vehicle computing system may use sensor data depicting the surrounding environment to detect when another vehicle is traveling in front of the vehicle at a speed that is less than a threshold minimum speed. If the other vehicle fails to increase speed above the minimum speed, the computing system may determine whether to change lanes to avoid the other vehicle based on speed data for other lanes. In some implementations, the computing system assigns penalties to lane segments surrounding the vehicle based on speed data for the different lane segments. For instance, the path finding system for the vehicle can use penalties and speed data to determine efficient routes that safely circumvent slow agents.
B60W 30/16 - Contrôle de la distance entre les véhicules, p. ex. pour maintenir la distance avec le véhicule qui précède
B60W 40/04 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes liés aux conditions de trafic
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
9.
Systems and Devices for Strain Relief for Magnetic Cores and Assemblies
An example device includes a mounting structure including a first material having a first coefficient of thermal expansion (CTE). The mounting structure includes a center portion and an outer portion. The device further includes a magnetic core for an electrical component that is coupled to the outer portion of the mounting structure. The magnetic core includes a second material having a second CTE. The magnetic core is split into a plurality of sections separated by spaces extending from the center portion to an outer edge of the outer portion. Each of the plurality of sections is separately coupled to the mounting structure, and each of the plurality of sections is connected to the electrical component.
H01F 27/26 - Fixation des parties du noyau entre ellesFixation ou montage du noyau dans l'enveloppe ou sur un support
H01F 27/30 - Fixation ou serrage de bobines, d'enroulements ou de parties de ceux-ci entre euxFixation ou montage des bobines ou enroulements sur le noyau, dans l'enveloppe ou sur un autre support
The described aspects and implementations enable privacy-respecting detection, separation, and localization of sounds in vehicle environments. The techniques include obtaining, using audio detector(s) of a vehicle, a sound recording that includes a plurality of elemental sounds (ESs) in a driving environment of the vehicle, and processing, using a sound separation model, the sound recording to separate individual ESs of the plurality of ESs. The techniques further include identifying a content of individual ESs and causing a driving path of the vehicle to be modified in view of the identified content of the individual ESs. Further techniques include rendering speech imperceptibly by redacting temporal portions of the speech, using sound recognition models to identify and discard recordings of speech, and driving at speeds that exceed threshold speeds at which speech becomes imperceptible from noise masking.
G10L 15/20 - Techniques de reconnaissance de la parole spécialement adaptées de par leur robustesse contre les perturbations environnantes, p. ex. en milieu bruyant ou reconnaissance de la parole émise dans une situation de stress
G10L 15/22 - Procédures utilisées pendant le processus de reconnaissance de la parole, p. ex. dialogue homme-machine
G10L 25/51 - Techniques d'analyse de la parole ou de la voix qui ne se limitent pas à un seul des groupes spécialement adaptées pour un usage particulier pour comparaison ou différentiation
11.
Augmenting point cloud data with artificial return points
Aspects and implementations of the present disclosure relate to augmenting point cloud data with artificial return points. An example method includes: receiving point cloud data comprising a plurality of return points, each return point being representative of a reflecting region that reflects a beam emitted by a sensing system, and generating a plurality of artificial return points based on presence or absence of return points along radial paths of beams emitted from the sensing system.
Aspects and implementations are related to systems and techniques enabling predictions of a motion change in a moving vehicle, predictions of an onset of a motion of an idling vehicle, and classification of vehicles based, at least in part, on vibrometry data obtained using light detection and ranging devices.
G08G 1/052 - Détection du mouvement du trafic pour le comptage ou la commande avec des dispositions pour déterminer la vitesse ou l'excès de vitesse
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
One example LIDAR device comprises a substrate and a waveguide disposed on the substrate. A first section of the waveguide extends lengthwise on the substrate in a first direction. A second section of the waveguide extends lengthwise on the substrate in a second direction different than the first direction. A third section of the waveguide extends lengthwise on the substrate in a third direction different than the second direction. The second section extends lengthwise between the first section and the second section. The LIDAR device also comprises a light emitter configured to emit light. The waveguide is configured to guide the light inside the first section toward the second section, inside the second section toward the third section, and inside the third section away from the second section.
G02B 6/125 - Courbures, branchements ou intersections
G01S 7/481 - Caractéristiques de structure, p. ex. agencements d'éléments optiques
G01S 17/10 - Systèmes déterminant les données relatives à la position d'une cible pour mesurer la distance uniquement utilisant la transmission d'ondes à modulation d'impulsion interrompues
G01S 17/42 - Mesure simultanée de la distance et d'autres coordonnées
G02B 6/122 - Éléments optiques de base, p. ex. voies de guidage de la lumière
G02B 6/42 - Couplage de guides de lumière avec des éléments opto-électroniques
The present disclosure relates to devices, systems, and methods relating to configurable silicon photomultiplier (SiPM) devices. An example device includes a substrate and a plurality of single photon avalanche diodes (SPADs) coupled to the substrate. The device also includes a plurality of outputs coupled to the substrate and a plurality of electrical components coupled to the substrate. The plurality of electrical components are configured to selectively connect the plurality of SPADs to the plurality of outputs by selecting which output of the plurality of outputs is connected to each SPAD of the plurality of SPADs and to thereby define a plurality of SiPMs in the device such that each SiPM of the plurality of SiPMs comprises a respective set of one or more SPADs connected to a respective output of the plurality of outputs.
G01T 1/24 - Mesure de l'intensité de radiation avec des détecteurs à semi-conducteurs
G01S 17/90 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie utilisant des techniques d'ouverture synthétique
G01T 1/20 - Mesure de l'intensité de radiation avec des détecteurs à scintillation
G01T 1/208 - Circuits spécialement adaptés aux détecteurs à scintillation, p. ex. à l'élément photomultiplicateur
G01T 1/29 - Mesure effectuée sur des faisceaux de radiations, p. ex. sur la position ou la section du faisceauMesure de la distribution spatiale de radiations
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for estimating a 3-D pose of an object of interest from image and point cloud data. In one aspect, a method includes obtaining an image of an environment; obtaining a point cloud of a three-dimensional region of the environment; generating a fused representation of the image and the point cloud; and processing the fused representation using a pose estimation neural network and in accordance with current values of a plurality of pose estimation network parameters to generate a pose estimation network output that specifies, for each of multiple keypoints, a respective estimated position in the three-dimensional region of the environment.
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
16.
IDENTIFICATION OF REAL AND IMAGE SIGN DETECTIONS IN DRIVING APPLICATIONS
The described aspects and implementations enable efficient identification of real and image signs in autonomous vehicle (AV) applications. In one implementation, disclosed is a method and a system to perform the method that includes obtaining, using a sensing system of the AV, a combined image that includes a camera image and a depth information for a region of an environment of the AV, classifying a first sign in the combined image as an image-true sign, performing a spatial validation of the first sign, which includes evaluation of a spatial relationship of the first sign and one or more objects in the region of the environment of the AV, and identifying, based on the performed spatial validation, the first sign as a real sign.
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
Methods and devices for detecting traffic signals and their associated states are disclosed. In one embodiment, an example method includes a scanning a target area using one or more sensors of a vehicle to obtain target area information. The vehicle may be configured to operate in an autonomous mode, and the target area may be a type of area where traffic signals are typically located. The method may also include detecting a traffic signal in the target area information, determining a location of the traffic signal, and determining a state of the traffic signal. Also, a confidence in the traffic signal may be determined. For example, the location of the traffic signal may be compared to known locations of traffic signals. Based on the state of the traffic signal and the confidence in the traffic signal, the vehicle may be controlled in the autonomous mode.
G06V 20/56 - Contexte ou environnement de l’image à l’extérieur d’un véhicule à partir de capteurs embarqués
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
18.
Model for Excluding Vehicle from Sensor Field Of View
The technology relates to developing a highly accurate understanding of a vehicle's sensor fields of view in relation to the vehicle itself. A training phase is employed to gather sensor data in various situations and scenarios, and a modeling phase takes such information and identifies self-returns and other signals that should either be excluded from analysis during real-time driving or accounted for to avoid false positives. The result is a sensor field of view model for a particular vehicle, which can be extended to other similar makes and models of that vehicle. This approach enables a vehicle to determine when sensor data is of the vehicle or something else. As a result, the detailed modeling allowing the on-board computing system to make driving decisions and take other actions based on accurate sensor information.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B60R 11/00 - Autres aménagements pour tenir ou monter des objets
B60R 11/04 - Montage des caméras pour fonctionner pendant la marcheDisposition de leur commande par rapport au véhicule
B60W 10/04 - Commande conjuguée de sous-ensembles de véhicule, de fonction ou de type différents comprenant la commande des ensembles de propulsion
B60W 10/20 - Commande conjuguée de sous-ensembles de véhicule, de fonction ou de type différents comprenant la commande des systèmes de direction
H04Q 9/00 - Dispositions dans les systèmes de commande à distance ou de télémétrie pour appeler sélectivement une sous-station à partir d'une station principale, sous-station dans laquelle un appareil recherché est choisi pour appliquer un signal de commande ou pour obtenir des valeurs mesurées
19.
OBJECT TRACKING ACROSS A WIDE RANGE OF DISTANCES FOR DRIVING APPLICATIONS
The described aspects and implementations enable efficient and seamless tracking of objects in vehicle environments using different sensing modalities across a wide range of distances. A perception system of a vehicle deploys an object tracking pipeline with a plurality of models that include a camera model trained to perform, using camera images, object tracking at distances exceeding a lidar sensing range, a lidar model trained to perform, using lidar images, object tracking at distances within the lidar sensing range, and a camera-lidar model trained to transfer, using the camera images and the lidar images, object tracking from the camera model to the lidar model.
G01S 17/89 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie
G05D 1/02 - Commande de la position ou du cap par référence à un système à deux dimensions
G06V 10/25 - Détermination d’une région d’intérêt [ROI] ou d’un volume d’intérêt [VOI]
G06V 10/44 - Extraction de caractéristiques locales par analyse des parties du motif, p. ex. par détection d’arêtes, de contours, de boucles, d’angles, de barres ou d’intersectionsAnalyse de connectivité, p. ex. de composantes connectées
G06V 20/56 - Contexte ou environnement de l’image à l’extérieur d’un véhicule à partir de capteurs embarqués
Aspects of the disclosure relate to adjusting a virtual camera's orientation when a vehicle is making a turn. One or more computing devices may receive the vehicle's original heading prior to making the turn and the vehicle's current heading. Based on the vehicle's original heading and the vehicle's current heading, the one or more computing devices may determine an angle of a turn the vehicle is performing and The one or more computing devices may determine a camera rotation angle and adjust the virtual camera's orientation relative to the vehicle to an updated orientation by rotating the virtual camera by the camera rotation angle and generate a video corresponding to the virtual camera's updated orientation. The video may be displayed on the display by the one or more computing devices.
H04N 23/60 - Commande des caméras ou des modules de caméras
B60R 1/27 - Dispositions de visualisation en temps réel pour les conducteurs ou les passagers utilisant des systèmes de capture d'images optiques, p. ex. des caméras ou des systèmes vidéo spécialement adaptés pour être utilisés dans ou sur des véhicules pour visualiser une zone extérieure au véhicule, p. ex. l’extérieur du véhicule avec un champ de vision prédéterminé fournissant une vision panoramique, p. ex. en utilisant des caméras omnidirectionnelles
B60R 1/28 - Dispositions de visualisation en temps réel pour les conducteurs ou les passagers utilisant des systèmes de capture d'images optiques, p. ex. des caméras ou des systèmes vidéo spécialement adaptés pour être utilisés dans ou sur des véhicules pour visualiser une zone extérieure au véhicule, p. ex. l’extérieur du véhicule avec un champ de vision réglable
Aspects and implementations of the present disclosure relate, generally, to optimization of autonomous vehicle (AV) technology and, more specifically, to a transfer hub for autonomous trucking operations. In one example, the disclosed techniques include obtaining, at a first time, one or more first images of an outside environment of an AV and shutting down a data processing system of the AV. The techniques further include initiating a starting-up of the data processing system of the AV and obtaining, at a second time, one or more second images of the outside environment of the AV. The techniques further include determining, based on a comparison of the one or more first images to the one or more second images, that the AV has not moved between the first time and the second time and completing the starting-up of the data processing system of the AV.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
B60S 5/02 - Alimentation des véhicules en combustibleDisposition générale des installations dans les stations d'approvisionnement
B60W 30/02 - Commande de la stabilité dynamique du véhicule
B60W 30/06 - Manœuvre automatique de stationnement
B67D 7/04 - Appareils ou dispositifs pour transférer des liquides à partir de récipients ou de réservoirs de stockage en vrac vers des véhicules ou des récipients portables, p. ex. pour la vente au détail pour transférer des carburants, des lubrifiants ou leurs mélanges
22.
Predicting a Parking or Pullover Spot Vacancy for an Autonomous Vehicle Pickup
The technology involves to pickups performed by autonomous vehicles. In particular, it includes identifying one or more potential pullover locations adjacent to an area of interest that an autonomous vehicle is approaching. The vehicle detects that a given one of the potential pullover locations is occupied by another vehicle and determines whether the other vehicle will be vacating the given pullover location within a selected amount of time. Upon determining that the other vehicle will be vacating the given potential pullover location within the timeframe, the vehicle determines whether to wait for the other vehicle to vacate the given pullover location. Then a driving system of the vehicle either performs a first action in order to wait for the other vehicle to vacate the given pullover location or performs a second action that is different from the first action when it is determined to not wait.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
G08G 1/01 - Détection du mouvement du trafic pour le comptage ou la commande
G08G 1/14 - Systèmes de commande du trafic pour véhicules routiers indiquant des places libres individuelles dans des parcs de stationnement
23.
Systems, Apparatus, and Methods for Retrieving Image Data of Image Frames
At least one processor may be configured to receive a first image frame of a sequence of image frames from an image capture device and select a first portion of a first image frame. The at least one processor may also be configured to obtain alignment information and determine a first portion and a second portion of a second image frame based on the alignment information. Further, the at least one processor may be configured to determine a bounding region within the second image frame and fetch image data corresponding to the bounding region of the second image frame from memory. In some examples, the first image frame may comprise a base image and the second image frame may comprise an alternative image frame. Further, the first image frame may comprise any one of the image frames of the sequence of image frames.
G06T 7/33 - Détermination des paramètres de transformation pour l'alignement des images, c.-à-d. recalage des images utilisant des procédés basés sur les caractéristiques
G06T 7/55 - Récupération de la profondeur ou de la forme à partir de plusieurs images
G06V 10/44 - Extraction de caractéristiques locales par analyse des parties du motif, p. ex. par détection d’arêtes, de contours, de boucles, d’angles, de barres ou d’intersectionsAnalyse de connectivité, p. ex. de composantes connectées
24.
User Interface Techniques for Recommending Remote Assistance Actions
Example embodiments relate to user interface techniques for recommending remote assistance actions. A remote computing device may display a representation of the forward path for an autonomous vehicle based on sensor data received from the vehicle. The computing device may augment the representation of the forward path to further depict one or more proposed trajectories available for the autonomous vehicle to perform. Each proposed trajectory conveys one or more maneuvers positioned relative to road boundaries in the forward path. The computing device may receive a selection of a proposed trajectory from the one or more proposed trajectories available for the autonomous vehicle to perform and provide navigation instructions to the vehicle based on the proposed trajectory.
Aspects of the present disclosure relate to a vehicle for maneuvering an occupant of the vehicle to a destination autonomously as well as providing information about the vehicle and the vehicle's environment for display to the occupant. The information includes a representation of a scene depicting an external environment of the vehicle. The representation of the scene includes a visual representation of the vehicle and a visual representation of objects in an external environment of the vehicle.
B60K 35/28 - Dispositions de sortie, c.-à-d. du véhicule à l'utilisateur, associées aux fonctions du véhicule ou spécialement adaptées à celles-ci caractérisées par le type d’informations de sortie, p. ex. divertissement vidéo ou informations sur la dynamique du véhiculeDispositions de sortie, c.-à-d. du véhicule à l'utilisateur, associées aux fonctions du véhicule ou spécialement adaptées à celles-ci caractérisées par la finalité des informations de sortie, p. ex. pour attirer l'attention du conducteur
B60K 35/81 - Dispositions pour la commande des instruments pour la commande des affichages
G01C 21/36 - Dispositions d'entrée/sortie pour des calculateurs embarqués
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
G08G 1/015 - Détection du mouvement du trafic pour le comptage ou la commande avec des dispositions pour distinguer différents types de véhicules, p. ex. pour distinguer les automobiles des cycles
G08G 1/04 - Détection du mouvement du trafic pour le comptage ou la commande utilisant des détecteurs optiques ou ultrasonores
G08G 1/0962 - Dispositions pour donner des instructions variables pour le trafic avec un indicateur monté à l'intérieur du véhicule, p. ex. délivrant des messages vocaux
09 - Appareils et instruments scientifiques et électriques
12 - Véhicules; appareils de locomotion par terre, par air ou par eau; parties de véhicules
39 - Services de transport, emballage et entreposage; organisation de voyages
42 - Services scientifiques, technologiques et industriels, recherche et conception
Produits et services
Business consultancy; business advice; business training; business management; business planning; business administration; business support services; advertising services; transportation services, namely, operations management relating to vehicles; transportation services, namely, tracking, locating, and monitoring of vehicles for commercial purposes; transport in the nature of transportation logistics services, namely, arranging the transportation of goods for others; freight logistics management; monitoring, managing, and tracking of transportation of persons and delivery of goods and packages in transit, for business purposes; providing a website featuring information for business management of transportation logistics services, namely, providing information about planning, coordinating, and tracking the transportation of goods, freight, people, and conveyances; providing a website featuring information for transportation logistics management services, namely, about planning, coordinating, and tracking the transportation of people, and planning and scheduling shipments for users of transportation services; business data analysis; fleet management services in the nature of tracking, locating, and monitoring of fleet vehicles as well as vehicle fuel management, parking management, remote assistance management, and demand management, depot management, all for commercial purposes; advertising services; promoting the goods and services of others; arranging and providing discount programs that enable customers to obtain discounts on goods and services; administration of a customer loyalty program which provides discounted ride-hail rides; providing a website featuring information about discount and rewards programs; arranging and conducting incentive rewards programs to incentivize engagement with a ride-hailing platform; charitable services, namely, organizing and conducting volunteer programs and community service projects. Computer software; downloadable software; recorded software; computer hardware; computers; sensors; software for sensor operation; downloadable and recorded software for detecting and issuing notifications regarding vehicle maintenance needs; downloadable and recorded software for facilitating and assisting with vehicle maintenance remotely; downloadable and recorded software for navigating, driving, and directing a vehicle to receive fuelling and servicing; downloadable software for arranging, engaging, scheduling, managing, obtaining, booking, and coordinating travel, transportation, transportation services, ride-hailing, deliveries, and delivery services; downloadable software for the scheduling and dispatch of motorized vehicles; downloadable software for monitoring, managing, and tracking delivery of goods; downloadable software for requesting and ordering delivery services; downloadable software for planning, scheduling, controlling, monitoring, and providing information on transportation of assets and goods; downloadable software for tracking and providing information concerning pick-up and delivery of assets and goods in transit; downloadable software for accessing and providing online grocery and retail store services; downloadable software for providing and managing delivery of consumer goods, food, and groceries; downloadable real-time map software for tracking vehicles, trips, and deliveries; downloadable software for displaying transit routes; downloadable software for providing information on transportation and delivery services; downloadable software featuring information about food, grocery, and consumer products; downloadable software for users to administer, access, monitor, and manage loyalty programs and rewards; downloadable software for earning, tracking, and redeeming loyalty rewards, points, and discounts; downloadable software for issuing, setting up, distributing, and redeeming promotions, coupons, discounts, deals, vouchers, rebates, rewards, incentives, and special offers to customers; computer hardware and recorded software for the autonomous driving of motor vehicles; downloadable software in the nature of vehicle operating system software; downloadable software for autonomous vehicle operation, navigation, steering, calibration, and management; downloadable software for vehicle fleet management, namely, tracking fleet vehicles for commercial purposes; downloadable computer software for use as an application programming interface (API); downloadable and recorded software for vehicle fleet management and demand forecasting, vehicle charging, vehicle fuel monitoring, vehicle maintenance, vehicle depot management, vehicle parking management, and remote assistance with vehicles; recorded software and computer hardware for vehicle fleet launching, coordination, calibrating, direction, scheduling, booking, dispatching, and management; computer hardware and recorded software for use with vehicle cameras; recorded software for artificial intelligence (AI), machine learning, and deep learning; recorded software for artificial intelligence (AI), machine learning, and deep learning for data processing and contextual prediction, personalization, and predictive analytics; downloadable computer programs and downloadable software for artificial intelligence (AI), machine learning, and deep learning for use in connection with operating autonomous vehicles, systems, devices, and machinery; recorded software and computer hardware for use in connection with and for operating autonomous vehicles, systems, devices, and machinery; computer hardware and recorded software for operating vehicle cameras; downloadable computer software for use in operating and calibrating lidar; computer hardware for operating autonomous vehicles; navigational instruments for vehicles; laser object detectors; laser device for sensing objects; laser device for sensing outdoor terrain; audio detectors; laser device for sensing distance to objects in the nature of laser rangefinders; electric sensors for determining position, velocity, direction, and acceleration; perimeter sensors in the nature of sensors that measure the presence of objects in the environment and the speed, trajectory, and heading of objects; environmental sensors for measuring the presence of objects in the environment and the speed of objects; vehicle sensors, namely, environmental sensors for measuring the presence of objects in the environment and the speed of objects; sensors for determining position, velocity, direction, and acceleration; vehicle safety and detection equipment and hardware; safety and driving assistant systems comprised of sensors for determining position, velocity, direction, and acceleration of land vehicles; cameras; cameras for use with vehicles; downloadable data sets in the field of machine perception and autonomous driving technology. Vehicles; self-driving transport vehicles; trucks; freight vehicles, namely, trucks and vans; shared transit vehicles; freight vehicles in the nature of land vehicles; autonomous land vehicles and structural parts thereof. Transportation services; car rental services; truck rental services; rental of autonomous vehicles; truck transport; car transport; travel by land vehicles; transport of persons; transport of goods; delivery of goods; delivery by road; transportation and delivery services by autonomous vehicles; freight and cargo services; freight transportation; supply chain logistics and reverse logistics services, namely, storage, transportation, and delivery of goods for others; providing autonomous vehicle booking services; travel arrangement, namely, arranging time-based ride-hailing; transportation services, namely, coordinating the pickup and dropoff of passengers at designated or directed locations; transportation management services for others, namely, planning, coordinating, and tracking the transportation of people and conveyances; providing transportation information; providing a website featuring information regarding autonomous car transportation and delivery services and scheduling transportation services; providing a website featuring information in the field of transportation; providing a website featuring information regarding transportation services and bookings for transportation services; providing a website featuring information regarding delivery services and bookings for delivery services; providing a website featuring information about transportation management services in the nature of transportation of goods, namely, about planning, coordinating, and tracking the transportation of conveyances; charitable services, namely, transportation and delivery services by road. Software as a service (SaaS) services; platform as a service (PaaS) services; online non-downloadable software; technical support services; software as a service (SaaS) services featuring software for sensor operation; platform as a service (PaaS) services featuring software for sensor operation; online non-downloadable software for detecting and issuing notifications regarding vehicle maintenance needs; online non-downloadable software for facilitating and assisting with vehicle maintenance remotely; online non-downloadable software for navigating, driving, and directing a vehicle car to receive fuelling and servicing; online non-downloadable software for sensor operation; online non-downloadable software for arranging, engaging, scheduling, managing, obtaining, booking, and coordinating travel, transportation, transportation services, ride-hailing, deliveries, and delivery services; online non-downloadable software for tracking, locating, and monitoring vehicles; online non-downloadable software for coordinating the transport and delivery of goods; online non-downloadable software for arranging, procuring, scheduling, engaging, coordinating, managing, and booking transportation and deliveries; online non-downloadable software for providing and managing delivery services; online non-downloadable software for providing and managing delivery of consumer goods, food, and groceries; online non-downloadable software for accessing and viewing transit information, schedules, routes, and prices; providing temporary use of online non-downloadable real-time map software for tracking vehicles and deliveries; providing a website featuring online non-downloadable software that enables users to request transportation; providing temporary use of online non-downloadable computer software for identifying trip delays and vehicle location; providing temporary use of online non-downloadable software for accessing transportation services, bookings for transportation services and dispatching motorized vehicles; online non-downloadable software for issuing, setting up, distributing, redeeming, and accessing promotions, coupons, discounts, deals, vouchers, rebates, rewards, incentives, and special offers; online non-downloadable software for vehicle fleet management and demand forecasting, vehicle charging, vehicle fuel monitoring, vehicle maintenance, vehicle depot management, vehicle parking management, and remote assistance with vehicles; software as a service (SaaS) services featuring online non-downloadable computer software for use as an application programming interface (API); online non-downloadable software for vehicle coordination, navigation, calibrating, direction, and management for use with vehicle on-board computers; software as a service (SaaS) services featuring software for vehicle coordination, navigation, calibrating, direction, and management for use with vehicle on-board computers; online non-downloadable software for analyzing transportation and deliveries; electronic monitoring and reporting of transportation data using computers or sensors; online non-downloadable software for the autonomous driving of motor vehicles; online non-downloadable software for autonomous vehicle navigation, steering, calibration, and management; online non-downloadable software for visualization, manipulation, and integration of digital graphics and images; online non-downloadable software for artificial intelligence, machine learning, and deep learning; online non-downloadable software for use in operating and calibrating lidar; online non-downloadable software used for data analytics in the field of transportation; online non-downloadable software used for data analytics in the field of transportation fleet management; online non-downloadable open source software for use in data management; land and road surveying; surveying services and data collection and analysis in connection therewith; mapping services; online non-downloadable software for accessing location, GPS, and motion sensor data for safety and emergency response purposes; online non-downloadable software for emergency assistance; online non-downloadable software for safety and incident detection; providing data sets in the field of machine perception and autonomous driving technology; providing information about autonomous-vehicle and machine-perception research via a website; research, design, and development in the field of artificial intelligence; research, design, and development in the field of autonomous technology; research, design, and development in the field of perception and motion prediction; research, design, and development of computer hardware and software for use with vehicle on-board computers for monitoring and controlling motor vehicle operation; installation, updating, and maintenance of computer hardware and software for use with vehicle on-board computers for monitoring and controlling motor vehicle operation; research, design, and development of computer hardware and software for vehicle coordination, navigation, calibrating, direction, and management; research, design, and development of sensors and for structural parts thereof; research, design, and development of lasers for sensing objects and distance to objects, lasers for sensing indoor and outdoor terrain, lasers for measuring purposes, laser measuring systems, lidar (light detection and ranging apparatus), and laser equipment; advanced product research, design, and development in the field of artificial intelligence in connection with autonomous vehicles; design and development of computer hardware and software; research, design, and development of vehicle software; technological, scientific and research services in the field of robotics, self-driving car and autonomous vehicle technology; providing virtual computer systems and environments through cloud computing for the purpose of training self-driving cars, autonomous vehicles and robots; virtual testing of self-driving cars, autonomous vehicles and robots using computer simulations; creation, development, programming and implementation of simulation software in the field of self-driving cars, autonomous vehicles and robots; creation of simulation programs for autonomous vehicles.
27.
System and method of providing recommendations to users of vehicles
A system and method are arranged to provide recommendations to a user of a vehicle. In one aspect, the vehicle navigates in an autonomous mode and the sensors provide information that is based on the location of the vehicle and output from sensors directed to the environment surrounding the vehicle. In further aspects, both current and previous sensor data is used to make the recommendations, as well as data based on the sensors of other vehicles.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B60R 1/00 - Dispositions pour la visibilité optiqueDispositions de visualisation en temps réel pour les conducteurs ou les passagers utilisant des systèmes de capture d’images optiques, p. ex. des caméras ou des systèmes vidéo spécialement adaptés pour être utilisés dans ou sur des véhicules
B60T 7/22 - Organes d'attaque de la mise en action des freins par déclenchement automatiqueOrganes d'attaque de la mise en action des freins par déclenchement non soumis à la volonté du conducteur ou du passager déclenchés par le contact du véhicule, p. ex. du pare-chocs, avec un obstacle extérieur, p. ex. un autre véhicule
B60T 8/00 - Dispositions pour adapter la force de freinage sur la roue aux conditions propres au véhicule ou à l'état du sol, p. ex. par limitation ou variation de la force de freinage
B60T 8/17 - Utilisation de moyens de régulation électriques ou électroniques pour la commande du freinage
B60T 8/88 - Dispositions pour adapter la force de freinage sur la roue aux conditions propres au véhicule ou à l'état du sol, p. ex. par limitation ou variation de la force de freinage selon une condition de vitesse, p. ex. accélération ou décélération comportant des moyens sensibles au fonctionnement défectueux, c.-à-d. des moyens pour détecter et indiquer un fonctionnement défectueux des moyens sensibles à la condition de vitesse
B60T 17/22 - Dispositifs pour surveiller ou vérifier les systèmes de freinsDispositifs de signalisation
B60W 30/08 - Anticipation ou prévention de collision probable ou imminente
B60W 40/06 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes liés à l'état de la route
B60W 50/14 - Moyens d'information du conducteur, pour l'avertir ou provoquer son intervention
G01C 21/36 - Dispositions d'entrée/sortie pour des calculateurs embarqués
G01S 17/86 - Combinaisons de systèmes lidar avec des systèmes autres que lidar, radar ou sonar, p. ex. avec des goniomètres
G01S 17/931 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
G06Q 10/02 - Réservations, p. ex. pour billetterie, services ou manifestations
G06Q 30/02 - MarketingEstimation ou détermination des prixCollecte de fonds
G06Q 30/0207 - Remises ou incitations, p. ex. coupons ou rabais
G06T 7/223 - Analyse du mouvement utilisant la correspondance de blocs
G06T 7/231 - Analyse du mouvement utilisant la correspondance de blocs utilisant des recherches complètes
G06T 7/521 - Récupération de la profondeur ou de la forme à partir de la télémétrie laser, p. ex. par interférométrieRécupération de la profondeur ou de la forme à partir de la projection de lumière structurée
G06T 7/73 - Détermination de la position ou de l'orientation des objets ou des caméras utilisant des procédés basés sur les caractéristiques
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
G07C 9/00 - Enregistrement de l’entrée ou de la sortie d'une entité isolée
B60W 30/186 - Usure excessive ou détérioration des éléments de friction, p. ex. des embrayages
B60W 50/029 - Adaptation aux défaillances ou contournement par solutions alternatives, p. ex. en évitant l'utilisation de parties défaillantes
B62D 6/00 - Dispositions pour la commande automatique de la direction en fonction des conditions de conduite, qui sont détectées et pour lesquelles une réaction est appliquée, p. ex. circuits de commande
G01S 13/86 - Combinaisons de systèmes radar avec des systèmes autres que radar, p. ex. sonar, chercheur de direction
Aspects of the disclosure relate to controlling a vehicle. For instance, using a camera, a first camera image including a first object may be captured. A first bounding box for the first object and a distance to the first object may be identified. A second camera image including a second object may be captured. A second bounding box for the second image and a distance to the second object may be identified. Whether the first object is the second object may be determined using a plurality of models to compare visual similarity of the two bounding boxes, to compare a three-dimensional location based on the distance to the first object and a three-dimensional location based on the distance to the second object, and to compare results from the first and second models. The vehicle may be controlled in an autonomous driving mode based on a result of the third model.
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
G01S 13/931 - Radar ou systèmes analogues, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
G01S 17/931 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
G06F 18/22 - Critères d'appariement, p. ex. mesures de proximité
29.
Synchronized Spinning LIDAR and Rolling Shutter Camera System
One example system comprises a LIDAR sensor that rotates about an axis to scan an environment of the LIDAR sensor. The system also comprises one or more cameras that detect external light originating from one or more external light sources. The one or more cameras together provide a plurality of rows of sensing elements. The rows of sensing elements are aligned with the axis of rotation of the LIDAR sensor. The system also comprises a controller that operates the one or more cameras to obtain a sequence of image pixel rows. A first image pixel row in the sequence is indicative of external light detected by a first row of sensing elements during a first exposure time period. A second image pixel row in the sequence is indicative of external light detected by a second row of sensing elements during a second exposure time period.
H04N 25/531 - Commande du temps d'intégration en commandant des obturateurs déroulants dans un capteur SSIS CMOS
G01S 17/42 - Mesure simultanée de la distance et d'autres coordonnées
G01S 17/86 - Combinaisons de systèmes lidar avec des systèmes autres que lidar, radar ou sonar, p. ex. avec des goniomètres
G01S 17/89 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie
G06T 7/521 - Récupération de la profondeur ou de la forme à partir de la télémétrie laser, p. ex. par interférométrieRécupération de la profondeur ou de la forme à partir de la projection de lumière structurée
G06T 7/90 - Détermination de caractéristiques de couleur
H04N 23/45 - Caméras ou modules de caméras comprenant des capteurs d'images électroniquesLeur commande pour générer des signaux d'image à partir de plusieurs capteurs d'image de type différent ou fonctionnant dans des modes différents, p. ex. avec un capteur CMOS pour les images en mouvement en combinaison avec un dispositif à couplage de charge [CCD] pour les images fixes
H04N 23/698 - Commande des caméras ou des modules de caméras pour obtenir un champ de vision élargi, p. ex. pour la capture d'images panoramiques
H04N 23/71 - Circuits d'évaluation de la variation de luminosité
H04N 23/73 - Circuits de compensation de la variation de luminosité dans la scène en influençant le temps d'exposition
H04N 23/90 - Agencement de caméras ou de modules de caméras, p. ex. de plusieurs caméras dans des studios de télévision ou des stades de sport
H04N 25/40 - Extraction de données de pixels provenant d'un capteur d'images en agissant sur les circuits de balayage, p. ex. en modifiant le nombre de pixels ayant été échantillonnés ou à échantillonner
30.
QUALITY SCORING FOR PULLOVERS FOR AUTONOMOUS VEHICLES
Aspects of the disclosure relate to evaluating quality of a predetermined pullover location for an autonomous vehicle. For instance, a plurality of inputs for the predetermined pullover location may be received. The plurality of inputs may each include a value representative of a characteristic of the predetermined pullover location. The plurality of inputs may be combined to determine a pullover quality value for the predetermined pullover location. The pullover quality value may be provided to a vehicle in order to enable the vehicle to select a pullover location for the vehicle.
Example systems and methods enable an autonomous vehicle to request assistance from a remote operator when the vehicle's confidence in operation is low. One example method includes operating an autonomous vehicle in a first autonomous mode. The method may also include identifying a situation where a level of confidence of an autonomous operation in the first autonomous mode is below a threshold level. The method may further include sending a request for assistance to a remote assistor, the request including sensor data representative of a portion of an environment of the autonomous vehicle. The method may additionally include receiving a response from the remote assistor, the response indicating a second autonomous mode of operation. The method may also include causing the autonomous vehicle to operate in the second autonomous mode of operation in accordance with the response from the remote assistor.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B60W 50/02 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour préserver la sécurité en cas de défaillance du système d'aide à la conduite, p. ex. en diagnostiquant ou en palliant à un dysfonctionnement
H04L 67/12 - Protocoles spécialement adaptés aux environnements propriétaires ou de mise en réseau pour un usage spécial, p. ex. les réseaux médicaux, les réseaux de capteurs, les réseaux dans les véhicules ou les réseaux de mesure à distance
One example method of testing an electrical device comprises transmitting a data pattern to a memory device of the electrical device by a controller of the electrical device to provide a written data pattern to the memory device, wherein the data pattern replicates a resonant frequency of at least a portion of the electrical device, reading the written data pattern from the memory device with the controller, and comparing the written data pattern to the data pattern.
Example embodiments relate to real-time health monitoring for automotive radars. A computing device may receive radar data from multiple radar units that have partially overlapping fields of view and detect a target object located such that the radar units both capture measurements of the target object. The computing device may determine a power level representing the target object for radar data from each radar unit, adjust these power levels, and determine a power difference between them. When the power difference exceeds a threshold power difference, the computing device may perform a calibration process to decrease the power difference below the threshold power difference or alert the vehicle, including onboard algorithms, to the reduced performance of the radar.
An example method includes receiving, from one or more sensors associated with an autonomous vehicle, sensor data associated with a target object in an environment of the vehicle during a first environmental condition, where at least one sensor of the sensor(s) is configurable to be associated with one of a plurality of operating field of view volumes. The method also includes based on the sensor data, determining at least one parameter associated with the target object. The method also includes determining a degradation in the parameter(s) between the sensor data and past sensor data, where the past sensor data is associated with the target object in the environment during a second environmental condition different from the first and, based on the degradation, adjusting the operating field of view volume of the at least one sensor to a different one of the operating field of view volumes.
G01S 17/86 - Combinaisons de systèmes lidar avec des systèmes autres que lidar, radar ou sonar, p. ex. avec des goniomètres
G01S 17/89 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie
G01S 17/931 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
G01W 1/06 - Instruments pour indiquer des conditions atmosphériques par mesure de plusieurs variables, p. ex. humidité, pression, température, nébulosité ou vitesse du vent donnant l'indication des conditions météorologiques par combinaison des variables mesurées
G06V 10/44 - Extraction de caractéristiques locales par analyse des parties du motif, p. ex. par détection d’arêtes, de contours, de boucles, d’angles, de barres ou d’intersectionsAnalyse de connectivité, p. ex. de composantes connectées
G06V 10/88 - Reconnaissance d’images ou de vidéos utilisant des moyens optiques, p. ex. filtres de référence, masques holographiques, filtres de domaine de fréquence ou filtres de domaine spatial
G06V 20/56 - Contexte ou environnement de l’image à l’extérieur d’un véhicule à partir de capteurs embarqués
G08G 1/01 - Détection du mouvement du trafic pour le comptage ou la commande
G08G 1/04 - Détection du mouvement du trafic pour le comptage ou la commande utilisant des détecteurs optiques ou ultrasonores
G08G 1/048 - Détection du mouvement du trafic pour le comptage ou la commande avec des dispositions pour compenser les conditions ambiantes ou d'autres paramètres, p. ex. la neige, un véhicule arrêté à un détecteur
H04N 23/69 - Commande de moyens permettant de modifier l'angle du champ de vision, p. ex. des objectifs de zoom optique ou un zoom électronique
37.
Methods and Systems for Dithering Active Sensor Pulse Emissions
One example device comprises a plurality of emitters including at least a first emitter and a second emitter. The first emitter emits light that illuminates a first portion of a field-of-view (FOV) of the device. The second emitter emits light that illuminates a second portion of the FOV. The device also comprises a controller that obtains a scan of the FOV. The controller causes each emitter of the plurality of emitters to emit a respective light pulse during an emission time period associated with the scan. The controller causes the first emitter to emit a first-emitter light pulse at a first-emitter time offset from a start time of the emission time period. The controller causes the second emitter to emit a second-emitter light pulse at a second-emitter time offset from the start time of the emission time period.
G01S 17/10 - Systèmes déterminant les données relatives à la position d'une cible pour mesurer la distance uniquement utilisant la transmission d'ondes à modulation d'impulsion interrompues
38.
DETERMINING AND RESPONDING TO AN INTERNAL STATUS OF A VEHICLE
Aspects of the disclosure relate to determining and responding to an internal state of a self-driving vehicle. For instance, an image of an interior of the vehicle captured by a camera mounted in the vehicle is received. The image is processed in order to identify one or more visible markers at predetermined locations within the vehicle. The internal state of the vehicle is determined based on the identified one or more visible markers. A responsive action is identified action using the determined internal state, and the vehicle is controlled in order to perform the responsive action.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B60K 28/04 - Dispositifs de sécurité pour la commande des ensembles de propulsion spécialement adaptés aux véhicules ou aménagés dans ceux-ci, p. ex. empêchant l'alimentation en carburant ou l'allumage en cas de danger sensibles à des conditions relatives au conducteur sensibles à la présence ou à l'absence de conducteur, p. ex. au poids ou à l'absence de poids
B60N 2/00 - Sièges spécialement adaptés aux véhiculesAgencement ou montage des sièges dans les véhicules
B60Q 9/00 - Agencement ou adaptation des dispositifs de signalisation non prévus dans l'un des groupes principaux
G01C 21/36 - Dispositions d'entrée/sortie pour des calculateurs embarqués
Aspects of the disclosure provide a method of generating and following planned trajectories for an autonomous vehicle. For instance, a baseline for a planned trajectory that the autonomous vehicle can use to follow a route to a destination may be determined. A stopping point corresponding to a traffic control that will cause the autonomous vehicle to come to a stop using the baseline may be determined. Sensor data identifying objects and their locations may be received. A plurality of constraints may be generated based on the sensor data. A planned trajectory may be generated using the baseline, the stopping point, and the plurality of constraints, wherein constraints beyond the stopping point are ignored.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
B60W 30/08 - Anticipation ou prévention de collision probable ou imminente
G01C 21/36 - Dispositions d'entrée/sortie pour des calculateurs embarqués
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
40.
Automated Generation of Radar Interference Reduction Training Data for Autonomous Vehicle Systems
Example embodiments relate to methods and systems for automated generation of radar interference reduction training data for autonomous vehicles. In an example, a computing device causes a radar unit to transmit radar signals in an environment of a vehicle. The computing device may include a model trained based on a labeled interferer dataset that represents interferer signals generated by an emitter located remote from the vehicle. The interferer signals are based on one or more radar signal parameter models. The computing device may use the model to determine whether received electromagnetic energy corresponds to transmitted radar signals or an interferer signal. Based on determining that the electromagnetic energy corresponds to the transmitted radar signals, the computing device may generate a representation of the environment of the vehicle using the electromagnetic energy.
Aspects of the disclosure relate to methods for controlling a vehicle having an autonomous driving mode. For instance, sensor data may be received from one or more sensors of the perception system of the vehicle, the sensor data identifying characteristics of an object perceived by the perception system. When it is determined that the object is no longer being perceived by the one or more sensors of the perception system, predicted characteristics for the object may be generated based on one or more of the identified characteristics. The predicted characteristics of the object may be used to control the vehicle in the autonomous driving mode such that the vehicle is able to respond to the object when it is determined that the object is no longer being perceived by the one or more sensors of the perception system.
One example method involves rotating a housing of a light detection and ranging (LIDAR) device about a first axis. The housing includes a first optical window and a second optical window. The method also involves transmitting a first plurality of light pulses through the first optical window to obtain a first scan of a field-of-view (FOV) of the LIDAR device. The method also involves transmitting a second plurality of light pulses through the second optical window to obtain a second scan of the FOV. The method also involves identifying, based on the first scan and the second scan, a portion of the FOV that is at least partially occluded by an occlusion.
The technology involves identifying suitable pickup and drop-off locations based on detected pedestrian walking paths. Mapped areas have specific physical configurations, which may suggest places to pick up or drop off a rider (or a delivery). A walking path heatmap can be generated based on obtained historical and/or real-time pedestrian-related information, which can be obtained by autonomous vehicles driving in areas of interest. Incorporating heatmap information into the evaluation, the system identifies locations for optimized pickup or drop-off in accordance with where pedestrians would likely go. One aspect involves classifying different objects, for instance identifying one or more objects as people who may be walking versus riding a bicycle. Once classified, information about the paths is used to obtain a the heatmap associated with the walking paths.
A control system for an autonomous vehicle is configured to pick up a passenger at a pickup location. The autonomous vehicle includes a self-driving system and one or more computing devices in communication with the self-driving system. The one or more computing devices are configured to receive a trip request including a pickup location and a destination location, the trip request being associated with a client device; cause the self-driving system to navigate the autonomous vehicle to the pickup location; send a prompt to the client device to collect semantic information; determine a specified location based on the semantic information received in response to the prompt; identify one or more semantic markers for the specified location from the semantic information; and send a message to the client device identifying the specified location using the one or more semantic markers.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G06Q 10/02 - Réservations, p. ex. pour billetterie, services ou manifestations
G06Q 50/40 - Procédés d’affaires s’appliquant à l’industrie du transport
G06V 10/44 - Extraction de caractéristiques locales par analyse des parties du motif, p. ex. par détection d’arêtes, de contours, de boucles, d’angles, de barres ou d’intersectionsAnalyse de connectivité, p. ex. de composantes connectées
G06V 20/56 - Contexte ou environnement de l’image à l’extérieur d’un véhicule à partir de capteurs embarqués
G08G 1/123 - Systèmes de commande du trafic pour véhicules routiers indiquant la position de véhicules, p. ex. de véhicules à horaire déterminé
45.
END-TO-END TRAINABLE ADVANCED DRIVER-ASSISTANCE SYSTEMS
The disclosed systems and techniques are directed to computationally efficient detection and tracking of objects in driving environments. The techniques include generating a set of auto-labeled training data using a first autonomous vehicle (AV) system having multiple sensor modalities. The auto-labeled training data includes a first set of non-lidar data associated with the first AV system, and one or more target predictions for the first set of non-lidar data, said predictions generated based at least on lidar data associated with the first AV system. The techniques further include training, by the processing device and using the auto-labeled training data, an end-to-end perception model of a second AV system lacking a lidar sensor to predict presence of one or more objects in a driving environment of the second AV system.
G06V 10/774 - Génération d'ensembles de motifs de formationTraitement des caractéristiques d’images ou de vidéos dans les espaces de caractéristiquesDispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant l’intégration et la réduction de données, p. ex. analyse en composantes principales [PCA] ou analyse en composantes indépendantes [ ICA] ou cartes auto-organisatrices [SOM]Séparation aveugle de source méthodes de Bootstrap, p. ex. "bagging” ou “boosting”
G06V 10/764 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant la classification, p. ex. des objets vidéo
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for determining a task schedule for generating prediction data for different agents. In one aspect, a method comprises receiving data that characterizes an environment in a vicinity of a vehicle at a current time step, the environment comprising a plurality of agents; receiving data that identifies high-priority agents for which respective data characterizing the agents must be generated at the current time step; identifying available computing resources at the current time step; processing the data that characterizes the environment using a complexity scoring model to determine a respective complexity score for each of the high-priority agents; and determining a schedule for the current time step that allocates the generation of the data characterizing the high-priority agents across the available computing resources based on the complexity scores.
G06F 9/48 - Lancement de programmes Commutation de programmes, p. ex. par interruption
B60K 31/00 - Accessoires agissant sur un seul sous-ensemble pour la commande automatique de la vitesse des véhicules, c.-à-d. empêchant la vitesse de dépasser une valeur déterminée de façon arbitraire ou maintenant une vitesse donnée choisie par le conducteur du véhicule
G06F 9/38 - Exécution simultanée d'instructions, p. ex. pipeline ou lecture en mémoire
G06F 9/50 - Allocation de ressources, p. ex. de l'unité centrale de traitement [UCT]
G06N 5/00 - Agencements informatiques utilisant des modèles fondés sur la connaissance
The subject matter of this specification relates to a light detection and ranging (LiDAR) system. In at least one implementation, the LiDAR system comprises a first signal source; a second signal source; a combiner to generate a hybrid transmission signal from signals generated by the first signal source and the second signal source; a first photodetector to measure a first component of a reflection signal related to range of a target; and a second photodetector to measure a second component of the reflection signal related to velocity of the target, wherein the system is configured to derive the range and velocity of the target from the first component and the second component, respectively.
G01S 17/26 - Systèmes déterminant les données relatives à la position d'une cible pour mesurer la distance uniquement utilisant la transmission d'ondes à modulation d'impulsion interrompues dans lesquels les impulsions transmises utilisent une onde porteuse modulée en fréquence ou en phase, p. ex. pour la compression d'impulsion des signaux reçus
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G01S 7/481 - Caractéristiques de structure, p. ex. agencements d'éléments optiques
G01S 17/58 - Systèmes de détermination de la vitesse ou de la trajectoireSystèmes de détermination du sens d'un mouvement
G01S 17/931 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
48.
Methods for Localizing Light Detection and Ranging (Lidar) Calibration Targets
Example embodiments relate to methods for localizing light detection and ranging (lidar) calibration targets. An example method includes generating a point cloud of a region based on data from a light detection and ranging (lidar) device. The point cloud may include points representing at least a portion of a calibration target. The method also includes determining a presumed location of the calibration target. Further, the method includes identifying, within the point cloud, a location of a first edge of the calibration target. In addition, the method includes performing a comparison between the identified location of the first edge of the calibration target and a hypothetical location of the first edge of the calibration target within the point cloud if the calibration target were positioned at the presumed location. Still further, the method includes revising the presumed location of the calibration target based on at least the comparison.
B60W 50/06 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour améliorer la réponse dynamique du système d'aide à la conduite, p. ex. pour améliorer la vitesse de régulation, ou éviter le dépassement de la consigne ou l'instabilité
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G01S 7/481 - Caractéristiques de structure, p. ex. agencements d'éléments optiques
G01S 17/894 - Imagerie 3D avec mesure simultanée du temps de vol sur une matrice 2D de pixels récepteurs, p. ex. caméras à temps de vol ou lidar flash
G08G 1/00 - Systèmes de commande du trafic pour véhicules routiers
49.
Independently Actuated Wheel Sets for Large Autonomous Self-Driving Vehicles
The technology relates to fine maneuver control of large autonomous vehicles that employ multiple sets of independently actuated wheels. The control is able to optimize the turning radius, effectively negotiate curves, turns, and clear static objects of varying heights. Each wheel or wheel set is configured to adjust individually via control of an on-board computer system. Received sensor data and a physical model of the vehicle can be used for route planning and selecting maneuver operations in accordance with the additional degrees of freedom provided by the independently actuated wheels. This can include making turns, moving into or out of parking spaces, driving along narrow or congested roads, construction zones, loading docks, etc. A given maneuver may include maintaining a minimum threshold distance from a neighboring vehicle or other object.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B62D 7/14 - Timonerie de directionFusées ou leur montage pour roues pivotant individuellement, p. ex. sur pivot de fusée les axes de pivotement étant situés dans plus d'un plan perpendiculaire à l'axe longitudinal du véhicule, p. ex. toutes les roues étant directrices
B62D 7/15 - Timonerie de directionFusées ou leur montage pour roues pivotant individuellement, p. ex. sur pivot de fusée les axes de pivotement étant situés dans plus d'un plan perpendiculaire à l'axe longitudinal du véhicule, p. ex. toutes les roues étant directrices caractérisée par des moyens modifiant le rapport entre les angles de direction des roues directrices
50.
REAL-TIME LANE CHANGE SELECTION FOR AUTONOMOUS VEHICLES
Aspects of the disclosure relate to routing an autonomous vehicle. For instance, the vehicle may be maneuvered along a route in a first lane using map information identifying a first plurality of nodes representing locations within the first lane and a second plurality of nodes representing locations within a second lane different from the first lane. While maneuvering, when the vehicle should make a lane change may be determined by assessing a cost of connecting a first node of the first plurality of nodes with a second node of a second plurality of nodes. The assessment may be used to make the lane change from the first lane to the second lane.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B60W 40/02 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes
G01C 21/34 - Recherche d'itinéraireGuidage en matière d'itinéraire
51.
Methods and Systems for Automatic Problematic Maneuver Detection and Adapted Motion Planning
Example embodiments relate to methods and systems for automatic problematic maneuver detection and adapted motion planning. A computing device may obtain a route for navigation by a vehicle and a set of vehicle parameters corresponding to the vehicle. Each vehicle parameter can represent a physical attribute of the vehicle. The computing device may generate a virtual vehicle that represents the vehicle based on the set of vehicle parameters and perform a simulation that involves the virtual vehicle navigating the route. Based on the results of the simulation, the computing device may provide the original route or a modified route to the vehicle for the vehicle to subsequently navigate to its destination. In some cases, the simulation may further factor additional conditions, such as potential weather and traffic conditions that are likely to occur during the time when the vehicle plans on navigating the route.
The present disclosure relates to limitation of noise on light detectors using an aperture. One example embodiment includes a system. The system includes a lens disposed relative to a scene and configured to focus light from the scene onto a focal plane. The system also includes an aperture defined within an opaque material disposed at the focal plane of the lens. The aperture has a cross-sectional area. In addition, the system includes an array of light detectors disposed on a side of the focal plane opposite the lens and configured to intercept and detect diverging light focused by the lens and transmitted through the aperture. A cross-sectional area of the array of light detectors that intercepts the diverging light is greater than the cross-sectional area of the aperture.
G01S 7/4863 - Réseaux des détecteurs, p. ex. portes de transfert de charge
G01S 17/10 - Systèmes déterminant les données relatives à la position d'une cible pour mesurer la distance uniquement utilisant la transmission d'ondes à modulation d'impulsion interrompues
G01S 17/42 - Mesure simultanée de la distance et d'autres coordonnées
53.
Calibration and Localization of a Light Detection and Ranging (Lidar) Device Using a Previously Calibrated and Localized Lidar Device
Example embodiments relate to calibration and localization of a light detection and ranging (lidar) device using a previously calibrated and localized lidar device. An example embodiment includes a method. The method includes receiving, by a computing device associated with a second vehicle, a first point cloud captured by a first lidar device of a first vehicle. The first point cloud includes points representing the second vehicle. The method also includes receiving, by the computing device, pose information indicative of a pose of the first vehicle. In addition, the method includes capturing, using a second lidar device of the second vehicle, a second point cloud. Further, the method includes receiving, by the computing device, a third point cloud representing the first vehicle. Yet further, the method includes calibrating and localizing, by the computing device, the second lidar device.
G01S 17/10 - Systèmes déterminant les données relatives à la position d'une cible pour mesurer la distance uniquement utilisant la transmission d'ondes à modulation d'impulsion interrompues
G01S 17/42 - Mesure simultanée de la distance et d'autres coordonnées
G01S 17/86 - Combinaisons de systèmes lidar avec des systèmes autres que lidar, radar ou sonar, p. ex. avec des goniomètres
G01S 17/87 - Combinaisons de systèmes utilisant des ondes électromagnétiques autres que les ondes radio
G01S 17/931 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
Methods, systems, and apparatus for generation and use of surfel maps to plan for occlusions. One of the methods includes receiving a previously-generated surfel map depicting an area in which a vehicle is located, the surfel map comprising a plurality of surfels, each surfel corresponding to a respective different location in the area in which a vehicle is located; receiving, from one or more sensors, sensor data representing the area in which the vehicle is located; determining, based on the sensor data, that the area in which a vehicle is located includes a dynamic object having a changed shape relative to its representation in the surfel map; and generating an updated path for the vehicle to travel that avoids an occlusion by the changed shape of the dynamic object of a line of sight of one or more sensors to an area of interest.
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
Aspects of the disclosure provide for the generation of a driving difficulty heat map for autonomous vehicles. For instance, log data generated by a vehicle being driven in a manual driving mode for a segment of a route may be input into a disengage model in order to generate an output identifying a likelihood of a vehicle driving in an autonomous driving mode requiring a disengage from the autonomous driving mode along the segment of the route. The log data may have been collected within a geographic area. A grid for the geographic area may be generated. The grid may include a plurality of cells. The output is assigned to one of the plurality of cells. The plurality of cells and assigned output may be used to generate a driving difficulty heat map for the geographic area.
G01C 5/02 - Mesure des hauteursMesure des distances transversales par rapport à la ligne de viséeNivellement entre des points séparésNiveaux à lunette impliquant une stabilisation automatique de la ligne de visée
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G01C 21/00 - NavigationInstruments de navigation non prévus dans les groupes
G07C 5/02 - Enregistrement ou indication du temps de circulation, de fonctionnement, d'arrêt ou d'attente uniquement
56.
PULL-OVER LOCATION SELECTION USING MACHINE LEARNING
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for selecting a pull-over location using machine learning. One of the methods includes obtaining data specifying a target pull-over location for an autonomous vehicle travelling on a roadway. A plurality of candidate pull-over locations in a vicinity of the target pull-over location are identified. For each candidate pull-over location, an input that includes features of the candidate pull-over location is processed using a machine learning model to generate a respective likelihood score representing a predicted likelihood that the candidate pull-over location is an optimal location. The features of the candidate pull-over location include one or more features that compare the candidate pull-over location to the target pull-over location. Using the respective likelihood scores, one of the candidate pull-over locations is selected as an actual pull-over location for the autonomous vehicle.
B60W 40/06 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes liés à l'état de la route
Aspects of the technology relate to assisting a passenger in an autonomous vehicle without a driver. For instance, after a door of the vehicle is opened, a predetermined period of time may be waited by processors of computing devices of the vehicle. After waiting the predetermined period of time and when the vehicle's door remains open, a set of instructions for closing the vehicle's door may be played by the processors through a speaker of the vehicle. Once the door of the vehicle is closed, an announcement may be played by the processors through the speaker requesting that the passenger press a first button to initiate a ride to a destination. In response to the first button being pressed, the ride to the destination may be initiated by the processors by maneuvering the vehicle autonomously to the destination.
B60K 35/00 - Instruments spécialement adaptés aux véhiculesAgencement d’instruments dans ou sur des véhicules
B60K 35/26 - Dispositions de sortie, c.-à-d. du véhicule à l'utilisateur, associées aux fonctions du véhicule ou spécialement adaptées à celles-ci utilisant une sortie acoustique
B60K 35/28 - Dispositions de sortie, c.-à-d. du véhicule à l'utilisateur, associées aux fonctions du véhicule ou spécialement adaptées à celles-ci caractérisées par le type d’informations de sortie, p. ex. divertissement vidéo ou informations sur la dynamique du véhiculeDispositions de sortie, c.-à-d. du véhicule à l'utilisateur, associées aux fonctions du véhicule ou spécialement adaptées à celles-ci caractérisées par la finalité des informations de sortie, p. ex. pour attirer l'attention du conducteur
The present disclosure relates to limitation of noise on light detectors using an aperture. One example implementation includes a system. The system includes a lens disposed relative to a scene. The lens focuses light from the scene. The system also includes an aperture defined within an opaque material. The system also includes a waveguide having a first side that receives light focused by the lens and transmitted through the aperture. The waveguide guides the received light toward a second side of the waveguide opposite to the first side. The waveguide has a third side extending between the first side and the second side. The system also includes an array of light detectors that intercepts and detects light propagating out of the third side of the waveguide.
G01S 7/481 - Caractéristiques de structure, p. ex. agencements d'éléments optiques
G01S 17/89 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie
G02B 6/08 - Guides de lumièreDétails de structure de dispositions comprenant des guides de lumière et d'autres éléments optiques, p. ex. des moyens de couplage formés par des faisceaux de fibres la position relative des fibres étant la même aux deux extrémités, p. ex. pour transporter des images le faisceau de fibres ayant la forme d'une plaque
59.
RESOURCE ALLOCATION FOR AN AUTONOMOUS VEHICLE TRANSPORTATION SERVICE
Aspects of the disclosure relate to generating a model to assess maximum numbers of concurrent trips for an autonomous vehicle transportation service. For instance, historical trip data, including when requests for assistance were made, response times for those requests for assistance, and a number of available resources when each of the requests for assistance were made may be received. In addition, a number of concurrent trips, or trips that overlap in time, occurring when each of the requests for assistance were made may be received. The model may be trained using the historical trip data and the numbers of concurrent trips. The model may be configured to provide a maximum number of concurrent trips given a period of time, a number of available resources, and a response time requirement.
The technology relates to a system for cleating a sensor cover. The system may comprise a wiper comprising a wiper support, a wiper blade, and a sensor cover. The wiper blade may be configured to clear the sensor cover of debris, and the sensor cover may be configured to house one or more sensors. A wiper motor may rotate the wiper and a sensor motor may rotate the sensor cover. The system wiper blade may comprise a first edge attached to the wiper support and a second edge which may be configured to be in contact with the sensor cover. The wiper blade may extend in a corkscrew shape around the wiper support. The wiper motor may be configured to rotate the wiper in a first direction and the sensor motor may be configured to rotate the sensor cover in a second direction opposite the first direction.
B60S 1/62 - Autres accessoires sur véhicules pour le nettoyage
B08B 1/14 - LingettesÉléments absorbants, p. ex. écouvillons ou éponges
B08B 3/02 - Nettoyage par la force de jets ou de pulvérisations
B60S 1/04 - Essuie-glaces ou analogues, p. ex. grattoirs
B60S 1/34 - Bras d'essuie-glacesMontage à cet effet
B60S 1/56 - Nettoyage des pare-brise, fenêtres ou dispositifs optiques spécialement adaptés pour nettoyer d'autres parties ou dispositifs que les fenêtres avant ou les pare-brise
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
61.
Control calibration timing to avoid memory write blackout period
One example system for preventing data loss during memory blackout events comprises a memory device, a sensor, and a controller operably coupled to the memory device and the sensor. The controller is configured to perform one or more operations that coordinate at least one memory blackout event of the memory device and at least one data transmission of the sensor.
Aspects of the technology involve controlling a vehicle configured to operate in an autonomous driving mode. This includes receiving a set of environmental inputs including temperature information from different temperature sources, receiving initial steering information from a steering system of the vehicle, and obtaining an initial rack position command by a motion control module of the vehicle. The system determines, based on the environmental inputs, the initial steering information and an initial rack position, a likelihood that a steering actuator of the steering system of the vehicle is blocked or likely to become blocked. The system determines whether a threshold excitation amount has been applied to the steering system within a selected amount of time or a selected driving distance. When the threshold amount of excitation is not met, an excitation profile is applied to the steering system in order to modify the initial rack position.
B62D 5/04 - Direction assistée ou à relais de puissance électrique, p. ex. au moyen d'un servomoteur relié au boîtier de direction ou faisant partie de celui-ci
63.
Enhanced depth of focus cameras using variable apertures and pixel binning
Example embodiments relate to enhanced depth of focus cameras using variable apertures and pixel binning. An example embodiment includes a device. The device includes an image sensor. The image sensor includes an array of light-sensitive pixels and a readout circuit. The device also includes a variable aperture. Additionally, the device includes a controller that is configured to cause: the variable aperture to adjust to a first aperture size when a high-light condition is present, the variable aperture to adjust to a second aperture size when a low-light condition is present, the readout circuit to perform a first level of pixel binning when the high-light condition is present, and the readout circuit to perform a second level of pixel binning when the low-light condition is present. The second aperture size is larger than the first aperture size. The second level of pixel binning is greater than the first level of pixel binning.
H04N 5/347 - Extraction de données de pixels provenant d'un capteur d'images en agissant sur les circuits de balayage, p.ex. en modifiant le nombre de pixels ayant été échantillonnés ou à échantillonner en combinant ou en mélangeant les pixels dans le capteur SSIS
G01S 17/89 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie
G02B 27/00 - Systèmes ou appareils optiques non prévus dans aucun des groupes ,
H04N 5/378 - Circuits de lecture, p.ex. circuits d’échantillonnage double corrélé [CDS], amplificateurs de sortie ou convertisseurs A/N
H04N 25/46 - Extraction de données de pixels provenant d'un capteur d'images en agissant sur les circuits de balayage, p. ex. en modifiant le nombre de pixels ayant été échantillonnés ou à échantillonner en combinant ou en groupant les pixels
H04N 25/75 - Circuits pour fournir, modifier ou traiter des signaux d'image provenant de la matrice de pixels
Example embodiments relate to reducing auto-exposure latency. An example embodiment includes a method of reducing auto-exposure latency. The method includes determining, by a processor, a first setting of an exposure parameter for a first frame to be captured by an image sensor. The first setting of the exposure parameter is determined based at least in part on characteristics of a previous frame captured by the image sensor. The first setting of the exposure parameter is determined during a first frame period associated with capturing the first frame. The method also includes initiating, by the processor, a first frame exposure operation based on the first setting of the exposure parameter. During the first frame exposure operation, the image sensor captures the first frame during the first frame period.
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing rare example mining in driving log data. In one aspect, a method includes obtaining a sensor input; processing the sensor input using an encoder neural network to generate one or more feature vectors for the sensor input; processing each of the one or more feature vectors using a density estimation model to generate a density score for the feature vector; and generating a rareness score for each of the one or more feature vectors from the density score. For example, the rareness score can represent a degree to which a classification of an object depicted in the sensor input is rare relative to other objects. As another example, the rareness score can represent a degree to which a predicted behavior of an agent depicted in the sensor input is rare relative to other objects.
G06V 10/764 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant la classification, p. ex. des objets vidéo
G06V 10/82 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant les réseaux neuronaux
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
An example method includes receiving point cloud information about a field of view of a lidar system. The point cloud information includes spatiotemporal and amplitude information about return light received. The method also includes determining, based on the point cloud information, a set of bright light returns from at least one highly reflective object. The bright light returns include return light having an amplitude above a photon threshold and a corresponding bright light return range. The method yet further includes determining, based on the point cloud information, a set of crosstalk returns. The crosstalk returns include return light having a corresponding crosstalk return range. The method includes adjusting, based on a normalized number of crosstalk returns, at least one of: a cleaning system, an operating mode of a lidar system, or an operating mode of a vehicle.
B60S 1/56 - Nettoyage des pare-brise, fenêtres ou dispositifs optiques spécialement adaptés pour nettoyer d'autres parties ou dispositifs que les fenêtres avant ou les pare-brise
G01S 17/931 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing rare example mining in driving log data. In one aspect, a method includes maintaining a plurality of density estimation models that each correspond to a different rareness type with respect to historical sensor inputs in a driving log generated by sensors on-board a vehicle; receiving a query that references a sensor input; generating, from the sensor input, a corresponding density estimation model input for each of the plurality of density estimation models; processing, using each of the plurality of density estimation models, the corresponding density estimation model input to generate a corresponding density score; generating, for the sensor input, and from the density scores, a rareness score associated with each different rareness type; and providing the rareness scores in response to receiving the query.
G06N 3/042 - Réseaux neuronaux fondés sur la connaissanceReprésentations logiques de réseaux neuronaux
G07C 5/08 - Enregistrement ou indication de données de marche autres que le temps de circulation, de fonctionnement, d'arrêt ou d'attente, avec ou sans enregistrement des temps de circulation, de fonctionnement, d'arrêt ou d'attente
An example system includes a light detection and ranging (LIDAR) device that scans a field-of-view defined by a pointing direction of the LIDAR device. The system also includes an actuator that adjusts the pointing direction of the LIDAR device. The system also includes a communication interface that receives timing information from an external system. The system also includes a controller that causes the actuator to adjust the pointing direction of the LIDAR device based on at least the received timing information.
Aspects of the technology relate to exception handling for a vehicle. For instance, a current trajectory for the vehicle and sensor data corresponding to one or more objects may be received. Based on the received sensor data, projected trajectories of the one or more objects may be determined. Potential collisions with the one or more objects may be determined based on the projected trajectories and the current trajectory. One of the potential collisions that is earliest in time may be identified. Based on the one of the potential collisions, a safety-time-horizon (STH) may be identified. When a runtime exception occurs, before performing a precautionary maneuver to avoid a collision, waiting no longer than the STH for the runtime exception to resolve.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
The technology relates to determining general weather conditions affecting the roadway around a vehicle, and how such conditions may impact driving and route planning for the vehicle when operating in an autonomous mode. For instance, the on-board sensor system may detect whether the road is generally icy as opposed to a small ice patch on a specific portion of the road surface. The system may also evaluate specific driving actions taken by the vehicle and/or other nearby vehicles. Based on such information, the vehicle's control system is able to use the resultant information to select an appropriate braking level or braking strategy. As a result, the system can detect and respond to different levels of adverse weather conditions. The on-board computer system may share road condition information with nearby vehicles and with remote assistance, so that it may be employed with broader fleet planning operations.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B60W 10/04 - Commande conjuguée de sous-ensembles de véhicule, de fonction ou de type différents comprenant la commande des ensembles de propulsion
B60W 10/18 - Commande conjuguée de sous-ensembles de véhicule, de fonction ou de type différents comprenant la commande des systèmes de freinage
B60W 10/20 - Commande conjuguée de sous-ensembles de véhicule, de fonction ou de type différents comprenant la commande des systèmes de direction
B60W 30/02 - Commande de la stabilité dynamique du véhicule
G01S 13/95 - Radar ou systèmes analogues, spécialement adaptés pour des applications spécifiques pour la météorologie
G01S 15/88 - Systèmes sonar, spécialement adaptés à des applications spécifiques
G01S 17/95 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la météorologie
G01W 1/02 - Instruments pour indiquer des conditions atmosphériques par mesure de plusieurs variables, p. ex. humidité, pression, température, nébulosité ou vitesse du vent
73.
Methods and Systems for Detecting Adverse Road Conditions using Radar
Example embodiments relate to techniques for detecting adverse road conditions using radar. A computing device may generate a first radar representation that represents a field of view for a radar unit coupled to a vehicle and during clear weather conditions and store the first radar representation in memory. The computing device may receive radar data from the radar unit during navigation of the vehicle on a road and determine a second radar representation based on the radar data. The computing device may also perform a comparison between the first radar representation and the second radar representation and determine a road condition for the road based on the comparison. The road condition may represent a quantity of precipitation located on the road and provide control instructions to the vehicle based on the road condition for the road.
G01S 7/41 - Détails des systèmes correspondant aux groupes , , de systèmes selon le groupe utilisant l'analyse du signal d'écho pour la caractérisation de la cibleSignature de cibleSurface équivalente de cible
B60W 30/09 - Entreprenant une action automatiquement pour éviter la collision, p. ex. en freinant ou tournant
Example implementations may relate to sun-aware vehicle routing. In particular, a computing system of a vehicle may determine an expected position of the sun relative to a geographic area. Based on the expected position, the computing system may make a determination that travel of the vehicle through certain location(s) within the geographic area is expected to result in the sun being proximate to an object within a field of view of the vehicle's image capture device. Responsively, the computing system may generate a route for the vehicle in the geographic area based at least on the route avoiding travel of the vehicle through these certain location(s), and may then operate the vehicle to travel in accordance with the generated route. Ultimately, this may help reduce or prevent situations where quality of image(s) degrades due to sunlight, which may allow for use of these image(s) as basis for operating the vehicle.
In one example, a method is provided that includes receiving lidar data obtained by a lidar device. The lidar data includes a plurality of data points indicative of locations of reflections from an environment of the vehicle. The method includes receiving images of portions of the environment captured by a camera at different times. The method also includes determining locations in the images that correspond to a data point of the plurality of data points. Additionally, the method includes determining feature descriptors for the locations of the images and comparing the feature descriptors to determine that sensor data associated with at least one of the lidar device, the camera, or a pose sensor is accurate or inaccurate.
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G06F 18/213 - Extraction de caractéristiques, p. ex. en transformant l'espace des caractéristiquesSynthétisationsMappages, p. ex. procédés de sous-espace
G06V 10/75 - Organisation de procédés de l’appariement, p. ex. comparaisons simultanées ou séquentielles des caractéristiques d’images ou de vidéosApproches-approximative-fine, p. ex. approches multi-échellesAppariement de motifs d’image ou de vidéoMesures de proximité dans les espaces de caractéristiques utilisant l’analyse de contexteSélection des dictionnaires
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for automatically designating traffic scenarios as safety-relevant traffic conflicts between agents in a driving environment. One of the methods includes receiving data representing a traffic scenario involving two agents; computing a safety-relevant metric for a first plurality of time points of the traffic scenario; computing a surprise metric for a second plurality of time points of the traffic scenario; determining that the surprise metric satisfies a surprise threshold within a threshold time window of the safety-relevant metric satisfying a safety-relevant threshold; and in response, designating the traffic scenario as a safety-relevant traffic conflict.
Aspects of the disclosure relate to providing transportation services with autonomous vehicles. For instance, a first route to a first destination may be determined. The first route may have a first cost. Weather information for the first destination may be received. A characteristic is determined based on the weather information. A second destination having the characteristic may be selected. The second destination may be different from the first destination. A second route to the second destination may be determined. The second route may have a second cost. The first cost may be compared to the second cost, and the vehicle may use the comparison to set one of the first destination or the second destination as a current destination for a vehicle to cause the vehicle to control itself in an autonomous driving mode to the current destination.
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G01C 21/34 - Recherche d'itinéraireGuidage en matière d'itinéraire
G01C 21/36 - Dispositions d'entrée/sortie pour des calculateurs embarqués
G08G 1/133 - Systèmes de commande du trafic pour véhicules routiers indiquant la position de véhicules, p. ex. de véhicules à horaire déterminé à l'intérieur du véhicule
78.
Using Driver Assistance to Detect and Address Aberrant Driver Behavior
The technology relates to identifying and addressing aberrant driver behavior. Various driving operations may be evaluated over different time scales and driving distances. The system can detect driving errors and suboptimal maneuvering, which are evaluated by an onboard driver assistance system and compared against a model of expected driver behavior. The result of this comparison can be used to alert the driver or take immediate corrective driving action. It may also be used for real-time or offline training or sensor calibration purposes. The behavior model may be driver-specific, or may be a nominal driver model based on aggregated information from many drivers. These approaches can be employed with drivers of passenger vehicles, busses, cargo trucks and other vehicles.
B60W 50/06 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour améliorer la réponse dynamique du système d'aide à la conduite, p. ex. pour améliorer la vitesse de régulation, ou éviter le dépassement de la consigne ou l'instabilité
B60W 50/14 - Moyens d'information du conducteur, pour l'avertir ou provoquer son intervention
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G01C 21/30 - Mise en coïncidence avec des cartes ou des contours
The present disclosure relates to systems, vehicles, and methods relating to imaging and object detection using polarization-based detection of infrared light. An example system includes at least one infrared detector configured to detect infrared light corresponding to a target object within a field of view. The infrared light includes at least one of a first polarization or a second polarization. The system also includes a controller configured to carry out operations. The operations include receiving, from the at least one infrared detector, information indicative of infrared light corresponding to the target object. The operations also include determining, based on the received information, a polarization ratio corresponding to the target object. The polarization ratio comprises a first polarization intensity divided by a second polarization intensity. The operations also include determining, based on the polarization ratio, that the infrared light corresponding to the target object comprises direct light or reflected light.
Aspects of the disclosure relate generally to generating and providing route options for an autonomous vehicle. For example, a user may identify a destination, and in response the vehicle's computer may provide routing options to the user. The routing options may be based on typical navigating considerations such as the total travel time, travel distance, fuel economy, etc. Each routing option may include not only an estimated total time, but also information regarding whether and which portions of the route may be maneuvered under the control of the vehicle alone (fully autonomous), a combination of the vehicle and the driver (semiautonomous), or the driver alone. The time of the longest stretch of driving associated with the autonomous mode as well as map information indicating portions of the routes associated with the type of maneuvering control may also be provided.
G01C 21/00 - NavigationInstruments de navigation non prévus dans les groupes
G01C 21/34 - Recherche d'itinéraireGuidage en matière d'itinéraire
G01C 21/36 - Dispositions d'entrée/sortie pour des calculateurs embarqués
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
H04L 67/12 - Protocoles spécialement adaptés aux environnements propriétaires ou de mise en réseau pour un usage spécial, p. ex. les réseaux médicaux, les réseaux de capteurs, les réseaux dans les véhicules ou les réseaux de mesure à distance
81.
Methods and Systems to Determine a Strategy for a Drop Process Associated with a Light Detection and Ranging (LIDAR) Device
Example implementations may relate to determining a strategy for a drop process associated with a light detection and ranging (LIDAR) device. In particular, the LIDAR device could emit light pulses and detect return light pulses, and could generate a set of data points representative of the detected return light pulses. The drop process could involve a computing system discarding data point(s) of the set and/or preventing emission of light pulse(s) by the LIDAR device. Accordingly, the computing system could detect a trigger to engage in the drop process, and may responsively (i) use information associated with the environment around the vehicle, operation of the vehicle, and/or operation of the LIDAR device as a basis to determine the strategy for the drop process, and (ii) engage in the drop process in accordance with the determined strategy.
G01S 17/931 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
G01S 7/4861 - Circuits pour la détection, d'échantillonnage, d'intégration ou de lecture des circuits
G01S 17/10 - Systèmes déterminant les données relatives à la position d'une cible pour mesurer la distance uniquement utilisant la transmission d'ondes à modulation d'impulsion interrompues
G01S 17/89 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie
Aspects of the disclosure provide a method of facilitating communications from an autonomous vehicle to a user. For instance, a method may include, while attempting to pick up the user and prior to the user entering an vehicle, inputting a current location of the vehicle and map information into a model in order to identify a type of communication action for communicating a location of the vehicle to the user; enabling a first communication based on the type of the communication action; determining whether the user has responded to the first communication from received sensor data; and enabling a second communication based on the determination of whether the user has responded to the communication.
B60Q 5/00 - Agencement ou adaptation des dispositifs de signalisation acoustique
B60Q 1/26 - Agencement des dispositifs de signalisation optique ou d'éclairage, leur montage, leur support ou les circuits à cet effet les dispositifs ayant principalement pour objet d'indiquer le contour du véhicule ou de certaines de ses parties, ou pour engendrer des signaux au bénéfice d'autres véhicules
G08G 1/00 - Systèmes de commande du trafic pour véhicules routiers
Aspects of the present disclosure relate to a system having a memory, a plurality of self-driving systems for controlling a vehicle, and one or more processors. The processors are configured to receive at least one fallback task in association with a request for a primary task and at least one trigger of each fallback task. Each trigger is a set of conditions that, when satisfied, indicate when a vehicle requires attention for proper operation. The processors are also configured to send instructions to the self-driving systems to execute the primary task and receive status updates from the self-driving systems. The processors are configured to determine that a set of conditions of a trigger is satisfied based on the status updates and send further instructions based on the associated fallback task to the self-driving systems.
G08G 1/00 - Systèmes de commande du trafic pour véhicules routiers
B60W 50/00 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier
B60W 50/029 - Adaptation aux défaillances ou contournement par solutions alternatives, p. ex. en évitant l'utilisation de parties défaillantes
G05D 1/223 - Dispositions d’entrée de commande sur les dispositifs de commande à distance, p. ex. manches à balai ou écrans tactiles
G05D 1/227 - Transfert de la commande entre la commande à distance et la commande embarquéeTransfert de la commande entre plusieurs dispositions de commande à distance
G06F 9/46 - Dispositions pour la multiprogrammation
G06F 9/48 - Lancement de programmes Commutation de programmes, p. ex. par interruption
Aspects of the disclosure relate to controlling a vehicle in an autonomous driving mode where vehicle has a drive by wire braking system. For instance, while the vehicle is being controlled in the autonomous driving mode, a signal corresponding to input at a brake pedal of the drive by wire braking system may be received. An amount of braking may be determined based on the received signal. The amount of braking may be used to determine a trajectory for the vehicle to follow. The vehicle may be controlled in the autonomous driving mode using the trajectory.
Example embodiments relate to radar image video compression techniques using per-pixel Doppler measurements, which can involve initially receiving radar data from a radar unit to generate a radar representation that represents surfaces in the environment. Based on Doppler scores in the radar representation, a range rate can be determined for each pixel that indicates a radial direction motion for a surface represented by the pixel. The range rates and backscatter values can then be used to estimate a radar representation prediction for subsequent radar data received from the radar unit, which enables a generation of a compressed radar data file that represents the difference between the radar representation prediction and the actual representation determined for the subsequent radar data. The compressed radar data file can be stored in memory, transmitted to other devices, and decompressed and used to train models via machine learning.
G01S 13/89 - Radar ou systèmes analogues, spécialement adaptés pour des applications spécifiques pour la cartographie ou la représentation
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G01S 7/41 - Détails des systèmes correspondant aux groupes , , de systèmes selon le groupe utilisant l'analyse du signal d'écho pour la caractérisation de la cibleSignature de cibleSurface équivalente de cible
Aspects of the disclosure relate to repositioning a rooftop sensor of an autonomous vehicle when needed to reduce the overall height of the autonomous vehicle. For instance, while an autonomous vehicle is being controlled in an autonomous driving mode, a low clearance zone may be identified. An activation location may be determined based on the low clearance zone and a current speed of the autonomous vehicle. Once the activation location is reached by the autonomous vehicle, a motor may be caused to reposition the rooftop sensor. In addition, in some instances, after the autonomous vehicle has passed the low clearance zone, the motor may be caused to reposition the rooftop sensor again.
B60W 10/30 - Commande conjuguée de sous-ensembles de véhicule, de fonction ou de type différents comprenant la commande d'équipements auxiliaires, p. ex. des compresseurs pour la climatisation ou des pompes à huile
B60R 11/00 - Autres aménagements pour tenir ou monter des objets
B60R 11/04 - Montage des caméras pour fonctionner pendant la marcheDisposition de leur commande par rapport au véhicule
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
G01S 13/86 - Combinaisons de systèmes radar avec des systèmes autres que radar, p. ex. sonar, chercheur de direction
G01S 13/931 - Radar ou systèmes analogues, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
A system and method include scanning a light detection and ranging (LIDAR) device through a range of orientations corresponding to a scanning zone while emitting light pulses from the LIDAR device. The method also includes receiving returning light pulses corresponding to the light pulses emitted from the LIDAR device and determining initial point cloud data based on time delays between emitting the light pulses and receiving the corresponding returning light pulses and the orientations of the LIDAR device. The initial point cloud data has an initial angular resolution. The method includes identifying, based on the initial point cloud data, a reflective feature in the scanning zone and determining an enhancement region and an enhanced angular resolution for a subsequent scan to provide a higher spatial resolution in at least a portion of subsequent point cloud data from the subsequent scan corresponding to the reflective feature.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
G01S 7/48 - Détails des systèmes correspondant aux groupes , , de systèmes selon le groupe
G01S 7/4865 - Mesure du temps de retard, p. ex. mesure du temps de vol ou de l'heure d'arrivée ou détermination de la position exacte d'un pic
G01S 17/10 - Systèmes déterminant les données relatives à la position d'une cible pour mesurer la distance uniquement utilisant la transmission d'ondes à modulation d'impulsion interrompues
G01S 17/42 - Mesure simultanée de la distance et d'autres coordonnées
G01S 17/86 - Combinaisons de systèmes lidar avec des systèmes autres que lidar, radar ou sonar, p. ex. avec des goniomètres
G01S 17/89 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour la cartographie ou l'imagerie
G01S 17/931 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
Aspects of the disclosure provide for the selection of a route for a vehicle having an autonomous driving mode. For instance, an initial location of the vehicle may be identified. This location may be used to determine a set of possible routes to a destination location. A cost for each route of the plurality is determined by inputting time of day information, map information, and details of that route into one or more models in order to determine whether the vehicle is likely to be stranded along that route and assessing the cost based at least in part on the determination of whether the vehicle is likely to be stranded along that route. One of the routes of the set of possible routes may be selected based on any determined costs. The vehicle may be controlled in the autonomous driving mode using the selected one.
G01C 21/34 - Recherche d'itinéraireGuidage en matière d'itinéraire
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
Example embodiments relate to methods and systems for automated generation of radar interference reduction training data for autonomous vehicles. In an example, a computing device causes a radar unit to transmit radar signals in an environment of a vehicle. The computing device may include a model trained based on a labeled interferer dataset that represents interferer signals generated by an emitter located remote from the vehicle. The interferer signals are based on one or more radar signal parameter models. The computing device may use the model to determine whether received electromagnetic energy corresponds to transmitted radar signals or an interferer signal. Based on determining that the electromagnetic energy corresponds to the transmitted radar signals, the computing device may generate a representation of the environment of the vehicle using the electromagnetic energy.
42 - Services scientifiques, technologiques et industriels, recherche et conception
Produits et services
Research and development into autonomous vehicles; research, design, and development of computer hardware and software for use with autonomous vehicle on-board computers for monitoring and controlling motor vehicle operation; research, design, and development of computer hardware and software for autonomous vehicle coordination, navigation, calibrating, direction, and management; research, design, and development of sensors and for structural parts thereof; software as a service (SaaS) services featuring computer software for use as an application programming interface (API) for use in connection with autonomous vehicles; advanced product research, design, and development in the field of artificial intelligence in connection with autonomous vehicles
93.
Methods and Systems for Modifying Power Consumption by an Autonomy System
Example embodiments relate to techniques for modifying power consumption of an autonomy system. For instance, a vehicle autonomy system may use sensor data from vehicle sensors to determine information about the surrounding environment and estimate one or more conditions expected for a threshold duration during subsequent navigation of the path by the vehicle. The autonomy system can then adjust operation of one or more of its components (sensors, compute cores, actuators) based on the one or more conditions expected for the threshold duration and power consumption data corresponding to the components. The vehicle can then be controlled based on subsequent sensor data obtained after adjusting operation of the components of the autonomy system thereby increasing the efficiency of the autonomy system in accordance with the vehicle's surrounding environment.
B60W 50/04 - Détails des systèmes d'aide à la conduite des véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier pour surveiller le fonctionnement du système d'aide à la conduite
B60W 40/02 - Calcul ou estimation des paramètres de fonctionnement pour les systèmes d'aide à la conduite de véhicules routiers qui ne sont pas liés à la commande d'un sous-ensemble particulier liés aux conditions ambiantes
B60W 60/00 - Systèmes d’aide à la conduite spécialement adaptés aux véhicules routiers autonomes
94.
Methods and Systems for Detecting and Mitigating Automotive Radar Interference
Example embodiments relate to techniques that involve detecting and mitigating automotive interference. Electromagnetic signals propagating in the environment can be received by a radar unit that limits the signals received to a particular angle of arrival with reception antennas that limit the signals received to a particular polarization. Filters can be applied to the signals to remove portions that are outside an expected time range and an expected frequency range that depend on radar signal transmission parameters used by the radar unit. In addition, a model representing an expected electromagnetic signal digital representation can be used to remove portions of the signals that are indicative of spikes and plateaus associated with signal interference. A computing device can then generate an environment representation that indicates positions of surfaces relative to the vehicle using the remaining portions of the signals.
Example embodiments relate to light detection and ranging (lidar) devices having vertical-cavity surface-emitting laser (VCSEL) emitters. An example lidar device includes an array of individually addressable VCSELs configured to emit light pulses into an environment surrounding the lidar device. The lidar device also includes a firing circuit configured to selectively fire the individually addressable VCSELs in the array. In addition, the lidar device includes a controller configured to control the firing circuit using a control signal. Further, the lidar device includes a plurality of detectors. Each detector in the plurality of detectors is configured to detect reflections of light pulses that are emitted by one or more individually addressable VCSELs in the array and reflected by one or more objects in the environment surrounding the lidar device.
G01S 17/931 - Systèmes lidar, spécialement adaptés pour des applications spécifiques pour prévenir les collisions de véhicules terrestres
G01S 7/481 - Caractéristiques de structure, p. ex. agencements d'éléments optiques
H01S 5/183 - Lasers à émission de surface [lasers SE], p. ex. comportant à la fois des cavités horizontales et verticales comportant uniquement des cavités verticales, p. ex. lasers à émission de surface à cavité verticale [VCSEL]
H01S 5/42 - Réseaux de lasers à émission de surface
Aspects of the disclosure provide for evaluation of a planned trajectory for an autonomous vehicle. For instance, for each of a plurality of objects, a predicted trajectory may be received. The planned trajectory may identify locations and times that the vehicle will be at those locations. For each of the plurality of objects, a grid including a plurality of cells may be generated. Occupancy of each grid for each of the plurality of objects may be determined based on the predicted trajectories. A cell of each grid which will be occupied by the vehicle at a location and time of the planned trajectory may be identified. The planned trajectory may be evaluated based on whether any identified cell is occupied by any of the plurality of objects at the time.
A system is provided that includes an image sensor coupled to a vehicle, and control circuitry configured to perform operations including receiving, from the image sensor, an input stream comprising high dynamic range (HDR) image data associated with an environment of the vehicle, and processing the input stream at the vehicle by applying a global tone mapping, followed by offline image processing that can include applying a local tone mapping to the globally tone mapped images of the same input stream.
G06T 5/92 - Modification de la plage dynamique d'images ou de parties d'images basée sur les propriétés globales des images
G06T 3/4007 - Changement d'échelle d’images complètes ou de parties d’image, p. ex. agrandissement ou rétrécissement basé sur l’interpolation, p. ex. interpolation bilinéaire
G06T 5/94 - Modification de la plage dynamique d'images ou de parties d'images basée sur les propriétés locales des images, p. ex. pour l'amélioration locale du contraste
98.
Modifying the behavior of an autonomous vehicle using context based parameter switching
A vehicle configured to operate in an autonomous mode may operate a sensor to determine an environment of the vehicle. The sensor may be configured to obtain sensor data of a sensed portion of the environment. The sensed portion may be defined by a sensor parameter. Based on the environment of the vehicle, the vehicle may select at least one parameter value for the at least one sensor parameter such that the sensed portion of the environment corresponds to a region of interest. The vehicle may operate the sensor, using the selected at least one parameter value for the at least one sensor parameter, to obtain sensor data of the region of interest, and control the vehicle in the autonomous mode based on the sensor data of the region of interest.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B60W 30/00 - Fonctions des systèmes d'aide à la conduite des véhicules routiers non liées à la commande d'un sous-ensemble particulier, p. ex. de systèmes comportant la commande conjuguée de plusieurs sous-ensembles du véhicule
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for selecting simulators for evaluating control software for autonomous vehicles. In one aspect, a system comprises: receiving data specifying a driving scenario in an environment; receiving an actual value of a low-level statistic measuring a corresponding property of the driving scenario; generating simulations of the driving scenario using a simulator; determining, for each simulation, a respective predicted value of the low-level statistic that measures the corresponding property of the simulation; determining, from the respective predicted values for the simulations, a likelihood assigned to the actual value of the low-level statistic by the simulations; and determining, from the likelihood, a low-level metric for the simulator and for the driving scenario that measures a realism of the simulator with respect to the corresponding property of the driving scenario.
Aspects of the disclosure relate to testing situational awareness of a test driver tasked with monitoring the driving of a vehicle operating in an autonomous driving mode. For instance, a signal that indicates that the test driver may be distracted may be identified. Based on the signal, that a question can be asked of the test driver may be determined. A plurality of factors relating to a driving context for the vehicle may be identified. Based on the determination, a question may be generated based on the plurality of factors. The question may be provided to the test driver. Input may be received from the test driver providing an answer to the question.
G09B 7/02 - Dispositifs ou appareils d'enseignement à commande électrique procédant par questions et réponses du type où l'élève doit donner une réponse à la question posée, ou bien où la machine donne une réponse à la question posée par l'élève