A light detection and ranging (LiDAR) sensor is described herein. The LiDAR sensor can comprise a fiber optic ending, a laser assembly, and one or more processors. The fiber optic ending can comprise a fiber optic cable terminated by a reflector. The laser assembly can emit a chirp signal to detect an object in an environment. A portion of the chirp signal can be diverted to the fiber optic ending. The one or more processors construct a profile of the chirp signal based on the diverted portion of the chirp signal. The one or more processors determine a best fit curve based on the profile of the chirp signal and one or more parameters associated with the best fit curve. A frequency offset between an emitted chirp signal and a returned chirp signal can be computed based on the best fit curve and the one or more parameters. Based on the frequency offset, the one or more processors can determine a range of the object.
G01S 17/34 - Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
Provided herein is a power distribution system comprising a main power bus, sub-buses coupled to the main power bus, and a controller. The sub-buses provide power to electrical components of a vehicle. Each of the sub-buses includes an electrically programmable fuse in series with a relay. The controller is configured to detect a fault in a sub-bus of the sub-buses, determine a fault type associated with the fault, and in response to determining the fault type, generate a command to cause the relay to change a relay state.
H02H 7/00 - Emergency protective circuit arrangements specially adapted for specific types of electric machines or apparatus or for sectionalised protection of cable or line systems, and effecting automatic switching in the event of an undesired change from normal working conditions
B60R 16/03 - Electric or fluid circuits specially adapted for vehicles and not otherwise provided forArrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric for supply of electrical power to vehicle subsystems
G01K 3/00 - Thermometers giving results other than momentary value of temperature
G01R 31/00 - Arrangements for testing electric propertiesArrangements for locating electric faultsArrangements for electrical testing characterised by what is being tested not provided for elsewhere
H02H 1/00 - Details of emergency protective circuit arrangements
H02H 7/22 - Emergency protective circuit arrangements specially adapted for specific types of electric machines or apparatus or for sectionalised protection of cable or line systems, and effecting automatic switching in the event of an undesired change from normal working conditions for distribution gear, e.g. bus-bar systemsEmergency protective circuit arrangements specially adapted for specific types of electric machines or apparatus or for sectionalised protection of cable or line systems, and effecting automatic switching in the event of an undesired change from normal working conditions for switching devices
A system trains a model to infer an intent of an entity. The model includes one or more sensors to obtain frames of data, one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to perform steps. A first step includes determining, in each frame of the frames, one or more bounding regions, each of the bounding regions enclosing an entity. A second step includes identifying a common entity, the common entity being present in bounding regions corresponding to a plurality of the frames. A third step includes associating the common entity across the frames. A fourth step includes training a model to infer an intent of the common entity based on data outside of the bounding regions.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
Described herein are systems, methods, and non-transitory computer readable media for generating fused sensor data through metadata association. First sensor data captured by a first vehicle sensor and second sensor data captured by a second vehicle sensor are associated with first metadata and second metadata, respectively, to obtain labeled first sensor data and labeled second sensor data. A frame synchronization is performed between the first sensor data and the second sensor data to obtain a set of synchronized frames, where each synchronized frame includes a portion of the first sensor data and a corresponding portion of the second sensor data. For each frame in the set of synchronized frames, a metadata association algorithm is executed on the labeled first sensor data and the labeled second sensor data to generate fused sensor data that identifies associations between the first metadata and the second metadata.
Described herein are systems, methods, and non-transitory computer readable media for triggering a sensor operation of a second sensor (e.g., a camera) based on a predicted time of alignment with a first sensor (e.g., a LiDAR), where operation of the second sensor is simulated to determine the predicted time of alignment. In this manner, the sensor data captured by the two sensors is ensured to be substantially synchronized with respect to the physical environment being sensed. This sensor data synchronization based on predicted alignment of the sensors solves the technical problem of lack of sensor coordination and sensor data synchronization that would otherwise result from the latency associated with communication between sensors and a centralized controller and/or between sensors themselves.
G06N 3/126 - Evolutionary algorithms, e.g. genetic algorithms or genetic programming
G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computerOutput arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G05B 19/042 - Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
6.
AUTOMATED VEHICLE SAFETY RESPONSE METHODS AND CORRESPONDING VEHICLE SAFETY SYSTEMS WITH SERIALIZED COMPUTING ARCHITECTURES
Described herein are systems, methods, and non-transitory computer-readable media for implementing automated vehicle safety response measures to ensure continued safe automated vehicle operation for a limited period of time after a vehicle component or vehicle system that supports an automated vehicle driving function fails. When a critical vehicle component/system such as a vehicle computing platform fails, the vehicle is likely no longer capable of performing calculations required to safely operate and navigate the vehicle in an autonomous manner, or at a minimum, is no longer able to ensure the accuracy of such calculations. In such a scenario, the automated vehicle safety response measures disclosed herein can ensure - despite failure of the vehicle component/system -continued safe automated operation of the vehicle for a limited period of time in order to bring the vehicle to a safe stop.
H04L 67/12 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
G07C 5/02 - Registering or indicating driving, working, idle, or waiting time only
B60W 30/09 - Taking automatic action to avoid collision, e.g. braking and steering
B60W 50/029 - Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
A computing component implemented as part of a vehicle architecture and configured to process point cloud data. The computing component comprising one or more programmable logics that, when executed, cause the computing component to generate one or more transformations to align scans of point cloud data, translate the aligned scans of the point cloud data to a universal coordinate system, and generate addresses and offsets to store the aligned scans of the point cloud data.
A sensor enclosure comprises a cover and a structure. The structure can be encased by the cover. The structure comprises a frame, a ring, and one or more anchoring posts. The frame can be configured to mount one or more sensors. The ring, disposed peripherally to the frame, can be operatively coupled to the cover. The ring can include a drainage ring plate that drains rainwater accumulated on the cover away from the sensor enclosure. The one or more anchoring posts, disposed underneath the frame and the ring, can be used to anchor the sensor enclosure to a vehicle.
Provided herein is a system and method for cooling a sensor enclosure of a vehicle. The system comprises one or more sensors configured to determine a speed of the vehicle, an internal temperature of an enclosure, and an external temperature. The system comprises an enclosure to house the one or more sensors. The system comprises a fan disposed at a base of the enclosure. The system comprises a controller configured to regulate a rotation speed of the fan based on the speed of the vehicle, the internal temperature of the enclosure, the external temperature, or the difference between the internal temperature of the enclosure and the external temperature. The controller operates the fan at the regulated rotation speed.
Described herein are systems, methods, and non-transitory computer readable media for performing an alignment between a first vehicle sensor and a second vehicle sensor. Two-dimensional (2D) data indicative of a scene within an environment being traversed by a vehicle is captured by the first vehicle sensor such as a camera or a collection of multiple cameras within a sensor assembly. A three-dimensional (3D) representation of the scene is constructed using the 2D data. 3D point cloud data also indicative of the scene is captured by the second vehicle sensor, which may be a LiDAR. A 3D point cloud representation of the scene is constructed based on the 3D point cloud data. A rigid transformation is determined between the 3D representation of the scene and the 3D point cloud representation of the scene and the alignment between the sensors is performed based at least in part on the determined rigid transformation.
Described herein are systems, methods, and non-transitory computer-readable media for isolating commercial components from a harsh vehicle operating environment to increase the longevity of such components and to decrease their failure rate. Also described herein are systems, methods, and non-transitory computer-readable media for monitoring the operational health status of vehicle components for failure, and upon detecting failure of a component, initiating a processing task reassignment and fault recovery process. In this manner, processing load handled by the component prior to failure can be offloaded to one or more other vehicle components while a fault recovery process is initiated for the component. When the failed component is operational again, the vehicle may revert back to the task assignment in place prior to the component failure, may continue with the current task assignment, or may transition to another different task reassignment.
A vehicle generates a city-scale map. The vehicle includes one or more Lidar sensors configured to obtain point clouds at different positions, orientations, and times, one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to perform registering, in pairs, a subset of the point clouds based on respective surface normals of each of the point clouds; determining loop closures based on the registered subset of point clouds; determining a position and an orientation of each of the subset of the point clouds based on constraints associated with the determined loop closures; and generating a map based on the determined position and the orientation of each of the subset of the point clouds.
An apparatus on a vehicle comprises one or more sensors, one or more nozzles that output fluid to clean the respective one or more sensors, and a compressor that generates fluid such as compressed air. The compressor is in fluid communication with the one or more nozzles. The apparatus further comprises one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to determine a current velocity of the vehicle and control an operation of the compressor based on the current velocity of the vehicle.
Provided herein is a power distribution system comprising a feedback circuit including a transistor in series with a relay, the feedback circuit regulating regulate a main power path including a main power supply connected in series with an electric power converter. The power distribution system further comprises OR-ing controllers that regulate the main power path and a backup power path including a low-voltage battery. The power distribution system further comprises terminals through which power from the main power path or the backup power path is transmitted to respective components corresponding to channels. The power distribution system further includes a microcontroller that acquires data in each of the channels and control operations associated with each of the channels based on the acquired data.
Provided herein is a system on a vehicle, the system comprising one or more sensors, one or more processors, and a memory storing instructions that, when executed by the one or more processors, causes the system to perform: receiving one or more ride requests for ridesharing from one or more users; receiving respective preferences from each of the one or more users; selecting a ride request of a user from the one or more ride requests based on the respective preferences; notifying the user of the selecting of the ride request; sending at least one of the images or videos of the interior of the vehicle to the user; in response to the sending, determining whether the user confirms the ride request; and in response to determining that the user confirms the ride request, selecting a route to the user and driving, according to the route, to the user.
Provided herein is a headlamp assembly comprising a housing that encloses: a sensor that acquires data associated with a surrounding environment; a light source that illuminates a field of view comprising a portion of the surrounding environment; and one or more processors that analyze the acquired data and determine a direction, field of view, power, or an intensity of the illumination of the portion based on the analyzed data.
B60Q 1/14 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights having dimming means
B60Q 1/00 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
H05B 47/125 - Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
Described herein is a sensor device. The sensor device comprises a housing and a printed circuit board encased by the housing. The printed circuit board comprises an image sensor that captures image data, an image sensor processor that processes the image data, a serializer that converts one or more data channels associated with the image data into a single data channel, and one or more exposed surfaces. The one or more exposed surfaces dissipate heat generated by the image sensor, the image sensor processor, and the serializer from the printed circuit board to the housing.
H05K 1/18 - Printed circuits structurally associated with non-printed electric components
H05K 7/14 - Mounting supporting structure in casing or on frame or rack
H04N 23/52 - Elements optimising image sensor operation, e.g. for electromagnetic interference [EMI] protection or temperature control by heat transfer or cooling elements
09 - Scientific and electric apparatus and instruments
12 - Land, air and water vehicles; parts of land vehicles
39 - Transport, packaging, storage and travel services
42 - Scientific, technological and industrial services, research and design
Goods & Services
Computer hardware and downloadable computer software for operating self-driving vehicles; Computer hardware and recorded computer software for operating self-driving vehicles; Computer hardware; computer hardware for communicating audio, video and data between computers via a global computer network, wide-area computer networks, and peer-to-peer computer network; Downloadable computer software for vehicle navigation; Recorded computer software for vehicle navigation; Downloadable computer software for operating an autonomous vehicle; Recorded computer software for operating an autonomous vehicle; Downloadable computer software for vehicle fleet management; Recorded computer software for vehicle fleet management; Downloadable computer software for scheduling and booking vehicles for passenger transport; Recorded computer software for scheduling and booking vehicles for passenger transport; Downloadable computer software for managing autonomous vehicles; Recorded computer software for managing autonomous vehicles; navigation apparatus for vehicles; navigational instruments, namely, GPS navigation devices and satellite-aided navigation systems; Electronic steering apparatus for vehicles, namely, simulators for the steering and controlling of vehicles; radar apparatus; laser device for sensing distance to objects; laser object detectors for use on vehicles; lidar, namely, light detection and ranging apparatus; vehicle infrared, acceleration, proximity, and velocity sensors; Electric sensors for determining position, velocity, direction, and acceleration; downloadable mobile applications for coordinating transportation services; downloadable mobile applications for booking vehicles for passenger transport; cameras for use with vehicles; Downloadable mobile applications for booking taxis Land vehicles; automobiles; autonomous cars; autonomous land vehicles; driverless cars Transportation of passengers and freight by trucks and autonomous vehicles; Transportation services, namely, making reservations and bookings for transportation; transportation of passengers by vehicle; transportation services, namely, providing travel by autonomous vehicles; transportation of passengers by land vehicle; transportation services, namely, pickup and drop-off of passengers at designated or directed locations; providing taxi booking services via mobile applications Providing online non-downloadable software services for transportation services, namely, software for coordinating, booking, and dispatching autonomous vehicles for transportation purposes; research and development into autonomous vehicles; research, design, and development of computer hardware and software for use with vehicle on-board computers for monitoring and controlling motor vehicle operation; installation, updating, and maintenance of computer software for use with vehicle on-board computers for monitoring and controlling motor vehicle operation
09 - Scientific and electric apparatus and instruments
39 - Transport, packaging, storage and travel services
42 - Scientific, technological and industrial services, research and design
Goods & Services
Computer hardware; computer hardware for communicating data between computers via a global computer network, wide-area computer networks, or peer-to-peer computer networks; Downloadable computer software for vehicle navigation; Recorded computer software for vehicle navigation; Downloadable computer software for operating an autonomous vehicle; Recorded computer software for operating an autonomous vehicle; Downloadable computer software for vehicle fleet management; Recorded computer software for vehicle fleet management; Downloadable computer software for coordinating, scheduling, booking, and dispatching vehicles; Recorded computer software for coordinating, scheduling, booking, and dispatching vehicles; Downloadable computer software for managing autonomous vehicles; Recorded computer software for managing autonomous vehicles; downloadable mobile applications for coordinating transportation services; downloadable mobile applications for coordinating, scheduling, booking, and dispatching vehicles; Downloadable computer software and hardware for vehicle fleet launching, coordination, and management; Recorded computer software and hardware for vehicle fleet launching, coordination, and management Transportation services, namely, providing services by vehicles; transportation services, namely, making reservations and bookings for transportation; Vehicle sharing services, namely, providing temporary use of vehicles; transportation services, namely, providing travel by autonomous vehicles; transportation reservation services; providing a website featuring information regarding autonomous vehicle transportation services; transportation services, namely, coordinating the pickup and drop-off at designated or directed locations Providing online non-downloadable software services for transportation services, namely, software for coordinating, booking, and dispatching autonomous vehicles for transportation purposes; research and development into autonomous vehicles; research, design, and development of computer hardware and software for use with vehicle on-board computers; research, design, and development of computer hardware and software for vehicle coordination, navigation, and management; installation, updating, and maintenance of computer software for use with vehicle on-board computers; research, design, and development in the field of artificial intelligence
Provided herein is a system and method for a sensor system on a vehicle. The sensor system comprises sensors connected with one another in a daisy chain communication network. The sensor system further comprises a controller connected to at least one of the sensors. The controller is configured to operate the vehicle based on data from the sensors and to operate the daisy chain communication network.
H04L 67/12 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
Described herein are systems, methods, and computer readable media for dynamically determining a language variant to use for vehicle output to a vehicle occupant based on the vehicle's location. A geographic region may include multiple sub-regions, each of which may be associated with a respective one or more language variants. As an example, a geographic region may be a state or province, and each sub-region may have one or more dialects that are spoken by individuals in that sub-region. In some cases, a particular dialect may be predominant in a given sub-region. As a vehicle traverses a travel path, it may determine its current location, which geographic sub-region includes that location, and which language variant (e.g., dialect) is predominant there. That language variant may then be selected for in-vehicle communication with a vehicle occupant. The vehicle location determination may be made at or near where the occupant entered the vehicle.
Provided herein is a system and method for determining whether a sensor is calibrated and error handling of an uncalibrated sensor. The system comprises a sensor system comprising a sensor and an analysis engine configured to determine whether the sensor is uncalibrated. The system further comprises an error handling system configured to perform an error handling in response to the sensor system determining that the sensor is uncalibrated. The method comprises determining, by a sensor system, whether the sensor is uncalibrated, and performing, by an error handling system, an error handling in response to the sensor system determining that the sensor is uncalibrated.
H04L 67/12 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
24.
Correcting or expanding an existing high-definition map
A computing system includes one or more processors and a memory storing instructions that, when executed by the one or more processors, cause the system to perform operations. The operations include determining that a portion of an existing map is to be updated; obtaining a point cloud acquired by one or more Lidar sensors corresponding to a location of the portion; converting the portion into an equivalent point cloud; performing a point cloud registration based on the equivalent point cloud and the point cloud; and updating the existing map based on the point cloud registration.
A computing system is implemented as part of a vehicle architecture. The computing system includes a computing component, a first computing node that includes a power distribution system, a second computing node that includes input/output (I/O) interfaces to connect to devices, actuators, or sensors, and a third computing node. The computing component further comprises one or more processors and instructions or logic that, when executed by the one or more processors, cause the computing component to perform, transmitting commands to the second computing node, the commands associated with initial processing of data received at the second computing node, receiving initially processed data from the second computing node, and performing further processing on the initially processed data.
B60W 50/04 - Monitoring the functioning of the control system
G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
B60W 50/00 - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
26.
EFFICIENT RETRIEVAL OF SENSOR DATA WHILE ENSURING ATOMICITY
A computing device performs initial processing of sensor data. The computing device performs obtaining sensor data, writing the sensor data to first addresses of a dynamically allocated buffer associated with the computing device, encoding the sensor data, writing the encoded sensor data to second addresses of the dynamically allocated buffer, in response to completing the writing of the encoded sensor data, indicating that the writing of the encoded sensor data has been completed, receiving, from a computing resource, a polling request to read the encoded sensor data, transmitting, to the computing resource, a status that the writing of the encoded sensor data to the second addresses has been completed, reading, to a memory of the computing resource, the encoded sensor data, receiving, from the computing resource, a second status that the encoded sensor data has been read, and removing, from the dynamically allocated buffer, the encoded sensor data.
Described herein are systems, methods, and non-transitory computer-readable media for self-detection of a fault condition by a vehicle component, generation of a device health code that includes multiple tiers of information relating to the fault condition experienced by the vehicle component, and broadcasting of the device health code to one or more other vehicle components via one or more vehicle communication networks. A recommended vehicle response measure indicated by a reaction code in the device health code can then be taken or alternate vehicle response may be selected and initiated based on an evaluation of current vehicle operational data.
B60R 16/023 - Electric or fluid circuits specially adapted for vehicles and not otherwise provided forArrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric for transmission of signals between vehicle parts or subsystems
A computing device performs initial processing of sensor data. The computing device includes one or more processors and instructions or logic that, when executed by the one or more processors, cause the computing device to perform obtaining sensor data, encoding the sensor data, writing the encoded sensor data to a dynamically allocated buffer, and logging a status of the written encoded sensor data at a static location of the dynamically allocated buffer. The status includes any one or more of memory addresses at which frames of the sensor data begin in the dynamically allocated buffer, valid bit fields corresponding to the frames, and sizes of each of data segments within the frames. The instructions further cause the computing device to perform, in response to receiving a polling request from a computing resource, transmitting the logged status to the computing resource.
H04N 19/30 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
H04N 19/423 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements
H04N 19/436 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation using parallelised computational arrangements
H04N 19/625 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using discrete cosine transform [DCT]
A computing device triggers a sensor operation. The computing device includes one or more processors and instructions or logic that, when executed by the one or more processors, implements computing functions. The computing device performs receiving timestamps from a sensor, simulating an operation of the sensor, the simulation including predicting orientations of the sensor at different times based on the received timestamps, comparing a latest timestamp of the computing device to a latest timestamp of the sensor, and based on the comparison, triggering a second sensor to perform an operation.
Described herein are systems, methods, and non-transitory computer-readable media for self-detection of a fault condition by a vehicle component, generation of a device health code that includes multiple tiers of information relating to the fault condition experienced by the vehicle component, and broadcasting of the device health code to one or more other vehicle components via one or more vehicle communication networks. A recommended vehicle response measure indicated by a reaction code in the device health code can then be taken or alternate vehicle response may be selected and initiated based on an evaluation of current vehicle operational data.
G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
Described herein are systems, methods, and non-transitory computer-readable media for self-detection of fault conditions experienced by vehicle components, generation of device health codes indicative of the fault conditions, and broadcasting of the device health codes over mixed vehicle communication networks. The device health codes can be parsed to identify fault information, and the fault information can be assessed along with current vehicle operational data to determine a recommended vehicle response measure to one or more fault conditions experienced by one or more vehicle components.
G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
Provided herein is a system comprising: one or more processors; and a memory storing instructions that, when executed by the one or more processors, causes the system to perform: identifying, in a map, one or more entities that change over time; predicting an amount of change of the identified one or more entities over time; and updating the map based on the predicted amount of change of the identified one or more entities over time.
A computer-implemented method and a system for training a computer-based autonomous driving model used for an autonomous driving operation by an autonomous vehicle are described. The method includes: creating time-dependent three-dimensional (3D) traffic environment data using at least one of real traffic element data and simulated traffic element data; creating simulated time-dependent 3D traffic environmental data by applying a time-dependent 3D generic adversarial network (GAN) model to the created time-dependent 3D traffic environment data; and training a computer-based autonomous driving model using the simulated time-dependent 3D traffic environmental data.
Provided herein is a system and method of a vehicle that detects a signal and reacts to the signal. The system comprises one or more sensors; one or more processors; a memory storing instructions that, when executed by the one or more processors, causes the system to perform detecting a signal from a source; determining an intended action of the vehicle based on the detected signal; sending, to the source, a response signal indicative of the intended action; determining whether the source has sent a response to the response signal; and in response to determining that the source has sent a response to the response signal, taking the intended action based on the response to the response signal.
G05D 1/02 - Control of position or course in two dimensions
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
B60Q 1/34 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating change of drive direction
35.
System and method for determining vehicle navigation in response to broken or uncalibrated sensors
Provided herein is a system and method for providing a vehicle navigation in response to broken or uncalibrated vehicle sensors. The system comprises a sensor to capture data, one or more processors, and a memory storing instructions that cause the system to determine whether the sensor is broken or uncalibrated based on the data, and to limit a range of motion of the vehicle when the sensor is determined to be broken or uncalibrated. The limiting a range of motion of the vehicle includes limiting a number of driving options for the vehicle when the sensor is broken.
G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/02 - Control of position or course in two dimensions
A method comprises obtaining smart seat sensor data, the smart seat sensor data being detected by a tactile-sensitive surface material of a seat of an autonomous vehicle in response to a user interacting with the tactile-sensitive surface material. Other sensor data is obtained from one or more other sensors disposed within the autonomous vehicle. The smart seat sensor data and the other sensor data are integrated. A behavior of the user is estimated based on the integrated data, and the autonomous vehicle is controlled based on the estimated behavior of the user.
B60W 10/18 - Conjoint control of vehicle sub-units of different type or different function including control of braking systems
B60W 10/06 - Conjoint control of vehicle sub-units of different type or different function including control of propulsion units including control of combustion engines
B60W 40/08 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to drivers or passengers
37.
Computerized detection of unsafe driving scenarios
Systems, methods, and non-transitory computer-readable media configured to obtain one or more series of successive sensor data frames during a navigation of a vehicle. Disengagement data is obtained. The disengagement data indicates whether a vehicle is in autonomous mode. A training dataset with which train a machine learning model is determined based on the one or more series of successive sensor data frames and the disengagement data. The training dataset includes a subset of the one or more series of successive sensor data frames and a subset of the disengagement data, the machine learning model being trained to identify unsafe driving conditions.
G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
38.
Contextualization and refinement of simultaneous localization and mapping
A computing system includes one or more processors and a memory storing instructions that, when executed by the one or more processors, cause the system to perform operations. The operations include obtaining sensor data from a sensor of a vehicle, the sensor data including point cloud frames at different positions, orientations, and times, the sensor data used to generate a map, determining a position and an orientation of the sensor corresponding to a capture of each of the point cloud frames according to a simultaneous localization and mapping (SLAM) algorithm, and depicting, on an interface, a graphical illustration of the determined positions at which the point cloud frames were captured.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
39.
COMPUTERIZED DETECTION OF UNSAFE DRIVING SCENARIOS
Systems, methods, and non-transitory computer-readable media configured to obtain one or more series of successive sensor data frames during a navigation of a vehicle. Disengagement data is obtained. The disengagement data indicates whether a vehicle is in autonomous mode. A training dataset with which train a machine learning model is determined based on the one or more series of successive sensor data frames and the disengagement data. The training dataset includes a subset of the one or more series of successive sensor data frames and a subset of the disengagement data, the machine learning model being trained to identify unsafe driving conditions.
B60W 30/095 - Predicting travel path or likelihood of collision
G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
A computing system includes one or more processors and a memory storing instructions that, when executed by the one or more processors, cause the system to perform operations. The operations include obtaining sensor data from a sensor of a vehicle, the sensor data including point cloud frames at different positions, orientations, and times, the sensor data used to generate a map, determining a position and an orientation of the sensor corresponding to a capture of each of the point cloud frames according to a simultaneous localization and mapping (SLAM) algorithm, and depicting, on an interface, a graphical illustration of the determined positions at which the point cloud frames were captured.
An apparatus includes a processing node of a distributed computing platform. The processing node communicates with other processing nodes over one or more networks. The processing node may receive frames of point clouds at a processing node of a distributed computing platform, determine a subset of the frames as key frames based at least in part on distances travelled between captures of the respective frames, and allocate tasks of processing the key frames to processing subnodes based at least in part on estimated processing demands of the key frames and processing capabilities of each of the processing subnodes.
A computing system includes one or more processors and a memory storing instructions that, when executed by the one or more processors, cause the system to perform operations. The operations include determining that a portion of an existing map is to be updated; obtaining a point cloud acquired by one or more Lidar sensors corresponding to a location of the portion; converting the portion into an equivalent point cloud; performing a point cloud registration based on the equivalent point cloud and the point cloud; and updating the existing map based on the point cloud registration.
Provided herein is a system and method that acquires data and determines a driving action based on the data. The system comprises a sensor, one or more processors, and a memory storing instructions that, when executed by the one or more processors, causes the system to perform, determining data of interest comprising an object, feature, or region of interest, determining whether a sharpness of the data of interest exceeds a threshold, in response to determining that the sharpness does not exceed a threshold, operating the sensor to increase the sharpness of the data of interest until the sharpness exceeds the threshold, in response to the sharpness exceeding the threshold, determining a driving action of a vehicle based on the data of interest, and performing the driving action.
B60K 31/00 - Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
A computing system includes one or more processors and a memory storing instructions that, when executed by the one or more processors, cause the system to perform operations. The operations include determining that a portion of an existing map is to be updated; obtaining a point cloud acquired by one or more Lidar sensors corresponding to a location of the portion; converting the portion into an equivalent point cloud; performing a point cloud registration based on the equivalent point cloud and the point cloud; and updating the existing map based on the point cloud registration.
09 - Scientific and electric apparatus and instruments
12 - Land, air and water vehicles; parts of land vehicles
39 - Transport, packaging, storage and travel services
42 - Scientific, technological and industrial services, research and design
Goods & Services
Computer hardware; computer hardware for communicating data between computers via a global computer network, wide-area computer networks, or peer-to-peer computer networks; Downloadable computer software for vehicle navigation; Recorded computer software for vehicle navigation; Downloadable computer software for operating an autonomous vehicle; Recorded computer software for operating an autonomous vehicle; Downloadable computer software for vehicle fleet management; Recorded computer software for vehicle fleet management; Downloadable computer software for coordinating, scheduling, booking, and dispatching vehicles; Recorded computer software for coordinating, scheduling, booking, and dispatching vehicles; Downloadable computer software for managing autonomous vehicles; Recorded computer software for managing autonomous vehicles; downloadable mobile applications for coordinating transportation services; downloadable mobile applications for coordinating, scheduling, booking, and dispatching vehicles; Downloadable computer software and hardware for vehicle fleet launching, coordination, and management; Recorded computer software and hardware for vehicle fleet launching, coordination, and management Land vehicles; automobiles; trucks; freight land vehicles; mass transit land vehicles; autonomous cars; autonomous land vehicles; driverless land vehicles; electric land vehicles; freight train containers; plastic parts for vehicles, namely, automotive exterior and interior plastic extruded decorative and protective trim; metal parts for vehicles, namely, automotive exterior and interior metal decorative and protective trim Transportation services, namely, providing services by vehicles; transportation services, namely, making reservations and bookings for transportation; Vehicle sharing services, namely, providing temporary use of vehicles; transportation services, namely, providing travel by autonomous vehicles; transportation reservation services; providing a website featuring information regarding autonomous vehicle transportation services; transportation services, namely, coordinating the pickup and drop-off at designated or directed locations Providing online non-downloadable software services for transportation services, namely, software for coordinating, booking, and dispatching autonomous vehicles for transportation purposes; research and development into autonomous vehicles; research, design, and development of computer hardware and software for use with vehicle on-board computers; research, design, and development of computer hardware and software for vehicle coordination, navigation, and management; installation, updating, and maintenance of computer software for use with vehicle on-board computers; research, design, and development in the field of artificial intelligence
An adaptive filter system and a method for controlling the adaptive filter system are described herein. The system can includes one or more filters to attenuate incoming light. The one or more filters can be moved by one or more actuators. The method can capture image data from an imaging device through the one or more filters. Information can be determined from the captured image data. The one or more filters can be moved to a position for capturing image data based on the information.
A vehicle generates a city-scale map. The vehicle includes one or more Lidar sensors configured to obtain point clouds at different positions, orientations, and times, one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to perform registering, in pairs, a subset of the point clouds based on respective surface normals of each of the point clouds; determining loop closures based on the registered subset of point clouds; determining a position and an orientation of each of the subset of the point clouds based on constraints associated with the determined loop closures; and generating a map based on the determined position and the orientation of each of the subset of the point clouds.
Provided herein is a system and method implemented on a vehicle. The system comprises one or more sensors, one or more processors, and a memory storing instructions that, when executed by the one or more processors, causes the system to perform: obtaining data from the one or more sensors; comparing the obtained data from the one or more sensors with reference data; determining whether one or more characteristics of the obtained data deviate from corresponding characteristics of the reference data by more than a respective threshold; in response to determining that one or more characteristics of the data obtained deviate from corresponding characteristics of the reference data by more than a respective threshold, determining an action of the vehicle based on amounts of the one or more deviations; and performing the determined action.
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/02 - Control of position or course in two dimensions
B60Q 1/50 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
B60Q 1/54 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for indicating speed
49.
Automated vehicle safety response methods and corresponding vehicle safety systems with serial-parallel computing architectures
Described herein are systems, methods, and non-transitory computer-readable media for implementing automated vehicle safety response measures to ensure continued safe automated vehicle operation for a limited period of time after a vehicle component or vehicle system that supports an automated vehicle driving function fails. When a critical vehicle component/system such as a vehicle computing platform fails, the vehicle is likely no longer capable of performing calculations required to safely operate and navigate the vehicle in an autonomous manner, or at a minimum, is no longer able to ensure the accuracy of such calculations. In such a scenario, the automated vehicle safety response measures disclosed herein can ensure—despite failure of the vehicle component/system—continued safe automated operation of the vehicle for a limited period of time in order to bring the vehicle to a safe stop.
A vehicle generates a city-scale map. The vehicle includes one or more Lidar sensors configured to obtain point clouds at different positions, orientations, and times,
one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to perform registering, in pairs, a subset of the point clouds based on respective surface normals of each of the point clouds; determining loop closures based on the registered subset of point clouds; determining a position and an orientation of each of the subset of the point clouds based on constraints associated with the determined loop closures; and generating a map based on the determined position and the orientation of each of the subset of the point clouds.
Described herein are apparatuses and methods for selectively controlling the application of a fluid to a sensor enclosure such as a camera housing to protect the housing from overheating. An apparatus that includes a protective shield and a conduit such as tubing for supplying a fluid is described. The protective shield is provided so as to protect an exterior surface of the camera housing from heat caused by sun exposure. The tubing includes an inlet for supplying a fluid such as water or air, can extend through or around an exterior of the camera housing, and includes an outlet with one or more nozzles for ejecting the fluid into a space between the protective shield and the camera housing. Sensor data is received from various vehicle sensors to assess the temperature of the housing, the velocity of the vehicle, and so forth to determine when the fluid should be supplied.
Described herein are systems, methods, and non-transitory computer-readable media for implementing automated vehicle safety response measures to ensure continued safe automated vehicle operation for a limited period of time after a vehicle component or vehicle system that supports an automated vehicle driving function fails. When a critical vehicle component/system such as a vehicle computing platform fails, the vehicle is likely no longer capable of performing calculations required to safely operate and navigate the vehicle in an autonomous manner, or at a minimum, is no longer able to ensure the accuracy of such calculations. In such a scenario, the automated vehicle safety response measures disclosed herein can ensure—despite failure of the vehicle component/system—continued safe automated operation of the vehicle for a limited period of time in order to bring the vehicle to a safe stop.
B60W 50/02 - Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
G07C 5/02 - Registering or indicating driving, working, idle, or waiting time only
H04L 67/12 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
53.
Systems and methods for implementing a tracking camera system onboard an autonomous vehicle
Systems, methods, and non-transitory computer-readable media are provided for implementing a tracking camera system onboard an autonomous vehicle. Coordinate data of an object can be received. The tracking camera system actuates, based on the coordinate data, to a position such that the object is in view of the tracking camera system. Vehicle operation data of the autonomous vehicle can be received. The position of the tracking camera system can be adjusted, based on the vehicle operation data, such that the object remains in view of the tracking camera system while the autonomous vehicle is in motion. A focus of the tracking camera system can be adjusted to bring the object in focus. The tracking camera system captures image data corresponding to the object.
Described herein are systems, methods, and non-transitory computer readable media for using 3D point cloud data such as that captured by a LiDAR as ground truth data for training an instance segmentation deep learning model. 3D point cloud data captured by a LiDAR can be projected on a 2D image captured by a camera and provided as input to a 2D instance segmentation model. 2D sparse instance segmentation masks may be generated from the 2D image with the projected 3D data points. These 2D sparse masks can be used to propagate loss during training of the model. Generation and use of the 2D image data with the projected 3D data points as well as the 2D sparse instance segmentation masks for training the instance segmentation model obviates the need to generate and use actual instance segmentation data for training, thereby providing an improved technique for training an instance segmentation model.
G06V 10/50 - Extraction of image or video features by performing operations within image blocksExtraction of image or video features by using histograms, e.g. histogram of oriented gradients [HoG]Extraction of image or video features by summing image-intensity valuesProjection analysis
G01S 17/894 - 3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
G06K 9/62 - Methods or arrangements for recognition using electronic means
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
55.
System and method for selectively generating electricity
Provided herein is a system and method for heat exchange of a vehicle. The system comprises an enclosure disposed on the vehicle. The enclosure comprises a vent at a base of the enclosure. The enclosure houses one or more sensors. The enclosure comprises a fan disposed at a base of the enclosure. The heat exchange system comprises an deflector disposed on the vehicle outside the enclosure and configured to direct an airflow into the vent of the enclosure. The heat exchange system comprises a motor configured to: generate electricity from the airflow and selectively supply electricity to operate the fan. The heat exchange system comprises a controller configured to adjust the deflector and regulate an amount of electricity supplied from the motor to the fan.
F03D 9/25 - Wind motors characterised by the driven apparatus the apparatus being an electrical generator
B60H 1/00 - Heating, cooling or ventilating devices
B60R 16/03 - Electric or fluid circuits specially adapted for vehicles and not otherwise provided forArrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric for supply of electrical power to vehicle subsystems
56.
Autonomous vehicle navigation using with coalescing constraints for static map data
Systems, methods, and non-transitory computer readable media are provided for obtaining a slice of static map data comprising a plurality of blocks, each block comprising a plurality of cells, each, each cell having a cell value indicating a probability that an object is present in the cell; loading the slice into a cache memory of a parallel processor; arranging the static map data in the cache memory in contiguous memory spaces assigned to a group of workers of the parallel processor that have coalescing constraints; loading a frame of dynamic map data into the cache memory; obtaining a plurality of scan match candidates each representing a possible position and attitude of the vehicle; processing, in the parallel processor, the static and dynamic map data and the candidates to generate results each representing a candidate and score; and selecting the candidate having the highest score as a vehicle position.
G01C 21/32 - Structuring or formatting of map data
G05D 1/02 - Control of position or course in two dimensions
G06V 10/75 - Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video featuresCoarse-fine approaches, e.g. multi-scale approachesImage or video pattern matchingProximity measures in feature spaces using context analysisSelection of dictionaries
57.
AUTONOMOUS VEHICLE NAVIGATION USING WITH COALESCING CONSTRAINTS FOR STATIC MAP DATA
Systems, methods, and non-transitory computer readable media are provided for obtaining a slice of static map data comprising a plurality of blocks, each block comprising a plurality of cells, each, each cell having a cell value indicating a probability that an object is present in the cell; loading the slice into a cache memory of a parallel processor; arranging the static map data in the cache memory in contiguous memory spaces assigned to a group of workers of the parallel processor that have coalescing constraints; loading a frame of dynamic map data into the cache memory; obtaining a plurality of scan match candidates each representing a possible position and attitude of the vehicle; processing, in the parallel processor, the static and dynamic map data and the candidates to generate results each representing a candidate and score; and selecting the candidate having the highest score as a vehicle position.
A system included and a computer-implemented method performed in an autonomous-driving vehicle are described. The system performs: receive a request to meet a person at a location; drive the vehicle to the location; identify the person at the location; and providing an instruction for the person to interact with the vehicle.
G06Q 50/40 - Business processes related to the transportation industry
B60Q 1/50 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G06F 21/32 - User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
G06F 21/35 - User authentication involving the use of external additional devices, e.g. dongles or smart cards communicating wirelessly
Described herein are systems, methods, and non-transitory computer-readable media for isolating commercial components from a harsh vehicle operating environment to increase the longevity of such components and to decrease their failure rate. Also described herein are systems, methods, and non-transitory computer-readable media for monitoring the operational health status of vehicle components for failure, and upon detecting failure of a component, initiating a processing task reassignment and fault recovery process. In this manner, processing load handled by the component prior to failure can be offloaded to one or more other vehicle components while a fault recovery process is initiated for the component. When the failed component is operational again, the vehicle may revert back to the task assignment in place prior to the component failure, may continue with the current task assignment, or may transition to another different task reassignment.
Described herein are systems, methods, and computer readable media for performing data conversion on sensor data to obtain modified sensor data that is formatted/structured appropriately for downstream processes that rely on the sensor data as input. The sensor data can include point cloud data captured by a LiDAR, for example. A grid structure and corresponding grid characteristics can be determined and the sensor data can be converted to grid-based sensor data by associating the grid structure and its characteristics with the sensor data. Generating the grid-based sensor data can include reformatting the point cloud data to superimpose the grid structure and its grid characteristics onto the point cloud data. Various downstream processing that cannot feasibly be performed on the raw sensor data can then be performed efficiently on the modified grid-based sensor data by virtue of the grid structure imbuing the sensor data with spatial proximity information.
A sensor enclosure comprising a domed cover and a base. The base can be encased by the domed cover. The base comprises an inner frame, an outer frame, one or more wipers, and a powertrain. The inner frame can provide surfaces for one or more sensors. The outer frame, disposed underneath the inner frame, the outer frame includes a slewing ring. The slewing ring comprises an inner ring to which the domed cover is attached and an outer ring attached to the outer frame. The one or more wipers extends vertically from the outer frame, each wiper having a first end attached to the outer frame and a second end attached to a support ring, and each wiper making a contact with the dome cover. The powertrain, disposed within the outer frame, configured to rotate the ring and the dome cover attached to the inner ring.
Described herein are systems, methods, and non-transitory computer readable media for memory address encoding of multi-dimensional data in a manner that optimizes the storage and access of such data in linear data storage. The multi-dimensional data may be spatial-temporal data that includes two or more spatial dimensions and a time dimension. An improved memory architecture is provided that includes an address encoder that takes a multi-dimensional coordinate as input and produces a linear physical memory address. The address encoder encodes the multi-dimensional data such that two multi-dimensional coordinates close to one another in multi-dimensional space are likely to be stored in close proximity to one another in linear data storage. In this manner, the number of main memory accesses, and thus, overall memory access latency is reduced, particularly in connection with real-world applications in which the respective probabilities of moving along any given dimension are very close.
G06F 12/0811 - Multiuser, multiprocessor or multiprocessing cache systems with multilevel cache hierarchies
G06F 12/0846 - Cache with multiple tag or data arrays being simultaneously accessible
G06F 12/1045 - Address translation using associative or pseudo-associative address translation means, e.g. translation look-aside buffer [TLB] associated with a data cache
G06F 12/0831 - Cache consistency protocols using a bus scheme, e.g. with bus monitoring or watching means
Described herein is a memory architecture that is configured to dynamically determine an address encoding to use to encode multi-dimensional data such as multi-coordinate data in a manner that provides a coordinate bias corresponding to a current memory access pattern. The address encoding may be dynamically generated in response to receiving a memory access request or may be selected from a set of preconfigured address encodings. The dynamically generated or selected address encoding may apply an interleaving technique to bit representations of coordinate values to obtain an encoded memory address. The interleaving technique may interleave a greater number of bits from the bit representation corresponding to the coordinate direction in which a coordinate bias is desired than from bit representations corresponding to other coordinate directions.
G06F 12/0811 - Multiuser, multiprocessor or multiprocessing cache systems with multilevel cache hierarchies
G06F 12/0846 - Cache with multiple tag or data arrays being simultaneously accessible
G06F 12/1045 - Address translation using associative or pseudo-associative address translation means, e.g. translation look-aside buffer [TLB] associated with a data cache
Provided herein is a power distribution system comprising a feedback circuit including a transistor in series with a relay, the feedback circuit regulating regulate a main power path including a main power supply connected in series with an electric power converter. The power distribution system further comprises OR-ing controllers that regulate the main power path and a backup power path including a low-voltage battery. The power distribution system further comprises terminals through which power from the main power path or the backup power path is transmitted to respective components corresponding to channels. The power distribution system further includes a microcontroller that acquires data in each of the channels and control operations associated with each of the channels based on the acquired data.
Provided herein is a power distribution system comprising a main power bus, sub-buses coupled to the main power bus, and a controller. The sub-buses provide power to electrical components of a vehicle. Each of the sub-buses includes an electrically programmable fuse in series with a relay. The controller is configured to detect a fault in a sub-bus of the sub-buses, determine a fault type associated with the fault, and in response to determining the fault type, generate a command to cause the relay to change a relay state.
H02H 7/00 - Emergency protective circuit arrangements specially adapted for specific types of electric machines or apparatus or for sectionalised protection of cable or line systems, and effecting automatic switching in the event of an undesired change from normal working conditions
H02H 7/22 - Emergency protective circuit arrangements specially adapted for specific types of electric machines or apparatus or for sectionalised protection of cable or line systems, and effecting automatic switching in the event of an undesired change from normal working conditions for distribution gear, e.g. bus-bar systemsEmergency protective circuit arrangements specially adapted for specific types of electric machines or apparatus or for sectionalised protection of cable or line systems, and effecting automatic switching in the event of an undesired change from normal working conditions for switching devices
H02H 1/00 - Details of emergency protective circuit arrangements
G01K 3/00 - Thermometers giving results other than momentary value of temperature
G01R 31/00 - Arrangements for testing electric propertiesArrangements for locating electric faultsArrangements for electrical testing characterised by what is being tested not provided for elsewhere
B60R 16/03 - Electric or fluid circuits specially adapted for vehicles and not otherwise provided forArrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric for supply of electrical power to vehicle subsystems
Systems and methods of linearizing a signal of a light detection and ranging (LiDAR) sensor are described herein. A system receives a portion of a non-linear chirp signal. The portion of the non-linear chirp signal is sampled at a sampling frequency to generate data points corresponding to the portion of the non-linear chirp signal. A profile of the non-linear chirp signal is generated based on the data points. The non-linear chirp signal is linearized based on the profile of the non-linear chirp signal.
G01S 17/34 - Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
Described herein is a sensor device. The sensor device comprises a housing and a printed circuit board encased by the housing. The printed circuit board comprises an image sensor that captures image data, an image sensor processor that processes the image data, a serializer that converts one or more data channels associated with the image data into a single data channel, and one or more exposed surfaces. The one or more exposed surfaces dissipate heat generated by the image sensor, the image sensor processor, and the serializer from the printed circuit board to the housing.
A system trains a model to infer an intent of an entity. The model includes one or more sensors to obtain frames of data, one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to perform steps. A first step includes determining, in each frame of the frames, one or more bounding regions, each of the bounding regions enclosing an entity. A second step includes identifying a common entity, the common entity being present in bounding regions corresponding to a plurality of the frames. A third step includes associating the common entity across the frames. A fourth step includes training a model to infer an intent of the common entity based on data outside of the bounding regions.
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
Improved calibration of a vehicle sensor based on static objects detected within an environment being traversed by the vehicle is disclosed. A first sensor such as a LiDAR can be calibrated to a global coordinate system via a second pre-calibrated sensor such as a GPS IMU. A static object present in the environment is detected such as signage. A type of the detected object is determined from static map data. Point cloud data representative of the static object is captured by the first sensor and a first transformation matrix for performing a transformation from a local coordinate system of the first sensor to a local coordinate system of the second sensor is iteratively redetermined until a desired calibration accuracy is achieved. Transformation to the global coordinate system is then achieved via application of the first transformation matrix followed by a second known transformation matrix.
Improved calibration of a vehicle sensor based on static objects detected within an environment being traversed by the vehicle is disclosed. A first sensor such as a LiDAR can be calibrated to a global coordinate system via a second pre-calibrated sensor such as a GPS IMU. Static objects present in the environment are detected such as signage. Point cloud data representative of the static objects are captured by the first sensor and a first transformation matrix for performing a transformation from a local coordinate system of the first sensor to a local coordinate system of the second sensor is iteratively redetermined until a desired calibration accuracy is achieved. Transformation to the global coordinate system is then achieved via application of the first transformation matrix followed by application of a second known transformation matrix to transition from the local coordinate system of the second pre-calibrated sensor to the global coordinate system.
A light detection and ranging (LiDAR) sensor is described herein. The LiDAR sensor can comprise a fiber optic ending, a laser assembly, and one or more processors. The fiber optic ending can comprise a fiber optic cable terminated by a reflector. The laser assembly can emit a chirp signal to detect an object in an environment. A portion of the chirp signal can be diverted to the fiber optic ending. The one or more processors construct a profile of the chirp signal based on the diverted portion of the chirp signal. The one or more processors determine a best fit curve based on the profile of the chirp signal and one or more parameters associated with the best fit curve. A frequency offset between an emitted chirp signal and a returned chirp signal can be computed based on the best fit curve and the one or more parameters. Based on the frequency offset, the one or more processors can determine a range of the object.
G01S 17/34 - Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
Described herein is a sensor assembly and a method of operating the sensor assembly. In various embodiments, the sensor assembly can comprise a base component, a light detection and ranging (LiDAR) sensor, a transparent cylinder, a motor component, and a controller. The LiDAR sensor can be mounted on a support platform disposed centrally on the base component. The transparent cylinder can be disposed peripherally to the LiDAR sensor and can provide a field of view (FOV) for the LiDAR sensor. The transparent cylinder can be rotated independently of the base component. The motor component can be disposed on the base component, adjacent to the support platform. The motor component can be coupled to the transparent cylinder through a gearset and configured to rotate the transparent cylinder. The controller can be configured to obtain sensor data from on-board vehicle sensors. The controller can determine a level of obscurement on the transparent cylinder based on the sensor data. The controller can determine that the level of obscurement exceeds a threshold level of obscurement. The controller can transmit an actuation signal to the motor component to cause a rotation of the transparent cylinder at a rotational speed. The rotation of the transparent cylinder can disperse obscurements away from the transparent cylinder.
Provided herein is a headlamp assembly comprising a housing that encloses: a sensor that acquires data associated with a surrounding environment; a light source that illuminates a field of view comprising a portion of the surrounding environment; and one or more processors that analyze the acquired data and determine a direction, field of view, power, or an intensity of the illumination of the portion based on the analyzed data.
B60Q 1/14 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights having dimming means
B60Q 1/00 - Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
H05B 47/125 - Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
Described herein is a sensor device. The sensor device includes a housing with a front plate and a back plate. The front plate includes mounting holes and the back plate includes second mounting holes and alignment holes. The sensor device includes a printed circuit board encased by the housing, wherein the printed circuit board comprises an image sensor, an image sensor processor, and a serializer.
A vehicle generates a city-scale map. The vehicle includes one or more Lidar sensors configured to obtain point clouds at different positions, orientations, and times, one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to perform registering, in pairs, a subset of the point clouds based on respective surface normals of each of the point clouds; determining loop closures based on the registered subset of point clouds; determining a position and an orientation of each of the subset of the point clouds based on constraints associated with the determined loop closures; and generating a map based on the determined position and the orientation of each of the subset of the point clouds.
Provided herein is a system and method that acquires data and determines a driving action based on the data. The system comprises a processor configured to acquire data of nonuniform resolution over a field of view of the sensor, and a controller configured to determine a driving action of a vehicle based on the data, and perform the driving action.
G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
G06V 10/147 - Details of sensors, e.g. sensor lenses
G06V 10/22 - Image preprocessing by selection of a specific region containing or referencing a patternLocating or processing of specific regions to guide the detection or recognition
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G01S 17/89 - Lidar systems, specially adapted for specific applications for mapping or imaging
G01S 17/86 - Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
77.
Instance segmentation using sensor data having different dimensionalities
Described herein are systems, methods, and non-transitory computer readable media for using 3D point cloud data such as that captured by a LiDAR as ground truth data for training an instance segmentation deep learning model. 3D point cloud data captured by a LiDAR can be projected on a 2D image captured by a camera and provided as input to a 2D instance segmentation model. 2D sparse instance segmentation masks may be generated from the 2D image with the projected 3D data points. These 2D sparse masks can be used to propagate loss during training of the model. Generation and use of the 2D image data with the projected 3D data points as well as the 2D sparse instance segmentation masks for training the instance segmentation model obviates the need to generate and use actual instance segmentation data for training, thereby providing an improved technique for training an instance segmentation model.
Described herein are systems, methods, and non-transitory computer readable media for generating fused sensor data through metadata association. First sensor data captured by a first vehicle sensor and second sensor data captured by a second vehicle sensor are associated with first metadata and second metadata, respectively, to obtain labeled first sensor data and labeled second sensor data. A frame synchronization is performed between the first sensor data and the second sensor data to obtain a set of synchronized frames, where each synchronized frame includes a portion of the first sensor data and a corresponding portion of the second sensor data. For each frame in the set of synchronized frames, a metadata association algorithm is executed on the labeled first sensor data and the labeled second sensor data to generate fused sensor data that identifies associations between the first metadata and the second metadata.
A system comprises a sensor system comprising a sensor and an analysis engine configured to determine whether the sensor is uncalibrated. The system further comprises an error handling system configured to determine whether to perform a recalibration in response to the sensor system determining that the sensor is uncalibrated. The error handling system further comprises a recalibration engine configured to perform a recalibration.
An apparatus on a vehicle comprises one or more sensors, one or more nozzles that output fluid to clean the respective one or more sensors, and a compressor that generates fluid such as compressed air. The compressor is in fluid communication with the one or more nozzles. The apparatus further comprises one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to predict a trajectory of the vehicle and control an operation of the compressor based on the predicted trajectory of the vehicle.
An apparatus on a vehicle comprises one or more sensors, one or more nozzles that output fluid to clean the respective one or more sensors, and a compressor that generates fluid such as compressed air. The compressor is in fluid communication with the one or more nozzles. The apparatus further comprises one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to determine a current velocity of the vehicle and control an operation of the compressor based on the current velocity of the vehicle.
Described herein are systems, methods, and computer readable media for capturing sensor data relating to an enclosed compartment of a vehicle (e.g., a cargo area of the vehicle) via one or more vehicle sensors; analyzing the sensor data to determine whether it is indicative of a living being present in the enclosed compartment; performing an object detection analysis on at least a portion of the sensor data to determine a type of living being detected; and initiating one or more automated vehicle response measures based on the type of living being.
B60W 40/08 - Estimation or calculation of driving parameters for road vehicle drive control systems not related to the control of a particular sub-unit related to drivers or passengers
B60W 50/00 - Details of control systems for road vehicle drive control not related to the control of a particular sub-unit
B60W 50/14 - Means for informing the driver, warning the driver or prompting a driver intervention
G06V 20/59 - Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
Described herein are systems, methods, and computer readable media for performing data conversion on sensor data to obtain modified sensor data that is formatted/structured appropriately for downstream processes that rely on the sensor data as input. The sensor data can include point cloud data captured by a LiDAR, for example. A grid structure and corresponding grid characteristics can be determined and the sensor data can be converted to grid-based sensor data by associating the grid structure and its characteristics with the sensor data. Generating the grid-based sensor data can include reformatting the point cloud data to superimpose the grid structure and its grid characteristics onto the point cloud data. Various downstream processing that cannot feasibly be performed on the raw sensor data can then be performed efficiently on the modified grid-based sensor data by virtue of the grid structure imbuing the sensor data with spatial proximity information.
Described herein are systems, methods, and computer readable media for performing data conversion on sensor data to obtain modified sensor data that is formatted/structured appropriately for downstream processes that rely on the sensor data as input. The sensor data can include point cloud data captured by a LiDAR, for example. A grid structure and corresponding grid characteristics can be determined and the sensor data can be converted to grid-based sensor data by associating the grid structure and its characteristics with the sensor data. Generating the grid-based sensor data can include reformatting the point cloud data to superimpose the grid structure and its grid characteristics onto the point cloud data. Various downstream processing that cannot feasibly be performed on the raw sensor data can then be performed efficiently on the modified grid-based sensor data by virtue of the grid structure imbuing the sensor data with spatial proximity information.
An apparatus on a vehicle comprises one or more sensors, one or more nozzles that output fluid to clean the respective one or more sensors, and a compressor that generates fluid such as compressed air. The compressor is in fluid communication with the one or more nozzles. The apparatus further comprises one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to determine information of an acoustic emission from the compressor and to counteract the acoustic emission based on the determined information.
G10K 11/178 - Methods or devices for protecting against, or for damping, noise or other acoustic waves in general using interference effectsMasking sound by electro-acoustically regenerating the original acoustic waves in anti-phase
Provided herein is a system and method for fleet coordination in a vehicle. The system comprises one or more sensors, one or more processors, and a memory storing instructions that, when executed by the one or more processors, cause the system to perform, capturing current data associated with the vehicle, planning a route of the vehicle based on the captured current data, navigating the vehicle in accordance with the planned route, detecting an instant position of the vehicle while navigating the vehicle, and coordinating a movement of another vehicle with the vehicle based on the detected instant position of the vehicle.
G05D 1/02 - Control of position or course in two dimensions
G08G 1/00 - Traffic control systems for road vehicles
H04W 4/46 - Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]
B60W 30/16 - Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
Provided herein is a system that comprises one or more sensors that capture data, one or more processors, and a memory storing instructions that, when executed by the one or more processors, causes the system to perform functions that include identifying one or more locations within a distance of the vehicle, capturing current data of the identified one or more locations, determining one or more changes that exceed respective threshold amounts between the current data and historical data of the identified one or more locations, and updating the historical data of the identified one or more locations based on the determined one or more changes.
Described herein are systems, methods, and computer readable media for capturing image data of one or more regions of a vehicle (e.g., a cargo area of an autonomous vehicle) at various particular times and assessing the image data to determine whether a past vehicle occupant has left behind one or more belongings of value in the vehicle. If it is determined that a former vehicle occupant has left behind an article of value, an audible message may be outputted from a speaker of the vehicle to inform the former occupant of the presence of the article in the vehicle or a notification may be sent to a mobile device of the former occupant. The audible message may be outputted, for example, while the former occupant is beyond a predetermined distance from the vehicle, but still within range of hearing the message.
G06V 20/59 - Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
G08B 21/24 - Reminder alarms, e.g. anti-loss alarms
G08B 5/36 - Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied using electric transmissionVisible signalling systems, e.g. personal calling systems, remote indication of seats occupied using electromagnetic transmission using visible light sources
B60R 11/04 - Mounting of cameras operative during driveArrangement of controls thereof relative to the vehicle
G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
B60Q 5/00 - Arrangement or adaptation of acoustic signal devices
B60W 60/00 - Drive control systems specially adapted for autonomous road vehicles
G06Q 20/10 - Payment architectures specially adapted for electronic funds transfer [EFT] systemsPayment architectures specially adapted for home banking systems
B60Q 3/20 - Arrangement of lighting devices for vehicle interiorsLighting devices specially adapted for vehicle interiors for lighting specific fittings of passenger or driving compartmentsArrangement of lighting devices for vehicle interiorsLighting devices specially adapted for vehicle interiors mounted on specific fittings of passenger or driving compartments
89.
Systems and methods for cooling components of a vehicle
Systems and methods are provided for cooling air in a vehicle. The system includes a chassis, inside an interior area of a vehicle, with one or more openings that are configured to allow air to enter the chassis to cool a heat generating component in the vehicle and an exhaust duct that directs the air away from the chassis after the air has contacted at least a portion of the heat generating component. The system includes a fan that acts to propel the air through the one or more openings and through the exhaust duct.
B60H 1/00 - Heating, cooling or ventilating devices
B62D 21/17 - Understructures, i.e. chassis frame on which a vehicle body may be mounted forming fluid or electrical conduit means or having other means to accommodate the transmission of a force or signal
H05K 7/20 - Modifications to facilitate cooling, ventilating, or heating
Provided herein is a system of a vehicle that comprises one or more sensors, one or more processors, and memory storing instructions that, when executed by the one or more processors, causes the system to perform: selecting a trajectory along a route of the vehicle; predicting a trajectory of another object along the route; adjusting the selected trajectory based on a predicted change, in response to adjusting the selected trajectory, to the predicted trajectory of the another object, the predicted change to the predicted trajectory of the another object being stored in a model; determining an actual change, in response to adjusting the selected trajectory, to a trajectory of the another object, in response to an interaction between the vehicle and the another object; updating the model based on the determined actual change to the trajectory of the another object; and selecting a future trajectory based on the updated model.
Systems and methods are provided for cooling air in a vehicle. The system includes a chassis, inside an interior area of a vehicle, with one or more openings that are configured to allow air to enter the chassis to cool a heat generating component in the vehicle and an exhaust duct that directs the air away from the chassis after the air has contacted at least a portion of the heat generating component. The system includes a fan that acts to propel the air through the one or more openings and through the exhaust duct.
Systems and methods are provided for cooling vehicle components. The system includes one or more heat generating components in a vehicle and a coolant flow path connected to the two or more heat generating components. The system includes a coolant pump configured to circulate coolant through the coolant flow pat and a reversing mechanism configured to reverse a direction of circulation of coolant.
B60H 1/00 - Heating, cooling or ventilating devices
B60R 16/00 - Electric or fluid circuits specially adapted for vehicles and not otherwise provided forArrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
B60R 16/037 - Electric or fluid circuits specially adapted for vehicles and not otherwise provided forArrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric for occupant comfort
B60R 16/08 - Electric or fluid circuits specially adapted for vehicles and not otherwise provided forArrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for fluid
93.
System and method for communicating vehicle actions
Provided herein is a system and method of a vehicle that communicates an intended action of the vehicle. The system comprises one or more sensors; one or more processors; and a memory storing instructions that, when executed by the one or more processors, causes the system to perform capturing data from the one or more sensors of another vehicle or a road condition; determining an intended action of the vehicle based on the captured data; simulating the intended action of the vehicle on a map; communicating, within the vehicle, the intended action of the vehicle; and navigating the vehicle based on the intended action of the vehicle.
G01C 21/36 - Input/output arrangements for on-board computers
B60Q 5/00 - Arrangement or adaptation of acoustic signal devices
H04W 4/46 - Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
94.
System and method for localization of traffic signs
Provided herein is a system and method of a vehicle. The system comprises one or more sensors, processors, maps, and a memory storing instructions that, when executed by the one or more processors, causes the system to perform: monitoring a location of the vehicle while driving; detecting a sign while the vehicle is driving; capturing, frame-by-frame, data of the sign until the sign disappears from a field of view of the sensor; synchronizing each frame of the data with the location of the vehicle; determining a location of the sign based on the frame-by-frame data; in response to determining, at a frame immediately before the sign disappears from the field of view of the sensor, that the vehicle is driving towards the sign, uploading the detected sign and the location of the sign onto the one or more maps; and implementing a driving action based on the sign.
G01C 21/32 - Structuring or formatting of map data
G08G 1/0967 - Systems involving transmission of highway information, e.g. weather, speed limits
H04W 4/46 - Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]
G06V 20/58 - Recognition of moving objects or obstacles, e.g. vehicles or pedestriansRecognition of traffic objects, e.g. traffic signs, traffic lights or roads
Described herein are aerodynamically enhanced sensor housings. An aerodynamically enhanced sensor housing has an asymmetrical lateral cross-section that includes a first portion having a substantially spherical curvature and a second portion having a non-spherical curvature. The second portion having the non-spherical curvature may be elongated in relation to the first portion. An aerodynamically enhanced housing can also include one or more indentations formed in an exterior surface thereof to further enhance drag reducing characteristics of the housing. In addition, air flow characteristics around the sensor housing during vehicle operation can be assessed and a drag reduction protocol can be generated and implemented to further enhanced the drag reducing characteristics of the sensor housing.
Described herein are systems, methods, and computer readable media for dynamically determining a language variant to use for vehicle output to a vehicle occupant based on the vehicle's location. A geographic region may include multiple sub-regions, each of which may be associated with a respective one or more language variants. As an example, a geographic region may be a state or province, and each sub-region may have one or more dialects that are spoken by individuals in that sub-region. In some cases, a particular dialect may be predominant in a given sub-region. As a vehicle traverses a travel path, it may determine its current location, which geographic sub-region includes that location, and which language variant (e.g., dialect) is predominant there. That language variant may then be selected for in-vehicle communication with a vehicle occupant. The vehicle location determination may be made at or near where the occupant entered the vehicle.
A system of a first vehicle includes sensors and a processor that performs generating of an initial trajectory along a travel route of the first vehicle, acquiring trajectories of second vehicles along the route, adjusting the initial trajectory based on the acquired trajectories of the second vehicles, and navigating the first vehicle based on the adjusted initial trajectory.
Described herein are sensor assembly cleaning systems and apparatuses that are adapted to rotate a transparent surface of a sensor assembly independently of a housing of the sensor assembly in order to disperse water, moisture, debris, or the like from the surface. The transparent surface may be a glass window that provides a camera of the sensor assembly with a field-of-view of an external environment. Sensor data captured from various on-board vehicle sensors such as moisture data, image data, vehicle velocity data, or the like can be evaluated against various criteria to determine when and for how long to rotate the transparent surface. Sensor data can be evaluated over a period of time to identify patterns or trends relating to one or more vehicle parameters. An activation schedule for initiating and ceasing rotation of the transparent surface can be determined based on such patterns/trends.
B60R 1/00 - Optical viewing arrangementsReal-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
H04N 17/00 - Diagnosis, testing or measuring for television systems or their details
B60S 1/56 - Cleaning windscreens, windows, or optical devices specially adapted for cleaning other parts or devices than front windows or windscreens
G08B 13/196 - Actuation by interference with heat, light, or radiation of shorter wavelengthActuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
Provided herein is a system comprising: one or more processors; and a memory storing instructions that, when executed by the one or more processors, causes the system to perform: obtaining a previous pose of a vehicle; acquiring one or more previous readings corresponding to one or more wheel encoders during the previous pose; acquiring one or more readings corresponding to one or more wheel encoders acquired after the previous pose; and adjusting the previous pose based on the one or more readings to obtain a current pose.
G01C 21/28 - NavigationNavigational instruments not provided for in groups specially adapted for navigation in a road network with correlation of data from several navigational instruments
G07C 5/08 - Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle, or waiting time
G01C 21/16 - NavigationNavigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigatedDead reckoning by integrating acceleration or speed, i.e. inertial navigation
G01C 22/00 - Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers or using pedometers
Described herein are systems, methods, and non-transitory computer readable media for triggering a sensor operation of a second sensor (e.g., a camera) based on a predicted time of alignment with a first sensor (e.g., a LiDAR), where operation of the second sensor is simulated to determine the predicted time of alignment. In this manner, the sensor data captured by the two sensors is ensured to be substantially synchronized with respect to the physical environment being sensed. This sensor data synchronization based on predicted alignment of the sensors solves the technical problem of lack of sensor coordination and sensor data synchronization that would otherwise result from the latency associated with communication between sensors and a centralized controller and/or between sensors themselves.
G06N 3/126 - Evolutionary algorithms, e.g. genetic algorithms or genetic programming
G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computerOutput arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles
G05B 19/042 - Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors