SalesDept@topcomponents.cc +8613427370519
Language Translation

* Please refer to the English Version as our Official Version.

How sensor innovation in ADAS systems can save lives in road traffic

Postar em Jan 01,1970

Traffic safety is a huge challenge - over 1.1 million people die each year from road traffic accidents, and approximately 20 to 50 million people are injured. One of the main causes of these accidents is driver error. Automakers and government regulatory agencies have been searching for ways to improve safety, and in recent years, advanced driver assistance systems (ADAS) have made significant progress in helping reduce road injuries. In this article, we will explore the role of ADAS in improving road safety, as well as various sensor technologies that are crucial for achieving this goal. The Evolution and Importance of ADAS Since the first introduction of Anti lock Braking System (ABS) in the 1970s, the application of ADAS technology in passenger cars has steadily increased, and safety has correspondingly improved. According to the National Security Council (NSC), ADAS has the potential to prevent approximately 62% of traffic fatalities in the United States alone, saving over 20000 lives annually. In recent years, ADAS functions such as Automatic Emergency Braking (AEB) and Forward Collision Warning (FCW) have become increasingly popular, with over a quarter of vehicles equipped with these features to help drivers prevent accidents and ultimately save lives. ADAS requires multiple technologies to work collaboratively. A perception kit acts as the "eyes" of the system, detecting the surrounding environment of the vehicle and providing data to the system's "brain", which uses this data to calculate the vehicle's execution decisions to assist the driver - for example, when a vehicle is detected ahead and the driver has not stepped on the brake, the AEB will automatically brake, stopping the vehicle in time to avoid a rear end collision. The ADAS perception kit consists of a visual system that includes a vehicle grade camera, with the core being a high-performance image sensor that captures video streams of the surrounding environment of the vehicle for detecting vehicles, pedestrians, traffic signs, etc. These images are displayed to assist the driver in low-speed driving and parking situations. Cameras are typically paired with depth perception systems such as millimeter wave radar, LiDAR, or ultrasonic sensors, which provide depth information to enhance the camera's two-dimensional image, increase redundancy, and eliminate ambiguity in object distance measurements. For automobile manufacturers and their primary system suppliers, implementing ADAS systems may be a challenge: the processing power to handle all data generated by multiple sensors is limited, and the sensors themselves also have performance limitations. The requirements of the automotive industry determine that every component must have extremely high reliability, including not only hardware but also related software algorithms, thus requiring extensive testing to ensure safety. The system must also maintain stable performance under the harshest lighting and weather conditions, be able to withstand extreme temperatures, and operate reliably throughout the entire vehicle lifecycle. Key Sensor Technologies in ADAS Systems Now let's take a detailed look at some key sensor technologies used in ADAS, including image sensors, LiDAR, and ultrasonic sensors. Each sensor provides a specific type of data, which is processed through software algorithms and combined with each other to generate an accurate and comprehensive understanding of the environment. This process is called sensor fusion, which can improve the accuracy and reliability of software perception algorithms through redundancy of multiple sensor modes, thereby achieving higher levels of security through higher confidence decisions. The complexity of these multi-sensor kits may rapidly increase, and algorithms require increasingly powerful processing capabilities. At the same time, sensors themselves are becoming increasingly advanced, allowing for local processing at the sensor level rather than on a central ADAS processor. Automotive image sensor Image sensors are the "eyes" of vehicles - arguably the most important type of sensor in any vehicle equipped with ADAS. From "machine vision" driving assistance functions such as automatic emergency braking, forward collision warning, and lane departure warning, to "human perspective" functions such as 360 degree surround view cameras for parking assistance and camera monitoring systems for electronic rearview mirrors, to driver monitoring systems that can detect distracted or fatigued drivers and issue alerts to prevent accidents, image data provided by image sensors can be used to achieve various ADAS functions. Onsemi offers a variety of image sensors, including the Hyperlux series, which provide excellent image quality with low power consumption. The Hyperlux sensor pixel architecture includes an innovative super exposure imaging solution that captures high dynamic range (HDR) frames through LED flicker mitigation (LFM), overcoming the problem of misreading caused by pulse flicker in LED front and rear lights or LED traffic signs. Hyperlux image sensors are designed to cope with challenging automotive scene conditions, such as capturing dynamic ranges of up to 150 decibels (dB) in direct sunlight above elevated bridges. Cameras equipped with Hyperlux image sensors perform far better than the human eye in handling extreme situations and can operate normally even in light levels far below 1 lux. Anson's Hyperlux image sensor includes an 8-megapixel AR0823AT and a 3-megapixel AR0341AT. These digital CMOS image sensors use Hyperlux 2.1 μ m overexposure single photodiode pixel technology, which has excellent low light performance and can capture a wide dynamic range in both high and low light scenes in the same frame image. Super exposure pixels can achieve a sufficiently large dynamic range in one frame of image, thus realizing a worry free exposure scheme, effectively eliminating the need for automatic exposure adjustment when lighting conditions change, such as when driving out of tunnels or parking lots on sunny days. Depth sensor (LiDAR) Accurately measuring the distance between an object and a sensor is called depth perception. Deep information can eliminate ambiguity in the scene, which is crucial for various ADAS functions and achieving higher-level ADAS and fully autonomous driving. There are multiple technologies available for deep perception. If depth performance is to be considered, light detection and ranging (LiDAR) are the best choices. LiDAR is capable of depth perception with high depth and angular resolution, and due to the active illumination achieved through the combination of near-infrared (NIR) laser and sensors, it can operate under all ambient light conditions. It is suitable for both close range and long-distance applications. Although low-cost millimeter wave radar sensors are more common in today's automotive applications, they lack the angular resolution of LiDAR and cannot provide the high-resolution 3D point cloud environment information required for higher-level autonomous driving beyond basic ADAS requirements. The most common LiDAR architecture is the Time of Flight (ToF) method, which directly calculates distance by emitting a short infrared light pulse and measuring the time it takes for the signal to reflect back from an object to the sensor. LiDAR sensors replicate this measurement process by scanning light within their field of view to capture the entire scene. The ARRAYRDM-0112A20 silicon photomultiplier tube (SiPM) array from onsemi is a single photon sensitive sensor with 12 channels in a single chip array and high photon detection efficiency (PDE) at near-infrared wavelengths such as 905nm, used for detecting returned pulses. This SiPM array has been integrated into a LiDAR, which is equipped on the world's first passenger cars to provide true "line of sight departure" autonomous driving capabilities, enabling the vehicle to have autonomous driving capabilities beyond basic driving assistance, meaning the driver can no longer focus on road conditions. This level of autonomous driving functionality, without the support of LiDAR deep perception, has not yet been reliably implemented in consumer grade vehicles. ultrasonic sensor Another technology used for distance measurement is ultrasonic detection, which emits sound waves with frequencies beyond the range of human hearing through sensors, and then detects the sound that bounces back, thereby measuring distance through flight time. Ultrasonic sensors can be used for close range obstacle detection and low-speed handling applications such as parking assistance. One advantage of ultrasonic sensors is that sound is much slower than light, so the time for reflected sound waves to return to the sensor is usually a few microseconds, while the time for light is nanoseconds. This means that the processing performance required for ultrasonic sensors is much lower, thereby reducing system costs. An example of an ultrasonic sensor is the Ansenmei NCV75215 Parking Distance Measurement ASSP. During the parking process of the vehicle, the component measures the distance of obstacles through a piezoelectric ultrasonic transducer for time-of-flight measurement. It can detect objects at a distance of 0.25 to 4.5 meters and has high sensitivity and low noise characteristics. conclusion Ansenmei has played an important role in developing the sensor technology required for ADAS. Anson Mei invented dual conversion gain pixel technology and HDR (high dynamic range) mode, which are now adopted by many sensors in the industry, and pioneered innovative super exposure designs that enable sensors to provide excellent low light performance while capturing HDR scenes through a single photodiode without saturation. Due to this market and technological leadership position, most ADAS image sensors currently on the road are developed by ON Semiconductor. These innovations have enabled Anson to provide high-performance sensors for automotive applications over the past two decades, thereby making a significant impact on ADAS in improving vehicle safety. The automotive industry is continuously investing heavily in ADAS and pursuing the goal of fully autonomous driving of vehicles - surpassing the basic driving assistance functions defined by SAE (i.e. L1 and L2 levels) and moving towards true autonomous driving capabilities (i.e. SAE defined L3, L4, and L5 levels). Reducing road injuries is one of the main driving forces behind this trend, and Anson's sensor technology will play a crucial role in this automotive safety revolution.

This is reported by Top Components, a leading supplier of electronic components in the semiconductor industry


They are committed to providing customers around the world with the most necessary, outdated, licensed, and hard-to-find parts.


Media Relations


Name: John Chen


Email: salesdept@topcomponents.ru