For the CAN-connectable Mobileye driver assistant system ST Microeletronics (France) has announced the third generation of an SoC processor. The vision-based driver assistant system developed by Mobileye (Israel) provides a redundant interface for the CAN in-vehicle networks.
THE EYEQ3 AND EYEQ3-LITE will be the first members of a third-generation family in the co-operation, which began in 2005, between ST and Mobileye. The EyeQ1, the first generation of the processor, is now in production at several car makers and offers market-leading functions to drastically reduce the number of accidents, including lane-departure warning (LDW), adaptive-headlight control (AHC), traffic-sign recognition (TSR), collision avoidance via radar-camera fusion, and forward collision warning (FCW).
For the award-winning second generation, ST and Mobileye increased the single-chip processing power of the EyeQ2 by six-fold over the EyeQ1, enabling the vision processor to take the active-safety concept to a new level, in recognition that pedestrians are the most vulnerable road users. In the USA alone, NHTSA (National Highway Traffic Safety Administration) reports that 4000 fatalities and 60000 injuries from road accidents involve vehicles colliding with either pedestrians or cyclists. Therefore, in addition to integrating extra Mobileye radar-vision fusion-based functions on top of those offered in the previous generation, the EyeQ2 features ‘pedestrian detection’ and an option for the car manufacturer to enable fully automatic emergency braking, particularly for use in urban environments. The EyeQ2 has recently entered into production in the new Volvo S60 sedan-car series with additional launches of EyeQ2-based systems expected in the coming months from several carmakers.
For the EyeQ3, the two companies are co-developing an evolution of the processor architecture for next-generation advanced active-safety products. Once again, the silicon will be a further six times more powerful than its predecessor to meet demands for increased resolution to distinguish objects even better than before, in addition to the increasing drive for extra functionality. Design of the EyeQ3 is already underway and stress-test qualification, according to AEC-Q100, is planned within 2013. Already, Mobileye and ST have gained a couple of design wins for the EyeQ3 from global OEMs, for a full range of functions. Production is expected to start in 2014.
“To sustain the rapid growth of driving-assistance systems, and camera-based aids in particular, the market needs stronger computing platforms, and at lower cost, that are capable of handling the increasing demands for customer functions going from alert to mitigation to full braking,” said Prof. Amnon Shashua, the Sachs professor of computer science at the Hebrew University of Jerusalem and Mobileye’s co-founder and chairman. “The EyeQ3 will be the critical engine for this market as it evolves toward accident-free and autonomous driving.”
“The detection capabilities of the EyeQ3, even in difficult environmental conditions, allow for both notification and for crash mitigation, helping get consumers a step closer to eliminating accidents from our roads,” said Marco Monti, Group Vice President and General Manager, ST’s Automotive Electronics Division,. “By combining ST’s automotive design and manufacturing expertise with Mobileye’s strength in video-based driver-assistance systems, we are providing an optimal and automotive-market-proven solution for future innovative and cost-competitive car-safety applications.”
The significantly increased processing power of the EyeQ3 will enable it to run an unprecedented bundle of front-camera and surround-camera based features simultaneously, including:
The EyeQ3 will accept multiple camera inputs from Surround-View-Systems, which will also incorporate the full scope of functions to create a safety 'cocoon' around the vehicle. The EyeQ3 processor will feature four multi-threaded MIPS32 cores, coupled with four cores of the new generation of Mobileye’s innovative Vector Microcode Processors (VMP), which together offer the optimal balance of control and data processing in an architecture tailored for vision processing. In addition to the EyeQ3, it is planned to launch the EyeQ3 ‘lite’ processor. It will share a subset of the EyeQ3’s core architecture, enabling a reduced bundle of functions.
News and reports