“Cameras and modules and systems…oh my!”

By Andrew Burt

Welcome to the first post of EYE ON SENSORS, the image sensor blog of Toshiba America Electronic Components (TAEC). Our goal for this new blog is to make it an informative and entertaining read that you’ll want to visit regularly.

As VP of the Image Sensor Business Unit at TAEC, where I’ve spent the past 16 years of my career, the topic is one about which I’m quite passionate – not only because it’s my job to be, but because there are so many exciting developments taking place in this technology. We are seeing their use growing not only in cameras, phones and tablets, but also increasingly in some newer areas – medical imaging, security/surveillance applications, and the automotive arena.

The use of camera sensors in the automotive space is a natural evolution of the technology, as the level of electronic content in our cars has continued to increase at a rapid pace. About 83 million car cameras are expected to be sold in 2020, a 5x increase from 2012, according to IHS Automotive.

Presenting at TU-Automotive, June 2015
Presenting at TU-Automotive, June 2015

I had the opportunity recently to share Toshiba’s perspective regarding sensor use in vehicles when I spoke at the TU-Automotive Conference, held in early June near Detroit. Focused on three key buzzwords surrounding emerging vehicle technology – telematics, autonomy and mobility – the event was fascinating. Virtually every major carmaker and leading automotive supplier was represented among the speakers and panelists. We’ll look at some key findings from the conference in future posts.

CMOS image sensors in cars have come a long way since the first wave of vehicles were introduced that featured rear-view cameras, allowing you to ensure you weren’t going to run over your kid’s bicycle or hit the neighbor’s dog when backing out of your driveway. The need for more camera sensors in cars is being driven by the rise of Advanced Driver Assistance Systems (ADAS), whose growth, in turn, has been spurred by the creation of new standards as governments continue to push for greater road safety. With ADAS still in early development, hardware and algorithms are in flux, creating significant opportunity for Toshiba – a silicon-to-system provider with partnerships that help meet varying requirements throughout the sensor ecosystem (see Figure 1).


In automotive applications, while current image sensors have the ability to detect objects about 250 feet away, detection of people and objects at a distance of more than 600 feet will be required going forward, particularly when we factor in self-driving, or autonomous, cars. Various types of viewing systems such as surround view, or top view, systems and side view parking assist systems are expected to become widely employed, in addition to the rear view systems that are already widely used.

The visual in Figure 2 shows the resolution trend and new applications for cameras in cars. As you can see, new vehicles may integrate as many as nine camera sensors to facilitate a variety of functions. Some are employed for viewing systems used directly by the driver, such as blind-spot mitigation or sighting pedestrians or hazards in the street, while multi-camera synching enables sensing systems that aid the driver in such actions as entering/leaving parking spaces and staying in or changing lanes. Another area is the brave new world of monitoring driver alertness and attentiveness.


You’ll notice that automotive sensors do not adhere to the trend toward very high resolution of 20MP or more seen in consumer electronics, such as smartphones. Whereas smartphone users demand ever-higher-quality cameras when making new purchases (a trend we’ll discuss further in a future post), automotive sensors need bigger pixels to deal with all the light sources that must be mitigated for them to function properly, which is why we see the standard resolution remaining at 1.3MP for the foreseeable future. For other key automotive-sensor metrics, see the table below.


The quality metrics are associated with automotive safety specifications. Automotive Safety Integrity Level (ASIL) is a risk classification scheme defined by ISO 26262 – the Functional Safety for Road Vehicles standard. AEC-Q100 refers to the stress test qualification for ICs established by the U.S.-based Automotive Electronics Council, which sets qualification standards for all electronics components supplied to the automotive industry.

Moving forward, in addition to reducing sensor costs, the industry will be working on continuing to adapt and optimize the technology to accommodate autonomous driving. This Fortune article from a few months ago underscores how snow and ice pose some challenges that still need to be overcome for self-driving or driver-assisted cars to be viable in all climates. Even when it’s clear skies, however, if one isn’t used to driving a vehicle with the latest ADAS capabilities, it definitely takes some getting used to. (More on my own recent experience in this regard in the next post.)

I hope you’ve enjoyed this glimpse into the state and future of automotive sensors, and that you’ll share this link with your friends and colleagues interested in the world of image sensors. To subscribe, please click on the link at right – we’ll let you know when new posts are up, and we’ll look forward to receiving your feedback in the comments section below.


One thought on ““Cameras and modules and systems…oh my!”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s