A car with a (point of) view

By Andrew Burt

shutterstock_191586596

Autonomous cars. Few other technology-related topics these days stimulate the heated reactions that this topic seems to kindle – although the new Apple Pencil is certainly getting a lot of attention. Reactions to driverless cars run the gamut from excited anticipation to dystopian alarm. It’s not so much fear of the machines taking over that tends to raise our antennae, as the fact that we just like to be in charge. We all like to feel that we’re in control of our cars, rather than being shuttled about by an unseen chauffeur.

But how much are we really in control?

Chris Gerdes, director of the Center for Automotive Research at Stanford (CARS), told TU-Automotive Conference attendees that 94% of the automotive accidents that occur annually (over 5 million in the U.S. alone) are preventable because they’re caused by human error – whether from sleep deprivation, intoxication, texting, or simply poor reflexes. Intrinsically, we know that “someone” is at fault when an accident occurs (not us, of course!), so this statistic makes self-driving cars seem like the ideal solution. Are they? The answer isn’t neat and tidy.

Cars that can drive themselves bring to the table a number of challenges that have nothing to do with engineering or technical specifications.  There are legal and ethical implications that need to be considered. For example, from a legal standpoint, if an accident does happen (because nothing is 100% fool-proof), who is responsible? The owner? The carmaker? Component and/or software providers? In cases where bad roads or hazards were involved, does the state or local government bear the burden? Government policies need be developed that take autonomous vehicles into account; funding of future infrastructure projects will no doubt be impacted. And soon. Many automakers and suppliers have publicly stated plans to put self-driving cars on the road by 2020.

CMOS image sensors – the eyes of a camera – play an important and growing role in enabling autonomous driving. Vehicles may contain as many as 10 cameras when the age of self-driving cars arrives, and Strategy Analytics estimates that global shipments of automotive cameras will more than triple, reaching 102 million units by 2020 (see chart below). With all these sensors, comes an increased need for security, and solutions are being developed to minimize the potential for and impact of another critical factor: hacking.

2020 Automotive Camera Demand. (PRNewsFoto/Strategy Analytics)

For most of us to make the decision to invest in a driverless car, we’ll need to be confident it’s been made as impervious to cyber-penetration as possible. In a recent IEEE Spectrum article, a principal scientist with Security Innovations claimed to have found an inexpensive and fairly simple way to hack the costly light detection and ranging (lidar) systems used by most self-driving cars to identify obstacles. And in a widely reported experiment, hackers hired by Wired magazine took control of a connected SUV, which ultimately ended up in a ditch. The driver was not injured, but what if some electronics DIY types, tired of building drones, were to try their hand at hacking projects?

This brings us to the question of ethics. Google “ethics” and “self-driving car,” and you get a host of results – with good reason. While an image sensor-driven camera and CPU are basically analogous to the coupling of the human eye and the brain, they don’t possess the same decision-making skills as humans. Debate therefore abounds as to how to program cars to make split-second decisions.

shutterstock_148830749A great deal of research is being done in this area. As detailed recently in MIT Technology Review, Stanford’s CARS is working on programming vehicles to make real-world types of ethical decisions. Chris Gerdes affirms that carmakers are highly aware of this issue, and they have programmers actively working on the design control algorithms that must take these challenges into account.

For example, can a machine that’s not supposed to break laws do so to save a life – e.g., cross over a double-yellow line to avoid hitting a bicyclist or pedestrian? What if the scenario is more complex, and no matter the option, someone will be injured, or worse? Do the “needs of the many outweigh the needs of the few,” as Mr. Spock would say?

We’d like to know what you think. Please feel free to share in the comments section your views on driverless cars and their legal, ethical and other implications for the automotive industry and our world at large.

Are you smarter than a sensor?

By Andrew Burt

shutterstock_176780390

Welcome back, or, if you’re checking out our blog for the first time – welcome! This post is the second of an initial series of three looking at the automotive space; rest assured that future installments will explore other facets of the image sensor business, as well.

As I mentioned in our inaugural post, I attended and spoke at the TU-Automotive Conference in June. Speakers representing companies throughout the automotive ecosystem, and beyond, addressed a host of topics ranging from data crunching and monetization to security and infrastructure challenges. In addition, ethical and moral issues were debated surrounding autonomous cars, which we’ll be touching on in a future post.

The folks at TU Automotive have encapsulated some of the conference highlights – you can read their Day 1 and Day 2 summaries of the event. I’m going to cherry-pick a couple of the comments highlighted, and add a few observations of my own.

Our focus, of course, is the role played by image sensors. Capturing and processing images is a critical aspect of advanced driver assistance systems (ADAS), and bringing advanced vision system technology to fruition is key for autonomous vehicles. As several speakers noted, however, the public is leery of autonomy, and consumer expectations need to be set appropriately. As Jude Hurin with the Nevada Dept. of Motor Vehicles emphasized, autonomy needs to be introduced to the driving public as not just another automotive innovation, but a major milestone – a tipping point in our relationship with our cars.

Facilitating acceptance will rely on optimizing the consumer experience. David Miller of Covisint cited Gartner’s prediction that, by 2020, more than 26 billion connected “things” will be installed. This, of course, includes connected cars – 150 million of which will be on the road by 2020, according to IHS Automotive (see Figure 1). What we want is ease of connection, and new cars will integrate wireless car connectivity (cellular, WiFi, Bluetooth, etc.) with ADAS and autonomous driving.

Figure1a

Of course, all this connectedness heightens the need for security – witness the widely reported vulnerabilities discovered in some types of vehicles. A large-scale hacker attack could cause serious pileups and confusion. Joe Fabbre of Green Hills Software stated the need for a security architecture that identifies critical system components and separates them from untrusted code, with strict access control enforced.

These points are representative of some key themes that emerged from the conference:

  1. Connected car disruption is happening NOW
  2. Ecosystem and business models are expanding exponentially
  3. Data management is a key challenge: process, predict and – most importantly – protect data
  4. Drivers want an integrated experience – navigation, entertainment, news/weather
  5. As noted earlier, sensors – imaging and others – are a critical connectivity component enabling ADAS and autonomy

I’d like to end with an anecdote that ties back to the title of this post, regarding my own recent experience piloting a rental vehicle equipped with the latest in ADAS technologies. I was attending Toshiba’s annual sales conference, and had reserved a rental car at the airport in order to pick up my colleagues flying in from Japan and take them to our hotel. It was a brand new SUV, and, as one tends to do when driving a rental, I acclimated to the vehicle “on the fly.” I soon learned that, today, this is a whole new ball game.

I pulled onto the freeway and sishutterstock_149847923gnaled to change lanes. As I began to move over so that I could pass another vehicle, I felt a sudden tapping on my left leg – firm enough that I was quite startled and thought perhaps my muscles were rebelling against sitting too long. A few minutes later, when I moved to change lanes again, I felt the same sensation on my right leg. With my colleagues chatting amongst themselves in Japanese behind me, I was left to my own thoughts, and I briefly wondered if I was having some type of neurological episode. Of course, I soon realized that the SUV’s side-mirror image sensors triggered the internal notification sensors that were sending me “are you sure you want to do this?” signals.

It was definitely an eye-opening experience – made even more so when I commenced the process of pulling the massive vehicle into what looked like a too-small parking spot. Up popped the heads-up display across the lower half of the windshield to guide me in, like landing a fighter jet on an aircraft carrier. Being familiar with these technologies in the abstract is very different from using them first-hand. It gave me valuable insight into the consumer experience – and reminded me that from now on, I should take a few minutes to look over the manual when I get a fully loaded rental car!

See you next time – and if you’re not already following Image Sensors World, be sure to check out this important industry blog.

“Cameras and modules and systems…oh my!”

By Andrew Burt

Welcome to the first post of EYE ON SENSORS, the image sensor blog of Toshiba America Electronic Components (TAEC). Our goal for this new blog is to make it an informative and entertaining read that you’ll want to visit regularly.

As VP of the Image Sensor Business Unit at TAEC, where I’ve spent the past 16 years of my career, the topic is one about which I’m quite passionate – not only because it’s my job to be, but because there are so many exciting developments taking place in this technology. We are seeing their use growing not only in cameras, phones and tablets, but also increasingly in some newer areas – medical imaging, security/surveillance applications, and the automotive arena.

The use of camera sensors in the automotive space is a natural evolution of the technology, as the level of electronic content in our cars has continued to increase at a rapid pace. About 83 million car cameras are expected to be sold in 2020, a 5x increase from 2012, according to IHS Automotive.

Presenting at TU-Automotive, June 2015
Presenting at TU-Automotive, June 2015

I had the opportunity recently to share Toshiba’s perspective regarding sensor use in vehicles when I spoke at the TU-Automotive Conference, held in early June near Detroit. Focused on three key buzzwords surrounding emerging vehicle technology – telematics, autonomy and mobility – the event was fascinating. Virtually every major carmaker and leading automotive supplier was represented among the speakers and panelists. We’ll look at some key findings from the conference in future posts.

CMOS image sensors in cars have come a long way since the first wave of vehicles were introduced that featured rear-view cameras, allowing you to ensure you weren’t going to run over your kid’s bicycle or hit the neighbor’s dog when backing out of your driveway. The need for more camera sensors in cars is being driven by the rise of Advanced Driver Assistance Systems (ADAS), whose growth, in turn, has been spurred by the creation of new standards as governments continue to push for greater road safety. With ADAS still in early development, hardware and algorithms are in flux, creating significant opportunity for Toshiba – a silicon-to-system provider with partnerships that help meet varying requirements throughout the sensor ecosystem (see Figure 1).

Figure1

In automotive applications, while current image sensors have the ability to detect objects about 250 feet away, detection of people and objects at a distance of more than 600 feet will be required going forward, particularly when we factor in self-driving, or autonomous, cars. Various types of viewing systems such as surround view, or top view, systems and side view parking assist systems are expected to become widely employed, in addition to the rear view systems that are already widely used.

The visual in Figure 2 shows the resolution trend and new applications for cameras in cars. As you can see, new vehicles may integrate as many as nine camera sensors to facilitate a variety of functions. Some are employed for viewing systems used directly by the driver, such as blind-spot mitigation or sighting pedestrians or hazards in the street, while multi-camera synching enables sensing systems that aid the driver in such actions as entering/leaving parking spaces and staying in or changing lanes. Another area is the brave new world of monitoring driver alertness and attentiveness.

Figure2

You’ll notice that automotive sensors do not adhere to the trend toward very high resolution of 20MP or more seen in consumer electronics, such as smartphones. Whereas smartphone users demand ever-higher-quality cameras when making new purchases (a trend we’ll discuss further in a future post), automotive sensors need bigger pixels to deal with all the light sources that must be mitigated for them to function properly, which is why we see the standard resolution remaining at 1.3MP for the foreseeable future. For other key automotive-sensor metrics, see the table below.

Table

The quality metrics are associated with automotive safety specifications. Automotive Safety Integrity Level (ASIL) is a risk classification scheme defined by ISO 26262 – the Functional Safety for Road Vehicles standard. AEC-Q100 refers to the stress test qualification for ICs established by the U.S.-based Automotive Electronics Council, which sets qualification standards for all electronics components supplied to the automotive industry.

Moving forward, in addition to reducing sensor costs, the industry will be working on continuing to adapt and optimize the technology to accommodate autonomous driving. This Fortune article from a few months ago underscores how snow and ice pose some challenges that still need to be overcome for self-driving or driver-assisted cars to be viable in all climates. Even when it’s clear skies, however, if one isn’t used to driving a vehicle with the latest ADAS capabilities, it definitely takes some getting used to. (More on my own recent experience in this regard in the next post.)

I hope you’ve enjoyed this glimpse into the state and future of automotive sensors, and that you’ll share this link with your friends and colleagues interested in the world of image sensors. To subscribe, please click on the link at right – we’ll let you know when new posts are up, and we’ll look forward to receiving your feedback in the comments section below.