Products
Story
An aerial view of an urban highway at night with imagery showing the range of detection for autonomous cars.
An aerial view of an urban highway at night with imagery showing the range of detection for autonomous cars.
The easiest way to understand what is autonomous driving is to just compare how we drive today.

March 1, 2022

There’s a lot to be excited about in the world of autonomous vehicles. There are also many challenges ahead, including misconceptions to clear up and technology hurdles to overcome before autonomous vehicles go mainstream. Ralf Klädtke, TE vice president and chief technical officer for Transportation Solutions, is keenly aware of the challenges and the trends impacting the automotive industry as it moves towards full autonomy. In this first discussion in a series of three interviews, Ralf discusses the dynamic driving this evolution forward and both the current obstacles and the general perspective towards autonomous vehicles. 

 

Read – and listen to – an interview with Ralf Klädtke.

Listen to the interview

21:16

Get insight on the drivers shaping vehicle autonomy.

Get interview alerts in your inbox

Please accept TE's Privacy Policy and the TE.com Terms and Conditions.

Please review errors above

The personal information you provide will be transferred to and processed by TE Connectivity in the U.S. to provide you with the requested information or services. Please read our privacy policy for more details.

For legal reasons we need to ask you for your consent with this by clicking the box to the left.

1

What is the current state of engineering vehicle autonomy?

I think when we just think back a couple of years, every one of us was dreaming to sit in a robotaxi to tell an artificial intelligence, I want to go to the next theater and everything would happen automatically. Lean back, enjoy the movie, close your eyes. And in hindsight, I must say, this has taken much longer than ever expected. A lot of developments have taken place, a lot of energy, time has gone into that. But I think issues like weather, snow, mixed traffic have been much stronger than most of us thought. And therefore, the whole progress is safety oriented and taking longer than we expected.  

 

 

2

What are key barriers to reaching level-five automation? 

I think people talk a lot about levels, but there are some brief explanations what the levels are. I think level zero, it's you yourself driving the vehicle. When we go to level one, the short abbreviation says feet off, which means the car is able to steer when you get across the lane, so it keeps you in between the lanes. And if a car's in front of you, it starts decelerating. That's breaking itself, accelerating again to the level of speed that you have set. So that's the level one. When we talk about the level two, it's kind of hands off. It's a partial automation, which means the vehicle can operate the steering, and as well the acceleration and deceleration in specific cases. And then at the level three, we talk a different language because up to level two, the driver stays liable. As of level three, the liability enters into the car, into the software, into the intelligence of the car. And here the level three means eyes off. So the driver does not need to observe the drive but must be ready to take control when an alert comes up. Bad weather or unsafe situation, please take over. And when we go to level four, then you don't need a driver, that's really mind off. The driver is not needed. The vehicle can operate, but under limited driving conditions. And at the level five is really no driver needed. Full automation in all driving conditions.  

 

 

3

What can engineers do to address the barriers of developing autonomous vehicles? 

I think there are many more barriers than we ever thought that there would be. I think just trying to count one, two, I think up to four barriers come up to my mind. I think on the first hand we have technical barriers. We see when we are in undefined scenarios. Especially in cities, when kids are playing on the streets, bicyclists, pedestrians crossing the streets. Mixed traffic means as well. Just imagine there is a car being operated in Paris. Five lanes, a lot of cars going in and out. Very undefined scenarios that are hard to be trained for and artificial intelligence. So, the technical barriers in mixed traffic are hard to overcome. The other barrier that I see is the liability. I said before, when we go from level two to level three, the liability is with a carmaker. And that's a hard thing to have. If you are in unsafe situations in different weather conditions, then who of the carmakers takes over the liability for any accident? As we all know, like in the United States, the fines in front of a court can be very high. 

 

So that's a risk to take and that's a barrier going to higher levels of autonomy. Another barrier I can see is a human barrier. The point is the readiness for autonomous driving, I think still about 60% of the people do not trust autonomous driving. Therefore, there's a human barrier, whether people are open to switch to the autonomous mode or not. And then the last barrier, I think not to be neglected is cost. People are not ready for a passenger car, for enjoying, for relaxing to be to pay huge extra. And if you go to higher levels of autonomy, we talk about higher computing requirements, about a lot of more sensors, sensor fusion, artificial intelligence, and that is not free of charge. So, there's a certain barrier. And what kind of autonomy people are ready to pay for. When we talk safety, there's willingness to pay for safety in passenger cars, but for the relaxing autonomy, that's still a barrier to be to be talked about.  

 

 

4

How do weather conditions affect autonomous vehicle design? 

I think it's a kind of journey. I think nowadays we can say that if we have this perfect California sunny weather, we have a nice marking on the roads, then it's easy for autonomous cars to follow the road to have good visibility. The cameras will see far ahead. There will be enough time to react for the computer. The latency times are not critical on this one. But when the weather turns to heavy snowfall, it's not to be underestimated. The human brain, a good trained driver, that can drive in conditions where you have heavy snowfall, where there is no line on the road visible anymore. So, it's just snow everywhere. And still an experienced driver will reduce the speed to a certain limit and drive as safely as possible. These conditions, heavy rainfall, heavy snowfall are hard to overcome. And the set of sensors we operate here is key to that. I think we are progressing continuously. With a camera, you have good visibility, but when you have snowfall, the camera gets to the limits. Then the radar is a bit stronger and it's about sensor fusion.I think that is a process. And I think the car makers at the moment try to get a way forward that is safety oriented, that we don't take too big steps under all these weather conditions. That we stay on the safe side. 

 

 

5

What is the difference between the autonomy value proposition and the autonomy journey?

I think when we look to the passenger cars, what I can see right now is that there is a big trend on safety because to go to fully autonomous driving with relaxing, watching a movie is something that has risks with it. But when we talk about a level two plus for passenger cars, we talk about automated driving. And that is something that turns out very strong. We have growth rates of more than 20% per year on this topic because sensors are being used in order to have emergency braking. When you get out of a parking lot, sensors look left and right. Is there any cross traffic coming in? So, to look in areas that you usually cannot look as a driver. And this is the point with the level two plus you have additional safety and still the driver stays reliable, which means that is a big asset for the people and the amount of money you have to pay for that is not too much as an extra. So here you have a big gain in safety and in driving comfort and the plus on cost is not too high. When you move then to the commercial trucks, it's a different story for a commercial trucks. Level two plus is additional safety if you drive and operate it in the city. That's okay, but the real commercial gain, the 45% of saving, comes when you go to full autonomy. And therefore, for the commercial trucks, it's more interesting to go to the point of full autonomy, to be driverless. And that is the point that is driven now by these autonomous freight networks like in the US. We already see that from Phoenix to Orlando, Florida. In the US, an autonomous freight network has been established so that trucks on the highway in a very controlled environment at good weather can be operated at level five, fully autonomous. And that you get commercial savings. Therefore it pays off because the total savings of 45% is really the motivation to invest a lot more in much more sensors, more processing power, computer power, in order to compensate for that. 

 

 

6

Do you foresee a day when infrastructure is specifically designed for autonomous cars, trucks, and public transit networks?

Actually, I was hoping for that a little bit because still there's the challenge of mixed traffic. So, when you have mixed traffic, just imagine a reckless driver. If in mixed traffic, a reckless driver comes and you have your autonomous truck, your autonomous car, and you just cross the path of the autonomous car, the autonomous car will go into emergency braking. When you are in New Delhi with a lot of cars going in and out and everywhere, using the horn in order to communicate with each other, an autonomous car will be the slowest in the traffic. It will brake. It will be on the safe side. And that is an environment where it's hard to move as an autonomous car. Therefore, I think hoping for how to roll out autonomous transportation, I think I can imagine looking to the future when you have megacities with 20 million people living there. I can imagine it's hard to find a parking place in the megacity with 20 million people. Why not taking the inner circle of a megacity and say this is autonomy? So, we start operating autonomous passenger cars, mobility as a service, all those operating nicely in the inner circle. You don't need a parking place anymore. You call the car. And as long as you have no mixed traffic, only autonomous cars, they will float nicely, safely in the inner circle of a city. That will work nicely. And the other point is, when you have the highway pilots, I think that's the next thing we will see. On the highways, trucks, passenger cars that operate in one direction, even in mixed traffic. I think the scenarios, meanwhile, are trained well enough that there is a good level of safety feasible in, I would say, okay weather. Don't talk about snowfall, heavy rain. This is more tricky.  

 

 

7

Which technologies are necessary to make vehicle autonomy possible?

Well, I think the easiest way to understand what is autonomous driving is to just compare how we drive today. How do we drive today? We have two eyes, two visual sensors. We have a brain with IQ, more or less, depending on the driver. We have lighting, we have a wiper, and that provides us 24/7 all-weather driving capability. If you translate that into autonomous driving instead of the two visual sensors, we have cameras, we might have radar, we might have LIDAR laser scanner. We have artificial intelligence with an IQ more or less as well, depending on the producer. And then we have some digital lighting and that should provide us all better capabilities. But the point here is the sensors are very different, so a camera is quite good when you have good weather conditions. You can read signs, you can read any kind of information on the road. But when it comes to radar, radar is good in movement detection in looking through rain and fog. LIDAR, again, is just having laser points. It provides a three dimensional object. It shows you the range, the object itself. But when you get at 200 meters, then you see just a few dots. So now the key is using a combination of sensors. In order to get to 100% object recognition, it's good to combine certain sensors. You start with light and the camera, and then the radar provides the movement detection.    

 

Which means you see in a longer distance that you might not see at night, you'll see a movement. It can be just the plastic bag crossing your street. And then the radar doesn't know. Is it a plastic bag? What is it? Or is it a deer or is it someone on a bicycle? You need light plus the camera to identify what is crossing them. And then similarly, you can use the LIDAR in certain situations, but only to a certain range. So, the key is now using different sensors in order to get to 100% probability of identifying what kind of object is in front of you, and whether you are driving safely, whether you can avoid any collision. And this under different weather conditions. And so, every sensor has pros and cons. If you have, for example, sun being very low coming in front of you and the range of the camera reduces very strongly, then you see an autonomous car in between the lanes. It's bouncing left and right because the range looking forward is getting too short. So, it's really a process in getting into that and it takes a lot of miles and training for the artificial intelligence to operate a car as you are used to.

 

 

Did you enjoy this interview? Read the source article.
Engineer operates cobots in a factory.

Slower – but safer – journey toward autonomous vehicles

In the race to achieve vehicle autonomy, it is not important which type – passenger cars, fleet trucks, or robotaxis – win. What matters is that we as an industry take a holistic approach to developing autonomy that addresses sustainability and roadway safety with zero fatalities.

Achieving level-5 autonomy in cars means addressing challenges not initially envisioned. Learn about these.
An aerial view of an urban highway at night with imagery showing the range of detection for autonomous cars.
An aerial view of an urban highway at night with imagery showing the range of detection for autonomous cars.
The easiest way to understand what is autonomous driving is to just compare how we drive today.

March 1, 2022

There’s a lot to be excited about in the world of autonomous vehicles. There are also many challenges ahead, including misconceptions to clear up and technology hurdles to overcome before autonomous vehicles go mainstream. Ralf Klädtke, TE vice president and chief technical officer for Transportation Solutions, is keenly aware of the challenges and the trends impacting the automotive industry as it moves towards full autonomy. In this first discussion in a series of three interviews, Ralf discusses the dynamic driving this evolution forward and both the current obstacles and the general perspective towards autonomous vehicles. 

 

Read – and listen to – an interview with Ralf Klädtke.

Listen to the interview

21:16

Get insight on the drivers shaping vehicle autonomy.

Get interview alerts in your inbox

Please accept TE's Privacy Policy and the TE.com Terms and Conditions.

Please review errors above

The personal information you provide will be transferred to and processed by TE Connectivity in the U.S. to provide you with the requested information or services. Please read our privacy policy for more details.

For legal reasons we need to ask you for your consent with this by clicking the box to the left.

1

What is the current state of engineering vehicle autonomy?

I think when we just think back a couple of years, every one of us was dreaming to sit in a robotaxi to tell an artificial intelligence, I want to go to the next theater and everything would happen automatically. Lean back, enjoy the movie, close your eyes. And in hindsight, I must say, this has taken much longer than ever expected. A lot of developments have taken place, a lot of energy, time has gone into that. But I think issues like weather, snow, mixed traffic have been much stronger than most of us thought. And therefore, the whole progress is safety oriented and taking longer than we expected.  

 

 

2

What are key barriers to reaching level-five automation? 

I think people talk a lot about levels, but there are some brief explanations what the levels are. I think level zero, it's you yourself driving the vehicle. When we go to level one, the short abbreviation says feet off, which means the car is able to steer when you get across the lane, so it keeps you in between the lanes. And if a car's in front of you, it starts decelerating. That's breaking itself, accelerating again to the level of speed that you have set. So that's the level one. When we talk about the level two, it's kind of hands off. It's a partial automation, which means the vehicle can operate the steering, and as well the acceleration and deceleration in specific cases. And then at the level three, we talk a different language because up to level two, the driver stays liable. As of level three, the liability enters into the car, into the software, into the intelligence of the car. And here the level three means eyes off. So the driver does not need to observe the drive but must be ready to take control when an alert comes up. Bad weather or unsafe situation, please take over. And when we go to level four, then you don't need a driver, that's really mind off. The driver is not needed. The vehicle can operate, but under limited driving conditions. And at the level five is really no driver needed. Full automation in all driving conditions.  

 

 

3

What can engineers do to address the barriers of developing autonomous vehicles? 

I think there are many more barriers than we ever thought that there would be. I think just trying to count one, two, I think up to four barriers come up to my mind. I think on the first hand we have technical barriers. We see when we are in undefined scenarios. Especially in cities, when kids are playing on the streets, bicyclists, pedestrians crossing the streets. Mixed traffic means as well. Just imagine there is a car being operated in Paris. Five lanes, a lot of cars going in and out. Very undefined scenarios that are hard to be trained for and artificial intelligence. So, the technical barriers in mixed traffic are hard to overcome. The other barrier that I see is the liability. I said before, when we go from level two to level three, the liability is with a carmaker. And that's a hard thing to have. If you are in unsafe situations in different weather conditions, then who of the carmakers takes over the liability for any accident? As we all know, like in the United States, the fines in front of a court can be very high. 

 

So that's a risk to take and that's a barrier going to higher levels of autonomy. Another barrier I can see is a human barrier. The point is the readiness for autonomous driving, I think still about 60% of the people do not trust autonomous driving. Therefore, there's a human barrier, whether people are open to switch to the autonomous mode or not. And then the last barrier, I think not to be neglected is cost. People are not ready for a passenger car, for enjoying, for relaxing to be to pay huge extra. And if you go to higher levels of autonomy, we talk about higher computing requirements, about a lot of more sensors, sensor fusion, artificial intelligence, and that is not free of charge. So, there's a certain barrier. And what kind of autonomy people are ready to pay for. When we talk safety, there's willingness to pay for safety in passenger cars, but for the relaxing autonomy, that's still a barrier to be to be talked about.  

 

 

4

How do weather conditions affect autonomous vehicle design? 

I think it's a kind of journey. I think nowadays we can say that if we have this perfect California sunny weather, we have a nice marking on the roads, then it's easy for autonomous cars to follow the road to have good visibility. The cameras will see far ahead. There will be enough time to react for the computer. The latency times are not critical on this one. But when the weather turns to heavy snowfall, it's not to be underestimated. The human brain, a good trained driver, that can drive in conditions where you have heavy snowfall, where there is no line on the road visible anymore. So, it's just snow everywhere. And still an experienced driver will reduce the speed to a certain limit and drive as safely as possible. These conditions, heavy rainfall, heavy snowfall are hard to overcome. And the set of sensors we operate here is key to that. I think we are progressing continuously. With a camera, you have good visibility, but when you have snowfall, the camera gets to the limits. Then the radar is a bit stronger and it's about sensor fusion.I think that is a process. And I think the car makers at the moment try to get a way forward that is safety oriented, that we don't take too big steps under all these weather conditions. That we stay on the safe side. 

 

 

5

What is the difference between the autonomy value proposition and the autonomy journey?

I think when we look to the passenger cars, what I can see right now is that there is a big trend on safety because to go to fully autonomous driving with relaxing, watching a movie is something that has risks with it. But when we talk about a level two plus for passenger cars, we talk about automated driving. And that is something that turns out very strong. We have growth rates of more than 20% per year on this topic because sensors are being used in order to have emergency braking. When you get out of a parking lot, sensors look left and right. Is there any cross traffic coming in? So, to look in areas that you usually cannot look as a driver. And this is the point with the level two plus you have additional safety and still the driver stays reliable, which means that is a big asset for the people and the amount of money you have to pay for that is not too much as an extra. So here you have a big gain in safety and in driving comfort and the plus on cost is not too high. When you move then to the commercial trucks, it's a different story for a commercial trucks. Level two plus is additional safety if you drive and operate it in the city. That's okay, but the real commercial gain, the 45% of saving, comes when you go to full autonomy. And therefore, for the commercial trucks, it's more interesting to go to the point of full autonomy, to be driverless. And that is the point that is driven now by these autonomous freight networks like in the US. We already see that from Phoenix to Orlando, Florida. In the US, an autonomous freight network has been established so that trucks on the highway in a very controlled environment at good weather can be operated at level five, fully autonomous. And that you get commercial savings. Therefore it pays off because the total savings of 45% is really the motivation to invest a lot more in much more sensors, more processing power, computer power, in order to compensate for that. 

 

 

6

Do you foresee a day when infrastructure is specifically designed for autonomous cars, trucks, and public transit networks?

Actually, I was hoping for that a little bit because still there's the challenge of mixed traffic. So, when you have mixed traffic, just imagine a reckless driver. If in mixed traffic, a reckless driver comes and you have your autonomous truck, your autonomous car, and you just cross the path of the autonomous car, the autonomous car will go into emergency braking. When you are in New Delhi with a lot of cars going in and out and everywhere, using the horn in order to communicate with each other, an autonomous car will be the slowest in the traffic. It will brake. It will be on the safe side. And that is an environment where it's hard to move as an autonomous car. Therefore, I think hoping for how to roll out autonomous transportation, I think I can imagine looking to the future when you have megacities with 20 million people living there. I can imagine it's hard to find a parking place in the megacity with 20 million people. Why not taking the inner circle of a megacity and say this is autonomy? So, we start operating autonomous passenger cars, mobility as a service, all those operating nicely in the inner circle. You don't need a parking place anymore. You call the car. And as long as you have no mixed traffic, only autonomous cars, they will float nicely, safely in the inner circle of a city. That will work nicely. And the other point is, when you have the highway pilots, I think that's the next thing we will see. On the highways, trucks, passenger cars that operate in one direction, even in mixed traffic. I think the scenarios, meanwhile, are trained well enough that there is a good level of safety feasible in, I would say, okay weather. Don't talk about snowfall, heavy rain. This is more tricky.  

 

 

7

Which technologies are necessary to make vehicle autonomy possible?

Well, I think the easiest way to understand what is autonomous driving is to just compare how we drive today. How do we drive today? We have two eyes, two visual sensors. We have a brain with IQ, more or less, depending on the driver. We have lighting, we have a wiper, and that provides us 24/7 all-weather driving capability. If you translate that into autonomous driving instead of the two visual sensors, we have cameras, we might have radar, we might have LIDAR laser scanner. We have artificial intelligence with an IQ more or less as well, depending on the producer. And then we have some digital lighting and that should provide us all better capabilities. But the point here is the sensors are very different, so a camera is quite good when you have good weather conditions. You can read signs, you can read any kind of information on the road. But when it comes to radar, radar is good in movement detection in looking through rain and fog. LIDAR, again, is just having laser points. It provides a three dimensional object. It shows you the range, the object itself. But when you get at 200 meters, then you see just a few dots. So now the key is using a combination of sensors. In order to get to 100% object recognition, it's good to combine certain sensors. You start with light and the camera, and then the radar provides the movement detection.    

 

Which means you see in a longer distance that you might not see at night, you'll see a movement. It can be just the plastic bag crossing your street. And then the radar doesn't know. Is it a plastic bag? What is it? Or is it a deer or is it someone on a bicycle? You need light plus the camera to identify what is crossing them. And then similarly, you can use the LIDAR in certain situations, but only to a certain range. So, the key is now using different sensors in order to get to 100% probability of identifying what kind of object is in front of you, and whether you are driving safely, whether you can avoid any collision. And this under different weather conditions. And so, every sensor has pros and cons. If you have, for example, sun being very low coming in front of you and the range of the camera reduces very strongly, then you see an autonomous car in between the lanes. It's bouncing left and right because the range looking forward is getting too short. So, it's really a process in getting into that and it takes a lot of miles and training for the artificial intelligence to operate a car as you are used to.

 

 

Did you enjoy this interview? Read the source article.
Engineer operates cobots in a factory.

Slower – but safer – journey toward autonomous vehicles

In the race to achieve vehicle autonomy, it is not important which type – passenger cars, fleet trucks, or robotaxis – win. What matters is that we as an industry take a holistic approach to developing autonomy that addresses sustainability and roadway safety with zero fatalities.

Achieving level-5 autonomy in cars means addressing challenges not initially envisioned. Learn about these.