Learning to drive is not easy. Especially if you’re a car.

The van in front was following the road rules a bit too meticulously for his liking. It would stop the moment a signal turned red, instead of trying to just about miss it. It would wait till the countdown had completely finished before it started moving again. Never once would it go above the speed-limit.

And this perfect behaviour was slowing down not just the car itself, but all the other traffic behind it as well.

He pulled up his car next to the van, planning to tell the driver what he thought of him. But when he looked in through the window, he was in for a shock.

The car didn’t have a driver.


When you’re starting from scratch, learning to drive is not easy.

The hardest part is learning how to see. You have all these pixels coming at you from cameras, and from various other sensors on your body. But how do you put them together? How do you make out what they’re saying? How do you actually see what the various objects in front of you are, instead of being left with colour after unrelated colour?

Usually, you have a model in your head. You know there’s bound to be a road and a pavement and traffic and a sky somewhere — you just need to find out where exactly they all are. You match the pixels to different patterns, and find out which pattern fits best. “This place”, you say, “is probably the road — and that place is probably another car.”

You’re not completely sure, of course; there’s always the chance that you’ve got it wrong. But as you keep driving, you’ll become more accurate. You’ll learn to recognise objects better.

That’s good, because recognising objects wrong can be a disaster. About a year and a half ago, in June 2016, a Tesla Autopilot was helping its driver steer along the highway. It wasn’t a full self-driving car: it was driven by the driver, and the Autopilot was only there to help with minor adjustments, like making sure the car kept straight on the road.

But then came a heavy white-painted tractor-trailer, crossing the road. And, seen against the bright sky, it fooled the Autopilot. The way it was painted, it looked like not a truck but just empty space. Naturally, the Autopilot tried to drive the car into that empty space. And, just as naturally, it crashed into the truck.

Neither the driver nor the car survived.

Humans may find it strange that a huge lorry could be mistaken for an empty road. The reason is that it’s an optical illusion. And optical illusions work differently for humans and for self-driving cars. What one finds straightforward, the other may find to be not straightforward at all.


Last August, the car company Ford was conducting an experiment about self-driving cars. More specifically, they were trying to find out how ordinary drivers and other people would react, when these strange new creatures started coming out onto the road.

People are used to having a driver to communicate with. That’s who they signal to while crossing the road, wave at to hail a bus or taxi, communicate with while driving, or even yell at when they get annoyed.

Self-driving cars, on the other hand, drive themselves all on their own. The driver’s gone — and with it go the eye-contact and other subtle gestures; things that people don’t notice, and yet, are all so accustomed to. For things to work out, we’ll need something to replace them. And that’s exactly what Ford was trying to make.

The experiment was conducted jointly with the Virginia Tech Transport Institute. It featured a “light-bar” on the windscreen, in the place where a driver’s eyes would normally be. That light-bar would signal what the car was doing. A slow, white pulse? Then it’s okay to overtake. Rapid blinking? Watch out — it’s accelerating from a stop!

For plans to work on these light-signals, fine-tuning them by observing how people respond. They aim to work with other companies and create a standard light-bar “language” to be used by all. Then way, everyone will know ho to communicate when cars get advanced enough to safely drive around on their own.

Of course, cars haven’t got that advanced just yet. That’s why Fords “self-driving” van wasn’t self-driving at all. It’s just that nobody noticed the driver, who was dressed up as a seat.


After you’ve learned how to see — or at least, picked up enough skills to get by — then comes the next step: deciding what to do.

That’s a bit easier, because you don’t have to think much. There are usually rules that you can simply follow.

The main rule is to keep on the road. That means, don’t drive off the side; keep adjusting the steering to stay on track. Which is pretty easy to do, once you know there the road is.

It’s not enough just to be on the road, however. You also need to be on the correct side of the road. You have to adjust your speed according to the road you’re on, going fast on highways and slowing down for speed-bumps. You need to identify road-signs in case there are special instructions to follow — although that’s also straightforward if you’ve recognised what the signs are.

Then, you have to deal with the traffic. You should know how to make way, how to match their speed, and how to avoid banging into them. Traffic is tricky, because it doesn’t always move the same way. You’ve got to learn to predict what the vehicles are going to do next.

Some rules are simple, like “slow down if the car just in front of you slows down”. Or “when switching lanes, you can cross the line if it’s a dashed line, but not if it’s a solid one.”

Other rules are more complicated. Some roads have one-way crossing lines, where you can cross from one lane to another but not the other way round. Those lanes have both dashed and dotted lines together.

In this situation, you can cross a dashed line if there’s a solid line just beyond it, but you can’t cross if the dashed line comes after the solid one. Cars in the above picture can only cross from down to up, not from up to down.

Rules like that, though complex, can still be followed. But there are some times when you’re not at all sure what to do. Those are the situations that you’ve never encountered before.

One day, a Google car was driving round a bend, when along came a duck, pursued by a lady, who was sitting in a wheelchair and carrying a broom. Luckily, the car had the sense to slow down, which is what you must do, too. The car had been told what to do in every situation its programmers had thought of — but nobody ever told it what to do if it came across a lady in a wheelchair chasing a duck with a broom!


While Ford works to ease the transition from driven cars to self-driving ones, software company Baidu is taking a different approach. They don’t try to make their vehicles fit in. Instead, they make them stand out. So, when you see one of those vehicles, you immediately know it’s a self-driving one. You can react accordingly, and be ready for the different sort of driving it would do compared to a human driver.

Baidu is proposing special lanes for self-driving cars: lanes where everyone will follow stricter rules, at least until self-driving cars are advanced enough to navigate the more messy world of human driving.

Those cars will be ready for some roads earlier than others. Maruti Suzuki chairman R.C. Bhargava has been quoted saying self-driving cars won’t work in India. Drivers in India are known for not following the official rules, instead making up new ones as they go along. I know someone who got pulled up by the traffic police because he drove the correct way round a ring-road, instead of cutting across like everyone else.

Instead of making self-driving vehicles work everywhere, Baidu will first work on getting self-driving to work in some places — for example, buses that follow a certain fixed route. That way, they’ll know the place well and have a better idea of what to expect. And, they can practice making self-driving more accurate.


Once you can decide what to do, there’s just one thing left: knowing where to go. At the moment, it’s pretty easy. A GPS does all the hard work; you just have to follow directions.

In the future, however, things could get more complex. You’ll probably start talking to other people and finding out where they are going, so you can all coordinate the traffic better. Lots of you could line up in a procession, each of you riding the winds of the one in front.

You’ll start to talk to humans, too. You’ll begin to learn their habits: when they’re going to get impatient, where they may make an unplanned stop on the way. And then, there’s the nice part.

As you learn to adapt to this strange, human, world, the humans will also start adapting to you.


If humans get used to self-driving cars, they’ll also start figuring out how to hack them.

The CIA, it seems, has one project working on just that. If they can somehow go through the Internet and get access to a self-driving car, they can also program it to do things. For example, they could program a car full of terrorists (or people they think are terrorists) to turn and crash into a wall, in what would be a perfect stealth-murder.

But plans don’t have to be so hi-tech. Far from using computers and advanced cryptographic programming techniques, all you need may be a bit of salt.

That’s what artist James Bridle used for his ‘Autonomous Trap 001’. It was basically two concentric circles of salt on the road: one solid, and the other dotted.

Remember what the car rules said?

You can cross a dashed line if there’s a solid line just beyond it, but you can’t cross if the dashed line comes after the solid one.

Cars will be able to cross the dashed and solid lines to get in to the circle — but they won’t be able to cross the solid and dashed lines to get back out!

They’ll be trapped inside the circle, and, unless someone comes to save them, there they’ll remain — waiting and waiting, or driving round and around, for ever.


Have something to say? At Snipette, we encourage questions, comments, corrections and clarifications — even if they are something that can be easily Googled! Or you can simply click on the ‘👏 clap’ button, to tell us how much you liked reading this.