A fleet of six self-driving Ford Mondeos are navigating the streets of Oxford in all hours and all weathers to test the abilities of driverless cars as part of a new trial.
Technology firm Oxbotica, spun out of an Oxford University project, retrofitted the vehicles which are making repeated nine-mile round trips within the city several times per day until March 2021.
A dozen cameras, three Lidar sensors and two radar sensors are used to put the car at ‘level 4 autonomy’, meaning it can handle almost all situations itself.
A person needs to be in the driving seat by law, but they won’t be touching the steering wheel or pedals, the driverless car will be ‘taking them for a ride’.
A fleet of six self-driving Ford Mondeos will be navigating the streets of Oxford in all hours and all weathers to test the abilities of driverless cars as part of a new trial
Technology firm Oxbotica, spun out of an Oxford University project, has retrofitted the vehicles which are following a nine-mile round trip within the city
The Oxford trial is part of the UK government-backed £12.3 million Endeavour project, set up to try deploying a fleet of self-driving cars in several cities.
Over the next five months the six adapted vehicles will drive themselves between Parkway Station and main station in Oxford – a nine mile round trip.
They will run the route several times a day ‘from morning commutes to school runs, in a range of weather conditions’ according to the team behind the trial.
Oxfordshire County Council, which is a partner in Project Endeavour, have been involved in the trail to test the cars in the city.
WHAT ARE THE SIX LEVELS OF SELF-DRIVING AUTOMATION?
Level Zero – The full-time performance by the human driver of all aspects of the dynamic driving task, even when enhanced by warning or intervention systems.
Level One – A small amount of control is accomplished by the system such as adaptive braking if a car gets too close.
Level Two – The system can control the speed and direction of the car allowing the driver to take their hands off temporarily, but they have to monitor the road at all times and be ready to take over.
Level Three – The driver does not have to monitor the system at all times in some specific cases like on high ways but must be ready to resume control if the system requests.
Level Four – The system can cope will all situations automatically within defined use but it may not be able to cope will all weather or road conditions. System will rely on high definition mapping.
Level Five – Full automation. System can cope with all weather, traffic and lighting conditions. It can go anywhere, at any time in any conditions.
Tesla’s Model 3 Sedan – one of the world’s most advanced road-legal cars with autonomous elements – currently operates at Level Two autonomy. It is equipped for Level Three autonomy, which may be introduced in a future software update
Innovation boss, Laura Peacock, told Autocar that the city had been at the forefront of autonomous mobility for the past four years.
‘The progress that has been made in the Connected Autonomous Vehicle (CAV) eco system is huge, moving from simulation, trials in isolated environments and now to the first live on-road public trials in Oxford.’
After the Oxford trial ends vehicles will then be deployed in Greenwich, London and a northern city that hasn’t been confirmed by officials from Endeavour.
‘The challenge is that every place is different,’ Oxbotica senior VP and Endeavour programme director Graeme Smith told The Times.
A dozen cameras, three Lidar sensors, two radar sensors are used to put the car at ‘level 4 autonomy’, meaning it can handle almost all situations itself.
A person needs to be in the driving seat by law, but they won’t be touching the steering wheel or pedals, the driverless car will be ‘taking them for a ride’
Every location they test the driverless vehicles in adds to the data available for the self-driving system – and lets Oxbotica see how the software needs alternating for different conditions.
He said doing the trials in different locations also lets them get a better idea of how to get councils and communities onboard with the idea of self-driving cars.
Project Endeavour was launched in 2019 and is a consortium of various autonomous vehicle and smart technology companies, with funding coming in part from the Government body – the Centre for Connected Autonomous Vehicles.
SELF-DRIVING CARS ‘SEE’ USING LIDAR, CAMERAS AND RADAR
Self-driving cars often use a combination of normal two-dimensional cameras and depth-sensing ‘LiDAR’ units to recognise the world around them.
However, others make use of visible light cameras that capture imagery of the roads and streets.
They are trained with a wealth of information and vast databases of hundreds of thousands of clips which are processed using artificial intelligence to accurately identify people, signs and hazards.
In LiDAR (light detection and ranging) scanning – which is used by Waymo – one or more lasers send out short pulses, which bounce back when they hit an obstacle.
These sensors constantly scan the surrounding areas looking for information, acting as the ‘eyes’ of the car.
While the units supply depth information, their low resolution makes it hard to detect small, faraway objects without help from a normal camera linked to it in real time.
In November last year Apple revealed details of its driverless car system that uses lasers to detect pedestrians and cyclists from a distance.
The Apple researchers said they were able to get ‘highly encouraging results’ in spotting pedestrians and cyclists with just LiDAR data.
They also wrote they were able to beat other approaches for detecting three-dimensional objects that use only LiDAR.
Other self-driving cars generally rely on a combination of cameras, sensors and lasers.
An example is Volvo’s self driving cars that rely on around 28 cameras, sensors and lasers.
A network of computers process information, which together with GPS, generates a real-time map of moving and stationary objects in the environment.
Twelve ultrasonic sensors around the car are used to identify objects close to the vehicle and support autonomous drive at low speeds.
A wave radar and camera placed on the windscreen reads traffic signs and the road’s curvature and can detect objects on the road such as other road users.
Four radars behind the front and rear bumpers also locate objects.
Two long-range radars on the bumper are used to detect fast-moving vehicles approaching from far behind, which is useful on motorways.
Four cameras – two on the wing mirrors, one on the grille and one on the rear bumper – monitor objects in close proximity to the vehicle and lane markings.