Researchers look to nature to pull water from the air›››
Peering through the gap in the catch fence, the members of the Fly Eagle team held their breath. A whine emerged from the distance, an engine noise that grew louder as anticipation built for the Dallara Super Formula car to round the corner and scream down the pit straight. In the blink of an eye, the car whipped past its spectators, crossed the timing beam and turned the corner at the end of the straight. The Fly Eagle team whooped; they’d set their best time yet.
Abu Dhabi’s Yas Marina Circuit is no stranger to racing cars. It hosts events throughout the year, including the season finale of the Formula One World Championship since 2009. But the track has never seen racing like this before. It’s not the speed or the car that makes it special: It’s the drivers.
The Fly Eagle car is driven entirely by artificial intelligence. There’s no driver in that car.
Yet, the drama, the speed, the precision, the passion — all remain.
The Abu Dhabi Autonomous Racing League (A2RL) is the first of its kind in the region, shaping the future of motorsport as we know it. Eight university teams were invited to take part in the “challenge,” going head-to-head for a prize fund of U.S.$2.25 million.
Image: Motorsport has long been a testing ground for innovations that later make their way into road cars.Each team races using identical Super Formula SF23 cars, the fastest open-wheel race car after those used in Formula One, capable of reaching a maximum speed of 300 km/h. They’re also manufactured using sustainable bio-composite materials, an important factor we’ll get into later.
Each car is equipped with seven cameras, four radar sensors and three lidar units to navigate its way around the track, with the only difference between the teams lying in how they use their coding skills, algorithms and machine learning techniques to teach the cars to drive.
“Just because it’s a machine, doesn’t mean there aren’t human elements in it,” said Tom McCarthy. He’s executive director of ASPIRE, the “technology transition arm” of Abu Dhabi’s Technology Research Council. “Remember that it’s people doing the programming here.”
HOW DOES IT WORK?
The AI needs to be able to turn into corners at the right moment, know when to brake, accelerate, change gear and recognize its surroundings at all times. To get the most out of the car, it needs information on how hot the tires and the brakes are, what the wind is doing in each turn, how much grip the tires have left — all the information a human driver gets from sensors and intuits from experience.
You’d think that the fastest way around the track would be to train the AI on an “ideal lap” set by an actual racing driver, an expert, and then have the car follow that data to the letter. And indeed, there is training data for the algorithms, but every 50 milliseconds, the AI decides whether to listen to that training data or the live data it receives from its sensors. Sometimes, when it relies on its own inputs, the car shaves time off its previous best lap. Sometimes, it turns too soon and smacks into the wall.
Lakmal Seneviratne is director of the Khalifa University (KU) Center for Robotics and Autonomous Systems. With Majid Khonji, who leads the research activities in the KU Autonomous Vehicle Laboratory, the university entered the A2RL event with team Fly Eagle, a collaboration with Beijing Institute of Technology. They spoke to KUST Review in the team garage on qualifying day.
“The optimal trajectory is pre-computed,” Khonji explained. “The code is then based on the information you get about your location on the track, and you try to accurately follow that path.”
“In a simulator, your car would run perfectly using this method,” Seneviratne added. “And do 10,000km perfectly. But in real life, errors creep in. If not corrected, these errors build up and the car goes wrong.”
CAPTION: AI generated, KUST Review IMAGE: Anas Albounni, KUST ReviewWhen asked if the team was correcting these errors or the AI was handling it, both Khonji and Seneviratne were quick to jump in: “The system is doing it. We set it up, but the system is doing all the learning, all the work.”
There’s plenty of run-off area at Yas Marina Circuit, but the barriers around the track are unforgiving, and there were many times during the practice runs that cars ran afoul of the track limits. Sustainable manufacturing came in handy as front wings were replaced regularly. And thankfully, the organizers had plenty of spare wings.
“We had some good runs but some technical hiccups, of course,” Seneviratne said on qualifying day. Race events are rarely without hiccups for any race team, no matter the category, but for Fly Eagle, the biggest issue was signal around the racetrack. Their car was finding it difficult to communicate with the GPS system localizing it around the circuit.
“We get a very high-quality 3D map of the track and then the car has lidar sensors which it uses to localize itself on this map,” Seneviratne explained. “The teams that are doing well here are using that technique successfully, and that’s what we’ll do next time too.”
“To give an analogy, imagine it’s a Formula One race and you’ve blindfolded the driver,” Khonji added. “That’s what our car is experiencing without the GPS.”
Elite racing drivers practice each track before they arrive by putting in lap after lap on a simulator. It’s common to hear them say they could drive a circuit with their eyes closed. Seneviratne laughed when KUST Review put this to him:
“In a straight line, sure, you could probably do it with your eyes closed, but corners, no way.”
This statement could not have been timed better: This is the point where attention was drawn from the garage back to the racetrack as the Kinetiz team car turned for Turn 12 too early and struck the barrier. Unfortunately for Kinetiz, Turn 12 is directly visible from the support pitlane where the teams were hosted for the event. The car was recovered, and a new front wing quickly supplied.
WHAT’S THE POINT?
Motorsport is often referred to as the “cradle of innovation”: Many innovations that found their way onto our roads originated in different motorsport categories. Disc brakes won the 1953 24 Hours of Le Mans Grand Prix for Team Jaguar and two years later debuted on Citroen road cars.
Carbon fiber was first used in Formula One in the 1980s to reduce weight and can now be found on high-performance road cars. Push-to-start reduced the start-up times for racing drivers in the pit lane — hardly a modern car lacks it now. Anti-locking brake systems originated on the Ferguson P99 racecar in 1961, the kinetic energy recovery system first tested in Formula One in 2008 led the way for hybrid vehicles and all suspension systems in cars today trace their roots to NASCAR or Formula One.

Even rear-view mirrors were first found in motorsport. At the first Indianapolis 500, driver Ray Harroun attached a mirror to his car so he could keep track of the cars behind him. By 1914, this was standard practice for all production cars.
ASPIRE says by stress-testing autonomous technology on the racetrack, it’s easier to identify key challenges and areas of improvement and rapidly address them:
“We believe there is potential in autonomous robotics and AI to combine these with the average driver to bring about greater safety on our roads,” said ASPIRE’s McCarthy. “We thought the best way to do it is to demonstrate its capability in the most extreme conditions you can, in the fastest, most well-designed race car in the world.”
Stress-test may be the operative word for the event. A race car lapping the circuit at speed with no driver but a computer was seriously impressive, but a full lap with no incidents was a rarity.
During qualifying runs, many of the teams struggled to set a lap. The cars seemed to randomly swerve, spin or turn into the barriers. Sometimes, they even pulled off to the run-off area and simply stopped.
Seneviratne explained the random stopping was the AI making a prudent safety choice: When it wasn’t sure what to do, rather than risk anything, it just came to a halt.
Fly Eagle, however, was not one of the teams that made it into the final.
“We’re on a learning curve but we’re really happy with what we’ve done,” Seneviratne told KUST Review. “For us, it was more about establishing a platform to go onto the next stage. This was the first time we’ve competed in any racing event. High speed is new for us.”
LIGHTS OUT
Four teams lined up for the final, hosted in front of a capacity crowd. Even this didn’t go to plan: The leading car spun, the second car passed by without incident, but then the race officials displayed a yellow flag to the competitors. Racing rules dictate no passing under a yellow flag, but this means no passing moving vehicles: i.e. no overtaking.
Humans get this. Computers did not. The algorithms knew they weren’t allowed to pass, so they didn’t. They stopped on track.
The safety feature is perfect for incidents on a real-life road, but it’s not so impressive for a racing event if all the cars grind to a halt.
After a restart, the eight-lap race was completed. For reference, Formula One drivers do a lap in about 90 seconds. They’d complete eight laps in 12 minutes or so. The A2RL cars took 16 minutes.
They weren’t far off once they got going but these lap times were slower than teams had achieved earlier in the week during their practice sessions. Once they’d reached the final, there may have been a subconscious unanimous decision to exercise a little more caution.
All race teams watch nervously as their cars compete – few must be as nervous as those watching a computer.

In the end, the inaugural event was won by the team from Technical University of Munich as its car correctly turned the hairpin on the last lap, while the lead car misjudged its entry. It was a clean move and was just as dramatic for a driverless car as it would have been for human drivers.
The gap between human and robot persists for now, but if these events keep happening, and teams keep pushing the boundaries of what AI can do, things may change very quickly.
A2RL plans to be back in 2025.
More like this: Aquabots