Artificial Intelligence (AI) is on everyone’s agenda. Not a day goes by without some AI announcement making the headlines.
- Beating the world champion go player (read more here);
- Achieving super-human ability at the game of Pong (read more here);
- Learning to paint (read more here);
- A $1B AI college at MIT (read more here);
- AI agent reads the news (read more here).
Yet there are those that think a lot of this is hype; that AI is the latest excitable buzz word used to describe something relatively mundane, namely automation software. This is not true. To prove that point, I want to share our own experience of AI with you, by providing some insight into exactly what impact AI has had on the world’s biggest biometric rollout.
Before I begin, let’s just get some definitions out of the way. First, I’m talking about narrow AI, applied to a specific task, not conscious robots enslaving humanity! Second, for us at Aurora-AI, this means Deep Neural Networks learning highly complex tasks through analysis of mass data, not traditional Machine Learning through some engineered statistical model of a known dataspace.
Heathrow recently announced a £50M project to implement the world’s biggest biometric roll-out (read more here). They are taking an unprecedented step to fully automate all departing passenger journeys through the airport, end-to-end. This is going to make the airport experience faster, more pleasurable, hassle-free and more secure. Aurora’s AI-based face recognition is supporting this grand endeavour (read more here).
But why now? Biometrics has been promising this for two decades. The simple answer is that AI has been applied to face recognition and the effect is that error rates have all but disappeared.
To use face recognition at every point of a passenger’s journey it’s got to be ultra-accurate, ultra-robust and extremely easy to use. It’s got to work in any lighting environment (step-up Aurora Infrared Face Recognition), through legacy cameras in self-service kiosks (web-cam enabled face-rec), on passengers’ mobile phones (face-rec optimised for selfies) and against ten-year-old passport photos (face-rec tuned for non-contemporary images). In short, it’s got to work everywhere, for real, not just when it’s optimised for some laboratory standardised test.
Aurora has just celebrated its 20th Anniversary! We’ve been supplying face recognition for 20 years and we’ve seen all the problems. We have also seen the revolution that AI has brought to biometrics. Back in 2014 we achieved the number one position in the very difficult Labelled Faces in the Wild challenge laid out by the University of Massachusetts. Our winning entry had an impressive (at the time) 93.24% accuracy. This was pre-AI.
In 2015 we introduced AI and slashed error rates so dramatically that we were, for a time, convinced we had a bug in our testing software. For Aurora’s team of Machine Learning PhDs, having worked for seven years developing our world-class face recognition algorithms, this was a profound moment. We ran the Deep Learning algorithm on the training data using our seahorse neural network (our AI naming convention is based on sea creatures) and equalled the accuracy of our best pre-AI engine, Wash (still waiting for another series of Firefly), on the second try. One of the team suggested “leave it running for another hour”. That removed another quarter of the outstanding errors! Over the next 5 weeks we reduced the error rate by a staggering two thirds, simply through increasing the size of the training set, extending training time and exploring new neural network architectures. Compared to the Machine Learning approach we’d used previously, we achieved 6 years’ worth of improvement in 5 weeks. This is the graph I presented to the Board later that month.
I had to add two more columns on by the time the Board Meeting came around (lobster beat prawn). We then applied Deep Learning to our face detection, feature detection and quality assessment algorithms. All three produced order-of-magnitude improvements in weeks. The speed, accuracy and general user experience of the complete system was transformed. Some months later, Heathrow updated their live face recognition system, used to process all domestic passengers, to Aurora-AI’s the latest AI-based system (stingray-v3), gaining instant reductions in transaction time, improved security and happier passengers.
We now produce a new face recognition engine twice a month, rather than twice a year, each with a step-change in accuracy. We’ve got face-rec for mobile phones, face-rec for infrared, face-rec for passports and face-rec for web-cams. Each is optimised for that specific image acquisition medium and each getting better on a monthly basis. Last month our accuracy rates were 99.6%. This month we release Orca, with 99.7% accuracy. Next month, well, we’re going to need a bigger test set.
This means that in the time between Heathrow Airport announcing the World’s Biggest Biometric Rollout and now, the technology has already got significantly better. By summer 2019, when the project goes live, I wonder if they’ll be any error left – that’s the impact of AI.
Hype, it is not.
Come and find out that its not hype at the Passenger Terminal Expo from 26th-28th March at the London Excel. Please visit our stand 1090. Book an appointment through our contact page.