This information has been prepared by IG, a trading name of IG Markets Limited. In addition to the disclaimer below, the material on this page does not contain a record of our trading prices, or an offer of, or solicitation for, a transaction in any financial instrument. IG accepts no responsibility for any use that may be made of these comments and for any consequences that result. No representation or warranty is given as to the accuracy or completeness of this information. Consequently any person acting on it does so entirely at their own risk. Any research provided does not have regard to the specific investment objectives, financial situation and needs of any specific person who may receive it. It has not been prepared in accordance with legal requirements designed to promote the independence of investment research and as such is considered to be a marketing communication. Although we are not specifically constrained from dealing ahead of our recommendations we do not seek to take advantage of them before they are provided to our clients. See full non-independent research disclaimer and quarterly summary.
Predicting human and animal behaviour is not always easy for people, let alone technology.
There are innumerable variables at work. Physicists would long ago have been coining it at the races if they did not need to simplify their classical predictive model by eliminating so many variables the horses became spherical.
Humans have a lifetime to learn from other’s behaviour. With the considerable advances in processing power, computers can now also learn in an empirical way. With big data they can also learn by example – so called 'deep learning'.
Newer vehicles are already incorporating this, but with autonomous vehicles just around the corner, they would need to interact with pedestrians who even themselves may not know their mind.
Self-driving in context
‘People in different places behave very differently, such as London versus Vienna or Shanghai’, says Maya Pindeus, CEO of Humanising Autonomy (HA).
Fusing deep learning AI models with psychological and behavioural models helps to predict people’s behaviour on the street, Pindeus explained to IG’s Jeremy Naylor. Add to this data on the context and culture of where the car is driving hones the accuracy of the behavioural predictions.
HA is a London-based startup which has built a context specific tool for predicting human intentions. As things stand, self-driving vehicles have difficulty determining what pedestrians are planning to do. Are they planning to cross the road? Run or walk, straight or at an angle? Are they looking at their phone and don’t even see the car? This is a particular problem in the dense environments of European cities.
Tesla and Google driving interest
Elon Musk has promised a completely automated Tesla by the end of this year. Tesla aims to use cameras to cope with object detection including pedestrians – reactive rather than predictive. Google has driven the most autonomous testing miles, but where, Pindeus asks. Is it in remote parts of Silicon Valley or is it in cities, where pedestrian behaviour is more of an issue?
The HA software can be built into the software stack of autonomous vehicles so a car can predict what a person might do on the street. The company has been working with Daimler on developing the software for autonomous and semi-autonomous vehicles.
HA is using the software-as-a-service (SaaS) business model. The software will be licensed on a per vehicle basis.