The big picture – There is an advice gap and some people think the answer is robo advice
What's the problem? – Just like the exams, robo advice relies on algorithms but these can disadvantage those who are less well off
What can de done? – Looks at ways to get real people involved with the advice process.
The recent fiasco with exam results in the UK highlights the risk and dangers of using algorithms to solve complex problems. The main criticism of the official exam regulator Ofqual was that it used an algorithm that unfairly disadvantaged students from less privileged backgrounds.
This touched a raw nerve with me because I have always wondered whether robo financial advice disadvantages those with small amounts of financial wealth. I have also argued that it is lazy thinking to believe that robo-advice is the best way to close the advice.
So, while at first sight there may appear to be no connection between using algorithms to determine exam results and using algorithms to deliver financial advice, on closer inspection there may be some valuable lessons to be learnt.
The people who fall into the advice gap have pension pots which are too big to take as a cash lump sum but not big enough to justify more complex pension drawdown advice. This normally translates into pension pots of more than £30,000 but less than £100,000.
I have a lot of sympathy for advisers who say they cannot provide affordable and profitable advice for those with modest pension pots because of the high cost of regulation. However, I do not believe the answer is robo advice but there is still an expectation that technology will come to the rescue of those who have fallen into the advice gap.
I believe that if more people are pushed down the robo advice route there is a serious risk that those with modest sized pension pots could be disadvantaged compared to those with larger pension pots who will have access to personal advice.
It can be argued that robo advice is better than no advice but this is to miss the point that a large number of people are being treated as second-class citizens when it comes to financial advice.
Financial advice is complex because not only are there a lot of technical aspects, e.g. pension rules and investment options, there are a lot of emotional and behavioural factors e.g. overconfidence myopic behaviour . Robo advice can deal with the technical aspects satisfactorily but it is difficult to take account of the emotional and behavioural factors.
In practical terms this might mean that the flexibility and choice that is widely available to those who take personal advice is denied to those who get their advice from an algorithm.
For example, the advice algorithm may work on the basis that any essential income is secured by way of an annuity. But it does not follow that just because someone cannot afford to take risk with a proportion of their income that should be forced into buying a lifetime annuity at the current low levels. There are other options and solutions which should be considered.
Algorithms for marking exams and giving financial advice are clearly two different things but they both need to take account of answers which require a degree of subjective interpretation. As the exam fiasco has shown, if the algorithm is not properly programmed it can disadvantage people from less affluent backgrounds.