The problem with algorithms: Why you shouldn’t leave IR35 to a computer
We’ve recently seen what can happen when A-level results are left up to algorithms. So why would you leave your IR35 compliance to the same fate?
The recent A-level debacle has been all over the news recently, but if you’ve not been following, the summary is that the Office of Qualifications and Examinations Regulation (Ofqual) used an algorithm to help determine the A-level results of students who could not sit their exams. Said algorithm ended up disproportionately penalising students from poorer backgrounds, causing extreme controversy and an eventual u-turn from the government.
However, we’re not here to look at A-levels; we’re more interested in the algorithm. It seems like we’re putting a lot of faith in computers at the moment, perhaps based on the idea that they are unbiased. Of course, an algorithm is only as impartial as the person who programmed it and the people who input the data, which is how Ofqual’s has ended up being called “unlawful”.
We’ve seen this a lot in the world of IR35. There are plenty of organisations offering cheap online tools to provide “easy” IR35 audits. Even HMRC has a Check Employment Status for Tax (CEST) tool that allows people taking on freelancers to check their employment status. But how accurate is a computer going to be with something as complex as IR35?
The problem with algorithms
While Ofqual likely did its best to provide an accurate and effective algorithm for determining A-level results, the flaw may have been trusting computers too much. Tom Haines, a lecturer in machine learning at the University of Bath, pointed out that not enough attention was paid to how the algorithm actually made decisions.
“A few hundred years ago, people put up a bridge and just hoped it worked. We don’t do that anymore, we check, we validate. The same has to be true for algorithms,” he said. “We need to realise that these algorithms are man-made artefacts, and if we don’t look for problems there will be consequences.”
This is especially true when it comes to IR35. Working out whether someone sits inside or outside IR35 with regards to the legislation and decades of legal precedent involves a lot of different factors, some of which are overlooked by these tools and some that are too complex for simple yes/no answers. When you also consider that the questions being asked of these online tools are often perceived very differently by the people answering the question, it’s clear to see why HMRC would argue that these types of tools do not count as taking reasonable care when undertaking a review.
Algorithms have proved they have deserve a place in the world – we’ve seen situation where they can provide businesses and professionals with valuable real time information that helps inform decisions when the data being provided is black and white, IR35 is far from black and white and with the information so unique to each contract, they currently aren’t suitable for the task at hand.
There is also a word of warning from the case law established in this area. The fundamental principle established by case law is that a mechanistic approach to decision making is wrong. This is summarised in the case of Hall v Lorimer (1992): “The object of the exercise is to paint a picture from the accumulation of detail. The overall effect can only be appreciated by standing back from the detailed picture which has been painted, by viewing it from a distance and by making an informed, considered, qualitative appreciation of the whole.
It is a matter of evaluation of the overall effect of the detail, which is not necessarily the same as the sum total of the individual details. Not all details are of equal weight or importance in any given situation. The details may also vary in importance from one situation to another. The process involves painting a picture in each individual case.”
The impact of getting it wrong
In the case of A-levels, the impact of an erroneous algorithm has been clear. Not only were thousands of deserving students denied university places, but the government then had to reverse its decision in a u-turn that was both humiliating and costly. When it comes to IR35, the cost of errors can be just as serious.
Just recently, Talksport presenter Paul Hawksbee was determined to have been a “disguised employee” under IR35 and is now eligible for an eye-watering £140,000 in unpaid taxes. With this liability moving to the End Hirer from April 2021, the costs can be just as serious. Liabilities (before any interest or penalties) could be around £10,000 per contractor per year (based on average contract rates). This can soon mount up to an eye watering financial risk. When considering how best to mitigate this risk it would be dangerous to pin your strategy on an untested method which goes against the established case law.
How we can help
So, what’s the answer? Rather than relying on algorithms, something as nuanced as IR35 needs experienced professionals to work with you to to prevent you getting caught out, something that HMRC recently made clear in guidance they published online. Remember, hiring organisations need to ensure they meet the reasonable care threshold when assessing the IR35 status of any PSC’s providing services to them. HMRC’s guidance on reasonable care makes it clear that “seeking the advice of a qualified, professional advisor” is a factor demonstrating that reasonable care has been taken.
Our team of specially trained legal professionals, regulated by the Solicitors Regulation Authority, has more than 20 years of IR35 experience, and will work with you to make sure the IR35 status of the contractors you engage with is properly assessed on a contract by contract basis, helping you prepare for April 2021 and implement the right systems and processes so this becomes BAU for you from then on out. Contact us now to find out how we can help