The UK government is backing away from proposals to remove individuals’ rights to challenge decisions made about them by artificial intelligence following an early analysis of its consultation process.
In its response to the consultation “Data: A new direction”, which set out proposals for changing UK data protection law following the nation’s departure from the European Union, the government would look to the “efficacy of safeguards” with respect to automated decision-making about people, rather than the removal of safeguards, Harry Lee, deputy director, data protection and data rights, Department for Digital, Culture, Media and Sport told a conference yesterday.
In September 2021, the government published a consultation that suggested it could water down individuals’ rights to challenge decisions made about them by AI.
The DCMS paper said that the need “to provide human review [of AI decisions] may, in future, not be practicable or proportionate.”
In the UK’s current implementation of the EU’s General Data Protection Regulation (via the Data Protection Act 2018), people have a right to not be subject to a solely automated decision-making process with significant effects.
The government’s consultation followed a recommendation from the much-derided Taskforce on Innovation, Growth and Regulatory Reform (TIGRR) that Article 22 of UK Data Protection Act 2018 – which outlines subjects’ rights to challenge automated decisions – should be removed, and was greeted with outrage from campaign groups.
The consultation is now closed. Speaking at the Westminster eForum policy conference Data protection and the future of data regulation in the UK, Lee said there had been a lot of concern about the removal of Article 22 from the current law.
“Our focus is really to look at how to improve the efficacy of safeguards with respect to automated decision making in light of many respondents flagging shortcomings in its design. That, rather than removal, is a more fruitful area of work for us over the coming months,” Lee said.
The government is also proposing changes to the governance of the data protection regulator, the Information Commissioner’s Office, in its shake-up of data protection law.
Look out for data watchdog independence
Last year, the outgoing Information Commissioner warned about the office’s future independence in light of the proposals, saying the ICO needed to “be able to hold government to account” with a governance model that “preserves its independence.”
Also speaking at the Westminster eForum conference yesterday, Louise Byers, director of planning risk and governance at the ICO, offered specific tests for the future independence of the office.
She said the ICO had in its response backed the move to a board model, with a chair and chief executive roles. However, the appointment of the CEO should be for the statutory board, not the government, as it proposes. She also warned against the government signing off codes of practice.
“We’ve highlighted in our response and concerns around the proposal for the Secretary of State to sign off to codes of practice and complex guidance, and also for the Secretary of State to make the appointment of the CEO role to the board, which is something we believe should be a role for the new secretary board,” she said.
“We believe it’s vital that whatever reforms are put in place, they maintain both the practicalities and perception of the independence of the ICO to allow us to continue to hold government and public sector to account,” Byers added.
She said the independence of the ICO was vital to the adequacy decision which currently allows data sharing with the EU, and similar decisions which might allow data sharing in other jurisdictions.
The DCMS’s Lee said the ICO’s role as an independent regulator was “important to creating and maintaining trust” with the public.
“I can understand why, in some instances, there are concerns about the impact [of the proposals] on the ICO’s independence and we’re going to look at those closely,” he said.
Lee offered no specific assurances that the government would not get involved in choosing the ICO’s CEO or signing off codes of practice and guidance. ®