The U.K. government has actually revealed an assessment on strategies to shock the national data protection program, as it looks at how to diverge from European Union rules following Brexit.

It’s also a year given that the U.K. published a national information strategy in which stated it desired pandemic levels of information sharing to become Britain’s new regular.

The Department for Digital, Culture, Media and Sport (DCPS) has today trailed an inbound reform of the information commissioner’s workplace– stating it wishes to broaden the ICO’s remit to “promote sectors and organizations that are using personal information in new, innovative and accountable ways to benefit individuals’s lives”; and appealing “simplified” rules to encourage using information for research study which “advantage’s individuals’s lives”, such as in the field of health care.

It also wants a brand-new structure for the regulator– including the creation of an independent board and president for the ICO, to mirror the governance structures of other regulators such as the Competitors and Markets Authority, Financial Conduct Authority and Ofcom.

In addition, it said the data reform assessment will think about how the brand-new program can assist reduce the risks around algorithmic predisposition– something the EU is currently moving to legislate on, setting out a risk-based proposition for regulating applications of AI back in April.

Which implies the U.K. dangers being left lagging if it’s only going to issue itself with a narrow focus on “predisposition mitigation”, rather than thinking about the larger sweep of how AI is intersecting with and affecting its people’ lives.

In a news release revealing the assessment, DCMS highlights an artificial intelligence partnership including Moorfields Eye Medical facility and the University College London Institute of Ophthalmology, which kicked off back in 2016, as an example of the sort of advantageous data sharing it wishes to encourage. Last year the scientists reported that their AI had actually been able to forecast the advancement of damp age-related macular degeneration more properly than clinicians.

The partnership also included (Google-owned) DeepMind and now Google Health– although the government’s PR doesn’t make mention of the tech giant’s participation. It’s an interesting omission, considered that DeepMind’s name is likewise connected to a well-known U.K. patient data-sharing scandal, which saw another London-based NHS Trust (the Royal Free) approved by the ICO, in 2017, for incorrectly sharing patient information with the Google-owned business throughout the development stage of a clinician support app (which Google is now in the process of stopping).

DCMS may be keen to avoid defining that its objective for the information reforms– aka to “eliminate unnecessary barriers to responsible information utilize”– might end up making it much easier for commercial entities like Google to get their hands on U.K. people’ medical records.

The sizeable public reaction over the most recent government attempt to requisition NHS users’ medical records– for vaguely defined “research study” purposes (aka the “General Practice Data for Preparation and Research”, or GPDPR, plan)– recommends that a government-enabled big-health-data-free-for-all might not be so popular with U.K. citizens.

“The government’s information reforms will offer clarity around the rules for making use of personal information for research functions, preparing for more scientific and medical developments,” is how DCMS’ PR skirts the sensitive health data sharing topic.

Elsewhere there’s talk of “reinforc [ing] the obligation of companies to keep individual info safe, while empowering them to grow and innovate”– so that sounds like a yes to data security however what about individual personal privacy and control over what happens to your information?

The federal government appears to be saying that will depend upon other goals– principally economic interests attached to the U.K.’s capability to carry out data-driven research or secure trade deals with other nations that don’t have the very same (existing) high U.K. requirements of data security.

There are some purely populist flourishes here too– with DCMS couching its ambition for a data program “based upon sound judgment, not box ticking”– and flagging up strategies to beef up charges for nuisance calls and text. Due to the fact that, sure, who does not like the sound of a crackdown on spam?

Except spam text and annoyance calls are a quite charming concern to no in on in an era of apps and data-driven, democracy-disrupting mass monitoring– which was something the outbound info commissioner raised as a significant problem of concern throughout her period at the ICO.

The very same populist anti-spam messaging has currently been deployed by ministers to attack the requirement to obtain web users’ authorization for dropping tracking cookies– which the digital minister Oliver Dowden just recently recommended he wants to get rid of– for all however “high danger” purposes.

Having a system of rights wrapping people’s information that gives them a say over (and a stake in) how it can be used appears to be being reframed in the government’s messaging as irresponsible and even non-patriotic– with DCMS pushing the notion that such rights stand in the way of more important financial or highly generalized “social” goals.

Not that it has provided any evidence for that– or perhaps that the U.K.’s existing data defense program obstructed of (the really adequate) data sharing during COVID-19 … While unfavorable usages of people’s details are being condensed in DCMS’ messaging to the narrowest possible meaning– of spam that’s visible to a specific– never ever mind how that individual got targeted with the annoyance calls/spam texts in the first place.

The government is taking its customary “cake and eat it” technique to spinning its reform plan– declaring it will both “safeguard” people’s information while likewise trumpeting the significance of making it actually easy for people’ information to be handed off to anybody who desires it, so long as they can claim they’re doing some kind of “development”, while also larding its PR with canned quotes calling the strategy “bold” and “enthusiastic”.

So while DCMS’ statement says the reform will “preserve” the U.K.’s (presently) world-leading data security requirements, it straight rows back– saying the brand-new program will (simply) “construct on” a few broad-brush “crucial elements” of the existing guidelines (particularly it states it will keep “concepts around information processing, people’s information rights and systems for guidance and enforcement”).

Clearly the devil will be in the detail of the propositions which are due to be published tomorrow morning. (Update: The assessment file is now on DCMS’ website and can be discovered here; the assessment runs up until November 19.) So anticipate more analysis to debunk the spin quickly.

However in one specific routed change DCMS states it wants to move away from a “one-size-fits-all” approach to data security compliance– and “permit organisations to demonstrate compliance in ways more appropriate to their scenarios, while still protecting people’ personal data to a high standard”.

That suggests that smaller data-mining operations– DCMS’s PR uses the example of a hair stylist’s but a lot of start-ups can employ less personnel than the average barber’s store– might have the ability to anticipate to get a pass to neglect those ‘high standards’ in the future.

Which suggests the U.K.’s “high standards” may, under Dowden’s watch, wind up looking like more of a Swiss Cheese …

Data security is a “how to, not a do not do”…

The man who is likely to end up being the U.K.’s next info commissioner, New Zealand’s privacy commissioner John Edwards, was taking questions from a parliamentary committee earlier today, as MPs considered whether to support his consultation to the role.

If he’s validated in the task, Edwards will be responsible for implementing whatever brand-new information regime the government cooks up.

Under questioning, he turned down the notion that the U.K.’s present information defense routine provides a barrier to data sharing– arguing that laws like GDPR need to rather be viewed as a “how to” and an “enabler” for development.

“I would disagree with the dichotomy that you provided [about privacy vs data-sharing],” he informed the committee chair. “I don’t believe that policymakers and services and governments are confronted with a choice of share or keep faith with information protection. Information protection laws and personal privacy laws would not be required if it wasn’t essential to share info. These are two sides of the exact same coin.

“The UK DPA [data security act] and UK GDPR they are a ‘how to’– not a ‘do not do’. And I think the UK and many jurisdictions have actually lastly discovered that lesson through the COVID-19 crisis. It has been definitely required to have excellent quality information readily available, minute by minute. And to move across different organizations where it needs to go, without friction. And there are times when information defense laws and privacy laws introduce friction and I believe that what you have actually seen in the UK is that when it requires to things can occur quickly.”

He likewise recommended that a lot of economic gains could be attained for the U.K. with some minor tweaks to present rules, rather than a more radical reboot being required. (Though plainly setting the guidelines won’t be up to him; his job will be imposing whatever brand-new routine is decided.)

“If we can, in the administration of a law which at the moment looks quite like the UK GDPR, that provides great latitude for different regulatory approaches– if I can turn that dial just a couple of points that can make the distinction of billions of pounds to the UK economy and thousands of jobs so we do not require to be throwing away the statute book and starting again– there is plenty of scope to be making improvements under the current routine,” he informed MPs. “Not to mention when we start with a fresh sheet of paper if that’s what the federal government selects to do.”

TechCrunch asked another Edwards (no relation)– Newcastle University’s Lilian Edwards, professor of law, development and society– for her thoughts on the government’s direction of travel, as indicated by DCMS’ pre-proposal-publication spin, and she revealed comparable issues about the logic driving the government to argue it requires to rip up the existing requirements.

“The whole plan of data security is to balance basic rights with the totally free flow of information. Financial concerns have actually never been ignored, and the current plan, which we’ve had in essence since 1998, has struck an excellent balance. The excellent things we finished with data during COVID-19 were done entirely lawfully– and with no great difficulty under the existing guidelines– so that isn’t a factor to alter them,” she told us.

She also took issue with the plan to reshape the ICO “as a quango whose primary task is to ‘drive financial development'”– mentioning that DCMS’ PR fails to consist of any mention of privacy or basic rights, and arguing that “producing an entirely brand-new regulator isn’t most likely to do much for the ‘public trust’ that’s seen as declining in practically every poll.”

She also recommended the federal government is glossing over the real economic damage that would strike the U.K. if the EU chooses its “reformed” requirements are no longer essentially equivalent to the bloc’s.” [It’s] hard to see much issue for adequacy here; which will, for sure, be reviewed, to our detriment– prejudicing 43% of our trade for a few low value trade deals and some hopeful sell of NHS data (again, likely to take a damaging ball to trust evaluating by the GPDPR scandal).”

She described the objective of regulating algorithmic bias as “applaudable”– however also flagged the risk of the U.K. falling back other jurisdictions which are taking a more comprehensive take a look at how to manage expert system.

Per DCMS’ press release, the federal government seems to be planning for an existing advisory body, called the Centre for Data Ethics and Innovation (CDEI), to have a key function in supporting its policymaking in this area– saying that the body will focus on “making it possible for trustworthy usage of information and AI in the real-world”. Nevertheless it has still not appointed a brand-new CDEI chair to change Roger Taylor– with only an interim chair visit (and some new consultants) announced today.

“The world has carried on because CDEI’s operate in this location,” argued Edwards. “We understand now that regulating the damaging results of AI has to be thought about in the round with other regulatory tools not simply data protection. The proposed EU AI Policy is not without defect however goes far further than data security in mandating much better quality training sets, and more transparent systems to be developed from scratch. If the UK is severe about regulating it needs to look at the global designs being drifted but right now it looks like its primary issues are insular, short-sighted and populist.”

Client data privacy advocacy group MedConfidential, which has actually regularly locked horns with the federal government over its approach to data protection, likewise queried DCMS’ continued attachment to the CDEI for shaping policymaking in such an essential location– pointing to last year’s biased algorithm examination grading scandal, which occurred under Taylor’s watch.

(NB: Taylor was likewise the Ofqual chair, and his resignation from that post in December pointed out a “challenging summer”, even as his departure from the CDEI leaves an awkward hole now … )

“The culture and management of CDEI resulted in the A-Levels algorithm, why should anybody in government have any confidence in what they state next?” said MedConfidential’s Sam Smith.