Author: Gary Hibberd

Date: 1st June 2020

 

Since the announcement on the 23rd of March 2020 that Britain would go into lockdown, the UK Government has been desperately looking for a way to ease restrictions without putting public safety at risk. This was never going to be an easy task, and while some of their decisions ave been difficult to implement, I like to believe they are acting in our best interests.

This includes one idea of creating a contact-tracing app, which will enable its users to be notified should they come into contact with someone who has been diagnosed with COVID19. The idea is that that person would then self-isolate for the next 14 days.

I am not here to say that we shouldn’t use technology to help in this fight against COVID19 and help improve our chances of reducing the mortality rates. But I am here to say that I believe we need to think carefully about how technology is used, and how our safety is being protected, both now and into the future.

I know that some will argue that the right to live trumps the right to privacy and I would agree. Under the Human Rights Act (article 2) is known as an ‘absolute right’, whereas the right to privacy (article 8 of the Human Rights Act) is a ‘fundamental right’.

 

Security Concerns

I am not here to argue which ‘right’ is more important. I fully respect and understand the need for this COVID19 app, and would encourage people to follow the UK Governments advice.

But remembering that lockdown was announced on the 23 of March, and went live on the 5 of May, my cyber senses begin tingling when I hear of an app that was developed (as far as I am aware) in just over one month. If it had been developed from the start of the outbreak (the 31st of December 2019), it is still pretty impressive to develop, test and launch an app in just 5 months. 

To my knowledge, there are 16 Tracing apps on Apple and Google stores, and with the UK Government avoiding the Apple and Google’s contact-tracing API, to build its app, it leaves me wondering; Which app should I use? What data is being collected? Where is the data being stored?

Of course, we know that cyber criminals are very active currently, and they will undoubtedly begin using fake-apps, phishing campaigns, and social engineering techniques to target the fearful and vulnerable. This isn’t speculation, as Action Fraud themselves are reporting a 400% increase in scams, all using COVID19 as a source topic.

 

Privacy Concerns

Of course, this leads us to concerns around privacy, of which I am not alone. 177 Cybersecurity experts wrote to the government expressing their concerns that the data collected by the app, and asking for reassurances that data collected by the app isn’t then used for other purposes, such as mass-surveillance. The open letter to the government states that;

“It is vital that, when we come out of the current crisis, we have not created a tool that enables data collection on the population, or on targeted sections of society, for surveillance. Such invasive information can include the ‘social graph’ of who someone has physically met over a period of time. With access to the social graph, a bad actor (state, private sector, or hacker) could spy on citizens’ real-world activities,” 

This idea of ‘mission creep’ is shared with many people in the industry which I speak to. All of whom recognise the important role this app has to play, but are still none-the-less concerned about what might happen down-the-line.

In response to these concerns, the UK Government has stated that “all of the data analysed and stored in the fight against Coronavirus will be deleted when it’s no longer required.”

Of course, that sounds great, but wait for just a second; What does ‘No longer required’ mean? NHSX, the body responsible for the development of the app, always maintained that data collected by the app would be deleted if a user chose to remove the app. But a spokesperson recently confirmed some would be retained for research purposes. Clearly, the data is still required.

This sounds suspiciously like mission-creep to me.

 

Conclusion

On 28 of May, the app went live in the UK.

When announcing the new app, the Secretary of State for Health and Social Care, Matt Hancock stated: “This new system will help us keep this virus under control while carefully and safely lifting the lockdown nationally.”

The National Cyber Security Centre (NCSC) has been involved in the development of the app and stated that “Privacy and security have been paramount throughout the app’s development”.

Even the Information Commissioners Office (ICO) has provided advice on data protection and privacy to any organisation developing such technologies. Let’s just hope that NHSX read it(!)

Finally, it is important to again state that I believe we all must do what we can to reduce risk to the community, and we should listen to, and follow the Governments advice. But I would also say that we need to be vigilant about what we are being asked to sign-up to. 

Our security and privacy concerns are real. They need to be addressed, or we cannot be surprised when we discover sometime in the future that our fundamental right to privacy, has been eroded due to our desire for the absolute right to life.

Other resources

Case studies

Our cyber consulting team works with clients from public sector bodies and global businesses to SMEs and start-ups. Read our success stories here. Learn more >

Video

See what our team have been discussing around current issues in regulation and data security, and recommended processes and policies that will benefit your business. Learn more >

Whitepapers

In our collection of whitepapers, Cyberfort’s cyber consulting experts explore issues from cyber threat intelligence to incident planning and data security. Read our whitepapers to help make informed decisions for the benefit of your business.Learn more >