“Ethics is at the root of privacy and is the future of data protection”

“Ethics is at the root of privacy and is the future of data protection”

“It’s hard to separate data protection by design from data ethics by design”

“Companies must ask themselves questions that identify the risks they are creating for others and mitigate those risks. There is every reason to include ethical considerations as part of that process.”

quotes from UK Information Commissioner – Elizabeth Denham’s speech at the TechUK Data Ethics Summit on 13 December 2017.

See the full speech at https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2017/12/techuk-data-ethics-summit/

or below:

Elizabeth Denham’s speech at the TechUK Data Ethics Summit on 13 December 2017.

Thanks for inviting me here today. I’m pleased that the issues of data protection are taking such a front and centre role in discussions about ethics and innovation.

I thought I’d start with a story. Are you sitting comfortably? Then I’ll begin.

Once upon a time there was a little girl called Ada. She had a quick brain and a huge imagination. The daughter of a poet and a gifted mathematician, Ada studied hard and became quite something in the world of computers.

Unremarkable perhaps.

But the twist in this tale is that Ada’s pioneering work took place nearly 200 years ago.

At a time when electricity was “new”, steam trains were an unconventional form of travel and the sticky postage stamp was a revolution in communication.

When I address conferences I often remark on how technology has changed beyond all recognition in the space of a generation – the 20 years since the Data Protection Act, the law my office regulates, was forged.

It’s easy to forget the origins of this revolution go way, way back.

Ada Lovelace may well be known to you – as the daughter of romantic poet Lord Byron, or a visionary with a passion for flying or for creating the world’s first machine algorithm.

But here’s what sets Ada apart and why I mention her here today. Ada looked beyond what was immediately possible. She saw Charles Babbage’s Analytical Engine – the first ever general purpose computer – as more than just a number cruncher.

She saw how numbers could represent other things – letters, musical notes, symbols – and how the machine could manipulate them according to rules.

She developed a vision of computer capability, a mind-set that she called “poetical science”. It led her to ask questions and examine how individuals and society relate to technology as a collaborative tool.

Ada saw the future. And now it’s our job to make some predictions of our own.

The future

What will technology look like in the future? What will it look like in another 200 years – Yuval Harari has some interesting thoughts on that subject in his latest book Homo Deus.

How artificial intelligence will ultimately outsmart us all and reduce our role as humans to bystanders.

We’re not quite there yet, but the world already seems a pretty futuristic place. The Transport Minister has indicated the first autonomous cars could be on sale in just three years.

Law enforcement agencies use biometric software to scan faces in CCTV footage and security firms use it to collect demographic data on crowds.

Businesses are changing too; using AI technology to improve customer service and streamline their operations.

Almost every day I read news stories about AI’s capabilities and effects. You’ll all have read about Facebook’s controversial new algorithm that can judge whether an individual’s posts may indicate thoughts of suicide.

And I recently read about computers that could, one day, assess your body mass index from a photo before offering you health insurance.

It makes me wonder – will our story have a happy ending? That’s why we’re here today.

And why am I here today? What role does the Information Commissioner’s Office play in this space?

The law

Many issues relating to data ethics involve personal data. And when it comes to personal data, that’s my office’s domain.

It may be useful for me to set out our regulatory role here. First off, we are a statutory regulator independent of government.

We are responsible for ensuring that personal data is handled in line with the law – specifically the Data Protection Act 1998. We educate and advise, comment on and raise awareness on issues related to data protection. When we need to, we can take enforcement action.

Our duties are wide and comprehensive and note merely a complaints based regulator. But when you strip it all back, my office is here to ensure fairness, transparency and accountability in the use of personal data on behalf of people in the UK.

These are principles of data protection, but they apply to some of the fundamental ethical questions we are discussing here as well.

These principles in the law are fit for purpose. They have stood the test of time, are technology-neutral, and those that argue we need a new legal framework miss the mark.

I accept that the Data Protection Act is not perfect and that it has struggled to keep pace with technological advances, including AI. The 1995 directive and the Data Protection Act have not affected the evolution of the internet or prevented surveillance from becoming the prevailing business model, the law has its limitations.

But there is a new law in town. The General Data Protection Regulation.

And this is a significant step up in the law. It was drafted by legislators here in the UK and throughout Europe for the very purpose of trying to tackle opaque decision-making by machines.

The GDPR significantly enhances people’s ability to challenge decisions made by machines. It provides for a measure of algorithmic transparency.

It provides for human intervention in decisions that have legal or similar effects.

This is not a new game played by different rules. The rules remain the same – fairness, transparency, accountability – and my office is well placed to regulate them.

The idea that data protection, embodied in legislation, does not work in a big data context is wrong.

Investigation into use of data analytics for political purposes

You’ll know of our investigation into the use of data analytics for political purposes. We’re looking at whether personal information was analysed to micro-target people as part of a political campaign and have been particularly focussed on the EU Referendum.

The overall goal of this work is to give the public insight into the vast sources of data and personal information used in the political arena.

I doubt very much that the majority of people understand the practices behind the scenes, data brokers, parties, campaigns, social media platforms, let alone the potential impact on their privacy.

It is still too soon for me to speculate on the outcome of our investigation.

But I will say this. Whether or not we find practices that contravened the law – and this is where I have jurisdiction – there are significant ethical questions here.

Ethical questions about truthfulness, fairness, respect, bias and maintenance of public trust in our political campaigns and referendums and perhaps even our democracy.

Even if it’s transparent, even if it’s legal, is it the right thing to do?

Ethics is at the root of privacy and is the future of data protection. In my view, this is the way forward. There must be a convergence.

For those of you who are interested, a fuller update on our investigation will be published on the ICO website this afternoon.


So I have the law to back me up. But, as I say, laws, regulation and guidance must keep pace with advancing technologies like AI and machine learning.

It’s important to create an environment that supports innovation without compromising individuals’ privacy rights.

As I’ve mentioned, on 25 May 2018 a new chapter begins when the GDPR takes effect. This is a much-needed modernisation of the law which gives us the right tools to tackle the challenges ahead.

The GDPR does not specifically reference data ethics, but it is clear that its considerable focus on new technologies – particularly profiling and automated decision making – reflects the concerns of legislators about the personal and societal effect of powerful data-processing technology.

It also embeds the concept of data protection by design – an essential tool in minimising privacy risks and building trust – and Data Protection Impact Assessments, which will be compulsory in some high risk circumstances and, in some cases will have to be assessed and approved by my office.

The new law minimises the chances of acting in haste, repenting at leisure. The work has to be done up front.

But these tools need not be restricted to data protection. It’s hard to separate data protection by design from data ethics by design.

Companies must ask themselves questions that identify the risks they are creating for others and mitigate those risks. There is every reason to include ethical considerations as part of that process.

The most innovative companies will go further and use these tools as a springboard to think of ways they can integrate their data protection and ethical assessments.

That just makes common sense. And it speaks again to convergence.

We’ve offered practical advice on applying GDPR compliant impact assessments in the specific context of big data analytics. It forms part of our paper on Big data, artificial intelligence and machine learning.

It addresses the broader societal implications of AI and says that “embedding privacy and data protection into big data analytics enables not only societal benefits such as dignity, personality and community but also organisational benefits like creativity, innovation and trust.

”In short, it enables big data to do all the good things it can do.”

There is a lot of good it can do.

The world of data protection and data ethics are not sitting in separate universes. But there are broader questions beyond the law. We are all struggling to define the gaps and work out how the outstanding questions can be addressed.

Although I would like to think my office is sagacious in this space, we do need to have a broader conversation across many sectors and society.

There are other key players, reports and initiatives contributing to a go-forward approach for the UK – and many of them are in the room today. The Royal Society and British Academy, Wendy Hall’s report to government on the AI industry, the Alan Turing Institute, the Nuffield Foundation, and key studies by parliamentarians.

Last month the Government announced its intention to create a new body concerned with data ethics. Matt Hancock has already spoken about it this morning.

The Centre for Data Ethics and Innovation can complement the role of the ICO and other regulators by promoting the consideration of ethical issues. We recognise it can be a positive enabler and encourager of innovation particularly around AI and machine learning.

The Centre for Data Ethics and Innovation

So how do I see the new Centre shaping up?

I’d like to see it facilitating meaningful public consultation on matters that, ultimately, impact on people and their privacy. These consultations will help define the public and societal benefit in use of data and ensure it benefits communities and not just a few individuals.

I’d like to see it focus on futurology. Stepping out of the here and now and scanning the horizon for the next big data ethics challenge.

We would like the centre, or a hub of bodies linked to it, to work with regulators to provide overarching ethical principles for AI and machine learning.

We recognise general principles will have specific applications across sectors.

AI applications for automated vehicles could have very different implications than in criminal justice or intelligence services, for example.

That’s quite a wish list!

But while I’m talking about it, the Centre could also support and encourage codes of conduct and standards.

For example, support the development of a code of conduct for ethics committees in companies. What does good look like?

It is critically important that the new body takes time early on to properly assess its role and how it can fill the gaps that exist. It should not take on a regulatory role which would only complicate the landscape.

We look forward to working with the new Centre and sharing our expertise – especially around the Impact of ubiquitous data collection and technologies like artificial intelligence.

And we’ll continue to co-ordinate our work with other independent regulators in the data ethics space.

In my view there’s no dichotomy between ethics and innovation. But ethical considerations should dictate the direction of travel.

The UK has always been a leader in data protection – it’s one of the things that attracted me to this job – and the UK is a leader in the digital economy.

This will continue if we can embrace the law, and think about its principles as we continue to innovate.

We’re in a race to the top with economies like Japan, Singapore and France that are focussed on AI and digital economies. They know – we know – how important it is to get ethical issues right when it comes to AI.


In closing, allow me to look again to the past.

Ada said: “Understand well as I may, my comprehension can only be an infinitesimal fraction of all I want to understand.”

There is so much more for us all to understand. But I do know this: The UK is uniquely placed to be a leader in this space and to ensure that the principles of data protection and data ethics are firmly embedded in a future framework.

Thank you.

Informationssikkerhed er et vilkår for at drive forretning

In Dansk Standard Vækst+Kvalitet #11 I provide reflections on the importance of Information Security Management, how the new EU GDPR underlines the need for Business Accountability – a “license to operate” and how Customer trust is becoming a new competitive advantage.

Please also find the article here: DS Vækst og Kvalitet No 11 November 2017 p32-33

The magazine is a supplement to Børsen on November 21 2017.


Data Ethics – The new competitive advantage

The Globalization leveraging the speed of information exchange across the world enables myriads of opportunities to any kind of business. It transforms the way we as people live our lives both in our local communities and our presence in global contexts, including professional, personal and private contexts. It is a transformative move we are part of, whether we like it or not. This is why enabling personal privacy in a transforming world is important. The myriad of opportunities to businesses and governmental authorities of leveraging new and emerging digital technologies is great and as a business you need to play along – if not – your business is soon out of the game.

However, the myriad of opportunities with great and noble purposes all have a dark flipside calling for attention by the people with bad intent, people living their digital Business life in the Dark internet, where privacy has been a cornerstone. A digital world where personal and internet identity have been clearly separated and anonymous presences rule and transparency is nonexistent. The ideal world of criminal activity with very low risk of individuals being caught in crime.

The new EU GDPR is designed to protect individual people’s privacy in the “Legal” world and to help Business at global scale to simplify exchange of personal data and to support businesses to leverage emerging technological opportunities. The strategy is to raise the bar for Businesses to handle personal data with care and demonstrate Accountability in terms of Lawfulness, Fairness and Transparency. Business and organizations are required to implement documented appropriate technical and organizational measures based on data privacy impact assessment of their processing of personal information.

Well, most of the basic privacy protection requirements have been around for a long time, but the transformative change of the world, the opportunities and threats developing at exponential scale call for new business and organizational habits in terms of practicing ethical behavior when handling personal information.

Even with the positive intent of the EU GDPR to protect individual’s privacy, that also has the dark flipside. The new penalty regime – up to €20 million or 4% of global turnover – for breaches call for a great business opportunity in the world of the Dark internet. It is easy to imagine how hackers with bad intent is motivated to start doing hacking and collection of personal data with the purpose of blackmailing the data controller for amounts less than the penalty to not disclose the data to the data authority. And even in that situation the data controller is in deep troubles as they are already obligated to report the breach.

Your business’s ethics capabilities will need to evolve and be used in your marketing strategy for your business to be competitive. Your ability to demonstrate your Trust in terms of proper information and data processing becomes key to any business as the EU GDPR roll out. We will see a new wave of privacy awareness among people – end users of smartphones and digital services – at the speed of light, calling for high demands of ethical behavior by businesses.

Data ethics is key to business at the speed of trust and data ethics become the new competitive advantage and it will stay so until data ethics has become the new norm of business practice.

So lets get started training the new data ethics habits, be visionary and ambitious on data ethics on behalf of your business beyond EU GDPR compliance!

“License to Operate”

With new EU GDPR requirements on information security related to personal data, compliance with the Regulation has become a make or break matter for any organization within EU and any organization outside EU processing EU citizen personal information. As such EU GDPR compliance is a License to Operate”. Your organization is required to provide documented evidence of compliance on request from your national data legislation Authority – in Denmark “Datatilsynet”.

Regardless the type and size of your organization Schledermann Consult can help get organized and ensure documented compliance with EU GDPR and other Information Security, Legal or Regulatory requirements.

Let us help you understand your “appropriate technical and organizational measures” and get organized for EU GDPR License to Operate.

My Professional Reputation

My Professional Reputation

“With dedication, high ambition and endurance Steen works strategically toward high set goals”

Steen gives the group its sense of common goal. He is professional, enduring, doing everything to finishing touches task and always with the customer in focus. He leads by example, is devoted, has a great desire to achieve goals, regardless of obstacles and opposition.

Steen shows great enthusiasm, is devoted to his work and think strategically. He is likable and has an infectious enthusiasm. He is good in inspiring secretly, he gives colleagues space and time, are welcoming and attentive.

Steen thinks strategically, has a natural approach to explain complex and introverted structures and values of the client. He has a very high level of ambition and work toward high goals. He has managed to move a small flat organization to an organization with structure and position to grow.

Steen’s passion shines through in his communication. He is an inspiration when he talks about new work and ways of thinking.

Steen produces results when it matters.

This abstract is based upon the responses from 11 colleagues, customers and managers who have submitted anonymous answers Thursday 20 June 2013.

This analysis is made by Peter Vikstrøm and Per Frykman – VIKSTROEM +45 26 21 13 67