Welcome to June’s newsletter
This month sees GDPR’s first aniversary, and as the Data Protection Commission reflects on the past 12 months, the headline statistics were as follows;
– 6,624 complaints were received
– 5,818 valid data security breaches were notified
– Over 48,000 contacts were received through the DPC’s Information and Assessment Unit
– 54 investigations were opened – 35 of these are non cross-border investigations and 19 are cross-border
investigations into multinational technology companies and their compliance with the GDPR.
– 1,206 Data Protection Officer notifications were received.
– Staffing numbers increased from 85 at the end of 2017 to 137 in May 2019.
For our thoughts on GDPR, see section 2 Thought leadership
1. Technical News
Biometric Data and explicit consent:
As technology takes ever greater strides, so organisations and
businesses are harnessing its capabilities to help manage their contact
with customers, including using it for means of identification and
While there are undoubtedly significant benefits in using new
technologies, organisations need to be aware of the potential
challenges when choosing and using any systems involving biometric
A deputy commissioner of the Information Commissioner’s Office (ICO)
has warned that organisations need to obtain explicit consent for the
use of biometric data.
A complaint from Big Brother Watch to the ICO revealed that callers were not given further information or advised that they did not have to sign up to the service. There was no clear option for callers who did not wish to register. In short, HMRC did not have adequate consent from its customers and we have issued
an Enforcement notice to have the Data deleted that it continues to hold without consent.
In the notice, the Information Commissioner says that HMRC appears to have given `little or no
consideration to the data protection principles when rolling out the Voice ID service’.
ICO highlights the scale of the data collection – seven million voice records – and that HMRC collected it in circumstances where there was a significant imbalance of power between the organisation and its
customers. It did not explain to customers how they could decline to participate in the Voice ID system. It also did not explain that customers would not suffer a detrimental impact if they declined to participate.
The case raises significant data governance and accountability issues that require monitoring.
Any organisations planning on using new and innovative technologies that involve personal data, including biometric data, need to think about these key points:
1) Under the GDPR, controllers are required to complete a DPIA where their processing is ‘likely to result in a high risk to the rights and freedoms of natural persons’ such as the (large scale) use of biometric data. A DPIA is a process which should also ensure that responsible controllers to incorporate ‘data protection by design and by default’ principles into their projects. Data protection by design and default is a key concept at the heart of GDPR compliance.
2) When you’ve done your DPIA, make sure you act upon the risks identified and demonstrate you have
taken it into account. Use it to inform your work.
3) Accountability is one of the data protection principles of the GDPR – it makes you responsible for
complying with the GDPR and says that you must be able to demonstrate your compliance by putting
appropriate technical and organisational measures in place.
4) If you are planning to rely on consent as a legal basis, then remember that biometric data is classed as
special category data under GDPR and any consent obtained must be explicit. The benefits from the
technology cannot override the need to meet this legal obligation.
The GDPR’s security principle requires to you put in place appropriate technical and organisational
measures to ensure you process personal data securely.
Article 32 of the GDPR provides further considerations for the security of your processing. This includes
specifying encryption as an example of an appropriate technical measure, depending on the risks involved and the specific circumstances of your processing. The ICO has seen numerous incidents of personal data
being subject to unauthorised or unlawful processing, loss, damage or destruction. In many cases, the
damage and distress caused by these incidents may have been reduced or even avoided had the personal data been encrypted.
It is also the case that encryption solutions are widely available and can be deployed at relatively low cost. It is possible that, where data is lost or destroyed and it was not encrypted, regulatory action may be
pursued (depending on the context of each incident).
2. International news
Ireland: Facebook and Whats App – Investigations into passwords being stored as plain text The DPC Ireland has notified us of an investigation into Facebook’s
storing of passwords in plain text has been initiated which concerns
fundamental issues in GDPR.
It also issued the following Statement about Whats App: “The Data
Protection Commission (DPC) has been informed (Monday evening
13 May 2019) by WhatsApp Ireland of a serious security
vulnerability on the WhatsApp platform. The DPC understands that
the vulnerability may have enabled a malicious actor to install
unauthorised software and gain access to personal data on devices
which have WhatsApp installed”.
Denmark: 160 000€ Fine for not deleting personal data in time
Following an inspection by the Danish Data Protection Agency in October 2018, the taxi company, Taxa
4×35, have been reported by the Danish Data Protection Agency to the police and the Agency has
recommend a fine of 160 000€ for violation of the GDPR.
In most jurisdictions, the Data Protection Authority can issue fines by their own but in Denmark a police
report must be issued, and the fine will be determined by the courts of Denmark.
The conclusions are interesting, as they expose several interesting mistakes conducted by Taxa
Article 5 of the EU General Data Protection Regulation outlines the processing requirements for personal
data. A recent fine imposed by the Danish DPA gives some guidance on how these Article 5 principles could be enforced going forward. In its ruling, the Danish DPA found that Taxa had violated Article 5 of the GDPR in three ways: purpose limitation, data minimization and storage limitation(retention)
Article 5(1)(b) requires that data be collected for a legitimate purpose and not be further processed in a
matter that is incompatible with that purpose. Taxa violated this principle when it transformed the phone numbers of customers into “anonymous” account numbers. Taxa admitted that the phone number was not necessary; only an account number to be associated with taxi ride data was needed. Taxa did not treat the phone number as personal data and apparently had no intention of using the phone number to contact or personally identify the individual customer. Instead, it intended it to be an anonymous way to track data to meet a business purpose. The Danish DPA clearly found that personal data must be processed in
compliance with the GDPR, regardless of how the company intends to treat the data.
Article 5(1)(c) requires personal data be adequate, relevant and limited to what is necessary in relation to
the purposes for which it is processed. Taxa argued that it had met minimization requirements by removing the names associated with the phone numbers and that its systems were not capable of transferring the
anonymous data about the taxi ride from a phone number to a unique ID. The Danish DPA did not care that the computer systems made it difficult to create new account numbers and stated, in no uncertain terms,
that costs associated with migrating personal data to a new anonymous data structure do not justify
continued use of the phone number beyond the retention policy.
Article 5(1)(e) requires that personal data is kept in a form that permits the identification of a data subject for no longer than is necessary for the purposes for which the personal data is processed. Taxa had a
retention policy in place that stated data collected during a taxi ride is only necessary for two years.
However, at the end of the two years, Taxa only deleted the name associated with the ride but kept all the taxi-ride data relating to the ride (date, GPS coordinates of starting and ending location, distance, payment) and associated with the customer’s phone number for an additional three years.
Retention schedules are only as good as long as they are followed. Privacy professionals need to ensure
that the timetable of retention is no longer than is necessary and that once time has expired that all
personal data is removed.
Norway: €170,000 fine for having one file saved in the wrong location. Or actually, for not knowing
where the personal data they are processing are stored at all.
The Municipality of Bergen has been fined 170 000€ by the Norwegian DPA, Datatilsynet.
The breach came to the DPAs attention by the report of one of the students of the public school,
administrated by the Municipality of Bergen, who found a file with login
credentials for 35 000 students and employees, in a public storage area.
The fine is for having one file saved in the wrong location. Or actually, for
not knowing where the personal data they are processing are stored at all.
Not knowing this means they cannot apply appropriate measures to
protect it, and they are therefor in breach of both art. 5(1)f and art. 32
Do you know where the personal data you are processing is stored?
Datatilsynet found that the municipality’s lack of appropriate measures to
protect the personal data in the computer file systems constituted
violations of both art. 5(1)f and art. 32 GDPR.
The fact that the security breach encompasses personal data to over 35 000 individuals, and that the
majority of these are children, were considered to be aggravating factors.
The Norwegian decision points the finger on the need to perform a privacy data inventory. The Municipality of Bergen has conducted a number of projects relating to information security and access management.
However. There is no point in investing in security measures and access management, until one has full
control of where personal data resides within the data sources.
26 Pembroke Street Upper Dublin 2
+353 1 636 3165
3003 Euro Business Park Little Island