COVID-19, Data Protection law and Privacy… Or the needs of the many vs the needs of the one.

When you have no right to privacy, Data Protection law governs the organisations respect for your information. It should not be Data Utility vs Privacy, but Data Protection and Data Utility.

The terms data protection and privacy are often used interchangeably.  Recently I have seen a high number of articles about “COVID-19 Symptom tracking Apps” and in the privacy community they all say the same – “is the loss of our privacy worth it?”  It’s tempting to look at it this way, and as Data Protection legislation is based on fundamental human rights legislation and principles, these are indeed worthy questions.  

It is always a societal balancing act, when considering the needs of the many vs the few, or the one.

It is clear in times of emergency, civil liberties can sometimes be suspended for “the greater good”, and the current “lockdown” with Police powers to enforce it is a clear example of this.  We all accept that it is for our protection in the “greater good”, but no one wants to awake from a subsided pandemic into a “new normal” of a Surveillance led Police state similar to an Orwellian 1984 Big Brother.  Power, once shifted and obtained is no easily set aside by Governments.  It only takes a look at the US Government “Patriot Act” that was passed for a limited time in response to the 9/11 terrorist attacks, that has been consistently renewed by successive governments ever since.  I’m reminded of Star Wars, where the Empire began with an unscrupulous Chancellor using emergency powers granted during a time of war to create an oppressive power hungry regime.  Yes, fantasy, but if our fiction is a mirror to our society, it is clear these are concerns that we all share.

As soon as I see technological surveillance being normalised, such as workplaces monitoring employee attention on remote web conference calls, or CCTV and Drone use being utilised to keep individuals in their place, I naturally recoil, and the libertarian in me seeks to object to the direction that our new surveillance society is taking.

But where does the law stand?  “Privacy” is not really mentioned in our current law, and instead we use the term “Data Protection”.  Often I see these terms used incorrectly as synonyms.  Privacy is not the same as Data Protection law. Why?  

The answer lies in this simple statement:

Data Protection law still applies, even when you have zero right to Privacy.

Let’s take it back a bit.  Privacy, and your human right to it, is more of a synonym for “Secrecy”, or your right not control if your personal data is disclosed or not.  Clearly we can’t live in a society where we live in Secret.  We have to transact our personal information in order to contract with businesses, lead productive social lives, contribute to society, pay our taxes etc etc.  This means our right to privacy changes depending where we go and what we do.  Clearly we have a large amount of privacy surrounding sexual practices in the privacy of our own home.  We have much less expectation of privacy in a busy high street, acting in our business capacity at work, or if we hold public office, or we use our persona to cultivate a celebrity status.   Privacy is changeable and varied, and it depends where we are and what context we do it in.  It is also not consent based in the majority of cases, as we cannot say “no” to having our data collected for tax, for a legal obligation, or deny law enforcement access where they have genuine need to investigate crime.   In the majority of cases, we may have little or no right to privacy or real choice over our data collection.

This is actually why Data Protection law is so important, as it sets rules and principles for when our privacy should be respected or when it should not.  

It does this by requiring organisations to have a legal basis for holding personal data, which defines the strength of the power between them and the organisation. Clearly if a law requires the company to hold the data they have little rights to privacy, but if relying on something like consent, the individual holds much more power – but either way, the organisation still needs to comply with their data protection legal obligations.  Most importantly where we do not have a right to privacy Data Protection law sets up responsibilities for those that hold it.  Data Protection law goes far beyond the scope of Privacy, but defining the safeguards in place for the proper use of the data for those that hold it.  Let’s consider some key parts of Data Protection Law, summarised as follow;

  • Use Justification (legal basis)
  • Transparency (privacy notices)
  • Collection Limitation (minimum necessary to the purpose)
  • Use Limitation (only used for the purposes notified)
  • Accuracy (ensuring data is up to date and 
  • Storage Limitation (held no longer than necessary)
  • Security (appropriate technological and organisations controls)
  • Individual Participation (allowing individuals rights such as access to a copy, rectification etc)
  • Transfer limitation (kept in countries/organisations with appropriate safeguards)
  • Accountability (appropriate documentation kept to demonstrate compliance)

There is much more to Data Protection law obligations than just these (controlling third parties, privacy impact assessments, privacy by design, etc etc), but I would argue, these have less to do with Privacy itself and more to do with practical Information governance and data management, examining the flows of the data and ensuring appropriate controls and safeguards have been designed in to ensure peoples data is treated with respect and only the minimum used where strictly necessary.

So let’s return to our examples of these “COVID19 symptom tracking apps”.  Clearly I can think of public interest reasons where we need to sacrifice individual privacy for the greater good, and wider public health.  However, trust is key. We must all be cautious to ensure that any of these solutions is carefully planned out in accordance with the principles above, with properly conducted Privacy Impact assessments giving rise to controls that protect our personal data, and by extension – us.  The data should be used only where strictly necessary, with minimum data collected, appropriate safeguards to minimise risks to the individual, deleted when no longer strictly necessary, and used for no other purposes that those originally identified and specified at the time of collection.  This is a far greater challenge than a simple “Yes/No” to the invasion of privacy, but instead reasoned justification and practical data management measures will win the day, providing great societal benefit and protection g the individual simultaneously.

It is not therefore Privacy vs Benefit, but Data Protection and Benefit.  A positive sum, win win solution, that benefits everyone, both individually and society as a whole.

Ralph T O’Brien, Principal,

Belgium DPO conflict of interest resulted in a fine

2 years on and finally a fine pertaining directly to the role of the DPO…. hurray! What a great celebration for GDPR and each of us who have the privilege to be a Data Protection Officer.

Avoidance of a conflict of interest for the DPO is super important in any organisation because the role requires that he/she stands in the shoes of the data subject which potentially can conflict with how the organisation views risk.

If we take this from a privacy risk angle, what is privacy risk? It is the risk of harm to the rights and freedoms of an individual (or natural person as per GDPR). You can think of the DPO similar to a consumer advocate in an organisation, except it’s ensuring that the organisation is fulfilling its obligation as a fit custodian of personal data, and ensuring that the rights of the data subject are met.

A conflict of interest can occur when looking at risk. Every privacy risk will equate to another organisational risk, i.e. missing encryption on laptops is a privacy risk but it is a security risk which is the cause of this privacy risk.

When you as DPO need to decide on risk appetite, you need to do this in the shoes of the data subject first. It’s not practical to ask all (data subjects) if they find this risk okay to accept, most wouldn’t understand what you’re talking about. As a CISO/CRO you will be looking at risk from the view of the organisation’s risk appetite. In fact these 2 views can create conflict in the role of the DPO, hence a conflict of interest.

This is why the recent ruling in Belgium is so important since GDPR came into force.

GDPR is very Personal

It is a personal post. Not that it is annivarsary of GDPR so I am very emotional due to that but because to me, GDPR is very personal and I hope you don’t mind.

Personal Letter

When was the last time you sent a letter? I do not mean letter to tax office, employment agency, or invoice to customers. Real ones?

When I was younger, it was the most joyful thing to write and then send a letter. I knew, I could write whatever I want to write, my girl friend could read, and I could get letters from here. Everything was private, everything was between two of us, and so emotional, so special, so joyful.

Now, instead of letters we send emails, fb messages, we whatsup, we viber, we hangout, we telegram.

I was sure that I was almost confident that, my letters were not opened, I had this trust. Because I could put signs in a way that I could let receiver notice that whether my envolope opened or not.

Yes, it was an analog process and now we are living in the era of digital. Now things changed. Now we whatsup, messangers, twitt, email each other.

I would like to continue with nice example Peter Krantz gave when he was CIO of Swedish National Library.

Just to illustrate how communication has changed I would like to use analogy of lending a book from library. Peter Krantz, who is CIO of National Library presents it like this:

You, as a user just lend the book and everything  between library and you. Now, when you read a digital book, there are many different stakeholders as stated in the picture. I have no control of use of this information.

But don’t get me wrong, I think we don’t even need GDPR and we can fully trust companies and states.

Who ever complains about privacy, talks about human rights etc, these are some bunch of crazy people who live in dillutional world.


When I was younger, were were told some consipracy theories that there is a secret ECHELON program that collects and stores all digital communication. It was crazy. Why there should be an organization like this? Why should they collect all these data?

But now we know it is the fact. We know because we have evidence provided by Edward Snowden. We know that because there are whistleblowers. Governments are not hiding it anymore. They just do that to protect us from terrorists! Companies are not hiding it. They accept that they collect data, take our consent, as if we could have another option and use it.

As a result, governments, and public bodies, creates, collects, stores and holds lots of critical information about us. It is not only our name, street address and registration number. It is our health data, it is our videos recorded by security officers, it is our photos and voice recordings when we enter and leave the countries at the airport. It is our photos (and may be even voice) when we stop at stoplight, it is our fingerprints when we enter (some) countries or get a new passport and driving license.

Governments ask us to trust them, private companies ask us trust them.  As I said, I trust all!. When I read “personalized content” I just understand that “Oh boy, great they are recording and following us for my safety and security”.


“The information we collect includes unique identifiers, browser type and settings, device type and settings, operating system, mobile network information including carrier name and phone number, and application version number. We also collect information about the interaction of your apps, browsers, and devices with our services, including IP address, crash reports, system activity, and the date, time, and referrer URL of your requests”

“We will share personal information outside of Google if we have a good-faith belief that access, use, preservation, or disclosure of the information is reasonably necessary to:”

Let me give explain what this means in practice for TrustCorp:

They  collect personal information like your name, email address, telephone number or credit card. They can collect our phone number, which is not core business, they can collect identifieir of my phone, again, I did not buy phone from them, they can see all calls, dates, durations, type of calls, nothing mentioned about whether they also record my calls or now (may be, who knows?) Then, they can collect information about the websites I visited, what worked what crashed, all location information that they can identify from any sensors, wireless pot around, information about my local storage.  Look, storage is interesting actually: It is like a IKEA knowing how much space I still have empty in my wardrobe and received regular updates about my space in wardrope, is not it?

If you think this information is too much, no it is not! If they think they might need more information from me, then they will notify me, if it is something notifiable, and they will ask my explicit consent, meaning that, I have accept their terms and conditions otherwise I cannot check my emails, check maps etc. I will not have this option: “No, I reject but still use the service as I was using before.”. I can not reject and use old version either! I will have the only and the one great option: Consent!

Not only these company have our data, they can also share all these data to

 “companies, organizations or individuals outside of TrustCorp if we have a good-faith belief”

They can share with individuals. Who can be these individuals? and if TrustCorp have “a good-faith belief”. What is “good” and whose “faith” is it?

TrustMEtoo CORP

Now my second example is from another TrustMEtoo corporation. This is the one that usually tries to improve their privacy and just deflect the questions. Let me explain how this company was used by Trump before the election: Parscale uploaded the names, email addresses, and phone numbers of known Trump supporters into the TrustMEtoo advertising platform. Next, Parscale used TrustMEtoo’s

Custom Audiences from Customer Lists” to match these real people with their virtual TrustMEtoo profiles. With TrustMEtoo’s “Audience Targeting Options” feature, ads can be targeted to people based on their TrustMEtoo activity, ethic affinity, or “location and demographics like age, gender and interests. You can even target your ad to people based on what they do off of TrustMEtoo.”

Parscale then expanded Trump’s pool of targeted TrustMEtoo users using “Lookalike Audiences”, a powerful data tool that automatically found other people on TrustMEtoo with “common qualities” that “look like” known Trump supporters. Finally, Parscale used TrustMEtoo’s “Brand Lift” survey capabilities to measure the success of the ads.

Then data was shared with trustable organizations and individuals to create their own database and then there was Project Alamo where 220 M American data were stored with approximately 4,000 to 5,000 individual data points.

What I was saying: we should trust corporations and TRUMP governments, right?

Muslim Registry

Do you remember, when there as a time, when people were scared that Trump would register Muslims in USA? Honestly, why were we scared that TRUMP is going to register Muslims in USA?

Honestly, do you really think he is going to register by one by, as of today, I guess we all know that he is not stupid that he was presented to us by our “objective” media. He already have the registry, as one of my friends shared at her Facebook post how Muslims were able to receive specific letters from churches and how innovative way of reaching the church is presented (it is from 2016).

I am not against of any religion practice, but I am not sure if we all are OK that any organization, company can get that detailed list?

Shall we trust companies? OH Yes!

Trust Governments

I think not only companies, but all governments are trustable, let me give you an example:

Let me give you first example from a state:

“According to a half dozen current and former employees, who spoke on the condition of anonymity, leaked Procera documents and internal communications, Turk Telekom requested not just a feed of subscribers’ usernames and passwords for unencrypted websites, but also their IP addresses, what sites they’d visited and when.” Forbes, October 2016

This except is  an old news from Forbes, when Turkish states technology provider company asked a Canadian company to  give access to “usernames and passwords for unencrypted websites, but also their IP addresses, what sites they’d visited and when”. We could only hear about this because they had Swedish branch and Swedish employees and CEO, and they protested, and CEO resigned. What if they did not? What if there are some companies that do not care about these issues but just profit from it?

Private Fridays and Privacy of Health Data

It is not only about when we use the service, with every device we are adding to our life, corporations are so trustable that they start to dare to say be careful what you say next to their voice activated devices. You don’t need to worry about private talks or moments or Friday nights with your partner anymore! Your dear friend Alexa will take care of it!

If you want to have some private moment and do not want them to hear and see, go to storage room! Wait a moment, maybe we already put a camera there!

Now we have covid-19 and some countries are making mandatory for people to provide input to some specific apps and go regular screening about their health with specific tools and cameras. They claim that it will ONLY be used for Covid-19. Let me give you a great example and ongoing discussion about PKU- blood registry in a very democratic and open country: Sweden.

PKU is a genetic disease and parents are asked to donate their kids’ blood for PKU clinical & health research. Majority of people, for the Samaritan reasons donated their kids’ blood.

You know what happened? In 2003 after assassination of Foreign Minister Anna Lindh, police were able to identify the perpetrator by means of blood samples from the PKU database, despite protests by the health service. When identifying Swedish citizens after the 2004 tsunami disaster, the Biobank Act was temporarily amended by a parliamentary decision that allowed the International Identification Commission to use the samples,

Imagine, they tell you to give permission of your newborn baby to be taken for the research to cure diseases, and you decide to donate. Now it can be used by police and international commission!

Imagine that you are “that kid” and your blood is registered to a database without your own consent and what if government decides to open these databases to not only police but insurance providers, to find your preconditions, genetic diseases that can be shared by companies that are trustable as I described above!

Privacy is Creativity

Imagine the world we are entering, we are recorded registered from our birth from our blood to our every move, by different companies, states and governments where we are supposed to trust them and they can share this information..

There are tons of studies on privacy and cameras etc. These sociological studies conclude that, behavior becomes conformist and as accepted by the power owners, as expected by power elite and we can basicly conclude that creativity, freedom, and resistance to power, actually humanity dies!

Think for a moment, what if Hitler had all these survalience technology we have now?

What could we do in order to protect ourselves with the all political and technological power he would have!

Privacy of Personal Data

I hope my examples above show why and how data privacy is very important for citizens.
But I have to make a distinct difference here. When government representatives talk about privacy and security, they ONLY refer privacy and security of government files. The problem with that context is that, these governments do not care about privacy of citizen data, do not care about privacy of, basically, “mydata”.

How much can we trust that google, Samsung, Microsoft and any other private companies will protect our data? What gives them this right to collect so much information about every one of us, every device of us?

I am not here to draw a negative picture, but we must face the reality and define problems properly without falling into ideological trap so that come up with suggestions. Because whenever people raise their voices, these organizations create an environment that privacy advocates are bunch of radical people who does not get the new world!

But just think about: How many of you know how your data collected and treated?

You are being told that it is anonymizer, right? but which level? I am sharing you my anonymized pictures!  Which one is stored in the database?

Encryption, right? I know a company promised so, please search for Asley Maddison, but then all their database is leaked, and data was easily seen, and people lost their reputation, several people committed suicide. People trusted the company that is regulated by state rules but in the end, their privacy is comprimised!

Ok Serdar, the DrZero, you complained a lot, are there any solution: Yes

There are many solutions, but I want to keep this discussion for future posts, first, we should see that these are about everybody and about each of us!

All these “trustable” companies and governments are pushing us to the corner by saying: (As once Google CEO Eric Smith did)

Trust us we are good guys!and

If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place

Edward Snowden has an answer to them:

“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say

I think I have right to flip the sentence here:

Hey Dear Companies and Dear States, If you do not want to share how you store our data, encyrpt our data, how you process the data,  then you have something to hide!

We put our trust in states, they are not able to protect our data! We put trust in government, they do not do anything to protect individuals! We put trust in private companies but they are being hacked and actually they abuse their power.

Google, Microsoft, Samsung, Facebook and all governments, human right activists, citizens need to understand more broadly that ignoring personal privacy in the data era, we not only let invasions of our personal space, but open the door to future abuses of power.

We have all the tools and the technology available to address all these problems. I just want to have my right of privacy and want to have private communication as I had with the physical letters years ago! Is it too much?

On the Second Anniversary of the GDPR: Mobile App Descriptions

With today being the second anniversary of the GDPR, below is an article I wrote regarding mobile apps and privacy, particularly with respect to the U.S. COPPA statute (Children’s Online Privacy Protection Act). I reviewed over 10,400 mobile apps in the Google and Apple stores while working at a Washington, DC, law firm and it was an enriching experience. So here are my suggestions and tips. Enjoy!


Mobile App Descriptions: Observations and Tips

James J. Casey, Jr., Esq., CPP


One of the most dynamic aspects of the smartphone revolution has been the introduction of mobile apps that are downloaded for use in smartphones and tablets. The Apple App and Google Play Stores are the primary players and have been since their establishment nearly 12 years ago. Phone manufacturers who attempted to create their own native app ecosystems, such as BlackBerry, generally failed. Of course, many know that BlackBerry phones now have the Android OS, which means access to the Google Play Store.

With the long – overdue focus on privacy – 20 years late in the estimation of the author – mobile apps are under increasing scrutiny. This is particularly true where apps may be directed towards children – and collecting their personal data in the process. I have been fortunate to have reviewed over 10,400 apps in the Apple App and Google Play Stores, and have some observations and tips to share with you. These observations and tips are exactly that – they are not legal advice. Thus, you are recommended to seek legal advice from your general counsel or outside attorney.


  1. I have seen sloppily written app descriptions – complete with spelling and grammatical errors. Thus, apps should be precisely written and explicitly clear as to what the app(s) is / are designed for, what age rating is appropriate, and what audience(s) the app(s) is / are directed towards. The app store content ratings should be accurate to the directed age groups and content descriptions. This is especially true for children as well as for mature subject matter and topics involving violence, crude humor, blood / gore / death, stronger sexual themes and content, and partial nudity. If an app is part of the Google Play Store “Designed for Families” (DFF) ratings, then it should be marked as such on the app page.
  2. Make sure there is consistent alignment between the descriptive words and pictures / media / images / screenshots in the app description. This is especially true when it comes to the ages of “directed persons” / target audience and where the content involves mature or potentially controversial subject matter. If an app is directed towards children (under age 13) and older individuals (a “mixed” app), then the descriptions should clearly state that. It is critical that app descriptions for children be crystal clear.
  3. There is a fine line between too much and too little information on the app page. Some apps have too much extraneous information and not enough important detail. App descriptions are truly art + science.
  4. It is also important to recognize that there are differences globally with the use of child – specific images. In some regions / cultures, that use may be perfectly appropriate for older audiences (not directed at children) while in other regions and cultures those images are not used except in apps directed towards children. Be aware of this global dimension.
  5. Be clear about the financial dimensions of apps (as applicable). Will they require purchases once the app is downloaded? If so, what is the cost and how often? Is there is a subscription option? Consumers do not want to be surprised by these costs. Reading the reviews of apps in these stores illustrates the “surprise” of these additional costs. It is quite informative to read the reviews of apps in both stores.
  6. If ads will pop up while an app is used, alert the consumer to this fact. This is another area where consumers would rather not be surprised.
  7. Ensure that your company privacy policy and other associated terms and conditions / terms of service are current and in compliance with the requisite statutes and regulations (such as the EU GDPR, U.S. COPPA, and the State of California CCPA).


App descriptions in the Apple App and Google Play Stores serve two important purposes – to entice people to download / use that app and comply with the relevant country / jurisdiction statutes and regulations. Apps require the same concise writing and dedication to detail that many other areas of technology and law require. From what I have seen in reviewing app pages, the biggest issues are sloppy writing (including missing substance) and inconsistent messages in them (especially between words and images / media). It is better to identify child – directed apps in the app page rather than have a governmental authority begin to question / analyze apps to ensure that the privacy interests of children are being protected.

We are entering a heightened age where the protection of personal data is much more important than previously desired or expected. It is better to adopt privacy – protecting practices now than react to legislation / regulation later.

I may be reached at if you have any questions or comments.

Happy Birthday 2 years on with GDPR!

In celebration for GDPR 2 years on, I thought to repost some blogposts from June 2018. However, when looking I realised that they were a few and the theme was strong on how our personal data is public in Sweden and the use of utgivningsbevis to keep this status quo. So, I ended writing an additional blogpost, realising that I’m still really unhappy about the Swedish status quo on this.

GDPR has brought progress in ensuring that we, data subjects, have rights over our personal data, but sadly what I posted 2 years ago is still acutely relevant today in 2020.

The fact is in Sweden our personal data is made public and we have no say! After all public is public, impossible to restrict processing when this is the case, and as acknowledged in privacy laws, not just in the EU. The data brokers get to this data scrape from public sources, do some intelligent profiling and sell on to businesses, e.g. based on where you live will determine how you are profiled and to whom you will be sold.

Someone tried to argue with me once that a street name (missing house no.) was not personal data. The fact is that the street where you live says quite a lot about who you are. It gives an indication on your wealth, if you’re young, with kids, or elderly and if you’re likely to have a garden, 1 or 2 cars, etc. Your street name is directly or indirectly linked to you as an individual. The street name could be enough that you receive cold calls either by phone or someone knocking on your door to sell you double-glazing.

In UK for example, you are hidden by default. The difference in Sweden is that it still stands today the clash between laws pertaining to ‘freedom of press’ versus ‘a right to a private life’. In Sweden it is the former which wins.

I read somewhere that there are 100s, maybe 1000s of complaints from Swedish data subjects on the lack of control and rights (as per GDPR) they have over their personal data. This is positive! People are aware of their rights and are asking questions, why is this happening? I can’t find the article now, so would appreciate if anyone can dig it up? The question is if this will change? Can it change?

The e-Privacy Regulation has something to protect from unsolicited calls, and by default protected, as in UK the resident needs to opt-in to be included in a public directory.

Protection against spam: this proposal bans unsolicited electronic communications by emails, SMS and automated calling machines. Depending on national law people will either be protected by default or be able to use a do-not-call list to not receive marketing phone calls. Marketing callers will need to display their phone number or use a special pre-fix that indicates a marketing call.

How it works in Sweden today is that every business needs to have a ‘do not call list’, it seems that what is proposed in the e-Privacy Regulation is a national list, which is an improvement, but still does not solve the root of the problem. I do not want my data public unless I have specifically consented to this or I have myself made my data public.

Happy GDPR Day!

On the two-year anniversary of the EU’s GDPR I thought it would be timely to post an excerpt from the 2nd edition of my Cybersecurity Law, Standards and Regulations book published earlier this year.

The European Union (EU) General Data Protection Regulation (GDPR) was approved by the EU parliament on April 14, 2016 and became effective May 25, 2018. The GDPR replaces the EU Data Protection Directive and is designed to:

• Standardize disparate data privacy laws throughout Europe.
• Protect EU citizen privacy.
• Harmonize EU data protection and privacy safeguards.
• Encourage compliance through meaningful fines and sanctions.
• Put EU citizens back in charge of their personal data.

GDPR applies to organizations located within the EU as well as organizations located outside of the EU if they offer goods or services to, or monitor the behavior of, the EU data subjects. GDPR applies to all companies processing and holding the personal data of data subjects residing in the European Union, regardless of the company’s location. Figure 3-1 provides a model of how GDPR is designed.

Figure 3-1. EU GDPR Model.

The GDPR differs from the EU Data Protection Directive in the following ways:
Directive vs. Regulation – GDPR carries more clout and removes the discretionary language that comes with a directive. The GDPR applies to all member states of the EU and removes data protection inconsistencies of member states.
Jurisdiction Expansion – The coverage of GDPR is expanded past European boundaries and extends compliance to any organization that houses or processes EU citizen information regardless of location.
Citizen Consent and Rights – Organizations can no longer use ambiguous terminology or confusing legalese to define or secure consent. Organizations must clearly define the terms of consent and how data will be used in plain language. Citizens also have the right to access (right to access) and receive (data portability) their own data as well as have it erased (right to be forgotten) on demand.
Privacy Safeguards – Privacy is now a legal requirement where privacy protection must be designed in systems and processes to meet the requirements of GDPR.
Enforcement – The GDPR is similarly enforced through courts, with penal and administrative sanctions in addition to civil remedies. What has changed is the amount of the fines a court can levy for a violation. Fines can go as high as EUR 20 million or four percent of an organization’s turnover or annual sales.
Breach Notifications – Under GDPR it is no longer necessary to submit breach notifications to each local privacy authority. A Data Protection Officer (DPO), which is a mandatory appointment would make the notification to a single and relevant authority.

2019 is the year when GDPR enforcement ramped up. I believe that for every data breach experienced here in the US, a parallel GDPR enforcement in cases EU citizens are impacted will be launched. Table 3-9 provides a summary of the some of the initial fines levied under GDPR.

Table 3-9. Largest GDPR Fines

The companies fined above are just the beginning with U.K. Data Protection Authority the Information Commissioner’s Office announcing in July of 2019 intends to fine British Airways and Marriott International for violating the GDPR $228 million and $124 million respectively in July 2019 (Davies, 2019).

TIP: Create a GDPR impact statement based on four percent of your organization’s annual turnover as well as covert EUR $20 million to determine total fine exposure.

GDPR compliance still requires work world-wide. A report by Thompson Reuters released approximately one year to the day that GDPR took affect states that:

• More companies are failing to meet global data privacy regulations.
• Many companies have found GDPR compliance more difficult than expected.
• Half of companies are at risk of falling further behind.
• An increasing number of companies have now been subject to enforcement actions.
• Companies are becoming less open and pro-active with consumers.
• Board and C-suite concern and engagement on data privacy issues is falling.
• GDPR is now consuming a greater proportion of data privacy budgets (Thomson Reuters, 2019).

Keep regular tabs on this site for the most current information on GDPR.

Accountability. Implications for a Controller using CCTV.

But what is a controller I hear you ask?! Once again we return to the “purpose and means (essential elements) of processing”. Not trying to get boring about it but this is where the magic happens! We have some interesting and challenging situations to consider. We need to always come back to who is the real controller of the camera. Not just who put the camera up – but the why? to what purpose? who benefits? and who controls how?

We also need to consider the types of data being processed. For cameras, it’s images and sound, probably not a lot more. This data is central to our security and it is realistic to expect it will be held for a period of time.   

Cameras in communal areas of apartment blocks; cameras on the street; cameras in areas that are semi-public -they all pose challenges that are not easily explained by the GDPR. Public cameras are also on the increase. Police forces are protecting us as a community with strategically placed cameras. It seems that no matter how far we roam we are never too far away from a CCTV camera. The central question for all of us is “who is the controller?”.  

So does the right of the controller to use this camera to “prevent” or “solve” crime override your rights of data integrity. The European Data Protection Board suggests a particular methodology to follow for private persons.  The controller should have tried other methods and determined that this is the necessary solution. From there, they need to ensure that they are applying the minimisation principle. Video surveillance to “prevent accidents” is not proportional.  Individuals should not be monitored in places they don’t expect to be monitored. changing rooms or saunas.

Household or domestic exemption rule in GDPR is strictly viewed, and getting more strict following recent guidelines. These days if we buy a camera for our home – we must be prepared to take responsibility for it. This means that (among other things) we should be really clear about the purpose of the camera; positioning it correctly and having a sign letting people know there is camera surveillance.