The Data Diva
In January 2021, I had the honour of being interviewed by Debbie Reynolds, chatting about the GDPR, its slow enforcement by Data Protection Authorities. The position of the French CNIL, the UK ICO, the Irish DPC, One stop Shop for US mega corporations, such as Google and Facebook. The future of international data flow since the invalidation of Privacy Shield by the ECJ Schrems-II, the requirement of ensuring the same level of data protection when personal data is transferred outside the EU/EEA and the case of the UK adequacy after Brexit. Securing data exchange, encryption and the all time good advice : data minimisation.
Transcript of the interview :
Tara Taubman-Bassirian 01:03
Hello, thank you for having me on your show. It’s nice to see you and chat with you. We are both very active, I think, in educating and talking and discussing the subject of data protection. I’ve been involved for more than a decade on this subject since the French corporate lawyer and I moved to Germany and the UK. And it’s here that I specialized in privacy and data protection while I did my LLM. And originally, I was expecting to do char diamond safety until during my LLM. I actually realized that governments were sometimes using chart protection to quiet free speech. And so I get involved with the reform of the Data Protection Directive, which became then the regulation GDPR. I guess your auditor would know the difference between a directive and the GDPR regulation, which is a regulation that applies directly to all EU nations. The legislation was quite an ambitious program. It could not be perfect because, within the 28 member states, there are some civil law backgrounds, and the others are common law like the UK. So there has been tension between the different legislations. I still believe that it’s the best or not the best. I shouldn’t say that it still needs to be more perfect, but the big fine makes everybody wake up. Everybody started to think, well, GDPR data protection is actually important. I’m trying to explain that we gain a lot with the digital world. I remember as a lawyer, when we used to, secretaries used to type everything your ears would dictate, and there was a secretary typing everything. And if something was wrong, they had to start all over again. When needed, Internet has made lawyers work much, much easier. Everyone’s work is much easier automatic. We do search on Google, which brings up so many results in just one click and all for free. Well, free, maybe not. But we don’t pay straight away. Sonia GDPR, with fines, wakes up. Charities, in my view, have been a bit too slow to enforce. But here comes the COVID and confinement, which also changed a lot of the situation working from home is different. I’ve been looking and following the cases of cyberattacks, which I guess Debbie, you’ve seen the same, they’re just going high up the ceiling. What working from home is less security, obviously. So it needs this training. I’m not sure every employer gets the training they needed. We had in the UK a very interesting case, the Morrison case, which went up to the High Court of vicarious liability of an employer here for his employee. And in the case of the supermarket, Morrison was he hadn’t done anything wrong. Apart from having an employment fight with an employee. They sacked the employee. It happened employee was working in the accounting department. He was asked to forward the whole HR data to the outside accountant, would it? Meanwhile, he copied it on a USB stick. And when he gets sacked, one of the whole HR data was published online. That caused a major issue and a problem for all the employees who had the data exposed. There was a class action. The Court of Appeal said, basically, that the employee Morrison’s should have had cyber insurance to cover these sorts of damages. In the High Courts, Morrison had a very good lawyer, not that the class action lawyer was not as good. But Morrison’s lawyer did a very good job of convincing the High Court that they hadn’t done anything wrong. And the ex-employee was a data controller, so he was only responsible. It can sound. I’m guessing lots of employees, employers, or employers listening would think, well, why would Morrison be liable. They hadn’t done anything wrong, then it’s the only liability of the ex-employer. He saw how the other employee could get any damage compensation for the data. So that’s an issue. And then there was this case that I barely mentioned. I saw it today on LinkedIn. I don’t have much detail about it. But it’s a German court that said not every data breach would give compensation. In the UK, we have the Lloyd case class action against Google when they had the green light to go forward. And they say the loss of control and the psychological distress was compensation, we still have to wait and see where it all goes. But the problem with a data breach is the consequences are not immediate. The data is stolen, they are sold on black markets, or they are just thrown away everywhere. And sometimes, many years later, someone whose ID is misused or financial details misused.
Debbie Reynolds 07:32
Yeah, I think the problem, I mean, that we just can’t get around is that a lot of times, the harm happens before the law. And if the enforcement doesn’t happen soon enough, or the remediation or, you know, happen, you know, once the milk is spilled out of the carton, you know, it’s almost impossible to get that back. So that’s why it’s really important that that data be protected the best way possible that individuals have more control over their data and that it not be spilled milk. I think it’s really interesting that you’re, you know, sleeping in a lawyer France you lived in Germany and now live in the UK. So you have, you know, the full experience, right, from different countries and, and different ways that they look at things. I love your thoughts, you know, at me being an American and you being in Europe, as we have so many people around the world working in Data Privacy, I hear sometimes people say there’s like a difference between the way American privacy folks think about privacy as opposed to European people. I want to know, have you had any thoughts about that?
Tara Taubman-Bassirian 08:48
Yes, privacy is very much connected to the cultural understanding of how we perceive our intimacy. I think Americans, for example, don’t like people touching you. Why more Mediterraneans are all touchy, and it’s different. We had a discussion on Facebook not long ago about what’s the difference between privacy and data protection. For me, privacy is a fundamental human right. It’s more of a philosophical concept. When data protection is more the data that is collected that needs to be protected, and the way that we are allowed to collect or not to collect right. And someone posted a picture of a toilet with glass walls. Some people think, well, previously is secrecy, and I have nothing to hide. So why would they care about privacy? And I like to tell them whenever you go to the loo you close the door. It’s just somewhere you would like to be in peace. However, going back to you three, actually, France was well known on web site. Novelties used to do these things in public. And they will show it as a way of saying, Well, I’m healthy, there was no barrier. So no problem going to the loo and showing everything you do in a loo to everyone. So things have changed. The understanding and the way we perceive privacy change. And it’s also changed from one culture to another culture. I’ve had people in Sweden telling me that we go naked in a sauna, and we don’t care about privacy. And then I reply, what do you choose to go to the sauna, you choose who you’re sitting with. And if you’re not happy, it’s not a huge sign out there. A certain number of people, you don’t like them, you just go away, and you go to another sauna. Your privacy online is different. You have absolutely no control over what’s being done with your data. And this is where previously, you get very close to data protection. A few years ago, I had a chat with an IT person about threats and detection. And whatever chatting, I Googled her, and I find so much private stuff about her online. And I just sent her a link. And she felt like you’re invading my privacy. I said, I just Googled your name, put speech mark, right, and see everything that comes out about you. Yeah, and this even for someone who is in IT, it was four years ago. I’m hoping today people understand that a bit more. But Google does the job of a puzzle, different pieces puzzle in one click put next to each other that shows a whole picture that you have no control over, right. It could be your own boss, but it could be someone else destroy your reputation.
Debbie Reynolds 10:56
Absolutely. Right, exactly. I think for people who aren’t online or don’t have a very good presence or good full presence online, it is, unfortunately, an opportunity for people to fill it in with bad things or things that aren’t true. So it’s very important, I think, for people to really step out and establish their identity. So people know, you know, this, Tara, this is very, you know, so they can differentiate the truth from the facts from fiction. Basically, I would love to get your thoughts on the Schrems II decision and the invalidation of Privacy Shield between the EU and the US. What are your thoughts about it?
Tara Taubman-Bassirian 12:43
It’s a case been ongoing for a few years. First, the previous adequacy decision, that was the Safe Harbor agreement, was invalidated by the European Court of Justice. Nothing was changed. So Maximillian Schrems, who is behind this case which found that none of your business NGOs continued his battle. In July this summer, the European Court of Justice again said that despite the European Commission giving the green light after the second year of reviewing the Privacy Shield, it should be invalidated. It’s not because the US stars intelligentsia that European don’t do. Right. Every country today has intelligence. And they try to grab information. The problem with the US as the European Court of Justice, so is that it is there is no sufficient judicial review. Right. There is a sort of discrimination against you EU citizens who don’t have a right to redress or so until things change, which is the FISA section 702, the executive order 12 333 and other intelligence regulation Tridacna things that allow the US to intercept data the normal transfer of data from the EU to the US. In principles, standard contractual clauses are still valid. So all the binding corporate rules or a lot of people jumped up and said, oh, you’re very let’s continue as we did before. However, the high court saying that additional measures should be taken to make sure that the data that isn’t exported from the EU and imported to the US is kept safe and not access is given to the US. Government and the EU citizen gotta write. Personally, I don’t see many ways that this could be done. Right, we can go closer. We can data can be encrypted, the encryption key should remain within the EU clauses could be added to the standard contractual clauses that are being modernized by the Commission. That’s right. It’s still not there. Some data protection authorities, like the German when ones, have been very strict, saying the normal transfer of data from the EU to the US. French, we are thinking about it in English. Early, be careful what you’re doing, what we can stop the trade. Basically, I know, I’m not very exactly repeating what they said. But that’s what comes out. But England, UK is concentrating on Brexit. Yeah. Is it suffering itself? Will they get the adequacy decision or not? But that’s another issue. But what’s interesting with the Schrems II decision is basically saying we Europe want strong protection to data. And someone like if this was a kind of RFID, sorry, that follows the data wherever it goes. The country of import should be the same level of protection today to this data, which is quite strong. Yes. But hopefully, it’s a good message that we pass. We had an interesting case in France, if I may mention that. Datahub is a cloud that stores all medical, French medical data, especially since the COVID. Neil said to the highest administrative costs in France to consider that his data should not be put in the hands of Microsoft. So that is confirmation that what we call what the ECG sees, or the transfer of data, is not the physical transfer of data from the EU to the US. Because all these American US laws apply to US companies, even though if their servers are based in Europe, still, US government can intercept the data, right?
Debbie Reynolds 17:46
Tara Taubman-Bassirian 17:47
This is why the German data protection authorities’ position is not pure data, check where it goes. And if you’re using US companies, rethink it. That’s a major step.
Debbie Reynolds 18:04
And I’ve been watching this play out for over 20 years. So you know, when Safe Harbor was put in place, and they wills invalidated Privacy Shield, I don’t know what a third try to see would be between the US and the EU. Obviously, there’s a lot of trade between countries. So there’s obviously probably more of an economic imperative than the privacy one. But I think we just have fundamental thoughts or differences about how we think about privacy. So in my view, the US is like commerce over privacy. And the EU is like, you know, we want to do commerce, but you have to respect that sort of an impasse, I feel. And then the redress issue is a very serious one because, in the Privacy Shield, they tried to have a sort of a method of redress within the Department of Commerce, I guess. But the big issue, in my view, well two issues in my view. One is that the Department of Commerce only covers certain types. So it’s not almost all industries, as opposed to the GDPR. And then also the redress. Because the Department of Commerce, again, is limited in the companies that they oversee, means that you know, like, for example, like let’s say, for instance, I have a problem with a US company, while I’m here in the US, I don’t have that level of redress that the Europeans are asking for. So there really has to be a total rethink of how that mechanism could be created in a way that will make it powerful. I think, but then I think that will cause a lot of people in the US to say, well, why can’t I have that? Right? We don’t have that right now in terms of the way that ways you guys are thinking about redress. I would love to ask. You had written something a while back about consent fatigue. And I think this is a big issue. So where a lot of people are, you know, you’re signing up for these companies with different services. And you’re constantly getting bombarded with cookie notices and different things like that. And some people just, they just click through. So give me your thoughts about consent.
Tara Taubman-Bassirian 20:43
Debbie Reynolds 23:18
Yeah, well, so a lot of tech companies because a lot of the laws about cookies have been eliminating cookies. They’re out there collecting data and other ways. But the law hasn’t caught up with that yet. So I think by the time that these cookie cases get settled, very few people actually use them, and they will be on to whatever the next evolution in technology may be. I would love to talk with you about the right to be forgotten. This is something that you talk about you’ve written about a bit for companies is the most difficult thing in the GDPR for them to acquiesce to because the right to be forgotten means that they have to figure out what they have in terms of data about individuals and how they, remove it from an organization and how they communicate that to the individual data subject. What are your thoughts about that?
Tara Taubman-Bassirian 24:18
On the question of knowing which data is hold and verities? I think this is something very important. And GDPR is based on the accountability principle. It’s asking the organization to sit and think which data they hold during that mapping, know where it goes, who access it, for how long they return it. These are the seven principles of data protection on GDPR. My favorite principle is data minimization. If you don’t hold the data, you don’t have the burden of the data. If you don’t need it, delete it.
Debbie Reynolds 24:56
I agree. I agree with you on that.
Tara Taubman-Bassirian 24:58
It’s as simple as that. It’s actually good for you. You don’t pay for data storage for anything. Another principle is data accuracy. What’s the point of holding data that is out of date? Your market is all wrong, your everything is wrong because the data is obsolete. So deleted, you don’t need it. I usually like to advise three-level storage of data. First level active data with more access, and also internet access seven-level, you have to keep the data because you might need it for refund reason, because it also, you know, tax reason, employment reason, you have to keep the data for a certain number of years, you have to sit and decide your data retention periods, depending on the level of data and what you want. But limit data to this second-level storage. Third Level storage would be archiving, no internet access, very limited access to any third parties. Because sometimes, for a historical reason, you need the data, right? I’ve been asked, well, what happened, we had all these historical data of people who became famous or were famous and visited the hotel or a museum, we would want to keep that forever. Well, you can keep it just don’t put it online. Keep it on the external hard drive that is not connected to the Internet. You’re allowed to do that. It’s not a problem. What people have to understand is an Internet connection. Globalization means easy access to data, easy misuse of data. But if the data is safe, encrypted on your hard drive, and not connected to the Internet, it’s safe. You can keep it as long as you want. There’s no problem with that. Right? That’s what is good for the GDPR. It’s caused people organization to sit and think, what they’ve got, how much they are holding, who is accessing it, how they keep it secure. It’s all common sense.
Debbie Reynolds 27:03
Yeah. I think though, with the Internet age, because I don’t know, like, I guess to use, for example, like Google, when Google first started, or Microsoft, or started giving people free email addresses, you know, the size of your email box was relatively small, over the years, they made the size of the email boxes bigger. So, as a result, people who just click more stuff may delete less stuff, they have more room. So in some ways, I feel like the Internet age has made people digital packrats, where they keep so much stuff, you know because you know, instead of like, let’s say, before you have books in the library, now you can have things on a hard drive, and people are less apt to delete things. Also, you know, I’ve worked a lot with corporations and lawyers. Sometimes, people want to keep data just because they think they may need it, even though they don’t have a real reason to keep it. And that just piles up over a year over a year. And to me, those are some of you know, especially if they’re connected to the Internet, a lot of times, you know, they have all data may be on all servers, maybe on something that’s more vulnerable than maybe their actual data and hackers like that, that’s just the gateway for them to get into your organization. And it also creates more risk for the company, because now you have people accessing or you have a breach, or someone’s looking at some old stuff, you don’t even know what’s there. So now you have to go through and figure out, you know, whose data is on here. So it’s just a mess. I think data retention is a big problem for a while where people just keep getting way too much stuff.
Tara Taubman-Bassirian 28:54
Indeed, I call it a Google hype. One school started to collect everything. And it was known that Google is saving all data, and storing and collecting became much, much cheaper. So people don’t let’s collect and then we will see what we do with it. But that causes issues that there are serious security issues with over-collecting data. Data breach cause harm, unfortunately. And cases like Morrison’s will happen again. Yeah. Employees are not devoted to the employer as they used to be there is a much bigger turnover. Young people just change jobs. If they’re not happy, they will want to take advantage, and it’s so easy. Get the USB and put it online. Yeah. So yeah, data minimization is very important. And the right to be forgotten. Before the GDPR, there was a right to be forgotten by the jurisprudence of The European Court of Justice, and that was more delinking than actually deleting. Sometimes you do need the Internet and the whole world, forget about something silly that you did one day in your life. Because employers search online, everyone’s searching online. Young people today are on with their baby pictures online, and even the embryo is already online before they were born. Yeah, it was a very interesting interview of ex-Google CEO. And he suggested that young people should be able to change their identity at the age of 21. Because they’ve already got so much behind them of silly things, they’ve done. Yeah, that’s not feasible.
Debbie Reynolds 30:56
We have to give a chance for people to redo their life. Everyone doesn’t mistake. Yeah, the problem with the Internet is you just need someone with a camera next to you to make it memorable.
Debbie Reynolds 31:12
Tara Taubman-Bassirian 31:13
We’ve had lots of external debates this week on social media about France. Who wants to pass a law that forbids anyone to record police officers during work and post it on social media? In my interpretation of GDPR already today, we should not be able to post anyone’s pictures online without their consent. And this is very important. I really ask everyone to reconsider before posting online, ask the person if they’re okay. They are people with special circumstances. There are people who are in the middle of a divorce. They don’t want to tell what they are and what they’re doing. Right. They are protected witnesses. They don’t want people to know where they are, and their faces would identify them. Facial recognition software for everyone can use it can identify and worse than identifying, it can make fake identification.
Debbie Reynolds 32:13
Tara Taubman-Bassirian 32:15
So you might be identified as a criminal, or you are not at all a criminal, but go and try to prove it wasn’t you, everyone is just attacking you see a police officer video that is posted on social media. It would be one extract of something that happened. You don’t know what the start was, and you don’t know how it finished? Right? They are police officers. I’m not saying what I am not defending at all police violence. I’m just saying that the judge has studied for that institution for judging. And it’s not the work of social media to judge the actions of a police officer. They can post by blowing the faces. Yeah. But they should never be anything posted with identifying pictures without asking the person. That’s very important. Yeah. People have to start to understand that it has consequences. Absolutely. And I thought it was the other way around. discussion with the police officer in my area, and giving the name of a young offender. 18 year old who is suspected of having killed someone with a knife.
Debbie Reynolds 33:27
It might be true.
Tara Taubman-Bassirian 33:29
But I want the judge to tell me what he did. And what he did was a crime. Yeah, it’s not the social Smith’s at work. And once the local police put that on Facebook, it means that the whole family is ashamed. The siblings, I’m ashamed. They go to school the next day, and they say, your brother killed someone. We don’t want to talk to you anymore. You got the COVID they say, oh, I don’t talk to you because you’ve got fever.
Debbie Reynolds 33:51
And they were able to talk to any rain. Exactly, exactly.
Tara Taubman-Bassirian 33:58
It is our things that change our relationship because of social media. We get so many good things from it. Yeah. You know, I’m a great believer in social media. I promote elearning. I have a charity that promotes learning. I have nothing against the Internet and anything digital. I love technology. I’m a gadget lover. But we have to accept that in the area of social media and Internet interconnection relationship has changed. We are not just broadcasting in our small network when we broadcast, we broadcast it for the whole world. Right, broadcast it as the highest bid ever. And we broadcast it forever.
Debbie Reynolds 34:44
Tara Taubman-Bassirian 34:44
So it’s a big catalyst of the Internet. It is changing all scales.
Debbie Reynolds 34:50
Tara Taubman-Bassirian 34:51
We have been, on the other hand, be more careful with what we’re doing and what we’re saying. And it’s so easy to just push a button. Without
Debbie Reynolds 35:00
Absolutely, absolutely. Africa, right, I think we need to, you know, I’m like you. I love technology. I’m really excited, you know, when the Internet came into existence, because I grew up without the Internet, obviously. But I’m a gadget lover as well. But I think we also have to be cognizant of the harms that can happen. So you can’t dissolve the so, so excited about the innovation without understanding that there can be harm. So we have to really be able to educate around that. The GDPR in Britain, as we know, very influential around the world. So we see laws come out that are taking bits and pieces of the GDPR and incorporating them into their law. So I think there’s going to be more regulation around Data Privacy and not less in the future.
Tara Taubman-Bassirian 35:51
We know even got in China a similar to GDPR legislation that is coming up so even China.
Debbie Reynolds 35:59
Yeah, even trying to Iraq. Right, exactly. Was that a pleasure to have you on so, and thank you again, thanks so much.
Tara Taubman-Bassirian 36:08