LearnerPrivacy.org
LearnerPrivacy.org
GDPR - The Flaws and Foibles of Well-Intentioned Privacy Policy #004
In this episode, we look at the EU General Data Protection Regulation (GDPR). Europe has a long history of thinking more deeply and formally about privacy than the US. We start by looking at the 1995 Data Protection Directive - which laid down a great set of principles long before the widespread use of the Internet or Cloud. Then in 2000 as a reaction to US world-scale applications like Google, Facebook, and Twitter, the EU developed the Safe Harbor principles, which US companies fought and eventually overturned in 2015. From the ashes of Safe Harbor, the Privacy Shield agreement emerged in 2016. The GDPR went into effect in 2018 with the intention of "getting it right". It turns out that unless a University wants to use GDPR to define a relationship with their educational technology vendors, it is pretty much ignored. Some cultures and some schools make good use of GDPR and others simply ignore it. This podcast is more about the Foibles and Flaws of GDPR. In an upcoming podcast we will flip and look at the regulation and where is is used and applied well. Music: Peacefully by E'S Jammy Jams
[inaudible]
Speaker 2:Welcome to the learner privacy.org podcast, episode four. My name is Charles severance, and I'm your host for the podcast. This podcast is about GDPR, the European privacy law, and I'm going to have to do at least two podcasts about this. This first podcast is going to be sort of critical historical perspective where things went wrong. I'm kind of a clickbait and if you were, but, but there's a lot of good in it as well. And so I'm going to have ended up two podcasts. One is kind of this critical take, and then we'll do a, just a, more of a straight up understanding of what GDPR is and what it intends to be. And if it were applied well, uh, how, how great it would actually be. So GDPR is the general data privacy regulation. It became E U your opinion law in 2016, went into effect 2018. It was proceeded by the international safe Harbor privacy and later privacy shield. That was kind of 2000 and, and adjusted in 2016. And then the EU data protection directive, which is 1995, which is a very, very long time ago. And that's actually where we're going to start. But if you think about it in 1995, most personal records were still kind of on paper or in mainframe computers. The internet, the web really only became popular in 1994. And it kind of was focused on what the government was supposed to think about privacy with respect to its citizens. So, so, but they're good principles. Uh, notice subjects should know when their data's being collected, why they're being collected. What's the purpose. They should be asked if it's okay. You ought to take that collected data and protect it just don't sort of lose it. Don't let it get out into the wild. Um, if you're got data collected on you, you should be informed. You should just, they should disclose that they're, they're pulling this data. People should be able to see what kind of data they have and see if it's even inaccurate. I mean, this is, this is sort of like a set of human rights about privacy with respect to their government, but these were great concepts, but they weren't really particularly binding a United States was part of this discussion. And, uh, they, they, they did apply these principles to some degree to government held data on individuals. But in 1995, there wasn't a concept of a cloud. There were no cloud vendors and the internet was really very young in 1995. And so, so this there's a lot of really good ideas in this 1998 95 directive, but it really wasn't, it doesn't apply to our current time. So as the cloud came out, you know, as the internet really became a lot more significant around 2000, there was a big problem of sort of the clash between the U S United States view of sort of what software and hardware and cloud vendors were supposed to do. And the European union sort of expectation of privacy there, like the privacy should apply here. And so they, the EU came up with some principles mostly about what it meant when U S companies would sort of begin to serve European customers. It wasn't particularly about learning data. I mean, we're talking about 2000 or 2000 Gmail didn't even exist in tooth tilt to 2004. So this inner safe Harbor came out pretty early on, and it really anticipated the emergence of things like Google and Twitter and Facebook. And so it was there before they showed up. But then when they did show up these world scale vendors, they, they looked at this as like, you know, we're just having our way and doing whatever we want in the United States. And no one seems to care. We're having a good time sharing a lot of cat pictures. And here we come into this culture, this European culture, that's thinking about privacy. And so there was kind of like a clash of the Titans. Actually, it wasn't really clash of the Titans. It was the clash of the Titans, stomping on the European union, Google Twitter, and Facebook just sued the heck out of him. And, and to the point where this whole safe Harbor thing was turned overturned in 2015, when that fell apart, they came up with a replacement, which was called privacy shield. And I think the idea was to bring the us state department. So the European union was dealing with a government, the United States government state department in particular, and they kind of negotiated the deal that said, here's how us companies are supposed to behave. And it got twisted and changed. And I was researching during this time and I found a bunch of web pages about it, but it was pretty weak. And I found that if you do like a history, internet archive, these pages would change from time to time. And the legal documents would say the actual law is in this webpage. And then that webpage just got edited. And so it was, I would say that the companies kind of ran rough shot over privacy shield in a way, I think your was between the time that, uh, the previous thing was overturned and GDPR, they thought GDPR come out and there's this privacy shield just to kind of thing that we're going to do for awhile, but it was pretty leaky. I mean, it really was, you know, in 2016 it was pretty leaky. And so if we look at sort of European culture that sort of had a pretty good cultural commitment to privacy in general, um, 20 years later in 2015, I, I went to visit a Spanish university and I asked them, how do you deal with the fact that one of your schools is just using a U S learning management system? And they showed me and you go to the system. And the first time you log in, when you're first admitted to the university, there's a popup. And the popup says, okay, well, we are using a us company and all your data is stored in the U S yes. And you have no choice, but you say, okay, then that means that this is fine because these Google Twitter and Facebook had just beaten the EU into submission. Um, he was on defense all the time. The U S learning management system companies, even as late as 2016, Dean could just like, you know, wave their hands, put up a popup and pretend that, that the EU was pretty much the same as the U S the GDPR, which became law in 2016, in a sense, I think was the us view of we're going to win this time. We're going to actually build a law. We're going to learn from all of our mistakes. And we're going to bake in all those kind of cultural, privacies and norms that we have. And so there's, there's some really wonderful principles in it, like lawfulness, fairness, transparency, limitation, and purpose. You can't just kind of pull data you don't need. And now we're talking not about governments, but about companies don't take any more than you need, make sure it's accurate, let people see it, let people fix it. And interestingly, there really are echoes of the 1995 directives limit. What, how long you keep it, make sure that you keep it confidential and make sure that we can talk to you about the data we can talk to them. So, so I think it was really designed to finally win in a legal sense against the U S based world scale companies like Google, because they'd lost so much so often. And a cynical view of it was maybe this time they would lose to Google, Facebook and Twitter, but at least they would make a bunch of money and it would hurt Google, Facebook and Twitter financially. So instead of just overturning it, they would just get fined. And so this severe fine is up to 20 million euros or 4% of the worldwide company revenue, or if it's a less severe offense, it could be 10 million euros or 2% of the company's revenue. And so it, it really, you can kind of see this thing as aimed at companies with a lot of money. And if they really want to just say bugger off, then they can just pay their 4% tax to operate in the EU. Although the document is a good document. So like our articles five through seven, you can read them online principles, article seven, the right to be forgotten, article 21, the right to object kinds of processing, like machine learning and artificial intelligence, the responsibility of the controller, which is the organization that's initially collecting and putting the data and storing the data and the processor, which is sort of like a second party to this. It's not a difficult read. I mean, you can read the GDPR and it makes sense to you and will in an upcoming podcast, we're going to go through these points point by point. Mmm. Okay. The one point that I'll bring in this sort of shorter upfront historical version is the concept of controller and processor. So the GDPR defines a data controller as the initial collector, like it's the phone company or Airbnb or other university that you're going to, this is the, the, the, the entity that you're in a sense, know that you're handing your data over. Now, they may use Amazon, or they may use something in the back or who knows what service or Google or something, some cloud service to do that. And those are more of the data processors and the data processors don't have quite the same responsibility. Cause the, the end users can't go to the data processor. You can't go to Amazon and say, Hey, Amazon, what data does Airbnb have about me? You actually have to go to Airbnb and say, and then AMA, then Airbnb will log into their Amazon servers and give you that data. So in that case, the proper simple example is Airbnb is your current data controller. And Amazon is the data processor. Now the data processor has a responsibility to the controller so that they control her can do their job. But if, as long as they give them the database and the Airbnb can read their database, then it's, then Amazon kind of is fulfilled. Most of its responsibilities, as long as it doesn't lose the data and keeps it secure, et cetera. Now, in my research for this, I found, I really liked the Zillow privacy policy. Now Zillow has, I don't think is international like Airbnb and Google the Zillow, right. To be forgotten. And you'll scroll to the bottom and there are these buttons. Like, you can be forgotten, just click right here and we will forget you now for me, what's, what's my reaction. When I see that, that I now have control over the Zillow data and I have a button to get rid of it all. And my initial reaction was don't press the button. I want Zillow to know something about me. I'm on Zillow to show me the things I want to see. I want Amazon to show me the things I want to see. So even the Amazon would give me a button to forget everything. I wouldn't push it, but if you don't give me the button, I'm like, what are you doing back there? Right. So I think that in the long run, if we start thinking about this, this is a good thing. I mean, the right to be forgotten doesn't mean that people want to be forgotten if they do, then we probably should forget them, but, okay. GDPR controller and processor now in teaching and learning and learning management systems, it's a little kind of vague, right? Uh, what if a university, your universities clearly controller? What if they outsource to a us based learning management system, cloud hosted and that US-based learning management system uses Amazon? Well, it's pretty clear that the university is a controller and Amazon is the processor. And what if there's an LTI tool, that's also a us based LTI tool. And that LTI tool also uses Amazon. Amazon is clearly in all of this picture that I just drew the, the processor. But the question is, if this LTI tool says, Hey, log in to my front door. So I can send you email messages about sales items. When sales happen, they actually are becoming more of a controller. And the extent to which the organization, the, the LMS or the learning tool really develops the relationship outside of the university. If it's simply doing the university's bidding and sticking to its knitting and doing its job at the behest of the university, then the LMS has more of a processor, which is the simpler of the roles. But if it starts to try to do lifelong learning, um, cradle to grave, you go to one university, another university, and you start to record the data. You just go to grade school, the university to community college, and we're going to record all this data, and we're going to figure out a learning profile. Well, you just became a controller at that point. So it's, I think the one thing that GDPR has got us thinking about is this notion of be careful where if you, there are sort of trip wires, that all of a sudden give you a lot more responsibility as a learning system. Um, so Pearson, for example, would almost certainly not just be a process or even though they wish they were, they would probably be a controller because they're creating a longterm relationship in terms of learning profile and other sort of fun, ironic click baity kind of thing is the concept of GDPR prosecution. I can't, you can't have one person Sue for GDPR. The government has to Sue. And so they decide which violations are worth pursuing, right? So there's crimes that happen all the time and they don't get prosecuted. Like, for example, if you give your credit card to a Nigerian Prince and they swindle you out of some money, you can go to your local police department. They'll they'll acknowledge that was a crime, but they're also not gonna send a bunch of people to Nigeria to find the person and get your money back. Right. And so it's not a question of whether there was a crime or not. It's a question of whether it's going to be investigated and prosecuted. And so if you think about GDPR, its whole purpose was to go after these big companies, Google, Facebook, all the things that the EU had failed to get control over things like learning management systems, they kind of just flow under the radar. And so even in the post GDPR learning management systems, aren't the reason that GDPR was made. There's a GDPR enforcement tracker and it's kind of funny go search for the word university, go search for the word learning, go search for the word learning management system, go search for LMS. You'll find zero things about LMSs. You'll find zero things about canvas Blackboard, Sockeye, or Moodle, nothing. You will find one university got in trouble because a student protested and the university sort of docks them in that the university released their private data and they got in a lot of trouble for that, but it wasn't even in a learning context. So even the one that I could find about universities, they also show you by the white country and England and Spain, just kind of culturally seem to have a lot more violations than other countries. Then just take a look at kind of the issues that show up that, that, that merit investigation and prosecution. And you'll find that somebody in a party store had a, uh, a CCTV camera pointed out in the front door and they could see cars going by on the film, which is gathering more data than you need. And there's a$500, 500 year old fine for that. I'm like, okay, all this LMS date is going to the United States and no one cares, but we're going to like bus some dude in their little party store. Well probably because the prosecutor got bad service at the party store and like, look, man, I'm the GDPR local prosecutor. You can either refund my money on this bad piece of pizza, or I'm gonna see that camera behind there. It's pointed on the street. You're gonna get a$500 fine. So it's up to you buddy. And so it just seems like it's almost a joke, right? The things that they prosecute for and just go through it. It's, it's a it's I think it's a great law. I love reading it, but in its application, it Google and those folks, they don't get hit by it and it's early days, but it's, it's not all that great. So I, I just knew about one university. I won't mention their name or even their country, but they ha they were doing some due diligence of GDPR. And they asked their LMS vendor. If they, it was a U S LMS vendor and they asked their U S LMS vendor, like you got GDPR going on. You're you're all set. And the vendor said, Yar lawyers say we're good. And the university said, great. That's enough for us. Let's let's buy your product. There is like no auditor. There's nobody that hunts this down to decide this, if this university pretends that GDPR is not a problem than it is not a problem, right? The key is as the university is who has to decide to comply with the law. So if we look at the past, it's 1995, right? The past 25 years of privacy efforts, you know, it is full of good intentions and it's full of great. I mean the first document in 1995, I just love the words of it. Right? But it couldn't be more leaky. There are schools and universities that actually care. And what they get to do is that universities who care can have a wonderful conversation. And so all the good news that we'll talk about is talking about how activist universities in countries like France or Germany or Ireland, like look at GDPR is like, this is the beginning of the conversation with our vendors. And we want this privacy. We don't have to be forced to cover it. We don't think of this as an annoyance. This is the beginning of a conversation. So I'm, I'm happy because I'm starting to see conversations in the circles that I play in that are starting to say, like, let's just look at GDPR as like, how about we just figure out how to make, do a great job of this. Not just like, what did our lawyers tell us the popup had to say, but instead, wait a sec. Let's really, there's a lot of good here. And then the other thing that I think is optimistic in terms of the GDPR is to the extent that it influences the California consumer privacy act of 2019. And I'm noticing in the circles that I'm cruising around in, I'm starting to see that elements vendors are a little more afraid or respectful of the California consumer privacy act. And they are of GDPR. They just think that they can get away with some kind of a whitewashing of GDPR and then move on. Whereas the California consumer privacy act think about it. This battle between the European union and the U S giants kind of Silicon Valley giants, you know, that you are, your Europe is kind of far away from Silicon Valley. And so the Silicon Valley just stomps on Europe, but California knows about Silicon Valley, California understands the U S companies. California is a pretty progressive place and they're pretty close. And so I think that we're going to find in time that the California law will be far more effective, ultimately, even though it's going to take, so yeah,
Speaker 3:The California law definitely
Speaker 2:Builds on GDPR from a philosophical point of view. And all the work in Europe really has laid a tremendous Phyllis, a groundwork for privacy going forward. And if you want, and you'll use it wisely, it's a great law, but it just doesn't seem to be used. Law use wisely across the board and people aren't scared of it. So they don't really comply if they don't want to comply. So, so we're gonna talk about this more upcoming podcasts, both of GDPR, what it is, how it works, how it can be used well, and the California consumer privacy act and how that a, yet better step in the right direction with the, with more appropriate kind of tooth. So we'll see you then. Cheers.
Speaker 1:Thank you for watching this episode of the learner privacy.org podcast.