Last week, I stumbled onto a blog post from Clever announcing its new “Clever Badges” for students in Grades K-2. Designed to make it easier for younger students to access the edtech apps they use in the classroom, the badge replaces passwords with a laminated QR code that a student can flash at the camera on their machine to login.
In its own words,
“With Clever Badges, students simply flash their Badge and get access to all of their personalized learning applications. Our goal is to take something known as cumbersome and inefficient and make it fast, easy, and even fun! And best of all, Clever Badges significantly raises the bar for security and privacy in most classrooms.”
As I watched the video they’ve made to introduce this feature and read through the rest of the post that touted this as a win for student privacy and security, I was struck by how much more education needs to be done on those topics in edtech. In 2016, it shouldn’t be possible for technology companies of any sort to confuse accessibility and convenience with security and privacy, but here we are.
Once upon a time, the engineers that built websites realized that they needed a way to prove that people were who they said they were when they attempted to login to a website. To authenticate a user, websites began asking for a username and an authentication factor as user authentication credentials, and when correctly combined, those credentials would give users access to websites.
Next to a username, you (the user) would have to enter
Something you have, i.e. a possession factor like a hardware key or QR code badge.
Something you know, i.e. a knowledge factor like a password, PIN, or challenge question answer.
Something you are, i.e. an inherence factor like a fingerprint or an iris that can be scanned.
More often than not, passwords were the best authentication method to make this happen for the average web user. Thirty or forty years later, however, web users know of passwords as huge pain to manage, and a problem that everyone is struggling to try to solve. The best passwords to secure our computers are absolutely unusable by human standards,and the best passwords for humans are weak and crackable within very short amounts of time by dictionary attacks and password crackers like John the Ripper, Hashcat, and many others. Over the past few years, deficiencies in password storage and password security have lead many consumers and enterprises to adopt password managers (like 1Password), biometric applications like TouchID and fingerprint scanners. Those same weaknesses have also lead to increased adoption of multi-factor authentication. In many cases, security experts are now striving to build systems where passwords are not used or transmitted at all in the verification process, and they consider these password-less systems to be incredibly strong. (We’ll get into the numbers behind strength and entropy in another post, if necessary.)
In the case of their badges, Clever is attempting to replace the password (something you know) with a QR code (something you have).” For their intended audience of 5-7 year olds, this makes sense: kids struggle with typing and credentials in classroom settings no matter how they old they are, and this struggle is twice as tough with children who haven’t yet figured what letters are or do not yet have a strong grasp on reading. This daily struggle is a huge burden on teachers, who in response have come up with numerous insecure ways to work around the problem. More often than not, teachers turn to writing passwords down in huge lists, using weak password formulas, or encouraging password reuse among students as solutions to the authentication problems for kids— all things that Clever positively reinforces by referring to them as “creative” in their launch materials.
On their own, Clever Badges are a solving a real problem for kids, an important problem that eats up minutes of valuable classroom time every day. Any tool that gives kids access to technology that is otherwise prohibitively difficult for them to use is a great thing. But is this tool “significantly raising the bar for security and privacy” the way its makers claim?
Security + Privacy in the Classroom
Converting a “something you know” to a “something you have” isn’t necessarily a security improvement— as a user, when you’re relying on something you HAVE as an authentication mechanism, it is necessary to protect or secure that something to prevent unauthorized access to whatever it is you use that information to get into. With badges, removing the password from the element doesn’t increase security: it just foists security onto another another authentication mechanism that has different security challenges.
In the video on Clever Badges, it’s quite apparent that the basic security requirements of protecting the new “something you have” are very visibly not being met in a real classroom setting. The student badges in question are accessible to anyone who can reach them from a hook on the doorframe. Without proper privacy and security practices in place for storage, it is easy for students to gain access to their badges, mix them up, or swap them with one another to log into each other’s accounts. Additionally, badges could easily be reproduced by photograph or copier, and then used to login without a teacher’s knowledge. This failure to appropriately store and secure the somethings you have and the somethings you know presents a privacy issue in classrooms where the badges are used, especially if a teacher isn’t on top of classroom management or hasn’t stored the badges in a place that the kids can’t get to them, especially if educators are unaware that these issues exist. It is even more unbelievable that the launch video models an insecure practice by showing that the badges, clearly labeled with names that identify who they belong to, are all kept in the same place together in the open.
If we know anything about kids, we know that they’re smart— and kids are smart enough to know how to swap badges and complete each other’s work. The overall behavior is not new, but in the age of technology, the devices and apps in use in the classroom make it easier than ever to get away with. The only thing keeping kids from getting away with swapping their badges— “I hate math, and you like it. Will you trade badges so I do your science and you do my math?”— is a vigilant teacher who knows exactly what her students should be working on in lab time. If we know anything about teachers, we know that they’re faced with growing class sizes and inadequate classroom support. Leaving important security and privacy practices up to teachers to figure out themselves is not okay, especially since educators are almost never given any kind of formal security and privacy education in their professional training.
A win for access, not privacy or security
In the case of Clever Badges, there is a gain in accessibility for kids— but there are no security benefits in transferring a weak “something you know” into a physical “something you have” that isn’t being secured in a classroom setting. The tradeoffs taking place in this conversion from “know” to “have” are many, and once those tradeoffs have been compensated for, it’s very easy to see that they cancel each other out in security and privacy. As much as I respect the work that Clever is doing in schools, what they’re doing with this new product is not a security upgrade or a major privacy coup for schools. Clever badges are a usability feature that gives greater access to young kids who are building literacy and computing skills.
As a professional with a vested interest in passwords, security education, and student privacy, I encourage Clever to revisit the claims it is making about the privacy and security of its latest product. There are myriad technical questions that the badges bring up, but few to no answers are readily available.
- What threats are badges designed to protect from?
- How does revocation work if a badge is lost, stolen or damaged?
- What data is inside of the QR code?
- How many bits of entropy exist in the QR code?
- How are the QR codes generated? And is generation done in a way that would decrease the likelihood of collision?
- What kind of security testing and auditing, if any, have the badges gone through?
- Furthermore, these badges are teaching lifetime security skills to students. What happens when students are given a hardware token as a second factor for authentication in the future, especially if they were not taught to secure something they have?
In the spirit of transparency — on which strong security and privacy rely — Clever must answer the technical questions raised by this new product offering. To effectively do so, it must publish a white-paper with a high-level overview of the architecture of the badges, and share the usability research it has conducted and used while developing the badges. To back up the claims it makes about privacy, Clever should publish a privacy impact assessment that gives external experts working on these issues the ability to investigate the claims it is making. For the educators and students who rely on Clever’s technology, it should publish best practices for educators in regards to the privacy and security of badges. Without those things, there is no basis upon which schools can evaluate Clever’s claims or trust Clever’s product with the data of the young children, especially given the unsubstantiated claims made in its marketing materials and the insecure practices it has modeled for educators everywhere.
Disclosure: I’ve known Dan, Tyler and Raf since Clever’s APIs were in stealth mode during in 2012, and have had a discussion with Dan Carroll, Alex Smolen and Ben Adida about my concerns regarding the claims they make about badges. While Clever chose to record the call, I chose not to and am not sharing anything from the conversation that took place last week.
Thanks to Bill C., Rachel, Ryan, Bill F., Viv, Jeff and Julie for reading through my earlier drafts of this post and providing invaluable feedback. If you’re an educator reading this post, please take a look at this crash course on privacy + security for your classroom.
3 thoughts on “Privacy, accessibility and student data security: An Analysis of Clever Badges”
Absolutely right, but of course passwords aren’t safe either if students want to share the burden of work either. And it’s harder to share a QR choose without some level of tech or physical access.
I absolutely agree with your assessment that it is not doing what they say it is.
Surely it would be smarter to have a username and then use the QR as the password replacement too,rather than just “show the card to log in”? The kids must surely be capable of typing their name – my 5yo can read and type nearly anything!
I commend your commitment to improving security education – something we need more of. I’d like to offer an opinion from the UK – I work in the edtech space, specifically on identity services. I’ve visited a lot of classrooms to understand how users (students and teachers) authenticate to devices and services. For younger students (under 8yrs), I see three practices almost universally: 1) password is the same as username, passwords are simple and all identical, 3) simple per-user passwords are written on a noticeboard.
These young student users are accessing low stakes services – simple curriculum resources (maths, english, science), perhaps email (walled garden), maybe collaboration tools (Google Apps and Office 365). Traditional IT services in schools have all been on-prem, with no access over the internet. This has given an element of physical security to the services – you have to be on-site to authenticate. With online services, the perceived threat is not from within the classroom or even the school, but from remote, external actors brute forcing entry points.
I think that the traditional on-prem approach of IT in schools, plus the perceived low value of the services being used has led to the use of weak passwords. No vendor wants to force schools to change this – usability vs security.
So let’s consider the threats. I would suggest that Clever Badges do mitigate attack from remote, external actors (assuming strong implementation) but do nothing to solve the local, in-class attack. Therefore I would consider it a small step forward.
Not trying to excuse any mis-information, just looking to shed some light on the problems today. Clever should be really clear on benefit Badges gives, rather than using the blanket phrase “Clever Badges significantly raises the bar for security and privacy in most classrooms”.
Honestly, What is most alarming here is that we cannot have a constructive conversation without having to guess what the intentions were… we have absolutely no idea of what threats Clever is trying to defend from– something that would be unconscionable from a company working on single sign on tools and authentication methods. If we had better ideas of what kind of entropy existed within the codes and if that was a plausible defense against off-premises attackers, we could get somewhere. But without a threat model or any kind of documentation, here we are, totally in the dark. And security by obscurity is not security at all.