I tend to get excited about things. I’d say one of the key problems I have when writing – blogs, articles, books will probably be even worse here – is that, since I tend to be excited about things, my writing tends to wander to whichever dog has a puffy tail at the moment, and I sometimes look back and end up wishing each piece was tighter and more single-minded.
Take my post last week. Right now I’m excited about Firefox security UI, and about how to do a better job with the way we give users information. This is a good thing for me to be excited about, since it pays my bills. But I want to engender conversation about it, and to build context around my thoughts on the matter, and meandering isn’t necessarily the best way to do that.
So. This is the first of two posts I will write in the next week or so about this stuff. The goal is to outline:
- The way things are, and why we need to change them
- My thoughts on where we need to be looking to go
This is the first. What are we, as browser builders, doing for the user today when it comes to security UI?
The most iconic (ha!) of browser security indicators is the little padlock icon that appears on “safe” websites. It can be in the address bar…
… or the status bar…
… and is one of the few really universal indicators (please ignore the monkey.)
It, and its predecessor, Netscape’s key icon, try to tie a complicated statement about web site encryption to a concrete metaphor, a lock and key. The good thing about this indicator is that it has some history of user education behind it, and it’s a relatively easy concept to understand.
The padlock has a lot of problems, though. First of all, it is misleading; it doesn’t mean “safe” at all. The padlock appears when a website presents a valid SSL certificate, issued by a company that your browser thinks is trustworthy. But the bar for getting one of these can be as low as $10, and the validation the companies do varies from excellent to non-existent. Even back in 2005, there were over 400 phishing attacks using SSL. So clearly, the padlock is not equivalent to safety.
Moreover, as with some of the other cues I discuss below, the padlock has no anti-padlock equivalent. That is, if the padlock is meant to signal safety, then the possibility for danger is indicated by… nothing. Users are expected to notice the absence of an indicator. There is a relatively enormous wealth of data to back up claims that users are very bad at using the absence of something as a behaviour modifier.
Finally, the padlock’s positioning is pretty weak from a usability point of view. Putting in the address bar was a step in the right direction when it comes to associating the cue more strongly with the page you are viewing, but it is still a small, peripheral indicator that is only helpful to those who know, and remember, to check regularly. An intrusive indicator could be worse, if it caused people to disable it completely, but if your cue is invisible to most users, it might as well not be there at all.
Address Bar Decorations
More recently, a lot of attention has been paid to the various ways to use the address bar as an indicator. The flagship example of this is IE7’s “green bar”.
The idea with the green bar is to call out sites which have gone through the extra trouble to obtain an “Extended Validation” certificate for their site. This standard is still being drafted but MS went ahead and included support for this cue anyhow, since standards bodies rarely line up with product release schedules. Mozilla, in a similar vein, turns the address bar yellow to supplement the padlock icon on encrypted sites.
I was in New York a couple weeks ago, with other browser vendors and ceritificate authorities, and let me tell you, there is a lot of interest in whether or not we’ll start shipping a green bar. A consistent user experience around these things is important, so they’re not wrong to want to know where we stand.
But the green bar has most of the same problems that the padlock did. Like the padlock, it is misconstrued to mean safety when it oughtn’t. “Green means go” has become the press spin on the green bar, but just like the padlock, it’s making a statement about encryption and identity of the website, not a statement about whether they have honest business practices or protect your personal information.
Like the padlock, it expects users to notice its absence on sites without EV certs, leaving the address bar white in those cases. In fact, Microsoft turns the address bar yellow on “suspicious” web sites, in delicious contrast to firefox turning it yellow on encrypted sites.
Finally, affirmative address bar decorations are spoofable. In a now relatively-infamous study, Microsoft Research found that the green bar actually made users more susceptible to a particular kind of phishing attack called a picture in picture attack. A clever attacker can use various doctored images to make it look like there’s an IE window within the real IE window, recreating the toolbars and menu items and oh yes, creating a green address bar in the process. Users trained to look only for the green bar are easily fooled by this deception, in part because the real IE window isn’t offering any counter-indicating cues. Just the absence of the green bar.
When I say that the padlock isn’t about safety, I mean it. We can’t really be in the business, as browser vendors, of telling users whether a site is absolutely safe or not. What we can do if arm them with the information to make their own decisions. Not “AES encrypted with a 256 bit key” which is so impenetrable to most users as to constitute a total lack of information, but something. In fact we already do this. If you click on a padlock icon or right click on a page, you can get to the page’s security information. It looks like this (at least, on a mac):
I don’t have much to say here except that we can do better. I don’t mean it as a knock on the developers of this code – I know it’s intended to be for advanced users, and that if you dig into it, it really does provide complete information about the certs you’re working with.
But it’s a missed opportunity. Page info, particularly the security tab, should be giving you information to help you make security judgements. Have you been to this site before? Do you have a stored password for this site? My mom might or might not ever check for that information, but on the off chance that she did, I am certain it would be more helpful to her than “(RC4 128 bit)”.
In more recent versions of most browsers, things have gotten a little better. Most major browsers now ship with support for anti-phishing protection that looks something like this:
Anti-phishing has a lot of things going for it, as a piece of security UI. It’s attention getting. It’s very difficult to spoof since it crosses between content and chrome in a visible way (although presumably an attacker wouldn’t want to spoof a warning message anyhow). Even the iconography is better because unlike the padlock which provides a misleading signal, the red-circle sign means DO NOT ENTER in the real world, and means the same thing here. It has some minor bugs, but is mostly a good UI.
The big objection raised against it is that blacklist UIs are only as good as the blacklist, and it guarantees you’re always behind the times. We’re pretty happy with it nonetheless, but it’s obviously not a complete solution.
So. That’s where we’re at. A lot of our cues miss the mark in some important ways. We need cues that resist spoofing, that are clearer about the kind of information they provide, and that provide it in ways that are meaningful. I have ideas here, so do lots of other smart people, and those ideas will no doubt evolve heavily in the coming months.
That’s part 2.