06
May 08

About Larry

Blue LarryI’ve been meaning to write a post like this for a while, and maybe I still will, but in the meantime Deb has done a great job of introducing the world to Larry.  Her writing is enviably clearer than my own, so you should go check it out right now.

I bet she’d love it if you gave her some digg love, too.

[Killing comments on this one to reduce forking/repetition – take ’em to digg or debb]


16
Apr 08

Security UI in Firefox 3plus1

We’ve made a lot of changes (and more importantly, a lot of positive progress) in security UI for Firefox 3.

We have built-in malware protection now, and better phishing protection.  We have a password manager that intelligently lets you see whether your login was successful before saving, instead of interrupting the page load.  We have gotten rid of several security dialogs that taught users to click OK automatically, unseeingly.  We have OCSP on by default.  We have a consistent place in the UI now where users can get information about the site they are visiting, including detailed secondary information about their history with the site; all of which are first steps in a long road towards equipping users with more sophisticated tools for browsing online, by taking advantage of habits they already have, and things we already know.  All the people who worked on this stuff know who they are, and I want to thank them, because it sure as hell wasn’t all me.

With Firefox 3 in full down-hunker for final release (and with conference silly season upon us) though, I’ve started to get serious about thinking through what comes next.

Here’s my initial list of the 3 things I care most about, what have I missed?

1. Key Continuity Management

Key continuity management is the name for an approach to SSL certificates that focuses more on “is this the same site I saw last time?” instead of “is this site presenting a cert from a trusted third party?”  Those approaches don’t have to be mutually exclusive, and shouldn’t in our case, but supporting some version of this would let us deal more intelligently with crypto environments that don’t use CA-issued certificates.

The exception mechanism in Firefox 3 is a very weak version of KCM, in that security exceptions, once manually added, do have “KCM-ish” properties (future visits are undisturbed, changes are detected).  But without the whole process being transparent to users, we miss the biggest advantage to this approach.

Why I care: KCM lets us eliminate the most-benign and most-frequently-occurring SSL error in Firefox 3.  Self-signed certs aren’t intrinsically dangerous, even if they do lack any identification information whatsoever.  The problem is that case-by-case, we don’t have a way to know if a given self-signed cert represents an attack in progress.  The probability of that event is low, but the risk is high, so we get in the way.  That’s not optimal, though.  When the risk is negligible, we should get out of the way, and save our warnings for the times when they can be most effective.

2. Secure Remote Passwords

Secure Remote Password protocol is a mechanism (have some math!) for allowing a username/password-style exchange to happen, without an actual password going out along the wire. Rob Sayre already has a patch.  That patch makes the technology available, but putting together a UI for it that resists spoofing (and is attractive enough that sites want to participate) will be interesting.

Why I care: SRP is not the solution to phishing, but it does make it harder to make use of stolen credentials, and that’s already a big deal.  It also has the happy side effect of authenticating the site to you while it’s authenticating you to the site.  I wouldn’t want this useful technology to get stuck in the chicken-egg quagmire of “you implement it first.”

3. Private Browsing Mode

This is the idea of a mode for Firefox which would protect their privacy more aggressively, and erase any trace of having been in that mode after the fact.  Ehsan Akhgari has done a bunch of work here, and in fact has a working patch.  While his version hooks into all the various places we might store personal data, I’ve also wondered about a mode where we just spawn a new profile on the spot (possibly with saved passwords intact) and then delete it once finished.

Why I care: Aside from awkward teenagers (and wandering fiancés), there are a lot of places in the world where the sites you choose to visit can be used as a weapon against you.  Private browsing mode is not some panacea for governmental oppression, but as the user’s agent, I think it is legitimately within our scope (and morally within our responsibility) to put users in control of their information.  We began this thinking with the “Clear Private Data” entry in the tools menu, but I think we can do better.

(And also…)

Outside of these 3, there are a couple things that I know will get some of my attention, but involve more work to understand before I can talk intelligently about how to solve them.

The first is for me to get a better understanding of user certificates. In North America (outside of the military, at least) client certificates are not a regular matter of course for most users, but in other parts of the world, they are becoming downright commonplace.  As I understand it, Belgium and Denmark already issue certs to their citizenry for government interaction, and I think Britain is considering its options as well.  We’ve fixed some bugs in that UI in Firefox 3, but I think it’s still a second-class UI in terms of the attention it has gotten, and making it awesome would probably help a lot of users in the countries that use them.  If you have experience and feedback here, I would welcome it.

The second is banging on the drum about our mixed content detection.  We have some very old bugs in the area, and mixed content has the ability to break all of our assumptions about secure connections.  I think it’s just a matter of getting the right people interested in the problem, so it may be that the best way for me to solve this is with bottles of single malt.  Whatever it takes.  If you can help here, name your price.

Obviously I’ve left out all the tactical fixup work on the UI we already have.  We all know that those things will need to happen, to be re-evaluated and evolved.  I wanted to get these bigger-topic thoughts out early, so that people like you can start thinking about whether they are interesting and relevant to the things you care about, and shouting angrily if they aren’t.


17
Mar 08

Should Malware Warnings have a Clickthrough?

In the latest nightly builds of FF3, and in the upcoming Beta 5, we let users choose to ignore our phishing warning, and click through to the site, just like they could in Firefox 2:

Ignore this Warning

But that same spot is empty in the malware case (unless you install my magic extension.)  Should it be?  It’s a harder question than it seems, on first blush.

Continue reading →


23
Jan 08

Being Green, easiness of

As of today’s nightly firefox build, we’ve turned on EV support and activated the Verisign EV root for testing purposes.  What this means is that when you go to sites that have Verisign-issued EV certificates like, say, British Airways, the site-identity button (shall we call it Larry? Yes. Let’s.) will pick up the name of the site owner, all green-like.

I rather suspect this might startle a few of you.

Larry on British Airways

I’ve talked a lot about identity and security in Firefox 3, but some of the actual changes were easy to ignore if you weren’t looking for them.  The site button has been around for a while, with Larry telling you what he knows about a site, but you could choose not to click on him, not to get that information.  A while ago, I mentioned a way to get the EV behaviour ahead of schedule, if you wanted to test, but now those steps are no longer necessary.

So things are going to feel a little weird for a few days.  There are about 4000 EV sites these days (the AOTA has a pretty long list) so you will probably hit a few, and it will probably feel weird.  By all means, open bugs.  The whole reason we’re doing this is to get more sunlight on the code, because it’s required weird custom builds and secret handshakes for too long.

The story goes that when London first introduced street signs, there was significant protest.  They were gaudy, the argument went, and anyhow the locals already knew where they were going.  Many streets in London still don’t have them.  I’m excited about getting feedback into the UI to help users know better who they’re dealing with online, help them orient themselves, and rebuild some of the cues that we all take for granted in the real world.  But like the London signposts, I suspect it’ll take some getting used to.  Especially on Proto. Where it currently looks, as Shaver so eloquently puts it, like the South end of a North-facing horse.


10
Jan 08

Standardizing UI, and other Crazy Ideas

Decision making, by nerovivoStandards make the web go ’round.  I hope it doesn’t come as too much of a surprise that Mozilla cares a lot about standards, or that a significant percentage of the community, myself included, participate in active standards groups, be they W3C, WHATWG, industry consortia, or other.

They are often, to be honest, a slog.  Anything important enough to be standardized is important enough to attract a variety of interests and motivations, and being in the middle of multiple, divergent forces can be just as fun as it sounds.  They are usually noble slogs, though.  An open web needs a set of linguas franca. As it matures, people invent new creoles to express new ideas, and so our standards need to constantly evolve and add that new wealth to the growing lexicon of awesome.

A little while ago though, the W3C decided to try something sort of odd.  They formed up a working group to look at standardizing security UI.

Standardizing. UI.

To anyone who has designed a user interface, that sort of feels like standardizing art. Not that we are quite so full of hubris as to imagine ourselves Caravaggios, but UI design is a complex interplay of functionality, ergonomics, and subjective experience.  There are general principles, sure, but it’s a very different beast from, say, CSS2 margin properties, where everyone can at least agree that there ought to be a single correct result, even if they disagree about what that result should be or how to obtain it.

Nevertheless, boldly forth they have gone and established the Web Security Context working group with a pretty broad charter. Capturing current best practice is certainly fair game, but it is equally permissible for the group to try to move the state of the art forward.  We’re active members, as are Opera and Konqueror (though not Apple or MS), but like most standards bodies, the group includes folks from academia, from other companies, and from various interested groups as well.

This workgroup has put out its First Public Working Draft (FPWD), which means I have two things to ask you, or maybe ask of you.  In marketing, I believe they call this the Call to Action, so if you were looking for it, here it is!

The first thing I would ask, if you are at all interested, is that you to read it and remark upon it.  The group needs public comment, and you fabulous people are ably placed to provide it.

This first draft was kept deliberately inclusive, to make sure that the majority of recommendation proposals got public airings. So if your main criticism is just “too much,” that is unsurprising, but still welcome, feedback.

The second thing is harder.

We participate in this group for all the reasons mentioned above, and I personally take that participation seriously.  Even on the sketchy topic of standardized UI, I think there’s potential. A document which all browsers conform to as a baseline guide, which says things like “Don’t let javascript arbitrarily resize windows, because it lets this spoofing attack happen,” is a valuable one.  At Mozilla, we talk about things like making the mobile web a better place, for example. One thing we can do right up front in that world is spare this new generation of browser implementors (and their users!) from rediscovering our mistakes the hard way.  This standard could help do that.

But this draft is also defining new UIs, new interactions, new metaphors for online browsing.  The academics in the group have offered to gather usability data on several proposed recommendations, but at a fundamental level, I have asked the group a couple times whether it’s right to use a standard to do this kind of work at all.  I think several of the proposed requirements sound like interesting, probably fruitful UI experiments.  But that’s not the same as “Standards-compliant user agents MUST …”

My second question is this: as members of the Mozilla community, is this an effort that you want me (or people like me) participating in, and helping drive to final publication?

I’m still engaged on the calls and the mailing list – I still see good things coming out of the group, and I have my own opinions about how to best contribute.  But as an employee of Mozilla, I feel an obligation to steward my own resources responsibly, and to expend them on things that the community finds valuable, so it’s important for me to hear how people feel about the value of this work.

Opinions? Suggestions? Funny anecdotes?


04
Nov 07

Sleepy & Happy (WTB: 5 dwarves)

sleeping polar bearI want you to know that I’m sleeping again.

It’s not that I wasn’t before, I was.  But when you break the internet, you take on certain moral obligations vis a vis its restoration.  We landed bug 401575 today which gives our users a chance to override security warnings if they think they know what they’re doing.  There are people who will dislike this version just as much as the other people who disliked the first thing that landed, but that’s okay, because no one said we were finished yet.  Just like no one said we were finished last time.

I’d like to see us continuing to do better with giving users useful options when they run into a security problem.  Things that keep them away from the whatever button, whenever possible.  If we can redirect our users’ energies, judo-style, in directions that protect them from harm instead of stubbornly stopping them in their tracks, I think we can keep them safe, and happy, at the same time.  That why we’re still working on bugs like 402210 to help give users safe ways out, and bugs like 402207 to let us make safe choices for normal users without making power users cry.

These things, though, all of them: they are the birth pangs of something pretty amazing.

While I’ve been working on my stuff, everyone else has been working on theirs.  And I don’t know about my stuff, but their stuff is good.  We’re getting very very close to getting it all out to you; to knock on, and sniff, and generally assess, like a honeydew melon of awesomeness.  It’s really hard for me to go back to Firefox 2 now, and that’s not a knock against it – I still think it’s the best browser out there, but this new stuff?  Get ready for it.

Location bar auto-complete for example, like Jamaican blue mountain coffee, will change your world if you let it.  The new bookmarking system is an amazing platform for extension authors, and I’m pretty keen to see what happens there, but even the bits we ship in our own UI are changing the way I browse.  And the performance gains across the product are palpable.

When the beta comes out the door, if you’re brave enough to try it, don’t look for fireworks.  Our first, biggest job is to help you get to the web sites you want, so we’re not going to go to great lengths to jump up and down and grab your attention away.  But in a hundred subtle ways, things will just be nicer.

And we’re not done yet.

Postscript

I really should have just let the post end there, it was sort of a dramatic finish, but this needs saying:

I used the analogy “birth pangs” up there because it was what good analogies are: a way of situating facts or events which may be unfamiliar to readers within a context that is somehow more so.  “Honeydew melon of awesomeness” was maybe less apt, but nevertheless. Recently Tyla (and, in all fairness, Mike too) went through actual birth pangs.  The kind where you have an extra human at the end.  As analogies go, I’m not sure I do understand that context all that well.  Firefox 3 is going to be pretty awesome, but let me tell you, Claire is stiff competition for any would-be miracle.  Congratulations guys.  I promise never to mention my own sleep schedule  again.


11
Oct 07

TODO: Break Internet

So there’s this thing at Mozilla where we try not to break the internet.  Call us wacky, but it seems like a bad play.  And so Rob Sayre is right to be a little miffed when it looks like we’ve done exactly that.  Sayre is often right, in fact, it’s his thing that he does.

Backstory
The web has this technology called SSL that lets you do two important things:

  1. Know who you’re talking to (because companies exist which verify this information, we’ve been over this)
  2. Talk to them in an encrypted, validated way so that no one can eavesdrop or tamper with the message
  3. Show a little padlock on your browser window

As I said, only two of them are important.

Because SSL makes these relatively useful promises, it is sort of a popular technology.  Because it’s generally important to get security things *precisely right* though, and because humans are people, there’s a lot of broken SSL out there too.

What’s “broken”?  Sometimes it means using the identification for one site on another site (because it’s cheapereasierfaster than getting a second one).  Sometimes it means using it after it has expired.  Sometimes “broken” isn’t actually broken at all, it’s just that the site is using SSL with identification they wrote themselves, so that they’re getting promise 2 (encrypted, validated), but not promise 1 (knowing who you’re talking to).

In the past, most browsers did a very dumb thing here:

FF2 Domain Mismatch Error

This dialog, in the hands of normal people, feels like it basically amounts to:

Snotweasel omegaforce warning

Why change such a fun and exciting system, I hear you ask?  The real problem here is that once in a while, when this kind of dialog appears, it actually might represent an actual attack.  Most of the time it’s site administrator laziness, but it’s hard to tell, and it could be a real problem; it could mean that someone has hacked your internet connection (or more likely totally controls it because you connected in some public WiFi spot like a coffee shop) and is redirecting you from your bank’s web site to their own.  When that happens, the fact that we’ve taught everyone to click OK blindly is a really bad thing, because we need you to stop and ask yourself what’s going on.

That’s a lot of backstory, if it was new to you, take a break here.  Have a cookie.

The State of Things
In Firefox 3, one of the things a lot of people were really pushing on was that we dump these dialogs, and we have.   Rob has a screenshot of what the current code does, and in case you missed it the first time, here’s another link.

Before we start talking about changing it, I want to give the crypto dudes, and particular Kai Engert from RedHat a shout-out here, because (believe it or not) I think this is actually a good first step, and was a lot of work to get implemented.

So now instead of a little, cryptic dialog box with an OK button, there’s a big, cryptic error page with no OK button.   Hmm.

Firefox 3 Control Panel

People are seeing that error page, and making a couple really important points:

  1. Everything needs to be less cryptic.  Human readable would be a good start.  Bug 398718
  2. There needs to be a way to get past it so that it’s not a dead-end. (There is, of course. There’s the Add Exception dialog added in bug 387480, which people generally seem to like, but it’s buried in the bowels of advanced prefs, so bugs like 399275 argue for making it much more directly accessible).
  3. You’re (excuse me) batshit fucking loco.

Security and ease of use are not intrinsically a tradeoff. Indeed, a lot of the time, good security comes from a better understanding of how people naturally work.  But there are times, and this feels like one of them, where doing the safer thing for users means annoying them more, and annoying them less means failing to honour our obligation to keep them safe.  Boo.

Walking and Chewing Gum

The thing is, we don’t get to just throw up our hands and say “well, better safe than sorry” nor do I think we get to say “Too annoying, let’s revert.”   That slider has middle positions, where annoyance and safety are in better balance, let’s get there.

Fixing the text is important.  It needs to speak in human terms about why this is a problem, and about what you can do to fix it.  I do think, though, that we need to consider giving people a path from the error page to the override UI.  I can already hear the furious head-smashing of anyone who understands PKI and has read the relevant literature.   Click-throughs beget bad security habits, which is why I think it should still be a multi-step process that hammers home the fact that you’re doing something aggressive.   But full-stop blocking our users is something that’s contentious even for known malware sites; here it feels like too much.

IE7 does this.  I think they win big points for human readability there – even though they still have a click-through.  I don’t know how much the red shield scares users off, maybe it does, but one-click override still turns my stomach a little.  What I’d like to see from us is an action like that, but which, rather than automatically extending trust, simply shortcuts you to the exception adding dialog.  The argument will be made that it’s just a longer click-through, I understand that, but my feeling is that it’s long enough, and scary enough, to get more of users’ attention.  My feeling is also that we might have to eat that possibility anyhow, because if we make it sufficiently annoying for users to browse the web, they really will decide it’s a Firefox problem, since other browsers let them through.  At that point we not only fail our users on the security front, we also go back to the bad old days of “only works on IE.”

Why Don’t You Just…

I love it when people have alternate suggestions, but some of the frequently recurring ones have pretty big problems.  I’ll call out a few here to save re-treading (unless I’m getting them wrong, in which case we should totally retread, since they’re often held up as much simpler than this other thing we’re doing).

“Why don’t you just let the connections through quietly, and just remove any indicators of security, like the padlock, yellow address bar, verified identity, etc?”   The argument here being that rather than blocking the load, why not serve the content, but not let users think it’s a secure site?  Compelling, no?

Approaches like this have the really unpleasant side effect of subverting whatever good security practices our users have developed.  Banks tell their customers to go to the website via a saved bookmark, rather than clicking on links in email or other web pages.  That’s a good practice.  Some even tell users to look for the “https” in the URL.  In the case where you’re being attacked, where the cert presented is a forgery (since only the legit site can present the real one) all of these habits will tell you you’re safe. The URL says https, and you clicked on the same bookmark you always click on to get to your bank.  This would be a present gift-wrapped for attackers.

“Why don’t you treat self-signed certs, which legitimate sites use when they want encryption but not identity, differently from actual breakages?”

The thing is that self-signed is no more or less trustworthy than, say, a domain-mismatched cert.  Likewise for the argument about treating a self-signed cert differently from one that is signed, but by an unknown signer.  I did open bug 398721 about the idea of using “Key Continuity Management” as a way to mitigate the hurt in the self-signed case while still getting the basics right, but in any event that wouldn’t make it in for Firefox 3.

Closing
To my friends and family using Firefox, don’t panic, none of this is happening in the currently released browser, you’re not going to see this debate enacting itself on a desktop near you anytime soon.  We are extremely cautious about changing the experience in released products after shipping.  This is happening purely among those running the up-to-the-minute versions under active development.

It will get better.  Bug 398718 (my fingers have already learned how to type that one automatically) will land, and the error pages will be things that make sense, and explain your options.  Bug 399275 will morph into a general discussion of what kind of path we want to create to add exceptions, or if it doesn’t, I’ll create a new one which does.  We’re not going to ship a browser you can’t use.  Even on sites that are doing it wrong, we put the choice in your hands, because it’s your browser.  And we like you very much.


11
Sep 07

Mozilla24

FoxKeh!  On the world!I don’t normally blog about my work travel here, because what are you gonna do, come with me?  This one’s different though.

I’m flying out to SFO tomorrow morning (oh AC757, we’ve really gotten to know each other, haven’t we?) in anticipation of Mozilla24, a 24-hour all-mozilla, all-the-time conference at which I will be speaking amongst a group of shockinglymoreawesome people.  I will be talking about security UI, natch, and I would love to see all your smiling faces (though I’ll forgive the folks who saw the OSCON version for having their laptops open).

One of the many cool things about Mozilla24 is that it’s global – California, Tokyo, Thailand, and Paris, sure, but also online – so that if you are interested in the open web, and the directions we can take it, or if you’re just getting your feet wet, you can get involved.

Go sign up!  Why not get into the thick of it?  I’ll wait here.

PS – The blog photo here, Foxkeh, and indeed the whole Mozilla24 shebang, comes from Mozilla Japan.  They’re trying to make the rest of us look bad, bringing their A game.  Their A++++ OMG WOULD DO BUSINESS AGAIN WOW game.