At Our Most Excellent

Jono recently wrote a blog post about Firefox updates, and Atul wrote a follow up. They are two of the brightest usability thinkers I know. When they talk about users, I listen. I listen, even though some of the things they say sound confused to me, and some are plain wrong. And I listen because if people as bright and in tune with Mozilla as them think these things, I bet others do, too.

When I read (and re-read) the posts, I see 3 main points:

  1. The constant interruption of updates is toxic to the usability of any piece of software, especially one as important as your web browser.
  2. Our reasons for frequent updates were arbitrary, and based on the wrong priorities.
  3. We take our users for granted.

To be honest, if it weren’t for the third point, I wouldn’t be writing this. Anytime you do something that impacts lots of people, especially lots of impassioned and vocal people, you’re gonna get criticism. Listening to that is essential, but fanning the flames can consume all your energy and even still some people won’t be convinced. The third point, though, made by two people who know and love Mozilla even if they haven’t been close to the release process, isn’t something I want to leave sitting. I understand how it can fit a narrative, but it’s just not true.

Since I’m writing anyhow, though, let’s take them in order.

Interruptions Suck

Yes. They do. One criticism that I think we should openly accept is that the move to regular releases was irritating. The first releases on the new schedule had noisy prompts (both ours and the operating systems’). They broke extensions. Motives aside, our early execution was lacking and we heard about it. Plenty.

Today our updates are quiet. Addons have been compatible by default since Firefox 10 back in January. But that was a mountain of work that would have been much nicer to have in hand up front. As Jono says, hindsight is 20/20, but we should have done better with foresight there.

Motivations

It was hard for me to read the misapprehension of motives in these posts. Hard because I think Mozilla’s earned more credit than that, and hard because it means I haven’t done a good job articulating them.

Let me be clear here because I’m one of the guys who actually sits in these conversations: when we get together to talk about a change like this, concepts like “gotta chase the other guys” are nowhere in the conversation. When we get together and draw on whiteboards, and pound on the table, and push each other to be better, it is for one unifying purpose: to do right by our users and the web.

I wrote about this a while back, but it bears repeating. We can’t afford to wait a year between releases ever again; we can’t afford to wait 6 months. Think how much the web changes in a year, how different your experience is. Firefox 4 was 14 months in the making. A Firefox that updates once every 14 months is not moving at the speed of the web; we can’t go back there. Every Firefox release contains security, compatibility, technology and usability improvements; they should not sit on the shelf.

There’s nothing inviolate about a 6 week cycle, but it’s not arbitrary either. It is motivated directly from our earnest belief that it is the best way for us to serve our users, and the web at large.

And so the hardest thing for me to read was the suggestion that…

We Take Our Users For Granted

Nonsense. I don’t know how else to say it. In a very literal way, it just doesn’t make sense for a non-profit organization devoted to user choice and empowerment on the web to take users for granted. The impact of these changes on our users was a topic of daily conversation (and indeed, clearly, remains one).

To watch a Mozilla conversation unfold, in newsgroups or in blogs, in bugzilla or in a pub, is an inspiring thing because of how passionately everyone, on every side of an issue, is speaking in terms of the people of the web and how we can do right by them. We are at our most excellent then.

There’s beauty in the fact that this is another of those conversations. It is not lost on me, nor on Jono and Atul, I’d wager. They are Mozillians. And I believe they care deeply about Firefox users. I hope they realize how much the rest of us do, too.

SSL Information Wants to be Free

Recent events have really thrown light onto something I’ve been feeling for a while now: we need better public information about the state of the secure internet.  We need to be able to answer questions like:

  • What proportion of CA-signed certs are using MD5 signatures?
  • What key lengths are being used, with which algorithms?
  • Who is issuing which kinds of certificates?

So I decided to go get some of that information, so that I could give it to all of you wonderful people.

Continue reading “SSL Information Wants to be Free”

New in Firefox 3.1: Linkified View Source

Look what Curtis just did:

Linky!

Curtis Bartley is the newest member of the Firefox front end team and, to get his feet wet, he made the world a better place by fixing a very old bug. And its 7 duplicate bugs.

Specifically, he set it up so that resources which are referenced in source are now clickable links.  Want to know what that external javascript does?  Click the link, and it will be loaded in the source viewer.  Likewise CSS.  Maybe you clicked “View Source” only to discover you were looking at a frame set, and actually wanted the source for a frame – that works too.

And yes, back and forward keyboard shortcuts work. And yes, both relative and absolute links work. And yes, you can have this in a tab instead of a separate window, either by sticking view-source: on to the front of your URLs (see?), or by finding one of the addons that does it for you.

Way to go Curtis, keep ’em coming!

SSL Question Corner

From time to time, in the blogosphere or mailing lists, I will get questions about various security decisions we make in Firefox.  Here’s one that has been popular lately:

Q: I think you are dumb.

It is worded in a variety of ways, of course, but that’s the basic thrust.  A longer version might read:

Q: Why has Firefox started treating self-signed SSL certificates as untrustworthy?  I just want encryption, I don’t care that the cert hasn’t been signed by a certificate authority, and anyhow I don’t want to pay hundreds of dollars just to secure my communications.

There are a couple of implicit assumptions we should dispense with up front, before tackling the meat of the question, to wit:

  1. “Why has Firefox started treating…”  Firefox has been treating self-signed certificates as disconcerting for quite some time.  In Firefox 2, you would get a giant dialog box popping up asking what to do with them.  It was farcically easy to dismiss since just hitting OK would proceed to the site, and since the default was a temporary pass, not a permanent one, you saw the dialog frequently, making it even easier to ignore.  Firefox 3 has absolutely changed that flow — more on that later — but there is nothing new here.
  2. “ … I don’t want to pay hundreds of dollars …” Several CAs accepted by all major browsers sell certificates for less than $20/yr, and StartSSL, in the Firefox 3 root store, offers them for free.

Those concerns are red herrings, the real concern is in the middle:  “Why treat self-signed SSL as untrustworthy?  I just want encryption.”  Let’s explore this.

First of all, this isn’t quite right.  You never *just* want encryption, you want encryption to a particular system.  The whole reason for having encryption is that you don’t want various ill-doers doing ill with your data, so clearly you want encryption that isn’t going to those people.

“So fine, I want encryption to a particular system,” you say, “but I don’t need a CA to prove that my friend’s webmail is trustworthy.  CAs don’t even do that anyhow.  I trust him, Firefox should get out of my way.”

Yes, absolutely – the browser is your agent, and if you trust your friend’s webmail, you should be able to tell Firefox to do so as well.  But how do you know that’s who you’re talking to?

Permit me 3 short digressions…

Digression the First: Ettercap, webmitm, and friends

What if I told you that there were a group of programs out there that made it trivial, brain-dead simple, to intercept your web traffic, log it, and then pass it through without you ever noticing?  These “Man in the Middle” attacks used to be the stuff of scary security fiction, but now they are point-and-click.

If one of these is running on your network (you know, like the packet sniffers you’re protecting against with encryption in the first place) it will poison your network so that all requests go through them.  It will then transparently fetch and pass off any regular web pages without you noticing (after logging anything juicy, of course).  If you request an SSL page, it will generate its own certificate whose human readable details match the real site, same organization name, same domain name, everything, and use that to masquerade as the site in question.  The only difference is, it will be self-signed, since the tool obviously can’t get a CA signature.

Digression the Second: Drive-By Router Reconfig

Do you use one of those home cable-dsl-router/wifi-access-point thingies?  For the last couple years, security folks have gotten giggles out of finding ways to break them, and the number one thing they do is rewrite your network configuration so that your connections go to computers of their choosing.  If your router is subverted in this way, the only hint you might have is that your secure sites have all become self-signed.

Digression the Third: Kaminsky Breaks the Internet

This week I’m at the Black Hat security conference in Vegas, where it is a virtual certainty that Dan Kaminsky is going to outline an attack that lets any site on the internet pretend to be any other site on the internet.  I can pretend to be paypal.com.  You can pretend to be bankofamerica.com.  If your ISP doesn’t fix all of their servers, one aforementioned doer-of-ill can trick them into sending all of their customers to forgeries of the actual sites they seek.  They don’t even have to be on the same network anymore.  This is substantially easier than packet sniffing. The only thing that will tell you whether the sites you are visiting are real is the existence of a trusted certificate, which only the legitimate site can have.

Back to the Plot

The question isn’t whether you trust your buddy’s webmail – of course you do, your buddy’s a good guy – the question is whether that’s even his server at all.  With a CA-signed cert, we trust that it is – CAs are required to maintain third party audits of their issuing criteria, and Mozilla requires verification of domain ownership to be one of them.

With a self-signed certificate, we don’t know whether to trust it or not.  It’s not that these certificates are implicitly evil, it’s that they are implicitly untrusted – no one has vouched for them, so we ask the user.  There is language in the dialogs that talks about how legitimate banks and other public web sites shouldn’t use them, because it is in precisely those cases that we want novice users to feel some trepidation, and exercise some caution. There is a real possibility there, hopefully slim, that they are being attacked, and there is no other way for us to know.

On the other hand – if you visit a server which does have a legitimate need for a self-signed certificate, Firefox basically asks you to say “I know you don’t trust this certificate, but I do.”  You add an exception, and assuming you make it permanent, Firefox will begin trusting that specific cert to identify that specific site.  What’s more, you’ll now get the same protection as a CA signed cert – if you are attacked and someone tries to insert themselves between you and your webmail, the warning will come up again.

I don’t think the approach in Firefox 3 is perfect, I’m not sure any of us do. I have filed bugs, and talked about things I think we could do to continue to enhance our users’ security while at the same time reducing unnecessary annoyances.  You’ll notice that Firefox 3 has fewer “Warning: you are submitting a search to a search engine” dialog boxes than Firefox 2 did, and it’s because of precisely this desire.

I welcome people who want to make constructive progress towards a safer internet and a happier browsing experience. That’s what motivated this change, it’s what motivates everything we do with the browser, really.  So it sure would be nice if we didn’t start from the assumption that changes are motivated by greed, malice, or stupidity.

Security Screencast(s)

As Alix mentions, I recently put together a quick screencast of some of the new security features in Firefox 3. Of course, beltzner promptly scooped me with his own inimitable screencast, and what with the launch, it’s only now that I’m getting around to posting mine.

What’s interesting to me, though, is the difference between what I originally recorded, and what Alix published. I recorded the raw screencast using Jing, which is a simple, free screencasting tool for Mac and Windows. It caps you at 5 minutes, and records as flash, but it’s super easy to use, and screencast.com will host the resultant video for you. You can see what I recorded here:

http://content.screencast.com/bootstrap.swf

But then I handed it off to Alix and David and Rainer, and they turned my 5 minutes of low production values into 2 minutes of edited, titled video, with helpful visuals! See if you notice the difference…


Firefox 3: Security from Mozilla Firefox on Vimeo.

As promised in my last post, I’ll soon be posting yet another video, this time an hour long talk I gave at FIRST. And then, I think, no more blatant self-promotion for a couple weeks, eh?

Have you installed Firefox 3 yet?

Hello Vancouver! Briefly!

A quick note, to any Vancouverites that may be interested, that I will be in town on Wednesday to speak at the FIRST 2008 conference. The title of the talk is “The Most Important Thing – How Mozilla Does Security, and What You Can Steal.” If you’re attending the conference, I hope I’ll see you there. Once the conference is over, I’ll post my slides and a video of a presentation dry-run, in case anyone is interested.

I had a lot of help from several people, most notably Shaver, in putting this presentation together; my goal is to keep adapting it and ideally get other people giving it as well. Security is something that the Mozilla project has a lot of experience with, and a lot to be proud of. It is important to our mission that we share that expertise. Even when what we’re saying isn’t new (“have unit tests”), the fact that we have achieved the success we have lets us be a proof point for people trying to make change in their own projects (“Mozilla didn’t think code review was too time-intensive.”)

I may not be an official member of the evangelism team, but I will do whatever I can to encourage more people in our community to take their knowledge outbound. We are doing crazy awesome stuff here (how many IT people, on the planet, have dealt with what Justin‘s team has?) and we should consider it an obligation to spread that knowledge around. Heck, that’s actually sort of what my talk is about.

Mal-what? Firefox 3 vs. Bad People

A lot of the things I write here are for geeks.  That’s unsurprising, given my own wonkish leanings, but I appreciate that it makes me a tough guy to love, much less read, at times.  Sorry about that, and thanks for sticking with me.

With Firefox 3 on the cusp of the precipice of the knife’s edge of release, though, I wanted to stop pretending that everyone reads the same articles I do and talk about one of the many, really concrete things we’re doing to keep our users, like you, safe.  There will be graphs.

Continue reading “Mal-what? Firefox 3 vs. Bad People”

Security UI in Firefox 3plus1

We’ve made a lot of changes (and more importantly, a lot of positive progress) in security UI for Firefox 3.

We have built-in malware protection now, and better phishing protection.  We have a password manager that intelligently lets you see whether your login was successful before saving, instead of interrupting the page load.  We have gotten rid of several security dialogs that taught users to click OK automatically, unseeingly.  We have OCSP on by default.  We have a consistent place in the UI now where users can get information about the site they are visiting, including detailed secondary information about their history with the site; all of which are first steps in a long road towards equipping users with more sophisticated tools for browsing online, by taking advantage of habits they already have, and things we already know.  All the people who worked on this stuff know who they are, and I want to thank them, because it sure as hell wasn’t all me.

With Firefox 3 in full down-hunker for final release (and with conference silly season upon us) though, I’ve started to get serious about thinking through what comes next.

Here’s my initial list of the 3 things I care most about, what have I missed?

1. Key Continuity Management

Key continuity management is the name for an approach to SSL certificates that focuses more on “is this the same site I saw last time?” instead of “is this site presenting a cert from a trusted third party?”  Those approaches don’t have to be mutually exclusive, and shouldn’t in our case, but supporting some version of this would let us deal more intelligently with crypto environments that don’t use CA-issued certificates.

The exception mechanism in Firefox 3 is a very weak version of KCM, in that security exceptions, once manually added, do have “KCM-ish” properties (future visits are undisturbed, changes are detected).  But without the whole process being transparent to users, we miss the biggest advantage to this approach.

Why I care: KCM lets us eliminate the most-benign and most-frequently-occurring SSL error in Firefox 3.  Self-signed certs aren’t intrinsically dangerous, even if they do lack any identification information whatsoever.  The problem is that case-by-case, we don’t have a way to know if a given self-signed cert represents an attack in progress.  The probability of that event is low, but the risk is high, so we get in the way.  That’s not optimal, though.  When the risk is negligible, we should get out of the way, and save our warnings for the times when they can be most effective.

2. Secure Remote Passwords

Secure Remote Password protocol is a mechanism (have some math!) for allowing a username/password-style exchange to happen, without an actual password going out along the wire. Rob Sayre already has a patch.  That patch makes the technology available, but putting together a UI for it that resists spoofing (and is attractive enough that sites want to participate) will be interesting.

Why I care: SRP is not the solution to phishing, but it does make it harder to make use of stolen credentials, and that’s already a big deal.  It also has the happy side effect of authenticating the site to you while it’s authenticating you to the site.  I wouldn’t want this useful technology to get stuck in the chicken-egg quagmire of “you implement it first.”

3. Private Browsing Mode

This is the idea of a mode for Firefox which would protect their privacy more aggressively, and erase any trace of having been in that mode after the fact.  Ehsan Akhgari has done a bunch of work here, and in fact has a working patch.  While his version hooks into all the various places we might store personal data, I’ve also wondered about a mode where we just spawn a new profile on the spot (possibly with saved passwords intact) and then delete it once finished.

Why I care: Aside from awkward teenagers (and wandering fiancés), there are a lot of places in the world where the sites you choose to visit can be used as a weapon against you.  Private browsing mode is not some panacea for governmental oppression, but as the user’s agent, I think it is legitimately within our scope (and morally within our responsibility) to put users in control of their information.  We began this thinking with the “Clear Private Data” entry in the tools menu, but I think we can do better.

(And also…)

Outside of these 3, there are a couple things that I know will get some of my attention, but involve more work to understand before I can talk intelligently about how to solve them.

The first is for me to get a better understanding of user certificates. In North America (outside of the military, at least) client certificates are not a regular matter of course for most users, but in other parts of the world, they are becoming downright commonplace.  As I understand it, Belgium and Denmark already issue certs to their citizenry for government interaction, and I think Britain is considering its options as well.  We’ve fixed some bugs in that UI in Firefox 3, but I think it’s still a second-class UI in terms of the attention it has gotten, and making it awesome would probably help a lot of users in the countries that use them.  If you have experience and feedback here, I would welcome it.

The second is banging on the drum about our mixed content detection.  We have some very old bugs in the area, and mixed content has the ability to break all of our assumptions about secure connections.  I think it’s just a matter of getting the right people interested in the problem, so it may be that the best way for me to solve this is with bottles of single malt.  Whatever it takes.  If you can help here, name your price.

Obviously I’ve left out all the tactical fixup work on the UI we already have.  We all know that those things will need to happen, to be re-evaluated and evolved.  I wanted to get these bigger-topic thoughts out early, so that people like you can start thinking about whether they are interesting and relevant to the things you care about, and shouting angrily if they aren’t.