05
Mar 09

Deep Packet Inspection Considered Harmful?

I was recently asked, in the context of the ongoing Phorm debacle, and with other interested parties, to meet with members of the UK government and discuss deep packet inspection technologies, and their impact on the web.  I’m still organizing my thoughts on the subject – I’ve put some here, but I’d love to know where else you think I should look to ensure I have considered the relevant angles.

Brief Background

Phorm‘s technology hooks in at the ISP level, watches and logs user traffic, and uses it to assemble a comprehensive profile for targeting advertising. While an opt-out mechanism was provided, many users have complained that there was no notice, or that it was insufficiently clear what was going on. NebuAd, another company with a similar product, has apparently used its position at the ISP level to not only observe, but also to inject content into the pages before they reached the user.  It’s hard to get unbiased information here, but this is what I understand of the situation.

Thoughts

1.  Deep packet inspection, in the general case, is a neutral technology. Some technologies are malicious by design (virus code, for instance), but I think DPI has as many positive uses as negative. DPI can let an ISP make better quality of service decisions, and can be done with the full knowledge and support of its users. I don’t think DPI, as a technology, should be treated as insidious.

2. Using deep packet inspection to assemble comprehensive browsing profiles of users without explicit opt-in is substantially more questionable. My browsing history and habits are things I consider private in aggregate, even though any single visit is obviously visible to the site I’m browsing.

It’s possible that I will choose to allow this surveillance in exchange for other things I value, but it must be a deliberate exchange. I would want to have that choice in an explicit way, and not to be opted in by default, even for aggregate data. Moreover, given the complexity of this technology, I would want a great deal of care to go into the quality of the explanation.  Explaining this well to non-technical users might be so difficult as to be impossible, which is why it’s so important that it be opt-in.

3. Using deep packet inspection in conjunction with software that modifies the resultant pages to include, for instance, extra advertising content, is profoundly offensive and undermines the web. The content provider and the user have a reasonable expectation that no one else is modifying the content, and a typical user should not be expected to understand the mechanics of the web sufficiently to be able to anticipate such modifications.

Solutions

As a browser, we do some things to help our users here, but we can’t solve the problem. https resists this kind of surveillance and tampering well, but requires sites to provide 100% of their content over SSL. Technologies like signed http content would prevent tampering, if not surveillance, but once again assume that sites (and browsers!) will support the technology. Ad blockers can turn off any injected ads, tools like NoScript can de-fang any injected javascript but, fundamentally, http content is not tamper-proof, and no plaintext protocol is eavesdropping proof.

People trust their ISPs with a huge amount of very personal data. It’s fine to say that customers should vote with their feet if their ISP is breaking that trust, but in many areas, the list of available ISPs is small, and so the need for prudence on the part of ISPs is large.

That’s what I’m thinking so far, what am I missing?


21
Jan 09

SSL Information Wants to be Free

Recent events have really thrown light onto something I’ve been feeling for a while now: we need better public information about the state of the secure internet.  We need to be able to answer questions like:

  • What proportion of CA-signed certs are using MD5 signatures?
  • What key lengths are being used, with which algorithms?
  • Who is issuing which kinds of certificates?

So I decided to go get some of that information, so that I could give it to all of you wonderful people.

Continue reading →


08
Dec 08

Firefox Malware?

A crappy thing happened last week – someone wrote some malware that infects Firefox. We obviously don’t like that very much at all, but I wanted to at least make it clear what is and isn’t happening, since there’s some confusion out there.

What is going on?

Basically for as long as there has been software, there have been nasty people out there who get you to download and install software which turns out to have hidden cargo.  Security folks use names like “virus,” “trojan,” “worm,” and “malware” to describe different types, but the point is that if a person can be tricked into running nasty programs, they can do nasty things.

In this case, rather than wiping your hard drive or turning all your icons upside down, this particular jerk has decided to mess with your Firefox. Once you run the program, it hooks into your Firefox and watches for you to visit certain sites, at which point it will steal your username and password.

How Can I Tell If I Have It?

You can open up your Firefox addons manager (Tools->Add-ons) and go to the “Plugins” section.  If you have a plugin called “Basic Example Plugin for Mozilla” you should disable it.


Original credit to TrustDefender Labs’ blog post on the subject

Does This Mean that Firefox is Insecure?

No, and here’s why:

  • This particular malware targets our program, but once you have malicious software running on your system, it can just as easily attack other programs, or harm your computer in other ways.
  • This isn’t contracted by just browsing around the web with Firefox 3. In fact, the Malware Protection features in Firefox 3 are designed specifically to prevent sites from being able to attack your computer.

The people getting infected here are either downloading enticing files that have the malware hiding inside (which is why Firefox 3 hands off all downloads to your computer’s virus scanner once downloaded) or, as some sites are reporting, people who have already been infected in the past having their computers forced to download this file as well.

Typical Firefox 3 users who avoid downloading software they don’t trust are unlikely to ever see this, and even the sites reporting it describe its incidence as “rare”.

What’s this I hear about GreaseMonkey?

There are some mentions of greasemonkey in a couple of the early reports based on some analysis of the code used by this malware, but I want to be clear that the (legitimate, and awesome) Greasemonkey Addon is not involved in this malware in any way. It is not involved in the installation or execution of the attack.

As always, the best defense is vigilance.  Use a browser with a solid security record and modern anti-malware defenses built in, and be very careful about downloading and running programs you find online.  If a bad guy is able to get you to run a program on your machine they will be able to do bad things, so we’ll keep trying to stop them and you keep trying to as well.

More details are also available on the official Mozilla security blog.


11
Nov 08

New in Firefox 3.1: Linkified View Source

Look what Curtis just did:

Linky!

Curtis Bartley is the newest member of the Firefox front end team and, to get his feet wet, he made the world a better place by fixing a very old bug. And its 7 duplicate bugs.

Specifically, he set it up so that resources which are referenced in source are now clickable links.  Want to know what that external javascript does?  Click the link, and it will be loaded in the source viewer.  Likewise CSS.  Maybe you clicked “View Source” only to discover you were looking at a frame set, and actually wanted the source for a frame – that works too.

And yes, back and forward keyboard shortcuts work. And yes, both relative and absolute links work. And yes, you can have this in a tab instead of a separate window, either by sticking view-source: on to the front of your URLs (see?), or by finding one of the addons that does it for you.

Way to go Curtis, keep ’em coming!


06
Nov 08

SSL Error Pages in Firefox 3.1

If you’re using Firefox 3.1 nightlies or the upcoming Firefox 3.1 beta 2, you might notice some changes in the way we handle SSL errors. I landed them last week, and since it’s a topic that readers of this blog have historically wanted to talk about, I thought I would highlight some of the changes here. Continue reading →


21
Oct 08

SSL Infoquickie (with Bonus Firefox Pro-Tip!)

There is less public information out there about SSL certificate usage than one might like to see. Netcraft has a for-pay report with some interesting figures, and occasionally makes some of that data public, and I’ve blogged about other sources in the past, but in general, it’s pretty sparse. I keep meaning to do something coordinated about that, I have some ideas, but they keep getting back-burnered.

So it came to pass that when someone idly remarked that it would be nice to know what percentage of certs on the top sites were valid, I pounced upon it as a way to quickly release some pent-up info-gathering angst.

It’s profoundly unscientific, but so was the question. Are the Alexa top 500 sites even a good reflection of the most popular SSL sites? Not really. I think it will bias the data towards higher counts of untrusted certs (since the admins aren’t expecting them to be used) and towards lower overall cert counts (since many of those sites won’t answer SSL hails, whereas presumably a list of the top 500 SSL sites all would). Is blindly connecting to their main page on port 443 the best way to harvest their certs? Probably not, lots of them use secure.frobber.tld constructions, so that will also bias the data lower. Let’s just agree that it’s a sort of fun number to have as an order-of-magnitude style signpost.

Of the 500 top sites on Alexa, October 15, 2008:

  • 217 responded to an SSL query on port 443
  • 199 of those replies used valid certs chaining to trusted roots
  • The other 18 were a mix of self-signed, bad chains (likely from trusted roots, though I didn’t investigate), and expired certs.

If you prefer pretty pictures:

SSL Certificate Stats

Any conclusions you want to draw from this data will be only as good as the aforementioned biases within it, but don’t say I never do anything for you in a feeble attempt to vent my own info-lust urges.

Bonus Firefox Pro-Tip: If you are on Firefox 3.1 Nightlies or the upcoming Firefox 3.1 Beta 2, you now have the ability to turn off link-visited colouring.  David Baron recently landed a fix for bug 147777 that adds a new about:config preference to control the behaviour, layout.css.visited_links_enabled.

“Great!” I hear you all saying, “We’ve been hoping for a way to turn off an occasionally useful feature!”

And who hasn’t, really? But the thing of it is that colouring links can give away information to tricky sites about where you’ve been. It’s up to you whether you think that privacy/functionality trade-off is worth making, and the bug is still open while more universal solutions are contemplated, but in the meantime, you have the choice.


05
Aug 08

SSL Question Corner

From time to time, in the blogosphere or mailing lists, I will get questions about various security decisions we make in Firefox.  Here’s one that has been popular lately:

Q: I think you are dumb.

It is worded in a variety of ways, of course, but that’s the basic thrust.  A longer version might read:

Q: Why has Firefox started treating self-signed SSL certificates as untrustworthy?  I just want encryption, I don’t care that the cert hasn’t been signed by a certificate authority, and anyhow I don’t want to pay hundreds of dollars just to secure my communications.

There are a couple of implicit assumptions we should dispense with up front, before tackling the meat of the question, to wit:

  1. “Why has Firefox started treating…”  Firefox has been treating self-signed certificates as disconcerting for quite some time.  In Firefox 2, you would get a giant dialog box popping up asking what to do with them.  It was farcically easy to dismiss since just hitting OK would proceed to the site, and since the default was a temporary pass, not a permanent one, you saw the dialog frequently, making it even easier to ignore.  Firefox 3 has absolutely changed that flow — more on that later — but there is nothing new here.
  2. “ … I don’t want to pay hundreds of dollars …” Several CAs accepted by all major browsers sell certificates for less than $20/yr, and StartSSL, in the Firefox 3 root store, offers them for free.

Those concerns are red herrings, the real concern is in the middle:  “Why treat self-signed SSL as untrustworthy?  I just want encryption.”  Let’s explore this.

First of all, this isn’t quite right.  You never *just* want encryption, you want encryption to a particular system.  The whole reason for having encryption is that you don’t want various ill-doers doing ill with your data, so clearly you want encryption that isn’t going to those people.

“So fine, I want encryption to a particular system,” you say, “but I don’t need a CA to prove that my friend’s webmail is trustworthy.  CAs don’t even do that anyhow.  I trust him, Firefox should get out of my way.”

Yes, absolutely – the browser is your agent, and if you trust your friend’s webmail, you should be able to tell Firefox to do so as well.  But how do you know that’s who you’re talking to?

Permit me 3 short digressions…

Digression the First: Ettercap, webmitm, and friends

What if I told you that there were a group of programs out there that made it trivial, brain-dead simple, to intercept your web traffic, log it, and then pass it through without you ever noticing?  These “Man in the Middle” attacks used to be the stuff of scary security fiction, but now they are point-and-click.

If one of these is running on your network (you know, like the packet sniffers you’re protecting against with encryption in the first place) it will poison your network so that all requests go through them.  It will then transparently fetch and pass off any regular web pages without you noticing (after logging anything juicy, of course).  If you request an SSL page, it will generate its own certificate whose human readable details match the real site, same organization name, same domain name, everything, and use that to masquerade as the site in question.  The only difference is, it will be self-signed, since the tool obviously can’t get a CA signature.

Digression the Second: Drive-By Router Reconfig

Do you use one of those home cable-dsl-router/wifi-access-point thingies?  For the last couple years, security folks have gotten giggles out of finding ways to break them, and the number one thing they do is rewrite your network configuration so that your connections go to computers of their choosing.  If your router is subverted in this way, the only hint you might have is that your secure sites have all become self-signed.

Digression the Third: Kaminsky Breaks the Internet

This week I’m at the Black Hat security conference in Vegas, where it is a virtual certainty that Dan Kaminsky is going to outline an attack that lets any site on the internet pretend to be any other site on the internet.  I can pretend to be paypal.com.  You can pretend to be bankofamerica.com.  If your ISP doesn’t fix all of their servers, one aforementioned doer-of-ill can trick them into sending all of their customers to forgeries of the actual sites they seek.  They don’t even have to be on the same network anymore.  This is substantially easier than packet sniffing. The only thing that will tell you whether the sites you are visiting are real is the existence of a trusted certificate, which only the legitimate site can have.

Back to the Plot

The question isn’t whether you trust your buddy’s webmail – of course you do, your buddy’s a good guy – the question is whether that’s even his server at all.  With a CA-signed cert, we trust that it is – CAs are required to maintain third party audits of their issuing criteria, and Mozilla requires verification of domain ownership to be one of them.

With a self-signed certificate, we don’t know whether to trust it or not.  It’s not that these certificates are implicitly evil, it’s that they are implicitly untrusted – no one has vouched for them, so we ask the user.  There is language in the dialogs that talks about how legitimate banks and other public web sites shouldn’t use them, because it is in precisely those cases that we want novice users to feel some trepidation, and exercise some caution. There is a real possibility there, hopefully slim, that they are being attacked, and there is no other way for us to know.

On the other hand – if you visit a server which does have a legitimate need for a self-signed certificate, Firefox basically asks you to say “I know you don’t trust this certificate, but I do.”  You add an exception, and assuming you make it permanent, Firefox will begin trusting that specific cert to identify that specific site.  What’s more, you’ll now get the same protection as a CA signed cert – if you are attacked and someone tries to insert themselves between you and your webmail, the warning will come up again.

I don’t think the approach in Firefox 3 is perfect, I’m not sure any of us do. I have filed bugs, and talked about things I think we could do to continue to enhance our users’ security while at the same time reducing unnecessary annoyances.  You’ll notice that Firefox 3 has fewer “Warning: you are submitting a search to a search engine” dialog boxes than Firefox 2 did, and it’s because of precisely this desire.

I welcome people who want to make constructive progress towards a safer internet and a happier browsing experience. That’s what motivated this change, it’s what motivates everything we do with the browser, really.  So it sure would be nice if we didn’t start from the assumption that changes are motivated by greed, malice, or stupidity.


02
Jul 08

The Most Important Thing

Microphone by hiddedevries on flickr… or How Mozilla Does Security and What You Can Steal

As promised last week, I have now put my presentation slides for my talk at FIRST2008 online.  I’ve also put up a video I recorded of a dry-run through the slides, in case you want to experience the talk, and not just read it.

Slides (CC-BY-SA):

Video (CC-BY-SA):

Thanks again to Mike Shaver for helping me put these slides together, and to all the people who reviewed them ahead of time.  I really enjoyed this talk, and hope to give it again – as I’ve said many times before, we have learned a lot of lessons the hard way; we should be sharing that experience broadly, since we’re one of the few organizations that can.

I would love any edits or suggestions for the slides themselves, or my presentation of them.  I’ll also accept offers of exciting cash and prizes to give this talk at your campus/company/private island.