Book Review 3 Pack

I used to write these massive, once a year posts, reviewing every book I’d finished in the course of a year.  It’s becoming clear to me that I had too much time on my hands, because the thought of doing that again with the meager 30 or so books I read last year is just far too daunting.  I like John’s approach, of just shooting out a couple reviews at a time, whenever the mood strikes him.  In that vein then, here are three.

Born on a Blue Day – Daniel Tammet

Daniel Tammet has got to be nearly unique in the world, for having an autistic spectrum disorder (Asperger’s Syndrome), being a mathematical savant, a synesthete, and for being able through all of it to write clear and moving prose about his own mental life.

His writing has a style which you quickly recognize as characteristic of his autism – he is very fixated on details, spending more time talking about the texture of carpet than about the features of the people in his life, but for all that, it is a really touching account of the difficulties he’s had, and the ways he’s found to cope with them.

If you have any interest in how the mind of an autistic person works, to say nothing of an autistic savant, this book is fascinating.  The chapter where he meets Kim Peek is particularly memorable.  Highly Recommended.

The Ghost Map – Steven Johnson

This book isn’t, I think, what most people expect it to be.  I kept hearing about this book from design folks because the titular map in question is a sort of object lesson in Tuftian information design.  The thing is, the book is mostly not about the map, or about information design at all.

The book is far more concerned with tracing the early days of what we now call epidemiology, and that’s not a bad thing at all.  For me, in fact, that’s more interesting.  I found the middle tended to drag on a bit, (Yes, I understand he was groundbreaking, yes, it was very brave to go back into a Cholera zone…) but for all that I found it a quick read, and one that nicely underscored a reality that is ever more true today: Science, without an ability to communicate it compellingly, is impotent.  Recommended.

The Wonga Coup – Adam Roberts

The Wonga Coup is a story about a bunch of wealthy British and South African guys who decide it would be fun to take over Equitorial Guinea.  Fun and lucrative, since oil deposits had been discovered off the coast.  Fun and plausible, since the country’s current leader is somewhere between despot and lunatic, and won’t be fiercely defended by the population, most of whom are too hungry to fight anyhow.  What makes the story interesting is that it’s non-fiction.  It all actually happened, about 15 years ago.

This book was a drive-by for me, just picked up in a bookstore with no particular advance recommendation, but it really is interesting to read about the machinations of a real-world movie-plot, and the things that end up making or breaking such a campaign.  I found the writing pretty slow moving at times, but I think that just says that Roberts is a better researcher than writer, because the details are scrupulously documented.  I don’t think I can give it a blanket recommendation, but if this is an area you are already passionate about, it’s certainly an important piece of modern mercenary history.

[Addendum: I’m not sure if I should tag these posts for inclusion on planet.mozilla.  They aren’t work-related, but I know a lot of people in the community are readers.  What do people think, stay or go?]

State of the Malware Nation

It’s a couple weeks old, I know, but for anyone who hasn’t seen it, Google’s Online Security Blog has linked to a draft article produced by some of their malware researchers about the trends they’ve observed in malware hosting and distribution.  Aside from a troubling pre-occupation with CDF graphs, it’s a really interesting look at the way malware networks are spread through the internet.

I found this snippet interesting:

We also examined the network location of the malware distribution servers and the landing sites linking to them. Figure 8 shows that the malware distribution sites are concentrated in a limited number of /8 prefixes. About 70% of the malware distribution sites have IP addresses within 58.* — 61.* and 209.* — 221.* network ranges.

Our results show that all the malware distribution sites’ IP addresses fall into only 500 ASes. Figure 9 shows the cumulative fraction of these sites across the 500 ASes hosting them (sorted in descending order by the number of sites in each AS).  The graph further shows the highly nonuniform concentration of the malware distribution sites— 95% of these sites map to only 210 ASes.

But I think this is the big takeaway:

Malware Landing Site Distribution

Because malware is being distributed via ad networks more and more, it’s no longer safe to assume that you’ll be okay if you just avoid the seedy parts of the net.  And because it’s no longer requiring user interaction in a lot of cases, the old-school “don’t run executables from random websites” best practice might not be enough either.  To stay on top of things, you are going to want to be running a browser that is as hardened as we can make it, and that also incorporates active checking of known malware sites.

And lookit, the Firefox 3 beta is right over here.

Being Green, easiness of

As of today’s nightly firefox build, we’ve turned on EV support and activated the Verisign EV root for testing purposes.  What this means is that when you go to sites that have Verisign-issued EV certificates like, say, British Airways, the site-identity button (shall we call it Larry? Yes. Let’s.) will pick up the name of the site owner, all green-like.

I rather suspect this might startle a few of you.

Larry on British Airways

I’ve talked a lot about identity and security in Firefox 3, but some of the actual changes were easy to ignore if you weren’t looking for them.  The site button has been around for a while, with Larry telling you what he knows about a site, but you could choose not to click on him, not to get that information.  A while ago, I mentioned a way to get the EV behaviour ahead of schedule, if you wanted to test, but now those steps are no longer necessary.

So things are going to feel a little weird for a few days.  There are about 4000 EV sites these days (the AOTA has a pretty long list) so you will probably hit a few, and it will probably feel weird.  By all means, open bugs.  The whole reason we’re doing this is to get more sunlight on the code, because it’s required weird custom builds and secret handshakes for too long.

The story goes that when London first introduced street signs, there was significant protest.  They were gaudy, the argument went, and anyhow the locals already knew where they were going.  Many streets in London still don’t have them.  I’m excited about getting feedback into the UI to help users know better who they’re dealing with online, help them orient themselves, and rebuild some of the cues that we all take for granted in the real world.  But like the London signposts, I suspect it’ll take some getting used to.  Especially on Proto. Where it currently looks, as Shaver so eloquently puts it, like the South end of a North-facing horse.

Standardizing UI, and other Crazy Ideas

Decision making, by nerovivoStandards make the web go ’round.  I hope it doesn’t come as too much of a surprise that Mozilla cares a lot about standards, or that a significant percentage of the community, myself included, participate in active standards groups, be they W3C, WHATWG, industry consortia, or other.

They are often, to be honest, a slog.  Anything important enough to be standardized is important enough to attract a variety of interests and motivations, and being in the middle of multiple, divergent forces can be just as fun as it sounds.  They are usually noble slogs, though.  An open web needs a set of linguas franca. As it matures, people invent new creoles to express new ideas, and so our standards need to constantly evolve and add that new wealth to the growing lexicon of awesome.

A little while ago though, the W3C decided to try something sort of odd.  They formed up a working group to look at standardizing security UI.

Standardizing. UI.

To anyone who has designed a user interface, that sort of feels like standardizing art. Not that we are quite so full of hubris as to imagine ourselves Caravaggios, but UI design is a complex interplay of functionality, ergonomics, and subjective experience.  There are general principles, sure, but it’s a very different beast from, say, CSS2 margin properties, where everyone can at least agree that there ought to be a single correct result, even if they disagree about what that result should be or how to obtain it.

Nevertheless, boldly forth they have gone and established the Web Security Context working group with a pretty broad charter. Capturing current best practice is certainly fair game, but it is equally permissible for the group to try to move the state of the art forward.  We’re active members, as are Opera and Konqueror (though not Apple or MS), but like most standards bodies, the group includes folks from academia, from other companies, and from various interested groups as well.

This workgroup has put out its First Public Working Draft (FPWD), which means I have two things to ask you, or maybe ask of you.  In marketing, I believe they call this the Call to Action, so if you were looking for it, here it is!

The first thing I would ask, if you are at all interested, is that you to read it and remark upon it.  The group needs public comment, and you fabulous people are ably placed to provide it.

This first draft was kept deliberately inclusive, to make sure that the majority of recommendation proposals got public airings. So if your main criticism is just “too much,” that is unsurprising, but still welcome, feedback.

The second thing is harder.

We participate in this group for all the reasons mentioned above, and I personally take that participation seriously.  Even on the sketchy topic of standardized UI, I think there’s potential. A document which all browsers conform to as a baseline guide, which says things like “Don’t let javascript arbitrarily resize windows, because it lets this spoofing attack happen,” is a valuable one.  At Mozilla, we talk about things like making the mobile web a better place, for example. One thing we can do right up front in that world is spare this new generation of browser implementors (and their users!) from rediscovering our mistakes the hard way.  This standard could help do that.

But this draft is also defining new UIs, new interactions, new metaphors for online browsing.  The academics in the group have offered to gather usability data on several proposed recommendations, but at a fundamental level, I have asked the group a couple times whether it’s right to use a standard to do this kind of work at all.  I think several of the proposed requirements sound like interesting, probably fruitful UI experiments.  But that’s not the same as “Standards-compliant user agents MUST …”

My second question is this: as members of the Mozilla community, is this an effort that you want me (or people like me) participating in, and helping drive to final publication?

I’m still engaged on the calls and the mailing list – I still see good things coming out of the group, and I have my own opinions about how to best contribute.  But as an employee of Mozilla, I feel an obligation to steward my own resources responsibly, and to expend them on things that the community finds valuable, so it’s important for me to hear how people feel about the value of this work.

Opinions? Suggestions? Funny anecdotes?

What happens when your job is also a hobby?

I took a vacation day yesterday, since I had a bunch of appointments piling up, and figured it would be best to just blitz.  In the evening, I was sort of fiddling around, and built this:

PDB v1

It’s probably only interesting to people who find performance monitoring interesting, but I like having it around, even in its very rough condition.  I would love to include the Talos graphs in there, since Talos data is a lot more relevant than the oldschool tests, particularly around pageload.  Nevertheless, it beats clicking a hundred different links off the tinderbox waterfall, and it was a fun excuse to play with a tiny bit of jQuery too.

Johnath’s Performance Dashboard – Trunk

[PS – NSID Day 12 – pretty damned shaggy.  Itch might be subsiding though!]

NSID

Johnath on NSID Day 7A couple years ago, when I still worked for IBM, there came a point – about a week into December – when I realised that I had no more user lab sessions, no more customer travel – that I had no particular reason to keep myself presentable.  This was an opportunity not to be ignored.

I tend to shave pretty regularly, and I think people tend to prefer it that way, for the most part.  I do too, really.  But sometimes you need a chance to stretch your follicles and see what you’d look like if only.  And so, NSID was born:

No Shaving In December

I have been delinquent in not introducing the concept sooner, but in truth, the first NSID was not a full month long anyhow, and we keepers of the faith welcome late arrivals in any case.  Don’t view it as a contest, or a strict discipline, view it as an opportunity.

If you have to shave early because of some social function – so be it – consider resuming your hobo look afterwards if there’s still time.  If you have to shave it because it itches like an unholy FIRE, that’s okay.  NSID is not about judgement.  It’s about self-actualization which, unless I am sorely mistaken, and I’m not, is right at the tippy-top of the god damned pyramid.  It’s the gift you give yourself.

Know too that you are not alone.  I am here.  Robcee is here.  Beltzner and bhearsum and claire are here too.  Shaver defied the destiny of his very name to join our motley crew, and mconnor is a member by default.

We have a flickr pool.  You know what to do.

Security Tidbits

How am I going to find a blog pic that talks about 'security' and 'donuts'.  Oh, that was easy.Tidbits, mind you, not Timbits.  Every time I’m dealing with non-Canadians in Canada, and they refer to “donut holes” when they clearly mean “Timbits,” I have a moment where I feel sort of embarrassed for them. Like they just said they were going to nip up the old gorn and scumbles for some hennylummers. Like they are hopelessly antiquated.  And then I remember that “Timbit”, like “Kleenex”, “Xerox” and “100% Beef,” is just a corporatism, and truly it is I who should feel ashamed. And I do. On with the show.

SSL Error Pages

Yes, again.  But just a quickie.  When I land bug 402207 later today, it will slightly change the way adding a security override works.  You’ll still have the option to add an exception when you visit a site with unverified security, but whereas recently the dialog that popped up would auto-fetch the certificate for you, it will now pre-populate the url, but make you fetch the certificate yourself.

This isn’t just a stupid attempt to annoy users more, it’s an attempt to make it easier to understand what’s going on.  The behaviour of our exception adding is now controlled by a preference named:

browser.ssl_override_behavior

With three values:

  • 0 = Don’t pre-populate the site URL or pre-fetch the certificate
  • 1 = Pre-populate the URL, but don’t pre-fetch the certificate (New default)
  • 2 = Pre-populate and pre-fetch (Old default)

Doing this means that the dialog has less text when users first see it, meaning users might be more inclined to actually read it.  It also don’t have an obvious one-click path, the user needs to fetch the certificate (at which point the problems will show up) and then add the exception.

Users who want to fast track the process because they know what they’re doing can just switch that to “2”, and users (or possibly IT departments deploying Firefox internally) might also choose to set it to 0 to compel more user interaction before trust is given to an unverified site.

EV Support

For all the talk about Larry and EV certificates, people might be wondering when they’ll start seeing them.  In a funny sort of way, they’re already there – all the code to DO stuff is there, but we don’t yet have any authorities “blessed” as being EV issuers.  So that code is idle at the moment.

Kai has now finished up bug 404592 though, which means testers on nightlies can turn on EV trust by setting an environment variable.  To see EV treatment on your (post-beta1) nightly, just run with:

NSS_EV_TEST_HACK=USE_PKIX

I won’t go into detail about how to set environment variables, because this only matters in the very short term anyhow, but for those who are fluent in this underworld machination, doing so will prematurely bless the Verisign EV root.  This doesn’t mean anything about Mozilla and Verisign and what certs will be trusted in Firefox 3, it’s purely a testing contrivance.  Live sites with Verisign EV certs include Paypal and eBay. Once we have at least one EV root in the trusted list, this hack won’t be necessary, and Larry will truly be free to roam.

[Update: It took one minute – sixty terran seconds – for google to index this blog and give me sole possession of the googlerank for ‘hennylummers.’  Spooky.]

It’s On.

Firefox RacerAs announced Very Early In The Morning (EST) today, Firefox 3 Beta 1 is now live.

There is some appropriately scary text there about not downloading it unless you are a developer or a tester, and that’s good text to have, because we wouldn’t want people treating this like a final release BUT it’s pretty awesome, and if you don’t mind living a little bit on the edge, you should check it out.

There are a ton of changes, and as I’ve said here before, a lot of them are subtle.  I want very much to point out a bunch of them, but I also don’t, because I want to know what unprimed minds think of it.  I’ll leave it up to you – if you want to see a (non-exhaustive) list of the kinds of changes we’ve made, you can check the release notes.  If you don’t, skip straight to the announcement and grab a copy.

Once you’re on the beta, you’ll get updates as new betas come out, just like you do with Firefox 2 when we release security and stability updates.  Running the betas and letting us know what you think is a great way to help the project, even if you’ve never tried programming.  You’re a human and a web user, that’s as much expertise as we need.

Self-documenting

I know I’m weird, but I’ve always really liked the way roads combine with badly maintained trucks to create emergent topographical self-documentation.  Pictures are easier:

self documenting road

Notice the dark spots?  That particular stretch of road always drives the point home for me – every time the trucks in front of me hit a bump or dip in the road, it shakes some grease loose from their chassis, and darkens the road a little bit.  Like ants finding efficient routings, it’s always just sort of made me happy.

[Note: The embedded google map got very very broken in RSS, so I’ve replaced it with a static graphic.  Still I suspect the RSS damage is done.]