ongoing by Tim Bray

ongoing fragmented essay by Tim Bray

Hydrofoiling 10 May 2021, 12:00 pm

I was standing on a floating dock on a Pacific-ocean inlet, a place where it’s obvious why motorboats are such an environmental disaster. Fortunately lots of other people have noticed too and it looks increasingly that more people will be able to enjoy Messing About In Boats without feeling like they’re making Greta Thunberg justifiably angry at them.

Hm, this piece got kind of long. Spoiler: I’m here mostly to talk about electric hydrofoil boats from Candela, in Sweden, which look like a genuinely new thing in the world. But I do spend a lot of words setting up (separately) the electric-boat and hydrofoil propositions, which you can skip over if you’d like.

Background

Last year I wrote about personal decarbonization, noting that our family works reasonably hard at reducing our carbon load, with the exception being that we still have a power boat that burns an absurd amount of fossil fuel for each km of the ocean that you cross.

Electric boats

Separately, back in 2019 I wrote a little survey of electric-boat options; my conclusion was that it shouldn’t be too long before they’re attractive for the “tugboat” flavor of pleasure boats. Which is perfectly OK unless you want to cover some distance and get there fast.

There’s been quite a bit of action in the electric-boat space. Here are some companies: VisionMarine Technologies (formerly the Canadian Electric Boat Company), Rand Boats, Greenline Yachts, Zin Boats, and then here’s Electrek’s electric-boats articles. If you like boats at all there’ll be pictures and videos in that collection that you’ll enjoy.

A lot of these are interesting, but (near as I can tell) none of them come close to combining the comfort, capacity, speed, and range that my family’s current Jeanneau NC 795 does.

But in the wake of Tesla, enough people are getting interested in this problem that it’s a space to watch.

That float

Here’s a picture of the float (and the boat). I was sitting down there on a lawn-chair enjoying the view, good company, and an adult beverage.

Floating dock on Howe Sound

What I noticed was that every time a motorboat went by, the wake really shook us up on the float. That float is a big honking chunk of wood, something like ten by twenty feet, and those powerboat wakes toss it around like a toy.

Which is to say, moving a boat of any size along at any speed throws a whole lot of water around. My impression is that most of the energy emitted by the burning fossil fuel goes into pushing water sideways, rather than fiberglass forward.

Let me quote from that Electric Boats piece: “There are two classes of recreational motorboat: Those that go fast by planing, and those that cruise at hull speed, which is much slower and smoother.”

Both of those processes involve pushing a lot of water sideways. And it turns out that my assertion is just wrong because there’s a third approach: the Hydrofoil. It’s not exactly a new idea either, the first patent was filed in 1869, and while they’re not common, you do see them around. In fact, here’s a picture of one I took in Hong Kong Harbour, of the fast ferry that takes high-rollers over to the casinos in Macau.

1994 picture of the fast hydrofoil Hong Kong-Macau ferry

If it looks a little odd that’s because I took it in 1994, with a film camera.

Hydrofoil sailing boats are a thing, including the AC75 design that was used in the most recent America’s Cup, probably the world’s highest-profile sailing race. These things are fast, coming close to 100 km/h. I strongly recommend this video report.

Then there’s the Vestas Sailrocket 2, which currently holds the world record for the highest sustained speed over 500m: 66.45 knots, which is 121.21 km/h. There’s a nice write-up in Wired.

Also check out the Persico 69F, something like the A75 only anyone can buy one. Here’s a story in Sailing World that has a lot of flavor.

Hydrofoil + electrons = Candela

I guess it’s pretty obvious where I’ve been heading: What about an electric hydrofoil motorboat? It turns out that there’s an outfit called Candela which is building exactly such a thing near Stockholm; their first boat is called the C-7. I reached out and got into an email dialog with Alexander Sifvert, their Chief Revenue Officer. It’s remarkably easy to strike up such a conversation when you open with “I’m a boater and an environmentalist and well-off.”

The idea is that you make an extremely light hull out of carbon fibre, give it an electrical drive-train, and put it on software-controlled hydrofoils that lift the hull out of the water and adjust continuously to waves and currents, to give you a smooth, silent, fast, ride. They claim that the foils experience only a quarter of the water drag compared to any conventional hull going fast —  the typical cruising speed is 20kt or 37 km/h. So with a 40kWh battery pack like that in a BMW i3, they get 90+km of range at cruising speed.

Obviously there’s vastly less wake, and once you’re out at sea the operating cost is rounding error compared to what you pay to fill up the average motorboat’s huge fuel tank. Also, experience from the electric-car world suggests that the service costs on a setup like this should be a lot less over the years.

All this on top of the fact that your carbon load is slashed dramatically, more if your local electricity is green but still a lot. I also appreciate the low wake and silence — at our cabin, overpowered marine engines going too fast too close to shore are a real quality-of-life issue.

The Candela website is super-ultra-glossy and not terribly information-dense. Soak up some of the eye-candy then jump to the FAQ.

But there are terrific videos. The one that most spoke to me was this, of a C-7 and a conventional motorboat running side by side in rough water, which we experience a lot of up here in the Pacific Northwest. Here we have a famous powerboat racer (yes, there are such things) taking a C-7 out for a spin. And here we have a C-7 cruising along beside one of those insanely-fast hydrofoil sailboats from Persico.

I’m seriously wondering if this is the future of recreational motorboating.

Would I get one?

Well, it’s a good thing the C-7 doesn’t use much fuel, because it’s really freaking expensive, like three times the price of my nice little Jeanneau. Which should not be surprising in a product that is built out of carbon fibre, by hand, “serially” they say, in a nice neighborhood near Stockholm.

In any case, I wouldn’t buy the C-7 model because it’s built for joyriding and sunbathing, and we use our boat for commuting to the cottage and as Tim’s office. Mr Sifvert agreed, suggested they might have something more suitable in the pipeline and wondered if I’d like to sign an NDA (I declined, no point hearing about something I can’t buy or write about right now). Separately, I note that they’re also working on a municipal-ferry product.

I’ll take a close look when the product Mr Sifvert hinted at comes out. But at that price point, I’ll need to try it out — I gather Candela is happy to give you a sea trial if you’re willing to visit Stockholm. Which actually doesn’t sound terrible.

Independent testimony would be nice too. Which is a problem; as I noted in my Jeanneau review (while explaining why I posted it), boats are a small market with a tight-knit community and truly impartial product evaluations by experts are hard to come by.

Futures

The core ideas behind the Candela C-7 aren’t that complicated: Electric powertrain, battery, hydrofoils, software trim control. Because of electric cars there’s a lot of battery and electrical-engine expertise about. Hydrofoils aren’t a new technology either. So getting the software right is maybe the crucial secret sauce?

Thing is, as a citizen of the planet, I’d like to see the roads full of electric cars — this is definitely going to happen — and the waterways full of boats that look like what Candela is building. Which won’t happen at the current price point, but I’m not convinced that can’t be driven down to what the boating world sees as a “mass-market” level. I’ve no way to know whether that will happen. I sure hope so.

Multimodality 5 May 2021, 12:00 pm

My Wednesday consisted mostly of running around and moving things. I used five transport modes and now I can’t not think about environmental impact and practicality and urban futures. Hop on board the car-share, boat, electric car, bus, and bike, and come along for the ride.

Background

What happened was, on Wednesday morning our boat was at the boatyard for a minor repair. They’d squeezed me in on the condition that I show up at 8AM to sail it away — they’re pretty well maxed out this time of year when everyone wants to get back on the water. Simultaneously, following on my recent bike accident, my bike was stuck in the repair shop — also maxed out in spring — because the replacement pedals they’d ordered were defective and the nearest available pair were in an indie bike shop in a distant suburb.

So, here’s what I did.

  1. Got up early and took a car-share which was just outside my house down to Granville Island, where the boatyard is.

  2. Putt-putted the boat over to my spot at the marina, hung out there for a while while I cleaned up the boat, put it into office mode, and did a bit of work.

  3. Walked twenty minutes to the nearest car-share and took that home.

  4. Took the electric car 32km from central Vancouver to Cap’s South Shore Cycle in Delta, picked up the necessary bike part, came back to Vancouver to drop it off at the local bike repair that was working on my bike, then went home for lunch.

  5. When the shop texted me that they were done, took a handy local bus eight stops or so starting four blocks from my house and getting off around the corner from the bike shop.

  6. Biked home!

Now let’s consider the experience and the economics.

Car Share

There’s been a lot of churn in this market over the years, with Zipcar and Car2Go and so on; it may or may not be a thing where you are. Here in Vancouver it’s alive and well under the Evo brand. You pick up the one that’s nearest and drop it off at your destination. There’s a mobile app, obviously.

An Evo car-share

They’re all Toyota Prii and thus pretty energy-efficient. Also, since a significant part of an automobile’s carbon load comes from manufacturing, when you provide more trips with fewer cars, that should be a win.

Having said that, the carbon-load impact story is mixed. The reason is that they’re so damn convenient that they end up replacing, not solo car trips, but public-transit and bike trips. I can testify to that; when I was working downtown at Amazon, mornings when the weather was crappy or I wasn’t feeling 100%, I regularly yielded to the temptation to car-share rather than bike or take the train. On the other hand, we have friends who don’t own a car but probably would were it not for car-share availability.

Anyhow, it’s beastly complicated. Does car sharing reduce greenhouse gas emissions? Assessing the modal shift and lifetime shift rebound effects from a life cycle perspective dives deep on this. Big questions: What kind of trips is the car-share displacing, and do the vehicles wear out faster than individually-owned cars? The image is from that paper linked above.

Transport modes emission factors

Emission factors for three car-share scenarios. The Y access is equivalent grams of CO2 per passenger-kilometre travelled.

My big gripe with the study is that in my experience, the kind of car offered by car-shares is quite a bit more efficient than the average proprietary car on the road. I’m also puzzled at the low carbon cost assigned to manufacturing; I had thought it closer to 50% from previous readings. No matter how you cut it, though, it’s not simple.

Boating

OMG forgive me, Mother Earth, for I have sinned. When we’re trundling along at 20+ knots to the cabin with weekend supplies, we get 1 km/litre, maybe 1.1. (It’s only 30km.) On the other hand, there’s this.

False Creek from a small boat

Shot while heading over to the boatyard a couple of days ago. It’s awfully nice to be out in a boat.

I console myself with the fact that this is the only petroleum-burning object in our possession; even our house heating and cooking is electric now. But still. There is hope; innovators all over the world are trying to figure out how to make boating less egregiously wasteful. For example, consider the lovely Candela boats, which combine modern battery technology with hydrofoils to get all that fiberglass up out of the water. They’re not the only ones; other manufacturers, mostly in Europe, are trying to get the right combination of range, performance, and carrying capacity. I sincerely hope to be able to buy such a thing in the boating years that remain to me.

But in my actual boat in the year 2021, it’s not good.

Driving (e-car)

I took another car-share Prius home (a long walk to get it, this time) and had time for a coffee before I fired up the electric car to go get bike parts.

Jaguar I-Pace in the rain

I confess to having been all excited about this trip; it’s been a year or more since I’ve had the Jag out on the highway. Traffic was light and I may have driven a little too fast; actually cried out with joy as I vectored into the approaches to one of the Annacis Island bridges. The contrast to the friendly-but-frumpy little Prius is stark.

Let’s look at another graph, from Which form of transport has the smallest carbon footprint?.

carbon footprint for various travel modes

I encourage visiting the paper because this graphic is interactive, you can add lots more different travel modes.

Where I live the electricity is very green, but even with dirty coal-based power, battery cars are still way more carbon-efficient than the gas-driven flavor simply because turning fossil fuels into forward motion is so beastly inefficient. Still, considering manufacturing carbon cost, a single human in a heavy metal box is never going to be a really great choice, environmentally.

But wait: This is exactly the kind of errand cars exist for. There’s no good argument for decent public-transit service between residential Vancouver and a small store in a strip mall in a remote suburb. It’s too far to bike. The advantages of car-share, as we’ve seen, aren’t that overwhelming.

I once read a piece of analysis — sorry, can’t remember where — that suggested a future where lots of people still have cars, but that they are rarely used, mostly just sit there. Until, for example, you have to head off to fetch something from a distant suburb. That sounds plausible to me, partly because it describes our situation: We regularly get plaintive complaints from the Jag’s remote-control app saying that since we haven’t gone near it in days, it’s going into deep-sleep mode.

Busing

I generally don’t mind taking public transit, if only for the people-watching, except when there’s rush-hour compression.

Unfortunately, the carbon economics depend really a lot on how full your buses and trains are. To the extent that fossil-fuel shills have written deeply dishonest studies that I’m not gonna link to arguing that cars are more planet-friendly than buses unless the buses are really full all the time. In fact, in most places they’re full enough to make the carbon arithmetic come out ahead on carbon loading.

And there’s another subtle point: A successful public-transit system has to run some trains and buses at suboptimal loads because otherwise people won’t be able to depend on it to get around and will just go ahead and buy a car and then start driving everywhere.

And having said that, Covid is definitely not not helping; check out this picture I took on the bus today.

On Vancouver’s #3 bus, May 5, 2021

Vancouver’s #3 Main Street bus on a Wednesday afternoon in May 2021.

Biking

Anyhow, I finally, weeks after my accident, found myself back on my wonderful e-bike, heading home.

Trek e-bike at Vancouver’s False Creek

So far on this journey there’s been a whole lot of “It’s complicated.” No longer. Bikes’ carbon load is vanishingly small compared to any other way of traveling further than you want to walk. Plus they’re fun. Plus riding them is good for you. E-bikes hugely expand the range of situations where biking is an option for a non-young non-athletic person.

It’s not complicated. We need more bikes generally and more e-bikes specifically on the road, which means your city needs to invest in bike-lane infrastructure to make people safe and comfortable on two wheels.

Amazon Q1 2021 2 May 2021, 12:00 pm

Amazon announced their financial results for the January to March quarter last Thursday. I was reading them when an email popped up asking if I wanted to talk about them on CNBC Squawk, which I did. In preparation, I re-read the report and pulled together a few talking points; here they are.

Top line

Amazon’s gross revenue increased 41% year-over-year, to $108.5B. To use a technical term: HOLY SHIT! This sort of growth on that sort of base is mind-boggling. Granted, shoppers suffering from Covid coop-up played a substantial role. But still, at some point you have to run into the Law of Big Numbers.

Branching out

But maybe not. One way to increase revenue is to enter more markets, and does Amazon ever do that. The quarterly summary mentioned online pharmacy, NFL merch, “Amazon One” payment tech, “Amazon Business” global procurement system with 5M customers and a $25B top-line, Prime Video’s partnership with the New York Yankees’ media empire, wireless earbuds, and video doorbells. Is there any business sector Amazon is not charging into?

Which brings us to…

AWS

Revenue moved up to a $54B annual run rate, the highlight being 30%-ish growth with 30%-ish operating margin, generating 47% of Amazon’s overall profit. I’d use another technical term but you get the idea.

AWS benefits less from Covid than the rest of the company, I guess; but at this revenue level, both those 30% numbers are pretty astonishing. People wonder why Andy Jassy got the CEO job, but these numbers are all you need to know.

There’s lots more room for growth, too. AWS is like $54B/year these days and my guess is that’s probably around a third of global public-cloud spend. Global Enterprise IT spend is a much, much, bigger number, so there’s no risk of the Cloudistas hitting the limits of their addressable market any time soon.

On the other hand, as I’ve said before, there’s a major potential headwind here. Anyone who’s spending say $1M/month with AWS has to be thinking that they’re sending $3.6M/year to Amazon’s bottom line, and that’s not a very comfortable feeling if Amazon is competing with you, which it probably is, and if it isn’t, apparently will be next year.

If I were an Amazon shareholder (I’m not) or executive, I’d seriously look at doing that AWS spin-off while they can do it the way they want, not the way Washington DC, which seems unusually flush with antitrust energy, says, while pointing a gun at them.

Climate emergency

Moved up the carbon-neutral date from 2030 to 2025, yay! Journalists: Keep a close eye on them for evidence of dodgy carbon accounting, which is not exactly rare in the business community. If they manage to pull this off, that’s a titanic achievement and I hope they share lots of details, because Amazon is a reasonably typical enterprise, just bigger; plenty to learn here.

The other interesting thing is the Climate Pledge, which now has over 100 signatories including a whole bunch of famous names. This may be significant; when Amazon announced the Pledge, I was too polite to say that it looked like it’d been hastily cooked up and there weren’t many, as in any, other recognizable names attached. The Pledge’s zero-carbon date is 2040, which is way too late, but still, let’s hope this turns into a good thing for the world.

People issues

In Jeff’s goodbye letter he said they were adding top-level corporate goals, and they have. The new text: Amazon strives to be Earth’s Most Customer-Centric Company, Earth’s Best Employer, and Earth’s Safest Place to Work. The last two out of three didn’t used to be there. And I place a whole lot of weight on the last one, about safety. It didn’t get that much public traction, but I thought the most damaging Amazon reportage of the last couple of years was the series of stories about elevated injury rates at Amazon warehouses. Credit is due to Will Evans and the people at RevealNews.

I really sincerely hope that Amazon can bend this particular curve in a better direction.

But while that’s important, there’s a way more important “people” issue. When I got on CNBC, the host (in a segment omitted from the excerpt above) asked me about the unionization story and Amazon announcing a general warehouse-worker raise and so on. He asked “So, are they doing enough?”

No, obviously.

The developed world’s egregious inequality curve, which has been getting monotonically worse since the Seventies, is not going to be addressed by another buck an hour to powerless entry-level workers. Anyhow, it’s not an Amazon problem, Amazon is a symptom. And if you have to take a disempowered entry-level job, there are way worse places to go that Amazon.

I don’t claim to understand American Politics, but I do note that both Bernie Sanders and Josh Hawley are mad at big tech in general and Amazon in particular. But it seems obvious to me that, going forward, a central issue for business leaders — maybe the central issue — will be dealing with political pressure to redress society’s imbalances.

Once again, if I were in leadership, I’d be working on getting out in front on this stuff while I still have the reins in my hands.

Long Links 1 May 2021, 12:00 pm

Welcome once again to Long Links, a monthly curation of long-form pieces that pleased and educated me and that being semi-retired gives me time to enjoy; offered in the hope that one or two might enrich the lives of busier people.

It’s simple (and accurate) to say that Xi Jinping’s regime is barbaric, obscurantist, and brutal, and that the people of China are badly mis-governed. That given, the place is interesting; I for one am continuously astonished that they manage to hold it all together. Maria Repnikova on How China Tells its Story dives deep on this; on the very-grey line between what can and can’t be said, what Chinese propagandists think it is they’re trying to do, and (especially) the differences between the Chinese and Russian approaches to storytelling.

Scale Was the God That Failed is by Josh Marshall, founder and editor of Talking Points Memo, a rare example of an all-digital publication that grew out of a blog and has become a sustainable business. It’s not actually long-form measured by number of words, but contains more intellectual meat on the subject of Internet Publishing than you can find in a dozen typical discourses on the subject. Read this if you want to learn why Internet advertising is so awful, why your local paper went out of business, and why it’s not being replaced by high-profile Web publishers, who these days are mostly in the news for yet another round of journalist layoffs. Here are a couple of sound-bites. [Historically, newspapers and TVs and radio stations enjoyed local monopolies, thus…] “almost all of the elements of good, newspaper journalism—big newsrooms paying middle-class salaries and giving reporters the time to get the story right—were made possible by those monopolies.” Also: “The chronic oversupply of publications chasing a fixed number of ad dollars has required publishers to continually charge less for ads that demand more of readers.” Anyhow, investment-driven big-journo launches have pretty well all failed. Marshall offers several examples of things that have worked, including his own publication. But there’s still a crisis in journalism, which is really bad for democracy.

The Healing Power of Javascript, by well-known writer, photographer, and techie Craig Mod, is probably only only for geeks. It’s about the way that coding can be a morale-booster, particularly in depressive plague times. I found it heart-warming. Consider this: “With static sites, we've come full circle, like exhausted poets who have travelled the world trying every form of poetry and realizing that the haiku is enough to see most of us through our tragedies.”

Regular readers will know that I’m an audiophile; I used to really enjoy advising people on buying and setting up good-sounding stereo gear, but that seems to happen less these days. Still, judging by the intense traffic at my local record store, there are a lot of people listening to vinyl. I’m not a fanatic, listen mostly to digital music, but still find the experience of listening to vinyl entrancing. I really liked How to Buy the Best Record Player and Stereo System for Any Budget in Pitchfork. I haven’t heard a lot of the stuff they recommend, but here’s the advice I’ve often given to non-obsessive people who want good sound at a fair price: Decide how much you want to spend. Go buy speakers from PSB, amplification from NAD, and a record player from Rega that fit in your budget; they all have large product lines with entries at all the sane price-points. You’ll be happy. (But don’t buy a Rega cartridge.) Having said that, Marc Hogan, who wrote the Pitchfork piece, probably has fresher data.

Moving from Pitchfork to discogs.com, here are the 50 Most Popular Live Albums of All Time and The 200 Best Albums of the 2010s. I like the methodology; Discogs has as much claim to know what music people like as any organization in the world. As an audiophile and music lover, I have a soft spot for live recordings; a band on stage has more adrenaline in its veins than any studio can generate, and there’s less production interference between the music and you. Of the 50 mentioned here, I own 15. As for the Music Of The Teens, I have only seven. Because my taste these days has become less mainstream. Because what, in this century, are “albums”, anyhow? And because I’m an out-of-touch old fart.

Here’s another one that’s probably geek-only: The TEX tuneup of 2021, by Donald Knuth. TEX, these days, is pretty well only used for scientific publishing, and in just a few science neighborhoods. But it’s wonderful that a piece of software that’s so old and, to be honest, so old-fashioned, is still lovingly maintained. Knuth thinks this may be the last refresh.

This is only 3:54 long and not really a video, just panning and zooming around a photo of the moon. But, wow. The photo is taken with a Leica APO-Telyt-R 2.8/400mm, of which apparently only 390 were ever made, starting back in the Nineties. As I write, you can get one on eBay for $13,999. Might even be worth it.

Speaking of fine photographs captured with unconventional technology, take a tour through Interview: David 'Dee' Delgado’s love letter to New York City, shot on 4x5 film. What beautiful colors. To quote Delgado, self-described as a Puerto Rican independent photographer based in New York City: “This whole project was shot on a Toyo 45A which is a 4x5 field camera. It’s not a light camera. It’s a heavy camera. Lugging that camera along with the film, film holders, and a dark cloth is not an easy task. Shooting 4x5 is a lot slower, especially when you are using a field camera. It slows things down and lets you connect…” It sure connected with me.

Now let’s enter dangerous territory, with The Sexual Identity That Emerged on TikTok. On the surface, it’s about what might or might not be a segment of the gender/sexual spectrum: “super-straight”. On the other hand, it might be ignorant hatefulness. Whatever you may think on that issue, I found Conor Friedersdorf’s exploration of the conversation fascinating. As if discussing the issues aren’t difficult enough, he broadens his scope to include a broader survey of Internet conversation, and stays humane in the face of inhumanity. Anyhow, educational and thought-provoking. Even if you’re outraged.

The Ever Given is still stuck (legally, now) in the Suez Canal, but it’s not news any more. I think it still has stories to tell, though, and offer in evidence Gargantuanisation by John Lanchester in the LRB. [That’s a link to a Google search because the LRB URL is flaky and doesn’t seem to support direct linking.] Lanchester is a very good writer and has first-person experience with problems in the Canal. He has plenty to teach about what the Ever Given means and what trends it exemplifies. The story starts with a question: Have you ever thought about what ship might have brought whatever’s in the package that Amazon just delivered to you across whatever ocean needed crossing? Does anyone? This is a large and pretty well invisible segment of the economy. It’ll be less invisible if you read this.

These days I hear less talk about God from politicians, even in America, where such talk used to be not only commonplace but nearly compulsory. It’s not hard to figure out why; Americans are less religious. 538 does the numbers in It’s Not Just Young White Liberals Who Are Leaving Religion. I’m not sure why this is surprising in a world where supernatural events are not observed and prayers aren’t answered. But it changes lots of societal dynamics and needs to be talked over.

From The Health Care Blog: America’s Health and The 2016 Election: An Unexpected Connection. The piece isn’t that long but I include it because of the graph at the top, entitled “Vitality anad the Vote”. It is astonishingly information-dense, the kind of thing that features in the books of Edward Tufte. Looking at it, I observe the following:

  1. The graph measures, not vote distribution by geography, but its first derivative, vote movement.

  2. There is more geography in the South and Midwest than the West and Northeast.

  3. Midwestern Americans are remarkably less healthy than those to their east and west, with the Southerners in between.

  4. [The headline.] Healthier people swung progressive, unhealthy ones to Trump. Obviously there’s a correlation here with age.

  5. I wish there was a mouse-over so I could ask about anomalously outlying counties.

The accompanying text is competent and useful, but wow, that graph.

I don’t know much about Alex Steffen, but his Climate-Crisis coverage has impressed me. The Last Hurrah, from his Substack, asks what a climate radical (like me) should think of Joe Biden’s progress thus far. Despite the obvious fact that Biden’s program is woefully inadequate in the face of onrushing disaster, Steffen finds grounds for optimism. Which is something that we can all use some of these days.

Now, here’s a high-impact story. Microsoft is going to change the default font in Windows from the current and reasonably-OK Calibri. Many people, as in billions, are going to spend a lot of time looking at screens-full of this stuff, so the future of humanity is significantly affected. Thanks to Scott Hanselman for posting a small sample. My response: We can rule out Seaford for its absurd ’ (right single quote). Skeena [swoosh] looks [swoosh] like it’s sponsored by Nike. Something in Grandview makes the kerning hurt. Bierstadt wins for me because the chars snuggle up together with a little more flow.

Finally, you want long-form? Dive into How the Pentagon Started Taking U.F.O.s Seriously, which is a monster. And never not interesting. And no, you probably don’t have to take UFOs seriously even if you enjoy reading a few thousand words about those who do, which you probably will — enjoy reading I mean, not taking seriously.

Twilights 28 Apr 2021, 12:00 pm

I went out for a walk well into twilight time, put the camera in see-in-the-dark mode, fitted a fast friendly lens, and pointed it at pretty things.

Old East Vancouver house at twilight

This is in Eastern Vancouver (“East Van” everyone says), around where Riley Park becomes Mount Pleasant. A lot of the houses are old and, what with Vancouver’s real-estate craziness, probably doomed. Sadly, they won’t be missed very much I suspect.

Light display in an East Vancouver gardenLight display in an East Vancouver garden

It’s awfully nice of people to put up lights simply for the enjoyment of light.

Modern cameras, what can I say. Just set the ISO at 12800, free the lens to find whatever aperture keeps the shutter less than 40 or so, and you effortlessly get all these moody shots that would have been simply impossible for any previous generation of photographers.

Streetlights and cherry blossoms in East Vancouver twilight

Those awful old sodium-yellow streetlights are too much with us. Modern lights have less-awful colors. The clash with the cherry blossoms may not be beautiful but is interesting.

Lights amid tulips in East Vancouver twilight

We always have tulips this time of year, but something about 2021 has driven East Van gardeners into tulip frenzy, perhaps as a relief from Covid-lockdown pain?

East Vancouver alley at twilight

Ah, that sodium shade. The alley looks kinda scary but was actually perfectly welcoming.

Algorithm Agility? 24 Apr 2021, 12:00 pm

What happened was, I was fooling around with zero-knowledge proof ideas and needed to post public keys on the Internet in textual form. I picked ed25519 keys (elliptic-curve, also known as EdDSA) so I asked the Internet “How do you turn ed25519 keys into short text strings?” The answer took quite a bit of work to find and, after I posted it, provoked a discussion about whether I was doing the right thing. So today’s question is: Should these things be encoded with the traditional PKIX/PEM serialization, or should developers just blast the key-bits into base64 and ship that?

Previously

Old-school key wrapping

Traditionally, as described in the blog piece linked above, the public key, which might be a nontrivial data structure, is serialized into a byte blob which includes not just the key bits but metadata concerning which algorithm applies, bit lengths, and hash functions.

When I say “old-school” I mean really old, because the technologies involved in the process (ASN.1, PKIX, PEM) date back to the Eighties. They’re complicated, crufty, hard to understand, and not otherwise used in any modern applications I’ve ever heard of.

Having said all that, with a couple of days of digging and then help from YCombinator commentators, the Go and Java code linked above is short and reasonably straightforward and pretty fast, judging from my unit testing, which round-trips a thousand keys to text and back in a tiny fraction of a second.

Since the key serialization includes metadata, this buys you “Algorithm Agility”, meaning that if the flavor of key you’re using (or its supporting hash or whatever) became compromised and untrustworthy, you can change flavors and the code will still work. Which sounds like a valuable thing.

There is, after all, the prospect of quantum computing, which assuming that they can ever get the hardware to do anything useful, could crack lots of modern crypto notably including ed25519. I know very smart people who are betting on quantum being right around the corner, and others, equally smart, who think it’ll never work. Or that if it does, it won’t scale.

The simpler way

Multiple commentators pointed out that ed25519 keys and signatures aren’t data structures, just byte arrays. Further, that there are no options concerning bit length or hash algorithm or anything else. Thus, arguably, all the apparatus in the section just above adds no value. In fact, by introducing all the PKIX-related libraries, you increase the attack surface and arguably damage your security profile.

Furthermore, they argue, ed25519 is not likely to fail fast; if the algorithms start creeping up on it, there’ll be plenty of time to upgrade the software. I can testify that I learned of multiple in-flight projects that are going in EdDSA-and-nothing-else. And, to muddy the waters, another that’s invented its own serialization with “a 2-3 byte prefix to future proof things.”

Existence proof

I’m working on a zero-knowledge proof where there are two or more different public posts with different nonces, the same public key, and signatures. The private key is discarded after the nonces are signed and the posts are generated, and keypairs aren’t allowed to be re-used. In this particular case it’s really hard to imagine a scenario where I’d feel a need to switch algorithms.

Conclusions?

The question mark is because none of these are all that conclusive.

  1. Algorithm agility is known to work, happens every time anyone sets up an HTTPS connection. It solves real problems.

    [Update: Maybe not. From Thomas Ptasek: Algorithm agility is bad… the essentially nonexistent vulnerabilities algorithm agility has mitigated over TLS's lifetime… ]

  2. Whether or not we think it’s reasonable for people to build non-agile software that’s hardwired to a particular algorithm in general or EdDSA in particular, people are doing it.

  3. I think it might be beneficial for someone to write a very short three-page RFC saying that, for those people, just do the simplest-possible Base64-ification of the bytes. It’d be a basis for interoperability. This would have the potential to spiral into a multi-year IETF bikeshed nightmare, though.

  4. There might be a case for building a somewhat less convoluted and crufty agility architecture for current and future public-key-based applications. This might be based on COSE? This would definitely be a multi-year IETF slog, but I dunno, it does seem wrong that to get agility we have to import forty-year-old technologies that few understand and fewer like.

The current implementation

It’s sort of the worst of both worlds. Since it uses the PKIX voodoo, it has algorithm agility in principle, but in practice the code refuses to process any key that’s not ed25519. There’s an argument that, to be consistent, I should either go to brute-force base64 or wire in real algorithm agility.

Having said that, if you do need to do the PKIX dance with ed25519, those code snippets are probably useful because they’re simple and (I think) minimal.

And another thing. If I’m going to post something on the Internet with the purpose of having someone else consume it, I think it should be in a format that is described by an IETF RFC or W3C Rec or other stable open specification. I really believe that pretty strongly. So for now I’ll leave it the way it is.

How to Interchange Ed25519 Keys 19 Apr 2021, 12:00 pm

Herewith pointers to Java 15 and Go code that converts Ed25119 public keys back and forth between short text strings and key objects you can use to verify signatures. The code isn’t big or complicated, but it took me quite a bit of work and time to figure out, and led down surprisingly dusty and ancient pathways. Posted to help others who need to do this and perhaps provide mild entertainment.
[Update 04/23: “agwa” over at YCombinator showed how to simplify the Go with x509.MarshalPKIXPublicKey and x509.ParsePKIXPublicKey.]

They call modern crypto “public-key”; because keys are public, people can post them on the Internet so other people’s code can use them, for example to verify signatures.

How, exactly, I wondered, would you go about doing that? The good news is that, in Go and Java 15 at least, there is good core library support for doing the necessary incantations, with no need to take any external dependencies.

If you don’t care about the history and weirdness, here’s the Go code and here’s the Java..

Now, on to the back story. But first…

Should I even do this?

“Don’t write your own crypto!” they say. And I’ve never been tempted. But I wonder if there’s a variant form that says “Don’t even use the professionally-crafted crypto libraries and fool around with keypairs unless you know what you’re doing!”

Because as I stumbled through the undergrowth figuring this stuff out in the usual way, via Google and StackOverflow, I did not convince myself that the path I was following was the right one. I wondered if there was a gnostic body of asymmetric-crypto lore and if I’d ever studied it I’d know the right way to do what I was trying do. And since I hadn’t studied it, I should just leave this stuff to the people who had. I’m not being ironic or sarcastic here, I really do wonder that.

Let’s give it a try

I’m building bits and pieces of software related to my @bluesky Identity scheme. I want a person to be able to post a “Zero-Knowledge Proof” which includes a public key and a signature so other people can check the signature — go check that link for details. I started exploring using the Go programming language, which has a nice generic crypto package and then a boatload of other packages for different key flavors, with names like aes and rsa and ed25519 that even a crypto peasant like me recognizes.

Ed25519

This is the new-ish hotness in public-key tech. It’s fast and very safe and produces small keys and signatures. It’s also blessedly free of necessary knobs to twist, for example you don’t have to think about hash functions. It’s not quantum-safe but for this particular application that’s a complete non-issue. I’ve come to like it a lot.

Anyhow, a bit of poking around the usual parts of the Internet suggested that what a good little Gopher wants is crypto/ed25519. For a while I was wondering if I might need to support other algorithms too; let’s push that on the stack for now.

Base64 and PEM and ASN.1, oh my!

I remembered that these things are usually in base64, wrapped in lines that look like -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY-----. A bit of Googling reminds me that this is PEM format, developed starting in 1985 as “Privacy-Enhanced Mail”. Suddenly I feel young! OK then, let’s respect the past.

Go has a pem package to deal with these things, works about as you’d expect. And what, I wondered, might I find inside?

Abstract Syntax Notation One

Everyone just says ASN.1. It’s even older than PEM, dating from 1984. I have a bit of a relationship with ASN.1, where by “a bit of a relationship” I mean that back in the day I hated it intensely. It was in the era before Open Source, meaning that if you wanted to process ASN.1-encoded data, you had to buy an expensive, slow, buggy, parser with a hostile API. I guess I’m disrespecting the past. And it doesn’t help that I have a conflict of interest; when I was in the cabal that cooked up XML, ASN.1 was clearly seen as Part Of The Problem.

Be that as it may, when you have a PEM you have ASN.1 and if you’re a Gopher you have a perfectly-sane asn1 package that’s fast and free. Yay!

Five Oh Nine

It turns out you can’t parse or generate ASN.1 without having a schema. For public keys (and for certs and lots of other things), the schemas come from X.509 which is I guess relatively modern, having been launched in 1988. I know little about it, but have observed that among security geeks, discussion of X.509 (later standardized in the IETF as PKIX) is often accompanied by head-shaking and sad faces.

Key flavors

OK, I’ve de-PEM-ified the data and fed the results to the ASN.1 reader, and now I have what seems to me like a simple and essential question: What flavor of key does this represent? Maybe because I want to toss anything that isn’t ed25519. Maybe because I want to dispatch to the right crypto package. Maybe because I just want to freaking know what this bag of bits claims to be, a question the Internet should have an answer for.

Ladies, gentlemen, and others, when you ask the Internet “How do you determine what kind of public key this is?” you come up empty. Or at least I did. Eventually I stumbled across Signing JWTs with Go's crypto/ed25519 by Blain Smith, to whom I owe a debt of thanks, because it doesn’t assume you know what’s going on, it just shows you step-by-step how to unpack a PEM of an ASN.1 of an ed25519 public key.

It turns out that what you need to do is dig into that ASN.1 data and pull out an “Object Identifier”. At which point my face brightened up because do I ever like self-describing data. So I typed “ASN.1 Object Identifier” into Google and, well, unbrightening set in.

We must go deeper

At which point I wrote a little Go program whose inputs were a random RSA-key PEM I found somewhere and the ed25519 example from Blain Smith’s blog. I extracted an Object Identifier from each and discovered that Object Identifiers are arrays of numbers; for the RSA key, 1.2.840.113549.1.1.1, and for the elliptic-curve key, 1.3.101.112.

So I googled those strings and “1.3.101.112” led me to Appendix A of RFC8420, which has a nice simple definition and a note that the real normative reference is RFC8410, whose Section 3 discusses the question but does not actually include the actual strings encoded by the ASN.1.

“Oh,” I thought, “there must be a helpful registry of these values!” There sort of is, which I found by pasting the string values into Google: The informal but reasonably complete OID Repository. Which, frankly, doesn’t look like an interface you want to bet your future on. But did confirm that 1.2.840.113549.1.1.1 means RSA and 1.3.101.112 means ed25119.

So I guess I could write code based on those findings. But first, I decided to write this. Because the journey was not exactly confidence-inspiring. After all, public-key cryptography and its infrastructure (usually abbreviated “PKI”) is fucking foundational to fucking Internet Security, by which these days I mean banking and payments and privacy and generally truth.

And I’m left hoping the trail written in fairy dust on cobwebs that I just finished following is a best practice.

Then I woke up

At this point I talked to a friend who is crypto-savvy and asked “Would it be OK to require just ed25519, hardwire that into the protocol and refuse to consider anything else?” They said: “Yep, because first of all, these are short-lived signatures and second, if ed25519 fails, it’ll fail slowly and there’ll be time to migrate to something else.” Which considerably simplified the problem.

And by this time I understood that the conventional way to interchange these things is as base64’ed encoding of ASN.1 serializations of PKIX-specified data structures. It would have been nice if one of my initial searches had turned up a page saying just that. And it turns out that there are libraries to do these things, and that they’re built into modern programming languages so you don’t have to take dependencies.

G, meet J

So what I did was write two little libraries, one each in Go and Java, to translate public keys back and forth between native progam objects and base64, either encoded in the BEGIN/END cruft or not.

First let’s go from in-program data objects to base64 strings. In Go you do like so:

  1. Use the x509 package’s MarshalPKIXPublicKey method to turn the key into the ASN.1-serialized bytes.

  2. Base64 the bytes. You’re done!

In Java it’s like this:

  1. Use the getEncoded method of PublicKey to get the ASN.1-serialization byte sequence.

  2. Base64 the bytes, you’re done.

By the way, the base64 is only sixty characters long, gotta love that ed25519.

The next task is to read that textually-encoded public key into its native form as a programming-language object that you can use to verify signatures. In Go:

  1. De-base64 the string into bytes.

  2. Use the x509 package’s ParsePKIXPublicKey method to do the ASN.1 voodoo and extract the public key.

  3. The key comes back as an interface{} so you have to do some typecasting to get the ed25519 key, but then you can just return it.

For Java:

  1. Discard the BEGIN and END crap if it’s there.

  2. Decode the remaining Base64 to yield a stream of bytes, which is the ASN.1 serialization of the data structure.

  3. Now you need an ed25519-specialized KeyFactory instance, for which there’s a getInstance method.

  4. Now you make a new X509EncodedKeySpec by feeding the ASN.1 serialized bytes to its constructor.

  5. Now your KeyFactory can generate a public key if you feed the X509 thing to its generatePublic method. You’re done!

Other languages?

It might be of service to the community for someone else to bash equivalent tools out in Python and Rust and JS and whatever else, which would be a good thing because public keys are a good thing and ed25519 is a good thing.

References

Useful blogs & articles:

  1. Java EdDSA (Ed25519 / Ed448) Example

  2. How to Read PEM File to Get Public and Private Keys

  3. Export & Import PEM files in Go

  4. Signing JWTs with Go’s crypto/ed25519

  5. Ed25519 in JDK 15, Parse public key from byte array and verify. I think this one is a hot mess, but it had high Google Juice, so consider this pointer cautionary. I ended up not having to do any of this stuff.

  6. ASN.1 key structures in DER and PEM

RFCs:

  1. 8419: Use of Edwards-Curve Digital Signature Algorithm (EdDSA) Signatures in the Cryptographic Message Syntax (CMS)

  2. 8032: Edwards-Curve Digital Signature Algorithm (EdDSA)

  3. 7468: Textual Encodings of PKIX, PKCS, and CMS Structures

  4. 8410: Algorithm Identifiers for Ed25519, Ed448, X25519, and X448 for Use in the Internet X.509 Public Key Infrastructure

  5. 8420: Using the Edwards-Curve Digital Signature Algorithm (EdDSA) in the Internet Key Exchange Protocol Version 2 (IKEv2)

Spring Flowers, 2021 11 Apr 2021, 12:00 pm

48 hours ago I got my first Covid-19 vaccine dose, and today I took the camera for a stroll, hunting spring flowers. What a long strange trip it’s been.

Timothy Bray was vaccinated April 9th, 2021

Should I be concerned that the drugstore guy didn’t bother to sign? By the way, CHADOX1-S RECOMBINANT is better known as Astra Zeneca.

Vaccinated how?

They’re currently working their way through really old people and other targeted groups like teachers and some industrial workers with the Pfizer and Moderna. There seem to be a fair number of AZ doses arriving, and they’re not recommended for people under 55. So those of us in the 55-65 bracket can sign up at pharmacy branches; I did a couple of weeks back and got an SMS Friday morning.

I felt really baffed out and sore the day after, and just a bit sore today; nothing a bit of Ibuprofen can’t handle. Apparently in Canada we’re on multi-month delay between shots; so it’s not clear when I can go see my Mom, who got her first Pfizer dose on March 19th.

April 2021

It’s been a cold blustery spring but that doesn’t seem to bother the botanicals, especially the fruit trees, some of whom have already peaked.

Flowering fruit tree in Vancovuer’s Riley Park neighborhood

Spot the clothesline.

Nobody I know closely has been struck down by Covid, but people I love are suffering from one ailment or another as I write because that’s how life is. I live in a country with seasons, which means we are all subject to morale-boosting sensory stimuli at this time of year. Grab hold of them! We can all use all the help we can get.

Tiny flowers, April in Vancouver

From here on in, the pictures are courtesy of the strong-willed 40-year-old Pentax 100/F2.8.

What comes next, as the vaccinated proportion of the population grows monotonically (but asymptotically) and then the wave of second doses washes up behind the first’s?

I guess I’m talking about people in the privileged parts of the world, since it looks like the spread of vaccinations will be measurably, irrefutably, deeply racist and joined at the hip with the world’s egregiously awful class structure.

I just want to go to a rock concert.

These rhodos are ready to burst.

Rhododendron blossoms about to open

I’m kind of jaded about daffodils but this one was so pretty against the backdrop that I couldn’t resist.

Daffodil with something pink behind

I hope everyone reading this has something to look forward to so that getting out of bed Monday morning is more than just a chore. Failing that, if you’re in the Northern Hemisphere anyhow, there are incoming flowers. Tell them I said hello.

Mixed flowers in April in Vancouver

The Sacred “Back” Button 10 Apr 2021, 12:00 pm

Younger readers will find it hard to conceive of a time in which every application screen didn’t have a way to “Go Back”. This universal affordance was there, a new thing, in the first Web browser that anyone saw, and pretty soon after that, more or less everything had it. It’s a crucial part of the user experience and, unfortunately, a lot of popular software is doing it imperfectly. Let’s demand perfection.

Why it matters

Nobody anywhere is smart enough to build an application that won’t, in some situations, confuse its users. The Back option removes fear and makes people more willing to explore features, because they know they can always back out. It was one of the reasons why the nascent browsers were so much better than the Visual Basic, X11, and character-based interface dinosaurs that then stomped the earth.

Thus I was delighted, at the advent of Android, that the early phones had physical “back” buttons.

Early Android “G1” phone

The Android “G1 Developer Phone”, from 2008. Reproduced from my Android Diary series, which I flatter myself was influential at the time.

I got so excited I wrote a whole blog piece about it.

Nowadays Android phones don’t have the button, but do offer a universal “Back” gesture and, as an Android developer, you don’t have to do anything special to get sane, user-friendly behavior. I notice that when I use iOS apps, they always provide a back arrow somewhere up in the top left corner; don’t know if that costs developers extra work.

Imperfections

The most important reason I’m listing these problems is to offer a general message: When you’re designing your UX, think hard about the Back affordance! I have seen intelligent people driven to tears when they get stuck somewhere and can’t back out.

People using your software generally have a well-developed expectation of what Back should do at any point in time, and any time you don’t meet that expectation you’ve committed a grievous sin, one should remedy right now.

Problem: The Android Back Stack

Since we started with mobile, let’s talk Android. The “Activities” that make up Android apps naturally form a Back Stack as you follow links from one to the other. It turns out that Android makes it possible to compose and manipulate your stack. One example would be Twitter, which occasionally mails me about people I follow having tweeted something that it thinks might interest me, when I haven’t been on for a while. It’s successful enough that I haven’t stopped it.

When I click on the link, it leaps straight to the tweet in question. But when I hit Back, I don’t return to my email because Twitter has interposed multiple layers of itself on the stack. So it takes several hops to get back to where I followed the link from.

This is whiny, attention-starved, behavior. I’m not saying that Back should always 100% revert to the state immediately before the forward step; I’ve seen stack manipulation be useful. But this isn’t, it’s just pathetic.

Problem: SPA foolery

When, in my browser, I click on something and end up looking at something, and then I’m tired of looking at it and go Back, I should go back to where I started. Yes, I know there’s a convention, when an image pops up, that it’ll go away if you hit ESC. And there’s nothing wrong with that. But Back should work too. The Twitter Web interface does the right thing here when I open up a picture. ESC works, but so does Back.

Feedly doesn’t get this right; if you’re in a feed and click a post, it pops to the front and hitting ESC is the only to make it go away; Back takes you to a previous feed! (Grrrr.) Also zap2it, where I go for TV listings, has the same behavior; Back takes you right out of the listing.

(Zap2it demonstrates another problem. I have my local listings in a mobile browser bookmark on one of my Android screens, which opens up Chrome with the listings. Except for the tab is somehow magically different, if I flip away from Chrome then return to it, the tab is gone. Hmm.)

Problem: Chrome new-tab links

When I’m in Chrome and follow a link that opens a new tab, Back just doesn’t work. If I close the tab, for example with Command-W on Mac, it hops back to the source link. In what universe is this considered a good UX? To make things worse, there are many things that make Chrome forget the connection between the tab in question and its source link, for example taking a look at a third tab. Blecch.

Fortunately, Safari is saner. Well, mostly…

Problem: Safari new-tab links

I’m in Safari and I follow a link from my Gmail tab to wherever and it ends up in a new tab. Then, when I hit Back, there I am back in Gmail, as a side-effect closing the freshly-opened tab. Which is sane, rational, unsurprising behavior. (It’s exactly the same effect you get by closing the tab on Chrome, which is why Chrome should use Back to achieve that effect.)

Except, did I mention that the browser forgets? Safari’s back-linkage memory is encoded in letters of dust inscribed on last month’s cobwebs. More or less any interaction with the browser and it’s No More “Back” For You, Kid.

Especially irritating are pages that intercept scroll-down requests with a bunch of deranged JavaScript fuckery (that I darkly suspect is aimed at optimizing ad exposures) and that my browser interprets as substantive enough to mean “No More Back For You”. I swear I worry about farting too loudly because that might give Safari an excuse to forget, out of sheer prissiness.

Please let ’em back out

Screwing with the path backward through your product is inhumane, stupid, and unnecessary. Don’t be the one that gets in people’s way when they just want to step back.

Long Links 1 Apr 2021, 12:00 pm

Welcome to the monthly “Long Links” post for March 2021, in which I take advantage of my lightly-employed status to curate a list of pointers to good long-form stuff that I have time to savor but you probably don’t, but which you might enjoy one or two of. This month there’s lots of video, a heavier focus on music, and some talk about my former employer.

What with everything else happening in the world, people who are outside of Australia may not have noticed that they had a nasty sex scandal recently. My sympathy to the victims, and my admiration goes out to Australian of the Year Grace Tame's full National Press Club address, which is searing and heart-wrenching. I don’t know much else about what Ms Tame has done, but I’d award her the honor for just the speech, which a lot of people need to listen to. Probably including you; I know I did.

"The ocean takes care of that for us", by Fiona Beaty, thinks elegantly and eloquently about the relationship between oceanfront humans and the ocean. Indigenous nations saw the ocean as a self-sustaining larder, and it could be that again. Assuming we can learn to act like adults in our relationship to the planet we live on.

I’m a music lover and an audio geek, and like most such people, have long lamented the brutal dynamic-range compression applied to popular music with the goal of making sure that it’s never not as loud as the other songs it’s being sequenced with on the radio in car. The New Standard That Killed the Loudness War points out that the music-streaming landscape, a financial wasteland for musicians mind you, is at least friendlier to accuracy in audio. I especially love it when a live band gets into a vamp and the singer says “take it down, now”, and they drift down then surge back. No reason pop recordings shouldn’t use that technique; it’s basic to classical music. Now they can.

What Key is Hey Joe In? (YouTube). By watching this I learned that Hendrix didn’t actually write Hey Joe. It’s not 100% clear who actually did write it, and it’s also unclear what key it’s in. Adam Neely has an unreasonable amount of fun exploring this, and there isn’t a simple answer. The most useful thing you can say is that it’s designed to sound good on a conventionally-tuned guitar. If you’re not literate in music theory you’ll miss some of the finer points, but you might still enjoy this.

One of the pointers I followed out of this video was to a massive blog piece by Ethan Hein entitled Blues tonality, which I’m going to say covers the subject exhaustively but also entertainingly, with lots of cool embedded videos to reinforce his musical points. Some of which aren’t music that any sane person would think of as Blues.

And still more music! My streaming service offered up a number by Emancipator and I found myself thinking “Damn, that’s beautiful, who is it?” Behind the name “Emancipator” is Douglas Appling, a producer and DJ who decided to be a musician too, and am I ever glad he did. The music is mostly pretty smooth and you might be forgiven for thinking “nice chill lightweight stuff” but I think there’s a lot of there there. The DJ influence is pretty plain, but to me, this sounds more like classical music than anything else; carefully composed and sequenced with a lot of attention to highlighting the timbres of the instruments. Mr Appling has put together a band to take the music on the road and I think it’d be a fun show. Here’s a YouTube: Emancipator Ensemble, live in 2018.

I hate to end the musical segment here on a downer, but remember how I mentioned that the streaming landscape is a place where musicians go to starve? Islands in the Stream dives at length into the troubled and massively dysfunctional relationship between music and the music business. This picture has been rendered still darker, of course, by Covid, which has taken musicians off the road, the last place where they can sometimes make a decent buck for offering decent music. At some point a truly enlightened government will introduce a minimum wage for musicians, which means that the price you pay to stream will probably have to go up; Sorry not sorry.

Let’s move over to politics. Like most people, I read Five Thirty Eight for the poll-wrangling and stats, but sometimes they unleash a smart writer in an interesting direction. The smart writer in this case is Perry Bacon, Jr; in The Ideas That Are Reshaping The Democratic Party And America, he itemizes the current progressive consensus in clear and even-handed language. This being 538, there are of course numbers and graphs, and a profusion of links to source data, making this what I would call a scholarly work. Most important phrase, I think: “many of these views are evidence-based — rooted in a lot of data, history and research.” The piece is the first of a two-part series. Next up is Why Attacking ‘Cancel Culture’ And ‘Woke’ People Is Becoming The GOP’s New Political Strategy. In case you hadn’t noticed, the American right, so far this year, has largely abandoned discussing actual policy issues and has retreated into an extended howl of outrage about how “woke” people are trampling free speech via “cancel culture”. Since this is coming from a faction that enjoys being led by Donald Trump, it’s too much te expect integrity or intellectual rigor in their arguments. But from an analytical point of view, who cares? What matters is whether or not the stratagem will work. The evidence on that is, well, mixed.

These days, a lot of politics coverage seems to involve my former employer Amazon. How Amazon’s Anti-Union Consultants Are Trying to Crush the Labor Movement is not trying to convince you of anything, it is simply a tour through America’s anti-unionization establishment and the tools Amazon has been deploying nationwide and in Alabama. They’re spending really a lot of money. What on Earth Is Amazon Doing? is a well-written survey of the company’s late-March social-media offensive, kicking sand in legislators’ faces and pooh-poohing the peeing-in-bottles stories. Amazon is a well-run company but nobody would call this a well-run PR exercise. Is this a well-thought-out eight-dimensional chess move, or did leadership just briefly lose its shit?

The most important Amazon-related piece, I thought, was A Shopper’s Heaven by Charlie Jarvis in Real Life Magazine, which I’ve not previously encountered. It’s building on the same territory that I did in Just Too Efficient (by a wide margin the most radical thing I’ve ever published) — at some point, the relentless pursuit of convenience and efficiency becomes toxic, and we are way way past that point.

OK, enough about Amazon. But let’s beat up on the tech business some more, just for fun this time, with How to Become an Intellectual in Silicon Valley, an exquisitely pissy deconstruction of Bay Aryan thought leaders. Yes, it is indeed mean-spirited, but seriously, those people brought it on themselves.

I recommend John Scalzi’s Teaching “The Classics”, which wonders out loud why high-school students still have Hawthorne and Fitzgerald inflicted on them. There is one faction who feels that those Books By Dead White Guys are essential in crafting a well-rounded human, and others who argue that it’s time to walk away from those monuments to overwriting built on foundations most well-educated people now find morally repugnant. Scalzi finds fresh and entertaining things to say on the subject.

Let’s try to end on a high note. There was a news story about Wikipedia noticing that their content is mined and used by multiple for-profit concerns, using access methods (I’m not going to dignify them with the term “APIs”) that are not designed for purpose. Following on this, they had the idea of building decent APIs to make it convenient, reliable, and efficient to harvest Wikipedia data, and charging for their use, thus generating a revenue stream for long-term support of Wikipedia’s work. This is both promising and perilous — fuck with the Wikipedia editorial community’s loathing for most online business models at your peril. Anyhow, Wikimedia Enterprise/Essay is the best insider’s look at the idea that I’ve run across. [Disclosure: I’ve had a couple of conversations with these people because I’d really like to help.]

And finally, a tribute to one of my personal favorite online nonprofits: The internet is splitting apart. The Internet Archive wants to save it all forever. Just read it.

Topfew+Amdahl.next 31 Mar 2021, 12:00 pm

I’m in fast-follow mode here, with more Topfew reportage. Previous chapters (reverse chrono order) here, here, and here. Fortunately I’m not going to need 3500 words this time, but you probably need to have read the most recent chapter for this to make sense. Tl;dr: It’s a whole lot faster now, mostly due to work from Simon Fell. My feeling now is that the code is up against the limits and I’d be surprised if any implementation were noticeably faster. Not saying it won’t happen, just that I’d be surprised. With a retake on the Amdahl’s-law graphics that will please concurrency geeks.

What we did

I got the first PR from Simon remarkably soon after posting Topfew and Amdahl. All the commits are here, but to summarize: Simon knows the Go API landscape better than I do and also spotted lots of opportunities I’d missed to avoid allocating or extending memory. I spotted one place where we could eliminate a few million map[] updates.

Side-trip: map[] details

(This is a little bit complicated but will entertain Gophers.)

The code keeps the counts in a map[string]*uint64. Because the value is a pointer to the count, you really only need to update the map when you find a new key; otherwise you just look up the value and say something like

countP, exists = counts[key]
if exists {
  *countP++
} else {
  // update the map
}

It’d be reasonable to wonder whether it wouldn’t be simpler to just say:

*countP++
map[key] = countP // usually a no-op

Except for, Topfew keeps its data in []byte to avoid creating millions of short-lived strings. But unfortunately, Go doesn’t let you key a map with []byte, so when you reference the map you say things like counts[string(keybytes)]. That turns out to be efficient because of this code, which may at first glance appear an egregious hack but is actually a fine piece of pragmatic engineering: Recognizing when a map is being keyed by a stringified byte slice, the compiled code dodges creating the string.

But of course if you’re updating the map, it has to create a string so it can retain something immutable to do hash collision resolution on.

For all those reasons, the if exists code above runs faster than updating the map every time, even when almost all those updates logically amount to no-ops.

Back to concurrency

Bearing in mind that Topfew works by splitting up the input file into segments and farming the occurrence-counting work out to per-segment threads, here’s what we got:

  1. A big reduction in segment-worker CPU by creating fewer strings and pre-allocating slices.

  2. Consequent on this, a big reduction in garbage creation, which matters because garbage collection is necessarily single-threaded to some degree and thus on the “critical path” in Amdahl’s-law terms.

  3. Modest speedups in the (single-threaded, thus important) occurrence-counting code.

But Simon was just getting warmed up. Topfew used to interleave filtering and field manipulation in the segment workers with a batched mutexed call into the single-threaded counter. He changed that so the counting and ranking is done in the segment workers and when they’re all finished they send their output through a channel to an admirably simple-minded merge routine.

Anyhow, here’s a re-take of the Go-vs-Rust typical-times readout from last time:

Go (03/27): 11.01s user 2.18s system  668% cpu 1.973 total
      Rust: 10.85s user 1.42s system 1143% cpu 1.073 total
Go (03/31):  7.39s user 1.54s system 1245% cpu 0.717 total

(Simon’s now a committer on the project.)

Amdahl’s Law in pictures

This is my favorite part.

First, the graph that shows how much parallelism we can get by dividing the file into more and more segments. Which turns out to be pretty good in the case of this particular task, until the parallel work starts jamming up behind whatever proportion is single-threaded.

You are rarely going to see a concurrency graph that’s this linear.

CPU usage as a function of the number of cores requested

Then, the graph that shows how much execution speeds up as we deal the work out to more and more segments.

Elapsed time as a function of the number of cores requested

Which turns out to be a lot up front then a bit more, than none at all, per Amdahl’s Law.

Go thoughts

I like Go for a bunch of reasons, but the most important is that it’s a small simple language that’s easy to write and easy to read. So it bothers me a bit that to squeeze out these pedal-to-the-metal results, Simon and I had to fight against the language a bit and use occasionally non-intuitive techniques.

On the one hand, it’d be great if the language squeezed the best performance out of slightly more naive implementations. (Which by the way Java is pretty good at, after all these years.) On the other hand, it’s not that often that you really need to get the pedal this close to the metal. But command-line utilities applied to Big Data… well, they need to be fast.

Topfew and Amdahl 27 Mar 2021, 12:00 pm

On and off this past year, I’ve been fooling around with a program called Topfew (GitHub link), blogging about it in Topfew fun and More Topfew Fun. I’ve just finished adding a few nifty features and making it much faster; I’m here today first to say what’s new, and then to think out loud about concurrent data processing, Go vs Rust, and Amdahl’s Law, of which I have a really nice graphical representation. Apologies because this is kind of long, but I suspect that most people who are interested in either are interested in both.

Reminder

What Topfew does is replace the sort | uniq -c | sort -rn | head pipeline that you use to do things like find the most popular API call or API caller by ploughing through a logfile.

When last we spoke…

I had a Topfew implementation in the Go language running fine, then Dirkjan Ochtman implemented it in Rust, and his code ran several times faster than mine, which annoyed me. So I did a bunch more optimizations and claimed to have caught up, but I was wrong, for which I apologize to Dirkjan — I hadn’t pulled the most recent version of his code.

One of the big reasons Dirkjan’s version was faster was that he read the input file in parallel in segments, which is a good way to get data processed faster on modern multi-core processors. Assuming, of course, that your I/O path has good concurrency support, which it might or might not.

[Correction: Thomas Jung writes to tell me he implemented the parallel processing in rust_rs. He wrote an interesting blog piece about it, also comparing Xeon and ARM hardware.]

So I finally got around to implementing that and sure enough, runtimes are way down. But on reasonable benchmarks, the Rust version is still faster. How much? Well, that depends. In any case both are pleasingly fast. I’ll get into the benchmarking details later, and the interesting question of why the Rust runs faster, and whether the difference is practically meaningful. But first…

Is parallel I/O any use?

Processing the file in parallel gets the job done really a lot faster. But I wasn’t convinced that it was even useful. Because on the many occasions when I’ve slogged away trying to extract useful truth from big honkin’ log files, I almost always have to start with a pipeline full of grep and sed calls to zero in on the records I care about before I can start computing the high occurrence counts.

So, suppose I want to look in my Apache logfile to find out which files are being fetched most often by the popular robots, I’d use something like this:

egrep 'googlebot|bingbot|Twitterbot' access_log | \
    awk ' {print $7}' | sort | uniq -c | sort -rn | head

Or, now that I have Topfew:

egrep 'googlebot|bingbot|Twitterbot' access_log | tf -f 7

Which is faster than the sort chain but there’s no chance to parallelize processing standard input. Then the lightbulb went on…

If -f 1 stands in for awk ' { print $1}' and distributes that work out for parallel processing, why shouldn’t I have -g for grep and -v for grep -v and -s for sed?

Topfew by example

To find the IP address that most commonly hits your web site, given an Apache logfile named access_log:

tf -f 1 access_log

Do the same, but exclude high-traffic bots. The -v option has the effect of grep -v.

tf -f 1 -v googlebot -v bingbot (omitting access_log)

The opposite; what files are the bots fetching? As you have probably guessed, -g is like grep.

tf -f 7 -g 'googlebot|bingbot|Twitterbot'

Most popular IP addresses from May 2020.

tf -f 1 -g '\[../May/2020'

Let’s rank the hours of the day by how much request traffic they get.

tf -f 4 -s "\\[[^:]*:" "" -s ':.*$' '' -n 24

So Topfew distributes all that filtering and stream-editing out and runs it in parallel, since it’s all independent, and then pumps it over (mutexed, heavily bufffered) to the (necessarily) single thread that does the top-few counting. All of the above run dramatically faster than their shell-pipeline equivalents. And they weren’t exactly rocket science to build; Go has a perfectly decent regexp library that even has a regexp.ReplaceAll call that does the sed stuff for you.

I found that getting the regular expressions right was tricky, so Topfew also has a --sample option that prints out what amounts to a debug stream showing which records it’s accepting and rejecting, and how the keys are being stream-edited.

Almost ready for prime time

This is now a useful tool, for me anyhow. It’s replaced the shell pipeline that I use to see what’s popular in the blog this week. The version on github right now is pretty well-tested and seems to work fine; if you spend much time doing what at AWS we used to call log-diving, you might want to grab it off GitHub.

In the near future I’m going to use GoReleaser so it’ll be easier to pick up from whatever your usual tool depot is. And until then, I reserve the right to change option names and so on.

On the other hand, Dirkjan may be motivated to expand his Rust version, which would probably be faster. But, as I’m about to argue, the speedup may not be meaningful in production.

Open questions

There are plenty.

  1. Why is Rust faster than Go?

  2. How do you measure performance, anyhow…

  3. … and how do you profile Go?

  4. Shouldn’t you use mmap?

  5. What does Gene Amdahl think about concurrency, and does Topfew agree?

  6. Didn’t you do all this work a dozen years ago?

Which I’ll take out of order.

How to measure performance?

I’m using a 3.2GB file containing 13.3 million lines of Apache logfile, half from 2007 and half from 2020. The 2020 content is interesting because it includes the logs from around my Amazon rage-quit post, which was fetched more than everything else put together for several weeks in a row; so the data is usefully non-uniform.

The thing that makes benchmarking difficult is that this kind of thing is obviously I/O-limited. And after you’ve run the benchmark a few times, the data’s migrated into memory via filesystem caching. My Mac has 32G of RAM so this happens pretty quick.

So what I did was just embrace this by doing a few setup runs before I started measuring anything, until the runtimes stabilized and presumably little to no disk I/O is involved. This means that my results will not replicate your experience when you point Topfew at your own huge logfile which it actually has to read off disk. But the technique does allow me to focus in on, and optimize, the actual compute.

How do you profile Go?

Go comes with a built-in profiler called “pprof”. You may have noticed that the previous sentence does not contain a link, because the current state of pprof documentation is miserable. The overwhelming googlejuice favorite is Profiling Go Programs from the Golang blog in 2011. It tells you lots of useful things, but the first thing you notice is that the pprof output you see in 2021 looks nothing like what that blog describes.

You have to instrument your code to write profile data, which is easy and seems to cause shockingly little runtime slowdown. Then you can get it to provide a call graph either as a PDF file or in-browser via its own built-in HTTP server. I actually prefer the PDF because the Web presentation has hair-trigger pan/zoom response to the point that I have trouble navigating to the part of the graph I want to look at.

While I’m having trouble figuring out what some of the numbers mean, I think the output is saying something that’s useful; you can be the judge a little further in.

Why is Rust faster?

Let’s start by looking at the simplest possible case, scanning the whole log to figure out which URL was retrieved the most. The required argument is the same on both sides: -f 7. Here is output from typical runs of the current Topfew and Dirkjan’s Rust code.

  Go: 11.01s user 2.18s system  668% cpu 1.973 total
Rust: 10.85s user 1.42s system 1143% cpu 1.073 total

The two things that stick out is that Rust is getting better concurrency and using less system time. This Mac has eight two-thread cores, so neither implementation is maxing it out. Let’s use pprof to see what’s happening inside the Go code. BTW if someone wants to look at my pprof output and explain how I’m woefully misusing it, ping me and I’ll send it over.

The profiling run’s numbers: 11.97s user 2.76s system 634% cpu 2.324 total; like I said, profiling Go seems to be pretty cheap. Anyhow, that’s 14.73 seconds of compute between user and system. The PDF of the code graph is too huge to put inline, but here it is if you want a look. I’ll excerpt screenshots. First, here’s one from near the top:

Top of the Go profile output

So, over half the time is in ReadBytes (Go’s equivalent of ReadLine); if you follow that call-chain down, at the bottom is syscall, which consumes 55.36%. I’m not sure if these numbers are elapsed time or compute time and I’m having trouble finding help in the docs.

Moving down to the middle of the call graph:

Near the middle of the Go profile output

It’s complicated, but I think the message is that Go is putting quite a bit of work into memory management and garbage collection. Which isn’t surprising, since this task is a garbage fountain, reading millions of records and keeping hardly any of that data around.

The amount of actual garbage-collection time isn’t that big, but I also wonder how single-threaded it is, because as we’ll see below, that matters a lot.

Finally, down near the bottom of the graph:

Near the bottom of the Go profile output

The meaning of this is not obvious to me, but the file-reading threads use the Lock() and Unlock() calls from Go’s sync.Mutex to mediate access to the occurrence-counting thread. So what are those 2.02s and 1.32sec numbers down at the bottom of a “cpu” graph? Is the implementation spending three and a half seconds implementing mutex?

You may notice that I haven’t mentioned application code. That code, for pulling out the seventh field and tracking the top-occurring keys, seems to contribute less than 2% of the total time reported.

My guesses

Clearly, I need to do more work on making better use of pprof. But based on my initial research, I am left with the suspicions that Rust buffers I/O better (less system time), enjoys the benefits of forcing memory management onto the user, and (maybe) has a more efficient wait/signal primitive. No smoking pistols here.

I’m reminded of an internal argument at AWS involving a bunch of Principal Engineers about which language to use for something, and a Really Smart Person who I respect a lot said “Eh, if you can afford GC latency use Go, and if you can’t, use Rust.”

Shouldn’t you use mmap?

Don’t think so. I tried it on a few different systems and mmap was not noticeably faster than just reading the file. Given the dictum that “disk is the new tape”, I bet modern filesystems are really super-optimized at sweeping sequentially through files, which is what Topfew by definition has to do.

What does Gene Amdahl think about concurrency, and does Topfew agree?

Amdahl’s Law says that for every computing task, some parts can be parallelized and some can’t. So the amount of speedup you can get by cranking the concurrency is limited by that. Suppose that 50% of your job has to be single-threaded: Then even with infinite concurrency, you can never even double the overall speed.

For Topfew, my measurements suggest that the single-threaded part — finding top occurrence counts — is fairly small compared to the task of reading and filtering the data. Here’s a graph of a simple Topfew run with a bunch of different concurrency fan-outs.

Graph of Topfew performance vs core count

Which says: Concurrency helps, but only up to a point. The graph stops at eight because that’s where the runtime stopped decreasing.

Let’s really dig into Amdahl’s Law. We need to increase the compute load. We’ll run that query that focuses on the popular bots. First of all I did it the old-fashioned way:

egrep 'googlebot|bingbot|Twitterbot' test/data/big | bin/tf -f 7
    90.82s user 1.16s system 99% cpu 1:32.48 total

Interestingly, that regexp turns out to be pretty hard work. There was 1:32:48 elapsed, and the egrep user CPU time was 1:31. So the Topfew time vanished in the static. Note that we only used 99% of one CPU. Now let’s parallelize, step by step.

Reported CPU usage vs. number of cores

Look at that! As I increase the number of file segments to be scanned in parallel, the reported CPU usage goes up linearly until you get to about eight (reported: 774%CPU) then starts to fall off gently until it maxes out at about 13½ effective CPUs. Two questions: Why does it start to fall off, and what does the total elapsed time for the job look like?

Elapsed time as a function of number of cores

Paging Dr. Amdahl, please! This is crystal-clear. You can tie up most of the CPUs the box you’re running on has, but eventually your runtime is hard-limited by the part of the problem that’s single-threaded. The reason this example works so well is that the grep-for-bots throws away about 98.5% of the lines in the file, so the top-occurrences counter is doing almost no meaningful work, compared to the heavy lifting by the regexp appliers.

That also explains why the effective-CPU-usage never gets up much past 13; the threads can regexp through the file segments in parallel, but eventually there’ll be more and more waiting for the single-threaded part of the system to catch up.

And exactly what is the single-threaded part of the system? Well, my own payload code that counts occurrences. But Go brings along a pretty considerable runtime that helps most notably with garbage collection but also with I/O and other stuff. Inevitably, some proportion of it is going to have to be single-threaded. I wonder how much of the single-threaded part is application code and how much is Go runtime?

I fantasize a runtime dashboard that has pie charts for each of the 16 CPUs showing how much of their time is going into regexp bashing, how much into occurrence counting, how much into Go runtime, and how much into operating-system support. One can dream.

Update: More evidence

Since writing this, I’ve added a significant optimization. In the (very common) case where there’s a single field being used for top-few counting, I don’t copy any bytes, I just use a sub-slice of the “record” slice. Also, Simon Fell figured out a way to do one less string creation for regexp filtering. Both of these are in the parallelizable part of the program, and neither made a damn bit of difference on elapsed times. At this point, the single-threaded code, be it in Topfew or in the Go runtime, seems to be the critical path.

How many cores should you use?

It turns out that in Go there’s this API called runtime.NumCPU() that returns how many processors Go thinks it’s running on; it returns 16 on my Mac. So by default, Topfew divides the file into that many segments. Which, if you look at the bottom graph above, is suboptimal. It doesn’t worsen the elapsed time, but it does burn a bunch of extra CPU to no useful effect. Topfew has a -w (or --width) option to let you specify how many file segments to process concurrently; maybe you can do better?

I think the best answer is going to depend, not just on how many CPUs you have, but on what kind of CPUs they are, and (maybe more important) what kind of storage you’re reading, how many paths it has into memory, how well its controller interleaves requests, and so on. Not to mention RAM caching strategy and other things I’m not smart enough to know about.

Didn’t you do all this work a dozen years ago?

Well, kind of. Back in the day when I was working for Sun and we were trying to sell the T-series SPARC computers which weren’t that fast but had good memory controllers and loads of of CPU threads, I did a whole bunch of research and blogging on concurrent data processing; see Concur.next and The Wide Finder Project. Just now I glanced back at those (wow, that was a lot of work!) and to some extent this article revisits that territory. Which is OK by me.

Next?

Well, there are a few obvious features you could add to Topfew, for example custom field separators. But at this point I’m more interested in concurrency and Amdahl’s law and so on. I’ve almost inevitably missed a few important things in this fragment and I’m fairly confident the community will correct me.

Looking forward to that.

xmlwf -k 24 Mar 2021, 12:00 pm

What happened was, I needed a small improvement to Expat, probably the most widely-used XML parsing engine on the planet, so I coded it up and sent off a PR and it’s now in release 2.3.0. There’s nothing terribly interesting about the problem or the solution, but it certainly made me think about coding and tooling and so on. (Warning: Of zero interest to anyone who isn’t a professional programmer.)

Back story

As I mentioned last month, I took a little programming job partly as a favor to a friend, of writing a parser to transmute a huge number of antique IBM GML files into XML. It wasn’t terribly hard but there was quite a bit of input variation so I couldn’t be confident unless I checked that every single output file was proper XML (“well-formed”, we XML geeks say).

Fortunately there’s an Expat-based command-line tool called xmlwf that can scan XML files for errors and produce useful human-readable complaints, and it operates at obscene speed. So what I wanted to do was run my parser over a few hundred GML files and then say, essentially, xmlwf * in the output directory.

Which didn’t work because, until very recently, xmlwf would just stop when it encountered the first non-well-formed file. So I added a -k option (“k” for “keep going”) so it could run over a thousand or so files and helpfully complain about the two that were broken.

Lessons from the PR

Most important, I hadn’t realized how great the programming environment is inside Amazon. It’s all git, but there’s no need for branches or PR’s. You make your changes, you commit, you use the tooling to launch a code review, you argue, you make more changes, you (probably) commit --amend (unless you think multiple commits are more instructive for some reason), and this repeats until everyone’s happy and you push into the CI/CD vortex.

Obviously other people might be working on the same stuff so you might have to do a git pull --rebase and there might be pain sorting out the results but that’s what they pay us for. (Right?)

Anyhow, you end up with a nice clean commit sequence in your codebase history and nobody ever has to think about branches or PR’s. (Obviously some larger tasks require branches but you’d be amazed how much you can live without them.)

Finding: Pull requests

Now that I’m out in the real world, it’s How Things Are Done. For good reasons. Doesn’t mean I have to like them. As evidence, I offer How to Rebase a Pull Request. Ewwww.

Finding: Coding tools

The last time I edited actual C code, nobody’d ever heard of Jetbrains and “VS Code” would have sounded like a mainframe thing. I found the back corner of my brain where those memories lived, shook it vigorously, and Emacs fell out. The thing I’m now using to type the text you’re now reading. Oh, yeah; that was then.

C code in Emacs in 2021

It’s 2021. No, really.

It worked fine. I mean, no autocomplete, but there was syntax coloring and indentation and whole cubic centimeters (probably) of brain cells woke up and remembered C. Dear reader, back in the day I wrote hundreds and hundreds of thousands of lines of the stuff, and I guess it doesn’t go away. In fact, the number of syntax errors was pretty well zero because the fingers just did the right thing.

Finding: The Mac as open-source platform

It’s not that great. Expat maintainer Sebastian Pipping quite properly drop-kicked my PR because it had coding-standards violations and a memory leak, revealed by the Travis CI setup. I lazily tried to avoid learning Travis and, with Sebastian’s help, figured out the shell incantations to run the CI. Only on the Mac they only sort of worked, and in particular Clang failed to spot the memory leak.

The best way to deal with this is probably to learn enough Docker (Docker Compose, probably) to make a fake Linux environment. I was well along the path to doing that when I realized I had a real Linux environment, namely tbray.org, the server sending you the HTML you are now reading.

(Except for it’s a Debian box that couldn’t do the clang-format coding-standards test but that’s OK, my Mac could manage after I used homebrew to install coreutils and moreutils and gnu-sed and various other handsome ecosystem fragments.)

I mean, I got it to go. But if I do it again, I’ll definitely wrestle Docker to the ground first. Which is irritating; this stuff should Just Work on a Mac. Without having a Homebrew dance party.

C

Well, yeah. We shouldn’t diss it too much, basically every useful online service you interact with is running on it. But after my -k option was added, clang found a memory leak in xmlwf. Which I tracked down and yeah, it was real, but it had also been there before my changes. And it wouldn’t be a problem in normal circumstances, until it suddenly was, and then you’d be unhappy. Which is why, in the fullness of time, most C should be replaced by Go (if you can tolerate garbage-collection latency) and Rust (if you can’t). Won’t happen in my lifetime.

Anyhow

Thanks to Sebastian, who was polite in the face of my repeated out-of-practice cluelessness. And hey, if you need to syntax-check huge numbers of XML files, your life just got a little easier.

Three Million Meters on e-Wheels 16 Mar 2021, 12:00 pm

This is just another round of cheerleading for e-bikes, provoked by my odometer clicking over to three thousand km. Granted, not amazing for twenty months of commuting, but not nothing. For anyone in an even marginally urban situation in reasonable health, if you don’t have one of these, you’re really missing a trick. For earlier raving about this vehicle, see here.

E-Bike by False Creek

March flowers by False Creek.

E-Bike odometer reads 2999

More interesting than 3000, and prime.

Capital cost: Noticeable but much less than anything with a motor.
Fuel cost: Damn close to zero.
Parking cost: Free.
Health cost: Negative.
Carbon loading: Trivial.
Mind-clearing ability: High.
Cargo capacity: Remarkable.
Getting you the hell out of the house during Covid: Beyond price.

There must be some gripes?

Oh yeah, I had a flat. So I bought a new tube and slipped it in and absolutely could not get the big thick fucking tire back on the fucking rim. I had to take it to a bike shop, where I discovered that my wrestling with it had ruined the new inner tube — ten bucks shot to hell — and this skinny little bike-shop woman slipped it on in no time.

Oh, and I ran out of power once and just about gave myself a coronary pumping this klunker up a not-too-steep hill.

These things are rough on chains; I’ve replaced it once and it’s getting ratty again. This is unsurprising, since the bike is so heavy and a low gear plus the e-boost pulls awfully hard, especially if you insist on going fast, which I do.

There’s no place to stash a latte if you pick one up on the way to work.

You’re stretching

Obviously. It’s fast, it’s smooth, it’s fun, it’s green, and it’s on balance cheap. I can’t imagine who wouldn’t want one.

Long Links 1 Mar 2021, 12:00 pm

Welcome to the monthly tour of long-form excellence that I, due to being semi-retired, have the time to read. You probably don’t have that kind of time but one or two of these might brighten your day anyhow.

Katie Mack, an Astrophysics professor, is one of our best science writers; her book The End of Everything is definitely on my to-read list. The American Institute of Physics has a long interview with her which I found interesting as a sort of mini-autobiography of a life in science. Touches on issues of cosmology, communication, and diversity.

Let's Not Dumb Down the History of Computer Science is the transcript of a talk by Donald Knuth, probably the most famous living Computer Scientist. He is a wise man. For the vast majority of people who don’t care in the slightest about the History of Computer Science, there’s still interest here in the consideration of how we ought to communicate about technology, and is it ever OK to do so without diving into the meat of the matter, the details of the problems that practitioners study and ideally solve?

JWZ offers They Live and the secret history of the Mozilla logo. Who is JWZ, you ask? One of the people most responsible for turning the World Wide Web from a tool for science publishing into a giant global engine for culture and business. If you read this you will learn about some colorful and too-little-known corners of geek culture. In particular, anyone who was involved with technology back then will probably find this fascinating.

The geography of cities is three-dimensional, extending far above and below their surface. It is geological and architectural and legal and financial. Covenants, Easements & Wayleaves: The Hidden Urban Interfaces Which Shape London Part 1 studies the subject, diving (literally) deep, building its story around the London Underground.

Now, this is of special interest to anyone who reads books. I became aware of Patricia Lockwood a couple of years ago, mostly due to her engaging, hyper-intense, frequently-off-color Twitter account; previously, her primary vocation was poetry. Separately, I have signed up for news from the London Review of Books, which is really excellent. Last November the LRB suggested I look at something called Malfunctioning Sex Robot by Ms Lockwood, which turned out to be an essay on the collected novels of John Updike. I read a couple of those novels in my younger years — mostly about horny suburban New Englanders if memory serves — but don’t actually care enough, in normal circumstances, to look at Updike lit-crit. But oh my goodness, Lockwood’s piece riveted my attention end-to-end, full of sentences that I thought should be displayed on museum walls and Times Square billboards. A remarkable piece of writing.

Now we have, also from Lockwood in the LRB, I hate Nadia beyond reason, which is about The Lying Life of Adults, a collection from Elena Ferrante, best known for the Neapolitan Quartet, an astonishing extended novel about women and men (mostly women) navigating the second half of the 20th Century, spiraling out from the wrong part of Naples. These are almost unbearably intense and I found myself so deep in their emotional grip that I stopped reading about three-quarters of the way through out of sheer rage at one of the protagonists, who was about to do something I thought was stupid and damaging. Anyhow, turning Lockwood loose on this predictably results in fireworks and, I’d think, significantly increases the likelihood that you’re going to end up reading either more Lockwood or more Ferrante. A sample:

So yeah, to call the Neapolitan Quartet ‘a rich portrait of a friendship’ seems insane, or like something a pod person would say. Lila is a demon of inducement, the cattle prod that drives the mild herd forward, Lenù the definition of homeostasis. The epigraph is from Faust, which I guess according to this formula is a story about two dudes hanging out: only one of them is completely red, because he is the Devil. Like that legend, it begins in a location so specific it can only be referred to as a crossroads, and then moves into the macrocosmos. It is the picture of a person standing on a single point, and inside the long deep dive of a soul into the universe. Of course, it is also a rich portrait of a friendship.

I’m going to have to go back and finish the Quartet now.

So in 1974, there was this movie called Phantom of the Paradise. It follows the Phantom of the Opera canon pretty closely: Disfigured composer, hopeless love, villainous impresario, bloody revenge. Except for The Paradise is a rock nightclub and the whole thing is drenched in rock-n-roll culture. It’s ridiculous. I loved it. Writing in Pitchfork, Phantom of the Paradise Perfectly Captures the Sinister Side of the Music Industry by Nathan Smith puzzlingly adopts the strategy of taking the movie seriously. If you’re old enough to have seen it, you’ll probably like this. For the vast majority of you who aren’t, you might want to watch it, because it’s fun.

Here’s a link to a thing that’s long: The Titles of the PhD Dissertations Defended at the Dzerzhinsky Higher School of the KGB in 1980. Pretty sure I can’t add much to that title.

I’ve loved the music of British bass wizard Jah Wobble for many years but you never seem to read much about him. Bandcamp has a feature, though: Mapping Jah Wobble’s Interdimensional Dub. Worth reading, if only because of the many links to excellent music. Trivia: His real name is John Wardle; “Jah Wobble” comes from an attempt by Sid Vicious, in a typical drunken stupor, to pronounce it. Item: Subcode, a song on Radioaxiom, a dub outing by Wobble and fellow cosmic-bassist Bill Laswell, has the phattest bassline ever recorded by anybody. Finally, here he is live, with Sinéad O'Connor. You need a subwoofer.

It’s been obvious to any thinking person as long as I’ve been an adult that sexual minorities, starting with gay people, are just being who they are. Thus the slogan, from the earliest days of the LGBTQ struggle “Born this way”. Which would suggest a genetic basis. Except for, as The new genomics of sexuality moves us beyond ‘born this way’ discusses, there doesn’t seem to be a “gay gene” or in fact any straightforwardly discoverable genetic basis. Which shouldn’t change anything at the societal level, and is further evidence that what we are is more than what our DNA says; a finding that’s been pretty obvious since the sequencing of the human genome. And yeah, gender is a lot more fluid than even us progressives thought last millennium. This has wider implications for the consideration of experiences that run in regions and families like education and poverty; the piece introduces the term “postgenomics” which I suspect will get lots of traction.

In Amazon’s Great Labor Awakening, the NYT goes deep on the current landscape. Maybe we’re looking at an inflection point? It’s not obvious, but it’s worth close attention.

Hockey Has a Gigantic-Goalie Problem is by Ken Dryden and is a good, fun, read, but you probably need to have played or watched some hockey. I got a bit annoyed by Ken not mentioning the fact that when he himself was probably the world’s best goalie, the fact that he’s a really tall dude was part of the reason. Still, good stuff.

If you care about history, you should want to read books by people who were there to watch it. Which gets very difficult as the history you care about grows more ancient. Still, History Books » Primary Sources helpfully curates a bunch of contemporary narratives of key episodes of history. Shockingly, they left out Xenophon’s Anabasis, which I liked so much I blogged about it seventeen years ago.

What’s the opposite of history? Sci-fi, of course, and if you’re looking for some of that, the Guardian offers The best recent science fiction and fantasy – review roundup. I’ve read none of these! Must fix that.

Sorry to end on a down note, but there is Very Bad Stuff happening in India, mostly hiding behind the many other unfolding global disasters such as the climate emergency, Covid, China’s mass inhumanity, and the rise of the alt-right. No wait, this is a rise-of-the-alt-right story. The political faction currently ruling India is behaving frighteningly like the Nazis in 1930’s, feels sometimes like paragraph by paragraph out of the same playbook. Nobody can say we weren’t warned; the new news here is that Big Tech, seduced by the immense potential of the billion-strong Indian market, seems to be playing along with the ethnofascists: India Targets Climate Activists With the Help of Big Tech . And it’s not just climate activists either.

Until next month!

Meet þ 22 Feb 2021, 12:00 pm

Months into the cold wet Pacific-Northwest Dark Season and our cat, a charming 4-year-old calico, has been bored and fretful. The obvious solution: Get her a kitten! Easier said than done, let me tell ya. But it’s done. The little feline fluffball’s name is Thorn, spelt “þ”. More on that below.

This announcement has been delayed because obviously one must have pictures and little þ is a challenge to photograph. But, finally…

þ the kitten

We’ve wanted a kitten for a long time. When we acquired our current cat, a one-year-old rescue who’d already had kittens, they discouraged us from adopting two at a time: “She’s fierce, and mean to other cats” they said. “If you must get another, wait a couple of years then get a male kitten so she can dominate it.”

But in these plague times, kittens are hard to come by. Lauren haunted the SPCA sites of everywhere within four hours’ drive and came up empty, empty, empty. We were seriously considering dropping thousands for a purebred — we had a Bengal once and she worked out great, but we’d really rather rescue.

Anyhow, a week ago Sunday Lauren ran across this little guy on Kijiji, which is a Canadian Craigslist kind of thing. His owner was quite concerned about the quality of home and invited applicants to write about themselves. We sent a picture of the current cat saying “This will be his big sister” and that seemed to do the trick.

Anna was her name, she had a big apartment overlooking False Creek and a badly broken ankle in a huge cast, with more surgery scheduled. And the kitten needed his next vet visit for shots and so on. So I could see why she had to let him go. Thanks, Anna!

þ the kitten

That name

By tradition, our cats have had typographical names: Bodoni, Marlowe, Rune, and the current calico is Tilde, spelt “~”. We’ve enjoyed the one-character-ness, and were searching for another (“umlaut” was considered) and then came across þ, the letter Thorn, which was common in lots of old northern European languages notably including Old and Middle English, and survives in Icelandic. It’s “th” basically; usually (but not always) the voiceless flavor as in “thirst”, not voiced as in “other”. Arabic, by the way, has two completely separate letters for these two sounds.

Since this is a proper noun it’d be more orthographically correct to use capital Thorn, “Þ”, but we improperly prefer the lower-case þ.

How it’s going

þ’s eleven weeks old as we write, which means tiny, skinny, and silly; all ears and fluff and bounce. He’s so absurdly light that ~, who’s actually a pretty small cat, seems huge, ponderous to pick up.

The pictures here are deceiving because they omit to mention his pencil-thin legs, bulgy belly, and rat-like tail. Which is OK because he’ll grow out of those; his paws are already big so I’m encouraging ~ to establish dominance now before he’s twice her size.

Fortunately, they get along fine. The mission — addressing ~’s Seasonal Affective Disorder — has been accomplished. It took a couple of days and a careful, gradual introduction, but now they play lots every day. He’s got no fear and ambushes ~ with a vigorous spring-and-pounce (he has to pounce up to reach her) and doesn’t seem to mind when she slaps him around for it. Occasionally she’ll pounce back, she’s so much heavier you can hear the air whooshing out of the kitten when she lands. It doesn’t discourage him.

No kitten was ever smart, but þ’s head seems a little less empty than average. He’ll regularly surprise ~ by sneaking around behind something where she’s not looking. You want a smart cat, get a moggie, which þ definitely is.

Those photos

This is about the blackest animal I’ve ever seen, any species; not a single white hair, nose to tail. Our house lighting, outside the kitchen, tends to the soft, and the furnishings towards dark colors. And of course he never stands still for the camera.

But this evening there he was stretched out on the stereo, to be precise on the Benchmark USB DAC, which I leave on and is thus pleasantly warm.

þ the kitten relaxing on a DAC

Thank goodness for modern cameras that do well at ISO3200, for lenses with image stabilization, for the immensely data-rich Fujifilm RAW files, and for Lightroom’s ability to add light gracefully.

By the way, þ is not a digital-audio exclusivist, here he is shortly after discovering that the thing on the record player was going round and round, plotting how to get inside and kill it.

þ the kitten and a record player

Life is a little more interesting and more cheerful around the house. Every little bit helps, this winter.

Sea Island 21 Feb 2021, 12:00 pm

Not the most original name, granted. It’s wedged into the middle of Greater Vancouver’s western oceanfront and is mostly occupied by our airport and its apparatuses. But there are a couple of decent parks, and on a greyish February day they yielded fresh air, smiles, and a harvest of photographs. This particular season of this particular year, we’ll take what we can get.

McDonald Park

It’s wedged in between the airport and the north branch of the mighty Fraser River, whose existence is a big part of the reason Vancouver exists. It’s low-key and old-school.

Pay phone (!) in McDonald Park on Sea Island, Vancouver

Lauren picked it up and listened but there was no dial tone.

Walking along a riverbank smells and feels different than the oceanfront. It wasn’t much of a day but the gloom was relieved by the joy of the many off-leash dogs getting blissfully filthy in the mud and sand.

Reflecting puddle in McDonald Park on Sea Island, Vancouver

I thought the most interesting part was the semi-artificial marsh, deliberately planted and encouraged in an effort to compensate for one or another of the many losses in salmon habitat following on infrastructural improvent.

Marsh sign in McDonald Park on Sea Island, VancouverCattails in McDonald Park on Sea Island, Vancouver

These pictures reinforce an argument I’ve made here before and will make again: The desirability of going for a photowalk with a modern cameraphone — they’re all excellent — and a difficult, opinionated, prime lens, in this case my trusty Samyang 135mm f/2.. Neither can take any of the pictures that the other can.

Iona

It’s Regional Park (whatever that means) stuffed in behind Vancouver’s main water treatment plant, mostly distinguished by nice views out over the Straight of Georgia (that’s the water between Vancouver and Vancouver Island) (no, Vancouver isn’t on Vancouver Island, deal with it) and the South Iona Jetty, a stone string stretching 4km into the sea; you can walk or bike out and back, which I recommend but we didn’t do today.

People walking on South Iona Jetty, Vancouver

A popular spot on February 21st, 2021.

But for me the main attraction is the views out over the straight to the islands on the other side. Today the tide was at a level that maximized the extent of the tidal flats.

Tidal flats at Iona Park, Vancouver

Behind the flats the sea-grass reminds me irresistably of the coat of Highland cattle.

Dry sea-grass in Iona Park, Vancouver

Some vegetation flourishes in the intertidal zone; I’m sure there’s a branch of botany that understands how plant metabolisms can survive salt water, and maybe there’s something in there we could all learn from.

Tidal flat vegetation at Iona Park, Vancouver.

Let’s put on the long lens and peer across the ocean at the islands.

Looking across the Straight of Georgia from Iona Park, Vancouver

When we drove home, since Sea Island is where the airport is, all of a sudden we were on the road home from the airport, which we’ve taken so, so often over the years but not for a long time, and it felt spooky. Can’t imagine when I’ll fly again.

In these dark days, get the hell outside and soak up some air and light, already. You’ll thank yourself. Take a camera.

Recent Code 14 Feb 2021, 12:00 pm

I’ve been programming and remembering how much fun it is. With nostalgia for Old Data and thoughts on my work at AWS.

Old-School

What happened was, I was talking to a friend who’s in the middle of a big project; they said “Would you be interested in bashing out a quick parser in Java?”
“Parser for what?” I asked.
GML”.
I just about sprayed coffee on my computer. “You have got to be kidding. There hasn’t been any GML since the days of mainframes.”
“Exactly. They’re migrating the documents off the mainframe.”
“What documents?”
“High-value deeply-structured stuff. They need to turn it into simple XML, we’ll enrich it later.”

I’m semiretired and suddenly realized I hadn’t done any actual code for many months, so I named a price and they took it. It’s old-school stuff; I mean really old-school; GML is actually a basket of macros for the IBM mainframe Script typesetting system, which I used to write my undergrad CS assignments back in the freaking Seventies.

IBM GML documentation

Old-school it may be, but I’m learning cool new Java things like postmodern switch because IntelliJ keeps sneering at my Java 8 idioms and suggesting the new shiny. And I’d forgotten, really, how nice carving executable abstractions into shape feels. Also parsers are particularly fun.

And here’s a snicker. I realized that the parser needed a config file. So… JSON? YAML? XML? Except that so far, my program had exactly zero dependencies, a single little jar file and away you go; we don’ need no steenkin’ CLASSPATH. But wait… I’d just written a GML parser. So the config file is in GML, yay!

But Should I Code?

Seriously, it’s reasonable to ask that question at this stage of my career. It’s a conversation that arose at both of my last two jobs, Amazon and Google. Should your most senior engineers, the ones with decades of experience and coding triumphs and tragedies under their belts, actually invest their time in grinding out semicolons and unit tests? Or do you get more leverage out of them with mentoring, designing systems and reviewing others’ designs, code reviews, and being the bridge between businesspeople and geeks?

There’s no consensus on the subject that I’m aware of. There are people I deeply respect technically who really believe that coding is a lousy use of their time. But then anyone who’s been in this biz for long has met Architecture Astronauts who can make a hell of a design chart in Omnigraffle but are regularly really wrong.

I’m personally in the senior-engineers-should-code faction and when I was asked to evaluate someone, would always pull up the internal equivalent of the GitHub history page. I wouldn’t expect to see a lot of action there, but I’d get serious misgivings if I saw none. On the other hand, I freely admitted prejudice on the grounds that I personally can’t not code.

Except for I hadn’t for a while. Now I realize how much I missed it.

Recently…

Now that I’m in talking-about-code mode, I want to mention my most recent excursion when I was at AWS. Coding there is terrific, with very decent dependency-management and code-review tools. And, most important, there’s a good chance that your code will end up being used by hundreds of thousands of customers or process millions of requests per second or both. Those things will turn any geek’s crank.

I didn’t code a lot there. One little package in Snowmobile. Some bits and pieces in Step Functions; I love that service.

But I was fiddling with code in EventBridge from the month I joined (December 2014) to the last days before my exit. In particular, the stuff linked to from Events and Event Patterns in EventBridge. Words can hardly describe the intensity and fun involved in building this thing, and the thrill as customers piled on board and the flow of events through the code became mind-boggling.

The software lets you write rules and present events and have it tell you which rules matched. Simple enough, logically. It has an unusual but good performance curve and an API people seem to like. I can’t go into how it works; there’s a patent filing you could track down but that’s written in patent-ese not English so it wouldn’t help.

Other teams started picking it up and suddenly I had collaborators. There was this one feature request that I was convinced was flatly impossible until this guy I’d never heard of just made it work. He and I were chief co-authors from that point for the next several years. I miss him.

My last couple of years at AWS I was, in between all my other work, regularly chiselling away at this code. It wasn’t the best part of my job, but I liked it. At one point it became clear that AWS was serious about upping its open-source game. So I floated a proposal that we open-source my baby. That ball was still in play when I left but I’m not holding my breath. I still had lots of what-to-do-next ideas and working on it would be a great semiretirement hobby.

Advice

If you used to like to code but don’t do it any more, I suggest you see if you still do.

Jassy Talking Points 5 Feb 2021, 12:00 pm

Following on Tuesday’s big succession announcement at Amazon, I was apparently the only human who’d been in a room with Andy Jassy more than once in recent years and was willing to talk to media. By about the fifth conversation, my talking points were stubs because the points wore off, leaving a well-polished gleam. So I might as well share them directly. If you’ve read any of the other articles this may sound familiar.

I should be clear that I’m not exactly close to Andy Jassy —  I’ve only ever been in the room with him at decision meetings and annual planning reviews; maybe a dozen times in my 5½ years there. Second, I’m not going to tell any secrets.

I’ll use Q&A format. Every one of these is one I got from one or more journalists.

Why Andy?

In 2006, AWS didn’t exist. In 2021 its annual run rate is $50B. What else do you need to know?

The obvious choices for CEO were Andy Jassy and Jeff Wilke. Without inside information, I suspect that the decision was made a few months back and explains Wilke’s exit.

I think it’s the right call. Andy, in my opinion, is an outstanding executive. AWS was the best-managed place I worked in my 40-year career, including places where I was the CEO.

He’ll be less aggressive than Jeff, right?

I got this question a few times and it surprised me, because I don’t think so at all.

I remember one annual planning meeting where I was a senior member of a (large) AWS product group; we presented our six-pager. The usual sort of talk ensued, Andy’s team challenging us on this technical or that business issue. As I’d come to expect in document-driven discussions, the quality was excellent. Eventually Andy spoke up. He had a few things to say but this is what I remember: “Are you guys thinking big enough? Could you go faster? If we gave you double the resources, what could you do?”

That’s just anecdote. But here’s a number. It’s Amazon Web Services, right? How many services, then? Somewhere around 200 at this point and the rate of new-service announcements isn’t slowing down. There’s actually a faction among customers and analysts who argue that there are too many; that it’s hard for customers to understand and choose. Except for, under Andy’s leadership the strategy has been simple: Identify every IT problem an organization can have and offer a service that solves it. Are you really going to disagree in the face of that zero-to-$50B trajectory? Aggressive enough?

What’s Jeff’s legacy?

Jeff founded Amazon in 1994 and now it’s arguably the world’s most powerful company. Investors think it’s worth $1.7T and it employs over a million people. Whether you love or loathe capitalism, you have to be impressed at those numbers.

On the other hand, as recently as ten years ago the Big-Tech companies were admired and their leaders hero-worshipped. Today, a substantial proportion of the population is disaffected with Big Tech in general and Amazon in particular. Even people who buy a lot of stuff from Amazon tell me they feel bad about it. Many feel raw fear. Jeff Bezos is definitely part of that problem.

So his legacy has to include both the corporate success and the sectoral disaffection.

Isn’t it cool that Jeff is free to explore space?

Spare me. I’m as space-crazed as the next geeky nerd but in the near-to-medium future we need to be paying close attention to this planet’s problems if we want our children to have a place to live. In terms of economics and health and the environment, space is a distraction. I was super happy to see Jeff’s letter feature the Climate Pledge prominently.

By the way, anyone noticed Jeff’s ex-wife is kicking his butt in the (difficult) practice of giving money away constructively? I hope he catches up.

What’s Andy like?

OK, this is boring, but: customer-obsessed.

Data point: Before the first Andy meeting where I’d helped write the six-pager and was expected to make the case, I got advice: “Make sure anything you say is backed by customer data.” Fortunately we’d been running a popular beta on the new service under discussion, so I had loads of anecdotes from household-name customers. So when a question came up I could say things like “Well, CustomerA says the big upside is X but CustomerB says we need to beef up Y.” The advice was good: My recollection is that the decisions went the way we’d wanted them to.

Data point: It’s the middle of the night and some AWS service is having a nasty outage; maybe a big rainstorm took out a bridge with three telcos’ fibre on it. There’ll be a late-night communication chain on what we can say to customers and when we can say it. Andy will be on that chain and the service team’s representative (back then, sometimes me) better provide regular updates.

I could offer more if I could tell secrets. But customer focus is genuinely the first thing that comes to mind.

Now here’s something to watch out for. AWS claims a million-plus customers and, while I don’t know the numbers, it’d be reasonable to think that a significant chunk of the revenue comes from a smaller number of big customers. Thus, anyone who’s doing significant business has a few AWS people they’ve gotten to know pretty well and whose job is to understand their problems.

Amazon has over 150 million. Not just customers I mean, but Prime members. So I think Andy’s customer-obsession degree of difficulty is about to go way up.

Will Andy change the Amazon culture?

I doubt it. In my opinion, the vision of Jeff as singular day-to-day supergenius mastermind is just wrong. Nobody can do that at Amazon’s scale. The real achievement is building and sustaining highly effective culture and process. Around hiring and promotion, around product management, around reporting and decision-making. You can’t grow and innovate at that scale unless you can delegate strategy and tactics to lightly-supervised groups with high confidence that they’ll be right a lot.

From the point of view of Amazon’s leadership and its investors, its culture is working just fine. Why change anything? But…

Is Amazon going to become more humane?

I doubt it. There just isn’t a way to employ a million people and sell to a hundred million others others and offer a human touch.

So I don’t expect Amazon to stop bullying partners or having high injury rates at warehouses or ferociously resisting unionization.

Unless, of course, they are forced to do these things by a combination of legislation, regulation, and litigation. Which leads to:

What are the big challenges facing Amazon?

Now that Andy’s been promoted, he’s got a new responsibility: Testifying to Congress. It’s not going to be fun. And for my money, that’s going to be the biggest change to the landscape that Amazon plays on.

Quoting from the Leadership Principles: “As we do new things, we accept that we may be misunderstood for long periods of time.” Is it just me or does that sound like a symptom of extreme arrogance? “We’re so much smarter than everyone else that normal people can’t even understand why what we’re doing is smart.”

Now, this might be plausible when you’re a scrappy Seattle startup doing things nobody had ever previously thought of. When you’re the world’s most visible and most powerful company, you neither can nor should want to be misunderstood at scale. Maybe edit that LP a bit?

I expect Amazon to experience severe friction on multiple legal fronts. First, anti-monopoly. Note that 2020’s upsurge of antitrust litigation was not only bipartisan but in some cases Republican-led.

Regular readers here know that I’m not neutral at all about this: I enthusiastically support aggressive anti-monopoly action against not just Amazon and not just the Big-tech titans and not just in the USA, but across the economy and across the globe. I wouldn’t start with Amazon, if I were running the show, but I’d get there pretty quick.

(I’m pretty sure promoting Andy is a smart move and one he’s earned. But I’m unhappy because it decreases the likelihood that Amazon will spin off AWS voluntarily, which I think would be unambiguously good.)

A second legal front is probably going to fall out of the continuing uproar among sellers making their living on the Amazon platform. Elizabeth Warren argued powerfully in March 2019 that certain large tech businesses need to be designated as “platform utilities” and strictly regulated, most obviously by forbidding companies from both operating a marketplace and selling on it.

Finally, of course, we can expect the labor landscape to change. America lags the rest of the rich world shamefully in the imbalance of power and wealth between Capital and Labor; shifting this balance has to be pretty high on any progressive agenda.

How this goes depends on the politics playing out in Washington and then the next couple of election cycles, but my perception is that the Overton Window has moved and that the Covid interregnum may mark the high point of the fifty years of regressive politics kicked off by the Reagan-Thatcher neoliberal consensus of the Seventies. Definitely watch this space.

Hey Andy, here’s some sincere advice: Get to know Congresswoman Jayapal soonest, and give a careful listen to what she says. She’s really smart and you might agree on more than you suspect.

Andy Jassy is a terrific executive and I respect him a whole lot. It’ll be fascinating to watch Amazon navigate this new landscape.

Decarbonization 19 Jan 2020, 12:00 pm

We’re trying to decarbonize our family as much as we can. We’re not kidding ourselves that this will move any global-warming needles. But sharing the story might, a little bit. [Updated mid-2021 with a bit of progress: No more gas vehicles, heat pumps in production.]

Those who worry a lot about the climate emergency, and who wonder what they might do about it, are advised to have a look at To fix Climate Change, stop being a techie and start being a human by Paul Johnston (until recently a co-worker). It’s coldly realistic, noting that the Climate Emergency is not your fault and that you can’t fix it. Only radical large-scale global political action will, and his recommendation is that you find an organization promoting such change that meets your fancy (there are lots, Paul provides a helpful list) and join it.

Global GHG per capita, 2017

Such intensity of change is possible. It happened in the middle of the Twentieth century when faced with the threat of global Fascism; Governments unceremoniously fixed wages, told businesses what they could and couldn’t do, and sent millions of young men off to die.

It’s not just possible, it’s inevitable that this level of effort will happen again, when the threat level becomes impossible to ignore. We will doubtless have to conscript en masse to fight floods and fires and plagues; which however is better than attacking positions defended by machine guns. The important thing is that it happen sooner rather than later.

Thus evangelism is probably the most important human activity just now, which is why people like Greta Thunberg are today the most important people in the world.

A modest proposal: Decarb

“Decarbonize” has four syllables and “decarbonization” six. I propose we replace both the noun and the verb with “decarb” which has only two. Mind you, it’s used in the cannabis community as shorthand for decarboxylation, but I bet the world’s stoners would be happy to donate their two syllables to the cause of saving the planet. Anyhow, I’m gonna start using decarb if only because of the typing it saves.

So, why personal decarb, then?

Well, it feels good. And — more important — it sends a message. At some point, if everybody knows somebody who’s decarbing it will help bring the earth’s defenders’ message home, making it real and immediate, not just another titillation in their Facebook feed.

There’s a corollary. Decarbing is, by and large, not terribly unpleasant. The things we need to give up are I think not strongly linked to happiness, and some decarb life choices turn out to be pretty pleasing.

Caveats and cautions

It’s important that I acknowledge that I’m a hyperoverentitled well-off healthy straight white male. Decarbing is gonna be easier for me than it is for most, because life is easier for me. I’m willing to throw money at decarb problems before that becomes economically sensible because I have the money to throw; but I hope (and, on the evidence, believe) that pretty well every one of these directions will become increasingly economically plausible across a wider spectrum of incomes and lifestyles.

Because of my privileged position and because the target is moving, I’m not going to talk much about the costs of the decarb steps. And I’m not even sure it’d be helpful; these things depend a lot on where in the world you live and when you start moving forward.

Now for a survey of decarb opportunities.

Decarb: Get gas out of your car

Obviously fossil fuels are a big part of the problem, and automobiles make it worse because they are so appallingly inefficient at turning the available joules into kilometers of travel — most less than 35%. On the other side of the coin, electric vehicles are a win because they don’t care what time of day or night you charge them up, so they’re a good match for renewable energy sources.

Family progress report: Good.

For good news check out my Jaguar diary; I smile a little bit every time I cruise past a gas station. I’m here to tell you that automotive decarb isn’t only righteous, it’s fun.

Jaguar I-Pace

Decarb: Get yourself out of cars

Last time I checked, the carbon load presented by a car is only half fuel, more or less; the rest is manufacturing. So we need to build fewer cars which means finding other ways to get places. Public transit and micromobility are the obvious alternatives.

They’re good alternatives too, if you can manage them. If you haven’t tried a modern e-bike yet you really owe that to yourself; it’s a life-changer. I think e-bikes are better alternatives than scooters along almost every axis: Safety, comfort, speed, weather-imperviousness.

And transit is fine too, but there are lots of places where it’s really not a credible option.

Now, there are times when you need to use a car. But not necessarily to own one. It isn’t rocket science: If we share cars then we’ll manufacture fewer. There are taxis and car-shares like Car2Go and friends. (Uber et al are just taxi companies with apps; and money-losing ones at that. Their big advantage is that your ride is partly paid for by venture-capital investors who are going to lose their money, so it’s not sustainable.) The car-share picture varies from place to place: Here in Vancouver we have Evo and Modo.

Family progress report: Pretty good.

Since I got my e-bike I’ve put three thousand km on it and I remain entirely convinced this is the future for a whole lot of people. I just don’t want to drive to work any more and resent it when family logistics force me to.

Trek E-Bike

My wife works from home and my son takes a bus or a skateboard to college. My daughter still gets driven to school occasionally in some combinations of lousy weather and an extra-heavy load, but uses her bike and the bus and will do so increasingly as the years go by.

I used to take the train quite a bit but the combination of Covid and leaving Amazon means it’s only rarely a good alternative to bicycling. Hmm. We use car-share a lot for going to concerts and so on because they avoid the parking hassle, and when we need a van to schlep stuff around for a couple of hours.

Now, due to the hyperoverentitledness noted above, we have the good fortune to live in a central part of the city and all these commute options are a half-hour or less. This part of decarb is way harder in the suburbs and that’s where many just have to live for purely economic reasons.

Decarb: Get fossil fuels out of the house

Houses run on some combination of electricity and fossil fuels, mostly natural gas but some heating oil. The biggest win in this space is what Amory Lovins memorably christened negawatts — energy saved just by eliminating wastage. In practical terms, this means insulating your house so it’s easier to heat and/or cool, and switching out incandescent lights for modern LEDs.

Ubiquitous LEDs are a new thing but they’re not the only new thing. For heating and cooling, heat pumps are increasingly attractive. Their performance advantage over traditional furnaces is complicated. Wikipedia says the proper measure is coefficient of performance (COP), the ratio of useful heat movement per work input. Obviously a traditional furnace can’t do better than 1.0; a heat pump’s COP is 3 or 4 at a moderate temperature like 10°C and decreases to 1.0 at around 0°C. So this is a win in a place like Vancouver but maybe not if you’re in Minnesota or Saskatchewan.

Heat pump technology also works for your hot-water tank.

Another new-ish tech, popular in parts of Europe for some time now, is induction cooking. It’s more efficient than gas cooking, which is in turn more efficient than a traditional electric cooktop. For those of you who abandoned electric for gas years ago because it was more responsive, think again: Induction reacts just as fast as gas and boils water way faster.

Note that all these technologies are electric, so your decarb advantage depends on how clean your local power is. Here in the Pacific Northwest where it’s mostly hydroelectric, it’s a no-brainer. But even if you’ve got relatively dirty power the efficiency advantage of the latest tech might put you ahead on the carbon count.

Also bear in mind that your local electricity will likely be getting cleaner. If you track large-scale energy economics, the cost of renewables, even without subsidies, even with the need for storage, has fallen to the point where it not only makes sense to use it for new demand, in some jurisdictions it makes sense to shut down high-carbon generating infrastructure in favor of newer better alternatives.

Family progress report: Good, but…

We had a twenty-year-old gas range which was starting to scare us from time to time, for example when the broiler came on with a boom that rattled the windows. So early this year we retired it in favor of a GE Café range, model CCHS900P2MS1. It makes the old gas stove feel like cave-man technology. Quieter, just as fast, insanely easier to clean, and safer.

On the other hand, it requires cookware with significant ferrous content, which yours maybe doesn’t have; we had to replace a few of ours. And one thing that really hurts: It doesn’t work with a wok so we stir-fry a whole lot less.

GE Café CCHS900P2MS1 Induction Range

Modern induction range with a traditional cast-iron frying pan, making pancakes.

Now, as for those negawatts: We live in a wooden Arts-and-Crafts style house built in 1912 and have successively updated the insulation and doors and windows here and there over the years to the point where it’s a whole lot more efficient than it was. Late last year we went after the last poorly-insulated corner and found ourselves spending several thousand dollars on asbestos remediation.

Also in early 2020 we installed a Mitsubishi heat pump and a heat-pump-based hot-water tank. This turns out (in 2021 in Western Canada) to be quite a bit more expensive than heating the house with gas was. Also, modern furnaces (compared to the decades-old boxes they replaced) want to pump more air and pump it gently. There is one bedroom that we just couldn’t get properly ducted without a major house rebuild and it’s kind of cold now.

Another downside is that the tech isn’t fully debugged. The thermostat that came with the thing is really primitive and we haven’t been able to make a more modern one work with it.

Having said that, when the furnace is on, it’s a whole lot quieter and gentler than what it replaced.

One consequence is that we can turned off the natural gas coming into the house, which makes us feel good on the decarb front. And then there’s the earthquake issue; Where we live we’re overdue for The Big One and when it comes, if your house doesn’t fall down on you, another bad outcome is a natural-gas leak blowing you to hell.

Decarb: Fly less

Yeah, the carbon load from air travel is bad. The first few electric airplanes are starting to appear; but no-one believes in electric long-haul flights any time soon.

Family progress report: bad, used to be worse.

For decades, as the public face of various technologies, I was an egregious sinner, always jetting off to this conference or that customer, always with the platinum frequent-flyer card, often upgraded to biz class.

Since I parted ways with Google in 2014 and retreated into back-room engineering then retirement, things have been better. But not perfect; we have family in New Zealand and Saskatchewan, and take to the air at least annually. I haven’t any notion what proportion of air-flight carbon has business upstream, what proportion vacations, and what proportion love. I hope the planet can afford the latter and a few vacations.

Of course, Covid slaughtered biz travel and we don’t know yet how much will come back. I can’t imagine that anyone thinks it’ll go 100% back to the way it was, thank goodness.

Decarb: Eat less meat

The numbers for agriculture’s share of global carbon load are all over the place; I see figures from 9% to twice that. The UN Food and Agricultural Organization says global livestock alone accounts for 14.5% of all anthropogenic GHG emissions. So if the human population backed away from eating dead animals, it’d move the needle.

Family progress report: Not that great but improving.

We eat more vegetarian food from year to year, especially now that my son cooks once or twice a week and generally declines to include meat in what he makes. I make progress but slower, not so much because meat is tasty and nutritionally useful but because it’s easy; less chopping and other prep required. In a culture that’s chronically starved for time, this is going to be a heavy lift.

Also we’ve cut way back on beef, which is the worst single part of the puzzle. I think it’s probably perfectly OK for a human to enjoy a steak or a burger; say, once per month.

Decarb: Random sins

You probably shouldn’t have a motorboat (which we do) and if you do you should use it sparingly (which we do).

You probably shouldn’t burn wood in a fireplace (which we do) and if you do you should use it sparingly (which we do).

[To be updated].

Page processed in 0.396 seconds.

Powered by SimplePie 1.4-dev, Build 20140208172103. Run the SimplePie Compatibility Test. SimplePie is © 2004–2021, Ryan Parman and Geoffrey Sneddon, and licensed under the BSD License.