Week 32, the noise diary

Hello! The appointed hour is almost at hand. I’m sending this newsletter Saturday night, because on Sunday morning at 11 a.m., Pacific Time, I’ll begin my live reading of the novella Train Dreams by Denis Johnson. It will take a few hours, and it will be a nice thing, perhaps, to play in the background while you do some Sunday-ish activity like sip a beverage or clean the house or build a railroad bridge over a gorge.

You can tune in here.

It might even be underway at the moment you’re reading this!!

This dispatch is slightly odd-shaped, by Year of the Meteor standards. It’s about just one topic, and full to the brim with images, which is why I have omitted the usual art interludes. It concludes with a preview of some work I’ve put towards a potential video game project.

There will be gifs.

But first, some technical foundation:

One of the things that people often find themselves wanting to do is realistically render an imaginary 3D scene. This is the task at the core of Toy Story and Fortnite and the Navy’s latest flight simulator alike.

The stockpile of programming techniques used to produce these images is vast, almost gnostic—tricks to estimate how light might behave in an imaginary 3D scene, refined continuously since the earliest days of computers. For decades, this branch of programming has become ever more ingenious and inscrutable; beneath the skin of its gorgeous, real-time renderings lies a monstrous patchwork of shortcuts and illusions.

There is, however, a simpler way to realistically render an imaginary 3D scene:

Just follow the light!

You can “trace” each ray, each simulated photon, as it emanates from a light source, bounces between objects in the scene, and finally strikes a pixel in your display. If you render the scene this way, the task becomes… weirdly easy? There’s math, but it’s elegant math, and you can fit it into a few hundred lines of code. There are no tricks required…

…and there are consequently no shortcuts. For every pixel you want to paint, you have to trace a LOT of rays as they travel through the scene, to account for all the weird ways light can be bounced or blocked, refracted or colored. There are half-empty glasses of red wine lurking out there!! Thus, path tracing tends to require a lot of computation. Like… quite a lot.

For years, this was prohibitive. The technique was too slow for video games, which demand a fresh frame every 16 milliseconds or so, and too slow even for movie production. This has finally changed, but only recently. After dreaming and arguing about it for decades, Pixar and its peers have all switched to path tracing renderers. (Pixar’s research archive offers a really beautifully-written report about the practical use of path tracing in 3D animation. I have to say, for a technical PDF, I found it surprisingly… entertaining?!)

Let’s make the burden of path tracing more palpable.

I downloaded a simple path tracer and rendered one of the included example scenes on my MacBook, shooting 128 rays for every pixel. THIS TOOK HOURS:

Was it worth it?

Then, I changed the settings such that each pixel shot just one measly ray. Here’s the result, finished in five minutes:

The cheap version

In the path-tracing world, this kind of sketchy, incomplete, “noisy” rendering has historically been the subject of glumness and consternation. I mean, I understand why; the objective of the technique has generally been photorealism—luxe light and shadow.

But. What if that second image, the noisy one, is actually… better?

What if it’s gorgeous??

Alongside all that, an aesthetic fascination:

For months, I’ve been preoccupied with noise, every different flavor of it. To get you in the same headspace, I want to share some samples from my little folder of saved images—my noise diary. (If you’re viewing this email in narrow confines, any or all of these selections would benefit from the Open Image in New Tab treatment.)

Malwin Béla Hürkey designed this perfectly grainy poster:

Poster by Hurkey

Martian Press seems to produce most of its work on a Risograph, which you know I love. The Riso is a noise machine—and notice how the meaning of “noise” might flex and grow here, to mean not just “random fuzz” but “all sorts of stochastic, technically ‘imperfect’ artifacts in a production process”:


Here’s a frame from the opening credits for The Birds:

The Birds!

Look at the tips of those feathers!

The Birds!

You encounter a lot of noisy textures in space imagery. Here’s a literally scintillating plot from Takahiko Matsubara based on data from the Sloan Digital Sky Survey; the dots here aren’t truly random noise, of course—they’re galaxies!—but/and the image somehow has that sharp grain that I’m after:

Just the whole dang universe

And look at this clip from the European Space Agency’s Rosetta lander: one of the very first documents I deposited in the noise diary. Yes, I know, the clip is noteworthy overwhelmingly for being recorded ON THE SURFACE OF A COMET, but gosh, just look at the texture of it:

Comet dust

One last sample from the noise diary—a crucial one.

There seem to be vanishingly few people out there who are (a) using path tracing because they (b) love the noise. Every technical article, every lump of code, wants to help you eliminate the noise, rather than art-direct it.

One beautiful exception is Seth Thompson’s project Soft Pixels, an “experimental im­age series of 3D ren­der­ings with in­ten­tion­ally low fi­delity.” Intentionally low fidelity. Seth gets it. How in a million years could you argue that this—

A glorious spray

—is a bug, rather than a feature? That glorious spray of pixels!

Now, I’ll give you that promised preview.

This is a visual mechanism powered by path tracing, guided (aspirationally) by the noise diary. It will be the “skin” of the speculative project which is… not quite a video game? It’s a new kind of digital book, I think. Maybe I’ll just call it a video game, though?? I’m trying to get better at like, ~identifying slots~ for things.

In fact, the project is hardly anything yet; there’s an interactive text engine, a skeletal story, and now this: an aesthetic HELLO WORLD.

Pentwater 1

This path tracer shoots just one measly ray per pixel, and there ain’t that many pixels to start with. It’s a path tracer built for sparkly noise. It runs smoothly in real-time—in JavaScript, if you can believe it!

(When you decide other people’s problems aren’t your problems, all sorts of things become possible. There might be a shadow of Nintendo’s “lateral thinking with withered technology” lurking here.)

The simulated physics of path tracing gives you all sorts of lovely optical effects “for free.” Here’s my test landscape behind a refractive glass cylinder:

Pentwater 2

You can see that I’m using chunky pixels with a very limited palette, even a bit of dithering. I love how the fidelity of the path tracing plays against the constrained, crushed, “ugly” qualities of these other techniques.

Look at the edge of the pillars here as they move in and out of focus, boiling up into this absolute spray:

Pentwater 3

And listen. If you’re taking the time to trace the paths of all these rays… why not sling some of them around in circles? Here’s a light-warping black hole:

Pentwater 3

It was just last week that I got this basic mechanism working, and it has seized me with the same sense of possibility that I know from cooking up a killer opening line or a delicious twist in a story.

Oh… oh yes.

I want to see where this goes.

Other people will, too.


August 2019, Oakland

This website uses the typeface Albertus Nova, an update by Toshi Omagari of Berthold Wolpe’s classic, and GT America, designed by Noël Leu with Seb McLauchlan.

Have a question? Spot a typo? Your story didn’t arrive? Email

hony soyt qui mal pence