Laboratory Residencybuilding creative coding tools

I was quite tired after last year, where I worked on monthly projects and pushed hard to publish something every month. I also knew that this year would be a bit different due to my personal life, so I had to plan accordingly. I decided to aim for working on one personal project every day, but without any tight deadlines; this could be reading a research paper, exploring some technology if I had time, or playing a bit of music.

This, combined with publishing weekly studio diary of music sketches, was both easy enough that I knew I could do it, but also kept me accountable.

But still, I missed longer, more focused projects, so I was very happy to be invited for the residency, where I could spent four weeks of focused work on whatever I wanted.

I spent the whole August 2018 in Spokane, Washington at Laboratory, working on open source creative coding tools.

The project

I had a very rough idea of what I wanted to do when I arrived. Few years ago I was exploring 3D growth, and wanted to continue that idea through building my own tools that would allow me to create organic 3D shapes.

First week was full of learning in the most frustrating way — trying and failing over and over again. I still didn’t know what exactly I wanted to do, and was exploring different techniques. I finally settled on using SDFs (signed distance fields) after short research on meshing isosurfaces.

After I settled on a technique, I quickly realised that what I really need is a language that would allow me to create SDF models without the need to touch GLSL.

I also wanted to be able to run them on CPU or GPU, and mesh them with the simplest possible API.

One thing that helped me a lot, was starting from imagining and sketching how the API could work like, and then building it to fit my needs. I would create file, sketch out a few different approaches with comments, and then try to implement the one that looked the most promising. I was keeping dates in those files as well, it’s interesting to read them after some time, here’s where I realised that hiccup-like language could work:

## 2018-08-11

Thinking of using hiccup-like syntax for building geometries:

const geom = [
  { r: 0.02 },
    [sdSphere, { r: 0.1, p: [0.0, 0.2, 0.3] }]
    [sdCube, { p: [0.0, 0.2, 0.3], s: [1.0, 1.0, 1.0] }]

And for larger datasets:

const geom = [
    data: somePositions,
    op: [opUnionRound, { r: 0.02 }],
    mapper: [mapProp, { prop: "p" }, sdSphere, { r: 0.01 }]

One of important things for me to solve with this system was the ability to use large datasets. Creating GLSL from trees like that is fairly straightforward, but there are limitations of the size of GLSL shader, and number of instructions that can be executed on GPU, and I was hitting it often.

The solution at the time was withDataset instruction that later was changed to map, and from the user standpoint works like MapReduce — one functions maps over all the data, and then second one reduces it into single value. Behind the scenes I’m turning this into floating point data texture uploaded to GPU, and then stepping over it in the shader. This allowed me to radically improve number of data points that I can use with this technique, although it obviously has its own limitations.


It was really nice to have enough time to be able to tackle related (but not really) problems that I was hitting while developing the main project. For example, I didn’t want to wait for requesting/processing large datasets so I wrote persistent-memo to memoize results of functions where input parameters or function body don’t change, and persist it over reloads (either in localStorage or on filesystem).

Another issue that I was hitting, was jumping between different sketches, or running multiple different local server instances.

I solved this by creating sketchbook-cli which provides automatic gallery of screenshots on top of folder of flat javascript sketch files. I could see all the sketches I’ve explored at the same time, and this sped up my workflow a lot, especially since some of the SDF geometries are expensive to render.


I’ve learned some important lessons as well. First of all, a month of residency sounds like a long time, and I was even planning smaller “downtime” projects, but I’ve obviously never got to them, I barely had time to finish the main project.

Next time, I would probably aim for either fully research/learning related residency, or one where I know almost everything, and produce artwork. I’ve tried to combine both, and ended up only building the tools, without ever actually making anything bigger with them.


All of the tools I developed through the residency are now open sourced:

I also got a chance to play a few music gigs in the US, where I presented new material that I’ve been working on together with the studio diary project.

I’d definitely want to go for another residency. As usual, working on one thing brings up ten new ideas, and my notebook is only getting bigger. There’s something really nice about being able to set aside everything, and allow myself to work around the clock while exploring and learning.

This for sure is not sustainable long-term, but for a month or two, it allows me to create larger chunks of work and not feel burned out, or as if I were avoiding my responsibilities.

Szymon Kaliski © 2018