With Inkbase we explored what would it mean to have a hand-drawn ink combined with spreadsheet-like reactivity, all in end-user programmable form.
To learn more about the research, head over here for a long write-up covering everything from philosophy behind hand-drawn marks, to intricacies of the programming model.
Spatial messages as a way to communicate between embodied objects. The way that most programming was done in Inkbase was by sketching out some shapes on the canvas, and making them react to each other, by asking "what's to the right of me?", "is there anything inside that region?", etc. For example, in the demo below, the pink "wires" are not what makes the logic gates talk to each other — they are there just for user feedback. Instead, the logic gates query for what's to the left (inputs), and send the calculated values to everything that's to the right:
We hit this earlier on in Inkbase where interacting with a single item was nicely visualized with inspector panes, but groups of objects were not. We tried to explicitly solve this in Crosscut, but the solutions were far from ideal. I also hit this in my research at Glide where displaying lists of values is the main thing you do.