- Live-Programming is great, but somehow hard to apply to "production systems", why?
- most Live-Programming demos are about sliders in the code editor, adjusting variables live, as the output in the split screen changes
- anecdotally when working on "production systems" I rarely play with constants - maybe only when polishing some UI, but even then it often feels like working at the wrong level of abstraction
- most of the time, I try different code paths, which often means toggling multiple lines on/off across multiple files; "wiggling" the structure, not a single value
- this sometimes feels close to trying to have multiple
git
branches visible and editable at the same time (which is related to Handling History in Software)
- when Live-Programming a generative system, I see only the final result, so I still have to play a computer in my head (Simulator) - it's only that I have live-feedback from my actions - which is great, but for more complex things I want more visibility into sub-parts of the system
- which begs the question of how to get that visibility? - some ideas:
- explicitly through some form of tracing/probing - like explored for example in the Live Coding Livebook research, and many other places
- by designing the programming system in a way where "chunking" is part of what you do in that system, and other than a way of composition, gives you also visibility into the system, for example:
- 2024-09-02Future Of Coding1
- 2024-02-21Live-Programming1