My physical space is full of subtle cues. Books I read or bought most recently are lying out. Papers are lying in stacks on my desk, roughly arranged by their relationships.
If I need to fix a door, I'll be reminded each time I see it. Digital task lists live in a dedicated app. I have no natural cause to look at that app regularly, so I need to establish a new habit to explicitly review my task list.
— https://twitter.com/andy_matuschak/status/1202663208059691009 ↗
If I mark up a physical book then later flip through to see my margin notes, I'll always see them in the context of the surrounding text. By contrast, digital annotation listings usually display only the text I highlighted, removed from context.
— https://twitter.com/andy_matuschak/status/1202663208059691009 ↗
If I read an old digital note, I get the unnerving sense that it's part of some "whole" that I can't see at all-no matter how much hypertext is involved. Working with physical notes, I'd shuffle notes around to make sense of the structure. There isn't a digital equivalent.
— https://twitter.com/andy_matuschak/status/1202663256143187968 ↗
there's no "peripheral vision" in computing:
this is related to ideas of Embodied Cognition, Epistemic Actions, spatial thinking, and the Dynamic Medium
226 words last tended to on 2021-03-19 — let me know what you think