Software interfaces undervalue peripheral vision! (a thread)
My physical space is full of subtle cues. Books I read or bought most recently are lying out. Papers are lying in stacks on my desk, roughly arranged by their relationships.
Peripheral vision spontaneously prompts action.
If I need to fix a door, I’ll be reminded each time I see it. Digital task lists live in a dedicated app. I have no natural cause to look at that app regularly, so I need to establish a new habit to explicitly review my task list.
Peripheral vision emphasizes the concrete.
Unread digital books and papers live in some folder or app, invisible until I decide that “it’s reading time.” But that confuses cause and effect.
If I leave books lying on my coffee table, I’ll naturally notice them at receptive moments. I'll read a book if I feel an actual, concrete interest in it. By contrast, the motivation to read a digital book comes from abstract interest in the habit of reading.
Hi Andy, I keep coming back to this thread of yours. I’m curious if you could suggest some good references for spatiality of information and how we embed knowledge into our surroundings? Doesn’t even have to be in the context of HCI or digital.