One common criticism of automated "Skinner-box"-like teaching machines (e.g. in Watters's book) is that they're fascistic, inhumane, etc. In the context of K-12, that's definitely true, but I think the stronger criticism is that they don't really *work*, even on their own terms!
Conversation
That is, even if a student mechanically does exactly what the Skinner box (or Khan Academy exercises) asks them to do, the resulting understanding is usually brittle, shallow, and short-lived. *Also* the experience is often awful, but that seems unimportant if it doesn't work!
2
15
It's funny—when I was working on K12 edu, what really bothered me about teaching machines was the fascistic, anti-creative bit.
Now that I'm working on expert learning, I have a different perspective: if such a machine truly worked, I'd *love* to use one for topics I care about.
2
1
21
The voluntarism makes all the difference for me. In the context of a coercive learning environment, the *affect* of the teaching machines really bother me; but if I'm just trying to efficiently learn topics I need for projects that matter to me, then sure—whatever works best!
4
20
Questions I'd like to understand better:
Intelligent tutoring systems seem to produce more flexible, durable understanding. Is this true? If so, what differences from a Skinner-style machine make it so?
3
8
I can't get my head around Direct Instruction. The Follow Through studies are hard to argue with, but it sure seems like a teaching machine to me. Does it produce more flexible, durable understanding than? If so, why? The teacher's human involvement, even if scripted?
Replying to
Am I actually just wrong about such classic "computer aided instruction"-type systems? My conclusion's based mostly on interactions with and small-scale studies of students using Khan Academy. I'm wary of a lot of the empirical work here.
3
13
Another twist on the K12-vs-adult-learning context switch: maybe the reason these teaching machines don't seem to work in K12 *is* the coercion? That is, to learn things, you must earnestly think about them, and a coerced CAI user will not. But maybe a voluntary one would?
1
1
14
I suspect even most eager students would struggle to build strong understanding from these rote teaching machines. The emotional connection is just too flat. Readwise sends me highlights of things I cared about, and after a few weeks my eye just skids off them.
2
9
See also the discussion of emotional connection in the "mnemonic video" section of this essay with , on how MOOCs struggle to leverage the emotional range of video while also supporting detail: numinous.productions/ttft/#mnemonic
2
17
Replying to
has some relevant anecdotes about teachers recreating their own KA videos because the face/voice of someone you trust matters (we’ve found this especially true in math at sora)
1
Replying to
The teacher can do it well, or poorly. But my answer: the DI scripts are incredible works of engineering. They explain the outcome of Follow Through.
1
1
Thought of you as I wrote this! :)
I reckon children need human interaction with their teachers in a way that, say, Navy IT trainees do not (in the same way that children thrive better in a foster homes than orphanages). I dunno, maybe can hack this student-teacher connection w/ A.I.-style tutorbears. 🙂
1
Show more replies



