Discussion about this post

User's avatar
Francis Spufford's avatar

I agree with you about novels. (I would say that, wouldn't I?) But I'm not convinced, either, by Amodei's idea that AIs could do the work of science, or at least the genuinely pathfinding, breakthrough-making part of science. Doing pattern-recognition on a more-than-human scale, yes; systematically exploring vast possibility spaces we couldn't get into before, yes. But those are both ways in which AI can tease out unnoticed properties of data we've already got, things we already know (but don't know we know, etc). I like Henry Farrell's characterisation of AI as a social technology of knowledge, a way of conveniently indexing and making available a digest of what's already thought. (Errors included.) What that won't do, since it isn't a matter of probabilistic links between words and other symbols, is give us things that haven't been thought before. (Though it may show us things

*implicit* in existing thinking that had been hiding in plain sight.) To me, the danger here is not one of AGI taking over the world, or any of that shit, but of AI hopelessly disrupting the models of apprenticeship and human effort without which you can't get to the genuinely new stuff. Without the effortful mastery of the state of the art as it presently exists in any domain, you can't put yourself in a position to do the genuinely original next thing. Why would you spend years laboriously becoming mediocre, as a necessary stage on the way to becoming good, if AI will do it for you, frictionlessly? I think widespread adoption of AI is likely to be a recipe for human deskilling, and therefore for cultural stagnation.

Expand full comment
2 more comments...

No posts