Plato's Camera, again.
Apr. 27th, 2015 10:39 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
So I'm going to complain a little about Paul M. Churchland's 2012 book, just because I'm such a big fan of his.
I just roared through the thing for the first two chapters, but that's because I have the unnatural advantage of having read and reread many of the author's earlier works in years past. The whole thing is five chapters, and I believe the first chapter is shorter than the others: I say this so you will know what I mean when I say I rocketed through a very long chapter 2, and got bogged down in chapter 3.
What happens structurally is this: through to the end of chapter 2, he has been discussing backpropagation learning (a biologically unnatural, but artificially very natural, learning algorithm) and also feedforward neural networks that interpret static phenomena. In chapter 3, he simultaneously begins to introduce both the biologically more natural learning method that might be called Hebbian or something like it, and also at the same time, how recurrent neural networks can perceive and execute processes that unfold in time. And while I was starting to read chapter 3, I was starting to worry about which one he was writing about at any given time, because it was getting hard to keep track.
You've heard of damning with faint praise? I praise PMC with faint criticism.
I just roared through the thing for the first two chapters, but that's because I have the unnatural advantage of having read and reread many of the author's earlier works in years past. The whole thing is five chapters, and I believe the first chapter is shorter than the others: I say this so you will know what I mean when I say I rocketed through a very long chapter 2, and got bogged down in chapter 3.
What happens structurally is this: through to the end of chapter 2, he has been discussing backpropagation learning (a biologically unnatural, but artificially very natural, learning algorithm) and also feedforward neural networks that interpret static phenomena. In chapter 3, he simultaneously begins to introduce both the biologically more natural learning method that might be called Hebbian or something like it, and also at the same time, how recurrent neural networks can perceive and execute processes that unfold in time. And while I was starting to read chapter 3, I was starting to worry about which one he was writing about at any given time, because it was getting hard to keep track.
You've heard of damning with faint praise? I praise PMC with faint criticism.