> Everything, Gordon, finally depends on breaking the presumptive intuition > that consciousness is an irreducible something in the universe and seeing it > as a system property instead. > > SWM Not everything, by any stretch, Stuart. The distinction you are unwilling to make is between functional properties and first-order properties. Computation is a second-order property. When you conflate "functionalities" with physical processes, you are holding Searle's view. You don't see that because you think Searle's view boils down to a nonprocess-based view of what consciousness is. And you come to this erroneous conclusion because you seem to think that computers are the only game in town. It is misleading to insist that even Searle regards organic brains as computers. ANYTHING can be given a computational description. Two different conclusions are thus drawn by the camps and they are: 1. Computation can't get at what makes a process mental and another nonmental. This is consistently adopted by you and other strong AIers when saying that ascriptions of consciousness form a sort of continuum. 2. But, really, things are either conscious or not, whether or not there is a continuum of qualities some conscious systems may entertain vis a vis other limited ones. Searle. 3. Computation isn't even a candidate because second-order properties are not candidates. Searle. Searle is distinguishing the processes of computation from brute physical processes. So you can't go from his denial of computation as intrinsically causal to the assumption that it comes down to Searle's intuition of consciousness as basic. Only those processes which bottom out in brute causality can be candidates. And then you and Hacker turn around and say such Wittgensteinian things like it may be a category mistake to say of a brain that it is conscious. And then the onus is put on Searle to show how brains do it. And if he can't, it is alleged that Searle went wrong somewhere. On the contrary, we just haven't gotten there yet. What would be wrong, by our lights, is to demand that Searle's approach of seeking NCCs is misguided due to a conceptual mistake. But his biological naturalism is supposed to get us to see that only those mired in conceptual dualism to begin with would think anything wrong with the simple claim that brains cause consciousness somehow. This is why we think it a rearguard effort to label Searle a dualist--it is because you are and it would help you to win the argument that everybody else has to be also. We don't think so and are unmoved by bad arguments to the contrary. The crux is your unwillingness to acknowledge how computers work (second-order properties of crunching bit strings) and your unwillingness to consider that one doesn't need an argument for what is in plain sight of all who don't refuse to see. I don't suppose that will move you. Why? Because you are flaming or really believe that it makes no sense a la Hacker to hypothesize that brains cause consciousness given first order properties of brains. Keep in mind that as long as you conflate second and first-order properties, you are trying to sound as naturalistic as Searle is. But to the extent you have a problem with Searle, it boils down to some sort of incomprehension on your part as to how any first-order properties can cause consciousness. And that is conceptual dualism which is endemic to strong AI to begin with. Again, you will remain unmoved. Why? A good argument? How, if you entertain Searle's position with one side of your mouth while dismissing his claim with the other? What have I missed? Am I supposed to recant my belief in a distinction between functional and first-order properties? But the brain DOES cause consciousness. What else can? Prediction: the distinction will be routinely glossed from an epistemic view. But an ignoratio elechii is not the solution. Cheers, Budd ========================================= Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/