HOP over FOR, HOT theory


Peter Carruthers



Following a short introduction, this chapter begins by contrasting two different forms of higher-order perception (HOP) theory of phenomenal consciousness - inner sense theory versus a dispositionalist kind of higher-order thought (HOT) theory - and by giving a brief statement of the superiority of the latter. Thereafter the chapter considers arguments in support of HOP theories in general. It develops two parallel objections against both first-order representationalist (FOR) theories and actualist forms of HOT theory. First, neither can give an adequate account of the distinctive features of our recognitional concepts of experience. And second, neither can explain why there are some states of the relevant kinds that are phenomenal and some that aren’t. The chapter shows briefly how HOP theories succeed with the former task. And it then responds (successfully) to the challenge that HOP theories face the latter charge too. In the end, then, the dispositionalist HOT version of HOP theory emerges as the overall winner: only it can provide us with a reductive explanation of phenomenal consciousness which is both successful in itself and plausible on other grounds.



1          Introduction


I should begin by explaining the bad joke that forms my title. (It is bad because it does need some explanation, unfortunately.) On the one hand, I shall be arguing in this chapter for the superiority of higher-order perception (HOP) theories over both first-order representationalist (FOR) and actualist higher-order thought (HOT) theories. (That is, I shall be arguing that HOP theories win out over both FOR theory and actualist HOT theory.) But on the other hand, I shall be arguing that the theory on which we should all converge (the theory that we should all hop over for) is actually a dispositionalist form of HOT theory (a form of HOT theory that, when combined with consumer semantics, can also count as a kind of HOP theory, as we shall see).

            The topic of this chapter is phenomenal consciousness: the sort of conscious state that it is like something to have, or that has feel, or subjective phenomenology. More specifically, this chapter is about whether (and how) phenomenal consciousness can be reductively explained, hence integrating it with our understanding of the rest of the natural world. I shan’t here pause to distinguish phenomenal consciousness from other forms of state-consciousness (specifically, from various forms of access-consciousness), nor from a variety of kinds of creature-consciousness; for these distinctions have been adequately drawn elsewhere, and should by now be familiar (Rosenthal, 1986; Block, 1995; Lycan, 1996; Carruthers, 2000, ch. 1). Nor shall I pause to consider ‘mysterian’ arguments that phenomenal consciousness lies beyond the scope of reductive explanation (McGinn, 1991; Chalmers, 1996; Levine, 2000). And accounts that attempt to explain phenomenal consciousness directly in terms of neurology or brain-function (e.g. Crick and Coch, 1990) are similarly excluded from discussion. (For direct critiques of both mysterian and neurological approaches to consciousness, see Carruthers, 2000, chs. 2-4.) Somewhat more narrowly, then, this chapter is concerned with attempts to provide a reductive explanation of phenomenal consciousness in terms of some combination of intentional (or representational) content and causal (or functional) role.

            Representationalist theories of phenomenal consciousness can be divided into two broad categories, each of which then admits of several further sub-divisions. On the one hand there are first-order theories of the sort defended, in different ways, by Kirk (1994), Dretske (1995) and Tye (1995, 2000). (For discussion of a number of other variants on this first-order theme, see Carruthers, 2000, ch. 5.) Such theories reduce phenomenal consciousness to a certain sort of intentional content (analog or fine-grained, perhaps; or maybe non-conceptual - these differences won’t concern us here) figuring in a distinctive place in the causal architecture of cognition (perhaps as the output of our perceptual faculties, poised to have an impact on conceptual thought and behavior control). And then on the other hand there are a variety of higher-order theories that reduce phenomenal consciousness to some sort of higher-order awareness of such first-order analog / non-conceptual intentional states.

            Here is one way of carving up the different forms of higher-order representational accounts of phenomenal consciousness. The basic contrast is between theories that claim that the higher-order states in question are themselves perceptual or quasi-perceptual, on the one hand, and those that claim that they are conceptualized thoughts, on the other. Higher-order perception (HOP) theories propose a reduction of phenomenal consciousness to analog / non-conceptual intentional content which is itself the target of (higher-order) analog / non-conceptual intentional contents (Armstrong, 1968; Lycan, 1996; Carruthers, 2000). Actualist higher-order thought (HOT) theory, on the other hand, reduces phenomenal consciousness to analog / non-conceptual contents which are the actual target, at the time, of a higher-order belief or thought. Or otherwise put, actualist HOT theory reduces phenomenal consciousness to analog / non-conceptual contents of which the subject is conceptually aware (Rosenthal, 1986, 1993, 1997).

            One somewhat surprising thesis to be advanced in the present chapter is that both FOR theories and actualist HOT theories (which superficially look very different from one another) turn out to be subject to quite similar kinds of difficulty. In point of fact, essentially the same arguments that can be used to defeat the one can also be used to defeat the other. Which then leaves HOP theory as the only representationalist account left standing. But HOP theory, too, admits of a pair of sub-varieties, one of which turns out to be, at the same time, a (dispositionalist) form of HOT theory. This is where we begin, in section 2. But the range of different representationalist alternatives can be seen laid out in figure 1.


Figure 1 about here


2          Two kinds of HOP theory


The form of higher-order perception (HOP) theory that will be familiar to most people is so-called ‘inner sense’ theory, generally credited to John Locke (1690). It was reintroduced in our era by Armstrong (1968), and has been defended more recently by Lycan (1987, 1996). On this account, we not only have a set of first-order senses charged with generating analog / non-conceptual representations of our environments and the states of our own bodies, but we also have a faculty of inner sense, which scans the outputs of those first-order senses and generates higher-order analog / non-conceptual representations of (some of) them in turn. And while terminology differs, it would seem that it is these higher-order representations that are responsible for the feel of our phenomenally conscious states.[1] That is to say, our first-order perceptual states get to be phenomenally conscious by virtue of being targeted by higher-order perceptions, produced by the operations of our faculty of inner sense.

            The contrasting, less familiar, form of HOP theory is a dispositionalist version of HOT theory (Carruthers, 2000), although it might equally be called a ‘dual-content theory’. On this account, some of our first-order perceptual states acquire, at the same time, a higher-order analog / non-conceptual content by virtue of their availability to a faculty of higher-order thought (HOT), combined with the truth of some or other version of consumer semantics - either teleosemantics, or functional / conceptual role semantics.[2] (It is because it proposes a set of higher-order analog - or ‘experiential’ - states, which represent the existence and content of our first-order perceptual states, that the theory deserves the title of ‘higher-order perception’ theory, despite the absence of any postulated organs of higher-order perception.)

There is no faculty of ‘inner sense’ on this account; and it is one and the same set of states that have both first-order and higher-order analog / non-conceptual contents. Rather, a set of first-order perceptual states is made available to a variety of down-stream ‘consumer systems’ (Millikan, 1984), some concerned with first-order conceptualization and planning in relation to the perceived environment, but another of which is concerned to generate higher-order thoughts, including thoughts about those first-order perceptual states themselves. And it is by virtue of their availability to the latter consumer system that the perceptual states in question acquire a dual content. Besides being first-order analog / non-conceptual representations of redness, smoothness, and so on, they are now also second-order analog / non-conceptual representations of seeming-redness, experienced-smoothness, and so forth; hence acquiring a dimension of subjectivity. And it is this dimension that constitutes those states as phenomenally conscious, on this account.

            How can we adjudicate between these two very different versions of HOP theory? There are a pair of significant problems with inner sense theory. One is that it is very hard to see any evolutionary reason for the development of an organ of inner sense. Yet such a faculty would be by no means computationally trivial. Since it would be costly to build and maintain, we need a good story about the adaptive benefits that it would confer on us in return. But in fact there are no such stories on offer. All of the various proposed functions of inner sense turn out, either not to require inner sense at all, or to presuppose a faculty for higher-order thought (HOT), or both (Carruthers, 2000). In contrast, it isn’t difficult for dispositionalist HOT theory to explain why a HOT faculty should have evolved, nor why it should have access to perceptual contents. Here the standard stories about the adaptive benefits of sophisticated social - and perhaps ‘Machiavellian’ - thinking will surely suffice (Byrne and Whiten, 1988, 1998; Carruthers, 2000).

            The other main problem with inner sense theory is that it ought to be possible for such a sense-organ to malfunction, just as our other senses sometimes do (Sturgeon, 2000). I can be confronted with a surface that is actually red, but which (due to unusual lighting conditions, or whatever) I perceive as orange. So, too, then, it ought to be possible for me to be undergoing an experience with the first-order analog / non-conceptual content red while my inner-sense faculty is producing the higher-order analog / non-conceptual content seems orange or experienced orange. In such circumstances I would be disposed to make the first-order recognitional judgment, ‘It is red’ (spontaneously, without inferring that the surface is red from background knowledge or beliefs about my circumstances), while at the same time being inclined to say that my experience of the object seems orange to me. Yet nothing like this ever seems to occur.[3]

            In contrast once again, no parallel difficulty arises for dispositionalist HOT theory. For it is one and the same state that has both first-order and higher-order analog / non-conceptual content, on this account. (So there can be no question of the higher-order content existing in the absence of the first-order one.) And the higher-order content is entirely parasitic upon the first-order one, being produced from it by virtue of the latter’s availability to a faculty of higher-order thought. There therefore seems to be no possibility that these contents could ever ‘get out of line’ with one another. On the contrary, the higher-order analog / non-conceptual state will always be a seeming of whatever first-order analog / non-conceptual content is in question.

There are difficulties for inner sense theory that don’t arise for dispositionalist HOT theory, then. Are there any comparable costs that attend the dispositionalist HOT version of HOP theory? Two are sometimes alleged; but neither seems to me very real or significant. It is sometimes said in support of inner sense theory that this approach makes it more likely that phenomenal consciousness will be widespread in the animal kingdom (Lycan, 1996). Whereas it is rightly said that dispositionalist HOT theory will restrict such consciousness to creatures capable of higher-order thought (humans, and perhaps also the other great apes). But this alleged advantage is spurious in the absence of some account of the evolutionary function of inner sense, which might then warrant its widespread distribution. And our temptation to ascribe phenomenal consciousness quite widely amongst non-human animals is easily explained as a mere by-product of our imaginative abilities (Carruthers, 1999, 2000), and/or by our failure to be sufficiently clear about what really carries the explanatory burden when we explain other people’s behavior by attributing phenomenally conscious states to them (Carruthers, 2004).

            The other ‘cost’ of preferring dispositionalist HOT theory to inner sense theory is that we are then required to embrace some form of consumer semantics, and must give up on any pure causal-covariance, or informational, mere input-side semantics. But this strikes me as no cost at all, since I maintain that all right-thinking persons should embrace consumer-semantics as at least one determinant of intentional content, quite apart from any considerations to do with phenomenal consciousness (Botterill and Carruthers, 1999).

            I conclude, then, that once the contrast is clearly seen between inner sense theory and dispositionalist HOT / dual-content versions of higher-order perception (HOP) accounts of phenomenal consciousness, then the latter should emerge as the winner overall. For there are powerful arguments against inner sense theory, while there exist no significant arguments against dispositionalist HOT theory (which aren’t just arguments against the higher-order character of the account, which both approaches share, of course).

            This result is important, since many people are inclined to reject HOP accounts of phenomenal consciousness too easily. In fact, they see the weaknesses in inner sense theory without realizing that there is an alternative form of HOP theory (dispositionalist HOT theory plus consumer semantics) which isn’t really subject to those problems. The remainder of this chapter will now argue in support of HOP approaches in general, as against both first-order representationalist (FOR) and actualist HOT accounts. Combining those arguments with the points made briefly in the present section will then amount to an overall argument in support of a dispositionalist HOT form of HOP theory.



3          Explaining higher-order recognitional judgments


There is something of a consensus building amongst philosophers opposed to ‘mysterian’ approaches to phenomenal consciousness. It is that the right way to undermine the various thought experiments (zombies, inverted experiences, and such-like) that are supposed to show that phenomenal properties don’t supervene logically on physical, functional, or intentional facts, is to appeal to our possession of a set of purely recognitional concepts of experience (Loar, 1990, 1997; Papineau, 1993, 2002; Sturgeon, 1994, 2000; Tye, 1995, 2000; Carruthers, 2000).

            The idea is that we either have, or can form, recognitional concepts for our phenomenally conscious experiences that lack any conceptual connections with other concepts of ours, whether physical, functional, or intentional. I can, as it were, just recognize a given type of experience as this each time it occurs, where my concept this lacks any conceptual connections with any other concepts of mine – even the concept experience. My possession of the concept this can consist in nothing more nor less than a capacity to recognize a given type of phenomenal state as and when it occurs.[4]

            Given that I possess such purely recognitional concepts of experience, then it is easy to explain how the philosophical thought experiments become possible. I can think, without conceptual incoherence or contradiction, ‘This type of state [an experience as of red] might have occurred in me, or might normally occur in others, in the absence of any of its actual causes and effects; so on any view of intentional content that sees content as tied to normal causes (i.e. to information carried) and/or to normal effects (i.e. to teleological or inferential role), this type of state might occur without representing redness.’ Equally, I can think, ‘This type of state [an experience] might not have been, or might not be in others, an experience at all. Rather it might have been / might be in others a state of some quite different sort, occupying a different position within the causal architecture of cognition.’ Even more radically, I can think, ‘There might have been a being (a zombie) who had all of my physical, functional, and intentional properties, but who lacked this and this and that – indeed, who lacked any of these states.’

            Now, from the fact that we have concepts of phenomenally conscious states that lack any conceptual connections with physical, functional, or intentional concepts, it of course doesn’t follow that the properties that our purely recognitional concepts pick out aren’t physical, functional, or intentional ones. So we can explain the philosophical thought experiments while claiming that phenomenal consciousness is reductively explicable in physical, functional, or intentional terms. Indeed, it increasingly looks to me, and to others, that any would-be naturalizer of phenomenal consciousness needs to buy into the existence of purely recognitional concepts of experience.

            Higher-order perception (HOP) theorists of phenomenal consciousness are well placed to explain the existence of purely recognitional concepts of experience. We can say the following. Just as our first-order analog perceptual contents can ground purely recognitional concepts for secondary qualities in our environments (and bodies), so our higher-order analog perceptual contents can ground purely recognitional concepts for our first-order experiences themselves. The first-order perceptual contents analog-green, analog-smooth and so on can serve to ground the recognitional concepts, green, smooth  and so forth. Similarly, then, the higher-order perceptual contents analog-experienced-green and analog-experienced-smooth can serve to ground the purely recognitional concepts of experience, this state and that state. And such concepts are grounded in (higher-order) awareness of their objects, just as our recognitional concepts green and smooth are grounded in (first-order) awareness of the relevant secondary properties.

            Neither first-order (FOR) theories of the sort defended by Dretske (1995) and Tye (1995, 2000), nor actualist higher-order thought (HOT) theories of the kind proposed by Rosenthal (1993, 1997) can give an adequate account of our possession of purely recognitional concepts of experience, however. Or so I shall briefly argue. (For further elaboration of some of these arguments, especially in relation to FOR theories, see Carruthers, 2003.)

            According to FOR theories, phenomenal consciousness consists in a distinctive kind of content (analog or non-conceptual) figuring in a distinctive position in cognition (poised to have an impact upon thought and decision making, say). Such contents are appropriate to ground first-order recognitional applications of concepts of secondary qualities, such as green, smooth, and so on. But what basis can they provide for higher-order recognition of those first-order experiences themselves? The perceptual content analog-green can ground a recognitional application of the concept green. But how could such a content ground a recognitional application of the concept this [experience of green]? It isn’t the right kind of content to ground an application of a higher-order recognitional concept. For if such concepts are to be applied recognitionally, then that means that they must be associated with some analog or non-conceptual presentation of the properties to which they apply. And that means, surely, a higher-order analog content or HOP.

            One option for a FOR theorist here would be to say, as does Dretske (1995), that the higher-order concept applies to experience indirectly, via recognition of the property that the experience is an experience of. On such an account the putative recognitional concept this [experience as of green] is really a concept of the form, my experience of this [green]. But this then means that the concept is not, after all, a purely recognitional one. On the contrary, it is definitionally tied to the concept experience, and also to the presence of greenness. And then we can no longer explain the seeming coherence of the thoughts, ‘This [experience as of green] might not have been an experience, and might not have been of this [green].’

            Another option for a FOR theorist would be to defend a form of brute-causal account, as Loar (1990) seems tempted to do. On this view the higher-order recognitional concept this [experience as of green] wouldn’t have the quasi-descriptive content assumed by Dretske. Rather, applications of it would be caused by the presence of the appropriate kind of experience [as of green] without the mediation of any mental state, and more specifically, without the mediation of any higher-order perceptual state. But this view gets the phenomenology of higher-order recognitional judgment quite wrong. When I judge recognitionally, ‘Here is this type of experience again’, I do so on the basis of awareness of that which my judgment concerns – a given type of experience. I do not, as it were, judge blindly, as the brute-causal account would have it.

            Finally, a FOR theorist might allow that we do have higher-order perceptions (HOPs) to ground our recognitional concepts of experience, while denying that this is what constitutes those experiences as phenomenally conscious ones. (Tye, 1995, sometimes seems tempted to adopt a position of this sort.) On the contrary, it might be said, all first-order analog perceptual contents are phenomenally conscious, but only some of these are targeted by higher-order perceptual contents in such a way as to ground purely-recognitional concepts of experience.

One problem with this proposal, however, is that it requires us to accept the existence of phenomenally conscious states that are inaccessible to their subjects. That is, it requires us to believe that there can be phenomenally conscious states of which subjects cannot be aware. For as I shall note briefly in section 4 below, there is now robust evidence for the existence of perceptual systems whose outputs are inaccessible to consciousness, in the sense of being unavailable to higher-order awareness or verbal report. And it is one thing to claim that there can be phenomenally conscious states that we happen not to be aware of through other demands on our attention (some have wanted to describe the ‘absent minded driver’ type of example in these terms), but it is quite another thing to claim that there are phenomenally conscious states in us that we cannot be aware of, or that we are blind to. This would be very hard to accept.

            Another difficulty with the proposal is that it appears to confuse together two distinct forms of subjectivity. Any first-order perceptual state will be, in a sense, subjective. That is, it will present a subjective take on the organism’s environment, presenting that environment in one way rather than another, depending on the organism’s perspective and its discriminatory abilities. Thus any form of perception will involve a kind of subjectivity in the way that the world is presented to the organism. But phenomenal consciousness surely involves a much richer form of subjectivity than this. It involves, not just a distinctive way in which the world is presented to us in perception, but also a distinctive way that our perceptual states themselves are presented to us. It isn’t just the world that seems a certain way to us, but our experiences of the world, too, appear to us in a certain way, and have a distinctive feel or phenomenology. And this requires the presence of some form of higher-order awareness, which would be lacking in the first-order representational (FOR) proposal made above.

            FOR theories face severe difficulties in accounting for our possession of purely-recognitional concepts of experience, then. Similar difficulties arise for actualist higher-order thought (HOT) theory, of the sort defended by Rosenthal (1993, 1997). On this account, an experience gets to be phenomenally conscious by virtue of the subject’s conceptual awareness of the occurrence of that experience – that is to say, provided that the experience causes (and causes immediately, or non-inferentially) a higher-order thought to the effect that such an experience is taking place. But in the absence of any higher-order perceptual contents to ground such higher-order thoughts, this approach provides just another version of the ‘brute-causal’ account discussed above, and suffers from the same difficulties.

            Specifically, actualist HOT theory cannot account for the way in which our higher-order thoughts about our experiences appear to be grounded in some sort on non-conceptual awareness of those experiences. Nor, in consequence, can it explain how purely recognitional (higher-order) concepts of experience are possible which preserves their similarity to (first-order) recognitional concepts of color. Just as my judgments of ‘green’ are grounded in perceptual awareness of greenness (guided in their application by the content analog-green), so too my judgments of ‘this state’ are grounded in awareness of the state in question, which requires that they should be guided by a higher-order perceptual content such as analog-experienced-green.

            According to actualist HOT theory, there is a sense in which my recognitional judgments of experience are made blindly.[5] I find myself making higher-order judgments about the occurrence of experience, but without those judgments being grounded in any other awareness of those experiences themselves. It is rather as if I found myself making judgments of color (e.g. ‘Red here again’) in the absence of any perceptual awareness of color. But higher-order judgment doesn’t appear to be like that at all. When I think, ‘Here is that experience again’, I think as I do because I am aware of the experience in question. I can reflect on the appropriateness of my judgment, given the properties of the experience, for example. This requires the presence of higher-order perceptions of that experience – namely, HOPs.



4          Why some states are phenomenal and some aren’t


One difficulty for both first-order (FOR) theories and actualist higher-order thought (HOT) theories of phenomenal consciousness, then, is that neither can account adequately for the existence of purely recognitional judgments of experience. Another, equally powerful, objection is that neither can explain why some perceptual states are phenomenal and some aren’t. That is to say, neither can give an adequate account of that in virtue of which some analog / non-conceptual states have the properties distinctive of phenomenal consciousness and some don’t. But in order to make this point, I first need to say just a little about the conscious / non-conscious distinction as it applies to perceptual states.

            The evidence for non-conscious perceptual states in all sensory modalities is now quite robust (Baars, 1997; Weiskrantz, 1997). Here let me concentrate on the case of vision. Armstrong (1968) uses the example of absent-minded driving to make the point. Most of us at some time have had the rather unnerving experience of ‘coming to’ after having been driving on ‘automatic pilot’ while our attention was directed elsewhere – perhaps day-dreaming or engaged in intense conversation with a passenger. We were apparently not consciously aware of any of the route we have recently taken, nor of any of the obstacles we avoided on the way. Yet we must surely have been seeing, or we would have crashed the car. Others have used the example of blindsight (Weiskrantz, 1986; Carruthers, 1996). This is a condition in which subjects have had a portion of their primary visual cortex destroyed, and apparently become blind in a region of their visual field as a result. But it has now been known for some time that if subjects are asked to guess at the properties of their ‘blind’ field (e.g. at whether it contains a horizontal or vertical grating, or whether it contains an ‘X’ or an ‘O’), they prove remarkably accurate. Subjects can also reach out and grasp objects in their ‘blind’ field with something like 80% or more of normal accuracy, and can catch a ball thrown from their ‘blind’ side, all without conscious awareness. (See Weiskrantz, 1997, for details and discussion.)

More recently, a even more powerful case for the existence of non-conscious visual experience has been generated by the two visual systems theory proposed and defended by Milner and Goodale (1995). (See figure 2.) They review a wide variety of kinds of neurological and neuro-psychological evidence for the substantial independence of two distinct visual systems, instantiated in the temporal and parietal lobes respectively. They conclude that the parietal lobes provide a set of specialized semi-independent modules for the on-line visual control of action; whereas the temporal lobes are primarily concerned with more off-line functions such as visual learning, object recognition, and action-planning in relation to the perceived environment. And only the experiences generated by the temporal-lobe system are phenomenally conscious, on their account.[6]


Insert figure 2 about here


To get the flavor of Milner and Goodale’s hypothesis, consider just one strand from the wealth of evidence they provide. (For more extensive philosophical discussion, see Carruthers, 2000; Clark, 2002.) This is a neurological syndrome called visual form agnosia, which results from damage localized to both temporal lobes, leaving primary visual cortex and the parietal lobes intact. (Visual form agnosia is normally caused by carbon monoxide poisoning, for reasons that are little understood.) Such patients cannot recognize objects or shapes, and may be capable of little conscious visual experience; but their sensorimotor abilities remain largely intact.

One particular patient (D.F.) has now been examined in considerable detail. While D.F. is severely agnosic, she is not completely lacking in conscious visual experience. Her capacities to perceive colors and textures are almost completely preserved. (Why just these sub-modules in her temporal cortex should have been spared isn’t known.) As a result, she can sometimes guess the identity of a presented object – recognizing a banana, say, from its yellow color and the distinctive texture of its surface. But she is unable to perceive the shape of the banana (whether straight or curved); nor its orientation (upright or horizontal; pointing towards her or across). Yet many of her sensorimotor abilities are close to normal – she would be able to reach out and grasp the banana, orienting her hand and wrist appropriately for its position and orientation, and using a normal and appropriate finger grip.

Under experimental conditions it turns out that although D.F. is at chance in identifying the orientation of a broad line or letter-box, she is almost normal when posting a letter through a similarly-shaped slot oriented at random angles. In the same way, although she is at chance when trying to discriminate between rectangular blocks of very different sizes, her reaching and grasping behaviors when asked to pick up such a block are virtually indistinguishable from those of normal controls. It is very hard to make sense of this data without supposing that the sensorimotor perceptual system is functionally and anatomically distinct from the object-recognition / conscious system.

There is a powerful case, then, for thinking that there are non-conscious as well as conscious visual percepts. While the perceptions that ground your thoughts when you plan in relation to the perceived environment (‘I’ll pick up that one’) may be conscious, and while you will continue to enjoy conscious perceptions of what you are doing while you act, the perceptual states that actually guide the details of your movements when you reach out and grab the object will not be conscious ones, if Milner and Goodale are correct.

But what implications does this have for phenomenal consciousness, as opposed to access consciousness (Block, 1995)? Must these non-conscious percepts also be lacking in phenomenal properties? Most people think so. While it may be possible to get oneself to believe that the perceptions of the absent-minded car driver can remain phenomenally conscious (perhaps lying outside of the focus of attention, or being instantly forgotten), it is very hard to believe that either blindsight percepts or D.F.’s sensorimotor perceptual states might be phenomenally conscious ones. For these perceptions are ones to which the subjects of those states are blind, and of which they cannot be aware. And the question, then, is: what makes the relevant difference? What is it about a conscious perception that renders it phenomenal, that a blindsight perceptual state would correspondingly lack? Higher-order perception (HOP) theorists are united in thinking that the relevant difference consists in the presence of a higher-order perceptual content in the first case that is absent in the second, in virtue of the presence of which a phenomenally conscious state is a state of which the subject is perceptually aware.

First-order (FOR) theories, by contrast, face considerable difficulties on this point. Unless the proponents of such theories choose to respond by denying the data, or by insisting that even blindsight and sensorimotor percepts are actually phenomenally conscious ones, then there is really only one viable option remaining. This is to appeal to the functional differences between percepts of the different kinds in explaining why one set is phenomenally conscious while the others aren’t. For notice that the percepts constructed by the temporal-lobe system are available to conceptual thought and planning, but not to guide detailed movement on-line; whereas the reverse is true of the percepts produced by the parietal system. It is therefore open to a FOR theorist to say that it is availability to conceptual thought that constitutes an otherwise non-conscious perceptual state as phenomenally conscious (Kirk, 1994; Tye, 1995).

If what were being proposed were a brute identity claim, then such a position might be acceptable (or as acceptable as such claims ever are, if what we really seek is an explanation).[7] But proponents of FOR theories are supposed to be in the business of reductively explaining phenomenal consciousness. And it is left entirely obscure why the presence of conceptual thought and/or planning should make such a difference. Why should a perceptual state with the content analog-green, for example, remain unconscious if it is available just to guide movement, but become phenomenally conscious if used to inform conceptualized thoughts (such as, ‘That one is green’ or ‘I will pick up the green one’)? Granted, there is a big difference between thinking and acting. But what reason is there for believing that this difference can explain the difference between phenomenality and its lack?

Actualist higher-order thought (HOT) theory faces essentially the same difficulty. In explaining why sensorimotor percepts aren’t phenomenally conscious, a HOT theorist can point out that while such percepts guide movement, they aren’t available to higher-order thought and judgment. In contrast, the percepts produced by the temporal-lobe visual system are available to conceptual thought and reasoning in general, and to higher-order thought in particular. And the claim can then be made that those perceptual states produced by the temporal-lobe system are phenomenally conscious that are actually the target of a higher-order thought about themselves.

But why should the presence of a higher-order belief about the existence of a first-order perceptual state render that state phenomenally conscious? Why should higher-order access consciousness generate phenomenal consciousness? The first-order state remains the same, just as it was in the absence of the higher-order thought. (Or if it changes, this will be merely via a shifting of perceptual similarity-spaces, of the sort that is often caused by concept-application - as when I can make the aspect of a duck-rabbit figure alter by applying different concepts to it - not a change from the absence of subjective feel to its presence.) And the higher-order thought in question will characteristically not be a conscious one. It seems like actualist HOT theorists have no option but to advance a brute identity claim, saying that to be a phenomenally conscious state just is to be a perceptual state targeted by a HOT. But this is to give up on attempting a reductive explanation of the phenomena.[8]



5          Does  HOP theory face the same objection?


I have argued then (in section 3) that both first-order (FOR) theories and actualist HOT theories face essentially the same problem in explaining how we can have purely recognitional concepts of experience; whereas higher-order perception (HOP) theories are well placed to provide such an explanation. And I have now argued (in section 4) that neither FOR theories nor actualist HOT theory can give an adequate account of the conscious / non-conscious distinction, explaining why some perceptual states are phenomenally conscious while some aren’t. But how well do HOP theories perform in this latter respect? For ease of presentation, I shall now switch to framing the discussion in terms of  the form of HOP theory that I actually endorse, namely dispositionalist HOT theory. On this account, phenomenal consciousness consists in the dual perceptual content (both first-order and higher-order) possessed by those perceptual states that are made available to HOT (given the truth of some or other form of consumer semantics).

            Initially, the explanation of the difference between the states produced by the parietal and temporal-lobe visual systems is straightforward. The outputs of the sensorimotor system are first-order analog contents that are used merely to guide movement; and as such they aren’t phenomenally conscious. The outputs of the temporal-lobe system, in contrast, are available to a variety of down-stream consumer systems (such as action-planning), included in which is a faculty for higher-order thought (HOT). And it is by virtue of their availability to this HOT faculty that the first-order analog states that are produced by the temporal-lobe system come to acquire, at the same time, higher-order analog contents (given the truth of consumer semantics). And it is by virtue of having such dual content that the perceptual states in question are phenomenally conscious.[9]

            Here is how an objection might go, however (Byrne, 2001). Even if we don’t have any real examples, it surely could happen that dual-content perceptual states might occur without being accessible to their subjects (e.g. without being available to conscious thought and/or without being reportable in speech). For example, perhaps there could be a separate HOT faculty that monitors the outputs of the sensorimotor visual system for some reason, rendering those states, too, as dual-content ones. Then if I want to say that such states wouldn’t really be phenomenally conscious ones, don’t I have to appeal to functional-role considerations, just as FOR theory did above? Don’t I have to say that phenomenally conscious states are dual-content perceptual states that are reportable in speech, or something of the sort? And then can’t I, too, be charged with postulating a brute identity here, and giving up on reductive explanation?

            An initial reply is that it is extremely unlikely that there should actually be such dual contents that aren’t available to us. This is because higher-order thought doesn’t come cheap. So far as we know, a capacity for it has evolved just once in the history of life on earth, somewhere in the great ape / hominid lineage (perhaps only with the appearance of Homo – see Povinelli, 2000, for a skeptical look at the mentalizing abilities of chimps). The idea that there might be a capacity for HOT attached to the outputs of the sensorimotor system, or embedded someplace within our color-discrimination module or whatever, is unlikely in the extreme. So I don’t think that there are any real examples of non-access-conscious dual-content analog states (in the way that there are lots of real examples of non-conscious first-order perceptual states).

            I concede, however, that it is logically possible that there could be dual-content events that aren’t (in a sense) conscious. But here it is important to emphasize the distinction between phenomenal consciousness and various forms of access consciousness. I bite the bullet, and commit myself to the view that it is logically possible for there to be phenomenally conscious events (analog perceptual states with dual content, hence perceptual states with a subjective dimension) that aren’t access-conscious (that aren’t available for reporting in speech or to figure in decision-making). And I think that intuitions to the contrary are easily explained away. I certainly don’t see why one should define phenomenal consciousness in such a way as to entail access consciousness.

            This is where the fact that there are no real examples of dual-content perceptual states that aren’t also access-conscious becomes important. For within our experience and to the best of our belief these two properties are always co-instantiated. It might be natural for us, then, to assume that the two are somehow essentially connected with one another – especially since imagination, when conscious and reflectively guided, always deploys states that are access-conscious. It is hard for us to imagine a phenomenally conscious state that isn’t access-conscious. But that may just be because any image that we reflectively form is de facto access-conscious, given the way in which our cognitive system is actually structured.

            What matters is not what we can or can’t imagine, but what we can or can’t explain. And my contention is that dispositionalist HOT theory can reductively explain the distinctive features of phenomenal consciousness. In particular, by virtue of their dual analog content, perceptual states that are available to HOT will take on a subjective dimension. They will be both world-representing (or body-representing, in the case of pain and touch) and experience-representing at the same time. In such cases it isn’t just the world that is presented in a certain way to us, but our own experience of that world will also be presented in a certain way to us. And by virtue of such higher-order presentings, we can form purely recognitional concepts targeted on those very experiential states.

            This isn’t the place to set out and explain in detail the way in which dispositionalist HOT theory / dual-content theory can provide a successful reductive explanation of the various distinctive features of phenomenal consciousness. (See Carruthers, 2000.) But in addition to explaining how phenomenally conscious states possess a subjective dimension, this approach can also of course explain how such states possess properties that are available to introspective recognition, and how they can ground purely recognitional concepts, as we saw in section 3 above. We can therefore explain why, to anyone employing such concepts, the so-called ‘explanatory gap’ will seem to be unbridgeable. For such a person will always be able to combine, without incoherence, any proposed theory (including dual-content theory itself) with the thought, ‘But someone might satisfy the conditions of the theory without possessing this kind of state’ (thereby deploying their purely-recognitional concept this). Our account can also explain, too (and in common with other representationalist approaches, it should be said), how phenomenally conscious properties have a ‘fineness of grain’ that gives them a richness well beyond our powers of description and categorization. And it can be shown how people will then be strongly inclined to think of phenomenally conscious states as possessing intrinsic - that is, non-relational and non-intentional - properties; that people will be inclined to think of these properties as ineffable and private; and that we will be inclined to think that we have incorrigible, or at least privileged, knowledge of them.

Although there hasn’t here been the space to develop these points in any detail, it should nevertheless be plain that it is the dual-content aspect of the theory, rather than wider features of functional role (availability to planning and to speech, for example), that does the work in these explanations. This seems to me adequate motivation for the claim that phenomenal consciousness is constituted by dual-content perceptual states, wherever they might occur. To the best of our knowledge such states are also always actually accessible to the reasoning processes and reporting systems of their subjects. But there is nothing in my account of phenomenal consciousness as such that logically requires it.



6          Conclusion


I have argued that amongst higher-order perception (HOP) theories of phenomenal consciousness, dispositionalist HOT theory / dual-content theory is preferable to inner sense theory. I have also argued that HOP theories are preferable to both first-order (FOR) theories and to actualist HOT theory. Neither of the latter can give an adequate account of purely recognitional concepts of experience, nor of the distinction between conscious and non-conscious perceptual states; whereas HOP theories are well placed in both these respects. In the end, then, a dual-content theorist is what everyone ought to be.[10]




Armstrong, D. (1968). A Materialist Theory of the Mind. London: Routledge.

Baars, B. (1997). In the Theatre of Consciousness. Oxford: Oxford University Press.

Block, N. (1986). Advertisement for a semantics for psychology. Midwest Studies in Philosophy, 10, 615-678.

Block, N. (1995). A confusion about the function of consciousness. Behavioral and Brain Sciences, 18, 227-247.

Block, N. and Stalnaker, R. (1999). Conceptual analysis, dualism and the explanatory gap. The Philosophical Review, 108, 1-46.

Botterill, G. and Carruthers, P. (1999). The Philosophy of Psychology. Cambridge: Cambridge University Press.

Byrne, A. (2001). Review of Phenomenal Consciousness by Peter Carruthers. Mind, 110,  440-442.

Byrne, R. and Whiten, A. (Eds.) (1988). Machiavellian Intelligence. Oxford: Oxford University Press.

Byrne, R. and Whiten, A. (Eds.) (1998). Machiavellian Intelligence II. Cambridge: Cambridge University Press.

Carruthers, P. (1996). Language, Thought and Consciousness. Cambridge: Cambridge University Press.

Carruthers, P. (1999). Sympathy and subjectivity. Australasian Journal of Philosophy, 77, 465-482.

Carruthers, P. (2000). Phenomenal Consciousness: A naturalistic theory. Cambridge: Cambridge University Press.

Carruthers, P. (2003). Phenomenal concepts and higher-order experiences. Philosophy and Phenomenological Research, 66.

Carruthers, P. (2004). Why the question of animal consciousness might not matter very much. In B. Baars and S. Franklin (Eds.), Models and Mechanisms of Consciousness. Oxford: Oxford University Press.

Carruthers, P. (forthcoming). Reductive explanation and the ‘explanatory gap’.

Chalmers, D. (1996). The Conscious Mind. Oxford: Oxford University Press.

Clark, A. (2002). Visual experience and motor action: are the bonds too tight? Philosophical Review, 110, 495-520.

Crick, F. and Koch, C. (1990). Towards a neurobiological theory of consciousness. Seminars in the Neurosciences, 2, 263-275.

Dretske, F. (1995). Naturalizing the Mind. Cambridge, MA: MIT Press.

Fodor, J. (1998). There are no recognitional concepts, not even RED. In his In Critical Condition. Cambridge, MA: MIT Press.

Kirk, R. (1994). Raw Feels. Oxford: Oxford University Press.

Levine, J. (2000). Purple Haze. Cambridge, MA: MIT Press.

Loar, B. (1981). Mind and Meaning. Cambridge: Cambridge University Press.

Loar, B. (1990). Phenomenal states. In J. Tomberlin (Ed.), Philosophical Perspectives, 4. Northridge, Calif.: Ridgeview.

Loar, B. (1997). Phenomenal states. In N. Block, O. Flanagan and G. Güzeldere (Eds.), The Nature of Consciousness. Cambridge, MA: MIT Press.

Locke, J. (1690). An Essay Concerning Human Understanding. Many editions now available.

Lycan, W. (1987). Consciousness. Cambridge, MA: MIT Press.

Lycan, W. (1996). Consciousness and Experience. Cambridge, MA: MIT Press.

McGinn, C. (1989). Mental Content. Oxford: Blackwell.

McGinn, C. (1991). The Problem of Consciousness. Oxford: Blackwell.

Millikan, R. (1984). Language, Thought, and Other Biological Categories. Cambridge, MA: MIT Press.

Millikan, R. (1989). Biosemantics. Journal of Philosophy, 86, 281-297.

Milner, D. and Goodale, M. (1995). The Visual Brain in Action. Oxford: Oxford University Press.

Papineau, D. (1987). Reality and Representation. Oxford: Blackwell.

Papineau, D. (1993). Philosophical Naturalism. Oxford: Blackwell.

Papineau, D. (2002). Thinking about Consciousness. Oxford: Oxford University Press.

Peacocke, C. (1992). A Study of Concepts. Cambridge, MA: MIT Press.

Povinelli, D. (2000). Folk Physics for Apes. Oxford: Oxford University Press.

Rosenthal, D. (1986). Two concepts of consciousness. Philosophical Studies, 49, 329-359.

Rosenthal, D. (1993). Thinking that one thinks. In M. Davies and G. Humphreys (Eds.), Consciousness. Oxford: Blackwell.

Rosenthal, D. (1997). A theory of consciousness. In N. Block, O. Flanagan and G. Güzeldere (Eds.), The Nature of Consciousness. Cambridge, MA: MIT Press.

Sturgeon, S. (1994). The epistemic view of subjectivity. Journal of Philosophy, 91, 221-235.

Sturgeon, S. (2000). Matters of Mind. London: Routledge.

Tye, M. (1995). Ten Problems of Consciousness. Cambridge, MA: MIT Press.

Tye, M. (2000). Consciousness, Color and Content. Cambridge, MA: MIT Press.

Weiskrantz, L. (1986). Blindsight. Oxford: Oxford University Press.

Weiskrantz, L. (2000). Consciousness Lost and Found. Oxford: Oxford University Press.

Wittgenstein, L. (1953). Philosophical Investigations. Oxford: Blackwell.


Figure 1: Representationalist theories of consciousness




Representationalist theories of
   phenomenal consciousness





First-order theories                               Higher-order theories

            (FOR)                                                  (HOR)

   (Dretske, Tye)



                        Higher-order perception                                   Higher-order thought

                                 theories (HOP)                                               theories (HOT)





Inner sense theory                                   dispositionalist HOT                              actualist HOT

(Armstrong, Lycan)                              / dual-content theory                                    theory

                                                                   (Carruthers)                                         (Rosenthal)



Figure 2: The dual visual systems hypothesis



















[1] Lycan (1996) describes first-order perceptual states as possessing qualia, irrespective of their targeting by higher-order perception; and the terminology of ‘qualia’ is normally reserved for states that are phenomenally conscious. But I think that what he has in mind is just that first-order perceptual states represent fine-grained colors, textures and so forth; and that those states only acquire a dimension of subjective feel (hence becoming phenomenally conscious) when they are higher-order perceived. At any rate this is what I shall assume in what follows. (Inner sense theory seems to me devoid of interest otherwise.)

[2] For teleosemantics, see Millikan, 1984, 1989; Papineau, 1987, 1993. For functional or inferential role semantics, see Loar, 1981; Block, 1986; McGinn, 1989; Peacocke, 1992.

[3] Another variant on this theme, is that according to inner sense theory it ought to be possible for me to undergo a higher-order perception with the analog / non-conceptual content seems orange while I am undergoing no relevant first-order perceptual state at all. (Just as, in the case of hallucination, my first-order senses can sometimes produce a state with the analog / non-conceptual content red, while there is nothing colored in my environment at all.) In such circumstances I would be inclined to make the first-order  spontaneous judgment that I see nothing colored, while at the same time saying that I have an experience that seems orange to me. This combination of judgments seems barely coherent. Note, too, that similar problems can arise for actualist HOT theory; see Levine, 2000.

[4] Proponents of the existence of such concepts are then committed, of course, to rejecting the (quite different) arguments put forward by Wittgenstein (1953) and Fodor (1998) against the very possibility of purely recognitional concepts. Fortunately, neither set of arguments is at all compelling, though I shan’t attempt to demonstrate this here.

[5] By this I don’t mean that my higher-order judgments are non-conscious. For this isn’t problematic. It is granted on all hands that the higher-order representations that render our first-order percepts conscious aren’t themselves conscious ones, in general. Rather, I mean that for actualist HOT theory, higher-order judgments of experience aren’t grounded in awareness of their objects; which debars them from counting as genuinely recognitional.

[6] Note that this isn’t the old and familiar distinction between what and where visual systems, but is rather a successor to it. For the temporal-lobe system is supposed to have access both to property information and to spatial information. Instead, it is a distinction between a combined what-where system located in the temporal lobes and a how-to or action-guiding system located in the parietal lobes. And note, too, that the two-visual-systems hypothesis has the resources to explain the blindsight data.

[7] Block and Stalnaker (1999) argue that identities aren’t the kinds of facts that admit of further explanation. Consider the identity of water and H2O, for example. If someone asks, ‘Why is water H2O?’, it looks like we can only reply (vacuously), ‘Because it is’. You can’t explain the identity of water and H2O. Rather, identity facts are brute ones (not further explicable). Now, it is true that identities can’t be explained as such. But it is also true that, if the identity is to count as a successful reduction of the higher-level property involved, then it must be possible to deploy features of the property that figures on the reducing-side of the identity-claim in such a way as to explain the features distinctive of the property on the other side (the reduced property). Consider the identity of water and H2O again. Don’t we think that it must be possible to deploy facts about H2O, as such, in order to explain the distinctive properties of water – why it is colorless and odorless; why it is liquid at room temperatures; why it boils at 100o Centigrade; and so forth? Likewise, then, with phenomenal consciousness. A postulated identity, here, can only be acceptable if we can deploy the properties involved in such as way as to explain some of the distinctive features of phenomenality.

[8] For discussion of the demands placed on successful reductive explanation in general, and as applied to reductive explanations of phenomenal consciousness in particular, see Carruthers (forthcoming).

[9] Am I here holding dispositionalist HOT theory to a standard less demanding than that just imposed upon actualist HOT and FOR theories? No, because the dual-content idea can reductively explain the various features of phenomenal consciousness, particularly the latter’s subjective aspect and the way in which it can ground purely recognitional concepts of experience.

[10] Thanks to Zoltan Dienes, Rocco Gennaro and Bill Lycan for comments on an earlier draft of this chapter.