Skip to main content

Thought in Action


Today's post is by Barbara Gail Montero.

I’m a philosophy professor at the City University of New York (with a rather unusual background since prior to studying philosophy I worked as a professional ballet dancer for a number of years). Thought in Action: Expertise and the Conscious Mind (Oxford University Press) is a book I’ve written that challenges the widely held view that, once you are good at something, thinking about your action, as you’re doing it, hampers your skill.



In it, I argue that experts think in action—consciously, not merely unconsciously—and, when thinking about the right things, this is in no way diminishes their prowess.

One of my goals in the book is to dispel various mythical accounts of experts who proceed without any understanding of what guides their actions. Those chicken sexers that philosophers are fond of citing who can’t explain why they makes their judgments—they don’t exist. Coleridge’s “Kubla Khan,” which supposedly came to him fully formed in a dream, actually took nearly ten years to write. Kekulé’s well known story that in 1862 the idea for the ring structure for benzene came to him in a flash after dreaming of a snake biting its tail, is contradicted by his own lesser known written account that his theory was formed in 1858. Such stories, I argue, are attractive, but misleading.

I also critically analyze research (in both philosophy and psychology) that extrapolates from everyday skills to draw conclusions about expert performance. I argue that experts’ extended analytical training, as well as the relatively higher stakes involved in expert action, make quotidian tasks (such as everyday driving) different enough from expert-level actions (such as professional race-car driving) so as to not warrant extrapolation from the former to the latter. Extended deliberate training, I argue, enables experts to perform while engaging their self-reflective capacities without any detrimental effects; it allows them to think and do at the same time.

Why are so many attracted to the precept that thinking hinders high-level performance, even though it is wrong? One reason for this that I address in the book is that expert actions often appear effortless. Yet appearances can be deceptive. Also, poor performance may frequently coincide with acting in a deliberate and not automatic way, but it could be that the poor performance causes one to step back and think rather than the other way around. Moreover, as I also point out, it could be that just as diet books that advocate the idea that you can eat as much as you want as long as you don't eat one somewhat arbitrary category of food are popular, the idea that expertise is effortless is popular: neither view is correct, but both are easy to follow.

Support for my position is multipronged. I rely on case studies for in-depth explorations of specific examples of expert action, such as a head nurse’s description of working in the emergency room, a tennis player’s account of competing in a grand slam, or a chess player’s experience of playing speed chess. I examine and draw inferences from empirical research that identifies psychological factors experts themselves see as conducive to optimal performance in high-pressure situations. And I consider what we should expect would be true about expert action given what we know about experts and the way they train. What can we infer, for example, about expert performance given that experts practice in a thoughtful, analytic manner? How does the often-obsessive drive to improve affect the state of an expert’s mind when she is exercising her skills? And what follows from the need to occasionally take risks in performance?

The starting assumption of many researchers in this area seems to be that when experts are performing at their best, self-directed attention is harmful rather than helpful. Studies, then, are devised to test this preconception. Yet such studies sometimes seem better designed to substantiate this contention rather than to question it. If nothing else, I’ll consider my book a success if it creates a fissure in this approach.

For further discussion of the book see the OUP blog

Popular posts from this blog

Delusions in the DSM 5

This post is by Lisa Bortolotti. How has the definition of delusions changed in the DSM 5? Here are some first impressions. In the DSM-IV (Glossary) delusions were defined as follows: Delusion. A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not one ordinarily accepted by other members of the person's culture or subculture (e.g., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility.

Rationalization: Why your intelligence, vigilance and expertise probably don't protect you

Today's post is by Jonathan Ellis , Associate Professor of Philosophy and Director of the Center for Public Philosophy at the University of California, Santa Cruz, and Eric Schwitzgebel , Professor of Philosophy at the University of California, Riverside. This is the first in a two-part contribution on their paper "Rationalization in Moral and Philosophical thought" in Moral Inferences , eds. J. F. Bonnefon and B. Trémolière (Psychology Press, 2017). We’ve all been there. You’re arguing with someone – about politics, or a policy at work, or about whose turn it is to do the dishes – and they keep finding all kinds of self-serving justifications for their view. When one of their arguments is defeated, rather than rethinking their position they just leap to another argument, then maybe another. They’re rationalizing –coming up with convenient defenses for what they want to believe, rather than responding even-handedly to the points you're making. Yo...

Models of Madness

In today's post John Read  (in the picture above) presents the recent book he co-authored with Jacqui Dillon , titled Models of Madness: Psychological, Social and Biological Approaches to Psychosis. My name is John Read. After 20 years working as a Clinical Psychologist and manager of mental health services in the UK and the USA, mostly with people experiencing psychosis, I joined the University of Auckland, New Zealand, in 1994. There I published over 100 papers in research journals, primarily on the relationship between adverse life events (e.g., child abuse/neglect, poverty etc.) and psychosis. I also research the negative effects of bio-genetic causal explanations on prejudice, and the role of the pharmaceutical industry in mental health. In February I moved to Melbourne and I now work at Swinburne University of Technology.  I am on the on the Executive Committee of the International Society for Psychological and Social Approaches to Psychosis and am the Editor...