This week's post is from Vladimir Krstic (a philosopher at the United Arab Emirates University) on his recently published paper Lying by Asserting What You Believe is True: a Case of Transparent Delusion (Review of Philosophy and Psychology).
Imagine that I tell you that I believe that I am Hitler but that I am not Hitler since he is dead and I am not. You would probably think that you did not hear me correctly. But, if — upon your request for clarification — I tell you specifically that I believe that I am Hitler but that this is not possible and that, thus, my belief is obviously false, you might think that I am toying with you. Many people, following Moore, think that these claims are absurd.

However, this impression is misleading. One can confidently believe that p and ascribe this belief to oneself, while judging that not-p. This is what happens to transparently delusional people. They suffer from a delusional belief, they correctly ascribe this belief to themselves,but they also judge that their belief is false. There are also cases of non-pathological yet transparently irrational beliefs that people correctly ascribe to themselves while judging that the opposite proposition is true.
In this paper, I argue that analysing this condition generates many important insights regarding the nature of lying and sincerity. Most obviously, it suggests that people can lie by asserting propositions they confidently and consciously believe are true as long as they concurrently judge them to be false. More importantly, it suggests that the judgement about whether p and the self-ascribed belief about whether p can come apart and that judging that pdoes not entail believing that p. These hypotheses allow us to better understand many phenomena that seem paradoxical, like some cases of self-deception that involve a tension between the person’s professed belief and her behaviour.
Two ‘faults’ in one’s thought-evaluation processes explain transparent irrationality. Briefly, people evaluate their thoughts in two simple steps. In the first step, they purge the thought of all contextual or narrative associations (e.g., emotional importance) — the so-called thought-decontextualisation. Once the thought is decontextualized, the person can effectively run it through different domains of reality to test it for consistency (i.e., to see how probable it is considering various aspects of the world) — the so-called reality test (the second step). Athought that passes both steps is marked as correct.
Transparently irrational thoughts are caused and maintained by an error in one or both of these steps; as a result, the thought passes the test when it should not. A step 1 error is when the thought does not get decontextualized properly and certain subjective associations interfere with its reality-testing. Thus, some parents cannot believe that their child is abusing drugs even while looking at its bong. A step 2 error is when the thought does not get rendered through the relevant domain of reality and so it is never put together with the judgement to the contrary in a way that would result in the thought’s revision. I consciously believe that I am Hitler and judge that I am not Hitler because, due to a step-2 failure, I cannot realize that my judgement is a reason to revise or abandon my belief.
Imagine that I tell you that I believe that I am Hitler but that I am not Hitler since he is dead and I am not. You would probably think that you did not hear me correctly. But, if — upon your request for clarification — I tell you specifically that I believe that I am Hitler but that this is not possible and that, thus, my belief is obviously false, you might think that I am toying with you. Many people, following Moore, think that these claims are absurd.

Vladimir Krstic
In this paper, I argue that analysing this condition generates many important insights regarding the nature of lying and sincerity. Most obviously, it suggests that people can lie by asserting propositions they confidently and consciously believe are true as long as they concurrently judge them to be false. More importantly, it suggests that the judgement about whether p and the self-ascribed belief about whether p can come apart and that judging that pdoes not entail believing that p. These hypotheses allow us to better understand many phenomena that seem paradoxical, like some cases of self-deception that involve a tension between the person’s professed belief and her behaviour.
Two ‘faults’ in one’s thought-evaluation processes explain transparent irrationality. Briefly, people evaluate their thoughts in two simple steps. In the first step, they purge the thought of all contextual or narrative associations (e.g., emotional importance) — the so-called thought-decontextualisation. Once the thought is decontextualized, the person can effectively run it through different domains of reality to test it for consistency (i.e., to see how probable it is considering various aspects of the world) — the so-called reality test (the second step). Athought that passes both steps is marked as correct.
Transparently irrational thoughts are caused and maintained by an error in one or both of these steps; as a result, the thought passes the test when it should not. A step 1 error is when the thought does not get decontextualized properly and certain subjective associations interfere with its reality-testing. Thus, some parents cannot believe that their child is abusing drugs even while looking at its bong. A step 2 error is when the thought does not get rendered through the relevant domain of reality and so it is never put together with the judgement to the contrary in a way that would result in the thought’s revision. I consciously believe that I am Hitler and judge that I am not Hitler because, due to a step-2 failure, I cannot realize that my judgement is a reason to revise or abandon my belief.