The Faith in Qualia
The biases and cognitive missteps that occur when someone defends a belief in God are similar to those of someone who believes in qualia. If you question God-believers, they will justify by saying, "I just know it to be so." Or they may refer to some deeply spiritual experience in the past. Likewise, if you ask consciousness-believers why they believe in qualia, i.e. "knowing that blue is blue," they will emphatically say, "I just know it to be so."
Vouching for the authenticity of conscious experiences is a memory-retrieval act, similar to vouching for the authenticity of God. The God-believer professes a belief by recalling a placeholder of their belief. Those placeholders are constructed from memories that are as real as witness testimony in trials, which is hazy, at best. To the witness recalling the scene of a crime, they will often state, "It's like it was yesterday." But psychology studies keep showing that our brains readily fill in the blanks to create a convincing portrait of a memory. Likewise, when someone summons their understanding of the color blue, it's a similar placeholder: "The last time I inspected the qualia of blue, it was something special." Repeat this enough times, and you'll have conviction.
We might have as much free will as someone doing improv
Our minds just retroactively ascribe a predetermined will or hand behind our actions, when really they were just random.
We should be able to prove the materiality of consciousness soon thanks to AI
We'll find out soon enough whether Daniel Dennet was right, that there is nothing special about consciousness. As we crawl up the artificial intelligence ladder, we should start to see basic forms of consciousness in our machines. We have a funny feeling of consciousness when we look at Deep Blue defeating Kasparov at chess. Or when we imagine the grid computing behind Siri, we feel that some intelligence is at work. But so far, our sense of a "ghost in the machine" isn't the same as watching a squirrel pause and scan its surroundings. We feel that there is some kind of consciousness in the squirrel, albeit primitive, even if it's not ours. Or take even the simplest creature, a starfish. When we poke it, we sense some kind of consciousness when it curls up. We know that it felt something as if its reaction was it saying, "ouch." So if we're indeed making progress towards a generalized artificial intelligence, we should be able to poke at our machines and sense a similar reaction.