Monday, May 9, 2011
About That "Empathy Test" Of Yours
There's been some PKD activity on the Intertoobz lately, and I really need to write up a news roundup article, but not today; I want to share something cool a student hipped me to in Do Androids Dream of Electric Sheep?. Yes, it's that time of the semester again, when I read literally about a hundred essay about DADOES?. Most are pretty simplistic and usually go over one of the topics I suggest during our class lectures. Last week, a student approached me and said she wanted to write about the Voight-Kampff test in the novel; specifically, she found it ironic that the questions were not the least bit empathetic. I was stunned by the power of her discovery, and a little amazed that I had missed it.
Indeed, if you read the test questions Rick asks Rachel for instance, it's clear that there is nothing empathetic about the test itself. Rick describes a situation to Rachel: "In a magazine you come across a full-page color picture of a nude girl" (49). This is when Rachel gets the great line about testing to see if she's an android or lesbian - a line so great it even made it into the movie. But then it gets weird; Deckard continues: "Your husband likes the picture."
Uh, Rick... she doesn't have a husband. The questions are impersonal and in many cases seem totally unrelated to the expression of empathy someone might make in their day-to-day lives. So, this test and its administrators, which seek to detect empathy, make no attempt to tailor the questions to the subject. Martin Luther King Jr wrote in his letter from a Birmingham Jail that the ends are preexistant in the means. And here I can't help but think that the apathetic and clinical attitude of the test and its administrators creates a situation in which the androids' apathetic outlook has spread like a virus to the very people who are charged with eliminating this apathy.
This is yet another layer of irony in a deeply ironic book. Look, all of the characters transcend their identity in one way or other: the androids are empathetic, the humans are apathetic, the supposedly Chickenheaded John Isidore shows an appropriate reverance for life; Deckard, who keeps talking about how much he wants a real animal, can't take proper care of the animals once he acquires them.
It reminded me of this interview I did way back in the day with Lethem, where he said:
"On the other hand “Do Androids…” has “Blade Runner” attached to it. I think in terms of the role that Dick has taken in terms of the popular imagination it’s an important connection… I reread [each of the four novels] carefully and “DADoES” is the one book I’ve been underrating. It struck me as totally controlled and emotionally precise..."
I'm, once again, appreciating the care with which Dick wrote Do Androids Dream of Electric Sheep?. I have a new theory, that perhaps it is DADOES that marks a transition in Dick's writing career from his second-draft masterpieces written in the early 60s to the multiple drafts and Herculean efforts he put into his later books. Flow My Tears, The Policeman Said went through how many editions and drafts?
Anyway, all that effort pays off brilliantly in DADOES?, a book that continues to open up to deeper and deeper analysis and interpretation. Even after teaching the novel to thousands of students, I'm still amazed by what I find inside.
(Thanks for the insight, Lisa Casale!)
Subscribe to:
Post Comments (Atom)
10 comments:
Fascinating reading. I've often thought of this scene as a comment on what happens in the emotional system of the reader of a text who watches people go through experiences and possibly empathizes, or is offended, or whatever. Rickels goes into the Jungian method of testing affect which was apparently a big influence on the empathy test. I hadn't considered the fact that the questions are hypotheticals to be a problem since the test is measuring the human ability to empathize with a fictional situation happening to unreal strangers, something that I expect would be difficult for a computer to model. But to look at this as a moment that opens into a dystopian reading suggesting that flaws in the test are what leads to the problems being dramatized--brilliant. Thanks for posting this gem of student work.
Lisa is quite right. And I think there is yet a lot more to be discovered in Do Androids Dream Of Electric Sheep? by CAREFUL readers, including the fact that the title is an interrogatory. But does the novel give a satisfactory answer to the question of the title?
I always found it interesting how the test was set up. It detects androids not by their answers, but by the quickness of their reactions, which they cannot control or are aware of. While I agree with most of what you wrote, I still think some of the questions involve empathy. For example the question about boiled dog (I think it was in both the novel and Blade Runner) would get a completely different reaction than that of, say fried chicken.
I don't quite understand what you mean about the tests not testing empathy. Why does it matter that Rachel doesn't have a husband? Empathy is the act of putting yourself in someone else's shoes and not being able to relate to a fictional scenario would involve a failure of empathy.
There's definitely something to the fact that the people giving the test are very unempathetic in the way they treat the replicants, but I'm not convinced that the tests themselves don't test empathy.
Giospurs,
It's not that the test fails to test for empathy; rather the problem here is that the empathy test, used to determine who should live and who should be retired, is administered without empathy.
I see your point that any empathy test should ask about an unfamiliar situation since those questions would necessarily force the respondent into the mind of another.
But there is a latent assumption being made by the testers: lack of empathy is one of the things they look for as an indication a subject may either be an android or a human suffering from flattening of affect.
The very way the test is designed indicates an inability among the test designers to empathize with a test-taker. Thus, the test makers fail their own test.
I thought that the questions that aren't set in specifically empathetic situations were designed to-- set the machine & register a reference point, and develop a pace & rapport with the subject. Then; the administrator gets to the 'empathetic questions' at appropriate moments.
I would draw your attention in this debate (which is fascinating) to a point in the biography of Dick by Carrere in which he elucidates several interesting points regarding the category of empathy as an interrogative category with regards to determining someone's human-ness. In his discussion of the Voight-Kampff machine, specifically regarding the influence of the Turing test, Carrere writes that "Turing would surely have found this additional criterion [empathy] laughable...he would have pointed out that plenty of humans are incapable of charity and that in theory nothing prevented someone from programming a machine to carry out behaviours that one would normally attribute to charitable feeling." (Carrere, 133)
I find this illuminating with regard to the Voight-Kampff test: the test, I would argue, is not designed to test empathy, which after all, can be feigned. It is to test the ability to understand the test. In short, whatever the actual reaction is is more or less moot, it is the fact of the reaction which is indicative, on a phenomenological level. After all, Deckard lives in fear of confusing a human and an android, leading to the conclusion that much of the result of the test is intuitive. It depends on Deckard's decision, upon his reaction to the fact of the reaction. A brain damaged human might fail the test, a particularly well designed automata might pass it but it is the response to the test itself, not to the individual questions which is the most important element of the Voight-Kampff machine.
I would also add- as an after thought- that such a reciprocal system formed between the interviewer and the interviewee forms a closed system, which of course, as everyone here knows according to Maxwell's second law of thermodynamics inevitably leads to entropy, hence Deckard's perpetual fear of an erroneous diagnosis of an android. Also, a typically PhilDickian model for ascertaining someone's veracity as a human being: something bound by its very parameters, to decay and decompose.
The test is really a MEMORY test. Recalling the memory - OR NOT - is the test. Is it a real memory or was it implanted? Do you even remember?
THAT IS THE EMPATHY TEST.
Can you pass the Dick Voigt Kampff test?
(--) IS THAT THE FIRST QUESTION?
(--) ARE YOU A SENTIENT BEING?
(--) CAN YOU GIVE ME YOUR NAME?
(--) DOES MONEY GROW ON TREES?
(--) HOW DO YOU FIND A NEEDLE IN A HAYSTACK?
(--) WHAT IS THE SOUND OF ONE FINGER SNAPPING?
(--) WHAT DO YOU THINK OF THE TEST SO FAR?
(--) CAN YOU AFFORD TIME FOR A SMILE?
(--) WHAT MAKES YOU LAUGH AT YOURSELF?
(--) CAN YOU MAKE YOURSELF FEEL HAPPY
Did you pass?
It seems people miss the point about the boiled dog: androids can simulate empathy. The boiled dog incident was a play in the theater, not a real thing. Thus, no empathic reaction should be necessary. But humans are not rational so they react anyway.
Post a Comment