Back to News & Commentary

Weird Computer-Generated Quiz Produces Customer Service Fail

Jay Stanley,
Senior Policy Analyst,
ACLU Speech, Privacy, and Technology Project
Share This Page
June 22, 2012

I lost my credit card yesterday and had a very telling experience on the phone with American Express trying to get it replaced. After I gave them various pieces of information, the customer service agent said they would ship me a new card to the billing address on file. Just when I thought I was done, she then read something to the effect of, “For security purposes, I am going to ask you a question. The information this question is based on is not connected to your account, but was obtained from third-party information services.”

She then asked, “Which of the following companies have you been associated with?” And named four companies, none of which I had ever heard of in my life, much less been “associated” with. I picked her choice #5, “none of these companies.” The woman paused to wait for her computer and then said, “It says that answer is wrong.” At my request she re-read me the list. I still didn’t recognize any of these supposed companies. Then she told me that because it said my answer was wrong, she couldn’t issue me a new card, until I went through some other rigamarole involving calling them back from a certain phone number listed in their records.

I tried arguing with her about this very strange turn in the conversation, pointing out that they were already shipping the card to the billing address, and that I had verified other information, and wasn’t that enough? She said it wouldn’t allow her to issue me a card because I had failed the test. I told her the information the questions were based on was incorrect, but that made no difference.

Clearly what was going on was that she had been delivered a computer-generated quiz for me derived from information obtained from a third-party data broker such as Choicepoint or Lexis-Nexis. (The New York Times recently published a profile of one such company, Acxiom.)

Maybe others are familiar with this procedure but it was certainly new to me. I thought the experience was interesting for several reasons:

  • American Express does not appear to be concerned about freaking out their customers by making it obvious to them just how much information unrelated to the provision of their service the company is collecting.
  • One possibility is that the stupid computer-generated quiz was giving some formal corporate or parent-company name or a name for a company that had been changed, which led me to not recognize it. But at least as likely is that the information was just wrong. The information collected by these companies has been found to be highly inaccurate; in a study by PrivacyActivism, two out of 11 test subjects were reported to be corporate directors of companies that they had never heard of (hmmm…).
  • The possibility that I was being denied a service (at least, temporarily) and inconvenienced due to erroneous information held by a third party is an ominous indicator. It was a minor matter for me, but still—I never asked a bunch of companies to go out and start building a file of (quasi-accurate) information about me. I have no business relationship with those companies or leverage over them. The mindless bureaucratic “test” I was given was just a small peek into a whole secret world of data collection that is going on behind the scenes—and a reminder that that world can have consequences for our lives. If reliance on their information becomes more widespread and more serious, those consequences will increase.
  • Finally, note that in terms of my dealings with American Express, there was little difference between computer agent and human agent. I had earlier been bounced from the automated voicemail tree because it couldn’t handle my situation (I didn’t know my card number). Yet the fact that I was interfacing with a human being made zero difference in my treatment; I was still effectively trapped in a computerized decision tree. This employee had no discretion to dispose of my case outside the parameters of what her computer allowed. Her computer was not a tool that extended her brain; her brain was merely a translator between me and the computer algorithm, which was very much in charge. No doubt the quiz given to me had also been prepared by a computer, and the information on which it was based compiled by a computer at Axciom, Choicepoint, or a competitor. This question of the boundaries between the human and the computer is an interesting and potentially consequential one, which recently came up in the context of Google anti-trust issues.

In the end, the problem got resolved. While at first the agent said she could not ask me a new set of questions, eventually her computer seemed to relent, and after once again robotically reading the little speech about information “obtained from third-party information services,” she presented me with a list of street addresses (no city or state). I had to pick the one where I had once lived. I dimly recognized one of the addresses as an apartment I once inhabited for 9 months in grad school in the 1990s, and—I passed the test! American Express’s computer was happy at last.

Learn More About the Issues on This Page