Yesterday I wrote about receiving a robo-call that was so slickly done I didn’t even realize I was talking to a computer for several minutes. I mulled over whether the use of such technology raises any serious privacy or other ethical issues. Not really, I decided—at least that I could see.
I’ve been thinking further about the implications of such technology, however, and I do think it may spark certain power struggles between individuals and institutions. As I said yesterday, if bots can be used, they will be used. But what’s the definition of “can”? There will be not only technological and practical limitations to the use of verbal bots, but also limits to what we humans will tolerate.
Think about the “attention economics” of the situation. The fact that companies can initiate and sustain a conversation with us at much, much lower cost (because they don’t have to hire a human to do it) will drive wider and wider use of the technology. When something suddenly gets cheap and easy, it usually gets over-used; we also see that with drones, cell-phone location tracking, and other technologies. And, spam—when the marginal cost of a communication falls to nearly zero, it is predicable that it will begin flooding the tragic commons of the public’s attention.
Attention economics is based on the insight that our attention is a limited commodity—in many situations, especially those involving computers, it is the single most limited thing. Robots’ new abilities to hold a conversation will open up new opportunities for companies and others to try to sap some of that precious attention.
And what is the purpose of conversation? Ultimately, it’s to either convey or extract information. Here, the first will mean just another avenue for advertising (as in the robo-call I received). The second will mean privacy-invading prying for marketing or other purposes. Already it’s annoying when companies direct their cashiers to exploit your presence at the register to try to pry extraneous marketing information out of you such as your zip code. Without doubt companies will push people’s limits to see what else they can get people to give up, especially exploiting situations where they have a captive audience.
And of course, companies will not just be bugging us for attention, they will also exploit the limits on that attention to their own advantage. That is the central problem with contract law as applied to click-through licenses and the like; they are based on an implicit assumption that our attention is an unlimited commodity. They exploit the difference between that legal fiction and the actual, real-world limits on that commodity (the fact that nobody has time to read these agreements) to gain advantage over consumers.
I think what we’re really going to need is a bot of our own to handle all this. We’re going to need a defense mechanism against both the demands on our attention and the demands for information about us, whether they come in the form of old-fashioned telephone calls or the myriad other arenas where the technology may be used. Maybe the market will produce such bots and we'll enter an arms race of dueling, protective shells of bots working to extract information from others while ensuring that not a moment of their masters' time is wasted. After all, if you’re calling a company with a problem, why should you have to answer some customer service robo-agent’s basic questions, when you could easily have a bot do that?
While we’re at it, why should the two bots (yours and the company’s) even speak English to each other when they could exchange much of the relevant data in the flash of an eye in XML?
Of course, now we’re back in the thick of privacy issues. You’re going to want to make sure that your bot knows what to reveal about you, and when, and to whom, and what not to. We may be faced with the choice of delegating those decisions to an agent, or constantly having to spend some of our precious attention making sure our privacy is protected. Attempts to automate privacy analysis have not been successful so far, but perhaps improved AI will change that.
With attention an ever-more-precious commodity in a bot-filled world, power struggles may take place not just between individuals and large faceless institutions, but also between individuals. People may start having bots answer their phone calls, asking a few introductory questions and patching through only those calls where the owner is likely to want to chat. Everyone can have their own robo-receptionist screening their calls (perhaps based not only on content but also on voice recognition, just like a real secretary). But people may find it annoying to have to deal with other people’s bots. Perhaps they will even start having their own bot initiate calls. Then it will be like two rival, egotistic CEOs or heads of state, neither wanting to be the first to get on the phone.
The other day, shortly after receiving the robo-call, I got another call, this one from a delivery company seeking to confirm my address after their driver couldn’t find it. This was definitely a human, and in the course of our short conversation we got sidetracked once or twice and shared a laugh about the vagaries of newbie deliverymen. I appreciated the warmth and humor of the lady on the other end of the line as never before.
I said above that giving or receiving information were the ultimate purposes of a conversation. Of course among humans there are other purposes as well, such as giving comfort to a friend, or just enjoying their voice and their presence without regard to what is actually discussed. Or, just enjoying a moment of humor with a stranger.
While bots will no doubt have their uses, any attempts to replicate these dimensions of human life will be repugnant. And their efficiencies will bring certain losses—the increasing use of conversational robots may make real human interactions rarer, but it will also make them even more valued.
I don’t know how much of all this is just a flight of fancy, but that’s certainly one way things could go, and I bet we will see at least certain elements of these dynamics.