Monday, October 28, 2013

Your Robot Friend Loves You So Much



 The message came through on our answering machine that evening, as my son and I sat downstairs.  It was an announcement from my big guy's high school, telling us about an event that would be occurring the next day.

The message wasn't live, but it also wasn't recorded.  It was a droning, artificially-generated voice, speaking the words that had been typed into it by some administrator.  Or perhaps, listening to it, it was just a mass-call system reciting an email in that peculiar, lilting automated monotone.

As invitations go, it was peculiarly uninviting, a bit like getting a message from the Borg asking you to join them for an evening of live music, coffee, and assimilation into the Collective.

Because there are things, frankly, that machines just can't do, no matter how much more efficient they might be.

Like, for example, this bizarre oddment I encountered the other day.  I'm fascinated by bots and AI, and in my noodling about on the net looking for new bot-stuff, I found...well, this.  It's called "empathynow," and it's a fledgling webservice based on a Loebner Prize winning chatbot.  The business model here is a simple one.  You need encouragement?  You have no-one to listen or give you that little boost?  Well, you can just send "Chip" a text.

Sad?  Just text Chip.  Chip will text you right back with a word of encouragement.  Need affirmation?  Just text Chip.  Chip will affirm you.  Trying to work on losing a few extra pounds and need someone to hold you accountable?  Chip will keep asking you how you're doing, in a nonjudgmental way.

All for a small monthly fee.

The human desire behind it...to insure that no-one feels alone and uncared for...is an admirable one.   But I can't escape the feeling that as business models go, this one is desperately, horribly sad.

Are we so isolated and hungry for something to affirm us that we'd turn to an unfeeling automaton to simulate empathy?

I wish that I could answer that question in the negative.




No comments:

Post a Comment