Dirty chat robots
Just as seeing is a misnomer when it comes to machine vision, so the other human senses (hearing, smell, taste, and touch) don't have exact replicas in the world of robotics.Where a person hears with their ears, a robot uses a microphone to convert sounds into electrical signals that can be digitally processed.It's relatively straightforward to sample a sound signal, analyze the frequencies it contains (for example, using a mathematical descrambling trick called a Fourier transform), and compare the frequency "fingerprint" with a list of stored patterns.If the frequencies in your signal match the pattern of a human scream, it's a scream you're hearing—even if you're a robot and a scream means nothing to you.
What would it have taken to make a general-purpose robot similar to a human?
We experience the world through our five senses, but what about robots? Humans are seeing machines: estimates vary wildly, but there's general agreement that about 25–60 percent of our cerebral cortex is devoted to processing images from our eyes and building them into a 3D visual model of the world.
Now machine vision is really quite simple: all you need to do to give a robot eyes is to glue a couple of digital cameras to its head.
In psychology (the science of human behavior) and in robotics, these things are called perception (sensing), cognition (thinking), and action (moving). For example, robot welding arms in factories are mostly about action (though they may have sensors), while robot vacuum cleaners are mostly about perception and action and have no cognition to speak of.
As we'll see in a moment, there's been a long and lively debate over whether robots really need cognition, but most engineers would agree that a machine needs both perception and action to qualify as a robot.