Cognitive chips and candy bars

Share

Halloween offers a combination of the fanciful with the traditional. Today I’m going to keep the fanciful angle but I’m going to shove aside the traditional in favor of the futuristic.

This will involve looking at a gray area between the living and the non-living. No, I’m not talking about ghostly notions from a haunted house, where spirits inhabit a shadowy gap between this world and the next. I’m talking about computers. Computers may actually become capable of thinking someday, at which point they’ll have their own esoteric gap to inhabit.

IBM recently unveiled what they’re calling a “cognitive” chip, which they dub the “SyNAPSE” chip. Parallels to the human brain are not coincidental here, but are explicit and emphasized. The chip has 5.4 billion transistors and it’s a step toward “holistic computing intelligence” in the field of “nerosynaptic” chips.

In the computer field, it doesn’t get more authoritative than IBM. So it’s possible that the age of the electronic brain might indeed be upon us.

If that day comes, hopefully the technology will let scientists aim the cyber-brainpower at important tasks, such as curing cancer and other ailments.

But I have no idea how scientists do such things, so all I can do is contemplate the more immediate realm of human interaction and how we might regard the gig.

Since, heretofore, anything that thinks is considered alive, will a thinking computer be regarded as a life form? Until the real arguments come along, it will make a dandy topic for a term paper.

All sorts of fancy words will swirl around such discussions: Sentience, calculation, awareness, consciousness, and so on.

In just the span of a few decades, we’ve seen computers (as business machines) go from office calculators to (as smartphones) personal companions.

For example, when I had a restaurant lunch last week I noticed that of the nine patrons, eight had smartphones for their dining companions. If you had told me, back in my corporate power-lunch days, that people would be lunching with computers instead of with other people, I would have told you to cut back on your noontime martinis.

But as deep as they’ve bored into the modern psyche, smartphones are still in their Pac-Man days compared to what we can probably expect from tomorrow’s thinking computers. Outside of smartphones, I’ll bet that all sorts of new things will come along, computer-driven things that will seem obvious once people start using them, but which we can’t envision now.

But it’s easy to envision computers that eventually program themselves, at least within certain parameters, and in doing so, develop their own particular personalities. These personalities will be a result of their adaptation to the unique series of circumstances that they encounter.

If one such device was, say, the sole companion for an infirm person, and that person interacted with, communicated with, and related to the device’s distinct personality, then you might be able to offer a few arguments that the device is “alive” in some sense of the word. You could argue the point based on the objective qualities of the device, but also based on the subjective view of the human who derives companionship from its interaction.

Still holding that thought, have you ever walked down the halls of a retirement home, hearing a succession of blaring TVs as you pass each room? That’s mighty old technology.

In such settings, which will be increasingly common in graying societies, it’s not farfetched to imagine devices that can converse to some extent; place audio or video calls to relatives; read aloud news, weather, magazines, and books; show movies or TV shows; play checkers; play poker; play music; monitor basic health parameters; dispense medications; maintain a listening watch for such words as “help!” and summon human attendants when necessary.

With the possible exception of holding conversations, all the other duties I elaborated already seem within the realm of today’s technology.

It’s easy to get enthusiastic about futuristic notions. But there have been false starts. In the 1970s I learned, as a student, the rudiments of several programming languages. I remember that Artificial Intelligence was all the rage back then. Some very knowledgeable people thought that within a few years computers would upend the world with impressive displays of intelligence. Those few years have stretched into a few decades, and we’re still waiting, so those early AI days look a bit innocent in retrospect.

Still, a trend is a trend, and this one appears to be accelerating. I, for one, won’t be surprised if computers can think someday, but I don’t know if this will constitute being alive. Fortunately, I don’t have to solve such riddles. I’m just a spectator.

In the meantime, I’ll do what I do best: I’ll keep eating my Halloween Snickers bars ever year. This will keep me occupied until the issue, once it becomes an issue, is solved the true American way: with litigation. And then perhaps we’ll see the recognition of a new type of life form, one based on the consciousness of thinking computers.

Ed Stephens Jr. | Special to the Saipan Tribune
Visit Ed Stephens Jr. at EdStephensJr.com. His column runs every Friday.

Related Posts

Disclaimer: Comments are moderated. They will not appear immediately or even on the same day. Comments should be related to the topic. Off-topic comments would be deleted. Profanities are not allowed. Comments that are potentially libelous, inflammatory, or slanderous would be deleted.