April 19, 2011. If you're into the Sarah Connor Chronicles, at 8:11PM last night Skynet becameself aware, and set events into motion that will bring about Judgment Night... well, tomorrow. I don't buy into that myself (I'm a snobby Terminator purist, as I've lamented all day on twitter much to my followers' rolled eyes), but I did see what some may consider the harbinger of Skynet in the news today: Mind reading drones referenced in Wired.com's Danger Room.
The Air Force has a problem: They want to fly more drones, but they don't want air traffic controllers responsible for things like when and where the drones should land. That goes against the whole idea of unmanned, computer controlled aircraft.
The solution: smarter drones that predict what a human pilot is going to do and act accordingly to avoid collisions. It seems like an innocent premise, doesn't it? There's no need to limit traffic at busy terminals if your stealthy little drones are doing their own thing, weaving in and out of planned take-off and landings. The drones will size up any near collision situation and determine the most likely (and safe) path to take in milliseconds and respond appropriately. Win-win, right?
Wrong. Didn't anyone else actually watch Terminator? It started out innocently too. Let's walk through this real quick, shall we? First, drones are upgraded to be smart and sly in their dealings with manned aircraft. This opens the door for even more drones flying our friendly skies. Before you know it, armed drone aircraft are out patrolling the border, thick as mosquitoes and just as tenacious, defending us from perceived evils. Drones will fight our air superiority wars for us, unmanaged except by some central command and control AI.
We've all seen what happens next. A T100 striding a robotic path forward on human skulls on the offensive against the harried survivors of the human race. Is this really the way the future has to be?
Soar Technologies, one of the companies providing humanity with its robotic Armageddon -- I mean, smart drones -- has plans to build their "schema engine" (an intelligent logic mechanism) with the following properties: "memory management, pattern matching, and goal-based reasoning." If you're like me, you probably read that as I did: "How to kill humans."
So when the nukes come down like the rain from a violent thunderstorm of artificial intelligent revenge, don't blame me. Blame the Air Force and their crazy ideas. Just remember, we could have stopped this before it was too late.
Jason Kennedy has many more fears than a robot apocalypse, some of them remotely realistic. If you keep your eye on him via Twitter, there's a good chance you'll hear them.