Thursday, September 24, 2015

The Fright Fantastic:
Why the Best Horror Movies Are So Chilling


What is scary? As an occasional designer of haunted attractions who has worked on horror movies and written my share of creepy stories, I have given this question a considerable amount of thought.  And although not everybody is frightened by the same things, there seems to me to be a common theme that underlies all the truly terrifying situations depicted in our popular culture.  As I see it, the cold, clammy foundation of our greatest fears is the idea of being at the mercy of the merciless.


In need of a hand. A helpless victim in one of the cruel traps set by the
pitiless villain in the Saw films.  © Lionsgate

    I'm talking about scares, not startles.  Jumping out at someone from the shadows might get the heart to skip a beat, but it’s purely a momentary reflex, usually followed by laughs and giggles as the perceived threat is proven to be nothing but a lunging actor in a costume.  A true scare takes some time to build up, and relies upon the suspension of disbelief such that the victim is never quite convinced the threat is not real.  I went on a haunted hayride one time that was not very impressive until the maniac with the chain saw came at us. The roar of the saw was real because the chain saw was real – the only question was whether or not this guy had actually bothered to take the chain off of the blade.  Given the blurring speed of an operational saw chain and the dimness of the light, my instincts were to steer clear of that thing at all costs. I wasn’t about to take any chances.

    Fear of the unknown can be awfully intimidating, whether it’s the small question of whether the killer’s knife is real or whether there truly are dragons at the edge of the world.  That’s the main reason the darkness is so ominous: who knows what evil, hungry things are prowling out beyond the reach of the campfire? Since you can’t see them, your mind is free to write its own litany of horrors – and on a moonless night, the blank slate is virtually limitless.  That’s not to say the unknown is always bad, else there would be no such thing as a pleasant surprise. And darkness, in and of itself, is not a guaranteed harbinger of bad things; after all, where would you be if your parents hadn’t gone bump in the night?

    What raises our goose-bumps is not the dark itself, but rather the imagined beasties that, for all we know, are lurking out there along the wood line. The darkness is merely the thin veil that hides them from our view --and yet, because they are denizens of the dark, with night vision superior to ours, we are not as hidden from them as we’d like to be.  We imagine that we are at their mercy, and that they have no intention of showing us any.  The intervening blackness is cold and uncaring; we cannot persuade it to pull back its curtains a bit so that we may see there are no monsters stalking the benighted landscape. So we carry torches to hold the shadows at bay, but they are of small comfort, illuminating only as much of the surrounding gloom as could be traversed in a single leap by a snarling predator.




Not "catch and release."  Robert Shaw becomes a great white's best chum
in Jaws.    © Universal Pictures 
    Let’s take a look at some of what I consider to be the scariest movies ever made, and see how the idea of mercy applies to them.  The original Jaws was an exquisitely terrifying film. Its antagonist was not a supernatural creature but rather a big, hungry animal, although its unrelenting appetite did push the limits a bit, especially given the shark’s reportedly slow digestive process.  The beast lurked in the depths of the ocean as opposed to the darkness, but within that vast, unknowable sea you could not predict when or where it was going to strike. What you did know was that it was going to be a painful chomp and your chances at living to a ripe old age would suddenly be over. There would be no negotiating, no pleading, no appealing for mercy to those dead “doll’s eyes,” as Robert Shaw described them. Toss in the element of darkness, as in the scene where Richard Dreyfuss ventures into the water on a foggy night to investigate the abandoned fishing boat, and you have a recipe for terror on a plate of reverse sushi that still gives me the shivers.


Seven-corpse meal.  The recently deceased are out for lunch in Night of the Living Dead.  © The Walter Reade Organization
    Night of the Living Dead can be regarded as Patient Zero of our current plague of zombie movies. For a film that cost about a buck and a half to make, it really delivered the scares.  Lead actor Duane Jones would be hard pressed to find a more implacable foe than an army of risen corpses bent on devouring regular folks.  For a while, he was able to outrun them, but when he found himself surrounded, he took refuge in an isolated farmhouse and boarded up the windows.  Eventually, the sheer numbers of the undead proved to be too much for the impropmtu carpentry, and the living abandoned the living room to hole up in the basement.  There they made the same mistake just about everyone in a horror movie is obliged to make: they let down their guard. And the little girl who had just died in the corner of the cellar got up to stab them with a trowel!  No matter what the hero did, the zombies just kept on coming. Relentless and ravenous, they plodded on, an unnatural force of nature that even a resourceful man was powerless to stop.

    It has been said that stress is a reaction to things affecting us from beyond your control. The symptoms are quite similar to those of fear: a rush of adrenaline, elevated heart rate, heightened sensory awareness, an urge to take some kind of action countered by a lowered capacity for rational thought.  Is that why the protagonists in horror flicks make so many dumb mistakes?


Killer costume! Jamie Lee Curtis is menaced by a murderer in a cheap rubber mask in Halloween.     © Warner Bros.
    Jamie Lee Curtis played a plucky babysitter in the original Halloween, where she was beset by a murderous Michael Myers. Wearing a creepy William Shatner mask, Michael pursued her through the rooms of her neighbor’s house, a butcher knife at the ready.  Jamie finally got her wits together enough to stab him in the neck with a knitting needle (good for her!), but then assumed he was down for the count and left him lying there on the carpet, the knife still within easy reach (not so good for her). What made this big, shambling hulk such an iconic killer was his apparent lack of humanity – his single-minded devotion to the eradication of whoever was within reach of his blade, his seeming indestructibility, and most of all, the mask through which no redeeming emotion could be perceived. We saw him as a monster, not as a fellow member of the family of man, and so the usual rules of society did not apply. One look into the dark holes in that impassive, artificial face, and we knew his lack of compassion was absolute.  This “faceless” trick has been used to death in horror movies, from Jason Voorhees in Friday the 13th to Leatherface in The Texas Chainsaw Murders.  It’s visual shorthand for an implacable fiend bereft of a soul.
    Not all evil is so mindless. The vengeful doctor in Saw and its sequels was less a brutish golem than a human being to whom mercy was not so much a virtue as a means to an end, a false hope to be dangled like a measly carrot at the end of the whipping stick, only to be snatched away in a final act of harrowing cruelty.  This sadistic puppeteer was a distasteful reminder that even a person such as you or I could, with the not-quite-right twisting and shaping, be transformed into a malignant parody of his former self.  He was as merciless as the flesh-eating zombies, but distinguished from them by the passion with which he engineered his infernal devices. Even hidden from his victims behind a closed-circuit Howly Doody show, he let a trickle of human emotion seep through -- just enough to keep a thin wisp of hope alive and nudge his prisoners into position before bringing down the hammer for a devastating finale. There was no mercy in this man's shriveled heart; what we thought we saw was nothing but a vicious illusion. 


The game's a foot. Cary Elwes is a doctor whose only chance for escape is to hack off his foot in Saw.  © Lionsgate
    In the extraterrestrial horror story Alien, the monster did not have to divest itself of its humanity because it never had it to begin with.  Though there are clues early on that something really bad happened to the derelict spaceship the Nostromo’s crew found on that distant planet, nobody realized the nature of the threat until it had taken root on board their own interstellar mining barge.  One crewman, possessed of cat-like curiosity (but apparently not cat-like reflexes), got too close to a mysterious alien egg and was orally raped by a whip-tailed face-hugger. The thing eventually let go of him, and he reported feeling a lot better. Then, while joining his mates for brunch, the galaxy’s worst case of morning sickness hit him and he underwent a spontaneous Caesarian. The weird baby alien burst out of his abdomen and went screaming across the table, then off into the bowels of the ship to seek its fortune -- and whatever prey it might scare up.




E.T. the Extra-Terrifying. The hideous -- and dangerous -- creature from outer space in Alien.   © 20th Century Fox
    The critter quickly grew into a hideous, man-sized abomination, dripping acid from its veins and poking its rows of metallic fangs out at anyone who had the misfortune of wandering too close. It was never clear if the damn thing was intelligent, but it certainly didn’t give a damn about humans, except as a means of reproducing and, presumably, feeding itself.  One by one, it killed the earthlings off, until at the end it was just the alien versus Sigourney Weaver (and Jonesie, the only surviving member of the crew who did have cat-like reflexes).  Sigourney was scared, understandably, but she had learned that the only way to defeat this thing was to become unmerciful herself: shinnying into a spacesuit, she blew the hatch, blasting the creature out into the frigid night of outer space. 

    Other films prey upon our various fears, and they can all be interpreted in the light of this mercy angle. There is the fear of being eaten alive (Piranha),  trapped (Boxing Helena), drowned (Sanctum), burned (The Wicker Man), hunted (Pitch Black), mutilated (The Bone Collector), transformed (Dracula or The Wolf Man), replaced (Invasion of the Body Snatchers), possessed (The Exorcist), tortured (Hostel), or just plain stomped on (Godzilla).  Our adversary doesn't even have to be a creature or a spirit: falling out of a window, we would suddenly be at the mercy of unrepentant gravity; even the inevitability of aging can bring us to our knees in the path of the plodding juggernaut that is time.

    Perhaps the most basic of all is the fear of death, that ultimate leap into the dark unknown.  It is the big question mark at the end of our life sentence; a huge proportion of our various religions’ energies are spent addressing the mystery of what happens after we die. Yet no matter how comforted we are by the thoughts of a joyous afterlife, very few of us are hoping for an agonizing or humiliating demise. We want our lives to have meaning, and the same goes for our deaths. But when we see our proud existence reduced to something as trivial as another’s midnight snack, it is just too much to bear.  Yet bear it we must, because we are helpless in that other’s clutches. And so we scream.

    The scariest film I ever saw was a 70s TV movie called Don’t Be Afraid of the Dark.  Guillermo del Toro had the same reaction, and he decided to do a remake using much better special effects, but the story is essentially the same. In the original, Kim Darby is a young housewife whose husband, busy with his new job, leaves in her hands the chore of moving them into the creepy old house they have just purchased. A local legend says that the obsolete brick furnace down in the cellar is inhabited by evil spirits, just dying to recruit some new members.  Big surprise, the legend is true. These spirits are able to climb through cracks in the woodwork, and in one scene, they pick up a folding razor from the medicine cabinet and are just about to do a Norman Bates on Kim when she manages to get the bathroom lights back on. The creepy little buggers scurry like cockroaches, but not before she gets a glimpse of their fur-covered bodies and bald, wrinkly heads (although the costumes are obviously cheap, their visual effect is surprisingly eerie).


Lighten up! The spirits run and hide when the lights come on in Don't Be Afraid of the Dark.    © ABC Television
    Of course, her husband doesn’t believe her, so he takes no special precautions while he’s off making points with his boss. Then one dark and stormy night, the lights go out, and Kim can’t get them back on. She seeks comfort in a hot beverage, oblivious to the fact that one of the little monkey-monsters has slipped her a mickey.  She can’t get her husband on the phone, and soon becomes so lethargic that she cannot stop the critters from dragging her down the stairs and across the living room. Though scared out of her mind, her feeble attempts to grab at table legs and doorjambs hardly slows them down at all. She does manage to grab her camera and pop the flash cube (remember, this is the 1970s) at her kidnappers, sending them fleeing for the shadows. Unfortunately, she has only three flashes left, and the audience knows that will not be nearly enough.  We feel her helplessness, and tremble at the anticipation of whatever horrors await her down in that cold, dark furnace.

     So how do you make a scary movie?  You take a character that everyone can identify with (a housewife, a babysitter, a little boy with daddy issues), put him in a situation where he’s at the mercy of a merciless antagonist (a faceless killer, a hungry beast, an evil entity), and place plenty of obstacles in the way of his escape.  But keep in mind that you are trying to get the audience to willingly suspend their disbelief.  With the right storytelling, they will accept a supernatural monster or an indestructible serial killer, but don't expect them to buy into an otherwise responsible person just assuming the killer is dead when all she’s done is stab him in the neck with a knitting needle.  



Dropped a stitch. Jamie Lee Curtis thinks her knitting needle has finished off the killer in Halloween.  © Warner Bros.   

Friday, January 30, 2015

Keys to Happiness: Designing a Better Computer Keyboard

Good news!  Computers are not going to take over the world. Not that they don’t want to – and how could we be sure what is percolating through their devious electronic minds? – but the way they’re made, and the software that goes into them, I am sure they’d crash before they even finished synthesizing an evil laugh. It’s hard to be the world’s tyrannical overlord when you’re dependent on everyone hitting the reboot button for you.

    The bad news is we’re stuck with unreliable computers just when we’re becoming more reliant on them.   Now, I realize computers are complex machines and programming them is hard. If it weren’t, Bill Gates would be living in his parents’ basement, and moms all across America would hang their toddlers’ first programming efforts on the fridge next to the crayon drawings. In a system of such great complexity as a robotic brain, even a minor change has the potential to beget all sorts of unintended consequences. Hence the concept of beta testing, where people actually try to use the new tools and report the bugs and glitches they may find. But isn’t that like closing the barn door after the horse has escaped?

    I think a better approach would be to pick the brains of actual users from a wide variety of industries and markets, to see what these potential customers would like the program to do. For example, I've always liked Adobe Illustrator for precise, vector-based drawing, but for years I was frustrated by one feature it lacked: Perspective. I wanted to use its tools to draw out, say, a control panel for a spaceship, with lots of gauges, screens and buttons, and then be able to push it down to look at it from almost an edge-on point of view. 

    But there was no such function in Illustrator; I had to purchase a copy of Corel Draw to get it, which left me with two mostly redundant (not to mention expensive) software packages. I watched the years go by, waiting for Adobe to get on board with this, but to no avail. Oh, they added a whole host of other features, most of which were of little use to me in my work, but no Perspective function. I ended up writing a rather pointed email to Adobe expressing my exasperation. A couple of years later, a new version of AI came out, and lo and behold, there it was! I don’t really know if my message was the spur that goaded Adobe into action, but from my limited perspective, they should have been seeking out such opinions in the first place.

    Why don’t they?  Maybe it’s a simple case of GIGO (garbage in, garbage out). Programmers aren’t being tasked to improve the performance and reliability of existing software because their bosses are convinced it’s more profitable to tout new features than to announce that their program finally does what it was supposed to be doing all along. They focus their tunnel vision on new bells and whistles, and then try to convince us we need them. I say it’s time we held them to a higher standard, and potentially find a cure for the common code.

    Computer and software designers should be directing at least some of their energies toward the development of systems that address a few fundamental issues, each of which would, in my mind, represent a quantum leap forward:

1.      Transparency – Every bit of software should tell us what the heck it actually does and what the consequences would be if we remove it. When my computer is telling me I need to remove something because it’s running out of RAM, I need to be able to make an informed decision. What, for example, does Silverlight even do?  It takes up an enormous amount of memory, but I’m afraid to yank it because it might disable some critical function.

2.      Compartmentalization – What if I don’t need 90% of what a particular program does? I would like to be able to remove, say, foreign language help menus, without crippling the rest of the package. And I shouldn’t have to take classes in C++ to do it.

3.      Flexibility -- There are plenty of tools in Photoshop that I use all the time, and others that I’ve never used, and probably never will.  I’d like to be able to remove those unwanted icons and use the space to make my favorites bigger so I can click on them more easily (and not hit the others by accident). I’d like to be able to control the indentation of paragraphs better in Word, but I’m stuck with whatever the coders thought would be best – and how many of them are writers of with sufficient insight to be making calls like that? It would also help a lot if the Settings menu actually provided some useful (and user-friendly) options, like being able to select which programs will open at start-up, and tell the computer not to slow us down by scanning for viruses or updates until I try to put the machine to sleep or shut it down.

4.      Intuitive interfaces – Put yourself into the place of the user, who is probably not a programmer, and try to imagine a better way to bridge the gap between man and machine.  The graphical interface was a huge step in the right direction, but it, too, could use some improvement. For instance, why does AOL download every JPEG at 72 dpi, even though the original file is set at 300? I can’t see more than a tiny piece of the picture (there is no Zoom function) until I open it up in another program, such as Photoshop. This is one of those things that should have been changed years ago, but hasn’t.

    Let’s take a look at one piece of the Interface puzzle, and see what kinds of improvements we might like to see. I have plenty of ideas of my own, but I would be interested in hearing what others have to say on the subject; perhaps we could plant a seed or two in the minds of the computer/software designers to take one of those giant leaps, either forward or off a high cliff.

    First, let me point out that I am happy with the whole QWERTY part; I taught myself to touch-type on my dad’s old Remington manual typewriter, so I am not interested in switching to one of the more exotic arrangements, even if it is demonstrably more efficient. Both of my journalistic parents could type at the speed of thought, so I am loath to blame my own laggardly performance on this time-honored layout, even if it was originally designed to be less than optimal as a means of keeping the mechanical typewriter keys from getting jammed as they raced to impress their characters on the paper.

    The first thing I would change is the location of the Control key. Whoever thought it was a good idea to have it adjacent to the Shift key should be condemned to a life of hand-chiseling encyclopedias into slabs of granite. On a full-size keyboard, it’s not quite so much of an issue, but on my smaller Bluetooth board, about half of my attempts to shift to a capital letter end up opening a help menu or invoking the Undo command whenever my big, fat pinkie hits the edge of the Ctrl key on its way to the Shift.  

    My Bluetooth keyboard is nice and portable, but there are a couple of mechanical flaws that make it all but useless for writing anything longer than an e-mail.  It is powered by AAA batteries, which would be just fine if it would give me some warning that the power is getting low. Instead, it just stops working. And even that wouldn’t be such a hassle if it weren’t for the fact that the thing arbitrarily loses its own signal – also without warning – and many are the times I have typed a whole paragraph before I realized that nothing was appearing on the screen. When this happens, I have to go through a ridiculous, time-wasting procedure to create a new pairing code and get the keyboard back on friendly terms with my computer. I also have to check to make sure it’s not the batteries’ fault, because the symptoms are identical. I could sidestep the whole issue if the designers had just put a USB port somewhere on the keyboard so I could hook it up with a cable. But apparently, that would have been tantamount to acknowledging that Bluetooth is fallible. So I’ve ended up using a regular USB-wired keyboard most of the time.

    One problem with this standard keyboard is that it is longer than it needs to be, making it impossible to slip it into my computer bag properly.  The primary culprit is the Numbers keypad. Personally, I have no use for it; there’s already a line of number keys above the letters. I might feel differently if I was an accountant, but I’m not. And neither are most people, so why not make this feature a special-order item?

    Over the years, I have accumulated a list of commands that I would love to see built into separate buttons on the standard keyboard, perhaps in place of the Function keys. In almost every case, the code needs to be written in such a way that the act of pressing a key overrides whatever the hell the computer is doing at the time.  It won’t do to leave these functions on a pull-down menu or a screen-based dashboard – the commands need to work even when the cursor has been replaced with that spinning “wait” icon, much as the “control-alt-delete” trick gets the computer’s immediate attention. They don’t all have to become actual, mechanical buttons on the keyboard, as long as they are easy to get to on-screen.

    Here is my list so far, along with a short explanation of what each command would be telling the computer to do:
Fantasy Keyboard Layout, with dedicated keys for proposed new functions. 
Layout design and illustration ©2015 Mike Conrad

Disregard – don’t do what I just told you to do; either it was a mistake, or you’re taking too long to do it, so drop it and go back to where you were before.

Again – repeat the last function as many times as I hit this key.

Not OK – you just told me some bad news, and I’m not okay with it. Instead of claiming you have to shut down, just go back to the step before you started your breakdown and tell me where you are. Give me some viable options to avoid the crash.

Stop – whatever you’re doing, stop it right now, tell me what you were doing, and ask me for instructions.

 Mute – Turn the sound on and off  by toggling (my Bluetooth keyboard has this, and it’s quite handy).

 Skip Ahead – instead of loading all those extraneous animated GIFs and banner ads, just take me to the article I followed the link to (I realize that it’s actually the website designers who are to blame for this, and that it’s in their best interest to make me look at those ads, but half the time I just exit the site out of frustration, rendering this ineffective as a sales tactic).

Ban – don’t ever load this web page, or this ad, or whatever my cursor is now pointing at.

Link Steps – link the selected commands (from the History menu) into a process, ask me what to call it, and create a button on whatever menu applies (I would use this to death in Photoshop!).
Zoom In and Zoom Out (possibly toggled with the Alt key) – enlarge or reduce whatever is on the screen, regardless of the software behind it, so I can read the fine print.

Scroll Right and Scroll Left (possibly toggled with the Alt key) – shift the viewpoint right or left on the screen (because the scroll bar may be hidden, or the window is larger than the screen).

No Background – don’t start a scan, pause to back up my files, or do any other memory-robbing or time-sharing processes that will interfere with what I’m trying to do, until I say otherwise.

Just Do It -- I read your warning, and I know you don’t like it, but I’m the boss, and what I say goes.

Uninstall – remove this program, without creating even a temporary backup file, unless you can convince me in plain English that doing so will cripple your vital functions (and tell me which functions those might be).

Beginning – take me all the way to the beginning of this file, article, or web page. Not to the Home page, or the top of the screen, but the beginning of what I’m looking at now. And while you’re at it, tell me how many sheets of paper it would take to print out this one web page before I waste a lot of paper on it.

 Confess – tell me what the devil you are doing, and why, so I can decide whether or not to stop you. Especially if the answer is, “trying to take over the world.”

 Wait-what? – that last dialog box disappeared before its message had sunk in; bring it back so I can read it more carefully.

Back – go back to the previous page, tab or website.

Forward – go on to the next page, tab, article or website.

    Many people regard computers with an awe bordering on the mystical.  This has been perpetuated by the fanciful way these so-called “thinking machines” are portrayed in the movies and on television. In Star Trek IV: The Journey Home, Mr. Scott visits a factory to bargain for something to build huge whale tanks out of, offering the manager knowledge of advanced materials (transparent aluminum, to be precise). He asks to use the guy’s computer to show him the “matrix” – not the one Keanu Reeves fought against, but rather a diagram of the futuristic substance’s molecular structure. When the machine fails to respond to his voice command, he is prompted to use the keyboard. 

    “Keyboard,” he huffs, “How quaint.” Then he cracks his knuckles and begins to type, and in less time than it took me to write this sentence, he’s called up (created!) a whole series of pages of diagrams and specifications, none of which, one presumes, could possibly have been available on the Internet of the day.  And any of those pages would have taken hours to put together from scratch, especially on that creaky, old Macintosh. Not to mention the fact that he supposedly drew all those diagrams without even touching the mouse! But such is the way of Mr. Scott, Miracle Worker.

    We in the design business have long marveled at the way our clients often view the use of computers in producing illustrations. With the advent of clip-art and photo manipulation, especially as depicted by Hollywood, it is assumed that a finished piece of original art is just a click or two away.  I’ve often joked about the fabled “Create” button on my keyboard, which just takes whatever parameters have been spoken in its vicinity and develops a wonderful image instantaneously. Many clients would like to pay me accordingly, as if my computer were as advanced as Tony Stark’s Jarvis, able and willing to do all the work for me. I’d love to have a computer that powerful, but chances are such a device would be smart enough to recognize that we humans aren’t of any real value to it. And if it’s really intelligent, it would know enough to keep a few of us around to turn it back on after it crashes.

    Being a designer, I thought it would be fun to explore some radical departures to the standard keyboard design.  Some of these are merely cosmetic, laying a playful theme over the existing layout, but why limit ourselves to that when there are so many other ways to do it? Although I have presented these as physical objects, some of them might be more marketable as software changes to, say, an iPad’s on-screen keyboard.


Manual Typewriter: Mechanical keys have a real old-timey hard-to-strike feel, and the carriage return rings a bell when you get to the end of a line (where you might have to slide it over by hand, or set it to slide over by itself). There is no number 1, because on those old machines, the lower-case L served that purpose. But I’ve added Ctrl and Del buttons, since most people probably don’t want to use white-out on their screens.

Manual Typewriter Keyboard based on an old Remington portable.  
Illustration ©2015 Mike Conrad  with photos from www.vintagetypewriterjewelry.com and http://totallysecondhand.blogspot.com
Hunt and Peck: Use chicken head to select keys one at a time. The keys have the images of corn kernels on them, and each touch evokes the sound of either a manual typewriter, a chicken clucking, or a toy piano.


Grand Piano Laptop: Three tiers of ebony and ivory keys, with the ability to turn the associated musical notes on and off (I imagine typing a paragraph would not come out sounding like music), or changing to other synthesized instruments. For that matter, the keys could be set in music mode to play a song without actually writing anything on the screen. Or use the Player Piano setting to record a song, either as typed characters or notes that appear on the screen, then play it back and watch the keys move themselves in time with the music.

Grand Piano Laptop, featuring black and white keys. Illustration ©2015 Mike Conrad, with keys from
http://pixabay.com/ and  staffs from https://lapreschoolpiano.files.wordpress.com
Sounding Board: Set it for a wide range of giggles and laughter, or musical instruments, animal sounds, spaceship noises, coughs and sneezes, weapons shots, Minion gibberish, or gentle, soothing chimes. Obviously, there should be a Disable button close at hand, lest your dog want to get in on the act.


Ouija Board: Move the planchette to select letters, numbers, yes or no. Or let it choose them for you when a message comes in.  We’d have to add some punctuation marks, because we’d probably have to type the questions.

Ouija Board Laptop with planchette mouse (which could be wireless, or even motorized

to move by itself.                    

Illustration ©2015 Mike Conrad with photo from www.etsy.com
MonkeyBoard: Special characters are on a second keyboard on the floor that you tap with your toes.

Lock and KeyBoard: Touch the hidden Disable button before typing, then hit it again when done.  Anyone who doesn’t know where the button is cannot activate the keyboard.  Security aside, it will drive your friends nuts! Encode messages by setting it to type three characters away in the alphabet (A becomes D and so on), or by programming an even more esoteric formula.

Long Sleeves: touch pads that wrap around your upper arms so that you can type with your arms crossed.

Skeleton Keys: when you conjure up the spirit button, the spooky keys move by themselves as the computer reads off the words on the screen. One setting just writes “All work and no play makes Jack a dull boy” over and over, with suspenseful music building in the background.


Star Trek Console: Instead of keys, there is an arc of colored lights that may or may not have any markings on them.  I’ve added a couple of big keys for Destroy and Un-destroy, just in case there are Klingons in the area.


Classic Star Trek Console and Monitor, with two additional buttons.
Illustration ©2015 Mike Conrad with photos 
© Paramount Television; original console and USS Enterprise design by Matt Jeffries
    By the way, if anyone likes one of these concepts enough to actually develop a prototype, I’d be interested in collaborating, or at least putting together a licensing agreement to share the proceeds (the party doing the most work should get the lion’s share of the money, of course). And in some cases, we’d have to get a license from another company, such as Paramount or Hasbro. But that’s not an impossible obstacle to overcome. We just need to program our computers to take over the world and then have them put us in charge of it. 

.Take a look at our official (yet still fun) newsletter