Transhuman Thrillseeking

Leave a comment

Inspired by the cloning process in John Scalzi’s Old Man’s War and the trailers for Transcendence.


As computing capacity increased in the 21st century, it was inevitable that we’d digitize human consciousness. Finally, toward the middle of the century, we believed we understood the brain well enough to try.

The early adopters were patients with terminal diseases, as the process required destructively scanning a brain, slicing it up thinly post-mortem to make sure every cell was accounted for. Even if an in-depth scan of a living brain were possible, few were comfortable with the fact that doing so would have meant only making a copy, rather than a true transfer of consciousness.

They all came out wrong. At best they were sociopathic, at worst they quickly developed into inhuman caricatures of their former selves. Something essential had been lost in the conversion. They were carefully boxed; firewalls and programming strictures put in place to keep them from getting unfettered access to the internet. We’d all seen the movies about the dangers of insane AI.

Eventually, nanotech reached a point where we thought we could try again. Maybe the problem had been the postmortem scan; nanotech could scan in place, destroying cells to image them while the rest of the brain still lived.

This worked much better, but the digital minds were still crippled. With enough data, we determined that the best transfers were the slowest and the ones where the patient remained conscious. The programmers had put in failsafes to prevent a massive mental failure: the nanobots that were imaging the brain tried to begin simulating it to keep the rest of the mind from shutting down when it got no responses. The slower the process, the more context the simulation had. What we’d been missing was the way the brain changed in response to active stimuli: emotions.

The perfected process is slow, but reliable. A patient’s brain is colonized by nanites in thousands of places. They record the cells near them in all relevant contexts and only replace the cells when they’re certain they can simulate all proper responses. From the brain’s point of view, signals passed to the new cybernetic clusters are no different from those passed to the previous living cells. The nanites then begin observing other nearby cells before expanding again. The individual seamlessly transforms from wetware to hardware, gradually becoming more and more accustomed to thinking with a brain that’s increasingly cybernetic. Consciousness is fully preserved, and, given the increased efficiency of the silver matter over gray matter, new capabilities slowly come online. Individuals benefit from increased cognitive function, the ability to install downloaded knowledge directly, and access to augmented reality and networking. By the time the brain has been fully replaced and the individual is ready to transcend the material form, he or she has become more than human.

The catch is the context required: over and over, the individual must experience the full range of relevant emotions. A distressing number of patients die before becoming fully digitized, pursuing the emotional highs necessary to convince the nanites to expand, some of them quite dangerous. But nothing ventured, nothing gained.


Make a list of emotions relevant to the game intended. This can be a simple list or a complicated one. The longer the list, the more permissive the GM should be on whether they’re met.

As the player believes his or her character has experienced these emotions, and the GM agrees, check them off. Once all have been checked off, the nanites expand, and the character gains new mental capabilities. Make a list of options for the players to pick from for each upgrade, including mental attribute improvements or specific bonuses to certain cognitive tasks, the ability to add new downloaded skills, and AR and networking features. Essentially, experiencing a full array of emotions becomes a player-directed XP track for cyberware upgrades.

Optionally, players can choose to force the upgrade. Put a -1 next to every emotion that wasn’t checked off (cumulative with existing penalties to that emotion from previous attempts to force it). That penalty applies to all future rolls relevant to that emotion (particularly social rolls), and if it reaches -3 the character gains a mental illness relevant to having reduced capacity to correctly utilize the emotion.

The GM should indicate how many upgrades are required before the brain becomes fully digitized and can be backed up for a digital afterlife. At that point, there is no further benefit from pursuing the emotions, except that it’s potentially a habit.

Consciousness Twinning


(Originally posted August 2009)

I had an interesting idea for a weird-sciencey explanation for respawning in a scifi videogame context:

Late in the 21st century, we figured out transportation. You know, like Star Trek: the state of all your atoms and molecules and stuff is determined and replicated somewhere else. The method for doing it wound up being quite elegant, if you’re a big-brained string theorist guy. But, the interesting part, was the first guy they tried it on raised the obvious objections: “don’t I just die here and a copy of me is made somewhere else?” He made them see if they could create the copy somewhere else without destroying his current body. And damned if they couldn’t. But there was the weird part.

You’ve heard of quantum entanglement? The little quarks bouncing up and down one place and affecting their brother quarks across time and space with no regard for the speed of light? It turns out consciousness is like that. Something about your brain state, when it’s copied exactly, results in you basically being in two places at once. The first guy had to be put in sensory deprivation to deal with it, but he had two bodies and was aware of them at the same time. You have to be a special kind of person to be able to deal with that much sensory overload, though, and nobody’s figured out how to effectively use two bodies, yet.

Anyway, the persistence of consciousness issues aside, they tried a standard transport: kill a guy here and build him there. That worked out less well. That guy showed up at his destination with nasty gaps in memory and personality shifts. Turns out, without his consciousness holding the wave state open or whatever for even a moment, when they recreated him his brain pathways collapsed just a bit. They figured if they tried to clone someone out of cold storage that way, he might just wake up a vegetable. Religious folks rejoiced that there was something special about sapience, even if it was just the weird quantum wave form generated by the flow of electrons through your nerves.

There was a solution, for the wealthy or the special: a brain in a jar. You selectively clone someone’s brain, drop it in a nutrient bath, and go about your business. The guy dies or needs to be transported, there’s still a brain in a jar in a lab somewhere holding open those consciousness pathways, seeing everything the guy saw up to his moment of death, creating a stable platform to resurrect him on. Plus, if you stick a couple electrodes in the jar brain, you have a completely secure way of communicating with agents in the field by giving them the information in a locked-down facility and twinning it over to the live dude.

And that’s how the elite agents operate. They have a backup brain in a jar somewhere. They can receive orders deep in enemy territory, be transported willy-nilly wherever there’s resources to do so, and even be recreated with full memories after the moment of “death.” I hear it’s an awesome insurance package… if you trust your boss to own a working copy of your brain.