Inspired by the cloning process in John Scalzi’s Old Man’s War and the trailers for Transcendence.


As computing capacity increased in the 21st century, it was inevitable that we’d digitize human consciousness. Finally, toward the middle of the century, we believed we understood the brain well enough to try.

The early adopters were patients with terminal diseases, as the process required destructively scanning a brain, slicing it up thinly post-mortem to make sure every cell was accounted for. Even if an in-depth scan of a living brain were possible, few were comfortable with the fact that doing so would have meant only making a copy, rather than a true transfer of consciousness.

They all came out wrong. At best they were sociopathic, at worst they quickly developed into inhuman caricatures of their former selves. Something essential had been lost in the conversion. They were carefully boxed; firewalls and programming strictures put in place to keep them from getting unfettered access to the internet. We’d all seen the movies about the dangers of insane AI.

Eventually, nanotech reached a point where we thought we could try again. Maybe the problem had been the postmortem scan; nanotech could scan in place, destroying cells to image them while the rest of the brain still lived.

This worked much better, but the digital minds were still crippled. With enough data, we determined that the best transfers were the slowest and the ones where the patient remained conscious. The programmers had put in failsafes to prevent a massive mental failure: the nanobots that were imaging the brain tried to begin simulating it to keep the rest of the mind from shutting down when it got no responses. The slower the process, the more context the simulation had. What we’d been missing was the way the brain changed in response to active stimuli: emotions.

The perfected process is slow, but reliable. A patient’s brain is colonized by nanites in thousands of places. They record the cells near them in all relevant contexts and only replace the cells when they’re certain they can simulate all proper responses. From the brain’s point of view, signals passed to the new cybernetic clusters are no different from those passed to the previous living cells. The nanites then begin observing other nearby cells before expanding again. The individual seamlessly transforms from wetware to hardware, gradually becoming more and more accustomed to thinking with a brain that’s increasingly cybernetic. Consciousness is fully preserved, and, given the increased efficiency of the silver matter over gray matter, new capabilities slowly come online. Individuals benefit from increased cognitive function, the ability to install downloaded knowledge directly, and access to augmented reality and networking. By the time the brain has been fully replaced and the individual is ready to transcend the material form, he or she has become more than human.

The catch is the context required: over and over, the individual must experience the full range of relevant emotions. A distressing number of patients die before becoming fully digitized, pursuing the emotional highs necessary to convince the nanites to expand, some of them quite dangerous. But nothing ventured, nothing gained.


Make a list of emotions relevant to the game intended. This can be a simple list or a complicated one. The longer the list, the more permissive the GM should be on whether they’re met.

As the player believes his or her character has experienced these emotions, and the GM agrees, check them off. Once all have been checked off, the nanites expand, and the character gains new mental capabilities. Make a list of options for the players to pick from for each upgrade, including mental attribute improvements or specific bonuses to certain cognitive tasks, the ability to add new downloaded skills, and AR and networking features. Essentially, experiencing a full array of emotions becomes a player-directed XP track for cyberware upgrades.

Optionally, players can choose to force the upgrade. Put a -1 next to every emotion that wasn’t checked off (cumulative with existing penalties to that emotion from previous attempts to force it). That penalty applies to all future rolls relevant to that emotion (particularly social rolls), and if it reaches -3 the character gains a mental illness relevant to having reduced capacity to correctly utilize the emotion.

The GM should indicate how many upgrades are required before the brain becomes fully digitized and can be backed up for a digital afterlife. At that point, there is no further benefit from pursuing the emotions, except that it’s potentially a habit.