Skip to content
Link copied to clipboard

Memory vs. storage

By Michael Harris Recently, one of the entrepreneurship programs hosted by the Massachusetts Institute of Technology produced the beginnings of a website called "Eterni.me," which inspired much chatter, online and off.

By Michael Harris

Recently, one of the entrepreneurship programs hosted by the Massachusetts Institute of Technology produced the beginnings of a website called "Eterni.me," which inspired much chatter, online and off.

Eterni.me's creators bill their project as a way to be remembered forever in "a living proof of you." Tens of thousands of would-be immortals have already signed up.

By amassing social media data and then funneling it through an artificial intelligence - an avatar of you - your descendants will be able to carry on "as if they were talking to you in person," claims the web site. Happy thought, indeed, for the bereaved.

Eterni.me joins the ranks of an expanding e-death industry that includes Deathswitch (which, if you like, will release your passwords and take-to-the-grave secrets upon your untimely demise) and Ifidie (which allows you to record a farewell message that will be posted, post-mortem, to your Facebook wall).

Social media's soft insistence on perma-connectivity has, inevitably, burst through to the great hereafter.

My first concern with Eterni.me and its ilk is not that they fly in the face of what psychologists might consider healthy bereavement. Nor that they jangle our appreciation of the authentic human interface (that is, our appreciation of living people rather than chatbots and avatars).

My major concern is that Eterni.me and the rest are rewriting our definition of memory itself. Such programs lure us into thinking of human memory as comparable to the perfect storage of a computer's recall system.

We've, of course, always had the desire to hold onto more of the world than our feeble minds can grasp. We see that nascent urge in the supercharged mnemonic devices ("memory palaces") that ancient Romans conceived of, and again in the cabinets of curiosities that so entranced Renaissance scholars. In our own lives, we see it in every snapshot we take on an iPhone, and in the increasing usage of lifelogging apps like Saga and Timehop, which promise our preservation in the silicon beyond (in Timehop's vaguely morbid words: "a time capsule of you").

But real memories, as experienced by humans, are nothing like the so-called memories of computers. Scientists now agree with writer Jorge Luis Borge, who said, "Every time we remember something, after the first time, we're not remembering the event, but the first memory of the event. Then the experience of the second memory and so on."

Borges got that basically right. Through a brain process called "reconsolidation," every retrieval of a given memory actually changes it. As one expert, Nelson Cowan, told me: "We edit the past in light of what we know now. But we remain utterly unaware that we've changed it."

Memory, then, is a lived and morphing thing - not a static filing cabinet. The promise of something like Eterni.me is that we can circumvent our minds' failings. But such circumventing is to miss the point of human memory entirely.

The Talmudic maxim claims, "We don't see things as they are, we see things as we are." And I think we can expand that to say, "We remember things as we are." To deny that fallible quality is to deny a part of what makes us human.

Perhaps, given such gloriously tech-injected recall abilities, we will now have to cultivate forgetting, and cultivate the art of human memory - the magic of painting one's own past, building a story of what came before.

Albert Einstein said we should never memorize anything that we could look up. And that sounds like good and practical advice.

But what would Einstein say today, when everything - even our ancestors - can be looked up? Should we bother to memorize poetry, or names, or historical facts? What utility would there be in the hazy results?

Fifty years from now, if you can recite "The Epic of Gilgamesh," are you a wizard or a dinosaur? And, 50 years from now, if you can remember your long-gone grandparents without an avatar prompter ... well, would you even bother trying? Or would you, instead, reminisce in the perfect and passive recall that the algorithm provides?

Perhaps we should side instead with philosopher Lewis Mumford, who insisted that "information retrieving," however expedient, is simply no substitute for the possession of knowledge accrued through personal and direct labor.

A first step, anyhow, would be to ban the word memory from computer parlance. What computers are so fantastic at is "recall" and "storage." "Memory" is something else, and something we ought to hold onto.