2009 is probably not very high on anyone’s list of Totally Awesome Years. Our society has been set on a track toward painful socioeconomic changes, and there has been a worrisome deepening of geopolitical rifts. But as the halfwits in the media class clucked and squabbled amongst themselves, they missed the biggest story of all: in 2009 the long-term prospects for the human experiment became considerably brighter. In fact, we have just lived through a banner year for the human species, because this was the year that we learned that leaving our mother planet to live elsewhere is a tangible possibility.
In recent months, several lines of scientific investigation converged and the result seems to be that humankind has gained the ability to prospect for water on other worlds. Most significantly, the LCROSS lunar impactor shot straight into a crater at 1.5 miles per second. It was literally a bombshell, but its impact on history will be no less great—the colossal smash sent giant chunks of ice from our moon flying into space.
A few weeks before, an analysis of the light bouncing off the moon had indicated that lunar dirt contains trace amounts of water all across its surface. Extracting this water would be more difficult than mining the plentiful ice in the craters, but it could be done.
By the beginning of this year, the Phoenix Mars Lander had already detected—in fact, stepped on—ice near the Martian south pole. But now a camera orbiting Mars has snapped pictures of 99% pure ice near the equator, which has an environment far more hospitable to humans and our technology than the poles do.
If this year’s discovery of additional evidence that Mars was once covered in oceans had been discovered a short while ago, it would again have been interpreted by the green movement as an ominous warning of what was in store for Earth. Instead, in the context of this year’s water discoveries, Mars has become a friendlier place. We now know that we could survive there using today’s technology, if it were important enough to do so.
In the media, and our culture generally, a dearth of imagination has prevented the long-term implications of all this from being noticed, and it’s terribly disheartening to see. The water discoveries should have been celebrated, if not with fanfare, at least with rapturous conversation around every dinner table in the world. “Have you heard? If a global catastrophe makes Earth uninhabitable, there’s a place we can go!” But unless you follow space news, you probably weren’t even aware that these discoveries had fundamentally changed the calculus of our society’s future and even the destiny of our species.
Lunar ice means that large scale colonization of the moon is now possible decades earlier than it would have been had the moon been barren. That saved time could make all the difference in a pinch. Imagine that a few decades after a robust lunar colony is established, an asteroid, epidemic, or nuclear war ravages our home planet. We will have lunar water (and the fact that we knew about it) to thank for the preservation of human culture and knowledge.
Importantly, this increased access to the moon has put it within reach of privately funded excursions. It is now likely that individual and corporate homesteaders could establish and defend lunar property rights before governments mobilize to prevent them, promoting the cause of liberty and spreading Karl Popper’s Open Society into the cosmos.
The transition from an Earth-bound civilization to a space-faring one will start with the moon, making Mars colonization an incremental step rather than a giant leap. Relatively soon after a lunar colony is established, the skills and technologies developed to extract lunar water for drinking, breathing, and fuel-making will be further developed to do the same tasks on Mars. People will become experienced with the four-day journey between Earth and the moon, which will greatly simplify the logistics of moving personnel and cargo to the Red Planet.
Another important discovery was made in 2009 that made Mars more accessible: it became half as far away. Ion engines have already proved themselves on deep space probe missions, but this year a redesigned engine called VASIMIR was announced that greatly extends the thrust and efficiency of ion drives. With these new engines, set to be tested in 2010 aboard the International Space Station, Mars could conceivably be reached in only 39 days—almost exactly half the time it took Columbus to reach the New World.
Stephen Hawking is right when he says that if we don’t colonize other worlds in the near future our species will become extinct sooner rather than later. This year’s discoveries of water on the moon and Mars is a profound gift—one that might extend the longevity of the human species by an order of magnitude. Despite all the headaches and misadventures this year, 2009 may not have been a washout after all.
The Washington Post has an intriguing piece about a book dealing with gardens (of all things) and digitality. The author, Robert Harrison, argues that gardens immerse us in place and time, and that digital devices do not. The article jumps all over the place, talking about mobile communication, cultural anthropology, and evolution, but it makes several important points.
To start, attending to digital devices is said to preclude being present:
“You know you have crossed the river into Cyberland when the guy coming your way has his head buried in the hand-held screen. He will knock into you unless you get out of his way, and don’t expect an apology. It’s as if you aren’t there. Maybe you’re not.”
I’m very interested in language like this, because it’s a metaphor in the process of becoming a literalism. Today, saying that you’re not there because you’re looking at a device is metaphorical, but I think that the meaning of ‘being there’ is going to change to mean where you are engaged, no matter where its geographical location is in relation to you. “I’ll be right there!” he said as he plugged his brain into the internet. Moments later he was standing in the garden…
The article quotes a study that claims that the average adult spends 8.5 hours a day visually engaged with a screen. 24-hour days, split up by 8.5 hours of screen and 8 hours of sleep—the Screen Age really does deserve its own delineation. It’s a significant and unique period in human history.
And just like sleep, perhaps disturbingly so, people looking at screens can resemble dead people (or, more accurately, un-dead people):
…We have become digital zombies.
But I think the resemblance is entirely superficial. Sure, if you only go by appearances, an army of screen-starers is a frightening sight to imagine. But scratch the surface and you realize that screen-staring is a far cry from zombism. The social spaces we are constructing while we stare, the vast data stores we are integrating—these activities remind me of life. Teeming life. Our bodies may be sedentary, our eyes fixed on a single glowing rectangle, but what is going on is indisputably amazing. On the microscopic level there are billions of electrical fluctuations per moment, both in our brains and our machines, and they are actively correlating and adapting to each other. Patterns of thought are encoded in a vast network of micro-actions and reactions that span the planet. And what is it like for you when you stare at a computer or phone screen? You juggle complex, abstract symbolic information at speeds never before achieved by human brains, and you’re also inputting—emitting—hundreds of symbols with the precise motor skills of your fingers. You are recognizing pictures and signs, searching for things, finding them, figuring stuff out, adjusting your self image, and nurturing your dreams. There is no loss of dignity or life in this. But I admit that we all look like zombies while we do it, and I suppose that is pretty weird.
The article goes on to quote author Katherine Hayles, who says she thinks humans are in a state of symbiosis with their computers:
“If every computer were to crash tomorrow, it would be catastrophic,” she says. “Millions or billions of people would die. That’s the condition of being a symbiont.”
Let that sink in. At any moment a catastrophic event could fry our entire digital infrastructure in one fell swoop. Our civilization teeters on a house of cards as high as Mount Everest! To me this is the only reason the Screen Age should be frightening, but it’s very frightening indeed.
Turning now to sensation, Hayles mentions that touch and smell are suppressed by bipedalism:
“You could say when humans started to walk upright, we lost touch with the natural world. We lost an olfactory sense of the world, but obviously bipedalism paid big dividends.”
Note that bipedalism is associated with a loss of tactility, but it has also been correlated with enabling more complex manual dexterity. Maybe there is a general principle here that ambient tactile awareness is inversely correlated to prehension.
After a brief ensuing discussion of dualism and the advent of location-based services, we’re back to the gardens:
The difficulty, Harrison argues, is that we are losing something profoundly human, the capacity to connect deeply to our environments… “For the gardens to become fully visible in space, they require a temporal horizon that the age makes less and less room for.”
I like the point about a gardens’ time horizon. But it’s used to complain about the discomfort of our rushed lifestyle, which I would argue is separable from communication technology. The heads-buried-in-screens thing doesn’t really affect whether we have time for gardens.
An interesting footnote offered by Harrison is that the Czech playwright Karel Capek, who invented the word ‘robot,’ was a gardener.
Finally, this is the photo that accompanies the article:
…captioned, “Fingers on the political pulse.” The article is about looking and being present, but the picture is about hands, heatbeat, and hapticity.