NYC

It was 1978. I was new to New York. A rich acquaintance had invited me to a housewarming party, and, as my cabdriver wound his way down increasingly potholed and dingy streets, I began wondering whether he'd got the address right. Finally he stopped at the doorway of a gloomy, unwelcoming industrial building. Two winos were crumpled on the steps, oblivious. There was no other sign of life in the whole street.

"I think you may have made a mistake", I ventured.

But he hadn't. My friend's voice called "Top Floor!" when I rang the bell, and I thought - knowing her sense of humour - "Oh this is going to be some kind of joke!" I was all ready to laugh. The elevator creaked and clanked slowly upwards, and I stepped out - into a multi-million dollar palace. The contrast with the rest of the building and the street outside couldn't have been starker.

I just didn't understand. Why would anyone spend so much money building a place like that in a neighbourhood like this? Later I got into conversation with the hostess. "Do you like it here?" I asked. "It's the best place I've ever lived", she replied. "But I mean, you know, is it an interesting neighbourhood?" "Oh, the neighbourhood? Well that's outside!" she laughed.

The incident stuck in my mind. How could you live so blind to your surroundings? How could you not think of where I live as including at least some of the space outside your four walls, some of the bits you couldn't lock up behind you? I felt this was something particular to New York: I called it "The Small Here". I realised that, like most Europeans, I was used to living in a bigger Here.

I noticed that this very local attitude to space in New York paralleled a similarly limited attitude to time. Everything was exciting, fast, current, and temporary. Enormous buildings came and went, careers rose and crashed in weeks. You rarely got the feeling that anyone had the time to think two years ahead, let alone ten or a hundred. Everyone seemed to be passing through. It was undeniably lively, but the downside was that it seemed selfish, irresponsible and randomly dangerous. I came to think of this as "The Short Now", and this suggested the possibility of its opposite - "The Long Now".

"Now" is never just a moment. The Long Now is the recognition that the precise moment you're in grows out of the past and is a seed for the future. The longer your sense of Now, the more past and future it includes. It's ironic that, at a time when humankind is at a peak of its technical powers, able to create huge global changes that will echo down the centuries, most of our social systems seem geared to increasingly short nows. Huge industries feel pressure to plan for the bottom line and the next shareholders meeting. Politicians feel forced to perform for the next election or opinion poll. The media attract bigger audiences by spurring instant and heated reactions to human interest stories while overlooking longer-term issues - the real human interest.

Meanwhile, we struggle to negotiate our way through an atmosphere of Utopian promises and dystopian threats, a minefield studded with pots of treasure. We face a future where almost anything could happen. Will we be crippled by global warming, weapons proliferation and species depletion, or liberated by space travel, world government and molecule-sized computers? We don't even want to start thinking about it. This is our peculiar form of selfishness, a studied disregard of the future. Our astonishing success as a technical civilisation has led us to complacency, to expect that things will probably just keep getting better.

But there is no reason to believe this. We might be living in the last gilded bubble of a great civilisation about to collapse into a new Dark Age, which, given our hugely amplified and widespread destructive powers, could be very dark indeed.

If we want to contribute to some sort of tenable future, we have to reach a frame of mind where it comes to seem unacceptable - gauche, uncivilised - to act in disregard of our descendants. Such changes of social outlook are quite possible - it wasn't so long ago, for example, that we accepted slavery, an idea which most of us now find repellent. We felt no compulsion to regard slaves as fellow-humans and thus placed them outside the circle of our empathy. This changed as we began to realise, perhaps it was partly the glory of their music, that they were real people, and that it was no longer acceptable that we should cripple their lives just so that ours could be freer. It just stopped feeling right.

The same type of change happened when we stopped employing kids to work in mines, or when we began to accept that women had voices too. Today we view as fellow-humans many whom our grandparents may have regarded as savages, and even feel some compulsion to share their difficulties - aid donations by individuals to others they will never meet continue to increase. These extensions of our understanding of who qualifies for our empathy, indicate that culturally, economically and emotionally we live in an increasingly Big Here, unable to lock a door behind us and pretend the rest of the world is just "outside".

We don't yet, however, live in The Long Now. Our empathy doesn't extend far forward in time. We need now to start thinking of our great-grandchildren, and their great-grandchildren, as other fellow-humans who are going to live in a real world which we are incessantly, though only semi-consciously, building. But can we accept that our actions and decisions have distant consequences, and yet still dare do anything? It was an act of complete faith to believe, in the days of slavery, that a way of life which had been materially very successful could be abandoned and replaced by another, as yet unimagined, but somehow it happened. We need to make a similar act of imagination now.

Since this act of imagination concerns our relationship to time, a Millennium is a good moment to articulate it. Can we grasp this sense of ourselves as existing in time, part of the beautiful continuum of life? Can we become inspired by the prospect of contributing to the future? Can we shame ourselves into thinking that we really do owe those who follow us some sort of consideration, just as the people of the nineteenth century shamed themselves out of slavery? Can we extend our empathy to the lives beyond ours?

I think we can. Humans are capable of a unique trick: creating realities by first imagining them, by experiencing them in their minds. When Martin Luther King said "I have a dream", he was inviting others to dream it with him. Once a dream becomes shared in that way, current reality gets measured against it and then modified towards it. As soon as we sense the possibility of a more desirable world, we begin behaving differently, as though that world is starting to come into existence, as though, in our minds at least, we're already there. The dream becomes an invisible force which pulls us forward. By this process it starts to come true. The act of imagining something makes it real.

This imaginative process can be seeded and nurtured by artists and designers, for, since the beginning of the 20th century, artists have been moving away from an idea of art as something finished, perfect, definitive and unchanging towards a view of artworks as processes or the seeds for processes - things that exist and change in time, things that are never finished. Sometimes this is quite explicit - as in Walter de Maria's "Lightning Field", a huge grid of metal poles designed to attract lightning. Many musical compositions don't have one form, but change unrepeatingly over time - many of my own pieces and Jem Finer's Artangel installation "LongPlayer" are like this. Artworks in general are increasingly regarded as seeds - seeds for processes that need a viewer's (or a whole culture's) active mind in which to develop. Increasingly working with time, culture-makers see themselves as people who start things, not finish them.

And what is possible in art becomes thinkable in life. We become our new selves first in simulacrum, through style and fashion and art, our deliberate immersions in virtual worlds. Through them we sense what it would be like to be another kind of person with other kinds of values. We rehearse new feelings and sensitivities. We imagine other ways of thinking about our world and its future.

Danny Hillis's Clock of the Long Now is a project designed to achieve such a result. It is, on the face of it, far-fetched to think that one could make a clock which will survive and work for the next 10,000 years. But the act of even trying is valuable: it puts time and the future on the agenda and encourages thinking about them. As Stewart Brand, a colleague in The Long Now Foundation, says:

“Such a clock, if sufficiently impressive and well engineered, would embody deep time for people. It should be charismatic to visit, interesting to think about, and famous enough to become iconic in the public discourse. Ideally, it would do for thinking about time what the photographs of Earth from space have done for thinking about the environment. Such icons reframe the way people think.

The 20th Century yielded its share of icons, icons like Muhammad Ali and Madonna that inspired our attempts at self-actualisation and self-reinvention. It produced icons to our careless and misdirected power - the mushroom cloud, Auschwitz, and to our capacity for compassion - Live Aid, the Red Cross.

In this, the 21st century, we may need icons more than ever before. Our conversation about time and the future must necessarily be global, so it needs to be inspired and consolidated by images that can transcend language and geography. As artists and culture-makers begin making time, change and continuity their subject-matter, they will legitimise and make emotionally attractive a new and important conversation.

BE.jpg

Prozac blues.

No matter how corrupt, greedy, and heartless our government, our corporations, our media, and our religious & charitable institutions may become, the music will still be wonderful.

If I should ever die, God forbid, let this be my epitaph:

THE ONLY PROOF HE NEEDED
FOR THE EXISTENCE OF GOD
WAS MUSIC

***

Back to music. It makes practically everybody fonder of life than he or she would be without it. Even military bands, although I am a pacifist, always cheer me up. And I really like Strauss and Mozart and all that, but the priceless gift that African Americans gave the whole world when they were still in slavery was a gift so great that it is now almost the only reason many foreigners still like us at least a little bit. That specific remedy for the worldwide epidemic of depression is a gift called the blues. All pop music today—jazz, swing, be-bop, Elvis Presley, the Beatles, the Stones, rock-and-roll, hip-hop, and on and on—is derived from the blues.
 
A gift to the world? One of the best rhythm-and-blues combos I ever heard was three guys and a girl from Finland playing in a club in Krakow, Poland.
 
The wonderful writer Albert Murray, who is a jazz historian and a friend of mine among other things, told me that during the era of slavery in this country—an atrocity from which we can never fully recover—the suicide rate per capita among slave owners was much higher than the suicide rate among slaves.
 
Murray says he thinks this was because slaves had a way of dealing with depression, which their white owners did not: They could shoo away Old Man Suicide by playing and singing the Blues. He says something else which also sounds right to me. He says the blues can't drive depression clear out of a house, but can drive it into the corners of any room where it's being played. 

220px-AManWithoutACountry.jpg

«Ogilvy on advertising», the prequel.

After spending a few weeks getting a solid grounding in opinion research, Ogilvy accompanied Gallup to Hollywood. They pitched their services to the head of RKO studios, pointing out the competitive advantages of measuring the popularity of movie stars, pretesting audience acceptance of movie ideas and titles, and forecasting trends. RKO awarded them a twelve-month contract, and other studios soon followed suit, noting that David Selznick «took to ordering surveys the way other people order groceries.» Ogilvy admired Gallup immensely and gained a deep respect for the value of opinion research as a predictive tool in everything from marketing to politics. He found his time in Hollywood both entertaining and instructive and hobnobbed with some of the most famous movie stars of the day, almost all of whom he considered «repulsive egotists.» As a result of his audience research, Ogilvy discovered that certain marquee names had a negative effect on a picture's earnings, and he assembled a classified list he called «box office poison» that prematurely ended many a career. «There is no great trick to doing research,» Ogilvy later observed. «The problem is to get people to use it—particularly when the research reveals that you have been making mistakes.» Most people, he found, had "a tendency to use research as a drunkard uses a lamppost—for support, not for illumination.»

(…)

Stephenson had sent Fleming there in 1942 and had been impressed with how well he had come through the course, recalling that he was «top of his section,» though he lacked the killer's instinct, and had hesitated—a fatal error—during an exercise in which he was expected to «shoot a man in cold blood.» While the camp schooled secret agents, spies, and guerrilla fighters who went on to carry out BSC missions in enemy-occupied Europe and Asia, most of the people sent on the course with Ogilvy had been recruited to do intelligence or propaganda work, had backgrounds in journalism and foreign relations, and knew little or nothing about spycraft beyond the jobs they were doing at their typewriters. At Camp X, Ogilvy and his fellow trainees donned army fatigues designed to help maintain the facility's cover as a regular army base, and attended lectures on the new high technology of espionage, from the use of codes and ciphers to listening devices, and observed awe-inspiring demonstrations of silent killing and underwater demolitions. They also received some limited practice in how to use a handgun and shoot quickly and accurately without hesitation. «l was taught the tricks of the trade,» recalled Ogilyy. «How do you follow people without arousing their suspicion? Walk in front of them; if you also push a pram this will disarm their suspicions still further. I was taught to use a revolver, to blow up bridges and power lines with plastic, to cripple police dogs by grabbing their front legs and tearing their chests apart, and to kill a man with my bare hands.»

Fully expecting to be parachuted behind enemy lines, he was a little let down when Stephenson assigned him to desk duty.

Google Street View, word edition.

This is a story about love and death in the golden land, and begins with the country. The San Bernardino Valley lies only an hour east of Los Angeles by the San Bernardino Freeway but is in certain ways an alien place: not the coastal California of the subtropical twilights and the soft westerlies off the Pacific but a harsher California, haunted by the Mojave just beyond the mountains, devastated by the hot dry Santa Ana wind that comes down through the passes at 100 miles an hour and whines through the eucalyptus windbreaks and works on the nerves. October is the bad month for the wind, the month when breathing is difficult and the hills blaze up spontaneously. There has been no rain since April. Every voice seems a scream. It is the season of suicide and divorce and prickly dread, wherever the wind blows. The Mormons settled this ominous country, and then they abandoned it, but by the time they left the first orange tree had been planted and for the next hundred years the San Bernardino Valley would draw a kind of people who imagined they might live among the talismanic fruit and prosper in the dry air, people who brought with them Midwestern ways of building and cooking and praying and who tried to graft those ways upon the land. The graft took in curious ways. This is the California where it is possible to live and die without ever eating an artichoke, without ever meeting a Catholic or a Jew. This is the California where it is easy to Dial-A-Devotion, but hard to buy a book. This is the country in which a belief in the literal interpretation of Genesis has slipped imperceptibly into a belief in the literal interpretation of Double Indemnity, the country of the teased hair and the Capris and the girls for whom all life’s promise comes down to a waltz-length white wedding dress and the birth of a Kimberly or a Sherry or a Debbi and a Tijuana divorce and a return to hairdressers’ school. “We were just crazy kids,” they say without regret, and look to the future. The future always looks good in the golden land, because no one remembers the past. Here is where the hot wind blows and the old ways do not seem relevant, where the divorce rate is double the national average and where one person in every thirty-eight lives in a trailer. Here is the last stop for all those who come from somewhere else, for all those who drifted away from the cold and the past and the old ways. Here is where they are trying to find a new life style, trying to find it in the only places they know to look: the movies and the newspapers.

IB.jpg

Lies: About 1,160,000,000 results (0.51 seconds)

Frankly, the overwhelming majority of academics have ignored the data explosion caused by the digital age. The world’s most famous sex researchers stick with the tried and true. They ask a few hundred subjects about their desires; they don’t ask sites like PornHub for their data. The world’s most famous linguists analyze individual texts; they largely ignore the patterns revealed in billions of books. The methodologies taught to graduate students in psychology, political science, and sociology have been, for the most part, untouched by the digital revolution. The broad, mostly unexplored terrain opened by the data explosion has been left to a small number of forward-thinking professors, rebellious grad students, and hobbyists. That will change.

(...)

Everybody lies. People lie about how many drinks they had on the way home. They lie about how often they go to the gym, how much those new shoes cost, whether they read that book. They call in sick when they’re not. They say they’ll be in touch when they won’t. They say it’s not about you when it is. They say they love you when they don’t. They say they’re happy while in the dumps. They say they like women when they really like men. People lie to friends. They lie to bosses. They lie to kids. They lie to parents. They lie to doctors. They lie to husbands. They lie to wives. They lie to themselves. And they damn sure lie to surveys. Here’s my brief survey for you:

Have you ever cheated in an exam?

Have you ever fantasised about killing someone?

Were you tempted to lie?

Many people underreport embarrassing behaviours and thoughts on surveys. They want to look good, even though most surveys are anonymous. This is called social desirability bias. 

An important paper in 1950 provided powerful evidence of how surveys can fall victim to such bias. Researchers collected data, from official sources, on the residents of Denver: what percentage of them voted, gave to charity, and owned a library card. They then surveyed the residents to see if the percentages would match. The results were, at the time, shocking. What the residents reported to the surveys was very different from the data the researchers had gathered. Even though nobody gave their names, people, in large numbers, exaggerated their voter registration status, voting behaviour, and charitable giving.

Has anything changed in 65 years? In the age of the internet, not owning a library card is no longer embarrassing. But, while what’s embarrassing or desirable may have changed, people’s tendency to deceive pollsters remains strong. A recent survey asked University of Maryland graduates various questions about their college experience. The answers were compared with official records. People consistently gave wrong information, in ways that made them look good. Fewer than 2% reported that they graduated with lower than a 2.5 GPA (grade point average). In reality, about 11% did. And 44% said they had donated to the university in the past year. In reality, about 28% did.

Then there’s that odd habit we sometimes have of lying to ourselves. Lying to oneself may explain why so many people say they are above average. How big is this problem? More than 40% of one company’s engineers said they are in the top 5%. More than 90% of college professors say they do above-average work. One-quarter of high school seniors think they are in the top 1% in their ability to get along with other people. If you are deluding yourself, you can’t be honest in a survey.

The more impersonal the conditions, the more honest people will be. For eliciting truthful answers, internet surveys are better than phone surveys, which are better than in-person surveys. People will admit more if they are alone than if others are in the room with them. However, on sensitive topics, every survey method will elicit substantial misreporting. People have no incentive to tell surveys the truth.

How, therefore, can we learn what our fellow humans are really thinking and doing? Big data. Certain online sources get people to admit things they would not admit anywhere else. They serve as a digital truth serum. Think of Google searches. Remember the conditions that make people more honest. Online? Check. Alone? Check. No person administering a survey? Check.

The power in Google data is that people tell the giant search engine things they might not tell anyone else. Google was invented so that people could learn about the world, not so researchers could learn about people, but it turns out the trails we leave as we seek knowledge on the internet are tremendously revealing.

I have spent the past four years analysing anonymous Google data. The revelations have kept coming. Mental illness, human sexuality, abortion, religion, health. Not exactly small topics, and this dataset, which didn’t exist a couple of decades ago, offered surprising new perspectives on all of them. I am now convinced that Google searches are the most important dataset ever collected on the human psyche.

SS.png

The new needs friends.

«In many ways, the work of a critic is easy,» Ego says. «We risk very little, yet enjoy a position over those who offer up their work and their selves to our judgment. We thrive on negative criticism, which is fun to write and to read. But the bitter truth we critics must face is that, in the grand scheme of things, the average piece of junk is probably more meaningful than our criticism designating it so. But there are times when a critic truly risks something, and that is in the discovery and defense of the new. The world is often unkind to new talent, new creations. The new needs friends.»

EC.png

Animal rights vs. biblical literalism and unbridled science.

Another fundamental error of Christianity is that it has in an unnatural fashion sundered mankind from the animal world to which it essentially belongs and now considers mankind alone as of any account, regarding the animals as no more than things. This error is a consequence of creation out of nothing, after which the Creator, in the first and second chapters of Genesis, takes all the animals just as if they were things, and without so much as the recommendation of kind treatment which even a dog-seller usually adds when he parts with his dogs, hands them over to man for man to rule, that is to do with them what he likes; subsequently, in the second chapter, the Creator goes on to appoint him the first professor of zoology by commissioning him to give the animals the names they shall thenceforth bear, which is once more only a symbol of their total dependence on him, i.e their total lack of rights.

It can truly be said: Men are the devils of the earth, and the animals are the tormented souls. This is the consequence of that installation scene in the Garden of Eden. For the mob can be controlled only by force or by religion, and here Christianity leaves us shamefully in the lurch. I heard from a reliable source that a Protestant pastor, requested by an animal protection society to preach a sermon against cruelty to animals, replied that with the best will in the world he was unable to do so, because he could find no support in his religion. The man was honest, and he was right.

When I was studying at Göttingen, Blumenbach spoke to us very seriously about the horrors of vivisection and told us what a cruel and terrible thing it was; wherefore it should be resorted to only very seldom and for very important experiments which would bring immediate benefit, and even then it must be carried out as publicly as possible so that the cruel sacrifice on the altar of science should be of the maximum possible usefulness. Nowadays, on the contrary, every little medicine-man thinks he has the right to torment animals in the cruellest fashion in his torture chamber so as to decide problems whose answers have for long stood written in books into which he is too lazy and ignorant to stick his nose. – Special mention should be made of an abomination committed by Baron Ernst von Bibra at Nürnberg and, with incomprehensible naïveté, tanquam re bene gesta, [As if the thing were done well] narrated by him to the public in his Vergleichende Untersuchungen über das Gehirn des Menschen und der Wirbelthiere: he deliberately let two rabbits starve to death! – in order to undertake the totally idle and useless experiment of seeing whether starvation produces a proportional change in the chemical composition of the brain! For the ends of science – n'est-ce pas? Have these gentlemen of the scalpel and crucible no notion at all then that they are first and foremost men, and chemists only secondly? How can you sleep soundly knowing you have harmless animals under lock and key in order to starve them slowly to death? Don't you wake up screaming in the night?

It is obviously high time that the Jewish conception of nature, at any rate in regard to animals, should come to an end in Europe, and that the eternal being which, as it lives in us, also lives in every animal should be recognized as such, and as such treated with care and consideration. One must be blind, deaf and dumb, or completely chloroformed by the foetor judaicus, not to see that the animal is in essence absolutely the same thing that we are, and that the difference lies merely in the accident, the intellect, and not in the substance, which is the will.

The greatest benefit conferred by the railways is that they spare millions of draught-horses their miserable existence.

AS.png

Arthur Schopenhauer (1788-1860) The Horrors and Absurdities of Religion

The remembered present.

No genuine stereo perception is possible if one has lost an eye or an ear. But as Dr. Jorgensen observed, a remarkable degree of adjustment or adaptation can occur, and this depends on a variety of factors. One of these is the increased ability to make judgments using one eye or ear, a heightened use of monocular or monaural cues. Monocular cues include perspective, occlusion, and motion parallax (the shifting appearance Of the visual world as we move through it), and monaural cues are perhaps analogous to these, though there are also special mechanisms peculiar to hearing. The diffusion of sound with distance can be perceived monoaurally as well as binaurally, and the shape of the external ear, the pinna, provides valuable cues about both the direction and the asymmetries of sound reaching it. 

If one has lost stereoscopy or stereophony, one must, in effect, recalibrate one's environment, one's spatial world—and movement here is especially important, even relatively small but very informative movements of the head. Edward O. Wilson describes in his autobiography, Naturalist, how he lost an eye in childhood but nonetheless is able to judge distances and depths with great accuracy. When I met him I was struck by a curious nodding of the head, and took this to be a habit or a tic. But he said it was nothing of the sort—it was a strategy designed to give his remaining eye alternating perspectives (such as normally the two eyes would receive), and this, he felt, combined with his memories of true stereopsis, could give him a sort of simulacrum of stereo vision. He said that he adopted these head movements after observing similar movements in animals (like birds and reptiles, for instance) whose visual fields have very little overlap. Dr. Jorgensen did not mention any comparable head movements in himself—they would not be too popular in a concert hall—but such movements might well help one construct a richer, more diverse soundscape.

There are other cues that stem from the complex nature of sounds and the vicissitudes of sound waves as they bounce off objects and surfaces around one. Such reverberation can provide an enormous amount of information even to a single ear, and this, as Daniel Levitin has remarked, has an essential role in communicating emotion and pleasure. It is for this reason that acoustical engineering is a major science and art. If a concert hall or lecture hall is badly designed, sounds may be "killed," voices and music seem "dead." Through centuries of experience, the builders of churches and auditoriums have become remarkably adept at making their buildings sing.

Dr. Jorgensen says that he believes his good ear is "better than should be expected from a seventy-year-old." One's ear, one's cochlea, cannot improve as one gets older, but as Jacob L. clearly demonstrated, the brain itself can improve its ability to make use of whatever auditory information it has. This is the power of cerebral plasticity. Whether or not "hearing fibres may have crossed in the corpus callosum" to the other ear, as Jorgensen suggests, is questionable—but there most assuredly have been significant changes in his brain as he has adapted to life with one ear. New connections must have been made, new areas recruited (and a sufficiently subtle brain-imaging technique might be able to demonstrate such changes). It seems probable, too—for vision and hearing normally complement each other and tend to compensate for each other if one is impaired—that Dr. Jorgensen, consciously or unconsciously, is using vision and visual data to map the position of instruments in the orchestra and the dimensions, spaciousness, and contours of the concert hall, as a way of reinforcing a sense of auditory space. 

Perception is never purely in the present - it has to draw on experience of the past; this is why Gerald M. Edelman speaks of "the remembered present." We all have detailed memories of how things have previously looked and sounded, and these memories are recalled and admired with every new perception. Such perceptions must be especially powerful in a strongly musical person, a habitual concertgoer like Dr. Jorgensen, and imagery is surely recruited to complement one's perceptions, especially if perceptual input is limited. "Every act of perception," Edelman writes, "is to some degree an act Of creation, and every act Of memory is to some degree an act of imagination." In this way the brain's experience and knowledge are called upon, as well as its adaptability and resilience. What is remarkable in Dr. Jorgensen's case, at least, is that, after such a severe loss, with no possibility of function being restored in the ordinary sense, there has nonetheless been a significant reconstruction of function, so that much Of what seemed irretrievably lost is now available to him again. Though it took some months, he has, against all expectation, been able to recover in large measure what was most important to him: the richness, the resonance, and the emotional power of music. 

519xfvEmNuL._SX322_BO1,204,203,200_.jpg

The truth about reality.

Reality is everything that exists. That sounds straightforward, doesn’t it? Actually, it isn’t. There are various problems. What about dinosaurs, which once existed but exist no longer? What about stars, which are so far away that, by the time their light reaches us and we can see them, they may have fizzled out?

We’ll come to dinosaurs and stars in a moment. But in any case, how do we know things exist, even in the present? Well, our five senses – sight, smell, touch, hearing and taste – do a pretty good job of convincing us that many things are real: rocks and camels, newly mown grass and freshly ground coffee, sandpaper and velvet, waterfalls and doorbells, sugar and salt. But are we only going to call something ‘real’ if we can detect it directly with one of our five senses?

What about a distant galaxy, too far away to be seen with the naked eye? What about a bacterium, too small to be seen without a powerful microscope? Must we say that these do not exist because we can’t see them? No. Obviously we can enhance our senses through the use of special instruments: telescopes for the galaxy, microscopes for bacteria. Because we understand telescopes and microscopes, and how they work, we can use them to extend the reach of our senses – in this case, the sense of sight – and what they enable us to see convinces us that galaxies and bacteria exist.

How about radio waves? Do they exist? Our eyes can’t detect them, nor can our ears, but again special instruments – television sets, for example – convert them into signals that we can see and hear. So, although we can’t see or hear radio waves, we know they are a part of reality. As with telescopes and microscopes, we understand how radios and televisions work. So they help our senses to build a picture of what exists: the real world – reality. Radio telescopes (and X-ray telescopes) show us stars and galaxies through what seem like different eyes: another way to expand our view of reality.

Back to those dinosaurs. How do we know that they once roamed the Earth? We have never seen them or heard them or had to run away from them. Alas, we don’t have a time machine to show them to us directly. But here we have a different kind of aid to our senses: we have fossils, and we can see them with the naked eye. Fossils don’t run and jump but, because we understand how fossils are formed, they can tell us something of what happened millions of years ago. We understand how water, with minerals dissolved in it, seeps into corpses buried in layers of mud and rock. We understand how the minerals crystallize out of the water and replace the materials of the corpse, atom by atom, leaving some trace of the original animal’s form imprinted on the stone. So, although we can’t see dinosaurs directly with our senses, we can work out that they must have existed, using indirect evidence that still ultimately reaches us through our senses: we see and touch the stony traces of ancient life.

In a different sense, a telescope can work like a kind of time machine. What we see when we look at anything is actually light, and light takes time to travel. Even when you look at a friend’s face you are seeing them in the past, because the light from their face takes a tiny fraction of a second to travel to your eye. Sound travels much more slowly, which is why you see a firework burst in the sky noticeably earlier than you hear the bang. When you watch a man chopping down a tree in the distance, there is an odd delay in the sound of his axe hitting the tree.

Light travels so fast that we normally assume anything we see happens at the instant we see it. But stars are another matter. Even the sun is eight light-minutes away. If the sun blew up, this catastrophic event wouldn’t become a part of our reality until eight minutes later. And that would be the end of us! As for the next nearest star, Proxima Centauri, if you look at it in 2012, what you are seeing is happening in 2008. Galaxies are huge collections of stars. We are in one galaxy called the Milky Way. When you look at the Milky Way’s next-door neighbour, the Andromeda galaxy, your telescope is a time machine taking you back two and a half million years. There’s a cluster of five galaxies called Stephan’s Quintet, which we see through the Hubble telescope spectacularly colliding with each other. But we see them colliding 280 million years ago. If there are aliens in one of those colliding galaxies with a telescope powerful enough to see us, what they are seeing on Earth, at this very moment, here and now, is the early ancestors of the dinosaurs.

Are there really aliens in outer space? We’ve never seen or heard them. Are they a part of reality? Nobody knows; but we do know what kind of things could one day tell us if they are. If ever we got near to an alien, our sense organs could tell us about it. Perhaps somebody will one day invent a telescope powerful enough to detect life on other planets from here. Or perhaps our radio telescopes will pick up messages that could only have come from an alien intelligence. For reality doesn’t just consist of the things we already know about: it also includes things that exist but that we don’t know about yet and won’t know about until some future time, perhaps when we have built better instruments to assist our five senses.

Atoms have always existed, but it was only rather recently that we became sure of their existence, and it is likely that our descendants will know about many more things that, for now, we do not. That is the wonder and the joy of science: it goes on and on uncovering new things. This doesn’t mean we should believe just anything that anybody might dream up: there are a million things we can imagine but which are highly unlikely to be real – fairies and hobgoblins, leprechauns and hippogriffs. We should always be open-minded, but the only good reason to believe that something exists is if there is real evidence that it does.

51dcC45R67L._SX326_BO1,204,203,200_.jpg

On the conclusion of species.

El professor Jerison ha estudiat en profunditat l'evolució del cervell humà, i s'ha concentrat, sobretot, en els suposats avantatges d'un cervell molt desenvolupat en relació amb la grandària i el pes corporal. Amb tot, els neuròlegs afirmen que la depressió és una malaltia mental que només afecta els éssers que tenen la capacitat de reflexionar sobre si mateixos i de pensar en el seu passat i el seu futur. És a dir, que només es poden deprimir els humans perquè tenen un cervell gros. En aquest sentit, patim potser un excés de cervell?

Segons el famós psiquiatre britànic Tim Crow, hem de buscar l'origen de l'esquizofrènia en l'evolució dels cervells de majors dimensions. Crow afirma que l'esquizofrènia és el preu que paga l'Homo sapiens per tenir la capacitat del llenguatge. Aquesta idea és sens dubte suggeridora, però per a Jerison l'aspecte interessant és l'engrandiment del cervell, que suposa un nou salt evolutiu, d'una magnitud comparable al que va haver-hi fa 200 milions d'anys, quan en passar de rèptils a mamífers els animals van necessitar l'oïda i l'olfacte, a més de la vista.

«En l'evolució de tots els llinatges d'homínids, de l'australopitec en endavant, en un moment determinat va sorgir la necessitat de posseir un "mapa" més precís del territori que ocupaven, constituït ara per uns quants quilòmetres quadrats, i no solament uns quants metres quadrats, com en el cas del rèptil». Aquesta necessitat d'informació es troba en l'origen de l'engrandiment del cervell humà. Segons Jerison, això degué succeir quan el cervell humà tenia la grandària del d'un ximpanzé o una mica més, i l'evolució cap al llenguatge va venir de la necessitat de conèixer, reconèixer, un territori més extens. I del coneixement, i no pas de la grandària, ve la depressió. Perquè, com afirma Jerison no sense una certa ironia, «quan coneixem millor el món també ens coneixem millor a nosaltres amteixos i quan et coneixes a tu mateix, és molt probable que no t'agradis tant». Així, per a ell, l'esquizofrènia, la depressió, fins i tot els desordres bipolars podrien tenir l'origen en el millor coneixement de nosaltres mateixos. Un fet molt complex i en el qual influeixen multiples factors, entre els quals, saber que morirem, la renúncia a la immortalitat que va comportar el canvi del sistema de reproducció. La consciència de la mort ha significat un gran impuls en l'evolució. A excepció de l'home, no hi ha cap animal que la tingui. De manera que, ja ho veieu, quan parlem d'intel·ligència no tot són avantatges.

EP.png

Eduard Punset (2008) Per què som com som