Friday, April 26, 2024

Fully Understand Time

 



We Still Don’t Fully Understand Time

6 MINUTE READ
IDEAS
Rees is Astronomer Royal, former President of the Royal Society, Fellow (and former Master) of Trinity College, Cambridge, and Emeritus Professor of Cosmology and Astrophysics at the University of Cambridge. His latest book is If Science Is to Save Us

In our everyday lives, time is a precious commodity. We can gain or lose it. We can save, spend or waste it. If our crimes are revealed, we risk having to do time.

To scientists, time is something we can measure. Clocks have, over the centuries, been the high tech artifacts of their era—the water clock, the pendulum clock, Harrison’s chronometer, and so forth up to the incredible precision of atomic clocks—marvels of modern technology, albeit without the evident aesthetic quality of more traditional timepieces. (Though engineering friends tell me that, viewed through a microscope, there’s beauty in the intricacies of a silicon chip.)

Before there was a reliable calendar—or any records, or artifacts that could be reliably dated—the past was a ‘fog’. But this didn’t stop efforts to impose fanciful precise chronologies. Most precise of all was that worked out by James Ussher, Archbishop of Armagh, according to which the world began at 6 pm on Saturday, 22 October, 4004 B.C. Right up until 1910, bibles published by Oxford University Press displayed Ussher's chronology alongside the text.

Even in the 17th-century, Ussher's estimates ran into problems. Jesuit missionaries returned from China, telling of detailed historical records dating back to dynasties before 2350BC—the proclaimed date for Noah's Flood. Many were sceptical that the entire history of Earth's mountains, rivers and fauna could be squeezed into 6000 years. Sir Isaac Newton, in his old age, had abandoned science, and was obsessed with completing his own 'Chronology of Ancient Kingdoms'. He did not contest Ussher's dating of human origins, but conjectured that the six 'days' of Genesis could each be a prolonged era.

In the 19th-century, Darwin's genius was to recognise how "natural selection of favoured variations" could have transformed primordial life into the amazing varieties of creatures, now mainly extinct, that have crawled, swum or flown on Earth. But this emergence—a higgledy-piggledy process, proceeding without any guiding hand—is inherently very slow. Darwin guessed that evolution required not just millions but hundreds of millions of years. He was mindful of supporting evidence from geology. He estimated, by an argument that was actually flawed (and which he cut from later editions of his book) that to carve out the Weald of Kent took 300 million years. If he had seen the Grand Canyon he could have made a more convincing estimate.

Precise radioactive dating now tells us that the Sun and its planets condensed 4.55 billion years ago from interstellar gas in the Milky Way—itself a galaxy that, along with billions of others, is part of a still vaster cosmos that emerged from a fiery "beginning" about 13.8 billion years ago. Ever richer data from giant telescopes has allowed cosmologists to developed a credible scenario of our expanding universe. The timechart can be confidently extrapolated back to an era when everything was squeezed as dense as an atomic nucleus. At that time the universe had been expanding for only a millisecond. But that first millisecond—when crucial features of the universe were laid down—is still mysterious and speculative; the densities and temperatures were far higher than can be achieved in a lab, and so we lose our foothold in experimentally-tested physics

And what happened ‘before the beginning’? On this fundamental question, we cannot do much better than St Augustine in the 5th-century. He sidestepped the issue by arguing that time itself was created with the universe. Some modern cosmologists say that time closes up on itself, and the question is like asking, What happens if you go north from the north Pole? The 'genesis event' remains in some ways as mysterious to us as it was to St Augustine.

So cosmic history, we now believe, extends over billions of years. Our time-horizons have hugely extended back into the past. But our concept of the future has stretched even more. To our 17th-century forbears, history was nearing its close. Sir Thomas Browne wrote "The World itself seems in the wane. A greater part of Time is spun than is to come."

But that hardly seems credible to an astronomer—indeed, we are probably still nearer the beginning than the end. Our Sun is less than half way through its life; it will shine for another 6 billion years before the its nuclear fuel runs out. It then flares up, engulfing the inner planets. And the expanding universe will continue—perhaps for ever—destined to become ever colder, ever emptier. To quote Woody Allen, eternity is very long, especially towards the end.

The traditional view, even among those who accept Darwinian evolution, is that we humans are necessarily the culmination of the evolutionary tree. But in the perspective of a vastly prolonged cosmic future, it’s more reasonable to conjecture that we haven’t even reached the half-way stage in the progressive emergence of complexity in the cosmos. Any creatures who witness the death of the sun (having long before then developed the technology to escape to a safe distance) may be as different from us as we are from slime mould.

But even in the immensely concertinered timescape that modern cosmology reveals, extending billions of years into the future as well as the past, this century is special. It’s the first in the 45 million centuries of Earth’s history when one species, ours, can determine the entire planet’s fate. We’ve entered what’s sometimes termed the ‘anthropocene.’ The collective ‘footprint’ of humans on the Earth is heavier than ever; today’s decisions on environment and energy, empowered by our scientific knowledge, resonate centuries ahead and will determine the fate of the entire biosphere, and how future generations live.

Despite our awareness of the aeons lying ahead, our planning horizons have shrunk because our lives are changing so fast. The political focus is on the urgent and immediate, and the next election. Medieval cathedrals took a century or more to complete. There are few efforts by public or private sectors to plan more than two or three decades ahead—or to build structures that will, as the cathedrals have done, offer inspiration for a millennium.

Even more crucial is the possibility that humans will acquire the capability to redesign or ‘enhance’ themselves via genetic modification; or to deploy ‘cyberg’ tchniques that enable them to implant in their progeny the advantages of electronic computers. This evolution via ‘secular intelligent design’ could operate faster than Darwinian selection.

Perhaps our remote descendents will have a much-enhanced lifespan; they might even become near-immortal. Such entities, whose mental powers and attitudes are beyond our grasp—perhaps even beyond our imagination, would surely not feel ‘prisoners of time’, as we mortals do. Would they, like us, ‘spend and save’ it as a scarce resource? Or would an over-abundance lead to ennui? Only time can tell.




Academic Freedom

 


Academic Freedom Is More Important Now Than Ever

7 MINUTE READ
IDEAS
Roth is president of Wesleyan University. His most recent book is The Student: A Short History.

At least since the 1800s, colleges and universities in the United States have emphasized their civic missions. American college students weren’t just supposed to get better at exams and recitations, they were supposed to develop character traits that would make them better citizens. In the last fifty years, whether one attended a large public university or a small private college, chances are the mission statement of your school included language that emphasized the institution’s contribution to the public good. So why today is there a chorus of critics urging higher education leaders to cultivate neutrality, to cede the public sphere to others?

I suppose that in these days of social polarization and hyper-partisanship, some see campus life as a retreat from the bruising realities of political life. You can pursue theater or biology, religion or economics, without worrying too much what the person sitting next to you thinks of the political issue of the day. And if your institution has no political commitments, so the thinking seems to go, you may feel more inclined to form your own, or just not to have any at all. Whether one weds this to a monastic view of higher education, or skills-based vocational one, one can feel that university life provides a respite from the take-a-stand demands of the political. While there is certainly freedom in that, it should be remembered that we only have that freedom because of guarantees established by generations of political struggle.

One of those fundamental guarantees is the ability to choose one’s field of study, to conduct research without political intervention, and to openly discuss the results of one’s work. In his forthcoming book Academic Freedom: From Professional Norm to First Amendment Right, David Rabban argues convincingly that academic freedom is a distinctive first amendment right, one which protects teachers and researchers while enabling society as a whole to benefit from the production and dissemination of knowledge. The American Association of University Professors sketched out this foundational professional norm in 1915. Scholars are free to explore issues and to debate them; they should be able to take positions that might turn out to be very unpopular in the broader political realm. Rabban shows how these protections have been protected in a series of court decisions over the last hundred years. Whether teaching the Bible, a contemporary video, or evolutionary biology, professors should not have to worry that political pressures will force them away from a path of inquiry or a mode of expression.

Today, though, such worries abound. Libraries are banning books at alarming rates, faculty are being disciplined for their political views, and student rights to protest are being curtailed beyond appropriate “time and place” guidelines. Issues around the right to speak one’s mind are front and center on college campuses, thrust there in part because of protests around the war in Gaza and the spectacle of congressional hearings about antisemitism. Ivy League institutions have attracted the most attention for outbursts of Jew hatred even as scores of civilian hostages from Israel are subject to torture, rape, and brutal isolation. But rejecting the vicious tactics of the war in Gaza, doesn’t make one a supporter of terrorist violence; starving Palestinians in Gaza won’t secure freedom for the hostages. In these dire conditions, it’s no wonder that colleges are faced with legitimate protests as well as traditional expressions of prejudice and hate. On campuses across the country, Islamophobia and antisemitic harassment can destroy the conditions for learning, but mass arrests of peaceful protestors and the censorship of speakers for their political views only undermine academic freedom in the long run.

In recent years, some progressives have expressed doubts about free speech, describing it as a neo-liberal disguise of existing power relations. Now, however, as those aligned with pro-Palestinian movements are being censured or arrested, the old liberal approach to freedom of expression seems more attractive to many protestors. Historians Ahmna Khalid and Jeffrey Snyder have been writing about these issues, and recently they gave a powerful presentation to a packed house at Wesleyan. They began with a critical appraisal of Florida’s various attacks on academic freedom, which on my campus was preaching to the choir. But the speakers went on to discuss how many of efforts that fall under the popular rubric of inclusion are also aimed at suppressing speech ostensibly to protect so-called vulnerable communities. Protecting students against speech in the name of harm reduction is almost always a mistake, they argued. Here, the debate got interesting. And this was their point: debates are only interesting when people are free to disagree, listen to opposing views, change their minds. But they also seemed to agree that in some cases intimidation could be intense enough to warrant restrictions on expression. I call this a “safe enough space” approach to speech, but in the discussion following their talk it was clear that they thought my pragmatist position left too much room for unwarranted “safetyism.” There was a real conversation; students and faculty were fully engaged.

Read More: The New Antisemitism

And that, of course, is when learning happens: when we are engaged in deep listening and in trying to think for ourselves in the company of others. This is what I argue in The Student: A Short History: being a student—at whatever age—means being open to others in ways that allow one to expand one’s thinking, to enhance one’s capacities for appreciation, for empathy and for civic participation. That participation energizes a virtuous circle because it’s by engaging with others that one multiplies possibilities for learning (and then for further engagement).

That virtuous circle depends on freedoms in a political context under direct threat from the populist authoritarianism movement led by Donald Trump. When Trump attacks his enemies, when he talks about them as thugs and vermin or proposes his own national university to replace the elites so despised by his base, he is telegraphing his intentions to remake higher education in the image of his MAGA movement. Many academics seem to shrug their shoulders, saying either that “other politicians aren’t so great either,” or to suggest (as elites often have when faced with growing fascist threats) that he doesn’t really mean what he says. This is a grave mistake, as we’ve seen before in history.

Neutrality, whether based on principle, apathy or cynicism, today feeds collaboration. The attack on democracy, the attack on the rule of law, will also sweep away the freedoms that higher education has won over the last 100 years.

We can fight back. Between now and November 5th, many of our students, faculty, staff and alumni will be practicing freedom by participating in the electoral process. They will work on behalf of candidates and in regard to issues bearing on the future of academic freedom, free speech, and the possibilities for full engagement with others. This is challenging work. In the noise of contemporary politics, it’s hard to practice authentic listening; in the glare of the media’s campaign cameras, it’s hard to see things from someone else’s point of view.

But that is our task. If we are to strengthen our democracy and the educational institutions that depend on it, we must learn to practice freedom, better. We must learn to be better students. Our future depends on it.