Home Companies Industry Politics Money Opinion LoungeMultimedia Science Education Sports TechnologyConsumerSpecialsMint on Sunday

A limit to collapse?

Nobel laureate Subrahmanyan Chandrasekhar believed that scientists do their most innovative work in their 20s and 30s
Comment E-mail Print Share
First Published: Thu, Jan 31 2013. 06 08 PM IST
An artist’s impression of a giant black hole. Photo: AFP
An artist’s impression of a giant black hole. Photo: AFP
Here in the rolling hills of Chittoor district, some spunky, charming kids and I have been looking up at the stars every chance we get. The night sky is a thing of beauty and I hope they learn to love it. Especially, how each innocently twinkling speck is really a vast ball of fiercely burning gases.
But fires eventually die out. So there must come a time when a star uses up all its gas. What happens then? Now astronomers have known for a long time that a dying star will collapse on itself. A small one—example, our Sun—shrinks until it is a dense globe called a white dwarf. Many such creatures dot our sky.
But what about larger stars? Plenty of what we know about them, we learnt from the great astronomer Subrahmanyan Chandrasekhar. After studying the way stars collapse, he reached a profound conclusion in 1930. Stars larger than about 1.4 times the Sun’s mass, he suggested, won’t stay as white dwarfs when they collapse. Instead, they will get steadily smaller and denser. Some, he predicted, will eventually shrink into small regions that are so dense, with such powerful gravitational pulls, that nothing can escape them. Nothing, meaning not even light.
You know: black holes.
Two things about Chandrasekhar have always fascinated me.
First, the elegant reasoning that led to that number 1.4. Start with this: if a star is a ball of gases, what keeps it so? Why doesn’t it collapse on itself, like a pricked balloon? Well, what happens in stars is a nuclear reaction called fusion. This is what burns gas, emitting heat and light. Fusion creates pressure in the star that balances gravity, and this is why it doesn’t collapse on itself.
But when the fuel—the gas—runs out, there is no more fusion, no way the star can build pressure and resist gravity. This is why the star shrinks, compressing its own remaining matter, even squeezing out the space between its atoms. (Think of compressing a balloon filled with little balls).
Still, there’s a limit to such compression, because eventually atoms are forced so close together that their electrons begin to repel each other. (It’s what electrons do). And when the star reaches that state—inward-pulling gravity due to its mass, balanced against the outward pressure of electron repulsion—it becomes a white dwarf.
But Chandrasekhar showed that there is only so much compression the repulsive power of electrons can withstand. (Think of compressing that balloon so hard that you suddenly start flattening the balls). If gravity exceeds that power, the star will shrink beyond the white dwarf stage. This happens with more massive stars, which have more material to fall in on themselves and exert pressure. Conceptually, it’s not a difficult calculation to show that in a star that’s heavier than about 1.4 times the sun—the Chandrasekhar limit—compression pressure overcomes electron repulsion.
It’s such stars that go on shrinking, possibly all the way into black holes.
Second, after Chandrasekhar propounded his theory, he got into a famous argument with the astronomer Sir Arthur Eddington at a meeting in London of the Royal Astronomical Society (RAS). Now Eddington had been something of a hero to Chandrasekhar. As a student, he had won a prize in a physics contest at Madras University: Eddington’s book on star structure. But at the RAS meeting, Eddington tore into Chandrasekhar’s star collapse theory: “I think there should be a law of nature to prevent a star from behaving in this absurd way.”
Meaning, purely because Eddington could not comprehend the idea of a black hole, he believed nature should not allow it. Not the best way for science to advance. And, of course, today we know black holes certainly do exist.
Eddington and Chandrasekhar argued for some years. Chandrasekhar moved on to other fields, returning to black holes only much later (winning the 1983 Nobel). But he always believed that it was his argument with Eddington that drove him from his first speciality and his taking up other fields of research, that he remained productive into his 70s. Most scientists do their most innovative work in their 20s and 30s. After that, they rarely add significantly to science; rarely, Chandrasekhar argued, do they refine their own minds.
He thought it was because scientists develop an arrogance towards nature. Their success makes them believe that they have a special, somehow “right” view of science. For example, why should Eddington believe there must be a law of nature to prevent stars from turning into black holes? How could he possibly know? How could he assume, regardless of his own great work in astronomy, that he knew exactly what the laws of nature should or should not allow?
“Nature has shown over and over again,” Chandrasekhar wrote, “that the kinds of truth which underlie nature transcend the most powerful minds.” Here in Chittoor district, that’s an important, if humbling, thought.
Once a computer scientist, Dilip D’Souza now lives in Mumbai and writes for his dinners. A Matter of Numbers will explore the joy of mathematics, with occasional forays into other sciences. To read Dilip D’Souza’s previous columns, go to www.livemint.com/dilipdsouza-
Comment E-mail Print Share
First Published: Thu, Jan 31 2013. 06 08 PM IST