Why we should thank Einstein for our smartphone cameras

Learn more about how the photoelectric effect has shaped technologies such as burglar alarms, solar panels and the camera in your smartphone.

The engineer picked up a camera flash gun, aimed it at the tiny circuit board computer on the desk, and fired. For a fraction of a second, light flooded the room. Everyone blinked – and saw that the computer had crashed.

The Raspberry Pi team had just confirmed that their product, a budget computer sold to schools and electronics enthusiasts, hated having its picture taken. At least when you took the picture with a big xenon flash lamp.

Raspberry Pi 2
The Raspberry Pi 2 (model B.) Evan-Amos, Public domain, via Wikimedia Commons

“We all had fun crashing it,” recalls Eben Upton, founder of Raspberry Pi. They had realised that a chip on the computer was susceptible to the photoelectric effect – when light triggers the release of electrons, and thus an electrical current. A kind of reverse “light switch”, if you like.

Upton and his colleagues had not anticipated this problem. It was discovered by a Raspberry Pi 2 user less than a week after the device went on sale in early 2015. In subsequent versions of the computer, the troublesome chip featured a black coating thick enough to soak up incoming light.

More than a century earlier, Albert Einstein had described the photoelectric effect in a ground-breaking paper – one of four seminal papers he published in 1905 while working as a clerk in the Swiss patent office. Later, in 1921, he received the Nobel Prize in Physics for it.

The photoelectric effect has gone on to shape all kinds of technologies – from burglar alarms to solar panels and the camera in your smartphone.

‘Weird phenomena’

Albert Einstein lecturing in Vienna
Albert Einstein lecturing in Vienna, 1921. Photo: Ferdinand Schmutzer. Public domain via Wikimedia Commons

To understand it better, consider the question that gripped Einstein back in 1905: what is light made of?

At the time, many scientists theorised that light existed purely as a wave, which some suggested travelled across the universe in an intangible “light-bearing ether”. But to Einstein, this idea seemed ridiculous – “like Father Christmas”, says Steve Gimbel at Gettysburg College in the US.

Scientists including Heinrich Hertz had already demonstrated versions of the photoelectric effect by using light to generate tiny sparks, or to electrically charge pieces of gold leaf, causing them to repel each other.

“There were certain weird, unexplained phenomena where light could create electricity and that just blew people’s minds – that seemed to make no sense,” says Gimbel.

Illustration of electromagnetic spectrum
In the electromagnetic spectrum, visible light sits between infrared and ultraviolet radiation. Credit: BBC

The weirdest thing was that the intensity of light didn’t affect the energy of the electrons produced whereas the frequency, or colour, of the light did. This was mind-boggling. More light should mean more energy, right?

Well, Einstein realised that if light wasn’t just made up of waves but also discrete packets or particles (which later came to be known as photons) travelling in waves, then it could be that the energy of those individual particles would explain this.

“When a single photon hits an electron, it [the electron] gets excited,” explains Paul Davies, at the University of York. So long as that photon lands with enough energy, then the photoelectric effect occurs – and the electron is freed from the material.

Think of it like throwing tiny sticks of dynamite into an open barrel of cannonballs. The little explosions won’t be enough to knock out a cannonball, no matter how many times you fling one in. But if you use stronger dynamite, with more energy, that will make the cannonballs fly.

The energy value of a photon is directly related to the colour of visible light – photons in blue light travel on shorter waves and have more energy than those in red light, for instance. That’s why Hertz found that especially energetic ultraviolet light would produce stronger sparks during one of his experiments.

Foghorns and photovoltaic cells

Gimbel stresses that Einstein didn’t come up with this theory out of nowhere. He drew not only on work by Hertz and others, but also on physicist Max Planck’s theory of “quanta” – the idea that radiation, including light, consists of discrete packets of energy, for which Planck also received a Nobel Prize in Physics, in 1918. But in 1905 this concept was still controversial.

Nobel Laureates Walther Nernst, Albert Einstein, Max Planck, Robert A. Millikan and Max von Laue
From left to right: Nobel Prize laureates Walther Nernst, Albert Einstein, Max Planck, Robert A. Millikan and Max von Laue at a dinner given by von Laue in Berlin, 11 November 1931. Nationaal Archief, Photographer unknown, Public domain via Wikimedia Commons

“Einstein had this revolutionary mind where he was willing to consider other approaches,” says Gimbel. “He took seriously this idea that light could be quantised.”

Einstein’s work was divisive – especially his special theory of relativity. Some members of the Nobel physics committee at the time hesitated to award him a prize and, when they did, they chose to award it for his work on the photoelectric effect, rather than relativity.

Scientists have long debated whether this was the best choice but there is little doubt that harnessing the photoelectric effect has changed the way our world works, since so many technologies rely on it.

Motion sensors in burglar alarm systems, for example, emit a beam of infrared light. When this beam is interrupted by an intruder, the light received by the sensor changes, altering the electrical current – and that sets off the alarm.

Illustration of camera, solar panels and x-rays
Technologies like photo and image sensors rely on the photoelectric effect. Credit: Getty Images

Finish lines at races held in the Olympic Games have used photoelectric cells to detect exactly when runners cross. Such technology has allowed ships to sense fog, and automatically switch on foghorns. It has also enabled cars to turn on their windscreen wipers spontaneously when it rains.

Strictly speaking, the photoelectric effect refers to a phenomenon in which electrons escape a material – but Davies says this is closely related to the photovoltaic effect, where movement of electrons facilitates an electrical current flowing through adjacent materials.

That’s what solar cells in solar panels do when they turn sunlight into electricity, contributing clean, renewable energy to electricity grids and tackling climate change.

Silicon sensors

Another popular application of the photoelectric effect is in camera sensors, the light-sensitive part of a digital camera that captures images. Nearly all use CMOS technology, which was fine-tuned at Nasa in the 1990s for use in space, but came to be installed on billions of smartphones. “The CMOS image sensor was the perfect device, let’s say, for that. It turned out to be the killer application,” says engineer Eric Fossum, who worked on the project.

Silicon is the key material used in CMOS sensors and Fossum, now at Dartmouth College, notes that the photoelectric effect in silicon is triggered by many colours of light.

“It doesn’t matter whether it’s green light, red light, or blue light – a photon will liberate exactly one electron. We’re kind of lucky that way.” This really helps when you want to capture a subject’s colour in full detail.

CMOS sensor
Most smartphone cameras use CMOS sensors. Credit: Getty Images/BBC

Now, Fossum and colleagues are working on image sensors sensitive to the smallest imaginable amount of light – a single photon. These devices, also known as photon-counters, are already used for laboratory experiments but they could also revolutionise digital imaging technologies, for example by improving image quality in medical CT scanners, and exposing patients to less radiation. The potential applications don’t stop there. “We’ll have the capability to practically see in the dark with this new technology,” says Fossum.

Another scientist working on devices that harness the photoelectric effect is Dimitra Georgiadou at the University of Southampton. She and her colleagues are developing technologies that can detect light and process information about it without having to send data to a central computer system for analysis. “This reduces significantly the amount of energy it needs,” says Georgiadou.

This might help researchers develop highly advanced bionic eyes and give sight to blind people by enabling the design of smaller, easier to implant, and more energy-efficient devices. It could also enable self-driving cars to make faster decisions about when to brake for safety reasons.

Lunar glow

The light-sensing technology Georgiadou is focused on does not rely on silicon but rather organic, carbon-containing, materials – these can be tuned to respond only to specific colours of light, and also printed on flexible substrates.

Such technology could turn up in wearable, low-power light sensors able to track the heart rate and blood oxygen levels of premature babies, for example, by shining small amounts of light through their skin and into their veins.

Since Einstein wrote down his theory on the photoelectric effect in 1905, we’ve certainly come up with a lot of fun things to do with it. But there’s more. Understanding this incredible interaction of light and matter has revealed curious details about the way the universe works.

In the 1960s, some of the earliest moon landers took pictures of the lunar horizon and noticed something strange: a weird glow, almost like a gently fading sunset. Except that the moon doesn’t have an atmosphere like Earth’s, and it’s the scattering of light by particles in our atmosphere that creates sunrises and sunsets as the planet turns on its axis.

Where was this lunar glow coming from? It turned out that light from the sun was striking dust on the moon’s surface and, through the photoelectric effect, giving it a positive electrical charge.

These little dust particles thus repelled each other, periodically levitating above the lunar surface. As they did so, they caught the light of the recently set sun – and created that magical glow.

By Chris Baraniuk, BBC World Service. This content was created as a co-production between Nobel Prize Outreach and the BBC.

Published May 2026

To cite this section
MLA style: Why we should thank Einstein for our smartphone cameras. NobelPrize.org. Nobel Prize Outreach 2026. Sun. 17 May 2026. <https://www.nobelprize.org/stories/photoelectric-effect/>

MARY BRUNKOW

Nobel Prize in Physiology or Medicine 2025

A chance discovery made around 80 years ago paved the way for Mary Brunkow’s painstaking work, which contributed to redefining how the immune system functions. From investigating an obscure gene dismissed as ‘junk’ by other scientists, to choosing to opt out of academia and conduct her research at a biotech startup, Brunkow has embraced the unexpected throughout her career.

First dog Maggie.jpg
Mary Brunkow with the family’s first dog, Maggie. Credit: Mary Brunkow
New baby brother.jpg
Mary Brunkow and her brother with their newest sibling. Credit: Mary Brunkow
HighSchool CrossCountry team.jpeg
Mary Brunkow with her high school cross country team. Credit: Mary Brunkow

Mary Brunkow was born in 1961 in Portland, Oregon, the middle child of three. In high school, she enjoyed basketball and ran for the track and cross-country teams, but was, by her own omission, pretty bad at both. She was, however, a good student with a love of maths and science, especially genetics.

Brunkow longed to go to university in a big city and, after graduation, started on the pre-med track at the University of Washington. She was inspired by her paternal grandfather who was a doctor and surgeon, and had a strong desire to help people. However, after taking a genetics class in her final year at university and getting her first taste of lab work, she realised research was where her future lay. “The minute I walked into that research laboratory, it was just an environment that really struck me,” she said.

“I like the idea of something really concrete and coming up with a discovery that proves something.”

Mary Brunkow

The lively and collaborative atmosphere of the lab stirred Brunkow to continue her studies at Princeton University as one of the first graduate students to join its department of molecular biology. She made the brave decision to focus on a seemingly obscure gene dismissed as ‘junk’ by other scientists for her PhD, before carrying out postdoctoral research at the University of Toronto.

Toronto post-doc lab_1991.jpg
Toronto, post-doc lab, 1991. Credit: Mary Brunkow
PhD lab at Princeton.jpg
PhD lab at Princeton. Credit: Mary Brunkow

While her fellow postdocs stayed in academia, Brunkow wanted to do something with a more direct application to human health. It was an exciting time in science. The Human Genome Project – an ambitious international research effort aimed at deciphering the chemical makeup of the entire human genetic code – was running at full speed and new genomics-related technologies were taking off. Her postdoctoral research over, she decided to take another unexpected path, joining a young biotechnology company in a Seattle suburb that developed pharmaceuticals for autoimmune diseases. According to Brunkow, the competition and self-promotion she had seen in academia was absent. “Everybody was working towards the same goal of finding new drug targets that would eventually be hopefully developed into drugs,” she said. There, Brunkow learned of a mouse genetics programme set up following the Second World War to understand how the effects of radiation might impact human health and alter genes.

Illustration: Brunkow and Ramsdell find the scurfy mutation
Brunkow and Ramsdell find the scurfy mutation. The scurfy mutation causes a mutiny in the immune system. Brunkow and Ramsdell succeeded in narrowing down the area of the mutation and locating it in the Foxp3 gene, which turned out to be decisive in the development of regulatory T cells. © The Nobel Committee for Physiology or Medicine. Ill. Mattias Karlén

A chance discovery of a spontaneous genetic mutation in mice made almost 80 years ago turned out to be the foundation for Brunkow and her colleague Fred Ramsdell’s Nobel Prize-worthy work. The mutation, known as scurfy, only affects male mice who show several hallmarks of severe autoimmune disease, where the immune system attacks almost every organ in their body. Yet the gene that caused the serious symptoms was unknown. The duo realised that if they could understand the molecular mechanism underlying the mice’s disease, they could gain decisive insights into how autoimmune diseases arise. 

Today, it is possible to map a mouse’s entire genome and find a mutated gene in a few days, but in the 1990s it was like looking for a needle in a huge haystack. Mapping had shown that the scurfy mutation must be somewhere in the middle of the X chromosome, so Brunkow and Ramsdell painstakingly mapped that area of the X chromosome, eventually narrowing their search and identifying 20 possible target genes. “It was really a molecular slog to get to that exact mutation, because it was just a very small genetic alteration that results in quite a profound change in the immune system,” Brunkow said.

She bred her own colony of scurfy mice from the same line, taking over and adapting a disused janitor’s closet to provide space to do so. By comparing the genes in healthy mice and scurfy mice they discovered a minor change in a gene that would later be named Foxp3.

Illustration of transcription factor Foxp3 bound to DNA
Illustration of Foxp3 bound to DNA. Credit: JUAN GAERTNER/SCIENCE PHOTO LIBRARY via Getty Images

During their work, Brunkow and Ramsdell had begun to suspect that a rare autoimmune disease, IPEX, which is also linked to the X chromosome, might be the human variant of the scurfy mice’s disease. In 2001, they revealed that mutations in the Foxp3 gene cause both the human disease called IPEX and the scurfy mice’s ill health. 

In 2025, Brunkow and Ramsdell shared the Nobel Prize in Physiology or Medicine with Japanese immunologist Shimon Sakaguchi. Their discoveries launched the field of peripheral tolerance, spurring the development of medical treatments for cancer and autoimmune diseases. 

Embracing the unexpected and following the path less travelled has yielded Brunkow a fascinating and inspiring career, combining academic research, biotech startups, consulting, science communication and programme management.

“Discoveries come from places where you never imagined, so you have to keep an open mind and keep your eyes open to different pathways, as you’re journeying through a topic of interest, but also an open mind in terms of career path and where you go in life.”

Mary Brunkow

“I would say I’m a pretty good example of … one who has kind of changed directions and not stayed on one single track. Sometimes doors open off to the side a little bit. There’s nothing wrong with exploring different options.”

Family in Ireland_2023.jpeg
Mary Brunkow with her family in Ireland, 2023. Credit: Mary Brunkow
Mary E. Brunkow lecturing
Mary Brunkow delivering her Nobel Prize lecture at Karolinska Institutet, 7 December 2025. © Nobel Prize Outreach. Photo: Nanaka Adachi

Women who changed science

Read about the contributions, careers and lives of more women who have been awarded Nobel Prizes for their scientific achievements.

To cite this section
MLA style: MARY BRUNKOW. NobelPrize.org. Nobel Prize Outreach 2026. Sun. 17 May 2026. <https://www.nobelprize.org/stories/women-who-changed-science/mary-brunkow/>

Exploring life’s smallest details

Learn more about some of the Nobel Prize-awarded discoveries in the fields of cells, nerves and genes.

Cells, nerves and genes are fundamental to all living organisms. They control life’s processes and development. Often, this is where illness and suffering are caused. Knowledge about cells, nerves and genes is built on centuries of exploring life’s smallest details. New instruments and methods have helped to give us pictures of previously unknown worlds. 

In 1931–1933, Karolinska Institutet (KI) awarded the Nobel Prize in Physiology or Medicine for discoveries in the fields of cells, nerves and genes. Otto Warburg (1931) measured the exchange of energy in cells. Charles Sherrington and Edgar Adrian (1932) studied impulses in the nervous system. Thomas Hunt Morgan (1933) defined modern gene theory by mapping the chromosomes of the fruit fly (Drosophila melanogaster). These laureates all defined and measured processes directly and in real time on the cell, nerve and chromosome level.

Here, you can learn more about their and some of their successors’ Nobel Prize-awarded work.

The content is taken from the exhibition Cells, nerves, genes, a collaboration between the Nobel Prize Museum and Medical History and Heritage at the Karolinska Institutet University Library.

Cells

British biochemist Peter Mitchell first published his ideas on the proton motive force in Nature in 1961. The equations here suggest the mitochondrial network. Cellular level art, paint on silk, digitised.

Licence: Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) https://creativecommons.org/licenses/by-nc/4.0/. Credit: Mitchell’s equation I. Odra Noel. Source: Wellcome Collection.

The release of energy

Illustration from a lecture
Image taken from Otto Warburg’s Nobel Prize lecture.

Otto Warburg was awarded the Nobel Prize for his biochemical research into how energy is released in the chemical processes of living cells. In the respiration of cells, nutrients are broken down while oxygen is consumed and energy released. Warburg measured oxygen consumption in the cells and studied the effect of enzymes on these processes, developing ground-breaking experiments and apparatus for this purpose. This made him the most copied biochemist in the first half of the 1900s.


The citric acid cycle

Hans Krebs was awarded the Nobel Prize for his medical discovery of the citric acid cycle, the central cellular metabolic process that breaks down nutrients and releases chemical energy in the form of ATP (adenosine triphosphate). The ATP molecule transfers and provides energy used by cells to power a variety of processes. The citric acid cycle also produces the components for building other molecules. Krebs began his career as a colleague of Otto Warburg in Berlin. After the Nazi take-over in Germany, he moved to the UK and became a professor at Oxford University. 

Illustration
Image taken from Hans Krebs’ Nobel Prize lecture.

Helped explain photosynthesis

Illustration
Image taken from Peter Mitchell’s Nobel Prize lecture.

Peter Mitchell explained how ATP molecules transfer energy in cells. This discovery helped to explain fundamental processes such as cell respiration, photosynthesis and bacterial metabolism. His chemiosmotic theory demonstrated that the fundamental mechanism is a process that takes place at the subatomic level in the cell. It begins with a flow of electrons and hydrogen ions through the mitochondria of the cell membrane, known as the “powerhouses of the cell.” The process is controlled by the interaction of several enzymes. 


Cell respiration

Hugo Theorell showed how oxidative enzymes affect cell respiration. His research laid the foundation for the modern understanding of how cells produce energy from oxygen and nutrients. Theorell made his discoveries in 1935 at Otto Warburg’s laboratory in Berlin. When the Nobel Institute for Medicine was founded at KI, Theorell was offered a position in Sweden to return to. 

Illustration
Image taken from Hugo Theorell’s Nobel Prize lecture.

Chemical signals at cellular level

Illustration
Image taken from Ulf von Euler’s Nobel Prize lecture.

Unlike most prominent Swedish physiologists who focused on the electric signals of the nervous system, Ulf von Euler researched chemical signals at cellular level. In 1940, he found that noradrenaline is a signal substance in the nervous system, and was awarded the medicine prize in 1970 for his discovery. Von Euler was appointed professor of physiology in 1937 at the KI, where he collaborated mainly with Hugo Theorell’s department of biochemistry at the Medical Nobel Institute. 


Prostaglandins

Sune Bergström, professor of chemistry at the KI, worked with Bengt Samuelsson and Jan Sjövall to explore prostaglandin. This is a hormone-like substance made of fatty acids. It affects both blood pressure, menstruation and inflammatory processes in the body. As a pharmaceutical, it can induce labour, but it can also be used for medical abortion. Prostaglandins have been vital to the development of medications in a variety of fields, such as the treatment of blood clots, inflammations and allergies. 

Illustration
Image taken from Bengt Samuelsson’s Nobel Prize lecture.

A molecular machinery

Illustration
Image taken from William Kaelin’s Nobel Prize lecture.

William Kaelin, Peter Ratcliffe and Gregg Semenza discovered how cells can detect and adapt to the oxygen supply. In the 1990s, they identified a molecular machinery that regulates the activity of genes in response to variations in oxygen levels. The discovery may lead to new treatments for anaemia, cancer and other conditions. 

Cells timeline

  1. The term cell

    The term cell was coined by Robert Hooke. One of the illustrations in his work Micrographia from 1665 shows a thin slice of cork. It seems to consist of smaller units encased by walls. They reminded Hooke of the cells inhabited by monks and nuns. This later became a fundamental concept in physiology. But the physiological importance of cells was unknown to Hooke. 

  2. Living cells documented

    Antonie van Leeuwenhoek was the first to document living cells. In 1674, he was able to observe the red blood cells of a fish. Leeuwenhoek was a Dutch draper and shop owner in Delft and made magnifying lenses of superior quality to scrutinise fabric quality. From 1670, he directed his lenses on the world around him and discovered what no one had seen before. This includes living bacteria, which he first noted in 1683. Like Hooke, he did not know what he was seeing. 

  3. Cells are structural units

    In 1838, the botanist Jakob Schleiden published the theory that cells are structural units shared by all plants. The same year, he and the physiologist Theodor Schwann defined their cellular theory: all forms of life consist of cells, the fundamental structural unit of all organisms. 

  4. Cell division

    Oscar Hertwig was a German anatomist and embryologist with a keen interest in cells and cell cores. His research on fertilisation, reproduction and heredity has been vital to our knowledge in these fields. In 1876, he was the first to describe meiosis, the type of cell division that produces sperms or egg cells. Later, he also described the role of nuclein (nucleic acid) in the fertilisation and transmission of hereditary characteristics.   

  5. The release of energy

    Otto Warburg was awarded the Nobel Prize for his biochemical research into how energy is released in the chemical processes of living cells.

  6. The citric acid cycle

    Hans Krebs was awarded the Nobel Prize for his medical discovery of the citric acid cycle, the central cellular metabolic process that breaks down nutrients and releases chemical energy in the form of ATP (adenosine triphosphate).

  7. Helped explain photosynthesis

    Peter Mitchell explained how ATP molecules transfer energy in cells. This discovery helped to explain fundamental processes such as cell respiration, photosynthesis and bacterial metabolism. 

  8. Cell respiration

    Hugo Theorell showed how oxidative enzymes affect cell respiration. His research laid the foundation for the modern understanding of how cells produce energy from oxygen and nutrients. 

  9. Chemical signals at cellular level

    In 1940, Ulf von Euler found that noradrenaline is a signal substance in the nervous system, and was awarded the medicine prize in 1970 for his discovery.  

  10. Prostaglandins

    The Nobel Prize in Physiology or Medicine 1982 was awarded jointly to Sune K. Bergström, Bengt I. Samuelsson and John R. Vane “for their discoveries concerning prostaglandins and related biologically active substances.” Prostaglandins have been vital to the development of medications in a variety of fields, such as the treatment of blood clots, inflammations and allergies. 

  11. A molecular machinery

    William Kaelin, Peter Ratcliffe and Gregg Semenza were awarded the medicine prize in 2019 for their work on how cells can detect and adapt to the oxygen supply.


Nerves

In the early 20th century Spanish scientist Santiago Ramón y Cajal known as 'Cajal' drew the first observed neurons as black shapes. The artist paints them here full of colour.

Licence: Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) https://creativecommons.org/licenses/by-nc/4.0/. Credit: Cajal Neurons. Odra Noel. Source: Wellcome Collection.

Structure of the nervous system

In the 1870s, Camillo Golgi came up with a method to dye cells, which made it possible to see cell details under a microscope. Both Golgi and Santiago Ramón y Cajal used and refined the method when mapping the anatomy and structure of the nervous system. Cajal contributed to developing the neuron theory: that the brain and nervous system consist of discrete elements, nerve cells or neurons, which are linked through contacts, later named synapses. 

Illustration
Image taken from Camillo Golgi’s Nobel Prize lecture.

Studies on reflexes

Illustration
Image taken from Edgar Adrian’s Nobel Prize lecture.

In the late 1800s, Charles Sherrington contributed to the theory that the nervous system consists of nerve cells, neurons, and that impulses are transmitted through contacts, synapses, between neurons. In experiments, he studied reflexes by charting the links between muscular movements and electric impulses to the nerves. The ability to examine how the nervous system works took a huge leap in 1925, when Edgar Adrian succeeded in measuring electric impulses in an individual nerve. He found that the individual impulses always have the same strength. Stronger simulation does not make the impulses stronger; instead, they are transmitted more frequently and through more nerve fibres.


Electric signals

John Eccles, Alan Hodgkin and Andrew Huxley delved deeper into the nervous system around 1950, to study how electric signals are transmitted. Hodgkin and Huxley found that a basic mechanism consists of sodium ions and potassium ions migrating in opposite directions through the cell walls, giving rise to electric tension. John Eccles could show that there are variants of synapses that are either firing or inhibiting. 

Illustration
Image taken from John Eccles’ Nobel Prize lecture.

Synapses and dopamine

Illustration
Image taken from Paul Greengard’s Nobel Prize lecture.

Arvid Carlsson, Paul Greengard and Eric R. Kandel examined how signals are chemically transmitted by various substances between synapses, the contact points between cells in the nervous system. Kandel could also show how these processes shape memories at synapse level. Carlsson’s research on the crucial importance of dopamine to normal movement in organisms led to the introduction of a substance called L-dopa, which the body transforms into dopamine, as a treatment for Parkinson’s disease. 


Physiology of the eye

Ragnar Granit, Haldan Keffer Hartline and George Wald made ground-breaking discoveries about vision and the physiology of the eye. Ragnar Granit studied the electrical impulses from the retina cells. He could prove that the cone cells, which enable us to see colours, are of different kinds and respond to light within three spectrum zones. Granit began his career in Helsinki, but transferred to KI in 1946, where he was a professor of neurophysiology at the Medical Nobel Institute. He studied under Charles Sherrington at Oxford.

Illustration
Image taken from George Wald’s Nobel Prize lecture.

Divided tasks in the brain

H_Sperry_1_beskuren.jpg
© Nobel Prize Outreach. Photo: Jan Eve Olsson

Roger Sperry showed that tasks are divided between the brain’s two halves – the left half handling abstract and analytical thinking, and the right processing spatial cognition and complex sounds. David Hubel and Torsten Wiesel recorded electrical activity from the nerve cells in the visual cortex. They discovered that different nerve cells react to specific visual stimuli, such as lines at a certain angle, movement or direction. Torsten Wiesel studied medicine at KI, where he also began his research before moving to the USA. 


The brain’s coordinate system

In recent decades, neurophysiological research has studied how information is formed in the brain. John O’Keefe could already measure in the 1970s how a rat’s spatial position activated certain cells in its brain. In other words, the brain cells created a sort of map. May-Britt Moser and Edvard Moser later discovered cell types that form a system of coordinates for navigation. When a rat passed points in a hexagonal grid in the space, this activated the corresponding nerve cells in the brain’s coordinate system. 

Illustration
Image taken from Edvard Moser’s Nobel Prize lecture.

Nerves timeline

  1. Described the human nervous system

    In 1664, Thomas Willis presented what was until then the most complete and correct description in text and images of the human nervous system. In his work Cerebri anatome, he drew up an informed overview of the central and autonomous nervous systems’ anatomy. He also distinctly classified the cranial nerves, including the “Willis nerve”, identified the sympathetic system and defined the brain as the organ of thinking. This was the context in which the word “neuro” was first used in medicine, but it did not get its current meaning until the 19th century. 

  2. Studies on skulls

    Gustaf Retzius made meticulous microscopic studies of the structure of the brain and nervous system. He was highly skilled in preparing and dying the fragile nerve cells, and also a fantastic draughtsman, working with the very best printers. Retzius was a professor of histology at KI in 1878–1890. He also studied skeletons and skulls to determine their age, gender and any diseases. He believed he could see from the bones what nationality and “race” they were. 

  3. Structure of the nervous system

    The Nobel Prize in Physiology or Medicine 1906 was awarded jointly to Camillo Golgi and Santiago Ramón y Cajal “in recognition of their work on the structure of the nervous system.”

  4. Studies on reflexes

    Charles Sherrington studied reflexes by charting the links between muscular movements and electric impulses to the nerves. Edgar Adrian succeeded in measuring electric impulses in an individual nerve. They were jointly awarded the Nobel Prize in Physiology or Medicine 1932.

  5. Electric signals

    John Eccles, Alan Hodgkin and Andrew Huxley delved deeper into the nervous system to study how electric signals are transmitted. They were awarded the medicine prize in 1963.

  6. Physiology of the eye

    Ragnar Granit, Haldan Keffer Hartline and George Wald made ground-breaking discoveries about vision and the physiology of the eye. They were awarded the Nobel Prize in Physiology or Medicine 1967.

  7. Divided tasks in the brain

    The Nobel Prize in Physiology or Medicine 1981 was divided, one half awarded to Roger W. Sperry “for his discoveries concerning the functional specialization of the cerebral hemispheres,” the other half jointly to David H. Hubel and Torsten N. Wiesel “for their discoveries concerning information processing in the visual system.”

  8. Synapses and dopamine

    Arvid Carlsson, Paul Greengard and Eric R. Kandel, awarded the medicine prize in 2000, examined how signals are chemically transmitted by various substances between synapses, the contact points between cells in the nervous system. 

  9. The brain's coordinate system

    The Nobel Prize in Physiology or Medicine 2014 was divided, one half awarded to John O’Keefe, the other half jointly to May-Britt Moser and Edvard I. Moser “for their discoveries of cells that constitute a positioning system in the brain.”


Genes

Model of a DNA double helix according to the correct dimensions of the natural molecule.

Licence: Attribution 4.0 International (CC BY 4.0) https://creativecommons.org/licenses/by/4.0/. Credit: Peter Artymiuk. Source: Wellcome Collection.

Mechanisms of heredity

Illustration
The image depicts a genetic cross diagram that illustrates how red eyes and white eyes are inherited. Photo: Hagströmerbiblioteket

The fruit fly Drosophila melanogaster became the model organism for Thomas Hunt Morgan’s research. This fly thrives in the laboratory and reproduces prolifically, with new generations in just a couple of weeks. Its comparatively few and large chromosomes are directly visible in a microscope. Morgan studied flies with genetic changes, mutations, such as an extra leg, no antennae, or different wing shapes. He examined how these traits were inherited, and how they are linked to chromosomes. This laid the foundation for a new understanding of the mechanisms of heredity. 


The structure of the DNA molecule

A ground-breaking step in genetic research was made in 1953, when Francis Crick and James Watson managed to determine the structure of the DNA molecule. This molecule consists of a long line of building blocks that form a double helix. Its building blocks are of four kinds, and their sequence serves as a code for genetic information. 

This discovery was enabled by research by Rosalind Franklin, who died in 1958. The 1962 medicine prize was awarded to Crick, Watson, and Maurice Wilkins, a colleague of Rosalind Franklin. 

Illustration
The double helix is ​​depicted at the bottom left of the page. This copy of Deoxyribose Nucleic Acid was signed by James D Watson during one of his visits to Karolinska Institutet. Photo: Hagströmerbiblioteket

“Gene scissors”

Illustration
The image depicts a genetic cross diagram that illustrates how red eyes and white eyes are inherited. Image taken from Jennifer Doudna’s Nobel Prize lecture.

The CRISPR-Cas9 method, also known as “gene scissors,” makes it possible to modify genetic contents. It was developed by Emmanuelle Charpentier and Jennifer Doudna and is a considerably more powerful and precise instrument than previous methods for changing the genetic contents of organisms. This opens up new potential in research and the development of pharmaceuticals and organisms for food production. 

Genes timeline

  1. How traits are inherited

    To study how traits are inherited, the biologist and Augustinian friar Gregor Mendel cross-fertilised pea plants with different traits, such as green or yellow peas, and studied several generations of offspring. He concluded that traits were passed on in pairs, one from each parent. The characteristics of their offspring is determined by whether these traits are dominant or recessive. 

  2. Detected chromosomes

    In 1878, the biologist Walther Flemming dyed cells from newt larvae and could then detect blue formations, chromosomes, inside the cells under a microscope. He could also observe processes in the cell core, including the behaviour of chromosomes in both living and dead cells. Flemming was the first to describe in detail how chromosomes behaved during normal cell division. It was he who came up with a name for the process: mitosis. 

  3. Coined the term “chromosome”

    Wilhelm von Waldeyer, a professor of anatomy in Berlin, coined the term “chromosome” in 1889, to describe the already-observed parts of the cell that particularly absorbed pigments when dyed. He was one of the earliest and most enthusiastic advocates of Cajal’s “neuron theory” on the structure of the nervous system.   

  4. Studies on DNA

    At the end of the 19th century, the Uppsala professor Olof Hammarsten studied ‘nucleic acid’ (DNA and RNA), a fundamental constituent of the cell. The preparation methods of the time smashed the DNA molecules to smithereens when they were examined. Scientists did not realise how big the molecule was, and didn’t believe it could carry the genetic information. In the 1930s, Olof Hammarsten’s nephew Einar Hammarsten, a professor of chemistry at KI, succeeded in proving that DNA is a “macro molecule.” Under the microscope, he managed to unwind fibres that were up to 50 cm long. 

  5. First modern genetic theories

    The geneticist Vilhelm Johannsen in Copenhagen was of the generation of scientists who collectively, but independently of each other, rediscovered Gregor Mendel’s research on heredity from the 1860s. Mendel’s genetics were widely launched from around 1900. Mendel formulated the first modern genetic theories, and his observations on dominant and recessive genes are still valid. 

  6. Mechanisms of heredity

    Thomas Hunt Morgan studied fruit flies with genetic changes, mutations, such as an extra leg, no antennae, or different wing shapes. He examined how these traits were inherited, and how they are linked to chromosomes. He was awarded the Nobel Prize in Physiology or Medicine 1933.

  7. Measured the level of DNA and RNA in individual cells

    In 1936, Torbjörn Caspersson embarked on a lifelong research project at KI into nucleic acid with biophysical methods, and particularly optical ones. He managed in the first year to measure the level of DNA and RNA in individual cells using a new technique for ultraviolet microscopy. In this way, he could prove that nucleic acid was involved in reproduction, since the content varied at different stages of cell division.  
    Caspersson became a professor of cellular research and genetics at the KI in 1944, and was later appointed head of a department at the Medical Nobel Institute. 

  8. DNA contains the cell's genetic information

    Oswald Avery showed in 1944 that DNA contains the cell’s genetic information. Until then, the dominating assumption was that something as complicated as genes must be packaged in a correspondingly complex macro molecule, most likely a protein. DNA has a comparatively simple chemical structure and was therefore discounted. Although Avery could demonstrate that genetic information was contained by DNA, it was still unclear how the molecule coded the information. 

  9. The structure of the DNA molecule

    A ground-breaking step in genetic research was made in 1953, when Francis Crick and James Watson managed to determine the structure of the DNA molecule. This discovery was enabled by research by Rosalind Franklin, who died in 1958. The 1962 medicine prize was awarded to Crick, Watson, and Maurice Wilkins, a colleague of Rosalind Franklin. 

  10. “Gene scissors”

    The CRISPR-Cas9 method, also known as “gene scissors”, makes it possible to modify genetic contents. Emmanuelle Charpentier and Jennifer Doudna were awarded the medicine prize in 2020.

The microscope that medicine laureate Ralph Steinman used daily.

© Nobel Prize Outreach. Photo: Alexander Mahmoud

To cite this section
MLA style: Exploring life’s smallest details. NobelPrize.org. Nobel Prize Outreach 2026. Sun. 17 May 2026. <https://www.nobelprize.org/stories/cells-nerves-genes/>

John Martinis: “It’s a lifelong learning process”

Eleonora Svanberg, a PhD student in mathematics at the University of Oxford, sits down with 2025 physics laureate John Martinis for a conversation about curiosity, discovery and the future of quantum technology. Martinis reflects on the choices that led him to exploring quantum mechanical behaviour and discusses his groundbreaking work – conducted with co-laureates Michel Devoret and John Clarke – demonstrating quantum phenomena on a macroscopic scale.

To cite this section
MLA style: John Martinis: “It’s a lifelong learning process”. NobelPrize.org. Nobel Prize Outreach 2026. Sun. 17 May 2026. <https://www.nobelprize.org/stories/students-meet-nobel-prize-laureate/john-martinis-its-a-lifelong-learning-process/>

Daron Acemoglu: “Always be open to talking to people who are ideologically opposite to you”

Daron Acemoglu, economic sciences laureate 2024, was joined by nine students from all over the world for a conversation on the topic of being a scientist. Acemoglu gave his best advice for finding great collaborators, spoke about the future of AI and shared how he has overcome obstacles during his scientific career. 

To cite this section
MLA style: Daron Acemoglu: “Always be open to talking to people who are ideologically opposite to you”. NobelPrize.org. Nobel Prize Outreach 2026. Sun. 17 May 2026. <https://www.nobelprize.org/stories/students-meet-nobel-prize-laureate/daron-acemoglu-always-be-open-to-talking-to-people-who-are-ideologically-opposite-to-you/>

Revealing secrets on a tiny scale

The small matter of what everything is made from has long fascinated humankind. Microscopes have had a huge impact on our understanding of everything from the composition of materials to the building blocks of life. The work of a number of Nobel Prize laureates have allowed microscopes to evolve.

A fascinating microbial world

High School micorsocpe.tif
Randy Schekman working on his Annual Science Fair project during his senior year of high school in the spring of 1966. Courtesy of Randy Schekman.

Peering into the plastic eyepiece of his toy microscope to take a closer look at a drop of pond scum, Randy Schekman was transported into a fascinating microbial world. Aged 12, he was determined to save money from odd jobs to buy his first professional microscope. But he couldn’t reach his goal as his mother kept borrowing money from his piggy bank.

Randy W. Schekman - Microscope
Randy Schekman donated this microscope to the Nobel Prize Museum in 2013. Nobel Prize Museum

“One Saturday I became so upset that, after mowing a neighbour’s lawn, I bicycled to the police station and announced to the desk officer that I wanted to run away from home because my mother took my money and I couldn’t use it to buy a microscope,” he said.

The desperate measures paid off: his parents picked him up from the police station and bought the microscope on their way home. It remained a “treasured possession” throughout Schekman’s school years.

Making the invisible, visible 

The microscope’s ability to be an extension of the human eye has enabled scientists to make strides in biology and medicine since its invention in the 16th century. Around one century ago, a scientific breakthrough by Richard Zsigmondy led to the development of the ultramicroscope.

But it was not until Frits Zernike took a fresh look at how refraction gives rise to the image in a microscope that scientists could distinguish transparent specimens such as bacteria and cells without having to cleverly illuminate or stain them.

When light rays pass through transparent materials, such as biological specimens, there is a change in the phase of the light waves – the position of the wave crests in relation to one another – compared to an unimpeded light ray. Our eyes can’t detect this but these phase changes contain important information that can be used to visualise the material they have passed through.

Frits Zernike in the laboratory
Frits Zernike looking into a microscope, 1953. Courtesy of University of Groningen. Photo: Stevens & Magielsen.
Phase contrast
Psilocybe mexicana cheilocystidia and spores 400x with phase contrast illumination from San Agustin Loxicha, Oaxaca, Mexico. Alan Rockefeller, CC BY-SA 3.0 , via Wikimedia Commons

Zernike worked out a way to make this phase change visible, and built the phase contrast microscope in the 1930s. His microscope was capable of enhancing the contrast of unstained, transparent specimens to reveal their inner workings in richer detail, revolutionising biological and medical research.

Despite being told by a major microscope manufacturer that his invention had “little practical value”, Zernike’s work had a massive impact that continues to be felt.

Phase contrast microscopes have enabled strides in biological and medical research to be made, especially in the fields of histology – the study of tissues and organs in the body – and cancer research. They also allow commercial products such as oils, drugs and textiles to be scrutinised.

Seeing life at an atomic level 

Ernst Ruska also had to overcome criticism, which began at an early age. Yet, Ruska’s conviction and curiosity paid off when, as a young student in 1933, he developed the first electron microscope. By using a magnetic coil as a lens he harnessed electron beams – instead of light – to obtain images of extremely small objects. The resulting image had a resolution greater than that of light microscopes, which was a breakthrough for science.

His invention allowed the smallest objects, such as cell organelles, viruses, or atoms, to be examined.

Ruska helped commercialise his invention, so that mass-produced electron microscopes rapidly found applications within many areas of science.

Rohrer and Binnig with a STM
Heinrich Rohrer (left) and Gerd Binnig (right) of IBM’s Zurich Research Laboratory, shown here with a first-generation scanning tunneling microscope (STM), an invention for which they were awarded the Nobel Prize in Physics 1986. Image courtesy of IBM Research – Zurich.

The development of the electron microscope opened up a previously hidden world. Ruska was recognised with the Nobel Prize in Physics 1986, which he shared with Gerd Binnig and Heinrich Rohrer who also turned to electrons to design a new microscope.

Binnig and Rohrer’s scanning tunnelling microscope (STM) used a single atom-sized stylus to create a topographical map of a surface. They exploited the tunnelling effect, in which a current flows between the stylus and the surface only if they are close enough together, to visualise the surface’s peaks and troughs at the atomic level. Thanks to their advances, crystal surfaces, DNA molecules and viruses could be visualised, opening up new vistas to life around us.

Bringing optical microscopy into the nanodimension 

Electron microscopes are incredibly powerful, but they cannot be used to image living cells because the electrons destroy the samples.

Eric Betzig - Microscope and component (1)
Eric Betzig’s microscope and component, donated to the Nobel Prize Museum. Photo: Nobel Prize Museum

For a long time, an alternative – optical microscopy – was held back because scientists presumed that optical microscopes would never obtain a better resolution than half the wavelength of light, and could therefore not be used to visualise tiny objects such viruses and protein molecules in cells.  

Eric Betzig, Stefan Hell and William Moerner, awarded the Nobel Prize in Chemistry 2014, used fluorescence to bypass this limit. Their work allowed the optical microscope to peer into the nanoworld. Nanoscopy – the ability to see past the optical limit of 200-300nm – revolutionised the field of cell biology. Theoretically, there is no longer any object too small to be studied.

Human kidney cell, Gated-STED microscopy
Human kidney cell, Gated-STED microscopy. Alison Dun, ESRIC (Edinburgh Super-Resolution Imaging Consortium). Source: Wellcome Collection.

Hell’s technique used fluorescent molecules to image nano-sized parts of a cell. From this he developed the stimulated emission depletion (STED) microscope, which he used at the turn of the millennium to image E. coli bacterium at a resolution never before achieved in an optical microscope.

Betzig and Moerner laid the foundation for a second method; single-molecule microscopy. It relies upon the possibility to turn the fluorescence of individual molecules on and off. Scientists image the same area multiple times, letting just a few interspersed molecules glow each time. Superimposing these images yields a dense super-image resolved at the nanoscale.

Nanoscopy is, among other things, being used to better understand the function of living nerve cells and neuronal circuits. Scientists are working on allowing continuous observation, so they could “film” the workings of “molecular machines” such as proteins, which could be used, for example, to design new drugs.

Bolstering biochemistry 

The Nobel Prize in Chemistry 2017 recognised discoveries that contributed to the development of cryo-electron microscopy, which has enabled a new era of biochemistry.

Illustration of the Zika virus
Visualisation of the Zika virus, enabled by cryo-electron miroscopy. © Johan Jarnestad/The Royal Swedish Academy of Sciences

Electron microscopes were only useful for imaging dead objects, because the powerful electron beam destroys biological material. But thanks to the work of Richard Henderson, Joachim Frank and Jacques Dubochet, researchers can now freeze biomolecules mid-movement and portray them at atomic resolution.

Cryo-electron microscopy builds on Henderson and Frank’s research using electron microscopes to visualise biological samples and Dubochet’s method of cooling the water around molecules so rapidly they can be imaged in their ‘natural’ form. Today the technique is being successfully applied to multiple scientific disciplines to reveal molecular insights that would otherwise not be achievable.

In medical research, it is used, for example, to observe the 3D outer shell structures of viruses, uncover the atomic structures of proteins associated with neurodegenerative diseases, and to determine the structure and interactions of proteins that play a significant role in cancer. Better understanding of such mechanisms could lead to the development of more effective treatments.

Richard Henderson and Carsten Sasche
Richard Henderson with his colleague Carsten Sasche in the lab. Credit: MRC Laboratory of Molecular Biology

Focusing on the future 

Numerous Nobel Prize laureates have used microscopes to make beneficial breakthroughs for humankind. The founder of modern bacteriology, Robert Koch, reportedly penned a letter to microscope manufacturer Carl Zeiss that read, “a large part of my success I owe to your excellent microscopes,” having discovered the bacilli that caused tuberculosis and cholera, and receiving the Nobel Prize in Physiology or Medicine in 1905.

Microscopes will doubtlessly help future laureates to experience their eureka moments and brilliant breakthroughs. They are more than just instruments, but gateways to other tiny worlds, that continue to enable revolutionary discoveries.

 

Microscopes and the Nobel Prize

  1. Richard Zsigmondy has an idea that leads to the ultramicroscope, making it possible to study objects below the wavelength of light.

     

  2. Frits Zernike comes up with the phase contrast principle leading to the invention of the phase-contrast microscope that allows the study of colourless and transparent biological materials without having to illuminate or stain them.

  3. Ernst Ruska develops an electron microscope that can produce a higher resolution image than an optical microscope. The ability to use electrons in microscopy greatly expands the borders of exploration after the invention is commercialised.

  4. Joachim Frank starts developing a new image processing method for electron microscope images by analysing and merging fuzzy 2D images to make sharp, 3D images possible.

  5. Gerd Binnig and Heinrich Rohrer invent the scanning tunnelling microscope that gives 3D images of objects down to the atomic level.

  6. Jacques Dubochet adds water to electron microscopy. He cooled water so rapidly that it solidified in its liquid form around a biological sample, allowing the biomolecules to retain their natural shape even in a vacuum.

  7. William Moerner is first to measure the light absorption of a single molecule.

  8. Richard Henderson uses an electron microscope to generate a 3D image of a protein at atomic resolution, proving the technology’s potential.

  9. Stefan Hell comes up with a method using fluorescent molecules to track processes in living cells. This leads to the STED microscope, which he uses in 2000 to image E. coli bacterium at a resolution never before achieved.

  10. Eric Betzig and William Moerner create a method in which fluorescence in individual molecules is steered by light, making to possible to track processes occurring inside living cells.

  11. Eric Betzig uses single-molecule microscopy for the first time.

  12. Atomic resolution in electron microscopy is reached, allowing researchers to routinely produce 3D structures of biomolecules.

To cite this section
MLA style: Revealing secrets on a tiny scale. NobelPrize.org. Nobel Prize Outreach 2026. Sun. 17 May 2026. <https://www.nobelprize.org/stories/microscopes/>

How microwaves let us tune into one another – and the universe

Learn more about how microwaves revolutionised our ways of communicating and helped us understand the origins of the universe.

The Pope is watching. This had better work. Such thoughts perhaps ran through the mind of Guglielmo Marconi in 1932 as he set up a special antenna in the Vatican gardens while His Holiness Pope Pius XI looked on.

This antenna was part of a new radio link connecting the Vatican to the Pope’s summer residence, Castel Gandolfo. And not just any radio link.

Image of Marconi and the Pope
Guglielmo Marconi made a telephone that used microwaves to link the Vatican to the Pope Pius XI’s summer home in the 1930s. Credit: DeAgostini/Getty Images/BBC

This one used microwaves – radio waves with extra-high frequencies. Marconi also set up a portable microwave communications system attached to a car, connecting the travelling Pope to the Vatican. Some have claimed this was the first mobile phone, albeit a very large one.

Thirteen years earlier, Marconi had shared the Nobel Prize in Physics with Ferdinand Braun for his contributions to wireless telegraphy. The radio age was in full flow. But when Marconi turned to microwaves, he was embracing a part of the radio spectrum with very special properties.

Microwaves can carry vast quantities of information. They can also cook food, or scramble an enemy’s electronics. Microwaves have even helped reveal the very origins of the universe.

Pip-pip-pip

Long before Marconi built a microwave telephone for the Pope, someone else had already experimented with similar frequencies.

In the late 19th Century, a brilliant Indian scientist called Jagadish Chandra Bose – now, sadly, largely forgotten – developed some of the earliest microwave technology.

Jagadish Chandra Bose
Indian scientist Jagadish Chandra Bose was a pioneer in microwave technology. Credit: SSPL/Getty Images/BBC

This included the very first equipment for generating millimetre waves – the waves used by 5G devices today. In 1895, Bose demonstrated that millimetre waves could ring a bell, and even fire a gun remotely.

Marconi arguably gained some of his stardom thanks to Bose.

On 12 December 1901, using a non-microwave frequency, the Italian inventor carried out the first transatlantic radio transmission. Sitting in a hut atop a Newfoundland cliff, he listened to a swirling deluge of noise in his earphone for many hours – until he heard what he had been waiting for.

Pip-pip-pip.

The Morse code for the letter S. Frantically, he passed the earphone to his colleague and asked, “Can you hear anything?” He could.

It was an incredible feat. Those radio waves had travelled more than 2,000 miles from the south of England, across open water. At the time, his record for a long-distance radio transmission was just 80 miles.

In the years since, some have questioned whether the transmission really happened as Marconi described.

However, recent investigations show that it was theoretically possible, even with his early radio equipment.

Among that equipment was a device called a coherer, a simple radio signal detector. And while the records are somewhat murky, it appears this coherer was designed by none other than Bose.

“He came up with fascinating instruments,” says Bose’s biographer, Sudipto Das.

But Bose was, perhaps, too far ahead of his time. For one thing, there were few useful applications for microwaves during the early 1900s that weren’t already feasible with lower frequency radio waves. Bose shifted his focus away from physics to his greater interest, plant physiology, and “almost sank into oblivion,” says Das.

Magnetron popcorn

However, World War Two made microwaves important again. Radar allowed militaries to detect enemy aircraft by bouncing radio signals off them. And a microwave device called the cavity magnetron developed in Britain in 1940, turned out to be one of the most powerful and effective radar technologies around.

Small enough to install on aircraft, its fantastic range and precision gave Allied countries an important advantage that helped them win the war.

It was also a microwave-emitting magnetron that inspired Raytheon engineer Percy Spencer to invent microwave ovens in 1945. A peanut bar in his pocket began to melt when he walked past magnetrons in a laboratory. And when he later held up a packet of popcorn, the popcorn popped and “exploded all over the lab,” a Reader’s Digest article later recalled.

Magnetrons
Magnetrons helped the Allies win WWII and later led to the invention of the microwave oven. Credit: SSPL/Getty Images/BBC

This happened because, at certain frequencies, microwaves excite molecules inside food, making them vibrate at the same frequency. The ensuing friction heats things up.

For microwave ovens, the frequency of choice is 2.4 gigahertz (GHz) – the same frequency used by many wi-fi routers. However, routers emit microwaves at much lower power levels than microwave ovens – that’s why you can’t make popcorn just by surfing the web.

Choosing the right frequency for cooking is really important, says Caroline Ross at the Massachusetts Institute of Technology. Microwaves at 2.4 GHz penetrate well inside food, and this frequency also allows for even absorption of the radiation by food molecules.

“When you go higher, like tens of gigahertz, the penetration depth is pretty small, so it gets blocked by almost anything – even water in the air,” she explains.

Microwaves are special partly because of that ability, at certain frequencies, to interact with matter. OK, reheating your dinner leftovers might not seem very dramatic – but what about using microwaves to induce noises in people’s heads?

Havana syndrome

Military personnel who worked near to large microwave radar installations built during World War II later recalled that they could sense the radar operating. “It was possible to hear the repetition rate of the radar when we were standing close to the antenna horn,” one witness wrote in the 1950s.

James Lin, professor emeritus at the University of Illinois Chicago, heard such stories and attempted to reproduce the effect in his lab during the 1970s.

“I used myself as a guinea pig, basically,” he recalls, as he describes how he set up a microwave antenna and pointed it directly at his head.

Lin has suggested that the microwaves induced pressure waves inside his head, which he sensed as sound. To avoid cooking his brain, he kept the power levels low. “I could hear the pulse,” he says. “The fact I’m still alive… I guess it wasn’t too bad.”

This became known as the microwave auditory effect and it could help to explain a spate of mysterious illnesses reported by American diplomats around the globe, most famously in Havana, Cuba.

Graphic image of building and car
In 2016, staff at the US Embassy in Cuba reported unusual, debilitating symptoms – later called ‘Havana Syndrome.’ Credit: AFP via Getty Images/BBC

Victims of so-called Havana Syndrome report experiencing strange grating noises, a feeling of pressure building up in their ears, dizziness, nausea and memory loss. Was an enemy directing a microwave beam at these people? While some have dismissed this hypothesis, Lin says it remains the most plausible explanation for the auditory symptoms.

Microwave weapons do exist, though those discussed in public tend to target machines rather than people. The US military has missiles that can zap enemy electronics with microwaves, for instance. Microwaves can even bring down drones.

In contrast, Lin has developed ways of using microwaves to heal – for example, to treat muscle diseases and irregular heartbeats.

For the latter, he says it is possible to insert a tiny microwave-emitting device into the heart, via a catheter, in order to destroy abnormal heart tissue. This technique, now widely used, is less invasive than open heart surgery, he points out: “Just deliver a pulse at high power, a microwave, to burn the tissue.”

The universe talking

But microwaves don’t just save lives. They have also helped to reveal the origins of the universe. In the early 1960s, radio astronomers Arno Penzias and Robert Woodrow Wilson attempted to use a large, horn-shaped antenna in the US state of New Jersey as a radio telescope. But they kept picking up an irritating hiss or static.

At one point, they thought this was caused by pigeon droppings in the antenna, so they shooed the birds away and cleaned up the mess. However, the birds were not to blame. What Penzias and Wilson were hearing was the sound of the universe itself.

“It’s a snapshot of early time,” says Sean McGee, at the University of Birmingham. Penzias and Wilson had discovered what we now call cosmic microwave background radiation – a signature left over from the Big Bang, when the universe exploded into being roughly 13.8 billion years ago. Penzias and Wilson was awarded half of the 1978 Nobel Prize in Physics for their work.

Cosmic microwave background
The European Space Agency shared this image of the cosmic microwave background, captured by its Planck satellite. Credit: ESA/Planck Collection/BBC

The residual radiation they detected is present throughout the cosmos. A small proportion of the snowy static on analogue television screens is attributable to it. In other words, until LED screens took over, people would pick up remnants of the Big Bang in their living rooms.

Satellites eventually helped astronomers to map the cosmic microwave background, recording its fluctuations as slight differences in temperature. Those fluctuations appear to have influenced where galaxies formed as the universe expanded.

“We’re all the result of quantum fluctuations in the very early universe that then seeded galaxies,” says McGee.

Today, people use microwaves for any international call that is connected by satellite. A great leap from the car-mounted equipment Marconi installed for the Pope in the 1930s.

It’s rather fitting that many people use microwaves to talk to one another day in, day out – since that is also how the universe has talked to us, helping to confirm our understanding of the greatest story of all time. The story of how everything began.

By Chris Baraniuk, BBC World Service. This content was created as a co-production between Nobel Prize Outreach and the BBC.

Published March 2026

To cite this section
MLA style: How microwaves let us tune into one another – and the universe. NobelPrize.org. Nobel Prize Outreach 2026. Sun. 17 May 2026. <https://www.nobelprize.org/stories/microwaves/>

The clock that starts ticking when something dies

A help in solving crimes and a revealer of history’s secrets: radiocarbon dating is one of the keys that unlocks our world.

Willard F. Libby (2)
Willard Libby. Photo: University of Chicago

There would be heaps of the stuff in sewage. Willard Libby was sure of it.

It was the mid-1940s, and the US chemist’s goal was to find a radioactive form of carbon, carbon-14, in nature. He had realised that, if it were there, it would leave a slowly decaying trace in dead plants and animals – so finding out how much was left in their remains would reveal when they died.

But Libby had to prove that carbon-14 existed in the wild in concentrations that matched his estimates. Other scientists had only ever detected carbon-14 after synthesising it in a lab.

Libby reasoned that living things would deposit it in their excrement, which is why he turned to sewage. Sewage produced by the people of Baltimore, to be precise. And he found what he was looking for.

Libby didn’t know it then but the idea that you could use radioactive carbon – radiocarbon – to date things would have all kinds of diverse applications.

Since the mid-20th Century, radiocarbon dating has confirmed the age of countless ancient artefacts, helped solve missing person cases, and put ivory traffickers in jail. It has even enabled scientists to understand the intricacies of Earth’s climate. Radiocarbon dating is one of the keys that unlocks our world.

But how does carbon-14 come into existence in the first place? Libby understood that it was being produced constantly by cosmic rays striking nitrogen atoms in the Earth’s atmosphere and changing their structure. The resulting carbon-14 atom quickly combines with oxygen to make radioactive carbon dioxide (CO2).

Back on the ground, plants absorb some of that radioactive CO2 in the air as they grow, as do the animals – including humans – that eat them. While a plant or animal is alive, it keeps replenishing its internal store of carbon-14 but, when it dies, that process stops. Because radiocarbon decays at a known rate, measuring how much is left in organic material will tell you the material’s age. A clock that starts ticking the moment something dies.

‘Putting things in order’

Once Libby confirmed there was carbon-14 in the methane gas from Baltimore sewers, he went on to detect radiocarbon in many different things, allowing him to prove how old they were – from the linen wrappings of the Dead Sea Scrolls to a piece of a ship found in the tomb of Sesostris III, an Egyptian king who lived nearly 4,000 years ago.

“This is a problem where you won’t tell anybody what you’re doing. It’s too crazy,” Libby later said. “You can’t tell anybody cosmic rays can write down human history. You can’t tell them that. No way. So we kept it secret.”

But once he had proved it worked, he let the world know. And, in 1960, Libby was awarded the Nobel Prize in Chemistry.

His technique works on organic material that is up to 50,000 years old. Older than that, and there is too little carbon-14 left. Carbon-14’s gradual decay is what makes radiocarbon dating possible – but that also means you can only go back so far.

Montage showing Dead Sea Scroll fragments
Dead Sea Scroll fragments were among the first items dated using carbon-14. Credit: Getty Images/BBC

Radiocarbon dating is now central to our understanding of history.

“In terms of putting things in order, in terms of being able to compare between different regions in particular, and understand that pace of change, it has been really important,” explains Rachel Wood, who works in one of the world’s most distinguished radiocarbon dating labs, the Oxford Radiocarbon Accelerator Unit.

She and her colleagues date materials including human bones, charcoal, shells, seeds, hair, cotton, parchment and ceramics, but also stranger substances. “We do the odd really unusual thing, like fossilised bat urine,” she says.

The lab uses a device called an accelerator mass spectrometer to directly quantify the carbon-14 atoms in a sample – unlike Libby, who was only able to measure the radiation emitted and thereby infer how much carbon-14 a sample contained. The accelerator can also date tiny samples, in some cases a single milligram, whereas Libby needed much more material.

Dating skeletons

Removing carbon-containing contaminants can take weeks, but once done the accelerator readily spits out a sample’s estimated age.

“It’s really exciting to be able to see the results immediately.”

Rachel Wood, Oxford Radiocarbon Accelerator Unit.

Radiocarbon dating has settled some long-standing arguments. Take the human skeleton discovered by theologian and geologist William Buckland in Wales in 1823. Buckland insisted it was no more than 2,000 years old, and for more than a century, no-one could prove he was wrong. Radiocarbon dating eventually showed it was actually between 33,000 and 34,000 years old – the oldest known buried human remains in the UK.

More recent human remains have also revealed their secrets thanks to this technology.

In 1975, a 13-year-old girl called Laura Ann O’Malley was reported missing in New York. Remains found in a California riverbed in the 1990s were thought to have originated in a historic grave until radiocarbon dating earlier this year showed they belonged to someone born between 1964 and 1967, who most likely died between 1977 and 1984. This fitted the timeline of O’Malley’s disappearance, and DNA analysis confirmed the remains were hers.

Forensic analyses often rely on the “bomb pulse” method of radiocarbon dating, which is possible due to the hundreds of atmospheric nuclear weapons tests that occurred during the 1950s and 1960s.

The blasts sent vast quantities of additional carbon-14 into the air, but these artificially high levels have been falling ever since. And so, by comparing carbon-14 measurements with that downward-sloping curve, it is possible to date materials from the mid-20th Century onwards very precisely – to within a year or so, in some cases.

“I don’t know of any other technique that comes close to that,” says Sam Wasser at the University of Washington. “It’s extraordinarily useful.”

Wasser has analysed radiocarbon dating results from ivory samples as part of efforts to crack down on the illegal wildlife trade. The data can show whether the elephants died before or after the 1989 ban on ivory sales, whatever traffickers may claim.

Man with ivory
Thanks to carbon-14 dating, scientists can determine if ivory was poached illegally.

‘Smoking gun’

One man jailed on this evidence is Edouodji Emile N’Bouke, convicted in Togo in 2014. While DNA tests uncovered the geographic origin of the ivory he trafficked, radiocarbon dating showed exactly when the elephants were poached. These two strands of evidence were “the smoking gun critical to bringing N’Bouke to justice,” the US State Department later said.

The same techniques have exposed artworks as forgeries. Take the painting of a village scene that one forger claimed was made in 1866. Radiocarbon dating confirmed that it had actually been painted, and artificially aged, during the 1980s.

Radiocarbon dating has also shed light on climate change by helping scientists understand the effect of fossil fuel emissions on Earth’s climate. Studies of glaciers and ancient ecosystems, for example, are made much more accurate thanks to radiocarbon dating technology.

This research has informed reports by the Intergovernmental Panel on Climate Change (IPCC), which in 2007 was awarded the Nobel Peace Prize – along with former US Vice President Al Gore – for its work disseminating information about climate change.

“It’s also very useful for people who want to use climate models to predict what the climate may potentially be like in the future,” says Tim Heaton at the University of Leeds. Scientists can use radiocarbon records to establish how Earth’s climate changed over time, and check climate models against those results, validating the models’ accuracy.

Carbon-14 dilution

But another clock is ticking. Fossil fuels contain copious quantities of carbon but no carbon-14 – the organisms that became coal, natural gas and oil, died so long ago that the carbon-14 they once contained has long since decayed. That means fossil fuel emissions are diluting the carbon-14 in Earth’s atmosphere today, which has a direct effect on how much radiocarbon ends up in living things.

Heather Graven at Imperial College London says that, in the worst-case scenario of extremely high emissions during the next century or so, the accuracy of radiocarbon dating could crumble.

“Something that’s freshly produced will have the same [radiocarbon] composition as something that’s maybe 2,000 years old,” she says. Radiocarbon dating wouldn’t be able to tell the two apart.

Rachel Wood argues that these problems won’t arise any time soon, but Paula Reimer, professor emeritus at Queen’s University Belfast, thinks fossil fuel emissions do “put a damper” on radiocarbon dating and ultimately threaten its accuracy.

She spent many years working to heighten the precision of radiocarbon dating, by making painstaking measurements of the radiocarbon found in tree rings, for example, to reveal variations in atmospheric levels of carbon-14 across millennia.

Extremely precise curves of radiocarbon levels are now available dating back 14,000 years or so. But fossil fuel emissions may eventually bring this era of incredible precision to an end.

By Chris Baraniuk, BBC World Service. This content was created as a co-production between Nobel Prize Outreach and the BBC. 

Published February 2026

To cite this section
MLA style: The clock that starts ticking when something dies. NobelPrize.org. Nobel Prize Outreach 2026. Sun. 17 May 2026. <https://www.nobelprize.org/stories/carbon-14/>

Five documentaries inspired by the Nobel Peace Prize

The Nobel Prize, National Geographic Documentary Films and Academy Award-winning filmmaker Orlando von Einsiedel have collaborated on a 5-part short documentary series, celebrating the ongoing impact and influence of Nobel Peace Prize laureates around the world. Watch the films here.

In 1998, Makur Diet lost his leg to a bullet in war-damaged South Sudan. Despairing for his future, Makur was close to committing suicide, until he was given a prosthetic leg. Makur realised that he now had a chance to do something good in the world and decided to devote his life to helping other amputees. At an International Committee of the Red Cross (ICRC) centre in South Sudan, Makur makes prosthetic legs and gives hope to other persons who have lost a limb. He is helping to rebuild his country, one leg at a time.

Makur works at an International Committee of the Red Cross centre in South Sudan. The ICRC have received the Nobel Peace Prize in 1917, 1944 and 1963.

still human

In the chaos of the world’s largest refugee camp, Kamal Hussein is a beacon of hope. From his small ramshackle hut, and armed only with a microphone, he has taken it upon himself to try and reunite the thousands of Rohingya families who have been torn apart by violence and ethnic cleansing in Myanmar. However, in finding lost family members and bringing them back together, he is not just helping them. He is also finding peace for himself.

Kamal’s work is funded by the United Nations High Commissioner for Refugees, recipient of the Nobel Peace Prize in 1954 and 1981.

lost and found

In an area of Iraq destroyed by ISIS, Hana Khider leads an all-female team of Yazidi deminers in their attempts to clear the land of mines. Their job involves painstakingly searching for booby traps in bombed out buildings and fields, where one wrong move means certain death. Even though the devastations caused by ISIS still are evident and the local people are suffering, they are trying to forget the past and remain hopeful.

Hana works for the Mines Advisory Group, an organisation who are part of the International Campaign to Ban Landmines, a coalition awarded the Nobel Peace Prize in 1997.

into the fire

How would natural habitats develop without human interference? In this documentary we follow an international team of scientists and explorers on an extraordinary mission in Mozambique to reach a forest that no human has set foot in. The team aims to collect data from the forest to help our understanding of how climate change is affecting our planet. But the forest sits atop a mountain, and to reach it, the team must first climb a sheer 100m wall of rock.

The scientist’s work is based on research conducted by the Intergovernmental Panel on Climate Change, recipient of the Nobel Peace Prize in 2007.

The lost forest

Here you get to follow the two South African musicians; Tsepo Pooe, who grew up in Soweto Township; and Lize Schaap, who grew up in wealthy Pretoria. They are both part of a very special and unique orchestra, the Miagi Orchestra. Inspired by the legacy of 1993 Nobel Peace Prize laureate Nelson Mandela, the orchestra aims to help the nation overcome decades of violence, conflict and division through the power of music.

an unfinished symphony

To cite this section
MLA style: Five documentaries inspired by the Nobel Peace Prize. NobelPrize.org. Nobel Prize Outreach 2026. Sun. 17 May 2026. <https://www.nobelprize.org/stories/peace-prize-documentaries/>

How X-rays and crystals revealed the true nature of things

The story behind the 100-year-old discovery that continues to render Nobel Prizes.

Alone in the Martian desert, a robot was looking for answers. In 2012, Nasa’s Curiosity rover scooped up a small pile of sand, ingested it, and blasted it with X-rays. The intrepid bot was going to find out what that sand was made of, which could in turn reveal information about the historical presence of water on Mars – because any water in this dusty, red plain was long gone.

Lawrence Bragg and Sir William Bragg
Lawrence Bragg and Sir William Bragg. Smithsonian Institution Archives.

Nearly a century earlier, in 1915, William and Lawrence Bragg – a father and son team – had been awarded the Nobel Prize in Physics for their work on X-ray crystallography, a technique that makes it possible to determine the atomic and molecular structures of crystals by studying how X-rays diffract, or deviate, when they interact with them.

Many materials, from tiny proteins to metals, can form crystals, and X-ray crystallography became the gold standard for revealing how various forms of matter are put together.

Back on Earth, Michael Velbel at Michigan State University was waiting eagerly for data from Curiosity on Mars. This was the first time X-ray crystallography had ever been done on another planet.

“I was shadowing the mission all along,” recalls Velbel.

NASA's Curiosity rover
NASA’s Curiosity rover brought X-ray crystallography to another planet for the first time. Credit: Getty Images/BBC

Curiosity’s analyses revealed details of the water content of minerals on Mars, which have given credence to – though not proved – the hypothesis that the planet had large bodies of water just a few hundred thousand years ago.

“We can finally get a grasp on that,” says Velbel.

Knowing what things are made of lets us do some amazing things. Analysis of atomic and molecular structures have helped scientists design drugs, unravel the secrets of DNA, and even make better batteries.

You can tell X-ray crystallography is important because it has played a role in numerous Nobel Prizes – some put the tally at more than two dozen. And yet few people know just how awesome a technique it is.

Regular patterns

“A lot of people call me ‘Chrystal the Crystallographer’ – or ‘C-squared’” jokes Chrystal Starbird at the University of North Carolina at Chapel Hill. She remembers the first time she used X-ray crystallography to determine a molecular structure. “I was looking at something no-one else had looked at before. I was like, ‘Wow, how cool is that!’”

If you’ve ever seen one of those ball-and-stick models of chemical substances, you’ll know what X-ray crystallographers are working towards. They want to find out what atoms are in a material and exactly how they are bonded together.

When Starbird does this kind of analysis, one key early step in the process is taking, say, a protein and working out how to grow crystals of it on a very small scale. Just as water forms ice crystals when it freezes, proteins can form very tiny crystals under certain conditions.

Those crystals are then harvested with tiny hair-like loops – which can be a very tricky procedure – and placed on an X-ray diffractometer.

The crystals are necessary because when you shine X-rays on their ordered structure, you get a regular diffraction pattern – precise markings specific to the chemical nature of the crystal in question.

Illustration of x-rays and crystals
When X-rays pass through crystals they produce distinctive diffraction patterns. Credit: BBC

However, proteins are much more complicated than water molecules so conditions must be just right for them to crystallise. Starbird may have to try hundreds of different approaches – using different chemicals, temperatures or humidity levels – before it works.

“I’m somebody who’s OK with delayed gratification,” she jokes.

Mapping insulin

Dorothy Crowfoot Hodgkin
Dorothy Crowfoot Hodgkin.  Photo from the Nobel Foundation archive.

One scientist who would probably have related to that was Dorothy Crowfoot Hodgkin. She spent 34 years using X-ray crystallography to work out the structure of insulin, beginning in the 1930s. Insulin is a hormone that helps control blood sugar levels but Type 1 diabetics are unfortunately unable to produce it.

In Hodgkin’s case, obtaining the insulin crystals wasn’t especially difficult. But because insulin contains no fewer than 788 atoms, it took a long time for her to map the entire structure using early X-ray crystallographic methods. Her achievement made it much easier to mass-produce insulin for the treatment of diabetes.

By the time Hodgkin had finally finished, in 1969, she had already been awarded the 1964 Nobel Prize in Chemistry for her X-ray crystallographic studies. She had also determined the structures of penicillin – an important antibiotic – and vitamin B12.

She died in 1994. At a memorial service the following year, Max Perutz – who also received a joint Nobel Prize in Chemistry for crystallographic work – said, “Her X-ray cameras bared the intrinsic beauty beneath the rough surface of things.” He praised both her kindness and her “iron will” to succeed.

“She was a terrific inspiration.”

Elspeth Garman at the University of Oxford, who knew Dorothy Crowfoot Hodgkin.

Garman describes an X-ray diffraction pattern as “an incredibly complicated reflection.”

X-rays directed at a crystal structure interact with the electrons orbiting atoms inside that structure and diffract, leaving a detectable trace on (in Hodgkin’s day) X-ray photographic film nearby.

The result is a pattern, which you can painstakingly convert into a topographical map of the structure, or a three-dimensional model.

Women excelling

Garman notes that many women have excelled at X-ray crystallography. She credits the Braggs, in part, for this. “They had the most amazing academic tree of women that they encouraged and took on as graduate students when people in other fields didn’t,” she says.

Rosalind Franklin
Rosalind Franklin with microscope in 1955. MRC Laboratory of Molecular Biology, CC BY-SA 4.0, https://creativecommons.org/licenses/by/4.0/, via Wikimedia Commons.

Besides Hodgkin, there was also Rosalind Franklin, whose X-ray diffraction image of DNA was among many findings she made while trying to work out the structure of DNA. Some of her findings influenced Francis Crick, James Watson and Maurice Wilkins in their own endeavours on this subject. They were awarded the Nobel Prize in Physiology or Medicine for their work in 1962. Many argue Franklin never received sufficient credit.

X-ray crystallography has also been involved in more recent Nobel Prize-awarded work – including the 2020 Nobel Prize in Chemistry for genome-editing technology, which has roots in crystallographic studies of RNA.

One hugely important application of X-ray crystallography is in drug discovery. It has helped scientists find drugs for sickle-cell disease and even certain cancers, for example.

Rob van Montfort, group leader in the Centre for Cancer Drug Discovery at the UK’s Institute of Cancer Research, says crystallography can reveal which compounds could block or control key proteins in the body, and thus treat a disease.

“X-ray crystallography… provides pictures showing how, exactly, the compound binds to the molecule,” he explains.

Seeing inside batteries

Recent technological developments have allowed increasingly complex crystallography studies, Garman says.

At Diamond Light Source, a science facility in the UK, staff use X-ray beamlines to check the medicinal potential of compounds at high speed, by analysing potential binding-sites on a given protein. “Overnight, you could look at 200,” says Garman. “It’s absolutely staggering.”

Mobile phone charging
Credit: shisheng ling via Getty Images

Researchers have also used this approach to study battery materials – a key technology for the transition away from fossil fuels. Phil Chater, crystallography science group leader at Diamond Light Source, says X-ray crystallography reveals how the materials inside batteries can degrade over time.

Lithium-ion batteries work by allowing lithium ions to travel between layers of material – that’s how they charge up and discharge energy.

“Maintaining that [layer] structure is very important for the prolonged life of these batteries,” Chater says.

But crystallography allows you to see, sometimes, how layers are changing, affecting the ability of the ions to move in and out, he adds. Scientists can then search for ways of overcoming the problem.

A close look at comet ice?

X-ray crystallography has clearly made waves in many fields. But there’s an elephant in the room, says Garman.

A rival technique called cryo-electron microscopy (cryo-EM) is now enabling scientists to derive the structure of certain molecules in a completely different way – by firing beams of electrons at them. Some molecules have traditionally been too small for cryo-EM devices to see, however solutions are emerging on that front.

There’s also artificial intelligence (AI). If AI can accurately predict molecular structures, there may be less need to use X-ray crystallography for this task. But Starbird cautions that there are many structures AI doesn’t predict well.

“I think people have a misconception that crystallography might be done soon, because we have AI – we’re not even close to that,” she says.

The Braggs, presumably, would be glad to hear this. And X-ray crystallography devices may go on even more exciting adventures in the future. Velbel suggests sending one to a distant comet orbiting our sun.

“Wow. I’d want to see what comet ice looks like,” he says, explaining that we might find exciting mixtures of unusual compounds if we could study it up-close. “I think it would be fascinating.”

By Chris Baraniuk, BBC World Service. This content was created as a co-production between Nobel Prize Outreach and the BBC.

Published February 2026

To cite this section
MLA style: How X-rays and crystals revealed the true nature of things. NobelPrize.org. Nobel Prize Outreach 2026. Sun. 17 May 2026. <https://www.nobelprize.org/stories/x-rays-and-crystals/>