Prof. Kardar, could you please start by telling us about your educational background and what inspired your interest in statistical physics?
I was born and raised in Tehran, Iran, where I attended a Don Bosco school for 12 years through high school. Growing up, I enjoyed mathematics and science and fully expected to study science or engineering at a university in Iran. somewhat unexpectedly, my father encouraged me to study abroad, which eventually led me to Cambridge University in 1976.
At Cambridge, I studied Natural Sciences, which involved classes in mathematics, chemistry, and materials science, as well as physics which was my true passion. After Cambridge, I moved to MIT for graduate studies in 1979. Choosing MIT was a risk, as I had been admitted without guaranteed financial support. Luckily, Professor Nihat Berker, had joined MIT faculty that same year and was seeking students. While I was first inclined toward general relativity or high-energy theory, meeting Nihat redirected my interests toward statistical physics. My graduate education was significantly enriched by courses from visiting professors Amnon Aharony on phase transitions and RG, and Henri Orland on disordered systems.
After earning my PhD, I spent three years as a Junior Fellow at Harvard, which granted me considerable freedom to explore new directions. I feel that this freedom, along with the mentorship of David Nelson and collaborations with colleagues such as Yacov Kantor, Yi-Cheng Zhang, and Ramamurti Shankar made this period exceptionally productive for me.
One of your early and widely recognized contributions is the Kardar-Parisi-Zhang (KPZ) equation. Could you tell us about how this work came about and what you think has contributed to its broad influence across different fields?
The KPZ work came about during my fellowship at Harvard. I usually spent the summer months at at Brookhaven National Laboratory, where Per Bak would gather around some theorists. One summer I had the good fortune to connect with Yi-Cheng Zhang who had joined Brookhaven soon after completing his PhD with Giorgio Parisi in Rome. Through this connection we started investigating an equation, possibly describing dynamics of growing interfaces, whose implications were not entirely clear to us.
Returning to Harvard and reflecting further, I recognized an important connection between this equation and a problem I had started with David Nelson on interfaces subject to disorder. Over Email with Zhang and Parisi this work was completed and published in PRL in 1986.
In the nearly forty years since, the KPZ equation has become central to our understanding of fluctuations in systems that evolve over time but not necessarily in equilibrium. Its wide applicability comes from being the simplest equation that captures how local, noisy, and asymmetric influences shape complex behavior—without relying on the details of any specific system. Much like the ubiquity of the diffusion equation, the KPZ equation naturally appears whenever fluctuations build up and organize in everything from traffic flow and biological growth to quantum spins. The Cole-Hopf mapping to directed polymer problem, has allowed mathematicians and physicists to derive exact results into the KPZ universality class.
Your research has covered a diverse range of topics. Could you elaborate on this breadth of interest, and share how your students and collaborators contributed to shaping this diverse trajectory?
Since joining the MIT faculty in 1986, I’ve been fortunate to explore many scientific questions, thanks largely to the creativity and energy of my students, postdocs, and collaborators. Early at MIT, following my postdoctoral interests, we studied the KPZ equation and other dynamic scaling phenomena. This line of research was carried out with my first group of students, Ernesto Medina, Terry Hwa, Deniz Ertaş, and Raissa D'Souza. In parallel, we studied variants of directed polymers and other random systems. This exploration of new directions in quenched randomness involved other talented students and postdocs including Leon Balents, Larry Saul, Barbara Drossel, Carmen Miguel, Mohammad Kohandel, and later on Sherry Chu.
Another major theme was the statistical mechanics of polymers and polymer networks, mostly with my long-time collaborator Yacov Kantor. Students and collaborators such as Maya Paczuski and Kay Wiese contributed at the beginning. Later on, this line of research expanded naturally into biological polymers, addressing topics like mixed-charge polyampholytes, polymer translocation through pores, and protein knots, and involved Jeffrey Chuang, Andrea Zoia, Alberto Rosso, Michael Slutsky, Ralf Metzler, Paul Dommersnes, and Peter Virnau. The joining of Leonid Mirny to MIT faculty was particularly important for this research; and incidentally we have been co-teaching a class on statistical physics in biology for the last twenty years.
There were some brief incursions into neuroscience, addressing statistical physics-inspired questions about vision and synapses, pursued with Mehdi Yahyanejad, Ha Youn Lee, Rava da Silveira, and Christoph Haselwandter.
Another productive and long-running research pursuit has been fluctuation-induced forces. This was initiated with Hao Li and Ramin Golestanian and developed later with Robert Jaffe. With Roya Zandi, Thorsten Emig, Jamal Rahi, Mohammad Maghrebi, and Vlad Golyk we explored various consequences of confining thermal, quantum, or even non-equilibrium fluctuations, in both soft matter settings of membranes and superfluids, and electrodynamics cases of Casimir forces and radiative heat transfer.
Starting in the 2000s, I was fortunate to collaborate with Arup Chakraborty, initially at Berkeley and now at MIT. Our joint work on immunology allowed me to learn from Arup and our joint students and postdocs including Andrej Kosmrlj, Tom Butler, John Barton, Shenshen Wang, and Andriy Goychuk. Given its relevance to public health and vaccination, this area has gained considerable recent interest.
Another growing area concerns active matter, where energy consumption at microscopic scales leads to unexpected macroscopic behaviors. Through interactions with Alex Solon, Yariv Kafri, and Julien Tailleur (now a colleague MIT), I am currently interested in this topic. Apologizing for the length of this overview, I must have inevitably omitted many contributions and collaborations, for which I apologize.
Having recently received the Boltzmann Medal, could you reflect on what this honor means to you personally and its significance within the field of statistical physics?
The Boltzmann Medal recognizes significant achievements in statistical physics. I’m deeply honored by this recognition, as it highlights what I believe is an exceptionally important but sometimes under-appreciated area of science.
Statistical physics seeks to understand how complex phenomena emerge from interactions among many components—atoms, molecules, cells, people, or stars. It aims to extract collective behaviors from microscopic details often too complicated to track. This gives statistical physics extraordinary reach across topics: materials science, biology, neuroscience, economics, ecology, and social sciences.
My research path illustrates this breadth, addressing polymers, immune systems, biological growth, and even road networks. Despite this broad relevance, statistical mechanics is not always sufficiently appreciated, even within physics itself, overshadowed by trendier topics like particle physics or cosmology. Yet, statistical physics addresses questions of both fundamental and practical nature. PhDs trained in this field make important impacts not only within physics but also across university departments and interdisciplinary research centers.
One of my goals in teaching and textbook writing has been to communicate the reach and versatility of statistical physics. We need to better highlight this to inspire greater student interest as well as deeper public appreciation. Receiving the Boltzmann Medal is thus a welcome occasion to further advocate for statistical physics, emphasizing its scientific place within physics and potential to address diverse questions outside physics.
Looking forward, which areas do you believe hold the most promise for future research in statistical physics?
There will certainly be new fundamental discoveries within statistical physics, and new arenas of application which may not be easy to predict.
I find the following directions promising: How a closed system comes to thermal equilibrium has been a central question of the field. This topic has found new relevance in connection with closed quantum systems, and the Eigenstate Thermalization Hypothesis, i.e. how a single quantum eigenstate can manifest properties traditionally associated with thermal ensembles. Understanding entanglement in this context provides insights into how quantum information equilibrates, with relevance all the way from quantum computation to the fate of black holes.
Another intriguing frontier has arisen due to rapid advances in artificial intelligence and large language models. Statistical physics, the language of collective phenomena and emergence, is ideal for exploring how something like intelligence spontaneously emerges from the relatively simple elements and connections underlying computational networks.
Closer to my current interests, there are interesting issues related to biological evolution in complex, dynamically changing fitness landscapes. Using statistical physics tools—such as energy landscapes and stochastic dynamics—can hopefully provide insights into adaptation, speciation, and ecological interactions.
These examples highlight the remarkable breadth and continued vitality of statistical physics, as it constantly reinvents itself and broadens its relevance across scientific disciplines.