We’ve spent the best part of 30 years teaching some of the brightest young people on the planet to learn to code. The computer science department has swelled. Boot camps have proliferated. The message was clear and consistent. Software is eating the world, so you might as well pick up a fork. Now the machine is learning to keep its own fork, and the whole premise is being questioned. However, this is not a cause for despair, and may be exactly the correction that humanity needs.
I say this as someone who trained as a computational physicist and has worked with AI and machine learning for decades. I’ve watched from both sides as the technology industry absorbed an unusual proportion of the world’s technical talent into software roles, much of which was technically sophisticated but intellectually narrow. The threat of AI to jobs is well-documented territory. Large-scale language models can already draft legal documents, write marketing copy, and generate functional code. What has been less explored so far is the potential for AI to solve relatively intractable problems in software production, ultimately redirecting human talent to problems that actually matter.
This is an idea that will trouble a significant portion of the tech industry. Software engineering as a general profession may end up in brackets. The bracketed era in the history of human intellectual labor ended with the invention of high-level programming languages on the one hand and the emergence of AI systems capable of creating software on the other.
The history of programming is the history of abstraction. We started with machine code, raw binary instructions fed directly to the hardware. That gave way to assembly language, then high-level languages like Fortran and C, and increasingly expressive languages like Python and JavaScript, which allowed programmers to think in terms further removed from the machine. With each step up the abstraction ladder, the previous level became something that only experts needed to touch. No one writes machine code by hand anymore, unless they’re working on a very specific embedded system problem. The assembly is the same. These languages didn’t disappear, but the number of people needed to work at that level was reduced to a fraction of what they used to be.

AI represents the next rung on that ladder, and perhaps the last and most important rung. If you can write what you need in natural language and have an AI system generate the functional code, traditional programming becomes the new assembly. It will still exist for specialized cases, but it will no longer be a mass profession. The entire middle layer of the abstraction stack is compressed into conversations.
This is true even in quantum computing, where programming is already a difficult field. Although tools such as Qiskit and Cirq currently require specialized knowledge to design quantum circuits, AI-assisted circuit generation is rapidly advancing. Within a few years, researchers may be able to write the necessary quantum algorithms in plain language and have AI systems generate, optimize, and error-correct circuits. Programming quantum computers may never become a mass-market skill. There is a possibility to skip brackets completely and go directly from professional to automation.
Increasingly, answers are pointing downwards toward physical foundations and outward toward science’s most difficult unsolved problems.
Any advancement in AI capabilities ultimately depends on physical hardware. More and more chips are fabricated at nanometer scales, cooling systems manage enormous heat loads, and entirely new computing architectures completely break the von Neumann model. TSMC, Intel, Samsung, and a growing ecosystem of quantum hardware companies such as IBM, Google Quantum AI, IonQ, Quantinuum, and PsiQuantum all face the same constraints. This means we need people who understand physics, materials science, and engineering at a basic level.
And quantum computing is just one of several emerging paradigms that require this expertise. Photonic computing, which uses light rather than electrons to process information, promises significant reductions in energy consumption and increased bandwidth for certain workloads. Companies like Lightmatter and Xanadu are building processors that leverage photons for both classical AI acceleration and quantum computation. Neuromorphic computing, which mimics the architecture of biological neural networks in silicon, represents a further departure from traditional chip design. Intel’s Loihi chip and IBM’s NorthPole chip are early examples of hardware that processes information in a fundamentally different way than traditional processors, requiring engineers who understand neuroscience, analog circuit design, and new materials in addition to traditional semiconductor physics. Each of these paradigms requires deep knowledge of the underlying physics, none of which can be constructed by prompting language models.
Quantum computing remains the clearest example of this shift. Building fault-tolerant quantum computers spans condensed matter physics, quantum error correction theory, cryogenics, photonics, and materials science. You cannot train a model based on Stack Overflow answers to solve qubit decoherence problems. I’ve been following this industry for years through Quantum Zeitgeist, and the most consistent theme from hardware teams is that the bottleneck isn’t money or ideas. It’s the people. There just aren’t enough properly trained physicists and engineers to fill the roles these companies need.
Its impact extends beyond computing. Consider what is possible when AI handles the cognitively heavy lifting that millions of talented people currently do.
NASA’s Artemis program is working to return humans to the moon, something that hasn’t been accomplished since 1972. The fact that humans went to the moon with a slide rule and then didn’t return for 50 years, developing increasingly sophisticated apps for food delivery, is one of the more damning indictments of how we have allocated our collective intelligence. SpaceX is developing Starship with the express goal of making Mars missions possible. After decades as a punch line, fusion energy has attracted more than $9.8 billion in private investment, with companies such as Commonwealth Fusion Systems, TAE Technologies and Helion Energy pursuing reactor designs that require expertise in plasma physics and superconducting magnet design. Even in biology, DeepMind’s AlphaFold showed that AI can predict protein structures, but translating those predictions into actual treatments still requires human researchers who understand cell biology and clinical realities.

These are not questions that yield better prompting strategies. We need scientists and engineers who can work at the boundaries of known physics.
If this theory is correct, it will have a major impact on education. For two decades, universities have been under tremendous pressure to produce ready-to-work software developers. Adjacent fields such as physics, mathematics, and electrical engineering saw enrollment stagnate as students pursued the economic importance of Silicon Valley.
That calculus is now changing. Once AI is able to generate competent code, the premium for a computer science degree that focuses primarily on software development will quickly diminish. But understanding quantum mechanics, thermodynamics, electromagnetism, and advanced mathematics will only grow in importance. These are fields that support physical systems that cannot be built using AI alone. These are also the areas that generate the kinds of thinking that are most resistant to automation, comfort with ambiguity, deep mathematical reasoning, and the ability to model systems from first principles.
My advice to anyone reading this, whether you’re a student, a mid-career professional, or someone watching the AI wave with growing trepidation, is to start learning about new computing paradigms now. Quantum computing is not a distant future curiosity. This is a fast-growing, emerging industry that is in dire need of people who understand the science. Photonic and neuromorphic computing are following a similar trajectory, creating a demand for physicists and engineers who can think beyond transistors. You don’t need a PhD to get started. IBM is offering free access to quantum hardware through the cloud, Intel is making its neuromorphic development tools publicly available, and open-source photonic simulation frameworks are emerging from companies like Xanadu. You can take courses online from MIT, Stanford, and more. The skills gap is real, but the barrier to entry is not access. This is a willingness to actively engage in really difficult subjects such as physics and mathematics, which cannot be shortcut by asking questions to a chatbot. The difficulty is what makes it worthwhile.
The Department of Physics, which has struggled for years to attract students, may find new meaning. Materials science has long been considered a modest field, but interest is likely to increase as the demand for new substrates, better qubits, photonic interconnects, and neuromorphic architectures increases. Successful people in the next era will be those who research deeply rather than broadly, and invest in understanding the fundamental workings of nature rather than the latest JavaScript frameworks.
Conversations about AI and employment are dominated by anxiety, and rightly so. People will lose their livelihood. Industry will be disrupted. But alongside that necessary concern, there is a story that deserves more attention. For decades, we’ve devoted most of our brightest technical minds to the relatively narrow problem of creating software. We have built an extraordinary digital infrastructure, but we have left some of our most important challenges in science and engineering understaffed and underfunded. We stopped going to the moon. We delayed fusion by decades. While the people who could have advanced quantum computing were busy optimizing ad click algorithms, we left it as a mere curiosity.
AI is currently solving software problems. It won’t be perfect, and it won’t happen overnight, but the direction is definitely on track. The question is no longer whether a machine can do the job. It is for humans to choose instead. The most hopeful answer is that, after a long and fruitful detour, we finally get back to the hard work. Physics. hardware. A grand engineering challenge. Fundamental questions about the nature of reality that we’re too busy writing code to pursue properly.

Software bracket ends. What happens next may be surprising. But only if we have the courage to go deep.
