The Higgs mechanism is responsible for (most of) baryonic mass

You know that irritating factoid that you are aware is wrong, but appears again and again in the news, in magazines and in blogs? The one that is said by people who should know better and you wish they would stop disseminating it? This blog post is about such a factoid. Here is the factoid

The Higgs mechanism shouldn’t be referred to as the origin of mass, as the Higgs mechanism is not responsible for baryonic masses. It only gives masses to the leptons, while most mass in our bodies comes from baryons. These attain their masses from an entirely different source, the non-perturbative effects of the strong source…..

(this isn’t a direct quote from anyone, but most physicists will have heard something of this sort)

Now, this statement is partly right, which is why so many people say it. But ‘partly’ is only around 40% right, and physics isn’t a first-past-the-post electoral system. The problem is that it is also about partly wrong, and here ‘partly’ means around 60% wrong. Quantitatively, to the best of our current knowledge the Higgs mechanism is responsible for about 60% of the baryonic mass of the proton and neutron.

Why? As we all learned from the books that Mr. Weinberg kindly donated to the kindergarten, the Higgs field couples to elementary fermions. When the Higgs acquires a vacuum expectation value, this turns the Yukawa coupling of the Higgs to fermions into a mass term for the fermions. The mass of the fermion is proportional to the strength of its coupling to the Higgs – and so the electron mass is set by the electron Yukawa coupling and the muon mass is proportional to the muon Yukawa coupling.

This also holds for the quarks – and so the top quark, which has a Yukawa coupling of unity to the Higgs field, turns into the mahoosive ‘who ate all the pies?’ fat bastard of the Standard Model, with a mass only slightly lighter than a gold nucleus.

What about the up and down quarks? These size-zero models have elementary masses, induced by the Higgs field, of only a few MeV. This is vastly smaller than the top, and in particular much smaller than the proton and neutron masses of a bit over 900 MeV. It follows that, when we say the proton is made of two ups and one down quark, the elementary Higgs-induced masses of these quarks don’t contribute much to the overall total. This leads to the refrain that I am ranting against, “The Higgs mechanism doesn’t contribute to the proton mass…..”

Where does the proton mass come from? The proton is a strongly interacting kludge of quarks, antiquarks and gluons held together by the complicated mess that is the strong force. Mass is energy and energy is mass, and the mass of the proton arises from the energy associated to this strong force mess.

What sets this energy? The scale where the strong force becomes strong – and here the Higgs returns to the game. Within the dynamics of the Standard Model, the behaviour of the strong force with energy depends on the number of charged particles – couplings run with energy, and at lower energies the strong force becomes stronger and stronger.

Now, the form of this running is not set from on high. It is dynamically determined by the number, nature and mass of the particles that interact under the strong force. Enter Higgs – because the effect of the Higgs mechanism is to modify this running. It does so because it gives large masses to the top, bottom and charm quarks and so, at energies below these masses, decouples them from the running of the strong coupling constant.

Calculating the effect of this is a straightforward exercise in graduate quantum field theory.  The result is that, without the Higgs mechanism, the strong force would become strong at lower energies than it does is nature – around 100 MeV rather than the observed 250 MeV. This also implies that in this alternative world without the Higgs mechanism, the bound states of the strong force – such as the proton and neutron – would have proportionately lower masses than they do in our world with the Higgs mechanism.

The upshot – the Higgs mechanism is responsible for the majority of the baryonic mass. This doesn’t occur at tree level – but is true radiatively through the role of the Higgs in decoupling the heavy quarks and so thereby modifying the running of the strong force and increasing the value of Lambda_QCD.

Grant Applications: Then and Now

Writing grants is a bane of academic life. Recently I came across an old astronomy grant application, submitted almost exactly (to the day) two hundred and fifty years ago.

The proposal was to observe the transit of Venus across the sun, from three locations (Hudson’s Bay, the North Cape and Tahiti) to measure the sun’s parallax and thereby the distance to the sun.

It begins

To the King’s Most Excellent Majesty. The Memorial of the President, Council and Fellows of the Royal Society of London for improving Natural Knowledge Humbly sheweth –

That the passage of the planet Venus over the Disc of the Sun, which will happen on the 3rd of June in the year 1769, is a Phaenomenon that must, if the same be accurately observed in proper places, contribute greatly to the improvement of Astronomy on which Navigation so much depends….

 Reading with the modern eye one sees:

  1. Gratuitous sucking-up to the person likely to review the grant
  2. Making a case for science of fundamental importance
  3. Supported by references to its industrial impact
  4. Astronomers asking for government money to travel to exotic locations 

Plus ca change then….the grant was awarded.

I learnt about this through reading Patrick O’Brian’s biography of Joseph Banks. O’Brian is probably my all-time favourite writer. To quote Lucy Eyre, there are two types of people in the world: Patrick O’Brian fans, and people who haven’t read him yet. Read him. (If Nelson’s navy isn’t your cup of tea, then read Testimonies, his hauntingly beautiful love story set in the Welsh valleys).

How to Count

One potato, two potato, three potato, four!
Five potato, six potato, seven potato, more!

Counting is one of the oldest skills we learn and most adults cannot consciously remember being unable to account. We learn to count at a young age, and it is hard to recall what we ever found hard about it. To find out how many of something there is, you just enumerate them. For example, every ten years a census is taken of the population. This is used to give a clear account of how many people live in a particular area at a particular time. In a literal sense, you just count heads.

Counting in physics seems easy as well. If I want to count how many atoms are in the room, it may take longer than the age of the universe to actually do it, but the principle is similar. I just count, and count, and count….and after a long, long time I would have the answer.

This post is about one of the ways counting can actually be rather subtle. In physics this goes under the exotic name of `infrared safety’, but the idea is much more general, and also applies to many areas outside physics (as we shall see).

To describe it, here’s a picture. Bang! This is the (slightly processed) result of a collision of two protons at the Large Hadron Collider (copyright CERN’s ATLAS experiment)

At first sight this picture may appear a bit strange. What is going on? What is happening? The relevant part is the two red bunches of particles. These bunches of particles are called `jets’, and they represents a large grouping of particles all heading in approximately the same direction. In colliders such as the LHC, these `jets’ are one of the common manifestations of the strong force that binds quarks and gluons together. If you want to understand the physics of the Large Hadron Collider, and whether any new physics may be lurking there, it is important to understand these jets.

In practice, given the number of collisions at the LHC, it is clear that computer algorithms are necessary to work with jets and to decide what is and what isn’t a ‘jet’ in any one particle collision. Nature doesn’t, by itself, produce particles with a label on them in Times New Roman stating that this set of particles belongs to jet 1 and this other set belongs to set 2. This requires a set of rules – for example, take an energetic particle and draw a cone around it of a certain angle, and say that, by definition, every particle within this is part of the jet.

However, once you have decided what a jet is, then it may seem quite simple to ask: How many particles are there in a jet? After all, we can surely just count them. We look at the jet, we look at all the particles within it, and we count them up. Simple! But this is where the problem lies. To see the problem, consider one of the most well known particles, the photon, the electromagnetic force carrier. Photons have many different energies. The most energetic are called gamma rays, then moving down to X-rays, ultraviolet light, visible light, infrared, microwaves and photons. The less energetic a photon is, the more sensitive the apparatus needed to detect it. Equipment that detects gamma-ray misses X-rays. X-ray detectors miss infrared light. Infrared detectors miss microwaves. In any jet, however defined, there are photons going down to infinitesimally small energies.

So in the end, this means that the question ‘How many particles are in a jet?’ is, by itself, actually meaningless. In that there is any answer, it is infinite – by going down to infinitely small photon energies, you can make the numbers of particle infinitely large.  To make the question sensible – and physically meaningful –  we have to instead ask ‘How many particles are there that have at least a minimal energy‘? The imposition of this threshold is essential for the question to make sense in the first place.

This is the simplest example of ‘infrared safety’ – which, roughly, says that results of physical observables shouldn’t be sensitive to the addition (or subtraction) of extremely low energy particles into the system. In a more subtle way, beyond simply counting particles, this requirement of ‘infrared safety’ is also crucial to a good definition of ‘what is a jet’. If your classification of where the jets are in a big smackeroo of a collision at LHC energies could change with the addition of a tiddly little microwave photon, your classification algorithm is a  bad one. Indeed, infrared safety is one of the key defining properties of modern jet algorithms, such as the anti-kT algorithm.

I introduced this idea of infrared safety through a counting problem – how many particles are there? However, it has broader implications as the underlying effect holds whenever people try to count without putting a lower threshold in. As such, it is the source of many of the pseudo-statistics that permeate public life. A good example of this is counting hate crimes.

Let us switch now from the Large Hadron Collider to closing time at a large city centre pub on Saturday night. Alfie says something to Bob – who doesn’t like it, and responds by punching Alfie in the face. Rather unpleasant – and a clear case of assault. But, this is also something that has happened for as long as young men have gone our drinking at the weekend, and unless you were personally involved you would not think twice about it.

However, call this a hate crime, and you probably react quite differently. What is needed to turn that drunken right hook into a hate crime? This is a legal question with a legal answer – so going to the Crown Prosecution Service, we find

‘Any criminal offence which is perceived by the victim or any other person, to be motivated by hostility or prejudice, based on a person’s disability or perceived disability; race or perceived race; or religion or perceived religion; or sexual orientation or perceived sexual orientation or transgender identity or perceived transgender identity

What does this mean? It means that if anyone, anywhere – present or not, for whatever reason and however unjustified, decides that a crime is a hate crime, then it is a hate crime (I have deliberately set the description so that there is no doubt that there is a crime involved). In the case of no crime, then the police record it as hate incident.

Whether you like it or not (and you shouldn’t), this is the legal definition of a `hate crime’. It is also a terrible definition. Why? It has no threshold. The perception doesn’t have to be correct. It doesn’t have to be justified. It doesn’t even have to be reasonable. Any crime committed in the UK can be converted by you, dear reader, into a hate crime. You simply have to report to the police that you believe this crime was committed due to hostility based on race or gender or religion or sexuality – and the beauty of the definition is that your belief explicitly does not require any evidence, reason or logic to support it.

In addition to having no threshold, this legal definition of `hate crime’ also violates the rules of natural justice – under which if A alleges something against B, evidence has to be adduced to support the accusation, the accused can offer a defence, and a neutral judge, jury or panel decides on the truth of the allegations.

Nonetheless, there are lots of people who want to count `hate crimes’, and the statistic seems to have some political weight, despite the fact it is close to meaningless.

We can now make the connection back to jet algorithims. In both cases, the counting problem arises when there is no threshold of `seriousness’ – whether `seriousness’ involve photon energy, a burden of proof, or the seriousness of an allegation. Sensible counting always requires a threshold. If you try and count something with either no threshold or a threshold set at such a low level that it constantly fluctuates, you get nonsense. You measure something – but something that is entirely un-related to what you are trying to measure. This is the case for the pseudo-statistic of ‘numbers of hate crimes’ – which most likely really measures something like social media shares on how to report hate crimes.

(Of course, there are various way this statistic could be converted into a more meaningful one. For example, it could restrict to cases where there is evidence, meeting the legal burden of proof, that a crime was actually motivated by hate.)

There is another topical area – sexual harassment – which is also prone to a similar counting problem. What fraction of people have been sexually harassed? How does this compare to ten years ago? Fifty years ago? If defined properly, such a question is interesting and tells us something useful about society. However, as a counting question its meaningfulness again depends on the use of a consistent threshold and there are many circulating definitions of ‘harassment’ that do not have a lower threshold.

For example, the UK citizens advice bureau says that

Harassment is when someone behaves in a way which offends you

Such a broad and open definition is easy to interpret in a way through which every adult could, if they wish, declare themselves to have experienced sexual harassment.

In contrast, the use of more robust thresholds – for example that of sexual assault, as it requires physical contact as a necessary ingredient – would give something which can be meaningfully counted and does result in a usable statistic.

What is the upshot? Counting is subtle, in physics and in life. When you are counting something large and discrete – like potatoes or murders – it is hard to go wrong. However, when you are counting something on a sliding scale that reaches down to infinitesimals, you need a threshold – and if you don’t have one, the number you produce is unlikely to mean anything.

The Logarithmic Nature of Scientific Contribution

The 2017 Physics Nobel Prize was awarded – to no-one’s surprise – for the discovery of gravitational waves. Also unsurprising were the awardees – Rainer Weiss, Kip Thorne and Barry Barish. There had been some debate about whether Barish would be included or not – was his too much of a managerial contribution rather than a scientific one? – but it was certainly no shock to see his name there.

What this award has led to, however, is a large degree of grumbling about how the Nobel Prizes are not fit for purpose, how they ignore the collaborative nature of science, and in particular how they should now be awarded for collaborations, such as LIGO or ATLAS or CMS.

This critique is nine parts wrong and one part right. The part it has right is that science is a collaborative endeavour – as is every other human activity. Anyone who thinks their accomplishments are due only to their innate brilliance should stop admiring their reflection in the mirror and go visit Mum and Dad.

What it has deeply wrong, though, is that some people contribute vastly more than others – inside and outside collaborations – and it is fair to recognise that. Microsoft was and is a collaborative activity – but Bill Gates was not just one of many collaborators.

LIGO could easily not exist today. The idea looked crazy at first – measure distances that are a tiny fraction of nuclear distances using a kilometre-scale baseline?? Gravitational waves could easily have remained undiscovered – and the Nobel Prize recognises that they were discovered not by a wave of a wand but because certain people built and pushed this project at a time when it looked like a fantasy, and turned it into something real. There is a world of difference between visualising and establishing a project when most people think you are nuts, and joining, five years before it attains discovery sensitivity, an established collaboration with a clear planned program of incremental technical improvements.

This also reflects one of the important social purposes of the Nobel Prize – to reward risk-taking in science. As billion dollar valuations are to techie dropouts with a garage, a computer and an idea, so Nobel Prizes are a potential reward for scientists who pursue unpopular ideas that will probably fail. Most science is safe science, and safe science is all well and good, but the best science will not happen when safe science is all that is pursued.

Scientific contributions depend on luck, talent and hard work (a bit like life). Geniuses who smell out the most exciting area and then master it by hard work always prosper. For the rest of us, it helps to be in the right area at the right time.

Whatever feeds into someone’s contributions, the result is best measured on a logarithmic scale. Compared to an average theoretical physicist, Ed Witten’s contributions are not a factor of a few larger, but orders of magnitude larger. Prizes are one of the ways such outsize contributions are recognised.

Congratulations to the three new Nobel Laureates – and for the other members of the collaboration, they have had the privilege to have the first look through a telescope every bit as revolutionary as the one Galileo pointed at Jupiter. Few scientists can think of something better.

Why blog?

The most direct motivation for this blog arose on June 23rd last year. Together with around half the electorate, I supported Leave in the referendum. I like democracy – governance of, by and for the people – and my prudential judgement was (and remains) that the European Union’s clear anti-democratic streak meant that it was right for Britain to leave.

I’m an academic. For me, one of the most dispiriting aspects of the last year has been the response of many – not all – academics to the Brexit vote. This response has involved a troubling degree of epistemic certainty coupled with a rather open contempt for half of the population. With this, arguments that I had thought dead for a century have been filled with new life – in particular, the old case used against the extension of the franchise first to working-class men and then to women, the argument that some folks are too ignorant to be allowed a say in high politics.

Britain has around 130 university vice-chancellors. Of these, I am not aware of a single one who supported Brexit (compared to half the population and 10-15% of academics). This illustrates one of the principal current problems with universities – a culture encouraging everyone to think the same (full disclaimer – I am a member of Heterodox Academy which aims to combat this problem).

Now, do not universities have diversity committees? Do not they have lofty mission statements about how they ensure inclusive environments? In principle, yes. Indeed, one of the most genuine pleasures of physics is the real diversity of people I have collaborated with, from so many different countries and cultures.

However, the practical meaning of ‘diversity’ in a university in 2017 tends rather to mean homogeneity around a certain kind of centre-left politics resembling the Hillary Clinton coalition. On this account, ‘diversity’ refers to people who look different, have sex differently, but think and vote the same.

This also comes with a well-defined sense of the deplorables. I give one example, from Britain’s best-known science communicator acting in his professional capacity.

Now, you can agree or disagree on the level of regulatory standards on imports, and most of us don’t have an informed opinion. But, to call someone a penis is abuse rather than disagreement, and is only socially acceptable because, somehow, it is different if you are talking about a Tory.

I care passionately about science (OK, physics, really – not bugs and beetles) and I care passionately about universities. Both, at their best, are among the civilisational crown jewels. I worry about their current trajectory.

The intellectual homogeneity of universities – and even more dangerously, the authoritarian social pressure to conform to Received Opinion – is damaging in two ways.

First, it turns excellent people away. Science is hard, and sometimes really hard. It is also universal, belonging to all cultures. It both wants and needs the best minds from *all* – and all means all – backgrounds. A scientific or university culture fails when it is oriented around how clever and enlightened People Like Us are, and how stupid and primitive Tories/Republicans/the DUP/Brexiteers/Fox New viewers/…. are. Such a culture is not intellectually sharp, it is smug and self-centred, and will fail to attract the best and liveliest minds to itself.

Second, as a practical matter science relies on public trust and money. It is an enormous privilege to be paid to do science for a living (anyone doubting this should recall how 99.9% of the population have lived for 99.9% of history), and this privilege is funded with other people’s taxes, distributed via our electoral representatives. Science is funded, in part, because it is trusted to provide truth about the world. How long will this trust last when so many scientists are so openly contemptuous of those with different political views to their own? On political hot-button topics, how many people trust the neutrality of communities whose members are all on one political side? In tight financial settlements with many deserving claims (and as a country we still spend fifty billion quid a year more than comes in) do you help or hinder your reputation for impartiality by calling people nobs?

This is the motivation for blogging. I hope not to always be so sombre. I intend to blog about physics (it’s fun!) but also about university culture, academic life, marathon running, poetry, and whatever else interests me.

It is true that anyone in universities with their eyes open can see dark storm clouds rolling in from across the Atlantic – but it is also true that storms often precede the most brilliant bright blue sunshine.