A highlight of the recent European Physical Society conference in Vienna was one of the first results to use data from Run 2. The conference note, from the ATLAS collaboration, provided the first detailed analysis of the number of collisions producing multiple particles from the LHC’s, high intensity, higher energy proton beams.
For any research group, having your results singled out as a conference highlight is a moment of huge satisfaction, and pride. But what does it take to get from raw experimental data to conference highlight, especially if you only get your data six weeks before the conference?
It’s all about planning.
Jan Kretzschmar (Liverpool and ATLAS) is Co-convenor or the ATLAS Standard Model working group. Almost a year ago, whilst the LHC still was still in the middle of the Long Shutdown, he helped to assemble a team with the specific goal of analysing this first data set from Run 2. A special 'task force' group was formed involving the convenors of several other ATLAS groups and appointing two dedicated coordinators.
“Right from the start, we were aiming for EPS,” explains Jan. “Previous experiments meant that we understood how multiplicity (the number of particles produced in a single collision) increases with energy, but only in the lower energy ranges. With the higher energy of the LHC, we didn’t have any theories to predict what happens.”
Although there is a definite logic to the idea that the higher the energy, the more debris is produced in a collision (think two cars colliding and the number of fragments of metal, glass and plastic that will fly out in all directions as a result of the energy of the impact), establishing the relationship experimentally and precisely was another matter; there are some phenomenological models that make predictions about multiplicity but no-one knew how accurate they would be.
The team comprised 40 people with a mix of experience and enthusiasm - approximately half the team was more senior physicists bringing relevant previous experience and knowledge, and the rest were PhD students and younger postdocs new to this sort of analysis and ready to learn. As everyone expected the analysis to deliver a high profile, early result in Run 2, many people were happy to join the effort.
The team started with Monte Carlo simulations of what they thought the Run 2 data might look like, testing their analysis tools.
“We knew what techniques we wanted to use, and we spent more than six months refining them,” says Jan.
With a fixed deadline to present the results at EPS in July, every piece of data was going to count. The slight delay to the restart of the LHC, and the knock-on delay to the first 13TeV collisions undoubtedly caused some anxious moments. But the team received their raw data, and with just six weeks to go before the conference, the pace of the project picked up, with data analysis taking place almost around the clock.
The raw ATLAS data relevant to this analysis is a series of dots marking points on the trajectory of electrically charged particles produced in the collisions at the centre of the detector. The first step in the analysis process is reconstruction - a sophisticated 'join the dots' exercise to establish the tracks of each particle. This identifies relevant events which are then analysed further and finally compared against the models.
The months of preparation paid off, but nevertheless, analysis of the data was an iterative process. The team met officially twice a week but there were lots of phone calls and interactions in between meetings; the analyses were reviewed and refined many times.
As with any ATLAS analysis, the process was followed by members of the Collaboration’s Editorial Board to keep an eye on the analysis but also to make sure that the subsequent research paper accurately described the process that the team had followed and clearly conveyed the physics result.
Emily Nurse (UCL and ATLAS) wrote up the conference note. “As the analysis followed a similar process to one that we used for a result with a lower centre of mass energy, I was able to write up the skeleton of the paper before we had any data – covering the background to the analysis and the techniques we used, but obviously not the results or the discussion.”
Having sent the note to the Editorial Board for comment before the data arrived, Emily was ready to manage the results when they started to arrive.
“I work 2½ days a week so I have to be really well-organised,” she explains, “and I relied on my colleagues to meet their deadlines so that I could meet mine. It worked pretty well.”
The graph compares the observed results (black triangles) with the Monte Carlo simulations (coloured lines)
For any team planning to present results, there’s always pressure in the run-up to a conference. “It’s very challenging but a lot of fun,” says Emily. “It’s great to be part of a team that pulls together, but the time required to do the analysis sometimes makes it feel like you do nothing but work! It’s been a while since I’ve worked into the evening to get a paper finished but getting the results written up is important and I was happy to make my contribution to the team.”
Every draft paper or conference note from ATLAS follows the same process before publication; it is circulated to all 3000 members of the collaboration for comment and then there is an open meeting where any collaboration members can attend in person or join by video conference to clarify anything in the paper. All of this needed to be fitted in before the charged particle multiplicity note could be presented at EPS.
So what did the data show? Well, the graph compares the observed results (the black triangles) with the Monte Carlo simulations (the coloured lines). It shows the average number of charged particle tracks in the central region of the ATLAS detector, which clearly increases as a function of the collision energy. While the general trend is predicted by most models, some simulations do better than others.
“The results reflect the time and effort taken by the theorists to tune the simulation models,” says Jan. “The next step is to feed our results back into the models to make them even more accurate to help future analyses.”
You can read the full conference note here and a full paper will follow in a month or so.
Whilst many physicists are focused on ever bigger and more powerful accelerators, a team from CERN is focusing its attention on an accelerator that is just 2m long.
Rob Edgecock (STFC) leads an EU-funded network looking at applications for accelerator technology. It includes the CERN project which aims to produce a miniature radiofrequency quadrupole (RFQ), a component found at the start of all high energy proton accelerators.
“One of the goals of the network is to look at how we can use accelerator technology to produce radioisotopes for cancer therapy,” explains Rob.
The new mini RFQ has been designed to be the perfect injector for the next generation of high frequency, compact linear accelerators used for delivering proton therapy to cancer patients. Using experience gained in designing and building Linac4, the new linac that will become operational in the CERN accelerator complex in 2020, the CERN team’s challenge for the mini linac project was to double the operating frequency of the RFQ and shorten its length.
In fact, the CERN team believes that their mini RFQ has applications beyond hadron therapy; it could be used within hospitals to produce radioisotopes for medical imaging, eliminating the current problem of transporting radioactive materials from the production site to the patient, and for advanced radiotherapy.
Miniaturising the technology presented some interesting technical challenges but the construction of the first two of the four 50cm long modules has been successfully completed at CERN and the team hopes to test them within the next few months. What they have learnt will feed back into CERN’s core accelerator development programme.
But of course, there are still many steps before the new mini RFQ is available to help patients.
“Building the RFQ modules is a good first step,” says Rob, “but clearly a complete working accelerator is required to determine what the actual performance will be and compare it with the cost, size and performance of the existing commercially available radioisotope production machines.”
And then it’s a question of finding a commercial partner to manufacture and sell the machine.
“If everything goes to plan, it should be possible to have them working for radioisotope production in a hospital within 4-5 years,” says Rob.
Two UK artists collaborating under the name Semiconductor are this year’s recipients of the Collide@CERN Ars Electronica Award.
In their art works, Ruth Jarman and Joe Gerhardt explore the material nature of our world, and how we experience it, through the lens of science and technology, questioning how they mediate our experiences.
During their two-month residency at CERN, they plan to create a digital artwork elaborating on the nature of the world and our perception of it, including consideration of how scientific instruments and particle physics discoveries influence our perception of nature.
CERN’s art programme gives world-class artists the time and space to reflect, research and renew their artistic practice and career by encountering the multi-dimensional world of particle physics in carefully curated encounters with CERN scientists.
Find out what it’s like working with one of CERN’s artists in residence in UKNFC 47.
Google map showing CERN
Virtual visitors can now explore many CERN sites directly from Google Maps via Google Street View.
Above ground you can click around the Meyrin campus and get a feel for daily life as well as visiting CERN’s first experiment, the Synchrocyclotron.
But the real excitement takes place below ground and you can visit the LHC tunnel and all four LHC experiments through a dedicated CERN part of Google Street View.