research

research

postdoc

IceCube and P-ONE experiments, MSU.
What happens when you combine a philosopher and computer scientist? You get a particle physicst! Understanding particle physics helps us understand the universe and its fundamental particles. Every day, I aim to answer these questions:
  • Why is there more matter than antimatter in the universe?
  • What is the nature of dark matter?
  • Why do neutrinos oscillate in flavor?
My work complements the particle physics done at lower energies, like at the Large Hadron Collider (LHC) at CERN, or other neutrino accelerator experiments at Fermilab or J-PARC. By trying to answer the same questions as these lower-energy experiments while using similar analysis techniques, we can more meaningfully combine our results and ensure that we are not missing any exciting new physics. In order to do this, I design, build, and test new detectors and their communications systems as well as analyze existing data at ultra-high energies. I have therefore become an expert on hardware, software, and data analysis. Let's take a peak into what I do on a daily basis.
Hardware
I run the Northern Testing System (NTS) used for the IceCube upgrade. This facility tests communication systems before South Pole deployment of upgrade materials from cables to communication software. I supervise five undergraduates who work daily in the laboratory. I also assist and overview the upgrade mDOM modules at MSU and their Final Acceptance Testing (FAT).
Technical software
I am known a bit as a machine learning guru. As the trigger coordinator for the PONE experiment, I am the lead designer of the on-shore P-ONE high-level trigger software. I also oversee a number of trigger chain developments for noise, lower-level physics triggers, and even bioluminescent organisms! I am the lead for the array performance studies at MSU for this experiment, which investigate using a transformer or Graphical Neural Network (GNN) architecture at higher levels of trigger chains for the full array. In this role, I supervise a few undergraduate students who get to work with and learn about machine learning applied in real-time data taking pipelines. Additionally, I am the reconstruction lead for the tau flux measurement on IceCube for the low-energy double pulse analysis. Using a two-layer weighting and selection system (RNN-DynamicEdge and Transfomer), I am overseeing the development of a new reconstruction method to identify tau neutrino interactions in the IceCube detector with a Ph.D. student and postdoc. This is a challenging task, as taus are rarely produced and decay mostly hadronically, making their signatures difficult to capture. Lastly, I am involved in the software reconstruction of long-lived particles decaying to leptons in the IceCube experiment using Convolutional Neural Networks to identifiy gaps in muon tracks that might suggest a dark matter particle interaction.
Analyses

The theoretical particles that I look for are called "long-lived" particles (or LLPs, for short). I am one of three contributing analysis members for the search for LLPs in IceCube. These LLPs are produced via neutrino interaction in the upper ice, have a "track gap", and decay back into SM leptons that are visible to our detector. The signature we look for is the "track gap", thus the name "Track Gap Analysis".

The second analysis I am underway in designing is the Double Pulse analysis in IceCube. This searches in the 10+ years of data module pulse signatures that have two distinct peaks. These peaks in the pulses are due to secondary leptonic decays from taus. Taus are rarely produced in IceCube, and they also decay mostly hadronically. These two aspects make tau signatures very difficult to capture. It is then imperative reconstruction tools and analyses are designed with high precision to find these hidden Double Pulses in our data. The analysis is ongoing, and expected to be completed in August 2026.

phd

The theoretical particles that I look for are called "long-lived" particles (or LLPs, for short). These are particles that decay far out in the ATLAS detector into standard model particles, thus producing a unique signature of that standard model particle. I look particularly at displaced tau objects in ATLAS, and my ATLAS authorship was on developing model-independent displaced tau triggers (quite a tricky challenge).

developments

ATLAS trigger:
  • run 3 tau/egamma TDAQ signature expert
  • run 3 displaced single and ditau trigger
analyses:
  • run 3 search of LLPs decaying to taus (ongoing)
  • run 2 search of LLPs decaying to taus (ongoing)
awards:
  • SUPA Saltire Grant (2021)
  • Mount Holyoke College Leadership Scholarship (2015)
funding bodies:
  • ERC
  • SUPA Saltire Grant while on placement at CERN (2021)