Since their discovery in the late 1800’s, electrons have been a constant source of study for scientists. Their properties and behavior have been studied and harnessed to produce some of the greatest inventions of the past century, including electron microscopes and particle accelerators. However one fundamental question about their behavior still remains: how do electrons move inside atoms and molecules? Electron motion within atoms has proved difficult to study due to the incredibly short timescale it occurs on (the attosecond timescale, or 10-18 seconds). One method of capturing electron motion is to use very short laser pulses to take a series of snapshots of the system. This requires laser pulses shorter than the duration of the dynamics we want to observe (similar to using a short flash on a camera to obtain an image of a fast-moving object). The means to do this have only become possible in the past decade with the advent of new ultrashort (less than 100 as) lasers, which have become feasible due to a process called high‐harmonic generation (HHG). However, these ultrashort lasers are difficult to produce and characterize experimentally, so theoretical and computational methods are often used in the field of attoscience. These methods are also not without their limitations – modelling the correlated behavior of electrons requires significant computing resources, and so High-Performance Computing (HPC) resources are often used to perform these calculations. In this seminar I will present recent results obtained using R-Matrix with Time-dependence (RMT) method calculations performed on national HPC resources, firstly to treat high-harmonic generation in two-color laser fields, and then on applications of the attosecond pulses generated during the HHG process to measure ionization delays.