“I think we’ve made a new record for us. 53 terabytes transferred in 24hrs, at 100% efficiency,” reported Sean Crosby to AARNet’s eResearch team. Crosby is a research computing scientist working at the ARC Centre of Excellence for Particle Physics at the Terascale (CoEPP) at the University of Melbourne. He is referring to a recent elephant data flow over the AARNet network between the University of Melbourne and a research network-connected site located in Germany.
Data processing for the ATLAS Experiment
This huge data flow forms part of the CoEPP’s activities as an ATLAS experiment Tier2 site for the Worldwide Large Hardron Collider Computing Grid (WLCG). The CoEPP is one of the 170+ grid-connected computing centres in 42 countries worldwide that provide the linked-up computing and storage facilities required for analysing the ~30 Petabytes (30 million gigabytes) of data CERN’s Large Hadron Collider (LHC) produces annually.
Helping scientists further our understanding of the Universe
Physicists are using the LHC to recreate the conditions of the Universe just after the ‘Big Bang’. They are searching for new discoveries in the head-on collisions of protons of extraordinarily high energy to further our understanding of energy and matter. Following the discovery of the Higgs boson in 2012, data from the ATLAS experiment allows in-depth investigation of the boson’s properties and the origin of mass.
The reported 100% efficiency of this particular big data transfer between Australia and Germany, clocked at nearly 5 gigabits per second sustained over 24 hours, is a great example of the reliability and scalability of the AARNet network to meet the needs of data-intensive research on demand.
Jan 30, 2018
Oct 19, 2017