AARNet’s eResearch Advisor, Alex Reid reports:
This annual event (held this year in Brisbane 19-22 October) brings together those who support and develop e-Infrastructure for researchers across Australia. eResearch simply means doing research using ICT tools and services, and is now the predominant mode for undertaking research.
Here are some of the highlights of this year’s conference:
National e-Infrastructure Review Findings
Philip Clark, who chaired the Panel appointed by the Education Department to conduct a Research Infrastructure Review to “advise the Australian Government on future research infrastructure provision and requirements to support research”, presented an overview of the principal finding and recommendations. Government is currently reviewing the report.
Among the findings were that the investment to date has produced a good foundation, but it is not comprehensive, sufficient or cohesive. He enunciated 11 key principles to be adopted in any future Research Infrastructure programme, including that infrastructure requires both physical and human elements, that it should move from ad hoc to sustained funding, and that it should be collaborative (as proven by NCRIS). The Report also recommends setting up an Australian National Infrastructure Fund.
This was all wonderful music to the ears of the research infrastructure fraternity, and indeed was looked on with envy by some of the international delegates. Of course, we do not yet have the government’s reaction to this Report.
Data Movement Workshop
AARNet’s Data Program Manger Brett Rosolen convened this collection of papers on how people in different disciplines have successfully moved big data. Entitled “Data movement journey by community champions who have ‘been there, done that'”, it included presentations by Chris Phillips, CSIRO, on radioastronomy data; Jon Smillie, NCI, on climate change data; Stephen Dart, Vicnode on using Aspera and ingesting 225TB of data, using a variety of strategies; David Wilde on use of the ScienceDMZ; Markus Buchhorn (deputising for Richard Northam) on the RDS Moving Big Data Flagship project. It closed with a Panel Discussion. A very useful stream, with lots of advice.
Professor Sarah Kenderdine, UNSW – Deep Mapping to Data Sculpting: Inside Omnispatial Archives.
This was a tour-de-force showcasing some of the exciting visualisations Sarah’s team has been working on, especially with culturally important sites in China and India. She works closely with a number of museums worldwide, and a visit to a current exhibition in Museum Victoria is strongly advised.
Professor Peter Hunter, Bioengineering Institute, NZ – Computational Physiology and the VPH/Physiome Project.
Peter described the work his team is undertaking to build up comprehensive models of the physiology of all 12 human organ systems, involving some 600 different models. He explained some of the complexity of this process, and the need for standards so work can be shared worldwide. VPH stands for Virtual Physiological Human. One example of related work is BabyX – “a machine that can laugh and cry, learn and dream, and can express its inner responses by facial expressions”.
Professor Anne Trefethen, Oxford University CIO, and Dr Tony Hey, shortly to become Chief Data Scientist, Science & Technology Rersearch Council, UK -The Data Deluge and the Fourth Paradigm eScience Ten Years On
Given as two separate talks, they described what has been done in the UK and USA regarding e-Infrastructure over the past 10+ years. They had plenty of examples of projects that are advancing the art in this space, eg the 1million digital objects in the Bodleian Library’s cultural heritage centre – but the challenge is how to manage this new knowledge. On the other hand (especially in the UK), there is little of a cohesive nature that has happened over the past 10 years, following the end in 2005 of the eScience Project that Tony had headed up. Tony pointed out that the UK mainstreamed eScience too early, and as a result lost focus; he concluded by asserting that the job of Data Scientist will be the hottest job in the 21st century, combining data engineer, data analyst, and data steward.
Science Gateways Workshop
Science Gateways is the term used (especially in the USA, possibly now here in Australia) for what we have previously called “portals” – they are software systems to provide integrated access to data, analysis tools, and modelling.
There is considerable interest in pooling approaches to building Science Gateways globally, and the session on this was opened by Nancy Wilkins-Diehr, Associate Director of the San Diego Supercomputer Center at University of California, San Diego (by video conference). She described a number of successful Science Gateways in the US, as well as details of a bid being made to the US National Science Foundation (NSF) for funding to establish a Science Gateway Community Institute.
Subsequent speakers in the Workshop described some notable Australian Science Gateways (VLs), some built through NeCTAR projects.