Now was time the TERN facilities to begin to deliver products for researchers, governments and the community as the network moved from an establishment phase to an operational footing, TERN Director Professor Tim Clancy said as he introduced the work of the three-day annual TERN symposium.
He commended the facilities for the staggering amount of work they had put in to building the network and its infrastructure. Nevertheless, TERN had ‘only two years to get runs on the board’ to demonstrate its worth to society, thereby producing a compelling case for longer-term investment. The national strategic road map for research infrastructure outlined an ambitious – but necessary – set of infrastructure capabilities that were intended to provide a foundation for research into the structure and function of terrestrial ecosystems over the next decade. Tim said he wanted to see this vision delivered through:
- an increase in the number of site networks, supersites and long-term ecological research sites;
- better observing systems for monitoring soil condition;
- standardised measurement and analysis;
- better monitoring of freshwater systems and biodiversity;
- better sensor technologies and networks;
- more precise models that could predict likely environmental futures much more accurately than was possible with current technology and networks;
- better understanding of the action and function of microbes through the use of metagenomics (analysing genetic material from microbes directly from environmental samples);
- data being openly shared and appropriately managed.
If TERN could build this capacity, the scientists in the network would be able to confidently confirm trends and show emerging changes in Australia’s natural environments. TERN would be a source of authoritative advice on the best ways to grapple with the rapid changes in ecosystems in the face of a growing population and continuing, perhaps escalating, climate change. They would also underpin TERN’s contribution to national policies such as state-of-environment reporting, the development of the National Plan for Environmental Information and the putative set of national environmental accounts (a means of integrating information on environmental trends with economic frameworks to calculate sustainability outcomes), as well as similar state level programs.
He said the network could not focus on natural ecosystems alone but needed to deal with managed ecosystems. The high-level science outcomes being delivered by facilities could make a significant contribution to policy and management, so that decision makers, and the Australian community, would recognise the value of the capabilities being developed.
‘There is a need for high-quality ecosystem science – to underpin the conservation of our unique biodiversity, and the sustenance of our natural ecosystems, including the restoration of degraded systems, to secure our agricultural production, and to help us adapt to changing climates and mitigate their effect on us,’ Tim said.
‘These issues are central to people’s quality of life, to our livelihoods, and indeed to life itself, so while the ambitions of TERN are lofty, our contribution shouldn’t be undervalued.’
People attending the symposium get down to two days of working and networking (Photo
courtesy of Angela Gackle)
Several themes emerged over the course of the symposium, as facilities discussed progress to date, problems overcome and still to be faced, and visions for the future and possible paths there.
One of the most persistent was the problem of establishing long-term research and the enduring access to research data in the face of short-term funding and political cycles. Facilities returned to this conundrum, discussing the difficulty of storing data in such a way that it could continue to be useful and accessible in the event of losing funding in mid-2014, and in the face of ongoing developments in computing and other technology. Most facilities talked about their efforts to build durability into their data collection, data repositories, data accessibility and hosting arrangements.
Another challenge that the facilities had begun to tackle was how to structure and store their data, or the data of others, so that it was seamlessly available to data users not just during this funded period, but long into the future. This was not just a question of technology, but also an artefact of the different data-collection traditions of disciplines and institutions, and an entrenched tradition in the sciences of protecting one’s own data.
A third challenge was how to enable the users to generalise from the specific data TERN facilities were collecting and making available. This was made more complex given TERN’s charter to make data freely available across disciplines, and required careful planning and implementation on various fronts to ensure different types of data able to be mixed and matched according to the user’s needs. Included in this was the development of sophisticated modelling capacity that would permit multi-scale and cross-discipline integration.
A related challenge was to facilitate greater use of the data resource that TERN was building. The data and information cost many millions of dollars to collect, and for their true worth to be realised the data repositories needed to attract other data collections as well as people to use them. The TERN data licensing framework, which would be rolled out in the next few months, would provide the framework for data to be shared freely across institutions, agencies and disciplines.
‘Partly this is an outreach activity, helping others to understand why what we’re doing is important, how it is useful to them,’ Tim said.
‘And the data has to be used. This means data has to be comprehensive and comprehendible, interdisciplinary, easy to discover, with frictionless access and sharing. It has to give us a robust culture of inter-disciplinary synthesis. If it can do that, and I think we can, it will give us leverage on further investment.’
Published in TERN e-Newsletter April 2012