| Time (AWST) | Presenter | Title | Domain | Institution | Abstract |
| 08:30-09:00 | Conference Registration | ||||
| 09:00-09:15 | Mark Stickells | Opening and Intro to Pawsey | CEO | Pawsey Supercomputing Research Centre | |
| 09:15-09:30 | Dr. Yathu Sivarajah | Pawsey Visusalisation Services | Visualisation | Pawsey Supercomputing Research Centre | |
| 09:30-09:55 | Owen Kaluza | Visualising Earth and Climate Systems at ACCESS-NRI | Climate | ANU | Climate and other Earth Systems models generate vast amounts of output data - using 3D graphics and GPU accelerated rendering can assist in analysing these huge datasets and more effectively bring data to life in animations for communication and outreach. At ACCESS-NRI - Australia's Climate Simulator, our outputs include producing and releasing high quality visualisations of some of these datasets. As part of this process we also aim to produce tools and examples to share with the climate modelling community where possible. This talk will highlight some of this visualisation work as well as the open-source workflow we've used to produce them, and the accompanying tools and examples on our Github repos, including ACCESS-Vis: a python library released to assist in producing 3D visualisations and animations of earth systems data without leaving the familiar Jupyter Notebook environment. |
| 09:55-10:15 | Kevin Liu | 3D visualisation of direct numerical simulation of bushfire analogue | Climate | Monash University | Bushfires are an ever increasing threat to human settlements in Australia and across the world as the effects of climate change worsens. Through the simulation of a flow scenario analogous to a real world bushfire, a physics based machine learning model can be developed to enable the real time prediction of the dynamics of such events. Some of the visualisations of the simulation data will be presented as part of this talk. This is to show that despite the strong restrictions on what can be achieved via computer simulations, a system that is highly chaotic and non-linear can qualitatively resemble a real world bushfire in many regards, serving as validation to the use of simulations as a surrogate for real world data. The visualisations to be showcased are both 2D and 3D, and will also feature an interactive component. |
| 10:15-10:35 | Dr. Maja Divjak | Blender Render: Using the Flamenco Render Farm at Pawsey Supercomputing Research Centre | Visualisation | Peter MacCallum Cancer Research Centre | Biomedical Animation is used at the Peter MacCallum Cancer Centre to explain complex cancer biology and treatments to both patients and scientific audiences. We are currently creating a suite of three animations explaining the role of epigenetics in cancer. The first of these is about DNA methylation, which can control how accessible DNA is to molecular switches, turning the DNA on or off. For this animation, we have moved to open source Blender software and a series of bespoke animation tools have been designed by Structural Biologist Brady Johnston. For our previous award-winning productions, we were privileged to access the Pawsey Nebula System, enabling fast rendering in Autodesk Maya software. For this project, Pawsey have implemented Flamenco render farm capability in Blender, which enables blender render jobs to scale across all of Nebula, reducing the time it takes to complete considerably. The one drawback to this capability is that DNA and protein simulations must be ‘baked’ in Blender, that is, the dynamic data is included or ‘baked’ into the file, so that it doesn’t have to be calculated during the render process. Blender rendering at Pawsey using Nebula or the Flamenco render farm has made a significant difference to our animation production timeline. |
| 10:35-11:00 | Morning Tea / Demo | ||||
| 11:00-11:25 | Jesse Helliwell | Accelerating Blender Workflows on Nebula using Flamenco | Visualisation | Pawsey Supercomputing Research Centre |
Researchers utilising Blender to visualise data may find that they face very long render times on consumer hardware for complex scenes. Not only that, but even a single nebula node with multiple GPU's might not be enough either. Until now, if you needed more performance, you would have to have a Setonix allocation and the technical skills to split your job manually using the job scheduler and custom scripts. The Pawsey Visualisation team is happy to introduce a Flamenco render farm job scheduling system to Nebula. This system allows Blender users to submit jobs directly from the blender GUI interface to be split across all free Nebula nodes. This presentation will take a researcher provided sample task and show the performance scaling that is possible and the demonstrate the steps necessary to enable this new approach. |
| 11:25-11:50 | Dirk Slawinski | Immersive Data Insights (IDI) - formerly Ocean Data Explorer | Coasts & Ocean Research | CSIRO | The ocean is a complex, ever-changing system, governed by interactions between physical, chemical, and biological processes across vast spatial and temporal scales. Yet much of our understanding remains locked in static graphs and siloed datasets.
CSIRO‚ Dynamic Ocean Explorer addresses an early stage in the process of understanding and manipulating this information‚ data exploration. The tool provides a multisensory experience, giving researchers the opportunity to interactively filter and cut through dynamic volumes, selecting and removing informational dimensions to focus on anomalies and patterns worthy of further investigation. The system gives researchers the ability to create visualisations in real time to support interdisciplinary collaboration and communication with non-expert audiences.
To test the approach, we have prepared a slice of time and space taken from the BRAN 2020 ocean model dataset, converting it into a purpose-built microformat tuned for real-time interaction with volumetric data including animation over time. The demo will show gestural interaction techniques and affordances that we designed and tested with oceanographers, focusing on ease of use, controllability and precision. They include a cutting plane that can be moved and positioned with six degrees of freedom, a precise data point readout activated by extending an index finger, and tools for selecting informational dimensions and colour ramps. We plan to demonstrate our mixed reality experience, though our system also supports a web3D implementation we have built for use without headsets. We are also using Ocean Explorer as a human-computer interaction testbed to open more sensory possibilities including haptics and sonification. |
| 11:50-12:15 | Dr. Andrew Woods | Large Scale Photogrammetric Reconstruction for Shipwreck Visualisation | Photogrammetry | Curtin University | Photogrammetric 3D Reconstruction is a commonly used tool to create visually realistic digital 3D models of physical objects - however it is highly computationally intensive and can take a considerable amount of time to process. The team of Andrew Woods, Ash Doshi and Daniel Adams based at the Curtin University HIVE is developing a photogrammetry processing pipeline to run on the Pawsey Supercomputing Centre's systems that will significantly speed up the process of photogrammetry processing. Our primary application is shipwreck sites - such as the wrecks of HMAS Sydney (II), HSK Kormoran and HMAS AE1 - however they code could be applied to many application areas. |
| 12:15-13:30 | Lunch / Demo | ||||
| 13:30-13:55 | Dr. Michael Roach | Data vis for metagenomes | Bioinformatics | Flinders University | Visualising large microbiome datasets presents many unique challenges. Individual microbiomes from humans, animals, or the environment can consist of hundreds or thousands of different microbial species, and many of these are poorly understood or characterised. Current human microbiome studies can consist of hundreds or even tens of thousands of samples, and the microbiomes are often highly unique to individuals leading to considerable sparsity in the datasets. Further complicating visualisation is the hierarchical nature of microbiome taxonomic and functional annotations. Most popular visualisations for microbiome studies are a necessary compromise. For instance, alpha and beta diversity metrics offer a simplistic representation of samples but usually lack the necessary depth for meaningful biological insights. Relative abundance visualisations provide a detailed representation of the samples but can be too complex and cluttered to visually interpret. While there is no one-size-fits-all for visualising these complex datasets, there are a range of options depending on your biological question. |
| 13:55-14:20 | Dr. Tim Dykes | Invited Talk - ExaDigiT: An Open Framework for Developing Digital Twins of Supercomputers | Visualisation | HPE | Digital twins are an excellent tool to model, visualise, and simulate complex systems, to understand and optimise their operation. Digital twins can provide insight into complex system behaviour, supporting the operator in monitoring the system and diagnosing potential faults, as well as providing a mechanism to simulate what-if scenarios and underpin design of future systems. The ExaDigit project for the past 2 years has focused on building a digital twin framework for liquid cooled supercomputers, with input from a variety of supercomputing sites, academic institutions, and industry partners. This talk will give a brief overview of the ExaDigit project, and showcase the work-in-progress visual analytics component, which allows visualisation of supercomputing systems with various types of information overlay including live data streams from the physical supercomputer as well as simulated components from the digital twin, based on Unreal Engine and NVIDIA Omniverse with augmented and virtual reality support. |
| 14:20-14:55 | Setonix Tour / Vis Lab Tour | ||||
| 14:55-15:20 | Afternoon Tea / Demo | ||||
| 15:20-15:45 | Ali Zamani | Workshop Opening | Visualisation | Pawsey Supercomputing Research Centre | |
| 15:45-17:00 | Workshop/Networking |