Global award for Cambridge engineer’s research on accelerating visualisation and learning from Big Data | Department of Engineering
Department of Engineering / News / Global award for Cambridge engineer’s research on accelerating visualisation and learning from Big Data

Department of Engineering

Global award for Cambridge engineer’s research on accelerating visualisation and learning from Big Data

Global award for Cambridge engineer’s research on accelerating visualisation and learning from Big Data

Postgraduate research by Cambridge engineer Robert Sales has won the Best Student Paper Award from the world’s largest aerospace technical society.

Robert's approach allows engineers and scientists to compress and visualise simulation results that we would otherwise have lost due to lack of storage space.

Prof Graham Pullan

Engineer Robert Sales has won the Best Student Paper Award from the American Institute of Aeronautics and Astronautics (AIAA), the world’s largest aerospace technical society.

His new neural-network approach compresses and visualises the massive amounts of data generated by computer simulations, significantly reducing the storage needed compared with traditional methods.

Robert travelled to the United States to present his research and demonstrate how it could enable interactive exploration of large-scale simulations.

He explained: “This work builds on recent advances in neural compression, allowing researchers to view and analyse simulation data with far fewer constraints than conventional methods.”

The award-winning paper, “Compression and Ray-March Rendering using Implicit Neural Representations of Data Values and Domain Extent”, presents a framework for compressing and rendering large, complex datasets such as those used in aerospace, climate, and medical simulations.

Robert Sales at the Whittle Laboratory. Credit: Suzanne Donovan

High-resolution simulations can generate gigabytes of data in minutes, which is often impossible to store in full. Traditional compression methods can degrade the accuracy of visualisations, especially for unstructured data. Robert’s approach preserves fidelity while dramatically reducing storage, producing clearer, more accurate visualisations.

Robert, based at the Department of Engineering’s Whittle Laboratory, explained: “Using neural representations, we can compress unstructured simulation data by up to 200 times without losing the quality needed for visualisation. Unlike traditional compression, which can destroy important features, our method focuses on the areas of interest and produces clean images from the compressed data.”

His approach uses a dual-network system that encodes both the data values and the spatial domain in a single, fully implicit representation. 

“I’m always looking for ways to apply machine learning and AI to accelerate research and development, particularly in the net-zero aviation and power sectors.” 

Robert is a PhD student at the EPSRC Centre for Doctoral Training in Future Propulsion and Power at the University of Cambridge and a member of Wolfson College. He previously earned an MEng in Aeronautical Engineering at Durham University and a Masters of Research at Cambridge.

He thanked his supervisor Graham Pullan, Professor of Computational Aerothermal Design, for his continued academic support.

Professor Pullan said: “It’s fantastic that Robert has received this award. His approach allows engineers and scientists to compress and visualise simulation results that we would otherwise have lost due to lack of storage space.”

The paper won the 2026  AIAA  SciTech MVCE Best Student Paper Award, which was presented at the SciTech Forum in Orlando.

The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways that permit your use and sharing of our content under their respective Terms.