Public Service Review: UK Science and Technology - Issue 4
PROFILE- Computational science and the future
15 December 2011
The rapid advance in computer power of the last 20-30 years and wide accessibility to computers led to an equally rapid growth in the use of computational methods in all fields of science and engineering. Computational modelling is now as important as physical experiments in understanding natural phenomena, or designing a new manufacturing process. With theoretical analysis limited by the need to produce simplified solutions and experiments restricted by cost and often physical risk, computer models become essential in producing solutions to a problem in the virtual world of the computer. Just like a computer game, this virtual world can simulate 'what if' scenarios, or test responses to input parameters for optimisation. Sophisticated computer graphics can then reveal the inner workings of a problem in minute, colourful, detail.
A research centre for computational science and engineering
Since the early Sixties, the UK pioneered the use of computational techniques to address real world problems. The FE methods used for structural analysis and Computational Fluid Dynamics for problems of heat and mass transfer originated in the UK. The Centre for Numerical Modelling and Process Analysis (CNMPA) at the University of Greenwich continues this proud tradition. The CNMPA was founded in 1986, in what was then Thames Polytechnic. Its mission to assist local industry, promote a safer environment and improve living/working conditions. Although the CNMPA also engages in experimental work, its focus concerns the development of software and computational models that have direct use in industry, the environment and society in general.
The centre is multidisciplinary, with 60-plus members that are a mixture of engineers, mathematicians, physicists and computer scientists. Its brief is wide, ranging from esoteric materials research to determine what happens to crystalline alloys as they solidify in microgravity, to studies linked to the restoration of the Cutty Sark tea clipper.
Work in the field of fire modelling and people evacuation from public buildings, hospitals, ships or aircraft, links a physical science subject (the spread of smoke, heat, or toxic gases) to social science through crowd behaviour. The value of the centre's work in advancing computational methods for the benefit of society has been recognised in a series of awards and accolades that include: The Queen's Anniversary Prize in 2002 for its life-saving software EXODUS, the top prize at the London Knowledge Transfer Awards 2008, in its use of computer models to determine how to dismantle and reassemble the Cutty Sark during restoration and the Times Higher Education Award of 2009 for 'Outstanding Engineering Research Team of the Year'. The CNMPA has worldwide links and enjoys international reputation as shown by the latest Research Assessment Exercise, where the work of the centre was classified as '40% world leading' and '30% internationally excellent'.
Recent successes: the link with space research
The strength of the centre lies in its links to industry. In a pan-European materials project called IMPRESS, the CNMPA provided the modelling component in a partnership of 40 research and industrial organisations to develop a new range of material products based on 'intermetallic' alloys.1 These are metallic alloys that in some respects behave like ceramics. Applications include aeroengine turbine blades that can operate at very high temperatures, yet are only half the weight of nickel superalloys, or nickel-aluminium metal powders that can replace platinum as a catalyst in vehicle exhausts, in the chemical industry, or in hydrogen fuel cells. The project was coordinated by the European Space Agency (ESA), not only because weight saving is very important in anything sent to orbit, but also because much of the fundamental research in the properties of these materials was carried out in weightless conditions.
Staying with materials science, the CNMPA is one of very few UK participants in the ESA ELIPS materials research programme researching the effects of microgravity on the behaviour of solidifying alloy melts. The absence of gravity opens possibilities for the development of completely new materials with strange properties, and computational modelling is an indispensible tool in this development. Microgravity experiments are planned for the International Space Station (ISS), but also on sounding rockets and on parabolic flights in a specially commissioned Airbus A300 (known as the Zero-G). Computational models direct experiments, predict the likely behaviour of solidifying alloys, including their crystalline structure, and explain phenomena that are otherwise hidden by the overwhelming influence of gravity.
In the absence of gravity, samples can be levitated in a weak magnetic or electrostatic field and then because there is no contact with a restraining container to initiate solidification, they can be cooled down to well below their normal solidification temperature, but still remain liquid. When solidification is finally triggered it is very rapid; alloy atoms have no time to align themselves into a regular crystal lattice and an amorphous glass-like structure results. A new class of metals is created, bulk metallic glasses (BMG). These can be much stronger than normal alloys and have higher elasticity. The BMG field is very much in its infancy, but already applications abound, ranging from golf clubs, to bio-implants that mimic closely the mechanical properties of bone. Space research will expand the range and volume of these attractive materials.
Just like a computer game, this virtual world can simulate 'what if' scenarios, or test responses to input parameters for optimisation.
Another important class of ISS experiments concerns the measurement of the physical properties of highly reactive alloys of titanium, zirconium and other metals in the liquid state, using electromagnetic levitation. This can be easily carried out in space, but it would be desirable and a lot less expensive to carry out similar experiments in terrestrial laboratories. The levitating field has then to be stronger to support the weight of the sample.
Levitation can be achieved with a combination of alternating and static electromagnetic fields. Several kilograms of material can be levitated this way. One unexpected effect of the external field – predicted using computer models – is to distort the microstructure of a solidifying alloy. A rather obscure thermoelectric effect (eg the electrical sensation felt by accidentally touching one's tooth filling with a fork) is responsible for this transformation. Since microstructure is linked to mechanical or electrical properties, the application of external magnetic fields may be used in the future to tailor certain materials for specific characteristics. An example of this transformation is shown in the figure, where an 'equiaxed' crystal is transformed by a strong static magnetic field aligned with the Z axis.
Computational methods and the future
Computational methods are now universally used. There has been a dramatic increase in capability with the increase in the number of users and developers. Accessibility is also now universal, since computations needing a supercomputer only a decade ago can now be carried out on a humble PC. However, calculations such as the example of alloy solidification shown above are also becoming more demanding as users ask for ever more realism. Computer runtimes for the most complex problems are still measured in weeks, or even months, on massively parallel computers. Advances in computer architecture (such as the recent use of Graphical Processor Units, or GPUs, for scientific computations) promise even faster and larger calculations in the future.
But does the veracity of the simulations also increase at the same rate? The answer here cannot not be an unqualified yes; however powerful the computer, and however careful the modeller, what is being addressed is a model of reality, not reality itself.
A model contains assumptions and approximations that need to be validated against physical experiments. Otherwise, one can be easily misled by the colourful graphics produced by the simulation.