To examine the first argument – most of the building blocks of modern modelling had indeed been around for a long time by the 1970s. Mathematical modelling, including many techniques of analysis and optimisation, including simulation, had advanced strongly since the early work of Operational Researchers in the Second World War. The basic mathematics of fluid flow was certainly in place – for example the work of St Venant, still fundamental to hydraulic modelling today, was carried out in the 19th century.
However the state of computing and the laborious ways of running computer models was unrecognisable compared with today. Models were run in batch on mainframe computers, with even simple models needing very long run times because the performance figures of those machines were way below a modern PC. Such computers were usually stretched to the limit by their primary purpose of billing, and modellers were encouraged to keep their models, with their unpredictable run-times and heavy processor loads, on timesharing bureaux. Program and data input was moving away from punched cards to teletypes, “glass teletypes”, and dumb terminals, and output was almost exclusively numeric on lineprinted continuous stationery or thermal paper rolls, except for the engineering use of large and expensive pen driven “plotters”. With very few general purpose modelling packages available, and none specifically for hydraulic modelling, models were usually written from scratch in a scientific language such as FORTRAN. There was almost no commercial use of application-independent databases or GIS for easier data inputs, and no spreadsheets or word-processing for ease of output. Perhaps the biggest gap of all, the PC had not appeared as a commercial proposition.
But in Xerox’s Palo Alto Research Centre (Xerox PARC) more than thirty years ago a group of developers were working on what was then called a WIMP interface (Windows, Icons, Mice, Pull-Down Menus), demonstrating this by the mid-seventies on the in-house designed Altos personal computer and a bit-mapped screen. The Xerox breakthroughs were destined to see commercial reality first at Apple and then Microsoft, to shape personal computing to the present day.
Since that time many other innovations have further changed computing - databases, GIS, desktop computing with a menu-driven rather than command-driven operating system, and a variety of results and reporting packages to which to link. The final driving force has been the enabler of all of this – Moore’s Law. In 1965, when integrated circuit technology was 4 years old and a chip held around 1000 transistors, Gordon Moore of Fairchild Semiconductor forecast that, over the next ten years, the number of transistors per chip would double every two years. This amazingly prescient statement is held to be true not just for Moore’s ten-year timescale, but right up to today’s Pentium 4 chips at 42 million transistors. Some even say the industry works specifically to meet Moore’s Law.
So have the modellers done nothing in this 25 year period of spectacular progress? Not a bit of it. Model developers can be credited with two strands of significant development. The first is to bring to the business the sheer volume of software that is required to create the models and the maths of the detail, and therefore accuracy, that exploits in full the power of today’s computers. The second is the creation of easy to use packages, rather than requiring all users to code their models from scratch, transforming the economics of modelling from the exotic domain of the few to the everyday tool of the an engineering project.
In terms of mathematics, the progress of Wallingford Software’s Collection Systems software is an indicator of the breakthroughs in hydraulic modelling generally. For example in 1992 SPIDA was the first implementation of the full solution St Venant equations. This meant that for the first time engineers could represent the sewerage system on the computer exactly as it exists in reality, with no need for risky approximations and assumptions. Complex ancillary structures, bifurcations, flow splitting devices and full backwater effects could all be represented. Other mathematical work was successful in progressing the speed and stability of the simulation engine, ensuring that results were reliable and consistent.
But at least as important as the development of the maths over the period has been the development in usability. For example in 1982, Mainframe WASSP was launched as the first commercially available dynamic simulation sewer modelling package. This breakthrough moved modelling from one-off purpose built in-house models to package software, greatly reducing the cost and ease of building hydraulic models. A breakthrough of equal moment was the launch of HydroWorks, with its Windows interface opening up modelling to all engineers. Further developments in ease of use have linked models to various inputs including GIS and outputs such as word processing and spreadsheets. Much progress continues to be made in validating and even auto-correcting input data. The best packages are also exploiting sophisticated graphical outputs, to improve the understanding of modelling results to a remarkable degree. The size of models that packages can accommodate continues to grow – Wallingford packages have grown from a 300 manhole limit to 100,000 over twenty years – with considerable benefits in modelling accuracy.
So, with no sign of slow down in Moore’s curve (forecast to continue through this decade) and a high level of accuracy in core models now achieved, where does modelling go next? In application, modelling will extend far more into operations, including real time operational control. At the same time, the best modelling packages will provide ever greater ease of use, to reduce the build time, improve the cost/benefit balance of modelling, and continue the march of the model to make it as much a part of every engineer’s toolkit as once was that forerunner of the PC, the slide rule.