Innovyze

Evaluating Modeling Software - The Hidden Features that Determine the True Cost/Benefit

0

Courtesy of Innovyze

There is general agreement that the costs/benefit balance of hydraulic modeling is now heavily in its favor.  The massive advances in both speed of build and ease of use over recent years make modeling the best approach to a far greater range of issues than was the case even a few years ago.  However, as many users know as they try to select modeling software, comparative evaluations of software packages are difficult.

There are several reasons for this, which are perhaps inevitable given the breadth and depth of function within modern modeling software, including:

  • there are very few people with the equally detailed knowledge and experience of several products that is necessary for true comparison
  • it is almost impossible to devise an evaluation harness, informal or formal, that does not contain a built-in bias towards one product or another
  • geography also plays a part in evaluation, with some products looking good for a specific country or region but not being as applicable elsewhere
  • the usual binary tick or cross approach to a feature list for software selection is an oversimplification of how well each product meets each criterion – scales of 1 to 5 are more appropriate
  • the weightings that should be applied to the values in a tick list vary from customer to customer, and even from project to project for a single customer according to their specific needs, and identifying these weightingsrequires extensive experience of the use of modeling
  • product comparison is a lengthy and therefore expensive process, so any published results are likely to have a commercial agenda behind them rather than being a truly independent approach
  • one of the most important attributes of any software product is its usability, and this is the most subjective and therefore the hardest to quantify.

The detailed technical functionality of different packages has been extensively covered in various publications.  Taken together, these provide a fair view of the engineering functionality of products - whether or not they use the St Venant equations, and if so, how they solve them, whether the modeling is dynamic, uses variable timesteps, can represent Real Time Controls, and similar important functionality.  Clearly any modeling package must meet a certain level of calculation accuracy to produce valid results on which capital and operational decisions can be soundly based.  In addition, depending on the complexity of the system it is required to model, it must have a certain breadth of features. But producing accurate results through a fast and stable simulation engine is only part of the story for hydraulic modeling packages, a necessary but not sufficient condition.

A number of other features that lie outside this core technical/mathematical functionality are equally important, but are frequently not to the fore in the evaluation process.  Hard to measure in many cases, they dictate the usability, productivity, and ultimately the cost/benefit balance of modeling.  In the building and continuous development of InfoWorks, there is extensive focus on developing these areas and, because they are often missing from the literature, it is important that they are highlighted in the software selection process.

Productivity features

The productivity of the modeler or modeling team is the key determinant of the cost of modeling: good modeling software contains a range of productivity aids that can dramatically lower the effort of building, calibrating, running and maintaining a model.

Productivity features can be categorized under two headings – technical features, and interface features.

Technical Features

The best modeling software undertakes a number of the simple but time-consuming tasks that are required during the model build process that other software products leave to the user - data checking and cleaning for example.  This activity is sometimes thought to require human intervention, but in fact once the rules have been established the checking involves merely examining each number or attribute against a check list and flagging anomalies – a task that software can undertake far better, and at a much lower cost, than people.  Equally, if required, the software can apply inference rules where errors are identified, based on interpolation between valid data points, and the data autocorrected.  Both these processes must take place under full user control, but the drudgery and elapsed time of data checking can be greatly reduced.

The same principle applies to connectivity data.  Because modeling data is often imported from GIS, and GIS is not particularly concerned with connectivity, connectivity errors are common in the first pass of the model build.  Checking this automatically is possible in a network modeling system that has the concept of connectivity built-in, providing another major productivity aid in a vital validation area.

Assigning roughness coefficients can also be automated.  Although some roughness factors are selected to directly reflect a specific and unusual pipe condition, most factors relate solely to properties of the pipe, such as age and material.  The software can allocate the majority of roughness factors automatically from its knowledge of the pipe attributes, and the few unusual instances can be specifically entered.  Calibration of a model is typically a major component of the build effort, and its automation can make a major contribution to reducing build cost.

Finally on technical features, the run time and stability of the model is a determinant of efficiency – in many cases run time is dead time for the modeling staff, and unstable models that fail during runs multiply this dead time and create a hidden addition to the total cost equation.  Speed and stability are vital attributes to include in software selection criteria.

Interface features

The issue of interfaces can be addressed as three separate aspects that are all key to high productivity: – the user interface, multi-user operation, and interfacing to other software.

Most modeling software now has a windows-like GUI as the user interface.  But for maximum productivity, it is vital that the interface is effective, intuitive for Windows users, and critically, consistent across the entire product. Where a product is modular, and perhaps those modules come from different sources, standardization of the interface is not easy to achieve.  If the interface is not simple and self-explanatory, the use of models will remain the sole province of specialist modelers rather than becoming standard across an entire engineering department.

The second aspect of the user interface is support for multi-user operation – another prerequisite of broadening the use of modeling across a department to leverage the build costs.  Ideally, the software will allow individual users the scope to own and use their own models and data sets, to view models developed by others under security control, but will impose central control onto storage and access to each dataset.  An appropriate mix of accessibility and control is essential to any modeling software planned for long-term use within a department and beyond.

The import and export of files of various types is essential both to efficient model build, and to the effective reporting of model results.  In terms of model build, the most obvious links required are to GIS systems.  Modern modeling software can operate in one of two ways – either having links to a single specified GIS only, or links to all the major GIS systems.

In addition to importing network data from a GIS, it is also important to import terrain data and time-varying data, including survey data and rainfall data.  Both these come in a variety of formats, both open and proprietary.  The ability of software to manage the import of a wide variety of formats, rather than requiring extensive system knowledge and time-consuming manipulation by the user, affects both the model build and the analysis effort required.

In terms of outputs, the first need is for data to be written back to the GIS.  Next, for the effective distribution of results, interfaces to the most effective reporting software is required.  Close links to MS Office are now an essential for reporting and disseminating model results.  With web publishing now a reality in many companies, export of model results directly in xml is very useful.  The essential feature is the flexibility to meet most common export requirements.

Management features

Management features are often the last attributes to be placed on a selection checklist, but in fact are among the most important if modeling is to be an ongoing core activity, reliably supporting decision making within the engineering department.  The pressure of delivering working models and results frequently tempts modelers to skip the proven good practices that should underpin their work, so it makes sense to have these activities managed by the software.

The first management requirement is data management.  Every data set should have its creation and every subsequent use and change tracked – who did what, and when.  Data properly managed by modeling software should be securely stored centrally, downloaded to the client PC of the user as required, all changes there logged, and then stored again centrally in its updated form, with audit loggings, at the end of the session.  If there are multiple versions of data sets, then strict version control must also be applied and logged for scrutiny when required.

Data flagging is another essential feature of good data management.  Many users want to annotate their data with so called “metadata” that flags, for each data item, the source, the date, and perhaps the reliability of the data.  The ability to store this metadata, as data and text, is an essential to the intelligent use of data.

Model management requires the same disciplines to be applied to models as to data.  Version control is frequently ignored by modelers as they make changes, so good modeling software will log and store all versions for audit as required.  Equally, just as with data, who did what and when should be recorded, and this is best undertaken by a good audit trail function within the software.

The iterative process of modeling places undue emphasis on the current versions of data and the model, and the users have every reason to ignore the process that went before.  Good practice demands that full audit and version control are maintained, as every experienced user has learned the hard way, and software does this best.

Cost of modeling

A final selection criterion in selecting software is the crucial issue of cost.  If two or more software modeling solutions meet all the technical, productivity and management features that the buyer deems necessary, the question arises – what are the comparative costs?

A common mistake at this stage is to fail to list the full cost of modeling.  The most visible cost is the purchase price of the modeling software, and it is easy to assume that, although there are clearly costs beyond that initial outlay, they are likely either to follow the pattern of the comparative software costs – that less expensive software has lower costs – or to assume that all other costs are identical across all products, and the only difference is in purchase price.  Both these assumptions are wrong.

The fact is that if the more expensive software has more productivity and management features than cheaper software, it will be less expensive to use.  It is hard to generalize about the annual costs of running a model, but they may well amount to three to five times the purchase cost of the software.  There are of course benefits of using the model, benefits which by definition outweigh all the total costs if the procurement is justified.  But if the selection decision is to be cost based, after all benefits of using the models have been assessed, it is vital that full costs are taken into account.

These full costs are the sum of:

Software costs

  • The purchase price of the software, for all modules required at the time of purchase and into the future
  • The purchase price of new releases and upgrades – some manufacturers charge, and some do not
  • The purchase price of any third party software that is essential to the running of the modeling software
  • The annual software maintenance charge, usually a percentage of purchase price.

Training costs

  • The costs of staff training, noting the optimum amount of training of a number of staff for effective use of the model.

Support costs

  • The annual cost of software support, which may be bundled into the maintenance charge or may incur an extra charge.  Vendor companies may also be flexible or otherwise in charging for assistance when working on a specific model, for example to help with any run problems of a particular model.

Staff costs

  • The full payroll costs of the staff that build and run the models.  These costs, possibly a proportion of the full costs of a number of staff rather than a single dedicated staff member, are likely to be large in comparison to with software costs, and therefore to be the dominant cost.

All these costs differ according to the software, but for two reasons there is a general rule that more productivity and management features in software lead to lower staff costs.  First, if tasks are automated they need less staff resource to undertake.  Second, if tasks are automated and interfaces are simple to use, these simpler elements of model building and operating could be undertaken by less skilled and lower paid staff.  If staff costs are indeed the dominant element of true costs, then the software that provides the most effective productivity support will prove the best buy, other things being equal.

Selecting software

In the light of all the above, the reader can draw up a good checklist for software selection.  However, there is a chicken-and-egg problem: filling in such a framework accurately really requires detailed knowledge of the software, but the user does not really know how a product performs until after purchase.

The solution adopted by many companies is to formally benchmark products with live prototypes in a pre-purchase phase, within an evaluation framework that is truly representative of the planned future use of the model.  If successful, the work is not wasted. If the model does not work out well, an expensive and long-term mistake is avoided.  The selection and implementation of modeling software for ongoing decision support throughout an engineering department and beyond is a decision that will have an impact for at least five years into the future.  It is worth putting in the effort to get it right.

Customer comments

No comments were found for Evaluating Modeling Software - The Hidden Features that Determine the True Cost/Benefit. Be the first to comment!