Software engineering research has proposed many tools, methods, and techniques to improve the software development process. It has long been recognised that the linear “waterfall” model ofrequirements -> design -> implementation -> testing -> evaluation simply does not work in practice for very complex systems. The most successful models suggest an approach to closely incorporate the evaluators and the developers by taking a more incremental approach. This incremental approach seeks to implement a scaled-down version of the system as soon as possible (its skeleton) and then build the appropriate functionality in increments onto that skeleton. Such an overall SWE approach will be applied in CryoClim.
Approach for managing user requirements
Understanding the user needs for CryoClim is critical to the success of the operational system. However the process of gathering of user needs and specifying system requirements is not straightforward. Particular problems faced by the analyst are:
- Addressing complex organisational situations with many stakeholders.
- Users and designers thinking along traditional lines, reflecting the current system and processes, rather than being innovative.
- Users not knowing in advance what they want from the future system.
- Rapid development cycles, reducing the time available for user needs analysis.
- Representing user requirements in an appropriate form.
In order to address these problems, an iterative process will be adopted. This involves the following steps:
- Perform an initial user survey to identify user needs at a general level. This is based upon the results of Phase 1 of the project.
- Produce a simulation of the system user interface based upon an understanding of the technical capabilities of the system and the identified general user needs. A first version was made in Phase 1 of the project.
- Demonstrate the simulation to an extended user group to obtain feedback and to generate more specific needs. These are then specified as system requirements within the CryoClim system specification documentation.
- Produce a prototype system (with partial functionality) and demonstrate it to a panel of users to obtain feedback. This step is repeated with several versions of the prototype so that the prototype system gradually reflects user requirements more closely.
Approach for user evaluation
User evaluation will provide on-going inputs to the CryoClim development team, and report on the evaluation of the system for each version of the system. The evaluation will be based on two principal activities:
- Demonstrator review: The key users will monitor the development of the system and give feedback to improve the development of the system based on a Flash demonstrator and prototype web portal. The Flash demonstrator is meant to show most functionality envisioned for the system ("the web service vision"), while the prototype web service is built on the real backend system and will show functionality developed so far in the project. First versions of the demonstrators were developed in Phase 1 of the project.
- User panel: User representatives will test and provide comments on the developing system prototype. The panel will provide important input regarding overall user functionality. The various users will have access to the system and will be able to send various data requests to it. Feedback from the users will be an important resource for necessary improvements to the whole system.
Essentially, users are concerned with the output that they receive from the system in terms of a useful product or appropriate accuracy, resolution and timeliness. However their response to the system will also depend on their having background knowledge of how CryoClim generates products. This will help to set their level of expectations from the system appropriately. In general, if users have an understanding of the product generation process and the system architecture, this will help them to relate to and use the system appropriately.
Approach for ICT evaluation
The Information and Communication Technology (ICT) system validation is based upon functional requirements (what the system will do) and non-functional requirements (qualities such as performance, security, interface operation, etc.). Use cases have been defined to ensure that all these requirements are met.
The ICT system validation includes the following general activities:
- Check that that the design outputs of the particular phase of the system development life cycle meet all of the specified requirements for that phase. It looks for consistency, completeness, and correctness of the system and its supporting documentation
- Confirms by examination and objective evidence that system specifications conforms to specified user requirements and that no failures are found and identify and log problems detected.
- That the whole system works as specified.
Validation has to make sure that it is robust with respect to imported data. The validation also has to make sure that the algorithm is robust and reliable with respect to natural variability. In addition, the output of each module has to be tested for quality. For example, one module may geocode satellite data. The accuracy of the geocoding process is validated by comparing known locations on the satellite image with ground control points, whose exact geographic location is known. Similarly, the output of a glacier or an ice type classification module may be compared with field data.
Similar tasks are encountered for the validation of the total system. Here, key components, as the data storage and interactive processing, are validated at a component level, the communication between the various system components, the data exchange and the functionality of the user interface is among the tasks to be examined. A continuous integration approach has allowed for the earliest detection of errors and other inconsistencies.
Other aspects of system validation are as follows:
- Testing with a realistic dataset: The technical context for evaluation and validation must be specified so that tests are performed in a reliable way. As a part of the validation process the database can be populated to simulate a database as it will be after several years of operation.
- Overall performance and load testing: A typical hardware and software configuration must be specified with a realistic loading on the system, to give a true indication of system performance. In addition and through use of a specific load testing tool, the ‘grinder’, simulations of various user loads can be run.
A service validation protocol was developed in Phase 1 of the project that for ICT covers the portal.
Approach for algorithm validation
Algorithm and product validation includes:
- Testing that the algorithm actually works sufficiently well under all expected natural variability.
- Testing that the cryospheric variables retrieved from the satellite data and other data, delivered as products in the operational environment, actually stick to the quality criteria specified.
Algorithm validation involves collecting observation data, e.g., high-resolution satellite images and in situ data, to compare with satellite observations applied in the production algorithms. As a result of data comparisons, improvements in current algorithms can be made. This validation takes place upfront of the service provision.
Running product production validation takes place while the system provision takes place. It may consist of a general quality check of all products produced, while sample products are further analysed to check that they stick to the full quality specification.
Both upfront and running validation results will be made fully available to the users through the web service. A product validation protocol was developed in Phase 1 of the project.