DSSAT Archives

DSSAT - Crop Models and Applications

DSSAT@LISTSERV.UGA.EDU

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
"Dr. Gerrit Hoogenboom" <[log in to unmask]>
Reply To:
DSSAT - Crop Models and Applications <[log in to unmask]>
Date:
Thu, 30 Nov 1995 12:25:31 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (722 lines)
The following CAMASE electronic newsletter has some interesting items
related to modeling you might be interested in.
 
Gerrit
 
 
>Received: from HEARN.nic.SURFnet.nl (hearn.nic.surfnet.nl [192.87.5.132])
by tangine.bae.griffin.peachnet.edu (8.6.10/8.6.10) with SMTP id LAA11247
for <[log in to unmask]>; Thu, 30 Nov 1995 11:05:28 -0500
>Received: from HEARN.NIC.SURFNET.NL by HEARN.nic.SURFnet.nl (IBM VM SMTP V2R2)
>   with BSMTP id 2568; Thu, 30 Nov 95 16:49:45 +0100
>Received: from NIC.SURFNET.NL (NJE origin LISTSERV@HEARN) by
HEARN.NIC.SURFNET.NL (LMail V1.2a/1.8a) with BSMTP id 6293; Thu, 30 Nov 1995
16:49:37 +0100
>Received: from NIC.SURFNET.NL by NIC.SURFNET.NL (LISTSERV release 1.8b) with
>          NJE id 1501 for [log in to unmask]; Thu, 30 Nov 1995 16:49:23
>          +0100
>Received: from HEARN (NJE origin SMTP@HEARN) by HEARN.NIC.SURFNET.NL (LMail
>          V1.2a/1.8a) with BSMTP id 6285; Thu, 30 Nov 1995 16:49:21 +0100
>Received: from mgate.nic.agro.nl by HEARN.nic.SURFnet.nl (IBM VM SMTP V2R2)
>          with TCP; Thu, 30 Nov 95 16:49:15 +0100
>Received: from ABW1 (ABW1) by AGRO.NL (PMDF V5.0-4 #12026) id
>          <[log in to unmask]> for [log in to unmask]; Thu, 30 Nov
>          1995 16:49:02 +0000 (MED)
>Received: from AB.DLO.NL by AB.DLO.NL (PMDF V4.3-7 #7552) id
>          <[log in to unmask]>; Thu, 30 Nov 1995 16:48:19 +0000 (GMT)
>X-VMS-To: CAMASE-L
>MIME-version: 1.0
>Content-transfer-encoding: 7BIT
>Approved-By:  "Ing. M.C. Plentinger, AB-DLO" <[log in to unmask]>
>Message-ID:  <[log in to unmask]>
>Date:         Thu, 30 Nov 1995 16:48:19 +0000
>Reply-To: Quantitative Methods of Research on Agricultural  Systems and the
>              Environment <[log in to unmask]>
>Sender: Quantitative Methods of Research on Agricultural  Systems and the
>              Environment <[log in to unmask]>
>From: "Ing. M.C. Plentinger, AB-DLO" <[log in to unmask]>
>Subject:      CAMASE_NEWS Extra edition, November 1995
>To: Multiple recipients of list CAMASE-L <[log in to unmask]>
>Content-Type: TEXT/PLAIN; CHARSET=US-ASCII
>
>    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
>
>                             N E W S L E T T E R
>
>                                     O F
>
>                        A G R O - E C O S Y S T E M S
>
>                             M O D E L L I N G
>
>    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
>    Published by AB-DLO                     November 1995, Extra edition
>    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
>
>    \==================================================================\
>    \                                                                  \
>    \                            CONTENTS                              \
>    \                                                                  \
>    \                           GUIDELINES                             \
>    \                                                                  \
>    \                           EVALUATION                             \
>    \                                                                  \
>    \                           Definitions                            \
>    \                           Guidelines                             \
>    \                           References                             \
>    \                                                                  \
>    \               SENSITIVITY AND UNCERTAINTY ANALYSIS               \
>    \                                                                  \
>    \                           Definitions                            \
>    \                           Guidelines                             \
>    \                           References                             \
>    \                                                                  \
>    \                           CALIBRATION                            \
>    \                                                                  \
>    \                           Definitions                            \
>    \                           Guidelines                             \
>    \                           References                             \
>    \==================================================================\
>
>    ====================================================================
>    EDITORIAL I
>
>    CAMASE is a concerted action that would not have come into existence
>    without the generous support by the European Commission's RTD
>    programme. this is greatly appreciated by us and, we are sure, by
>    our readers.
>                                                 Frits Penning de Vries,
>                                                        Marja Plentinger
>    ====================================================================
>
>    ====================================================================
>    EDITORIAL II
>
>    Systems analysis and simulation are commonly used tools of
>    researchers. Yet, many of us learned to use them by ourselves, by
>    trial and error. In the process, we tumbled in many pitfalls,
>    sometimes even without realizing it. It was suggested to CAMASE to
>    make an effort to produce guidelines for modelling and distribute
>    widely. We have pooled our limited wisdom is this matter, and
>    propose two short sets of guidelines on these pages. We hope they
>    will make a difference.
>    No doubt that our guidelines can be improved further! May we have
>    your comments?
>    ====================================================================
>
>                                 *   *   *
>
>                                 GUIDELINES
>
>    The need for guidelines for modelling has been expressed several
>    times, particularly by those out of the main stream of developments.
>    In the CAMASE project, we have developed a first draft of guidelines
>    for 'validation', 'sensitivity and uncertainty analysis' and
>    'calibration'. These are presented below, preceded by some relevant
>    definitions. To provide readers with more details and access to
>    examples, we added references to some of the most relevant
>    scientific papers. We welcome very much all responses to further
>    improve the guidelines and the set of most relevant papers. A next
>    step in upgrading the quality of model building and model use should
>    be a manual with more explicit guidelines, procedures, tools and
>    examples. Ongoing projects on software quality aim (see next
>    CAMASE_NEWS) at developing such manuals.
>
>    We acknowledge the input of Dr.Ir. A.K. Bregt, Dr. Ph. Debaeke,
>    Dr.ir. W.A.H. Rossing, Ir. M.J. van der Velden, Ir. G.W.J. van de
>    Ven, and Dr.ir. A.L.M. van Wijk.
>
>    Frits Penning de Vries,
>    Michiel Jansen, and
>    Klaas Metselaar.
>
>
>                                 *   *   *
>
>                                 EVALUATION
>
>    * Definitions
>
>    - Evaluation
>    The broadest term to describe the action of judging the adequacy of
>    a model. Evaluation includes checking internal consistency and
>    units used in a computer program, comparison of model output with
>    an independent data set of real world observations, uncertainty
>    analysis, and judgement of utility. The term 'test' is sometimes
>    used with the same meaning.
>
>    - Validation
>    The term will be used here in its most common utilitarian sense of
>    establishing the usefulness and relevance of a model for a
>    predefined purpose. It is a recurrent activity in a phase of model
>    development. Models have always a limited range of validity, and it
>    is necessary to specify clearly what it is. In case of predictive
>    models, a major part of the validation consists of an assessment of
>    prediction accuracy.
>
>    - Verification
>    This technical term designates the inspection of the internal
>    consistency of the model and its software implementation. Some
>    important elements are: analysis of dimensions and units, on-line
>    checks on mass conservation, detection of violation of natural
>    ranges of parameters and variables. Verification also comprises
>    inspection of qualitative behaviour of the model and its
>    implementation, for instance a check whether the response of one
>    model output to changing values of one parameter conforms to
>    theoretical insights.
>
>    - Calibration and validation data
>    Sets of data used to calibrate respectively validate a model.
>
>    - Cross-validation
>    A procedure for calibrating and validating a model with a limited
>    number of representative data sets. It consists of repeated
>    subdivision of all the data into calibration and validation data,
>    followed by corresponding calibration and validation. The average
>    of the observed prediction errors over the subdivisions provides an
>    estimate of the prediction error in an entirely new situation.
>    There are several variants of cross-validation. In the most popular
>    one, called leave-one-out validation, each independent data set
>    gets the role of validation data exactly once, at which occasion
>    the complementary set gets the role of calibration set.
>
>
>    * Guidelines
>
>    # Make explicit for what purpose the model is being validated, and
>    compare whether this is compatible with the objectives for which
>    the model was developed.
>
>    # Make explicit in the description of the model which processes or
>    natural resources are limiting the behaviour of the model.
>
>    # It is meaningless to simply state that a model is valid. After a
>    successful validation a model is shown to be of practical use for a
>    specific purpose over a specific range. A discussion of acceptable
>    error size, with due regard to the specific purpose, should be
>    included. Large errors might make the model of little practical
>    value as a predictor though it might still have an instructive
>    value.  Validation of absolute values of key variables is best.
>
>    # Model evaluation should start with verification of the model and
>    its software implementation.
>
>    # In model evaluation every model output should be subject to
>    validation. If the model is to be used in predictions, such as
>    scenario studies, the validation of the model is more efficiently
>    focused on issues of interest, which could be differences between
>    scenarios, or the resulting ranking of alternatives such as e.g.
>    predicting the yield of different varieties.
>
>    # The validation data should be representative for the situations
>    in which the model is to be used: Swedish data, for instance, may
>    be unsuitable to validate a model to be used in Spain. The
>    validation set should -if possible- cover the range of situations
>    encountered in predictions.
>
>    # Although prediction accuracy will benefit from representative
>    calibration data, representativity of the calibration data is not
>    required for soundness of the validation.
>
>    # The calibration data and the validation data should be different.
>    In studies where a large number of validations are executed, there
>    is a chance that calibration set and validation set are identical,
>    when calibration and validation set are arbitrarily taken from the
>    available sets.
>
>    # Validation should be repeatable by colleague scientists. This
>    means that all crucial validation data (in a broad sense,
>    comprising input, output, model structure) should be well
>    documented and accessible. Validation data set should be of high
>    quality.
>
>    # When dealing with complex models, divide and rule:
>    a. If the subject of a model is too large for regular validation
>       (e.g. an entire region), the model is to be subdivided into
>       components that are validated separately. Provide a logical
>       reasoning for which the aggregate model is consistent, and do
>       not miss crucial interactions among the components.
>    b. If the subject of the model is takes too long for regular
>       validation (e.g. long term changes in soil structure and
>       organic matter), validation should be undertaken for shorter
>       periods, and indirect evidence (time series from different
>       environments) collected.
>
>
>    * References
>
>    Addiscott, T., J. Smith & N. Bradbury, 1995. Critical evaluation
>       of models and their parameters. Journal Environmental Quality 24:
>       803-807
>    Colson, J., D. Wallach, A. Bouniols, J.B. Denis & J.W. Jones, 1995.
>       Mean squared error of yield prediction by SOYGRO. Agronomy
>       Journal 87: 397-402
>    Debaeke, Ph., K. Loague & R.E. Green, 1991. Statistical and
>       graphical methods for evaluating solute transport models:
>       overview and application. J.Contaminant Hydrology 7: 51-73
>    Hamilton, M.A., 1991. Model validation: an annotated bibliography.
>       Commun. Statist. Theory Meth. 20(7): 2207-2266
>    Koning, G.H.J. de, M.J.W. Jansen, C.A. van Diepen & F.W.T.
>       Penning de Vries, 1993. Crop growth simulation and statistical
>       validation for regional yield forecasting across the European
>       Community. CABO-TT Simulation Reports 31.
>    Penning de Vries, F.W.T., 1977. Evaluation of simulation models in
>       agriculture and biology: conclusions of a workshop. Agricultural
>       Systems 2 (1977): 99-107
>    Power, M., 1993. The predictive validation of ecological and
>       environmental models. Ecological modelling 68: 33-50
>    Rosenberg, N.J., M.S. McKenney, W.E. Easterling & K.M. Lemon, 1992.
>       Validation of EPIC model simulations of crop responses to current
>       climate and CO2 conditions: comparisons with census, expert
>       judgement and experimental plot data. Agric. Met. 59: 35-51
>    Scholten, H., 1994. Blueprint of a supramodel for quality assurance
>       of the simulation modelling process. Full paper submitted to
>       European Simulation Symposium, Istanbul, Turkey, October 9-12, 1994
>    Scholten, H. & M.W.M. van der Tol, 1994. Towards a metrics for
>       simulation model validation. In: Grasman, J. & G. van Straten
>       (Eds.). Predictability and nonlinear modelling in natural
>       sciences and economics. Proceedings of the 75th Anniversary
>       Conference of WAU, April 5-7, 1993, Wageningen, The Netherlands.
>       Kluwer Publishers, Dordrecht. 398-410
>
>
>                                 *   *   *
>
>                   SENSITIVITY AND UNCERTAINTY ANALYSIS
>
>    * Definitions
>
>    - Input
>    All parameters, initial values, tabulated functions, and driving
>    variables in the model. For some analyses, tabulated functions may
>    have to be parameterized.
>
>    - Uncertainty
>    In this context: imperfect knowledge regarding aspects of a model.
>    Uncertainty regarding model variables is usually specified by a
>    probability distribution or by a sample of measured values (an
>    empirical probability distribution); sometimes it is specified by a
>    set of possible values. We adhere to the probabilistic concept of
>    uncertainty, and we use variances as measure of uncertainty.
>
>    - Sources of uncertainty
>    Uncertainty exists at the level of inputs and output of the model.
>    Uncertainty at the level of model formulation also exists. In these
>    guidelines, however, we will assume that the model is deterministic,
>    and that uncertainties are solely introduced via the inputs. Input
>    uncertainty is caused by natural variation (e.g. weather, soil or
>    genetic variation) as well as by imperfection of data. Although the
>    causes of uncertainties may differ, their effect is the same, namely
>    uncertainty about the model outputs. It is up to the modeller
>    whether or not to incorporate natural variation in the model; the
>    choice depends also on the spatial or temporal scale at which the
>    model is used.
>    The input uncertainty of different parameters may contain
>    correlations caused by biological or physical mechanisms, e.g.
>    correlation between development rate before and after flowering,
>    or between weather at two consecutive days. Correlation can also
>    be caused by the nature of the data analyzed to estimate parameters,
>    e.g. correlation between estimates of intercept and slope of a
>    regression line.
>
>    - Sensitivity analysis
>    Definitions vary. In most studies, sensitivity analysis is the study
>    of model properties through - not necessarily realistically sized -
>    changes in the input variables and the analysis of its effect on
>    model outputs. The questions addressed are for instance:
>    # whether or not some output is affected at all by some input
>    # continuity, differentiability, monotonic increase or decrease of
>      the model's response to input variation
>    Most of the variation of outputs is generally caused by a small
>    number of inputs.
>
>    - Uncertainty analysis
>    Definitions vary. In most studies, uncertainty analysis is the study
>    of output uncertainty as a function of a careful inventory of the
>    different sources of uncertainty present in the model. The questions
>    addressed are for instance:
>    # What is the prediction uncertainty due to all uncertainties in
>      model inputs? (Total uncertainty, often expressed as variance)
>    # How do inputs (singly or in groups) contribute to prediction
>      uncertainty?
>
>    - One-at-a-time sensitivity analysis
>    An analysis of responses to variation of one input at a time,
>    whereas the other inputs are kept at nominal values. One-at-a-time
>    graphs can be informative and may reveal discontinuities; in these
>    graphs a model response is plotted against the studied input, which
>    latter varies in small steps over some range.
>
>    - Factorial sensitivity analysis
>    Analysis where inputs are varied according to a so-called factorial
>    design. In the most common factorial design, called two-level
>    design, each input has two levels: low and high. A full two-level
>    factorial design for n inputs requires 2n model runs. If this number
>    is prohibitive, one may apply a fractional factorial design, in
>    which only a fraction of the input combinations is realized.
>
>    - Local sensitivity analysis (or differential sensitivity analysis)
>    An analysis of responses to very small variations around some
>    setting of the input, e.g. nominal values. Logical sensitivity
>    analysis is the effort to establish by theoretical study of the
>    model, or by inspection of results of sensitivity or uncertainty
>    analysis, whether the model is sensitive at all for changes in an
>    input.
>
>    - Elicitation
>    A formal procedure to translate expert knowledge regarding input
>    uncertainty into probability distributions.
>
>
>    * Guidelines
>
>    # All parameters should be accessible for uncertainty- and
>    sensitivity analysis. The source code of a model should not
>    contain unexplained numerical values.
>
>    # Perform sensitivity analysis for verification of the model and
>    its implementation. Repeated running of the software over a broad
>    range of circumstances already constitutes a non-trivial test. Then
>    check whether the qualitative behaviour of responses conforms to
>    theoretical expectations.
>
>    # A logical sensitivity analysis can help to detect inputs for which
>    an output is entirely insensitive (factor screening). These sleeping
>    inputs might be ignored in subsequent analyses. However, be aware of
>    the fact that the sensitivity of an input may depend on the values
>    of other inputs.
>
>    # Apply factorial sensitivity analysis if you are interested in the
>    interaction between inputs. This is important when the response to
>    an input depends on the setting of other inputs.
>
>    # Use one-at-a-time sensitivity analysis to detect irregularities,
>    e.g. discontinuities, in the response such as may preclude specific
>    model- calibration techniques.
>
>    # For research papers on models and validation studies, an
>    uncertainty analysis is highly recommended.
>
>    # The establishment of input uncertainty constitutes the most
>    elaborate and most critical stage of uncertainty analysis.
>    Literature and experiments constitute the most natural source of
>    information. Expert knowledge is another source. Be aware that
>    experts in agro-ecology are not automatically experts in
>    probability; formal elicitation procedures may be helpful.
>
>    # Data providing information about input uncertainty pertain often
>    to separate submodels. Information about correlation in uncertain
>    inputs can be quite valuable since such information may greatly
>    reduce output uncertainty.
>
>    # Artificially generated weather data are often practical to use.
>    Weather generators are also models and need to be validated.
>
>    # If possible, perform uncertainty analysis for all variables
>    simultaneously. For large models, the analysis may have to be
>    performed for submodels separately.
>
>    # Simple random sampling from the input uncertainty distribution is
>    a good starting point, but latin hypercube sampling may be advisable
>    for efficiency. Both methods can incorporate correlations; simple
>    random sampling is conceptually simple and theoretically well
>    developed.
>
>    # When comparing alternative scenarios, calculate the relevant
>    contrasts with the same values of the input sample. This provides
>    the most efficient estimates of the scenario effects.
>
>    # Uncertainty analysis may be used (and regarded) as partial
>    validation: the total uncertainty about crucial model outputs
>    should be acceptable for the current application. Validation
>    through uncertainty analysis is only partial because structural
>    uncertainty in the model, is hardly never described as 'input'
>    uncertainty.
>
>    # Large uncertainty contributions of individual inputs or groups of
>    inputs to model output indicate that it is worthwhile to know more
>    about these (groups of) inputs, whereas it is pointless to gain new
>    information about other inputs. Thus, uncertainty analysis provides
>    information to support decisions on research priorities.
>
>    # By the same token, uncertainty analysis provides support in the
>    selection of calibration parameters.
>
>    # Compare estimated model-uncertainty with the current empirical
>    uncertainty. Differences may be due to: structural errors in the
>    model and errors in the presumed input uncertainty distribution,
>    such as, absence of uncertain inputs, absence of correlations
>    between inputs, erroneous specification of distributions etc.
>
>
>    * References
>
>    - General
>    Beck, M.B., 1987. Water quality modeling: a review of the analysis
>      of uncertainty. Water Resources Research 23, 1987: 1393-1442
>    Bouman, B.A.M., 1994. A framework to deal with uncertainty in soil
>      and management parameters in crop yield simulation; a case study
>      for rice. Agricultural Systems 46: 1-17
>    Hamby, D.M., 1994. A review of techniques for parameter sensitivity
>      analysis of environmental models. Environmental monitoring and
>      assessment 32: 135-154
>    Janssen, P.H.M., 1994. Assessing sensitivities and uncertainties in
>      models: a critical evaluation. In: Grasman, J. & G. van Straten
>      (Eds.). Predictability and Nonlinear Modelling in Natural Sciences
>      and Economics. Kluwer, Dordrecht. 344-361
>    Kleijnen, J.P.C. & W. van Groenendaal, 1992. Simulation: a
>      statistical perspective. Wiley.
>    Kremer, J.N., 1983. Ecological implications of parameter uncertainty
>      in stochastic simulation. Ecological modelling 18: 187-207
>
>    - Technical aspects
>    Bouman, B.A.M. & M.J.W. Jansen, 1993. RIGAUS, Random Input Generator
>      for the Analysis of Uncertainty in Simulation. Simulation Report
>      CABO-TT, no. 34. AB-DLO. 26 pp + appendices.
>    Haness, S.J., L.A. Roberts, J.J. Warwick & W.G. Cale, 1991. Testing
>      the utility of first order uncertainty analysis. Ecol. Modell. 58:
>      1-23
>    Iman, R.L. & W.J. Conover, 1982. A distribution-free approach to
>      inducing rank correlation among input variables. Commun. statist.
>      -simual. computa. 11(3): 311-334
>    Iman, R.L. & J.C. Helton, 1988. An investigation of uncertainty and
>      sensitivity analysis techniques for computer models. Risk Analysis
>      8: 71-90
>    Jansen, M.J.W., W.A.H. Rossing & R.A. Daamen, 1993. Monte Carlo
>      estimation of uncertainty contributions from several independent
>      multivariate sources. Conference predictability and nonlinear
>      modeling, Wageningen, April 1993.
>    Janssen, P.H.M., P.S.C. Heuberger & R. Sanders, 1993. UNCSAM 1.1: a
>      software package for sensitivity and uncertainty analysis. RIVM.
>    Lenthe, J. van, 1993. A blueprint of ELI: A new method for eliciting
>      subjective probability distributions. Behavior Research Methods,
>      Instruments & Computers 25(40): 425-433
>    McKay, M.D., R.J. Beckman & W.J. Conover, 1979. A comparison of
>      three methods for selecting values of input variables in the
>      analysis of output from a computer code. Technometrics 21: 239-245
>
>    - Weather generators
>    Geng, S., F.W.T. Penning de Vries & I. Supit, 1985. Analysis and
>      simulation of weather variables. Part II: Temperature and solar
>      radiation. Simulation report CABO-TT 5.
>    Racsko, P., L. Szeidl & M. Semenov, 1991. A serial approach to local
>      stochastic weather models. Ecological Modelling 57: 27-41
>
>    - Applications
>    Aggarwal, P.K., 1995. Uncertainties in crop, soil and weather inputs
>      used in growth models - implications for simulated outputs and
>      their applications. Agricultural systems 48(3): 361-384
>    Blower, S.M. & H. Dowlatabadi, 1994. Sensitivity and uncertainty
>      analysis of complex models of disease transmission: an HIV model
>      as an example. Internat. Statist. Review 62(2): 229-243
>    Rossing, W.A.H., R.A. Daamen & M.J.W. Jansen, 1994. Uncertainty
>      analysis applied to supervised control of aphids and brown rust
>      in winter wheat. Part 1. Quantification of uncertainty in
>      cost-benefit calculations. Agricultural Systems 44, 419-448
>    Rossing, W.A.H., R.A. Daamen & M.J.W. Jansen, 1994. Uncertainty
>      analysis applied to supervised control of aphids and brown rust
>      in winter wheat. Part 2. Relative importance of different
>      components of uncertainty. Agricultural Systems 44: 449-460
>    Voet, H. van der & G.M.J. Mohren, 1994. An uncertainty analysis of
>      the process-based growth model FORGRO. Forest ecology and
>      management 69: 157-166
>
>
>                                 *   *   *
>
>                                CALIBRATION
>
>    * Definitions
>
>    - Calibration
>    The adjustment of some parameters such that model behaviour matches
>    a set of real-world data; it is a restricted form of parametrization
>    of models.
>
>    - Calibration criterium
>    A function of the parameter values and the calibration data, that
>    provides a measure of the compatibility of the parameter values with
>    the data.
>
>    - Point calibration
>    A calibration that results in a single optimal parameter vector.
>    Many individual parameter vectors are often compatible with the
>    available calibration data, so that the point calibration may be
>    non-robust.
>
>    - Set calibration
>    A calibration that results in a set of parameter vectors compatible
>    with the calibration data.
>
>    - Distribution calibration
>    A calibration that results in a probability distribution of
>    parameter vectors compatible with the calibration data.
>
>    - Robust calibration
>    A calibration leading to results that are rather insensitive to
>    minor changes in the calibration data.
>
>
>    * Guidelines
>
>    # Ensure that the calibration method will never result in physically
>    impossible parameter vectors.
>
>    # Non-sensitive parameters are a major cause of non-robustness.
>    Sometimes such parameters are given a fixed typical value. Be aware
>    that the calibration results are conditional on the values of these
>    fixed parameters.
>
>    # Many calibration methods yield local optima of the criterium:
>    small changes from such an optimum give worse values of the
>    criterium, but further away better values may be realized. It is
>    advised to apply such methods repeatedly with different starting
>    points.
>
>    # Set calibration and distribution calibration may be advised in
>    order to circumvent the problems with point calibration. These
>    methods, however, are less well-developed, and are computer
>    intensive.
>
>    # Regarding the calibration method to be chosen: Use results from
>    a one-at a time parameter sensitivity analysis to look whether the
>    implicitly defined relations between state variables and parameters
>    are continuous or discontinuous and linear or nonlinear. If the
>    model response is smooth, the model can be linearized, and fast
>    optimization procedures using locally linear approximation are
>    possible. If discontinuous, more robust calibration procedures
>    should be used.
>
>    # In the proposed calibration procedures, parameter probability
>    distributions, based on literature reviews or on well-documented
>    expert knowledge, are assumed to be available.
>
>    # Parameter choice is best based on a ranking of the model
>    parameters as to their contribution to output uncertainty.
>
>    # If the model is not embedded in a parameter estimating procedure,
>    calibration can be executed as follows: Use sensitivity analysis to
>    analyse relations between state variables. Determine independent
>    subsystems, and calibrate the individual subsystems, taking care
>    that once a subsystem is calibrated, that subsystem is not modified
>    in following calibration steps. Calibrate a single parameter from
>    each independent subsystem. This calibration method yields a point
>    estimate.
>
>    # If the model is embedded in an optimization procedure, calibration
>    can be executed as follows: Choose parameters on the basis of their
>    contribution to the output uncertainty.
>
>    # Use a parameter estimation procedure in which parameter sets are
>    generated according to the distributions and correlations between
>    parameters established in the uncertainty analysis.
>
>    # Estimate the parameters simultaneously.
>
>    # The uncertainty of the parameters after calibration can be derived
>    under the following conditions: The model is correct and the
>    non-calibrated parameters have a negligible effect on the output
>    uncertainty. To investigate the effect of non- calibrated parameters
>    one should execute an uncertainty analysis.
>
>    # If a model for the measurement errors is available, and the
>    calibration criterium is based on it, one may execute a set- or
>    distribution calibration. Both calibrations allow to quantify total
>    uncertainty about crucial model outputs after calibration. This
>    uncertainty should be acceptable for the application.
>
>    # If the above methods are not possible, calibration becomes a work
>    of art, which can yield good predictions, but provides no assessment
>    of prediction uncertainty.
>
>
>    * References
>
>    Aldenberg, T., J.H. Janse & P.R.G. Kramer, 1995. Fitting the
>      dynamical model PCLake to a multi-lake survey through Bayesian
>      statistics. Ecological modelling 78: 83-99
>    Beven, K.& A. Binley, 1992. The future of distributed models: model
>      calibration and uncertainty prediction. Hydrological processes 6:
>      279-298
>    Janssen, P.H.M. & P.S.C. Heuberger, 1995. Calibration of process
>      orientated models. Ecological Modelling (to be published).
>    Keesman, K. & G. van Straten, 1989. Identification and prediction
>      propagation of uncertainty in models with bounded noise. Int. J.
>      control 49: 2259-2269
>    Klepper, O. & D.I. Rouse, 1991. A procedure to reduce parameter
>      uncertainty for complex models by comparison with real system
>      output illustrated on a potato growth model. Agricultural Systems
>      36 (1991) 375-395
>    Molen, D.T. van der & J. Pintr, 1993. Environmental model
>      calibration under different specifications: an application to the
>      model SED. Ecological Modelling 68: 1-19
>    Scholten, H. & M.W.M. van der Tol, 1994. SMOES: a Simulation Model
>      for the Oosterschelde EcoSystem. Part II: calibration and
>      validation. Hydrobiologica, 282/283: 453-474
>    Straten, G. van & K.J. Keesman, 1991. Uncertainty propagation and
>      speculation in projective forecasts of environmental change - a
>      lake eutrophication example. J. of Forecasting 10: 163-190
>
>    ====================================================================
>    CAMASE: A CONCERTED ACTION FOR THE DEVELOPMENT AND TESTING OF
>    QUANTITATIVE METHODS FOR RESEARCH ON AGRICULTURAL SYSTEMS AND THE
>    ENVIRONMENT.
>
>    CAMASE is financially supported by the European Community Specific
>    Programme for Research, Technological Development and Demonstration
>    in the Field of Agriculture and Agro-industry, including Fisheries.
>
>    The objectives of CAMASE are to advance quantitative
>    research on agricultural systems and their environment in the
>    EU-countries, by improving systems research in participating
>    institutes through exchange and standardization of concepts,
>    approaches, knowledge, computer programs and data.
>    CAMASE relates to a small network of research groups, and
>    a broad group of scientists receiving information. The network
>    consists of scientists from five groups in Europe: Denmark
>    (Royal Veterinary and Agricultural University, Copenhagen),
>    France (Institut National de la Recherche Agronomique,
>    Toulouse), Spain (Cordoba University, Cordoba), Scotland
>    (Institute of Ecology and Resource Management, Edinburgh) and
>    The Netherlands (AB-DLO, TPE-WAU and SC-DLO, Wageningen).
>
>    With CAMASE_NEWS, we aim to improve communication among
>    scientists working in agro-ecosystem modelling and interested
>    in better access to appropriate models, data, and related
>    tools, instruction materials. CAMASE-core groups and others can
>    contribute spontaneously or will be invited to contribute.
>    Responsibility for the opinions expressed rests with the
>    authors.
>
>    CAMASE_NEWS will appear four times per year. Please submit
>    news items for CAMASE_NEWS and requests for new subscriptions
>    to:
>
>    F.W.T. Penning de Vries/M.C. Plentinger
>    DLO Research Institute for Agrobiology and Soil Fertility (AB-DLO)
>    P.O.Box 14
>    6700 AA  WAGENINGEN
>    The Netherlands
>    Telephone: +31.317.475961
>    Telefax: +31.317.423110
>    Internet: [log in to unmask]
>
>    After an e-mail request for subscription, you will receive a
>    form to give your address, which is necessary for postal
>    mailings.
>
>    ====================================================================
>
 
 =======================================================================
Gerrit Hoogenboom
Associate Professor
Department of Biological and Agricultural Engineering
The University of Georgia
Griffin, Georgia 30223-1797, USA
 
Phone:  770-228-7216
FAX:    770-228-7218
E-Mail: [log in to unmask]
 =======================================================================

ATOM RSS1 RSS2