sábado, 15 de noviembre de 2008
viernes, 7 de noviembre de 2008
NU-ECOSOC, Resoluciones 15/08/2008
ECOSOC-NU Resoulciones y decisiones Agosto 2008
Naciones Unidas E/2008/INF/2/Add.1Consejo Económico y SocialResoluciones y decisiones adoptadas por el Consejo Económico y Social en su período de sesiones sustantivo de 2008
(30 de junio a 25 de julio de 2008)
Nota: En el presente documento se reproducen, a título informativo, los textos provisionales de las resoluciones y decisiones adoptadas por el Consejo en su período de sesiones sustantivo de 2008. Los textos definitivos se publicarán en Documentos Oficiales del Consejo Económico y Social, 2008,
Resoluciones 2008/32
Informe del Comité de Expertos en Administración Pública sobre su séptimo período de sesiones
El Consejo Económico y Social, Recordando sus resoluciones 2002/40, de 19 de diciembre de 2002, 2003/60, de 25 de julio de 2003, 2005/3, de 31 de marzo de 2005, 2005/55, de 21 de octubre de 2005, 2006/47, de 28 de julio de 2006 y 2007/38, de 4 de octubre de 2007, todas relativas a la administración pública y el desarrollo,
Recordando también las resoluciones de la Asamblea General 50/225, de 19 de abril de 1996, 56/213, de 21 de diciembre de 2001, 57/277, de 20 de diciembre de 2002, 58/231, de 23 de diciembre de 2003, 59/55, de 2 de diciembre de 2004 y 60/34, de 30 de noviembre de 2005, todas relativas a la administración pública y el desarrollo,
Recordando además el párrafo 11 de la resolución 60/1 de la Asamblea General, de 16 de septiembre de 2005,Tomando nota con reconocimiento de la labor precursora del Programa de las Naciones Unidas de administración pública, finanzas y desarrollo en la prestación de apoyo a los Estados Miembros en materia de reformas administrativas, creación de instituciones públicas, capacitación de funcionarios públicos y reconstrucción de las administraciones públicas después de los conflictos durante los últimos 60 años desde su creación en 19481,
Reconociendo que, aunque las condiciones y el contexto del desarrollo y la gobernanza han cambiado, las prioridades de la administración pública, incluidas la creación de capacidad para el desarrollo y la implicación en el desarrollo nacional, siguen siendo cuestiones intersectoriales de importancia crítica para la consecución de los objetivos de desarrollo convenidos internacionalmente, incluidos los objetivos de desarrollo del Milenio,
1. Toma nota de las conclusiones sobre el tema de la creación de capacidad para el desarrollo que figuran en el informe del Comité de Expertos en Administración Pública sobre su séptimo período de sesiones2;
2. Alienta a los Estados Miembros a que sigan reforzando su capacidad para aprovechar mejor las diversas modalidades de ayuda3 y potencien la comprensión y el uso de la creación de capacidad como una combinación juiciosa de desarrollo institucional y desarrollo de los recursos humanos4, gracias a la cual las personas, las organizaciones, los Estados y la sociedad en su conjunto desarrollan y mantienen su habilidad para gestionar bien los asuntos públicos mediante, entre otras cosas, la promoción de la participación pública en los procesos de gobernanza y desarrollo5, el aprovechamiento de las posibilidades que ofrecen las tecnologías de la información y las comunicaciones para fomentar el desarrollo centrado en el ser humano, combinando eficazmente políticas de descentralización y centralización, y constituyendo alianzas regionales y nacionales con instituciones de administración pública para proporcionar la capacitación necesaria6;
3. Destaca que la creación de capacidad es importante y necesaria en la reestructuración administrativa, la reforma de la administración pública, el desarrollo de los recursos humanos y la capacitación en materia de administración pública, la mejora del desempeño del sector público, la gestión financiera, la interacción entreel sector público y el privado, el desarrollo social, el desarrollo de la infraestructura y la protección del medio ambiente, la capacidad del gobierno en el campo jurídico y regulador y la gestión y ejecución de programas de desarrollo7;
4. Invita a los Estados Miembros a que sigan observando el progreso en la consecución de los objetivos de desarrollo convenidos internacionalmente, incluidos los objetivos de desarrollo del Milenio, y a que preparen un repertorio de las buenas políticas administrativas aplicadas en apoyo de esos objetivos, con inclusión de lacapacidad necesaria, los aspectos de desarrollo institucional y las visiones estratégicas para una administración pública moderna; y subraya que el sistema de las Naciones Unidas, en particular el Departamento de Asuntos Económicos y Sociales de la Secretaría de las Naciones Unidas y otros órganos competentes de las Naciones Unidas, deben apoyar esos esfuerzos y fomentar que se compartan las mejores prácticas y la experiencia adquirida;
5. Subraya que la creación de capacidad en la administración pública tiene suma importancia en todas las economías en transición, en la consecución de los objetivos de desarrollo convenidos internacionalmente, incluidos los objetivos de desarrollo del Milenio, en la rehabilitación y la reconstrucción después de los conflictos y en la gestión y preparación ante desastres o crisis, que los procesos de creación de capacidad en esas esferas tienen varias características y experiencias comunes importantes relativas a la interacción de los niveles de acción de las sociedades, los sistemas, las organizaciones y las personas, y que los Estados Miembros deben compartir esas experiencias en forma más sistemática y completa;
6. Destaca que en la creación de capacidad para la recuperación y la reconstrucción después de los conflictos, son requisitos importantes la continuidad de la administración y los servicios públicos, la coherencia del sector público y un enfoque basado en la participación de múltiples interesados, y que en la creación de capacidad para situaciones posteriores a desastres y crisis, el sistema de las Naciones Unidas, en particular el Departamento de Asuntos Económicos y Sociales y otros órganos de las Naciones Unidas, deben apoyar esos esfuerzos para definir y compartir la experiencia adquirida y las mejores prácticas;
7. Pide a la Secretaría que aumente su apoyo a la creación de capacidad8, incluso en el sector público, asegurando que los recursos disponibles sean suficientes y se mantengan los niveles de recursos existentes;8. Pide también a la Secretaría que se siga concentrando en los premios de administración pública de las Naciones Unidas, la Red en línea de las Naciones Unidas sobre administración y finanzas públicas, la Red de innovadores, el World Public Sector Report y el Foro Mundial sobre la Reinvención del Gobierno, y pide además a la Secretará que siga desempeñando su papel eficaz de facilitador de la aplicación de las líneas de acción del Programa de Túnez para la Sociedad de la Información9;
9. Toma conocimiento de la última etapa de los trabajos del Comité de Expertos relativos a la terminología básica de las Naciones Unidas sobre gobernanza y administración pública tras la revisión de las definiciones propuestas;
10. Toma conocimiento también de la contribución del Comité de Expertos al tema del examen ministerial anual de 2008: Aplicación de los objetivos y compromisos convenidos internacionalmente con respecto al desarrollo sostenible.
44ª sesión plenaria25 de julio de 2008 : publicada el 15 de Agosto 2008.
Notas__________________
1 Véase la resolución 246 (III) de la Asamblea General.
2 Documentos Oficiales del Consejo Económico y Social, 2008, Suplemento No. 24 (E/2008/44).
3 Véase la resolución 59/250 de la Asamblea General, párr. 30.
4 Véase E/1997/86.
5 Véase la resolución 2005/3, párr. 4.
6 Véase Documentos Oficiales del Consejo Económico y Social, 2003, Suplemento No. 44(E/2003/44).
7 Véase A/50/525-E/1995/122.
8 Véase la resolución 60/1 de la Asamblea General, párr. 22 f).9 Véase A/60/687.
1 Véase la resolución 246 (III) de la Asamblea General.
2 Documentos Oficiales del Consejo Económico y Social, 2008, Suplemento No. 24 (E/2008/44).
3 Véase la resolución 59/250 de la Asamblea General, párr. 30.
4 Véase E/1997/86.
5 Véase la resolución 2005/3, párr. 4.
6 Véase Documentos Oficiales del Consejo Económico y Social, 2003, Suplemento No. 44(E/2003/44).
7 Véase A/50/525-E/1995/122.
8 Véase la resolución 60/1 de la Asamblea General, párr. 22 f).9 Véase A/60/687.
domingo, 19 de octubre de 2008
Guía de notas para la incertidumbre
Anoto lo siguiente:
La incertidumbre debe acotarse, más si efectivamente es creciente y un subproducto natural de la complejidad. Para ello, estas notas deben ser consideradas. Tienen la doble ventaja de ser utilizadas en un ámbito que mejor que ninguno expresa el paradigma de la Triple Hélice, se aplican en un tema tan sensible como el cambio climático, tienen el aval de las NU y hacen explícita las relaciones de proximidad entre las Ciencias Sociales y las Exactas. A revisar (y para que no se olvide... aunque ya tienen tres años), aquí van.
INTERGOVERNMENTAL PANEL ON CLIMATE CHANGE
WMO UNEP
Page 1 July 2005
Guidance Notes for Lead Authors of the
IPCC Fourth Assessment Report on Addressing Uncertainties
The following notes are intended to assist Lead Authors (LAs) of the Fourth Assessment Report (AR4) to deal
with uncertainties consistently. They address approaches to developing expert judgments, evaluating uncertainties, and communicating uncertainty and confidence in findings that arise in the context of the assessment process. Where alternative approaches are used in the relevant literature, those should be used but where possible related to the approaches given here. Further background material and more detailed coverage of these issues are available in the guidance paper on uncertainties developed for the Third Assessment Report [1] and the report of an IPCC Workshop on Uncertainty and Risk [2].
The working group reports will assess material from different disciplines and will cover a diversity of approaches to uncertainty, reflecting differences in the underlying literature. In particular, the nature of information, indicators and analyses used in the natural sciences is quite different from that used in the social sciences. WG I focuses on the former, WG III on the latter, and WG II covers both. The purpose of this guidance note is to define common approaches and language that can be used broadly across all three working groups. Each working group may need to supplement these notes with more specific guidance on particular issues consistent with the common approach given here.
Plan to treat issues of uncertainty and confidence
1. Consider approaches to uncertainty in your chapter at an early stage. Prioritize issues for analysis. Identify key policy relevant findings as they emerge and give greater attention to assessing uncertainties and confidence in those. Avoid trivializing statements just to increase their confidence.
2. Determine the areas in your chapter where a range of views may need to be described, and those where LAs
may need to form a collective view on uncertainty or confidence. Agree on a carefully moderated (chaired) and balanced process for doing this. Review the information available
3. Consider all plausible sources of uncertainty using a systematic typology of uncertainty such as the simple one shown in Table 1. Many studies have shown that structural uncertainty, as defined in Table 1, tends to be underestimated by experts [3]. Consider previous estimates of ranges, distributions, or other measures of uncertainty and the extent to which they cover all plausible sources of uncertainty.
Table 1. A simple typology of uncertainties
Type Indicative examples of sources Typical approaches or considerations
Unpredictability
Projections of human behaviour not easily amenable to prediction (e.g. evolution of political systems).
Chaotic components of complex systems.
Use of scenarios spanning a plausible range, clearly stating assumptions, limits considered, and subjective judgments.
Ranges from ensembles of model runs.
Structural uncertainty
Inadequate models, incomplete or competing conceptual frameworks, lack of agreement on model structure,
ambiguous system boundaries or definitions, significant processes or relationships wrongly specified or not considered.
Specify assumptions and system definitions clearly, compare models with observations for a range of conditions,
assess maturity of the underlying science and degree to which understanding is based on fundamental concepts tested in other areas.
Value uncertainty
Missing, inaccurate or non-representative data, inappropriate spatial or temporal resolution, poorly known or changing model parameters.
Analysis of statistical properties of sets of values (observations, model ensemble results, etc); bootstrap and hierarchical statistical tests; comparison of models with observations.
4. Assess issues of risk where supported by published work. Where probabilistic approaches are available, consider ranges of outcomes and their associated likelihoods with attention to outcomes of potential high consequence. An alternative approach is to provide information for decisions that would be robust in the sense of avoiding adverse outcomes for a wide range of future possibilities [4]. (Note that the term “risk” has several different usages. If used it should be defined in context.). Make expert judgments
5. Be prepared to make expert judgments and explain those by providing a traceable account of the steps used
to arrive at estimates of uncertainty or confidence for key findings – e.g. an agreed hierarchy of information, standards of evidence applied, approaches to combining or reconciling multiple lines of evidence, and explanation of critical factors.
6. Be aware of a tendency for a group to converge on an expressed view and become overconfident in it [3]. Views and estimates can also become anchored on previous versions or values to a greater extent than is justified. Recognize when individual views are adjusting as a result of group interactions and allow adequate time for such changes in viewpoint to be reviewed. Use the appropriate level of precision to describe findings
7. Assess the current level of understanding on key issues and precede statements on confidence or uncertainty
with a general summary of the corresponding state of knowledge. Table 2 below provides a consistent language for this.
8. Develop clear statements for key findings that are quantitative and give explicit time frames as far as possible. Define carefully the corresponding variables or outcomes, their context, and any conditional assumptions. Where scenarios are used, explain the range of assumptions and how they affect the outcome. Then consider the most appropriate way to describe the relevant uncertainties or level of confidence by going as far down the hierarchy given below as you feel appropriate (from expressions of less to more confidence and less to more probabilistic approaches) [5]:
A. Direction of change is ambiguous or the issue assessed is not amenable to prediction: Describe the governing factors, key indicators, and relationships. If a trend could be either positive or negative, explain the pre-conditions or evidence for each.
B. An expected trend or direction can be identified (increase, decrease, no significant change): Explain the basis for this and the extent to which opposite changes would not be expected. Include changes that have a reasonable likelihood even where they are not certain. If you describe a collective level of confidence in words, use the language options in Table 2 or 3.
C. An order of magnitude can be given for the degree of change (i.e. sign and magnitude to within a factor of 10): Explain the basis for estimates given and indicate assumptions made. The order of magnitude should not change for reasonable ranges in such assumptions. If you describe a collective level of confidence in words, use the language options in Table 2 or 3.
D. A range can be given for the change in a variable as upper and lower bounds, or as the 5th and 95th percentiles, based on objective analysis or expert judgment: Explain the basis for the range given, noting factors that determine the outer bounds. If you cannot be confident in the range, use a less precise approach. If you describe a collective level of confidence or likelihood of an outcome in words, use the language options in Tables 3 or 4.
E. A likelihood or probability of occurrence can be determined for an event or for representative outcomes, e.g. based on multiple observations, model ensemble runs, or expert judgment: State any assumptions made and estimate the role of structural uncertainties. Describe likelihoods using the calibrated language given in Table 4 or present them quantitatively.
F. A probability distribution can be determined for changes in a continuous variable either objectively or through use of a formal quantitative survey of expert views: Present the PDF graphically and/or provide the 5th and 95th percentiles of the distribution. Explain the methodology used to produce the PDF, any assumptions made, and estimate the role of structural uncertainties. Communicate carefully, using calibrated language
9. Be aware that the way in which a statement is framed will have an effect on how it is interpreted [6]. (A 10% chance of dying is interpreted more negatively than a 90% chance of surviving.) Use neutral language, avoid value laden statements, consider redundant statements to ensure balance (e.g. chances of dying and of surviving), and express different but comparable risks in a consistent way.
10. To avoid the uncertainty perceived by the reader being different from that intended, use language that minimizes possible misinterpretation and ambiguity. Note that terms such as “virtually certain”, “probable”, or “likely”, can engage the reader effectively, but may be interpreted very differently by different people unless some calibration scale is provided [7].
11. Three forms of language are given in Tables 2, 3 and 4 to describe different aspects of confidence and uncertainty and to provide consistency across the AR4.
12. Table 2 considers both the amount of evidence available in support of findings and the degree of consensus among experts on its interpretation. The terms defined here are intended to be used in a relative sense to summarize judgments of the scientific understanding relevant to an issue, or to express uncertainty in a finding where there is no basis for making more quantitative statements. A finer scale for describing either the amount of evidence (columns) or degree of consensus (rows) may be introduced where appropriate, however, if a mid-range category is used authors should avoid over-using that as a ‘safe’ option that communicates little information to the reader. Where the level of confidence is ‘high agreement much evidence’, or where otherwise appropriate, describe uncertainties using Table 3 or 4.
Table 2. Qualitatively defined levels of understanding
High agreement limited evidence … High agreement much evidence
… … …
Level of agreement
or consensus →
Low agreement
limited evidence … Low agreement
much evidence
Amount of evidence (theory, observations, models) →
13. A level of confidence, as defined in Table 3, can be used to characterize uncertainty that is based on expert judgment as to the correctness of a model, an analysis or a statement. The last two terms in this scale should be reserved for areas of major concern that need to be considered from a risk or opportunity perspective, and the reason for their use should be carefully explained.
Table 3. Quantitatively calibrated levels of confidence.
Terminology Degree of confidence in being correct
Very High confidence At least 9 out of 10 chance of being correct
High confidence About 8 out of 10 chance
Medium confidence About 5 out of 10 chance
Low confidence About 2 out of 10 chance
Very low confidence Less than 1 out of 10 chance
14. Likelihood, as defined in Table 4, refers to a probabilistic assessment of some well defined outcome having occurred or occurring in the future. The categories defined in this table should be considered as having ‘fuzzy’ boundaries. Use other probability ranges where more appropriate but do not then use the terminology in table 4. Likelihood may be based on quantitative analysis or an elicitation of expert views. The central range of this scale should not be used to express a lack of knowledge – see paragraph 12 and Table 2 for that situation. There is evidence that readers may adjust their interpretation of this likelihood language according to the magnitude of perceived potential consequences [8].
Table 4. Likelihood Scale.
Terminology Likelihood of the occurrence/ outcome
Virtually certain > 99% probability of occurrence
Very likely > 90% probability
Likely > 66% probability
About as likely as not 33 to 66% probability
Unlikely < 33% probability
Very unlikely < 10% probability
Exceptionally unlikely < 1% probability
15. Consider the use of tabular, diagrammatic or graphical approaches to show the primary sources of
uncertainties in key findings, the range of outcomes, and the factors and relationships determining levels of
confidence.
References
1. Moss, R., and S. Schneider. 2000. Uncertainties, in Guidance Papers on the Cross Cutting Issues of the Third
Assessment Report of the IPCC, edited by R. Pachauri, T. Taniguchi, and K. Tanaka, Intergovernmental Panel
on Climate Change (IPCC), Geneva.
2. Manning, M.R., M. Petit, D. Easterling, J. Murphy, A. Patwardhan, H-H. Rogner, R. Swart, and G. Yohe (Eds).
2004. IPCC Workshop on Describing Scientific Uncertainties in Climate Change to Support Analysis of Risk and
of Options: Workshop report. Intergovernmental Panel on Climate Change (IPCC), Geneva.
3. Morgan, M.G., and M. Henrion. 1990. Uncertainty: A Guide to Dealing with Uncertainty in Quantitative Risk and
Policy Analysis., Cambridge University Press, Cambridge, UK. (See particularly chapter 6 “Human judgment
about and with uncertainty”.)
4. Lempert, R. J., S.W. Popper, and S.C. Bankes. 2003. Shaping the Next One Hundred Years: New Methods for
Quantitative Long-Term Policy Analysis. RAND Corporation; and Lempert, R. J. and M. E. Schlesinger. 2000.
Robust strategies for abating climate change. Climatic Change 45, 387-401.
5. Kandlikar, M., J. Risbey, and S. Dessai. 2005. Representing and Communicating Deep Uncertainty in Climate
Change Assessments, Comptes Rendu Geosciences 337, 443-451.
(NB Aspects of the hierarchy proposed above have been adapted from Kandlikar et al, however, other aspects of
the approach proposed by those authors differ from those given here.)
6. Kahneman, D. and A. Tversky. 1979. Prospect theory: an analysis of decision under risk. Econometrica 47, 263-
291.
7. (e.g.) Morgan, M.G. 1998. Uncertainty analysis in risk assessment. Human and Ecological Risk Assessment, 4, 25-;
and Wallsten, T.S., D.V. Budesco, A. Rapoport, R. Zwick, and B. Forsyth. 1986. Measuring the vague meanings
of probability terms. Journal of Experimental Psychology: General, 115, 348-365.
8. Patt, A. G. and Schrag, D. 2003. Using specific language to describe risk and probability. Climatic Change 61, 17-
30 (2003).; and Patt, A. G. and S. Dessai. 2004. Communicating uncertainty: lessons learned and suggestions for
climate change assessment. Comptes Rendu Geosciences 337, 425-441.
Type Indicative examples of sources Typical approaches or considerations
Unpredictability
Projections of human behaviour not easily amenable to prediction (e.g. evolution of political systems).
Chaotic components of complex systems.
Use of scenarios spanning a plausible range, clearly stating assumptions, limits considered, and subjective judgments.
Ranges from ensembles of model runs.
Structural uncertainty
Inadequate models, incomplete or competing conceptual frameworks, lack of agreement on model structure,
ambiguous system boundaries or definitions, significant processes or relationships wrongly specified or not considered.
Specify assumptions and system definitions clearly, compare models with observations for a range of conditions,
assess maturity of the underlying science and degree to which understanding is based on fundamental concepts tested in other areas.
Value uncertainty
Missing, inaccurate or non-representative data, inappropriate spatial or temporal resolution, poorly known or changing model parameters.
Analysis of statistical properties of sets of values (observations, model ensemble results, etc); bootstrap and hierarchical statistical tests; comparison of models with observations.
4. Assess issues of risk where supported by published work. Where probabilistic approaches are available, consider ranges of outcomes and their associated likelihoods with attention to outcomes of potential high consequence. An alternative approach is to provide information for decisions that would be robust in the sense of avoiding adverse outcomes for a wide range of future possibilities [4]. (Note that the term “risk” has several different usages. If used it should be defined in context.). Make expert judgments
5. Be prepared to make expert judgments and explain those by providing a traceable account of the steps used
to arrive at estimates of uncertainty or confidence for key findings – e.g. an agreed hierarchy of information, standards of evidence applied, approaches to combining or reconciling multiple lines of evidence, and explanation of critical factors.
6. Be aware of a tendency for a group to converge on an expressed view and become overconfident in it [3]. Views and estimates can also become anchored on previous versions or values to a greater extent than is justified. Recognize when individual views are adjusting as a result of group interactions and allow adequate time for such changes in viewpoint to be reviewed. Use the appropriate level of precision to describe findings
7. Assess the current level of understanding on key issues and precede statements on confidence or uncertainty
with a general summary of the corresponding state of knowledge. Table 2 below provides a consistent language for this.
8. Develop clear statements for key findings that are quantitative and give explicit time frames as far as possible. Define carefully the corresponding variables or outcomes, their context, and any conditional assumptions. Where scenarios are used, explain the range of assumptions and how they affect the outcome. Then consider the most appropriate way to describe the relevant uncertainties or level of confidence by going as far down the hierarchy given below as you feel appropriate (from expressions of less to more confidence and less to more probabilistic approaches) [5]:
A. Direction of change is ambiguous or the issue assessed is not amenable to prediction: Describe the governing factors, key indicators, and relationships. If a trend could be either positive or negative, explain the pre-conditions or evidence for each.
B. An expected trend or direction can be identified (increase, decrease, no significant change): Explain the basis for this and the extent to which opposite changes would not be expected. Include changes that have a reasonable likelihood even where they are not certain. If you describe a collective level of confidence in words, use the language options in Table 2 or 3.
C. An order of magnitude can be given for the degree of change (i.e. sign and magnitude to within a factor of 10): Explain the basis for estimates given and indicate assumptions made. The order of magnitude should not change for reasonable ranges in such assumptions. If you describe a collective level of confidence in words, use the language options in Table 2 or 3.
D. A range can be given for the change in a variable as upper and lower bounds, or as the 5th and 95th percentiles, based on objective analysis or expert judgment: Explain the basis for the range given, noting factors that determine the outer bounds. If you cannot be confident in the range, use a less precise approach. If you describe a collective level of confidence or likelihood of an outcome in words, use the language options in Tables 3 or 4.
E. A likelihood or probability of occurrence can be determined for an event or for representative outcomes, e.g. based on multiple observations, model ensemble runs, or expert judgment: State any assumptions made and estimate the role of structural uncertainties. Describe likelihoods using the calibrated language given in Table 4 or present them quantitatively.
F. A probability distribution can be determined for changes in a continuous variable either objectively or through use of a formal quantitative survey of expert views: Present the PDF graphically and/or provide the 5th and 95th percentiles of the distribution. Explain the methodology used to produce the PDF, any assumptions made, and estimate the role of structural uncertainties. Communicate carefully, using calibrated language
9. Be aware that the way in which a statement is framed will have an effect on how it is interpreted [6]. (A 10% chance of dying is interpreted more negatively than a 90% chance of surviving.) Use neutral language, avoid value laden statements, consider redundant statements to ensure balance (e.g. chances of dying and of surviving), and express different but comparable risks in a consistent way.
10. To avoid the uncertainty perceived by the reader being different from that intended, use language that minimizes possible misinterpretation and ambiguity. Note that terms such as “virtually certain”, “probable”, or “likely”, can engage the reader effectively, but may be interpreted very differently by different people unless some calibration scale is provided [7].
11. Three forms of language are given in Tables 2, 3 and 4 to describe different aspects of confidence and uncertainty and to provide consistency across the AR4.
12. Table 2 considers both the amount of evidence available in support of findings and the degree of consensus among experts on its interpretation. The terms defined here are intended to be used in a relative sense to summarize judgments of the scientific understanding relevant to an issue, or to express uncertainty in a finding where there is no basis for making more quantitative statements. A finer scale for describing either the amount of evidence (columns) or degree of consensus (rows) may be introduced where appropriate, however, if a mid-range category is used authors should avoid over-using that as a ‘safe’ option that communicates little information to the reader. Where the level of confidence is ‘high agreement much evidence’, or where otherwise appropriate, describe uncertainties using Table 3 or 4.
Table 2. Qualitatively defined levels of understanding
High agreement limited evidence … High agreement much evidence
… … …
Level of agreement
or consensus →
Low agreement
limited evidence … Low agreement
much evidence
Amount of evidence (theory, observations, models) →
13. A level of confidence, as defined in Table 3, can be used to characterize uncertainty that is based on expert judgment as to the correctness of a model, an analysis or a statement. The last two terms in this scale should be reserved for areas of major concern that need to be considered from a risk or opportunity perspective, and the reason for their use should be carefully explained.
Table 3. Quantitatively calibrated levels of confidence.
Terminology Degree of confidence in being correct
Very High confidence At least 9 out of 10 chance of being correct
High confidence About 8 out of 10 chance
Medium confidence About 5 out of 10 chance
Low confidence About 2 out of 10 chance
Very low confidence Less than 1 out of 10 chance
14. Likelihood, as defined in Table 4, refers to a probabilistic assessment of some well defined outcome having occurred or occurring in the future. The categories defined in this table should be considered as having ‘fuzzy’ boundaries. Use other probability ranges where more appropriate but do not then use the terminology in table 4. Likelihood may be based on quantitative analysis or an elicitation of expert views. The central range of this scale should not be used to express a lack of knowledge – see paragraph 12 and Table 2 for that situation. There is evidence that readers may adjust their interpretation of this likelihood language according to the magnitude of perceived potential consequences [8].
Table 4. Likelihood Scale.
Terminology Likelihood of the occurrence/ outcome
Virtually certain > 99% probability of occurrence
Very likely > 90% probability
Likely > 66% probability
About as likely as not 33 to 66% probability
Unlikely < 33% probability
Very unlikely < 10% probability
Exceptionally unlikely < 1% probability
15. Consider the use of tabular, diagrammatic or graphical approaches to show the primary sources of
uncertainties in key findings, the range of outcomes, and the factors and relationships determining levels of
confidence.
References
1. Moss, R., and S. Schneider. 2000. Uncertainties, in Guidance Papers on the Cross Cutting Issues of the Third
Assessment Report of the IPCC, edited by R. Pachauri, T. Taniguchi, and K. Tanaka, Intergovernmental Panel
on Climate Change (IPCC), Geneva.
2. Manning, M.R., M. Petit, D. Easterling, J. Murphy, A. Patwardhan, H-H. Rogner, R. Swart, and G. Yohe (Eds).
2004. IPCC Workshop on Describing Scientific Uncertainties in Climate Change to Support Analysis of Risk and
of Options: Workshop report. Intergovernmental Panel on Climate Change (IPCC), Geneva.
3. Morgan, M.G., and M. Henrion. 1990. Uncertainty: A Guide to Dealing with Uncertainty in Quantitative Risk and
Policy Analysis., Cambridge University Press, Cambridge, UK. (See particularly chapter 6 “Human judgment
about and with uncertainty”.)
4. Lempert, R. J., S.W. Popper, and S.C. Bankes. 2003. Shaping the Next One Hundred Years: New Methods for
Quantitative Long-Term Policy Analysis. RAND Corporation; and Lempert, R. J. and M. E. Schlesinger. 2000.
Robust strategies for abating climate change. Climatic Change 45, 387-401.
5. Kandlikar, M., J. Risbey, and S. Dessai. 2005. Representing and Communicating Deep Uncertainty in Climate
Change Assessments, Comptes Rendu Geosciences 337, 443-451.
(NB Aspects of the hierarchy proposed above have been adapted from Kandlikar et al, however, other aspects of
the approach proposed by those authors differ from those given here.)
6. Kahneman, D. and A. Tversky. 1979. Prospect theory: an analysis of decision under risk. Econometrica 47, 263-
291.
7. (e.g.) Morgan, M.G. 1998. Uncertainty analysis in risk assessment. Human and Ecological Risk Assessment, 4, 25-;
and Wallsten, T.S., D.V. Budesco, A. Rapoport, R. Zwick, and B. Forsyth. 1986. Measuring the vague meanings
of probability terms. Journal of Experimental Psychology: General, 115, 348-365.
8. Patt, A. G. and Schrag, D. 2003. Using specific language to describe risk and probability. Climatic Change 61, 17-
30 (2003).; and Patt, A. G. and S. Dessai. 2004. Communicating uncertainty: lessons learned and suggestions for
climate change assessment. Comptes Rendu Geosciences 337, 425-441.
miércoles, 1 de octubre de 2008
Suscribirse a:
Entradas (Atom)