Inventory methods in LCA: towards consistency and

Transcription

Inventory methods in LCA: towards consistency and
UNEP-SETAC LIFE CYCLE INITIATIVE
LIFE CYCLE INVENTORY (LCI) PROGRAMME
TASK FORCE 3: METHODOLOGICAL CONSISTENCY
Inventory methods in LCA:
towards consistency and
improvement
--Final Report
Date: June 2007
Sven Lundie, Andreas Ciroth and Gjalt Huppes
Foreword
Lead authors of this Task Force document are:
• Sven Lundie, School of Civil and Environmental Engineering / Centre for
Water and Waste Technology at the University of New South Wales, Sydney,
Australia;
• Andreas Ciroth, GreenDeltaTC GmbH, Berlin, Germany; and
• Gjalt Huppes, Institute of Environmental Sciences (CML), Leiden
University, Leiden, the Netherlands.
We would like to thank reviewers "on the way", particularly Chris Foster
(EuGeos), Bo Weidema (2.-0 LCA consultants) and Roland Hischier (Empa).
The authors would like to thank several referees of the final draft report who
provided very helpful comments. Addressing their comments increased the
quality of the final report significantly. The referees of the final draft document
were:
•
Patrick Hofstetter and Nils Jungbluth (Section 2.1);
•
Helias Udo de Haes (Section 2.2);
•
Reinout Heijungs, Anders Schmidt and Mark Huijbregts (Section 2.3) and
•
Masanobu Ishikawa (Section 3).
Executive summary
1 Summary and conclusions on selected methodological issues in LCI
1.1 Prospective and descriptive analysis. Modelling changes in LCA
A discussion of prospective and descriptive analysis leads, for LCAs, instantly
to the discussion of attributional and change-oriented modelling. For this
reason, a scheme of recommended application should not deal with
prospective and descriptive analysis but “directly” with the question of
attributional and change oriented modelling.
It was possible to develop a scheme in this sense. The scheme poses three,
rather straightforward, questions:
1. Is decision support embodied in the goal and scope of the analysis?
2. Is a change in the “status quo” embodied in any comparison being
studied?
3. Can that change be modelled with a net benefit?
The first two questions have, implicitly in most cases, been discussed in
previous literature. The third question is newly introduced here.
The questions are of a general nature. They aim at representing a consensus
among the whole LCA community, and to structure a more detailed
discussion and more elaborated guidelines. They will need to be discussed
and tested, while questions 2 and 3 will need to be detailed further. For
example, when should one assume that the status quo does not change?
How can one assess “costs and benefits” of modelling the change? What can
be modelled rather easily, and what seems excessive?
These questions have not been tackled in sufficient detail in previous
literature in a way that enables LCA practitioners to decide upon a suitable
change modelling method in a rational manner. They call for a “change
analysis” as a step in every LCA that aims at decision support, and for a
detailed “method cost benefit analysis”. The latter would best be undertaken
at a more generic, non-case specific level, with input from specific cases.
Neither of these forms of analysis yet exist; there exist, however, several
threads that could be used as starting points. For example, the literature on
advantages and disadvantages of attributional modelling in comparison to
change-oriented modelling is rather broad (Ekvall et al., Weidema 2003,
Frischknecht 1998; see also Chapter 3). Several authors have presented
tools applicable for a change analysis (e.g. Weidema 2003), there is rich
literature and knowledge outside the LCA field, in statistics and advanced
modelling, decision theory, in game theory, and most specifically in the field of
prospective analysis.
i
There is not yet, however, a consistent “framework” that integrates both types
of assessment and modelling, change-oriented and attributional, in a
consistent manner. The application scheme described here aims to be, in this
long-ongoing discussion, a first step towards a consensus on modelling
change in LCA. Looking at how deeply the modelling of change affects LCA
results and also conclusions drawn from an LCA, such a consensus is of high
need.
1.2 Multi-functionality and allocation in LCA
Based on the review of publications addressing methodological issues and
case studies it seems that the approach for dealing with multifunctional
processes suggested in the ISO framework (ISO 14044, 2006) is not
frequently followed in the practical application of LCA; ISO recommends in
order of preference 1) avoidance of allocation by subdividing unit processes
or expanding the system boundaries, 2) allocation based on underlying
physical relationships and then 3) allocation that reflect other relationships
(eg. economic, energy or mass allocation).
In the majority of the reviewed case studies some sort of allocation
procedures are applied. However, the levels of detail and justification
provided for decisions about system boundary expansion or allocation are
inconsistent and incomplete in most published reports. The first two steps of
the ISO hierarchy have been less commonly applied than the third. The
methodological choice of dealing with multi-functional processes is generally
handled on a case-by-case basis. No generic procedure for multi-functional
processes in co-production, combined waste processing and recycling has
been defined yet.
There is general agreement that the system expansion approach is a very
attractive way to theoretically avoid the difficult problem of allocation
altogether. In that sense, system expansion simplifies modelling because it
limits the assumptions that the modeller needs to make. However, system
boundary expansion is only applicable for consequential, not for attributional
LCAs.
But broadening the system boundaries makes the process of data collection
much more extensive. System expansion inflates the system under study due
to the widespread occurrence of multi-functional processes. System boundary
expansion generally introduces new multi-functional processes; some sort of
allocation is often still needed in order to collect the necessary background
data. Hence, in practice, allocation can very seldom be totally avoided even
by system expansion. Furthermore, system boundary expansion is equivalent
to redefining the functional unit.
In practice all types of allocation are applied, i.e. physico-chemical, economic,
mass and energy allocation. Economic allocation is most commonly used in
situations where there is co-production; it seems to be the preferred approach
and is perceived to be the best avenue to capture the downstream recycling
ii
activities. However, no generic procedure for multi-functional processes in coproduction, combined waste processing and recycling has been defined yet.
Based on the literature review the following recommendations can be made:
•
Link closely methodological choices to Goal and Scope Definition: It
seems to be a recurring theme that methodological choice needs to fit
closely with the goal of the study where the intentions of the study are
outlined. In the Goal and Scope Definition questions are answered, such
as why is the study commissioned, for what purpose, who is the target
audience etc. These issues are very likely to have a direct impact on
methodological choices. Hence, a closer link of the methodological
choices in multi-functional situations to Goal and Scope Definition can be
recommended, particularly in consequential LCAs. The justification of
choices should be explicit and transparent. Standard guidance on how to
describe and justify system boundary expansion and allocation decisions
in published reports might help to make LCA studies with multi-functional
processes more robust and transparent.
•
Rethink the ISO preference order of allocation procedures: As the
suggested ISO preference order does not seem to be applied in practice,
and in view of the practical difficulties of both system boundary expansion
and various types of allocation methods, it might be worthwhile to consider
moving system expansion from Step 1b to Step 3 in ISO 14044 in order to
put system expansion on the same level as the use of economic and other
causalities. Furthermore, economic relationships seem to be at least as
important as physical relationships in practice. Some authors recommend
economic allocation as a baseline method for most detailed LCA
applications, because it seems the only generally applicable method.
However, this goes against ISO 14044 and allocation on this basis is still
susceptible to various uncertainties, such as (locally) fluctuating prices,
demand, inflation, tariffs and industry subsidies etc. In either case physicochemical allocation seems to be the preferred approach if sufficient
information is available.
•
Develop industry-specific allocation procedures: it could be assumed that
no generic procedure for all multi-functional processes in co-production,
combined waste processing and recycling is definable. Hence, more effort
needs to be invested in developing allocation procedures appropriate to
specific industry sectors; if possible, physico-chemical ones.
1.3 Input data quality, data validation, uncertainty in LCA
Identifying consistencies is perhaps especially difficult in the data quality and
uncertainty field. Many of the papers analysed agree best on only two things:
firstly there is broad criticism about inconsistent nomenclature and the
different uses of important terms such as uncertainty, and about a “general
infancy” of the methodology (interestingly, this statement can be found in
iii
papers from 1996 to 2005) as well; secondly, there is consensus that
uncertainty assessment should be applied broadly, and that this is not yet the
case. These general statements still hold, albeit the situation has improved in
recent years. Data quality assessment for datasets is indeed applied in
commonly used LCI databases, while both Monte Carlo simulation and a
“pedigree matrix” approach that quantifies qualitative assessment information
have seen broad application success.
This text identifies six stages in the conduct of an LCA:
(1) specification of the goal and scope of the analysis;
(2) input data specification and collection;
(3) calculation of the LCA study;
(4) obtaining the result of the study as output;
(5) interpretation, and perception of the result by the audience, decision
makers;
(6) decision / action taken or initiated by the decision maker.
Based on these stages, the text suggests a top-down approach, starting from
effects in the real world and from the general characteristics of a good
decision.,As a consequence, analysis of how to provide good decision
support by an “improved” LCA should not stop at the model result stage (nr.
4), but consider how the result is perceived, and how decision makers react
when perceiving the result.
For the question of whether to address uncertainty or not, the text provides a
quite general answer: Uncertainty must be addressed if it is relevant for the
decision at stake, and this is the case if the uncertainty is high, or if it is
relatively higher in one alternative than in the other, or if the magnitude of the
uncertainty is of a similar order to the magnitude of the differences between
compared systems.
Verification and validation are, or should be, prime concerns for any modeller.
The verification process checks whether the model calculates its results in a
technically correct manner, while validation is concerned with whether the
model actually models what it should. Validation is barely used for LCAs
today; one reason being that it is difficult to apply for life cycle impacts. This
has the somewhat surprising effect that the specific result of an LCA is of
minor importance compared to the selected approach, and compared to
agreement being reached among stakeholders. Seeking possible “entry
points” for a validation into an LCA product model would be well worthwhile,
and would turn Life Cycle Assessment modelling into a more scientific
approach.
Data quality indicator lists are often comparable between different authors.
Yet there seems far less consensus about their definition, and even less
about their application. How to deal with trade-offs between different
indicators is rarely discussed. Practical guidance would be of value, both on
selection and practical use. From the different lists and concepts, the
“pedigree matrix” seems especially attractive; it has the appeal of combining
iv
human judgement and hard facts into quantitative values in a clear and
transparent way.
For many of the methods considered, this paper does not provide
recommendations. Quite often, the conclusion is that further work is required.
This is not highly satisfactory, and might appear to be a common reflex in
scientific papers. However, following on from the proposal of the six stages in
an LCA application, and of a top-down approach that starts where uncertainty
and data quality really matter (at the point of considering the effects on the
decision to be supported by the LCA), it is astonishing how little indeed has
been done.
The overall picture of data quality, uncertainty, validation and verification
provided in this text is new. It is hoped that it will serve to identify consensus
and recommended application procedures, and thus provide practical
guidance, leading towards consistency and improvement, even in the field of
data quality and uncertainty.
2 Summary and conclusions on advancing life cycle modelling
The limitations of simple ISO LCA for decision support are substantial. The
LCI part is a static model without any dynamics incorporated. Behavioural
mechanisms, including market mechanisms, are absent. Processes refer to
the past instead of the future. Spatial differentiation is mainly lacking.
However, by being so simple LCA has the advantage of being operational.
The problems of consistency relate to the current limitations, of which many
are keenly aware and which we would dearly like to overcome. There is a
tendency to use the quite limited static LCI model to indicate dynamics. It
would be a great improvement if either the static nature of LCA were
acknowledged with simple and clean comparative static analysis, or that a –
daring! - choice of dynamic modelling as the norm were made.
One discussion in this vein is centred around the issue of rebound effects. In
many situations there are clear indirect effects which, as rebounds, can
qualify the normal LCA outcomes - both negatively, as with high efficiency
light bulbs leading to new energy intensive applications, and positively, as
with IT services reducing travelling. These mechanisms are linked
haphazardly now, either in a comparative static or a loosely dynamic
framework. They should rather be part of a more systematic approach to
deepened forms of life cycle analysis, in the first instance still of a
comparative, static type but which could, in due time, be linked to dynamic
modelling when relevant mechanisms and appropriate data have been
developed.
Remaining within the realm of comparative, static analysis does not
necessarily mean that we should stick to current LCI. More mechanisms may
be added in static models as well, market models being an important
v
example. For all such variants, clarity about what is being compared is
essential. When several technology systems may produce the same function,
these can be compared on an equal footing. In contrast, the emerging trend
to make implicit comparisons with an unspecified reference situation, by
assuming substitution to take place relative to this unspecified reference
situation, is a major cause of inconsistency. If an LCA involves comparison
with a current situation, that situation should be specified on an equal footing
with the other alternatives under study.
The term ‘substitution’ used in the context of allocation, suggests an
economic mechanism - normally based on market mechanisms and
especially on elasticities of supply and demand. These may add one layer of
realism to the analysis, and also one layer of complexity. Considering market
reactions is clearly highly relevant to improving the realism of any assessment
of the consequences of choices. Doing this systematically is therefore a
requirement, firstly finding comparative static solutions, with dynamic analysis
coming “later”, if at all.
If these market mechanisms are incorporated in an LCA, they should be used
explicitly and systematically. Saying that “substitution” is being carried out,
failing to analyse it thoroughly, and then doing the not-real-substitution only
partially creates substantial inconsistency now. In short: consistency in LCI
can be much improved. This can be done either by specifying better the
purely technology-based simple LCA, or by developing a broader comparative
static framework involving main market mechanisms. Such options for
deepening life cycle based analysis are probably feasible now,
computationally as well as conceptually, but have not yet developed
empirically. It will not be possible to go all the way to computable general
equilibrium (CGE) models, as applied in general equilibrium modelling,
because the data requirements and computational power needed are too
huge if technological detail is to be realised. Partial equilibrium modelling is
the best target at this time, with choices about how “partial” being essential for
the outcomes and for interpreting the outcomes.
Closer to home, LCI/LCA can be much improved if the nature of current
modelling is clarified, not only in terms of what comparative static analysis is
about but also in terms of specifying the questions asked and linking the
answers to the questions. For more strategic technology questions, for
instance relating to new energy sources and transformation routes, the time
horizon of decisions is up to decades. Persevering in the use of data that
describes existing processes for such analysis then increasingly becomes the
wrong approach, linking to the past instead of to the relevant future. As the
future is not fully determined, technology scenarios then become important,
specifying consistent sets of future technologies as background for other
technology choices investigated. If wind power, clean coal and solar energy
emerge as dominant electricity technologies, low energy light bulbs, with
vi
notable environmental burdens in their production and end of life, become
less attractive.
Moving to dynamic analysis at the level of detail required in technology –
specific LCI – is currently not feasible. Some dynamic elements are present in
macro-level energy modelling, roughly linked to major technologies, as
applied in general equilibrium models (GEM, also referred to as CGE:
computable general equilibrium models). These models have an equilibrium
part with market mechanisms, and a time dependent part in which
technologies develop due to investment in new types, or other dynamic
mechanisms. Though not specifiable at sufficient detail for the purpose of
comparing different technology alternatives that could deliver a functional unit,
they may play a role in background process specification for LCI, as separate
but linked models. This may become more relevant if these general
equilibrium models are themselves developed to embody more technological
detail. Currently they represent the economy mostly at a 20-30 sector level of
detail. Input-output databases with more sectoral detail are being developed,
moving towards the level of around one hundred sectors, with even up to 500
sectors. The link to specific technologies as required in LCA them becomes
much more meaningful.
The detailed IO tables with broad environmental extensions (EIOA) that are
emerging can be linked to current LCI in two different ways. One way is to
use them to solve some of the data problems in LCI, incorporating
background data based on such IO tables in a tiered hybrid analysis. This
analysis is mathematically fully equivalent to current LCI, as matrix inversion.
However, a whole new domain of life cycle analysis can be developed, not
linked to a functional unit of arbitrary size but to full totals in society. The
system analysed in technological detail is fitted into the sectoral framework
with total demand for the function specified in the context of total demand in
society. This analysis has the big advantage that the link to sustainability
aims, which are not at the level of product systems but at the level of society,
can be made directly. This integrated hybrid analysis (IHA) makes the link
from the micro to the macro level of analysis. If the analysis would next be
extended to market mechanisms, as partial equilibrium analysis, the
specification in the integrated hybrid analysis could function a background on
the choice which partial markets to model: the most relevant ones.
vii
Table of contents
Executive summary ..........................................................................................i
1
Introduction...............................................................................................1
2
Selected methodological issues in Life Cycle Inventory Analysis .............3
2.1
Prospective and descriptive analysis. Modelling changes in LCA .....3
2.1.1
Motivation...................................................................................3
2.1.2
Descriptive analyses, scenarios, and prospective analyses in
general literature.......................................................................................4
2.1.3
Life Cycle Assessment specific: Descriptive analysis,
prospective analysis, scenarios and change in LCA models ....................6
2.1.4
Modelling changes in Life Cycle Assessments...........................8
2.1.5
Towards a recommended application scheme .........................11
2.1.6
Questions in the scheme..........................................................11
2.1.7
A recommended application scheme .......................................12
2.1.8
Conclusions..............................................................................13
2.2
Multi-functionality and allocation in LCA ..........................................15
2.2.1
Categorisation of multi-functional unit processes .....................15
2.2.2
International Organization for Standardization .........................16
2.2.3
System boundary expansion ....................................................18
2.2.4
Allocation..................................................................................21
2.2.5
Case studies and guidelines – a literature overview ................25
2.2.6
Structured approach for dealing with multi-functional unit
processes ...............................................................................................30
2.3
Data quality, validation, uncertainty in LCA .....................................37
2.3.1
Introduction ..............................................................................37
2.3.2
Uncertainty ...............................................................................46
2.3.3
Data quality ..............................................................................52
2.3.4
Verification and validation ........................................................57
2.3.5
Concluding remarks .................................................................61
3 Advancing life cycle modelling................................................................62
3.1
Introduction......................................................................................62
3.2
Advancing life cycle modelling in LCA .............................................67
3.3
Rebound mechanisms and modelling challenges ...........................70
3.3.1
Rebound mechanisms..............................................................70
3.3.2
From rebound mechanisms to modelling challenges ...............72
3.4
Process selection and results in LC inventory analysis ...................75
3.4.1
Process selection .....................................................................76
3.4.2
The nature of results ................................................................78
3.5
Time in sustainability modelling: main options surveyed .................81
3.5.1
3.5.2
3.5.3
3.5.4
Steady state equilibrium models ..............................................83
Non-steady state models for LC inventory analysis? ...............85
Non-steady state static equilibrium models ..............................87
Non-steady state dynamic models ...........................................88
3.6
Hybrid modelling for LCI ..................................................................91
3.6.1
Modelling principles..................................................................91
3.6.2
Tiered Hybrid LCA....................................................................94
3.6.3
Integrated Hybrid Analysis .......................................................94
3.7
4
Mathematical structure of LCA models............................................95
3.8
Conclusions on advances in Life Cycle Inventory modelling ...........97
Summary and conclusions on methodological consistency ..................100
4.1
Summary and conclusions on selected methodological issues in LCI
100
4.1.1
Prospective and descriptive analysis. Modelling changes in LCA
100
4.1.2
Multi-functionality and allocation in LCA.................................101
4.1.3
Input data quality, data validation, uncertainty in LCA............103
5
4.2
Summary and conclusions on advancing life cycle modelling .......104
References ...........................................................................................107
List of Figures
Figure 1: Relevance of different future research methods in relation to the
applications of LCA (Weidema 1998, Pesonen et al. 2000).............................8
Figure 2: Complete, substantial and marginal change (Azapagic and Clift
1999), taken from (Ekvall 1999), modified .....................................................10
Figure 3: attributional LCA as ‘slice of a pie’, and change-oriented (or
consequential) LCA as a change of the original system (Weidema 2003, p.
15) .................................................................................................................11
Figure 4: Undue sophistication raises the overall error in a model (Ciroth
2004), based on SRU: Umweltgutachten 1974, Stuttgart 1974, p 208,
modified)........................................................................................................12
Figure 5: Application scheme as a guidance towards attributional and changeoriented LCA modelling. Further explanations see text. ................................13
Figure 6: Accounting for co-products through system expansion ..................18
Figure 7: Schematic diagram for describing system expansion and
delimitation of joint production .......................................................................19
Figure 8: Decision flow diagram for identifying and handling multi-functionality
situations (Guinée et al, 2004) .......................................................................24
Figure 9: “Input data leading to output data by being fed through the model”
(van den Berg et al. 1999, p. 4) .....................................................................38
Figure 10: The LCA model results together with their perceived quality
influence the choices inspired by the model; and these, in turn, are the
practical effects of the LCA model (van den Berg et al. 1999, p. 4) ...............39
Figure 11: Six stages from scope to the effects of a decision supported by
LCA................................................................................................................40
Figure 12: Verification and validation for an LCA case study (Ciroth 2002,
modified)........................................................................................................43
Figure 13: Scheme for the analysis of ‘data inaccuracy’ in LCI (Huijbregts et
al. 2001, p. 130; screenshot from the original source) ...................................51
Figure 14: Overview of the internal review and data quality control within the
ecoinvent project (Frischknecht and Jungbluth 2003, p. 54)..........................58
Figure 15: Patchwork of expertise in the evaluation of the quality of an LCA
dataset (Ciroth et al. 2006, modified).............................................................59
Figure 16: Example for a plausibility calculation, for a wood co-combustion
process in coal power plants with reference years of 2000, 2010, 2020 and
2030 (see also Ciroth et al. 2006)..................................................................60
Figure 17 Constant technology systems specified as a steady state time slice
(“snapshot”) in time........................................................................................84
List of Tables
Table 1: Features of science, political analysis and life cycle assessment in
comparison ....................................................................................................45
Table 2: Comparison of proposed data quality indicators for Life Cycle
Assessment from various references (Ciroth and Srocka 2005; modified) ....54
Table 3 Mechanisms missed in simple LCAs and options for linking them in or
to expanded LCAs. ........................................................................................73
Table 4 Time in sustainability modelling ........................................................82
Table 5 Four main options for functional units in LCA and EIOA combinations
.......................................................................................................................93
TF3 Methodological consistency
1 Introduction
The aim of this report is to clarify a number of methodological issues in LCI
modelling, as part of LCA. Some of these issues have been on the agenda for
a long time, without coming to generally accepted solutions, and often not
even to agreement on alternative approaches. Examples are the nature of
LCA in terms of prospective and descriptive analysis; multifunctionality and
allocation; system boundary principles; data selection and data categories;
and validity and reliability of LCI results, and based on that LCA results.
Several of these subjects have a long history of debate. Therefore, it can
hardly be expected that the final solution to these issues can be framed in this
document. What is possible however is to survey positions in these fields and
see if a direction for solution can be indicated, possibly relative to specific
applications.
Several of these subjects relate to the specific nature of LCA as a relative
simple method for decision support, simple to make it operational also for
'small' users. LCA is unique in this respect in that even small firms and
consultants can make LCAs for supporting their choices. Other approaches to
sustainability analysis are either qualitative principles, as concepts, or are
based on complex methods and models, to be maintained by public research
institutes and difficult to interpret. So, the LCA method can be simple and
practicable, but at a cost. It seems that most discussion is centred around the
fact that the limitations inherent in simple modelling are becoming felt as
undue restrictions. We are not content with descriptive attributional LCA when
what we want is a view on what will happen, that is a prospective, dynamic
view. Might LCA, the simple method and modelling technique as standardised
by ISO and SETAC, evolve into a more realistic but more complex dynamic
models? It seems that clarifying such approaches for more complex modelling
is a useful exercise. Firstly, we might become content with the simple LCA in
many applications, accepting the limitations and accepting the divergence of
methods as unavoidable arbitrariness due to simplification. Secondly, we may
better understand the limitations of the current mostly simple LCA, by setting
it against other types of modelling, with advantages and disadvantages. Next,
some of these limitations may not be a necessity of nature but a matter of
further development, leading to 'New LCA', deepened in mechanisms and
broadened in content and applications. And finally, we may of course see the
grand vistas of more complex and more realistic types of modelling, coming
closer with better computational options and better data bases. This would go
beyond LCA, unless the meaning of LCA would be broadened, around its
central meaning of covering the life cycle of function systems.
Page 1
TF3 Methodological consistency
So, the report before you has two main parts:
•
•
Part 1, i.e. Chapter 2, is about current LCA and how it may be
advanced in terms of a number of methods issues. While for the
multifunctionality subject, convergence seems quite impossible and
clarification the best achievable, an investigation of prospective and
descriptive analysis in LCA yields a rather straightforward application
scheme of recommended practice. Looking into the connected issues
of data quality, uncertainty and validation reveals the current lack of
validation approaches, which deprives, in turn, data quality and
uncertainty treatment most of their empirical relevance.
Part 2, i.e. Chapter 3, is about how sustainability analysis might
advance, not within the current limitations of ISO LCA, but by evolving
new modes of analysis which incorporate mechanisms not now
present in LCA. Some problems with acrimonious debate in LCA might
be resolved at this more complex modelling level. For example, by
incorporating some economic market mechanisms into LCA, a very
different LCA from the current simple one, the discussions on
substitution can link to the established domain of market analysis. Also
the current development of cost-benefit analysis and dynamic
equilibrium models which become more applicable in the domain of
LCA should not be opposed but integrated into improved sustainability
analysis. This part 2 should function as a framework for improving
sustainability analysis: as LCA, broadened and deepened LCA, as
'New LCA' or 'beyond LCA'.
However, starting point for this modelling oriented part remains the LCA we
know, in all its shades. This does not result in new models, but a framework in
which such possible models can be placed. The authors want to emphasise
that the content of this report is rapidly evolving in this dynamic field. Hence,
the content presented here can only be considered as a 'snapshot in time'.
The contents of this document should also been seen in the context of the
work of other the Task Forces of the Life Cycle Inventory program.
Page 2
TF3 Methodological consistency
2 Selected methodological issues in Life Cycle Inventory
Analysis
2.1
Prospective and descriptive analysis. Modelling changes in LCA
Corresponding author: Andreas Ciroth, GreenDeltaTC, Berlin
2.1.1 Motivation
Life Cycle Assessment is a technique for decision support. There is
consensus that a Life Cycle Inventory model should, in principle, closely
reflect the decision situation. Decisions often change current situations.
Albeit, LCA models often reflect a current state, even in cases where the
decision under study might address a different situation.
Despite some effort1, there is little consensus about how the LCA modelling
should be done if the status quo changes. In fact, it is often not investigated
whether the status will change by the decision that is to be supported.
Changes of the status quo are likely if future options are modelled, or if the
consequences of the decisions reach so far that they potentially change the
current state. It is of course impossible to know exactly what these changes
will be, but it is often desirable to use modelling to assess what they are likely
to be or to understand the range of possibilities.
For considering changes in LCA, a veritable zoo of approaches and
terminologies exists. An easy, and common, choice is to ignore possible
changes in the model. This might lead to a model that does not reflect any
more the decision situation and that in consequence does not provide good
decision support.
The aim of this chapter is to shed light, and as far as possible to provide
guidance, on a sound use of approaches for modelling changes in Life Cycle
Assessments, drawing also from experiences gained in other fields of
science.
First, a literature review will explore approaches and insights in Life Cycle
Assessment and in policy analysis. The review material will be analysed for
common elements, overlaps and inconsistencies, with the aim to propose an
application scheme for modelling changes in LCAs. The developed
application scheme is drafted in a flow chart.
1
E.g., (Curran et al. 2001), a workshop on electricity modelling, and a Danish LCA methodology and
consensus creation project, supervised by the Danish EPA 1997-2001, see (Weidema 2003).
Page 3
TF3 Methodological consistency
2.1.2 Descriptive analyses, scenarios, and prospective analyses in
general literature
This section will provide a brief literature review of how descriptive analyses,
scenarios, and prospective analyses are understood in general (modelling)
literature, outside of the LCA world. The next section will then concentrate on
LCA specific references.
A descriptive analysis is a careful description of a situation or “how something
is”. The extent of the analysis is or should be determined by goal and scope
of the description exercise. The analysis may include the calculation of
indicators, and the production of graphics and tables for presentation and
communication of findings. And descriptions are one of the core elements of
science, and of day-to-day reasoning as well, defined as a “discourse
intended to give a mental image of something experienced“2.
Statistics and decision theory both provide a long track of approaches,
insights, and applications for descriptive analyses. Descriptive statistics have
developed numerous approaches and indicators for describing various
aspects of random data (states, and processes that take place) with the
general aim to provide good decision support (e.g. Sachs 1992, pp. 11). In
decision theory, descriptive analysis is one of the main branches (e.g. Bell et
al. 1988), with the aim to describe “how real people actually behave”3. It is
contrasted to normative analysis – “how ideally decisions should be made”,
and to prescriptive analysis (“how real people could behave more
advantageously”; Raiffa 2002, Kahneman and Tversky 1988).
While the scope of descriptive analysis seems rather broad, there is one,
major, limitation. The focus clearly lies on the description, and one often tries
to keep interpretation and guidance separated from this, rather fact-based,
description. It is evident that this reduces the possible objects of study. One
can, e.g., barely imagine a descriptive analysis of something that will happen
in future.
There are numerous examples for descriptive analyses, both consulting
studies and scientific projects. Two studies may serve as examples. Prywes
et al. provide a description of the labour market in the US (Prywes 2000);
Greg and Greg study the remnants of people from various cultures and times
as found in Dakota in the US (Greg and Greg 1988). The latter study is
interesting because it shows that descriptive analyses might consider time
aspects, but that they are, in general, limited to the past and present, and,
again in general, the “material” for the analysis can be observed by the one
who conducts the analysis.
Prospective analyses deal with (future) prospects. They often start from
descriptive analyses, or build upon them. It is common sense, and widely
accepted, that the future is unknown and that future events bear uncertainty.
2
3
Merriam Webster, website, http://www.m-w.com/dictionary/description, site accessed 21 April 2006.
All quotes from Raiffa 2002 (p. 10).
Page 4
TF3 Methodological consistency
A common measure to cope with this uncertainty is the design and analysis of
scenarios that model possible, or at least interesting, future states, or even
sequences in time.
Several “schools” of prospective analysis may be distinguished.
In France, Godet and colleagues developed a theoretical framework and
toolbox for prospective analysis (e.g. Godet 1979, 1994, 2001 and 2006).
They define the term prospective as anticipation for elucidating today’s
actions4. Basically, they propose scenarios for modelling possible futures, and
a set of tools for exploring the all-too large field of future possibilities, and for
reducing the uncertainty in scenarios. In this toolbox belong common tools
like the Delphi method, but also other tools and techniques, specifically
developed, or adapted, like “morphological analysis5”, “abaque de Regner”,
“Smic Prob Expert”, and others (Godet 2006, pp. 60).
Morphological analysis is a method for constructing scenarios in a systematic
way. For a given model, different model parameters are changed in a certain
manner, and confronted with hypothesis about the future. Result is, first, a
matrix with hypotheses as columns and parameters as lines, each matrix cell
indicating parameter changes for the specific hypothesis. In a second step,
impossible or unrealistic hypothesis/parameter pairs are excluded, thus
reducing the “morphological space”.
Abaque de Regner is a method of expert consultation by using coloured
cards, the colours representing an ordinal scale6.
“Smic Prob Expert” has the aim to rank scenarios according to their
probability of occurrence. Basically, selected experts are questioned, in a
survey, about their opinion regarding each scenario probability, on a scale
from 1 to 5. Experts are also asked about conditional probabilities. A
subsequent analysis modifies and corrects these “raw” data according to
basic rules of probability reasoning, ensuring, e.g., that the axioms of
probability are not violated. Once a consistent data basis is obtained, one is
able to calculate a hierarchy of scenario probabilities.
All these approaches are thoroughly structured and described in detail with
application guidelines available in a broad literature. Recently, a set of free
software tools has been issued for fostering the application of the toolbox7.
A more pragmatic approach has its origin in the US. The RAND corporation
set milestones in prognosis technique, proposing very early the use of
scenarios and Delphi techniques (e.g. Dalkey et al. 1969, Kahn and Wiener
1967). Most important, perhaps, was the inclusion of expert judgement in
decision support (Cooke 1991).
4
In the French original : „anticipation pour éclairer l’action” (Godet 2006, p. 9).
Not to be mixed with morphological analysis as used in linguistics.
6
„Abaque“ is French for abacus.
7
Available via http://www.3ie.org/lipsor/
5
Page 5
TF3 Methodological consistency
A more intuitive, yet goal-oriented approach is proposed by Schwartz
(Schwartz 1996). The question “what impending decisions keep you awake at
night” is the starting point for building scenarios. Schwartz tries to identify
archetypes of plots, plots that have happened all times in human history, and
to identify driving forces for them. The approach is thus to look for possible
archetypes for the decisions at stake, and to identify driving forces for them.
These forces are ranked by relevance/importance and by uncertainty, and
means to come to future states are analysed, e.g. by using a SWOT
analysis8.
2.1.3 Life Cycle Assessment specific: Descriptive analysis, prospective
analysis, scenarios and change in LCA models
2.1.3.1 Descriptive analysis
As descriptive analyses are essential for probably any analysis that deals with
real world phenomena, they are also relevant in the field of LCA. Few LCA
authors mention them explicitly, however. Guinée and colleagues constitute
an exception:
“before you can formulate a question, you first have to determine your
system, the alternatives etc. For this you need a descriptive analysis”
(Guinée et al. 1999, p. 6).
Guinée et al. emphasize, from an LCA perspective, that descriptive analysis
alone is not sufficient for decision support (Guinée et al. 1999, p. 6).
2.1.3.2 Prospective analysis
A literature review about prospective analysis in LCA can be kept short. Life
Cycle Assessments are, at present, almost always stationary models, or at
least models that neglect time. Although time aspects play a role in various
aspects of a product’s life cycle and its environmental impacts, e.g., in the
emissions of landfills (Finnveden and Nielsen 1999), in the product life time,
and in the atmospheric lifetime of CO2 as set in climate change models, very
few exceptions form this general rule exist (Ciroth et al. 2005).
Weidema understands prospective LCAs “as an assessment of the
consequences of a potential product substitution” (Weidema 2000). He
remarks that prospective LCAs “may well include very different processes
compared to a study with a static, retrospective perspective“. Reasons are (i)
forecasting techniques in prospective analyses give a better image of future
consequences, (ii) the scale of the product substitution is taken into account,
and, eventually, a technology different from the current technology share is
considered; finally, (iii) the market may change, and “a prospective LCA
8
SWOT: Strengths Weaknesses Opportunities Threats; a SWOT analysis is a qualitative method for
decision support, widely used in strategic management.
Page 6
TF3 Methodological consistency
seeks to determine which specific product substitutions will actually take
place and to what extent”.
In this sense, a prospective LCA is an(y!) comparative LCA that is performed
for decision support.
Decision support is a very common application case of LCA, probably the
most usual. However, explicit applications of prospective LCA studies are
scarce. Examples are Spielmann, “LCAs on train systems” (Spielmann 2005),
and Dannemand et al., “a study on future wind power systems” (Dannemand
et al. 2001). Both investigate systems that take place in about 20 to 30 years.
Spielmann terms his LCA “prospective LCA”, and uses scenarios for
modelling future transport systems.
2.1.3.3 Scenarios
Scenarios received some attention for LCAs. Until 2000, there has been a
SETAC Working Group dedicated to “Scenario Development in LCA”
(Pesonen et al. 2000). The Working Group defines scenarios, for LCA
applications, as follows:
“A scenario in LCA studies is a description of a possible future situation
relevant for specific LCA applications, based on specific assumptions
about the future, and (when relevant) also including the presentation of
the development from the present to the future” (Pesonen et al., p. 23).
This definition is consistent with “standard” scenario literature; it is not
modified to better fit into the LCA context. This is interesting because time
receives not much attention in LCAs so far (see above), and also, because
the term scenario is quite often, in LCA studies, used to refer to a specific set
of LCA model parameter settings, without, necessarily, any time aspect
included. E.g., it is quite popular to term basic settings of the LCA model in a
study as the “base scenario”, even if the LCA model is a static one.
The group discusses, briefly, other methods of “future research”9, and
proposes an application portfolio for them, drawing from a publication by
Weidema (Weidema 1998). The portfolio has two dimensions, time (historical,
now, 5 years, long term) and application area (specific or generic10) (see
Figure 1).
9
Namely: extrapolation methods; exploratory methods; normative methods; participatory modelling. It is
not discussed that these are not necessarily used for coping with time aspects.
10
Note that the term “specific” addresses, here and in the figure, the solution space or “area” of the
individual LCA.
Page 7
TF3 Methodological consistency
Figure 1: Relevance of different future research methods in relation to
the applications of LCA (Weidema 1998, Pesonen et al. 2000)
According to this figure, scenarios are proposed for considering long term
effects.
The working group distinguishes two main scenario types. What-if scenarios
give quantitative comparisons of compared options, and are used for
relatively easy, well-known, less complex cases, with a short to medium time
frame. Cornerstone scenarios, on the other side, typically investigate very
different options, with the aim rather to explore the field of study than to obtain
quantitative, comparable figures. They are used for complex, less known,
problems, often located in the long term, and will, often, be basis for a more
detailed follow-up research by means of what-if scenarios (Pesonen et al.
2000, pp. 26].
2.1.4 Modelling changes in Life Cycle Assessments
Although the literature about LCA and prospective analysis might offer clarity,
the discussion about the modelling of changes in LCAs offers diversity and
obfuscation. This is not surprising because LCAs are often modelled as static
and often aim at decision support. A static model can hardly express future
changes, and decisions clearly relate to change. So, a static model might
“overlook” that changes follow after the decision is taken, but it can in
principle consider the induced change nonetheless11.
11
E.g. by defining possible states where the decision will lead to; evaluating each state will give
guidelines for the decision. It might not be necessary to know the exact time when these system states
will be reached (e.g. Kheir 1996, Ljung 1994).
Page 8
TF3 Methodological consistency
The literature on change modelling in LCA offers many different
terminologies, and some different concepts. To come to a relatively easy
structure, the following distinguishes all relevant approaches according to the
way in which they model change.
Change-oriented LCAs model changes; attributional LCAs attribute a portion
of a large system to the LCA and disregard whether the large system
changes or not.
Following this distinction, prospective LCA (Weidema, Spielmann),
consequential LCA (Ekvall), change-oriented LCA (Weidema), and effectoriented LCA (Ekvall 1997) are termed change-oriented LCAs in the following,
while descriptive (CML), retrospective (Weidema) and attributional LCAs
(ISO) are termed attributional LCA.
Attributional LCA is the classical LCA approach. Starting from a functional
unit, a product system is built that covers the whole life cycle of the product or
service, ‘from cradle to grave’; and this system is assessed. The whole
product system depends on the chosen quantity for the functional unit in a
linear manner. Changes over time horizons are typically disregarded (SETAC
1993; UNEP 1996). Impacts of the product under study are attributed to the
functional unit of the product. One may visualise the attributional LCA system
as one slice of the whole product pie, where the functional unit defines how
large this slice is. The underlying assumption is a ceteris paribus assumption:
“The choice of the functional unit of the product alternative investigated
should not influence other activities on the planet” (Heijungs et al.
1992, p. 12; Frischknecht 1998, p. 47).
As an example, the electric grid net impacts may be analysed for one year
(this represents the whole pie), averaged to an UCTE mix for this specific
year, and attributed to one kWh of electricity (which represents the slice). Or,
in more colloquial words: “If I look at the world as it is running now, what does
car driving contribute to environmental problems?” (Guinée et al. 1999, p. 5).
Change-oriented LCA seeks to analyse the changes induced by the decision.
More specifically, it aims at describing the environmentally relevant physical
flows to and from a life cycle and its subsystems (Ekvall and Weidema 2004),
which are influenced by the decision. Possible decisions are whether to invest
in one specific product, or whether to change a production process in a
specific manner. Changes that need to be considered may include technology
switches, changes in market share, and learning curves (e.g. IEA 2000, for
energy production).
For example, additional electricity demand may be satisfied by a small power
station, by energy savings, and/or by a technically more advanced power
station, rather than by a power station representing an ‘average’ power plant.
Thus an analysis of market behaviour, e.g. by using partial or general
equilibrium models (Ibenholt 2002), may be necessary. Some authors simplify
Page 9
TF3 Methodological consistency
this analysis to the choice of ‘marginal technology’ for marginal production,
and ‘incremental technology’ to substantial changes in production volume
(Azapagic and Clift 1999; Ekvall 1999). In the end, the causal effects of the
decision need to be evaluated, either implicitly, or explicitly by use of a causal
model (Huppes 2001).
Figure 2 shows the differences of marginal, incremental12 and average
modelling, in an abstract way. Figure 3 shows the attributional LCA as a ‘slice
of a pie’ of the existing system, whereas the change-oriented LCA changes
the existing system driven by the definition of the functional unit.
Figure 2: Complete, substantial and marginal change (Azapagic and Clift
1999), taken from (Ekvall 1999), modified
12
In this terminology, incremental modelling relates to substantial change. Although this convention
may be slightly confusing to readers who are familiar with the notion of incremental change as change
in very small steps that is distinct from radical change involving large-scale reconfiguration of a sociotechnical system, it is preserved here to avoid adding further to the plethora of terminology already used
by workers in this area.
Page 10
TF3 Methodological consistency
Figure 3: attributional LCA as ‘slice of a pie’, and change-oriented (or
consequential) LCA as a change of the original system (Weidema 2003,
p. 15)
2.1.5 Towards a recommended application scheme
With the aim to guide practitioners towards change-oriented or attributional
LCA, depending on which modelling type seems preferable in their current
application case, a “recommended application scheme” is proposed in the
following. Note that the scheme does not help in further modelling choices,
as, e.g., allocation methods, or system boundary settings. Note also that the
scheme assumes that an LCA is the method of choice, so the recommended
environmental analysis method is not discussed either.
The scheme consists of a number of consecutive questions; answering the
questions leads to the recommended modelling method13.
It recommends an application based on the following rationale:
An application is preferable if it reflects reality better.
In practical applications, the effort for conducting the method will also play a
role; however, the effort seems not to influence whether change-oriented or
attributional LCA is selected. Thus, for pragmatic reasons, effort is not
considered in the scheme (!).
2.1.6 Questions in the scheme
The following, basic questions will be asked in the scheme.
A) Is product related decision support goal and scope of the study?
Example: What changes in environmental impact will be caused by
introducing a technology that reduces the loss of dairy products in the
households?
13
Some authors in the LCA field call this approach a „decision tree“; however, to avoid confusion with
statistics where a decision tree is a special graph that shows decision options with their chance of
realisation, or their utility function value, the general term scheme is used here instead.
Page 11
TF3 Methodological consistency
Or is goal and scope rather an analysis, a study of e.g. different properties of
the product without the intention to decide anything about the product system.
Example: What are the total environmental impacts from dairy products?
If this question is answered with no (no product related decision support) then
attributional analysis is recommended.14
B) Will a change that is induced by the decision change the “overall status
quo” considerably? If this question is denied, then change-oriented modelling
is not necessary. Since this question is asked from the perspective of the
case study, the question can be rephrased to:
Do case study results change considerably if change induced by the decision
is taken into account?
C) Can the induced change be modelled in a rather correct manner that does
not outweigh the gained insight? Again, if this question is denied, then
attributional modelling is recommended14.
Overall error
error
Errors due to misconceiving
reality (sampling errors, misspecifications of the model,
others)
Errors due to ignoring /
simplifying reality
modell sophistication, complexity
Figure 4: Undue sophistication raises the overall error in a model
(Ciroth 2004), based on SRU: Umweltgutachten 1974, Stuttgart 1974, p
208, modified)
The last question is a typical question of model sophistication. It is not
desirable to implement every detail one perceives from the “real world” into a
model, but to build a model as an image of reality that fits best to goal and
scope. Figure 5 illustrates this with the modelling error.
2.1.7 A recommended application scheme
The starting point in the application scheme (see Fehler! Verweisquelle
konnte nicht gefunden werden.) is the LCA method. Answers to three simple
questions A, B, C lead to either attributional or change-oriented modeling.
14
This recommendation assumes attributional analysis as the default analysis, and follows in this the
broader application of attributional LCA, it acknowledges that at times attributional LCA may be easier,
and, further, the fact that a change-oriented modelling without change remains somewhat pointless.
However, this default setting can be questioned (see Weidema 2003).
Page 12
TF3 Methodological consistency
LCA method to be applied
A) Goal & scope: Decision
support?
no
yes
B) Will the status quo change?
no
Attributional
LCA
yes
C) Can the change be
modelled with net benefit?
no
yes
Change-oriented LCA
Figure 5: Application scheme as a guidance towards attributional and
change-oriented LCA modelling. Further explanations see text.
These questions shall be answered by the practitioner that conducts the
study, and they shall be checked by a peer review panel if a specific study
has a review panel.
2.1.8 Conclusions
The discussion of prospective and descriptive analysis leads, for LCAs,
instantly to the discussion of attributional and change-oriented modeling. For
this reason, the scheme does not deal with prospective and descriptive
analysis but “directly” with the question of attributional and change oriented
modeling. To this end, the scheme poses three, rather straightforward,
questions. The first two questions have, implicitly in most cases, been
Page 13
TF3 Methodological consistency
discussed in previous literature. The third question, can the change be
modeled with net benefit, is newly introduced here.
The questions are of a general nature. They aim at representing a consensus
among the whole LCA community, and to structure a more detailed
discussion and more elaborated guidelines. They will need to be discussed,
and tested, and questions B and C (will the status quo change, and can the
change be modelled with net benefit) will need to be detailed in itself. For
example, when should one assume that the status quo does not change?
How can one assess “costs and benefits” of modelling the change? What can
be modelled rather easily? And what includes a high and risky level of
sophistication? These questions have not been tackled in sufficient detail in
previous literature in a way that LCA practitioners can decide upon the most
suitable modelling method in a rational manner. They call for a “change
analysis” as a step in every LCA that aims at decision support, and for a
detailed “method cost benefit analysis”. The latter would best be
undertaken at a more generic, non-case specific level, with input from specific
cases.
Neither of these forms of analysis yet exist; there exist, however, several
threads that could be used as starting points. For example, literature on
advantages and disadvantages of attributional modeling in comparison to
change-oriented modeling is rather broad (Ekvall et al., Weidema 2003,
Frischknecht 1998; see also Chapter 3. Several authors have presented tools
applicable for a change analysis (e.g. Weidema 2003), there is also rich
literature and knowledge outside the LCA field, in statistics and advanced
modelling, decision theory, and game theory, and most specifically, in the
field of prospective analysis.
There is not yet, however, a consistent “framework” that integrates both types
of assessment and modelling, change-oriented and attributional, in a
consistent manner. The application scheme aims to be, in this long-ongoing
discussion, a first step towards a consensus on modeling change in LCA.
Looking at how deeply the modelling of change affects LCA results and also
conclusions drawn from an LCA, such a consensus is of high need.
Page 14
TF3 Methodological consistency
2.2
Multi-functionality and allocation in LCA
Corresponding author: Sven Lundie, School of Civil and Environmental Engineering at the
University of New South Wales, Sydney, Australia
The multi-functionality problem has been identified as a significant
methodological problem. The general situation is that most processes that
constitute part of a product system are multi-functional: they produce more
than one product (co-production), treat two or more waste inputs (combined
waste treatment), 3) treat one waste input and produce one valuable output
(open- of close-loop recycling) or 4) serve three or more valuable functions
from both input and output (Heijungs and Suh, 2002). In such cases the
materials and energy flows as well as associated environmental releases
shall be allocated to the different products according to clearly stated
procedures (ISO 14044, 4.3.4).
Several approaches for dealing with the multi-functionality have been
developed, i.e. through system boundary expansion, and by partitioning,
using physico-chemical, economic, mass and energy approaches. However,
Guinée et al (2004) point out that the multi-functionality problem is an artefact
of wishing to isolate one function out of many, which are jointly produced,
waste being processed or recycled. There is no single “true” solution for
solving the multi-functionality problem. Therefore, the procedure addressing
multi-functionality should allow the most reasonable comparison of product
systems. When creating single functional unit systems, the method used may
easily determine the outcome of comparisons.
In this section an attempt is made to identify and categorise multi-functional
processes systematically (see Section 2.2.1), to describe both methodologies,
i.e. system boundary expansion and allocation, in detail (see Sections 2.2.3
and 2.2.4) and to critically review the current practice in LCA (see Section
2.2.5). Recommendations are given for the structured approach dealing with
multi-functional processes, advantages and disadvantages of approaches are
discussed in the context of the ISO framework (ISO 14044; see Section
2.2.6).
2.2.1 Categorisation of multi-functional unit processes
Multi-functional processes can occur in three different contexts. Guinée et al
(2004, p. 24) have defined these multi-functional processes and functional
flow in the context of economic allocation:
Page 15
TF3 Methodological consistency
•
“Functional flow: any of the flows of a unit process that constitute its goal,
viz. the product outflows (including services) of a production process and
the waste inflows of a waste treatment process.
•
Multi-functional process: a unit process yielding more than one functional
flow, i.e. co-production, combined waste processing and recycling:
o Co-production: a multi-functional process having more than one
functional outflow and no functional inflow.15
o Combined waste processing: a multi-functional process having no
functional outflow and more than one functional inflow at a physical
level (see Guinée et al, 2004 for examples).
o Recycling: a multi-functional process having one or more functional
outflows and one or more functional inflows (including cases of
combined waste processing and co-production simultaneously; see
Guinée et al, 2004 for examples)”.
This classification of multi-functional processes is used later on in the
following Sections.
2.2.2 International Organization for Standardization
The guidance provided by the International Standards Organization (ISO)
recognizes the variety of approaches which can be applied dealing with multifunctional processes. ISO suggests a generic step-wise framework in LCA
(ISO 14044, 2006). The following three steps are required:
Step 1: Wherever possible allocation should be avoided by
1) dividing the unit process to be allocated into two or more sub-processes
and collecting the input and output data related to these subprocesses, or
2) expanding the product system to include additional functions related to the
co-products, taking into account the requirements of Section 2.2.1.
Step 2: Where allocation cannot be avoided, the inputs and outputs of the
system should be partitioned between its different products and functions in a
way that reflects the underlying physical relations between them; i.e. they
should reflect the inputs and outputs are changed by quantitative changes in
the products or functions delivered by the system.
Step 3: Where physical relationships alone cannot be established or used as
the basis for allocation, the inputs should be allocated between the products
and functions in a way that reflects other relationships between them. For
example, input and output data might be allocated between co-products in
relation the economic value of the products.
15
Heijungs and Suh (2002) further differentiate in unit processes of which the functions are causally
coupled (joint production) or deliberately coupled (combined production).
Page 16
TF3 Methodological consistency
Formally, Step 1 is not part of the allocation procedure. Step.1.1 is relevant if
– for example – processes, which actually are independent, have been
lumped together in a data set. This is not an allocation problem, but can be
identified in a more in-depth analysis. For such in fact single-output processes
allocation is not relevant. The second part of Step 1, Step 1.2, is the
expansion of the product system, i.e. system boundary expansion, to include
the additional functions related to the co-products, combined waste
processing or recycling (see Section 2.2.3 for more details). In some cases,
the analysis may refer to combined processes where the outputs can be
varied independently, in marginal or incremental changes.
Step 2 includes relationships that are not necessarily causal, including
physical properties of products such as mass, molar flows, energy contents or
volume (often as a proxy for the more volatile economic value). In physical
causality, the physical inputs into a process combined with the process
conditions cause the outputs. Example: Grain seeds and fertiliser, together
with soil and climate conditions, cause the grain and the straw as products.
There is a clear logical requirement on the direction of time: Outputs follow
inputs, therefore outputs can never cause inputs. So co-products can never
cause the resources required in their production in a physical sense. Ekvall
and Finnveden (2001) argue that in some cases, this allocation may coincide
with allocation based on causal relationship, but where it does not, it will not
provide reliable information.
Where physical relationships alone cannot be established, step 3 is to be
applied: inputs may be allocated in a way that reflects 'other relationships'.
For this final allocation method only allocation based on economic value is
given as an example by ISO. Here the reference made to causality is a
difficult one, as in economic activities two very different causalities are
involved, natural science based causality and social causality. However in a
social sense, the causality may easily flow “the wrong way” – in contrast to
the physical flow. In economic activities, inputs are caused by the outputs: It
is the value of the products which creates the incentives for setting up the
facility and acquiring the inputs. The causality does not really go against time,
as it is expected proceeds which drive the firms to invest and run the process.
Hence, the operators of a grain producer adapt the inputs and the conditions
in such a way that they produce this additional output, within these constraints
governed by physical causality. Step 3 speaks of “other relations” than
“physical relationships”. For most economic activities, these relations would
be as guided by the considerations of the process operators. Processes are
run because of the value they create, which constitutes their economic
causality. Then the partitioning to the share of each product in total value
created is distributing this causal factor over its constituent parts. Though the
partitioning of course does not imply that one of the co-products cause the
part of the process, the reasoning as whole is based on causality, economic
causality.
Page 17
TF3 Methodological consistency
In the case of reuse and recycling ISO 14041 (now 14044) acknowledges the
fact that 'several allocation procedures are applicable. Changes in the
inherent properties of materials shall be taken into account' (ISO 14041,
Chapter 6.5.4). It is differentiated between close- and open-loop recycling.
The allocation procedures for the shared unit processes in open-loop
recycling shall take into account physical properties, economic value and/or
the number of subsequent uses of the recycled material.
2.2.3 System boundary expansion
The idea that allocation can be avoided by system expansion was first put
forward by Tillman et al (1991) and Vigon et al (1993) with respect to waste
incineration, and more generally by Heintz & Baisnee (1992).
Figure 6: Accounting for co-products through system expansion16
The concept of system expansion has its origin in the need to ensure that all the
studied systems yield comparable product outputs. Where a co-product does not
appear in similar quantity in all systems under study, comparability can be
maintained by expanding the system with the necessary amount of the coproducts. The processes to include when making such system expansions must
be those processes actually affected by an increase or decrease in output of the
16
The two original systems to the left are producing product A either without by-products (system 1) or
with the by-product B. System expansion (illustrated in the systems to the left) is performed with the
following rationale: If system 2 substitutes system 1, more B will be produced for the same quantity of A.
This additional amount of B will substitute another existing production of B, which must then be added
to system 1 to take this effect into account. Here, the difficult task is to identify which existing production
of B will be substituted. If system 2 is substituted by system 1, less B will be produced, thus requiring a
new substitute production to be added to system 1. Here, the difficult task is to identify which production
of B will be the substitute.
Page 18
TF3 Methodological consistency
by-product from the systems under study (see Figure 6), as described in ISO TR
14049, section 6.4. (with reference from ISO 14044)17.
Weidema & Norris (2005) consider that system boundary expansion is the
only approach that avoids allocation according to ISO 14041 (now 14044),
Clause 4.3.4.
Figure 7: Schematic diagram for describing system expansion and
delimitation of joint production
The currently most detailed procedural guideline for system expansion is
Weidema (2003 and 2004), from where Figure 7 has been drawn. Weidema
(2003, pp. 12 and 28) notes that system boundary expansion is always
possible for consequential LCAs. This approach is applicable for both
production and waste management, i.e. combined waste processing and
recycling. In the case of attributional LCAs economic allocation shall be
applied for co-products.
17
“The supplementary processes to be added to the systems must be those that would actually be
involved when switching between the analysed systems. To identify this, it is necessary to know: 1)
whether the production volume of the studied product systems fluctuate in time (in which case different
sub-markets with their technologies may be relevant), or the production volume is constant (in which
case the base-load marginal is applicable), 2) for each sub-market independently, whether a specific
unit process is affected directly (in which case this unit process is applicable), or the inputs are
delivered through an open market, in which case it is also necessary to know: whether any of the
processes or technologies supplying the market are constrained (in which case they are not applicable,
since their output will not change in spite of changes in demand), which of the unconstrained
suppliers/technologies has the highest or lowest production costs and consequently is the marginal
supplier/technology when the demand for the supplementary product is generally decreasing or
increasing, respectively.”
Page 19
TF3 Methodological consistency
Weidema (2003 and 2004) explains system expansion in relation to joint
production as being the answer to the question: 'How will the production
volume and exchanges of the processes in the system be affected by a
change in demand for the co-product that is used in the life cycle study?'
Weidema summarizes the answer to this question in three rules:
1) The co-producing process shall be fully ascribed (100%) to the
determining co-product for this process (product A; see Figure 7).18
2) Under the conditions that the dependent co-products are fully utilised, i.e.
that they do not partly go to waste treatment, product A shall be credited
for the processes that are displaced by the dependent co-products. The
intermediate treatment shall be ascribed to product A. If there are
differences between a dependent co-product and the product it displaces,
and if these differences cause any changes in the further life cycles in
which the dependent co-product is used, these changes shall likewise be
ascribed to product A (see Figure 7).19
3) When a dependent co-product is not utilised fully (i.e. when part of it must
be regarded as a waste), the intermediate treatment shall be ascribed to
the product in which the dependent co-product is used (product B), while
product B is credited for the avoided waste treatment of the dependent coproduct (see Figure 7).20
The procedure as outlined above is based on the simplified assumption that a
change in demand for a dependent co-product does not affect the production
volume of the co-producing process. Weidema suggests that when this
assumption is regarded as too simplified (especially as it may change over
time, depending on location, and depending on the scale of change),
separate scenarios should be applied for each co-product that may be
expected to be determining.
18
This follows logically from product A per definition being the co-product, which causes the changes in
production volume of the co-producing process.
19
This rule follows from the fact that – under the stated condition – both the volume of intermediate
treatment and the amount of product which can be replaced, is determined by the amount of dependent
co-product available, which again is determined by the change in production volume in the co-producing
process, which is finally determined by the change in demand for product A. It follows from this rule that
product B is ascribed neither any part of the co-producing system, nor any part of the intermediate
treatment. When studying a change in demand for product B, this product shall be ascribed the change
at the supplier most sensitive to a change in demand, i.e. the same process, which is displaced by a
change in demand for product A (but see also rule no. 3). If the condition stated in rule no. 2 (that the
co-product is fully utilised in other processes) is not fulfilled, rule no. 3 applies.
20
This follows from the volume of the intermediate treatment (and the displacement of waste treatment)
in this situation being determined by how much is utilised in the receiving system, and not by how much
is produced in the co-producing process. Another way of saying this is that in this situation, process I
(the intermediate treatment) is that supplier to process B, which is most sensitive to a change in
demand for product B.
Page 20
TF3 Methodological consistency
It should be noted that the above procedures refer to joint production, where
the relative output volume of the co-products is fixed, while for combined
production with independently variable output volumes, allocation can be
avoided simply by modelling directly the consequences of a change in the
output of the co-product of interest without change in the output of the other
co-products. For combined production, a physical parameter can generally be
identified, which – in a given situation – is the limiting parameter for the coproduction. It is the contribution of the co-product of interest to this parameter,
which determines the consequences of the studied change. This is the Step 2
in the ISO procedure, also known as allocation according to physical
causalities (Guinée et al, 2002). In support of the view of system expansion is
considered by Weidema & Norris (2005) and Weidema (2003) as being a
“unified theory” for allocation where this allocation according to the
determining (causal) parameter can be treated as a special case of system
expansion, where the limiting parameter for the combined production is seen
as the determining co-product, and the non-limiting parameters as the
dependent co-products, giving the same result as when applying the simpler
procedure of allocation according to the determining parameter.
2.2.4 Allocation
2.2.4.1 General allocation principles
ISO 14044 (2006) indicates that allocation procedures should approximate
fundamental input-output relationships and characteristics of inventory
analysis. The principles may apply to multi-products, internal energy
allocation, services (e.g. transport, waste treatment) and to recycling.
According to ISO 14044 processes shall be identified that are shared with
other product systems and several generic principles shall be applied. Guinée
et al (2002) add further, more specific principles. The full set of principles is
the following:
•
the sum of the allocated inputs and outputs of a unit process shall
equal the unallocated inputs and outputs of the unit process;
•
allocation should be at multi-functional unit process level only because
this is the most detailed disaggregated analysis of the system and
should be applied consistently across all multi-functional processes;
•
the definition of the multi-function problem does not distinguish
different types of multi-functionality, i.e. co-production (either combined
or joint production, combined waste processing, re-use and recycling).
Hence, in each case the same allocation principles should apply;
•
results should be unaffected by the sequence of application of the
allocation procedure (cf. Sen, 1970);
•
carrying out sensitivity analysis if several allocation procedures seem
applicable; and
Page 21
TF3 Methodological consistency
•
documentation and justification.
However, if allocation is applied to solve multi-functional problem(s),
allocation shall be applied consistently throughout the entire LCA.
2.2.4.2 Allocation procedures
The allocation procedures can roughly differentiated into 3 types, i.e.
allocation based on physical properties, physico-chemical allocation and
economic allocation:
Allocation based on physical properties
Mass, molar flows, energy contents or volume are physical properties which
are used to allocate the inputs and outputs of the product / service under
study. However, this allocation may, but most likely does not reflect the
causal relationship. Guinée et al (2002) go even further: They discredit this
approach for a lack of justification (as already in Huppes and Schneider
1994), as there is no causality involved, eg. the mass of outputs cannot cause
inputs by physical causation.
As it is easy applicable, this type allocation may be used in attributional, noncomparative LCAs, where in some situations it may be used as a proxy for
economic allocation (see in this sense also Weidema 2003). At best,
allocation based on mass or energy can be used as a proxy for allocation
based on economic value (Guinée et al 2004).
Physico-chemical allocation
Feitz et al (2007) developed systematically an industry specific physicochemical allocation matrix for the dairy industry that reflects closely the
causalities in the dairy manufacturing industry. It may be seen as an
optimisation procedure for producing a changed output within the production
functions constraints and the variation in production technologies in the
system. In this sense, it uses physico-chemical relations, as part of production
functions. This approach has been developed at a sector level, linking in to
new developments in LCA.
One way of solving the physico-chemical allocation problem is to generate a
product or industry specific physico-chemical allocation matrix that reflects the
actual allocation of resources on a whole plant information. However, this
allocation matrix is the product of an extensive process of subtraction /
substitution to determine average resource use and emissions for individual
products from numerous multi-product manufacturing plants (see Feitz et al,
2007 for details). The iterative subtraction / substitution procedure has to be
adopted to avoid the need for economic, energy or mass allocation and to
obtain a more realistic measure of resource use per product. The procedure
usually involves using initial literature and estimates from numerous
production site for resource efficiency per product, normalising the resource
efficiency figures for all products to a reference product, and producing a
Page 22
TF3 Methodological consistency
matrix of resource efficiency ‘coefficients’ (or physico-chemical allocation
factors). The coefficients may be then optimised in an iterative manner for all
products using surveyed process data from numerous plants given the
constraints of the number of products, mass of different products and total
resource use for each plant. The coefficients could be further refined by using
an approach similar to the RAS method, used for optimizing input coefficients
in input-output tables (see Stone, 1963; Bacharach, 1970; Parikh, 1979 and
van der Linden and Dietzenbacher, 2000 for further details).
The percent allocation is determined by multiplying the annual production of a
product by its unique coefficient (or allocation factor A/F) and then dividing by
the sum of all products multiplied by its specific A/F. The determined
percentage allocation is multiplied by the input or output flow of interest. Feitz
et al (2007) recommend that such physico-chemical allocation matrices may
be developed for other industrial sectors which do have similar production
processes; for example, agriculture (e.g. the meat industry); construction (e.g.
sand, gravel and other construction materials); mining (e.g. gold and lead)
and petrochemical industries (e.g. automotive fuels).
Economic allocation
A systematic approach to allocation has been suggested by Guinée et al
(2002). The authors recommend economic allocation as a baseline method
for most detailed LCA applications, because it seems the only generally
applicable method (Guinée et al 2004). This avoids the problem that
differences between alternatives are caused by different allocation methods
applied in stead of being due to the underlying reality. This position may seem
to go against the ISO 14041 (now 14044) recommendation that allocation
should preferentially be done on the basis of physical relationships. In
exceptional cases, the physical relations may be relevant and then may
precede economic allocation, as in the cadmium emissions from waste
incineration originating from nickel-cadmium batteries. These cases seem
restricted to situations where the processing of the input is the function and
the physical causality would go in the right direction. That is the case in waste
management only. No other instances of allocation based on physical
causality has been found yet.
Page 23
TF3 Methodological consistency
Figure 8: Decision flow diagram for identifying and handling multifunctionality situations (Guinée et al, 2004)
More recently, a decision support diagram has been developed by Guinée et
al (2004, see also Figure 8) for coping with allocation and multi-functionality:
1) determine functional flows for each process of the system under study
(see step 1 in Figure 8);
2) determine multi-functional processes (see step 2 in Figure 8); and
3) classify the type of allocation into co-production, combined waste
processing and open-recycling (see step 3, cases A-C in Figure 8).
Physico-chemical allocation has explicitly not been considered in Guinée et al
(2004) due to the lack of data. However, in all cases, Guinée et al (2004)
firstly recommend allocation on a physico-chemical basis (if sufficient
information is available) and then to allocate remaining flows on an economic
basis. However, it is recommended to perform sensitivity analyses in addition.
Page 24
TF3 Methodological consistency
2.2.5 Case studies and guidelines – a literature overview
Many papers addressing methodological issues and case studies have been
published which are addressing multi-functional problems. Here a selection of
these case studies is described and discussed including food production,
chemical production, energy generation and refinery and building materials.
2.2.5.1 General guidelines
Curran (2006) reviewed the progress to develop generic guidelines for
allocation in LCA. The picture for recommendations is very heterogeneous
(see Curran, 2006, pp 11 for details):
• US EPA (1993) 'states that no allocation is always applicable however,
the guide endorses mass basis.'
• Greenhouse Gases, Regulated Emissions, and Energy Use in
Transportation (GREET) Model (Wang, 1999) ' follows a process
where co-product value is measured by energy units.'
• National Renewable Energy Laboratory (NREL, 2004) 'follows the ISO
hierarchy, but recommends economic basis.'
• Guinée et al (2002) 'advises economic allocation for all detailed LCAs'
(except for special cases in waste processing; see Section 2.2.4.2).
Guinée et al (2004) developed a decision tree dealing for economic
allocation for co-production, combined waste processing and openloop recycling.
• EcoInvent (Frischknecht and Jungbluth, 2004) 'avoids using system
expansion and allows for choice of basis.'
• eLCieTM (Sylvatica (2004) 'recognises the ISO standard and the need
to allow flexibility.'
2.2.5.2 Co-production
Food production
Ayers et al (2006) reviewed LCA studies in the fishery industry with a
particular focus on methodological issues related to multi-functional
processes. In this industry sector there occur 4 key allocation problems, i.e. at
fishery, processing, feed production and on-farm stage:
• At fishery stage three approaches were taken to deal with multi-functional
processes:
1) economic allocation is applied by Ziegler et al (2003) and Mungkung
(2005). Ziegler et al argue that economic allocation is more socially
relevant in this type of study because the economic value of the cod that is
the driving force for the fishery. System expansion has not been possible
because there are no fisheries where only the by-catch species are
caught. Hence, results would have been less transparent.
2) mass allocation is applied by Eyjólfsdóttir et al (2003) and Ellingsen &
Aanondsen (2006) and
3) system boundary expansion is applied by Thrane 2004, although it has
been complex.
Page 25
TF3 Methodological consistency
•
•
At processing stage Ziegler et al (2003) and Hospido et al (2006) use
economic allocation, while Thrane (2004) retains to system boundary
expansion.
At feed production and on-farm stage only economic allocation is applied
by Papatrypphon et al (2004 & 2003) and Mungkung (2005).
Dairy industry
Cederberg & Stadig (2003) compare different methods of handling coproducts when dividing the environmental burden of the milk production
system between milk and the co-products meat and surplus calves: Initially
economic allocation between milk and meat was applied. Allocating coproducts meat and surplus calves was then avoided by expanding the milk
system. The authors show that economic allocation between milk and beef
favours the product beef, but when system expansion is performed, the
environmental benefits of milk production due to its co-products of surplus
calves and meat become obvious. Milk and beef production systems are
closely connected. Changes in milk production systems will cause alterations
in beef production systems.
A different approach is taken by Feitz et al (2007) who systematically
developed an industry specific physico-chemical allocation procedure for
dairy products based on extensive surveys and site-visits (see Section
2.2.4.2).
Forestry
Werner et al (2006) encounter allocation problems in up-stream processes of
wood products. The authors tackle the problems in various ways depending
on the process in the value chain:
• Forestry processes are allocated to industrial wood and roundwood
independently of its time of harvest based on volume or relative share of
proceeds.
• Transports are allocated to all the products generated during processing
of the log on a mass-basis or based on the relative share of proceeds.
• Production processes are subdivided in trimming, debarking, conversion,
sorting, and mechanical processing and are allocated based on mass or
relative share of proceeds.
• End-of-life scenarios (recycling and incineration) are modeled according to
various allocation procedures such as Cut-off, VCS, the Op-Cost MEA,
and the SC-PA.
Hischier et al (2005 give in their paper an overview on how wood and
packaging material production is inventoried in ecoinvent. A revenue-based
co-product allocation approach is used for the different outputs.
Page 26
TF3 Methodological consistency
Chemical industry & refinery
Kim and Overcash (2000) explore 3 ways to define an industrial
manufacturing process to ammonia production, i.e. a macroscopic, a
microscopic, and a quasi-microscopic approach. The macroscopic approach
does not subdivide any of the sub-processes within a plant; the microscopic
approach fully separates all sub-processes and minimizes the need for
allocation, but it does not completely avoid allocation for background data.
While the quasi-microscopic approach allows for joint sub-processes, that
cannot be technically separated.
Kim and Dale (2002) apply the system expansion approach to the production
of ethanol from corn using dry and wet milling of corn ethanol production.
System expansion approach is used to avoid the allocation procedure in the
foreground system of ethanol production from corn grain. The traditional
allocation is applied in the up-stream processes such as ammonia production
and petroleum fuels.
Azapagic (et al) (2000, 1999, 1998 & 1996) explore the use of linear
programming to find opportunities for system improvement in the production
of 5 boron products. Linear programming is applied to co-product allocation, if
environmental releases occur that are caused by the exchange of systems.
Detailed data on the sub-processes in the system is needed for this
approach.
Silva and Kulay (2003) use allocation criteria of energy and mass for the
production of sulfuric acid and manufacture of single superphosphate in an
LCA on wet and thermal routes for phosphate fertilizers manufacture.
For petroleum refining Wang et al (2004) investigate 5 types of allocation to 4
types of fuel, i.e. gasoline, diesel, LPG & naphtha. Energy content based
allocation at the refinery level, energy content based allocation at the refinery
level with rule-of-thumb adjustments, mass based allocation at the process
level, energy content based allocation at the process level, market-value
based allocation at the refinery level.
Combined heat and power
Frischknecht (2000) argues that allocation cannot be defined at the process
level, but must be done at the system level. Companies can perform
allocation to optimize the economic and/or environmental performance.
System enlargement is viewed as a special case of an allocation factor.
Fossil fuel chain
Guinee & Heijungs (2006) compare an average Dutch passenger car running
on petrol versus bio-ethanol and diesel versus biodiesel. The focus is on
analysing the influence of allocation methods on the overall environmental
impact results (cf. Bernesson et al 2004). 3 different allocation scenarios for
fossil fuel chains are carried out: 1) economic allocation, 2) physical
Page 27
TF3 Methodological consistency
allocation (mass or energy content; economic allocation is applied, if physical
parameter can not be determined), 3) Ecoinvent default allocation (physicalcausal relationships common physical parameters (mass or heating value)
and/or the economic proceeds of the valuable outputs of the multi-output
process).
The total results (at the level of environmental impacts) only differ modestly,
i.e. factor 1–1.5, although at the process level allocation factors may differ
significantly (up to almost 250).
2.2.5.3 Combined waste processing and recycling
So far, building materials and wood-based products have received most of
the attention in this area (Curran, 2006).
Ekvall & Weidema (2004) argue that price elasticity should ideally be
identified for each individual case of open-loop recycling unfortunately.
However, this is not likely to be feasible, as there is apparently a large
uncertainty in price elasticity of supply and demand. Instead default values for
the price elasticities are used as presented by Palmer et al. (1997) and
summarized by Ekvall (2000). The authors find out that open-loop recycling
has a negligible effect on the LCI results.
Recycled material from the system investigated can replace material of the
same type, i.e. virgin material or recycled material from other systems. It can
also replace completely different types of material or no material at all (Ekvall
& Finnveden 2001). Recycling of material into the system investigated might
affect different waste management processes. It might also affect several
other systems, in which the recycled material could have been used,
replacing another and unknown material. Therefore, two-tiered simplifications
are required to make the methodology operational: 1st line simplification (i.e.
'assume competition only with virgin and recycled material of the same type')
and 2nd line simplification (i.e. assume supply and demand to be (not) equally
affected; Ekvall and Weidema, 2004).
Borg et al (2001) posit that economic value is the appropriate indicator for
remaining quality of recycled building materials. Werner and Richter (2000)
apply economic allocation to open-loop recycling of aluminum for the
production of window frames. The authors propose to allocate environmental
burdens only to those products which are the 'aim' or the 'intended output' of
a process. Therefore, release data are allocated entirely to the window.
Jungmeier et al (2002a &b) address the treatment of allocation in wood-based
products. They conclude that different allocation factors, eg. mass or
economic value, are allowable in the same LCA. It is suggested to follow
allocation-schemes that they consider to be the most practical; forestry: mass
or volume; sawmill: mass or volume and proceeds; wood industry: mass &
proceeds.
Page 28
TF3 Methodological consistency
Bez et al (1998) investigate the disposal of waste in sanitary landfills. The
model approach shows an operationalized concept for allocation of the
environmental effects caused by the landfill process depending on the special
input components, i.e. the elementary composition of single waste fractions.
The incineration of organic waste solvents in hazardous waste incinerators is
analysed by Seyler et al (2005). For the multi-input allocation a model is
developed that takes into account the physico-chemical properties of waste
solvents such as elementary composition and net calorific value.
2.2.5.4 Discussion and conclusions based on the literature review
The reviewed case studies render a very 'mixed' picture of how LCA
practitioners are dealing with multi-functional processes in co-production,
combined waste processing and recycling.
In the fishery industry economic allocation is the most widely used approach,
while system expansion and allocation according to physical causality haven't
been applied in most cases (Ayers et al, 2006). Allocation procedures have
not been avoided by subdividing or expanding the system, i.e. step 1 of the
ISO procedure.
Allocation based on gross energy content is proposed, as it provides a more
accurate reflection of the flow of matter and energy in this production system
(Ayers et al, 2006).
LCA studies in the dairy industry offer also multifaceted approaches:
Cederberg and Stadig (2003) conclude that in prospective LCA studies,
system expansion should be performed to obtain adequate information of the
environmental consequences of manipulating production systems that are
interlinked to each other, while Nielsen et al (2003) state that economic or
mass-based allocation has been used most frequently in agricultural LCAs.
Feitz et al apply physico-chemical allocation (Feitz et al, 2007).
Werner et al (2006) are of the opinion that several co-product allocation
procedures are applicable for different life cycle steps within the same LCA.
The allocation procedure selected for each step widely depends on the
decision-maker's mental models on the material and market characteristics,
the purpose of the study and on the specific planning and decision structure
assumed for the processes to be allocated.
In the case of ammonia production Kim and Dale (2002) argue that 'the
choice of the allocation procedure depends on the goal of the study: system
expansion approach can evaluate effects of changes in the foreground
system, but is a data-intensive process; mass basis allocation method is
easily applicable and can identify the key sub-processes, but is unable to
determine effects of key process parameter changes.'
Page 29
TF3 Methodological consistency
Economic allocation seems to be the preferred approach and is perceived to
be the best avenue to capture the down-stream recycling activities (Curran,
2006; Guinee et al, 2004). In contrast, Werner et al (2006) argue that no
generic allocation procedure is definable that would adequately depict the
material and market characteristics of all materials available as well as the
specific decision logic for each of their life cycle steps.
In the majority of the reviewed case studies some sort of allocation
procedures are applied. However, the level of detail and justification provided
for system boundary expansion and allocation decisions are inconsistent and
incomplete, in most published reports (Ayers et al, 2006). The first two steps
of the ISO hierarchy have been difficult to apply neither sub-dividing the
systems studied nor allocation according to a causal physical relationship
were possible (Ayers et al, 2006). Frischknecht (2000) suggests moving
system expansion from Step 1 to Step 3 in ISO 14041 (now 14044) in order to
apply system expansion in a way similar as the use of economic and other
causalities. Also, economic relationships seem to be at least as important as
physical relationships (Ekvall and Weidema, 2004).
Based on this (limited) review it seems that the three steps framework in ISO
14044 (2006, see also Section 2.2.2) is not frequently followed in the practical
application of LCA. The methodological choice of dealing with multi-functional
processes might be made on a case-by-case basis. It seems to be a recurring
theme that needs to match closely with the goal of the study where the
intentions of the study are outlined. In the Goal and Scope Definition
questions are answered, such as why is the study commissioned, for what
purpose, who is the target audience etc. These issues may have a direct
impact on methodological choices. Giving the large variety of LCA studies it
seems to be unlikely that a 'one-fits-for-all' can be developed.
2.2.6 Structured approach for dealing with multi-functional unit
processes
In this section overall recommendations are given for dealing with multifunctional unit processes. Furthermore it is address how to identify and
classify multi-functional processes. The two main approaches, i.e. system
boundary expansion and allocation, are described, advantages and
disadvantages are discussed. As these approaches follow quite different
logics, they are addressed separately.
The rationale for the structured approach is to reflect the process system “in
reality” as reliable as possible. However, no recommendation is made as to
which procedure is superior, i.e. system boundary expansion or allocation
methods. The effort involved in following the decision guidance may vary
Page 30
TF3 Methodological consistency
significantly from case to case and from the allocation / system boundary
expansion method.
2.2.6.1 General recommendations
Methodological choices are needs driven. The information needs, such as
decision support, are outlined in the Goal and Scope Definition of an LCA. An
example for information needs could be: system expansion should be
performed to obtain adequate information of the environmental consequences
of manipulating production systems that are interlinked to each other. It may
therefore be helpful to differentiate between 'what if' and 'what was' scenarios,
'what if' = consequential LCA; 'what was' = attributional LCA. Also, a closer
link of the methodological choices in multi-functional situations to Goal and
Scope Definition might be recommendable, particularly in consequential LCAs
(Ekvall and Weidema, 2004). Werner et al (2006) call this a 'functionalistic
conception of LCA'; meaning that the initially formulated goal is crucially
inherent in many modeling decisions that affect the final result. The
justification of choices should be explicit and transparent.
A standard set of requirements for how to describe and justify allocation
decisions in published reports might help to make LCA studies with multifunctional processes more robust and transparent (Ayers et al, 2006). Key
issues for system boundary expansion and allocation might be after Curran
(2006): data availability, modeling 'reality', impact on decision making, etc.
Systematic sensitivity analysis as part of the interpretation phase remains
necessary (Guinee and Heijungs, 2006), particularly to illustrate the impact of
different allocation procedures or system boundary expansion on the results
of the study (Ayers et al, 2006).
Given the specific methodological considerations and prevailing practice of
conducting LCA studies, two issues might be addressed:
1) it could be assumed that no generic procedure for multi-functional
processes in co-production, combined waste processing and recycling is
definable (Werner et al, 2006). Hence, more effort needs to be invested in
developing allocation procedures appropriate to specific industry sectors
(Ayers et al, 2006; Feitz et al, 2007).
2) Frischknechts' ideas could be reconsidered 'relocating' system boundary
expansion from Step 1 to Step 3 in ISO 14041 (Frischknecht, 2000), in order
to put boundary expansion on par with 'other relationships'.
2.2.6.2 Categorising multi-functional processes
At the beginning multi-functional processes need to be categorized
systematically. A structured approach is outlined below how to categorise
multi-functional processes and how to further partitioning these processes:
Page 31
TF3 Methodological consistency
• Determination of functional flows
At the beginning of this procedure the functional flows of each process under
study must be identified. The functional flows may be products/services
manufactured and/or waste to be treated (Guinée et al, 2004).
• Determination of multi-functional processes
Those processes of the system have to be identified which have more than
one functional flow of which at least one is not entirely required by the product
system. If there are processes with more than one functional flow which do
not fully remain within the product system a method for allocation or system
boundary expansion must be applied (go to the next step); if this is not the
case, i.e. only one functional flow or closed loop recycling, no multi-functional
situation is given and hence further proceeding is needed (Guinée et al,
2004).
• Further partitioning of inputs and outputs
For identified multi-functional processes it is recommended to collect further,
more detailed information in order to obtain two or more mono-functional subprocesses (if possible). By sub-dividing multi-functional processes into monofunctional processes both the allocation or system boundary expansion may
be avoided. However, a further procedure has to be applied if it is not
possible to disaggregate all multi-functional into multiple mono-functional
processes.
Then there are two principle approaches dealing with this multi-functional
situation, i.e. system boundary expansion or allocation methods. These two
approaches are described separately.
2.2.6.3 System boundary expansion
Weidema (2003, 2004) and Weidema and Norris (2005) propos a three step
procedure. A brief summary is given here (for more details see Section 2.2.3):
•
•
•
Firstly, ascribe the co-producing process fully to the determining coproduct for this process;
Secondly, the co-producing process is credited for the processes that are
displaced by the dependent co-product(s) and the intermediate treatments
are ascribed to the co-producing process. It is the condition of this step
that the dependent co-products are fully utilised, i.e. that they do not partly
go to waste treatment; and
Thirdly, the intermediate treatment shall be ascribed to the product in
which the dependent co-product is used, while product B is credited for
the avoided waste treatment of the dependent co-product. It is the
condition that the dependent co-product is not utilised fully (i.e. when part
of it must be regarded as a waste).
Page 32
TF3 Methodological consistency
System boundary expansion is only applicable for consequential, not for
attributional LCAs (Weidema, 2003).
There is a general agreement that system expansion approach is a very
attractive way to theoretically avoid the difficult problem of allocation
altogether. In that sense, system expansion simplifies modelling, it limits the
assumptions that the modeler needs to make.
It allows the modeler to assign credits for avoided environmental burdens
associated with product displacement.
Guinée et al (2002, Section 3.9) point out that step 1b of the ISO procedure
(see Section 2.2.2) is equal to redefining the functional unit and the system
boundaries.
System boundary expansion generally introduces new multi-functional
processes (Guinée et al, 2002); some sort of allocation is still needed in order
to collect the necessary background data. Hence, allocation cannot be totally
avoided even in a system expansion approach. The majority of the case
studies apply some sort of allocation (Curran, 2006).
Broadening the system boundaries makes the process of data collection
much more complicated; not only are more data needed, but appropriate data
are needed (Curran, 2006). Given the widespread and increasing occurrence
of multi-functional processes, system expansion will inflate the system under
study to an extent that may not be acceptable for the LCA practitioner, as
most processes in the world would become part of the product system
analysed. This practical problem may hold for conventional, process-based
type LCAs. Data accessibility, time, and effort become significant and bring
the practicality of applying system expansion into question. But the problem of
a substantial growing number of processes through system boundary
expansion may to some extent be solved by applying hybrid modelling for
LCI. In principle, all sectors would be involved in expansions. Depending on
assumptions, and especially the definition of the functional unit, a few
products or all final products may be involved in the comparison (Guinée et al,
2002).
Larger systems run the risk of being less transparent in that there is more
detail on how data were arrived at than can be conveyed conveniently
(Curran, 2006).
“This procedure may often constitute an artificial solution to the multifunctionality problem, if the functions taken for expansion and subtraction are
known in reality not to be the relevant ones ….as such imaginary solutions
may introduce large and unknown uncertainties in outcomes” (see Guinée et
al, 2002).
Page 33
TF3 Methodological consistency
2.2.6.4 Allocation
If allocation is applied to solve multi-functional problem(s), allocation shall be
applied consistently throughout the entire LCA (Guinee et al, 2002).
Three different cases can be differentiated and handled differently (Guinée et
al (2004) for details), i.e. co-production, covering both goods and services,
combined waste processing and open- and closed-loop recycling.
In all cases, Guinée et al (2004) firstly recommend allocation on a physicochemical basis (if sufficient information is available) and then to allocate
remaining flows on an economic basis.
At the end of the analysis sensitivity analysis shall be carried out if several
allocation procedures seem applicable (Guinee et al, 2002).
If allocation methods are applied, the allocation procedures as described
below may be considered as an order of preference (as suggested by Guinee
et al, 2002):
Physico-chemical allocation
Wherever allocation cannot be avoided, an attempt should be made to
partition the inputs and outputs of the system between its different products or
functions in such a way that it reflects the underlying physical relationships
between them. The physico-chemical relationships shall render how the
inputs and outputs are changed within the multi-functional process in order to
deliver the product or function (ISO 14044, 2006). However, allocation based
on physical relationships (step 2 of the ISO procedure) is part of modelling if
these relationships are specified (Guinée et al, 2002; see Section 2.2.4).
Physico-chemical allocation seems to be the preferred approach, if allocation
procedures are being applied (Guinée et al, 2004; Feitz et al, 2007).
However, this complex type of allocation requires extensive data and is labour
intensive. Very detailed data of one production site or – even better – of
several sites have to be collected and further processed in an iterative
manner (see Section 2.2.4). Feitz et al (2007) have estimated that the results
based on physico-chemical allocation differ significantly from any other type
of allocation. Feitz et al (2007) recommend that such physico-chemical
allocation matrices may be developed for other industrial sectors which do
have similar production processes; for example, agriculture (e.g. the meat
industry); construction (e.g. sand, gravel and other construction materials);
mining (e.g. gold and lead) and petrochemical industries (e.g. automotive
fuels).
However, real physical causalities have only be found in waste management
(Guinée et al, 2004).
Page 34
TF3 Methodological consistency
Economic allocation
Economic allocation is based on the same principles used in managerial cost
accounting, see Huppes (1992) for explanation of the similarity in underlying
principles. There are several names for roughly the same methods, with
Gross Sales Value method being mostly used. There is some similarity in with
ISO steps, in that first the processes involved in the firm are dissected into
parts which are not joint or combined at all (as in compression and storage of
Cl2 in joint production of caustic and chorine). Allocation only applies to the
remaining part of the production process.
The application of economic allocation should be straightforward, if relevant
economic information is available (see Guinée et al (2004) for details).
Guinée et al (2004) recommend economic allocation as a baseline method for
most detailed LCA applications, because it seems the only generally
applicable method. This avoids the problem that differences between
alternatives are caused by different allocation methods applied in stead of
being due to the underlying reality.
This position may seem to go against the ISO 14041 (now 14044)
recommendation that allocation should preferentially be done on the basis of
physical relationships. In exceptional cases, the physical relations may be
relevant and then may precede economic allocation, as in the cadmium
emissions from waste incineration originating from nickel-cadmium batteries.
These cases seem restricted to situations where the processing of the input is
the function and the physical causality would go in the right direction. That is
the case in waste management only. No other instance of allocation based on
physical causality has been found yet.
However, economic allocation is susceptible to many types of uncertainty,
such as (locally) fluctuating prices, demand, inflation, tariffs and industry
subsidies etc. (see Guinée et al, 2002 and Feitz et al, 2007). Therefore,
economic (or market based) allocation is often viewed to be too volatile to be
practical (Curran, 2006). Such problems seem very similar to other data
problems, as with emissions: Within one installation, these vary during the
day, week and seasons, are different for similar installation, are different
between regions and have very strong dynamic tendencies during the life
time of installations.
There should be methodological clarity on which value or cost concept to
base the allocation procedure. If private cost and value concepts are applied,
prices should be market prices, including taxes, levies and subsidies. There
seem good reasons to do so, as such all in prices reflect the incentives for
firms.
Page 35
TF3 Methodological consistency
Allocation based on mass, energy contents, molar flows etc.
relationships
Mass, energy contents and molar flows allocation have been applied due to
their simplicity. These types of allocation may be considered as a crude proxy
for economic allocation (Guinée et al, 2004). However, Guinée et al (2002) go
even further: They discredit this approach for a lack of justification (as already
in Huppes and Schneider 1994), as there is no causality involved, eg. the
mass of outputs cannot cause inputs by physical causation.
As it is easy applicable, this type allocation may be used in attributional, noncomparative LCAs, where in some situations it may be used as a proxy for
economic allocation (see Weidema 2003).
Page 36
TF3 Methodological consistency
2.3
Data quality, validation, uncertainty in LCA
Corresponding author: Andreas Ciroth, GreenDeltaTC, Berlin
2.3.1 Introduction
This section deals with the quality of an LCA and its "components", and with
measures to describe and assess the quality. Aim of the text is not to provide
a complete overview of state of the art in this field, nor is it to present an
overall framework for data quality management in LCA. Simply, available time
and space is not sufficient. Instead, the dear reader is pointed to overview
papers, e.g. by Heijungs, and to the work of the former LCA data quality
working group.
Aim of this text is to analyse existing approaches, and to look into how far
consensus and consistency exists among them. This shall lead to
recommendations.
While this text has its origin in the SETAC TF3, it was in the end work of the
author, with major contributions from the co-authors of this paper. Several
reviewers, both from within and outside of the original Task Force group,
provided very helpful comments.
Identifying consistencies is perhaps especially difficult in the data quality and
uncertainty field. In fact, many of the analysed papers agree most in only two
things: First, there is broad criticism about inconsistent nomenclature and the
different uses of important terms as e.g. uncertainty, and about a general
infancy of the methodology (interestingly, this statement can be found in
papers from 1996 to 2005); second, there is consensus in that uncertainty
assessment should be applied broadly, and that this is not yet the case.
However, the situation has improved in recent years. Data quality assessment
for datasets is indeed applied in commonly used LCI databases, and Monte
Carlo simulation and a pedigree approach that quantifies qualitative
assessment information has seen broad application success.
2.3.1.1 How verification, validation, uncertainty, and data quality – and a
good decision are related: an LCA study from cradle to grave
With data quality, validation and uncertainty, this chapter will treat topics that
are closely related, but also different. And while they have all relevance for
methodology and practical conduct of Life Cycle Assessments (LCAs), they
refer to different aspects of an LCA. This has consequences for a
recommended application of them.
In order to be able to determine a recommended application of data quality,
validation and uncertainty, it is thus of value to consider how each of these
terms relates to an LCA study. This analysis will, further, indicate that other
Page 37
TF3 Methodological consistency
terms need to be discussed as well, among them verification and “a good
decision”. Verification will be distinguished from validation, and clarifying what
“a good decision” is will enable us to more clearly define an overall goal for
the application of LCA.
Let’s start on a very basic level. On a very basic level, an LCA can be seen as
a model with input and output data. In that sense, input data leads to output
data “by being fed through the model” (van den Berg et al. 1999, p. 4; see
Figure 9).
Figure 9: “Input data leading to output data by being fed through the
model” (van den Berg et al. 1999, p. 4)
This basic picture neglects several relevant aspects, among them the
application context. This is also recognised by van den Berg and colleagues,
who propose the picture as a starting point. In an application context, output
or results of the LCA will lead, together with a level of confidence “assigned”
to them, to a decision or choice, and these, in turn, will have effects in reality
(van den Berg et al. 1999, p. 6; “results of LCA” is a synonym for output data,
here)21.
21
Van den Berg and colleagues aim to provide a quality assessment framework for LCA that is not
limited to input data but covers also impact assessment data, the quality of the overall model, and
“circumstantial evidence from a broader quality perspective” (p. 2). In order to do so, they adapt the
NUSAP scheme developed by Funtowicz and Ravetz (1990) to LCA, but find the “generic” scheme not
applicable because they think LCAs are too complex, on the one side, and the NUSAP scheme too
complicated to apply and rather for more elaborated decisions than LCAs should support, on the other
side. In their scheme, the overall quality is determined, in principle, by spread, validity, and “pedigree”;
spread and validity concern both LCA model and data employed in the model, and pedigree procedural
aspects. Spread, e.g., being addressed, inter alia, by reproducibility of computation (for model) and by
uncertainty and variability (for data), validity, e.g., for data by representativity, and procedural aspects
covering whether quality assurance procedures, such as sensitivity analyses, have been performed.
Unfortunately, here is no room to discuss this interesting source more in detail; two final remarks shall
suffice instead:
First, van den Berg at al. make a fundamental distinction between model and data. They state that
models, for LCAs, describe the logical and computational structure of LCA (p. 4). They do not define
data (and also do not define but rather describe their understanding of what a model is), but state with
reference to Figure 9 that “results are […] determined entirely by the combination of input data and
model”, and, further, that they “treat model parameters here as data” (p. 4). I follow their distinction in
this paper.
Second, van den Berg does not distinguish between “result” and “output data”, treating both as
synonyms ([…] in order to generate the output data, i.e. the results”). In that sense, “Results of LCA” in
Figure 10 is synonymous to calculation results, or “output data”.
Page 38
TF3 Methodological consistency
Figure 10: The LCA model results together with their perceived quality
influence the choices inspired by the model; and these, in turn, are the
practical effects of the LCA model (van den Berg et al. 1999, p. 4)
However, also this is not yet the full picture. In addition, there will be someone
to listen to LCA results and to draw conclusions from the study that have
effects in real life. Let’s call this person the decision maker. Then, there is
also goal and scope of the study. Quite often, the decision maker influences
goal and scope of the study, e.g. as commissioner of an LCA study. Naturally,
there are often several decision makers in an LCA study; often with a
complicated relation amongst them (commissioner; environmental department
of a company; top management of a company; industry association involved
in a study). These multiple decision makers are, for sake of simplicity, not
considered here.
Taking all this into account, we have the following stages in the conduct of an
LCA (Figure 11):
(1) specification of the scope of the analysis;
(2) input data specification and collection;
(3) calculation of the LCA study;
(4) obtaining the result of the study as output;
(5) interpretation, and perception of the result by the audience, decision
makers;
(6) decision / action taken or initiated by the decision maker.
In the first stage, scope of the analysis shall influence input data considered
in the LCA as well as the LCA calculation. Output data or results follow
Page 39
TF3 Methodological consistency
directly from the calculation (4), without further influence by goal and scope22.
The result is then perceived, and interpreted by one or several decision
makers, who then decide, drawing also from other information sources. Only
the latter decisions have an effect in reality. Please note, however, that the
decision: Do nothing / do not change anything, is an effect as well23.
Data quality aspects are not shown at all in the figure. Also, the “chain of
analysis” shows only one feedback loop, from decision maker to goal and
scope. In practical applications, there might be additional ones, e.g. from
output to input data (when one discovers the impact of poor input data on the
calculation result one should go for better input data), or even from result to
goal and scope. LCA is not for nothing known as an iterative procedure.
Thus, although this figure cannot claim to provide the “final full picture”, it
contains the complete chain of analysis, from scope and “cognition interest” to
the effects of the decision. One could say that it contains an LCA study from
cradle (cognition interest) to grave (decision and effects). Note that stages 2,
3 and 4, are very similar to the points input data, LCA model, and LCA results
proposed by van den Berg et al., thus the figure can be seen as an extension
of their starting point.
This chain will provide structure for the following analysis, which will link data
quality, validation, and uncertainty to the different stages.
1
2
3
4
5
6
LCA
scope
input
data
calculation
result
perception,
interpretation
decision /
action
decision maker(s)
Figure 11: Six stages from scope to the effects of a decision supported
by LCA
As an intermediate conclusion, one can say that
ƒ
Six stages in conducting a Life Cycle Assessment can be identified, from
specification of goal and scope over the LCA in a narrow sense, with input
data and model calculation, to the perception and interpretation of the
result by a decision maker. Finally, a decision of the decision maker leads
to effects “in the real world”, whereas “do nothing” is also a decision which
has also effects
22
Following van den Berg, result and output data are synonyms; one might argue, however, that results
of a study are also effects following decisions taken later, when the calculated data from the study is
perceived.
23
See e.g. Watzlawick, Weakland, Fish (1974) for a discussion.
Page 40
effects
TF3 Methodological consistency
ƒ
Linking LCA into the overall decision context seems to deserve further
analysis.
ƒ
A top-down consideration of LCA methodology, starting from effects in the
real world, and from characteristics of a good decision, seems promising
for discerning recommended from other approaches, especially in the
context of data quality and uncertainty. However, little research exists in
this field.
ƒ
A consequence of the latter point is that analysis on how to provide good
decision support by an “improved” LCA should not stop at the model result
stage (nr. 4), but consider how the result is perceived, and how decision
makers react when perceiving the result.
2.3.1.2 Framing of terms
2.3.1.2.1
“A good decision”
In the end, LCA is about decision support. In this sense, all the different
modelling choices that need to be done when performing an LCA, all the
different stages in the analysis as shown in Figure 11, shall support ‘a good
decision’. And what is a good decision?
Dietz provides six criteria:
“What constitutes a good decision about the environment? […] Six
criteria for evaluating environmental decisions are suggested: human
and environmental well-being, competence about facts and values,
fairness in process and outcome, a reliance on human strengths rather
than weaknesses, the opportunity to learn and efficiency.” (Dietz 2003)
These are not directly connected to LCA, so far, although there are obvious
links from human and environmental well-being to impact categories and
endpoints addressed in LCAs, and competence about facts to quantitative
and qualitative figures provided by any LCA study. Yet, these links need to be
explored more in detail. For this paper, these criteria provide rather a glimpse
on what in the end could be relevant for discerning between recommended
and other approaches.
Recent work on LCA and decision analysis (e.g. Seppälä (1999), Lundie
(1999) has concentrated more on the technical aspects of decision theory.
Possible consequences of uncertain results on “a good decision” are also
interesting, recognising that decisions under uncertainty are a common field
of analysis in decision theory where it is found that men often cannot cope
rationally with uncertain situations (Kahnemann and Tversky (1986); von
Neumann and Morgenstern (1944)). One of few examples in LCA is Lenzen
(2006). He proposes statistical testing for addressing uncertainty in decisions,
especially for impact assessment (and external costing as well).
Page 41
TF3 Methodological consistency
2.3.1.2.2 Verification and validation
Validation is frequently joined with verification (e.g., Rothenberg 1999);
however, both have a specific meaning.
Validation, you may recall, is
“the process of ascertaining that the model mimics the real system by
comparing the behaviour of the model to that of the real system in
which the system can be observed and altering the model to improve
its ability to represent the real system” (Biles 1996)
In other words, you check whether a built model is correct by comparing it to
the reality you attempted to model.
Verification, on the other side, is the process of determining that a model
implementation accurately represents the developer's conceptual description
of the model and the solution to the model (e.g. [AIAA 1998]). Verification can
therefore be performed within a model, e.g. by checking whether the
calculation is mathematically correct, while validation requires checking the
model against reality, and against goal and scope of the model, and cannot
be performed only within the model (Figure 12).
In software development, one might check whether the software calculates
correct results (verification) and whether users understand input and output of
the software (validation), and also whether the software works when it is
integrated in existing larger IT environments (validation, as well).
For LCAs, one might of course check calculation results and perform other
verification techniques. One may, however, barely validate LCAs because this
would require, e.g., to compare impact assessment results of the LCA to
those of the product under study. Potential environmental impacts, as they
are, according to ISO 14040, outcomes of the impact assessment of an LCA,
seem to exclude this already by definition (Ciroth and Becker (2006)). One of
the tasks of the critical review is to check conduct and result of an LCA study
against goal and scope; thus, critical review has a validation task. This,
however, is to a large degree depending on expert judgement.
Page 42
TF3 Methodological consistency
verification
validation
Reality &
scope
Figure 12: Verification and validation for an LCA case study (Ciroth
2002, modified)
This situation is displayed in Figure 12. Somehow, reality gives reason to
build an LCA model, with specific goal and scope. The LCA is then
conducted, in several stages usually, providing a result. Verification checks
within the model, while validation checks model and result against goal and
scope (and reality, in a broader sense).
2.3.1.2.3 Uncertainty and variability
The third issue to be discussed is uncertainty, in relation to variability. For
LCAs, uncertainty may be distinguished into three stages (Ciroth 2001; Ciroth
et al. 2004; Heijungs and Huijbregts 2004). These fit to the stages of the LCA
decision process, in Figure 11:
a) Uncertainty in input data (stage 2, Figure 11);
b) Uncertainty propagation and processing (stage 3);
c) Uncertainty in the calculation result and in the interpretation (stage 4,
5)
Variability is often discerned from uncertainty. Variability means changes in
real data, e.g. changes of temperature over the day or over the year. This
variability can, but needs not, be reflected in measurement data. If these
changes are reflected, measurement data will have a spread that is a sign for
good data quality (because the real values have this spread as well). Nonvariable changes add, in contrast, spread in measurement data due to
random measurement errors or other aspects, which is then an indicator of
bad data quality.
Whether changes in data are attributed to variability or uncertainty often
depends on the definition and the “parameter resolution” of the measurement
procedure. For instance, emission flows in residual water of a landfill will
change with the actual weather (temperature and rain). They will therefore
change over several days. If, for a larger sample, a landfill is measured at
different times, and if weather conditions are not addressed in the
measurement procedure (and not taken into account when analysing the
Page 43
TF3 Methodological consistency
data), then different emission flow values will add to spread of data in the
larger sample that is not further “explained” by the measurement procedure,
and will most likely be treated as uncertainty in data.
2.3.1.2.4 Data quality
Data quality is a very fundamental and general term which is defined by ISO
as “characteristic of data that bears on their ability to satisfy stated
requirements” (ISO 14040). For an LCA, a data set (or datum) might not meet
requirements if it is e.g. too old; from a completely unrepresentative location
or region, and so on. The relevance of the data set for the study will have
influence on thresholds for these criteria.
Data quality is frequently specified in data quality indicators (DQIs). Both
verification and validation check data quality (as one among other aspects of
an LCA model). Uncertainty in quantitative data may be one data quality
indicator.
Transparency assures that data and modelling choices are accessible for
verification and validation; in this sense, transparency enables more refined
data quality assessment.
Data quality applies to any of the LCA study ‘stages’ presented in Figure 11,
relating them to the goal and scope (which specifies, more or less concrete,
the requirements).
2.3.1.3 Why care?
The question of data quality in Life Cycle Assessment is difficult for several
reasons. Crucial among those reasons seems the fact that there is no
empirical validation of LCA results today.
Thus, distinct from many other questions like:
ƒ the best way to build a ship to win the America’s Cup;
ƒ best ways for a market campaign for a shampoo for middle-aged, male
business man in Germany;
ƒ whether margarine or butter keeps your cholesterol lower,
a question like “Is diesel from North-Sea oil or diesel from rapeseed more
environment-friendly?” is a one way street. An LCA practitioner can and will
use best available process data, and impact models, but the final result
provided in the study he or she conducts will not be checked by information
from empirical measures coming back from the end of the road.
The desire to conduct a sound study will in this situation put much emphasis
on selecting the “right” approach (in putting up the LCA model), where “right”
means: conform to the standard and conform to conventions. Applying a
surprising, novel approach will raise concerns.
In this situation, providing uncertainties together with quantitative data may
appear as not necessary, as a complication of a model already complicated
Page 44
TF3 Methodological consistency
enough. To gain acceptance, an LCA model will most likely need approval by
experts (the peer review committee and perhaps other stakeholders
involved). Providing uncertainty information in this case seems rather
inefficient.
However, it is seems strange to assume that uncertainty in LCA is not
relevant without having looked into it, and without validation that could
otherwise point to ill-specified modelling and analysis. Therefore, in principle,
an uncertainty assessment is essential if the uncertainty in a comparative
assertion is biased, e.g. if one compares highly uncertain with relatively
certain data.
Table 1 shows that LCA application is not always in line with an ideal,
empirically based, scientific approach, mainly due to a lack of empirical
testing and a resulting lack of an adequate consideration of uncertainty. This
seems a drawback of the method.
Table 1: Features of science, political analysis and life cycle
assessment in comparison
Feature of science
(Morgan and
Henrion 1990, p.
22)
Features of policy
analysis (Morgan
and Henrion 1990,
p. 22)
Features of Life Cycle
Assessments (LCAs)
Empirical testing
Testing often
impractical
Empirical testing only for single
elements of LCAs, if any (e.g.
Impact Assessment models may
have been tested empirically), no
empirical tests for the entire LCA
model
Full documentation
Documentation
typically inadequate
Full documentation
(desired)
Reporting of
uncertainty
Uncertainty usually
incomplete or
missing
Uncertainty usually incomplete or
missing; in part a “gut-feeling”
estimation of uncertainty
Peer review
Review not
standard and in
some cases
arduous
Review is standard for
comparative assertions, review
for single datasets under
development
Open debate
Debate hindered by
the above problems
Open debate
(sometimes hindered by
confidentiality and data issues)
Page 45
TF3 Methodological consistency
Conclusions:
ƒ
Overall rationale for uncertainty in Life Cycle Assessment in decision
support is: address uncertainty if it is relevant for the decision at stake.
ƒ
Uncertainty is relevant if it is high, or, if it is relatively higher in one
alternative than in the other. This question of the relative importance of
uncertainty has been barely addressed so far.
ƒ
Validation is barely used for LCAs today. This has the somewhat
surprising effect that the specific result of the LCA is of minor importance
compared to the selected approach and to agreement among
stakeholders. Seeking possible entry points for a validation would be of
merit, and would turn the Life Cycle Assessment modelling into a more
scientific approach.
2.3.2 Uncertainty
2.3.2.1 Input uncertainty
Input uncertainty comprises the stages 1 and 2 in Figure 11.
How to get information about the uncertainty of model parameters, and how
to model these uncertainties for the LCA model, are questions on the input
side. Heijungs and Huijbregts (2004) distinguish between parameter variation,
sampling methods and analytical methods.
Parameter variation is done either by varying only one parameter at a time, or
by building scenarios that bundle several parameter variations in a consistent
way.
Sampling methods: Statistical sampling methods seem most appropriate for
collecting uncertainty information; however, they are rarely described in the
LCA field. An exception is Cascade (2003). In principle, sampling methods
that are in use in other scientific fields (Cochran 1977), e.g. for waste data
collection and analysis (Argus 2003), seem applicable also for LCAs.
Analytical methods comprise, according to Heijungs and Huijbregts, an
estimation of parameters of probability distribution. Since this is done also in
most statistical sampling methods (one will rarely feed every single, collected
datum in the model), they may be seen as a sub-form of sampling methods.
Weidema et al. (2003) propose a strategy for reducing (input) uncertainty in
the data collection. The strategy consists, very basically, in reducing the
uncertainty where it has (or is assumed to have) the highest impact if
possible; if it is not possible, one should try to reduce the uncertainty at other
places with a lower impact. Weidema et al. propose a hierarchy of uncertainty
impacts: First, process, then process technology, then market boundaries,
and so on.
Page 46
TF3 Methodological consistency
Conclusions:
ƒ
A data collection strategy can direct scarce data collection resources
towards weak points and hot spots in a given LCA system and thus helps
to improve data quality in an efficient manner;
ƒ
Existing proposals for a data collection strategy are rather straightforward,
and could be followed without running into major methodical problems;
ƒ
There is little guidance on how to estimate input uncertainties for LCAs;
this is a pity because further steps in uncertainty assessment build upon
the uncertainty in input data.
ƒ
Uncertainty is relevant if it is high, especially if it is relatively higher in one
alternative than in the other. This question of the relative importance of
uncertainty has been barely addressed so far.
ƒ
Statistical sampling methods seem worthwhile pursuing; they are widely in
use in other fields, but are, due to high resource demands, restricted only
to selected, important / sensitive data sets.
2.3.2.2 Uncertainty Propagation
Uncertainty propagation is relevant for stage 3 in Figure 1124. While input
uncertainty “just propagates” through an LCA, it is important to specify this
propagation effect in order to be able to quantify the uncertainty in the output
of the LCA, i.e. in the result. To do so, a number of approaches exist, Monte
Carlo simulation and approximation formulas being most prominent.
Monte Carlo Simulation
A Monte Carlo Simulation varies input data of a calculation according to a
given probability distribution, runs the calculation for each input value, and
stores the outcome / output data of the calculation. This procedure must be
repeated often enough (typically several thousand times) in order to achieve a
smooth input probability distribution (Vose 1996).
Other approaches
Other approaches for calculating or processing the uncertainty within the LCA
model, which have been used for LCAs so far, include interval calculation
(LeTéno 1999), fuzzy logic approaches (e.g. Pohl and Ros 1996, Ros 1998),
Gaussian error propagation formulas (Heijungs 1996), higher order error
propagation formulas (Ciroth 2001; Ciroth et al. 2004).
Combinations
In some cases, different approaches for assessing uncertainty propagation
can be combined. E.g., approximation formulas and Monte Carlo Simulation
24
Strictly speaking, uncertainty propagation will happen also at other stages, e.g. in the stages 4 to 6,
but this aspect has not been dealt with in LCA so far and will be work for the future.
Page 47
TF3 Methodological consistency
may be combined, and used alternately at different parts of the calculation; in
this case, approximation formulas provide estimates for probability distribution
parameters that are used in the simulation, and the statistical moments (like
mean and standard deviation) from the simulation’s result are used as input
for the approximation formulas (Ciroth 2001, Ciroth et al. 2004).
Conclusions:
ƒ
The different approaches have the following advantages, disadvantages,
and further characteristics when it comes to a practical application.
Monte Carlo Simulation (Ciroth 2003):
+
is able to give a good estimate of the uncertainty, in most cases
+
easy to apply for non-looped systems (no complicated mathematics
or terminology involved)
-high demands on time and computer resources
-problems with loops in the simulated system (it may happen that
loops, such as recycling loops in a product system, do not converge
any more; in these cases the produced uncertainty is extremely
high)
O
more information in the outcome (the exact probability distribution
of the analysed parameter instead of a single uncertainty
parameter) / more information needed to start (input probability
distribution needs to be specified)
Approximation formulas (Ciroth 2003)
+ give a result in one single calculation: low demands in time and
computer resources
+ better able to deal with loops in the product system
O
can in many cases provide an estimate as good as those of Monte
Carlo simulations
O
calculate only an estimate for the standard deviation, do not give
the exact probability distribution / do not need a specified probability
distribution to start
-provide only an approximation of the uncertainty
-before any application, the approximation formulas need to be
formulated, and implemented in the LCA calculation software
Other approaches
O
Frequently, the attempt is not to produce an estimate for the
uncertainty as a probability distribution, or moment of a probability
distribution, but rather a ‘proxy’ that indicates the uncertainty,
possibly in an easy to understand way (e.g., in fuzzy logic, the
uncertainty may be fuzzified as ‘high’, ‘low’, and ‘medium’, e.g.
(Bandemer and Gottwald 1993).
-(special terminology, e.g. in fuzzy logic applications)
-before any application, specific formulas need to be formulated, and
implemented in the LCA calculation software
Page 48
TF3 Methodological consistency
ƒ
Monte Carlo simulation will often be the method of choice in practical
cases; however, a decision which approach to apply will depend on the
specific case, taking into account the advantages and disadvantages
described above.
2.3.2.3 Output Uncertainty
Speaking of output uncertainty includes, for one, the presentation of the
uncertainty that comes with calculated results from the LCA model (stage 4 in
Figure 11); it further includes methods to consider the uncertainty in a sound
manner in the decision making process.
For the graphical presentation of output uncertainty, not much has been done
so far, for LCAs:
“In combination with parameter variation, one often sees the
consecutive presentation of tables and/or graphs for the different sets
of parameters or scenarios”
(Heijungs and Huijbregts 2004).
Monte Carlo Simulation results are commonly depicted as histograms, or boxwhisker plots, known from statistics; box-whisker plots commonly include an
indication of confidence intervals.
In statistics, a large number of approaches have been developed with the aim
of distinguishing options or scenarios of different uncertainties (e.g. Backhaus
1994):
-
Analyses of variance (ANOVA) in various flavours;
-
t-tests and other tests.
All these have not yet been fully explored for the LCA field, and one is
tempted to suggest that there is a certain reluctance in the LCA field to
applying quantitative statistics.
In a discussion on possibilities and drawbacks of a Monte Carlo Simulation,
Heijungs and Kleijn state that
“One may even produce test statistics for […] difference with another
product alternative (the t test) […] A problem is, of course, that more
statistics means more pages of output, and that interpretation should
provide a help rather than a bunch of pages filled with statistical
information.” (Heijungs and Kleijn 2001).
Of course, generating meaningful, sound aggregates and conclusions from
statistics is an issue that generally needs statistical expertise as does the
design and conduct of any statistical analysis; decision makers should not be
confronted with ‘a bunch of pages’.
This seems, however, an issue that other different fields of science and
application have solved before.
Page 49
TF3 Methodological consistency
Conclusions:
ƒ
There seems today a certain reluctance in the LCA field towards the
application of statistics. This reluctance is well motivated. Lack of reliable
information about input data uncertainty, and lack of attempts for an
empirical validation of LCA results make statistical analysis today
somewhat meaningless and even inefficient.
ƒ
However, the lack of empirical validation weakens the scientific validity of
LCAs. Thus a search for means to validate LCAs, besides pure expert
judgement, seems of prime relevance; validation will, in turn, put more
importance on statistical analysis and uncertainty assessment, and thus
have an indirect impact on input uncertainty specification and on the
interpretation of output uncertainties.
ƒ
A knowledge transfer, from multivariate statistics and test theory to the
LCA field, seems of value.
2.3.2.4 General Approaches covering different types of uncertainty
Huijbregts et al. (2001) propose a scheme for analysing uncertainty (they
speak of data inaccuracy) for the inventory; the scheme basically tries to
identify parameters relevant for the uncertainty (in the result), and to perform
a detailed Monte Carlo Analysis only for these. In a bit more detail, this
consists in specifying model input parameters and their uncertainty;
performing a sensitivity analysis to identify potentially important parameters in
two steps: estimating the detailed uncertainty probability distribution of those
parameters; and performing a Monte Carlo Simulation only for those. Any
parameter that contributes, in the Monte Carlo Simulation, heavily to
uncertainty in the result should be replaced by more reliable, less uncertain
parameter values (Figure 13). Note that Huijbregts et al. use the pair
‘inaccuracy’ and ‘lack of data’ instead of the usual pair inaccuracy and
imprecision /uncertainty (Morgan and Henrion 1990, Bevington 1992), and
subsume imprecision under inaccuracy: “Data inaccuracy may be caused by
imprecise measurement methods […]” (Huijbregts et al. 2001, p. 130).
Page 50
TF3 Methodological consistency
Figure 13: Scheme for the analysis of ‘data inaccuracy’ in LCI
(Huijbregts et al. 2001, p. 130; screenshot from the original source)
Conclusions:
ƒ
Albeit this example is prominent and was broadly discussed, it is also an
example of the existing diversity in terminology. Harmonisation of terms
seems of value.
ƒ
Developing an efficient uncertainty analysis, as is the goal of this example,
is necessary to make uncertainty analysis more common
ƒ
This is also an example for how strongly uncertainty analysis in LCA
needs validation. Otherwise, uncertainty analysis might run into a circular
Page 51
TF3 Methodological consistency
reasoning, e.g. if the uncertainty distribution is guessed and then results of
the simulation (performed with the so-guessed distributions) are
interpreted, refining, potentially, again the distribution.
ƒ
There is, as stated above, little guidance on how to perform validation for
LCAs, so far.
2.3.3 Data quality
Many indicators or measures of data quality have been proposed; the ISO
14040 series and the former SETAC Working group for data quality are
important examples. However, one gets the impression that examples of
practical application of the more elaborated data quality indicators are not
very common.
2.3.3.1 Single Criteria or indicators
Two prominent examples for single data quality indicators will be presented
more in detail; Table 2 gives an overview of some of the many different
proposals and references.
The former SETAC working group on data quality defined several criteria for
an assessment of data quality (Braam et al. 2001):
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
statistical representativeness of data
age of data
data collection method
quantitative analysis of flows
which processes are taken into account
aggregation level for flows
mass balance
geographical representativeness
temporal representativeness
technological representativeness
functional unit definition
allocation rules
uncertainty intervals specified
The ISO 14040 series state requirements for data quality, to be checked
during peer review:
ƒ
For all unit processes, the following general information shall be
recorded:
- the reference unit in relation to which the environmental
exchanges are calculated,
- what the data set includes (the beginning and the end of the unit
process, its function, and whether shut-down/start-up conditions
and emergency situations are included),
- geographical representativeness,
Page 52
TF3 Methodological consistency
the applied technology/the technological level,
data relevant for the allocation of the environmental exchanges
among co-products,
- the period during which data has been collected,
- how data have been collected and how representative they are,
and the significance of possible exclusions and assumptions,
- the source of the data,
- the validation procedure used25.
Account shall be taken of the electricity generating mix, the combustion
efficiencies for the various fuel types, the conversion efficiencies of the
generating facilities and the transmission and distribution losses.
Assumptions used on the source of fuels and mix of electricity shall be
clearly stated and justified.
Missing values and non-detectable data shall be reported as the best
estimate possible, e.g. based on unit processes employing similar
technology.
If data does not meet the initial data requirements, this shall be stated.
-
ƒ
ƒ
ƒ
CML (2001, pp 35) provides a long list of criteria for data quality, and many
other authors provide lists as well.
The ecoinvent database applies an interesting schema for uncertainty
assessment in flows, which incorporates different data quality aspects. A
pedigree matrix is applied in order to assess geographical, technical,
temporal differences in data. Assessment results from the matrix are then
turned into quantitative uncertainty figures for the amount of flows. In doing
so, a probability distribution is assumed for the flows.
This example that shows how closely uncertainty and data quality are related.
In the end, any data quality can result in changes in quantitative figures that
are output of the LCA, or of elements of the LCA such as flows for processes.
The following table summarise DQI proposals from main literature references
(Ciroth and Srocka 2005). It distinguishes the “application level” of a DQI:
Some indicators are meant to be used on the level of aggregated process
systems, while some address, e.g., material flows26. An “x” in the tables
means that the indicator is fully acknowledged in the reference, while a “(x)”
means that it is implicitly addressed.
The table shows a diversity of the proposed indictors and measures, yet it
also shows consensus. Many indicators are addressed in almost every
reference (time, region, technological representativity; consistency;
completeness). The recommended application level, though, is often different.
Consulting the individual references shows that the application
“recommendation” are often rather vague (e.g., from one source:
25
Note that the ISO 14040 series define validation not in way that is commonly used in modelling
science, understanding it rather as verification, see section 2.3.4.
26
An example: The “for all unit processes”, in the above example from the ISO 14040 series, is in the
table entered in the level of processes.
Page 53
TF3 Methodological consistency
“representativeness, as a qualitative assessment of the degree to
which the dataset reflects the true population of interest (i.e.
geographical, time period and technology coverage”).
X
X
X
X
X
(X)
X
X
X
X
X
X
X
X
(X)
X
X
(X)
X
(X)
X
X
X
(X)
(X)
(X)
X
X
(X)
X
X
X
X
X
X
X
X
X
[Weidema Wesnaes 1996]
X
X
X
X
X
X
X
(X)
[Weidema 2004]
[van den Berg et al. 1999]
[Schuurmanns 2003]
X
[Guinée et al., 2001]
X
[ecoinvent 2004]
X
[Buchgeister et al. 2003]
[Braam et al. 2001]
data source / data collection
level:
general
aggregated system
process
flow
completeness
level:
general
aggregated system
process
flow
statistical representativity of data
level:
general
aggregated system
process
flow
time-related representativity
level:
general
aggregated system
process
flow
regional representativity
level:
general
aggregated system
process
flow
technology-related representativity
level:
general
aggregated system
process
flow
consistency
level:
general
aggregated system
process
flow
reproducibility
level:
general
aggregated system
process
flow
level of aggregation
level:
general
aggregated system
process
flow
[Björklund 2002]
Data quality indicator \ reference
[ISO 14040], [ISO 14044]
Table 2: Comparison of proposed data quality indicators for Life Cycle
Assessment from various references (Ciroth and Srocka 2005; modified)
X
X
X
X
X
X
X
X
X
X
X
X
(X)
X
X
(X)
X
X
X
X
X
X
X
X
X
X
X
(X)
X
X
X
(X)
(X)
(X)
(X)
X
(X)
X
(X)
X
X
X
(X)
(X)
X
X
X
The application of DQIs is, though, not always a self-explanatory exercise,
and needs guidance. It thus seems justified to state that practitioners do not
get much support in the application of DQIs today. Even worse, how trade-
Page 54
TF3 Methodological consistency
offs between different indicators should be handled is rarely discussed. This
is further outlined in the next section.
Conclusions:
ƒ
There seems to be consensus, in LCA literature, about many of the
data quality indicators to be used in the context in Life Cycle
Assessment.
ƒ
There seems to be far less consensus about the application of data
quality indicators. Practical guidance would be of value.
ƒ
How to deal with trade-offs between different indicators is rarely
discussed. Again, practical guidance would be of value.
2.3.3.2 Overall data quality assessment
Overall data quality means the assessment of “the” data quality of one
element of an LCA study as a combination of different indicators. One
element can be any element in an LCA study, so, e.g., either the whole study,
or a unit process, or a material flow, or the like.
While there is abundant material on single data quality indicators, literature on
a complete, overall assessment of data quality is rather scarce.
The above section concluded that trade-offs between different indicators are
rarely discussed, and that practical guidance in this matter would be of value.
The trade-off issue is complicated in practice because indicators are quite
often not independent. As an example, the former SETAC data quality
working group proposed, e.g., ‘age of data’ which relates to ‘temporal
representativeness’, but also to ‘technological representativeness’. Indicators
seem rather to represent different ‘endpoints’ where a data quality
assessment could start, than independent assessment criteria. An overall
quality assessment thus cannot simply combine all the different indicator
results, but will need to consider their dependencies.
Application is also hampered because the proposed indicator lists are
generally open lists. There is some guidance as to which indicators must be
considered, but less on “shall” and nice to have criteria, for different
application contexts which are more specific than “comparative assertions”
and “other”. This may be justified by the early application stage, but makes a
comparison of different studies difficult.
One of the few examples of a concise and closed list of indicators is the wellknown pedigree matrix by Weidema and Wesnæs (1996), inspired by the
NUSAP pedigree matrix concept by Funtowicz and Ravetz (1990).
The matrix consists of five data quality indicators (reliability of the source;
completeness; temporal correlation; geographical correlation; further
technological correlation), which are each evaluated, for a data set, on a
scale from 1 (very good) to 5 (bad). For each indicator, more specific
Page 55
TF3 Methodological consistency
realisations are described (e.g.; “data from area under study” produces a 1 in
geographical correlation). The authors refuse to aggregate the numbers.
This matrix concept has been applied and modified by some authors;
Huijbregts et al. (2001) use only three of the data quality indicators omitting
completeness and reliability, because, in their view, these two may be better
addressed by quantitative assessments. Ciroth et al. (2002, p. 296) propose
to rephrase ‘technological correlation’ and ‘geographical correlation’ to
technological differences and geographical differences, because “to state a
difference, two data points are sufficient; to state a correlation, several data
points are required.”
As with every pedigree matrix, valuation is subject to change if another
person applies the matrix, and possibly results will change also if the same
person applies the matrix at different times. It might also be that some
expertise and time is needed in order to make a good valuation. For LCAs,
this is not yet analysed enough to draw conclusions.
In practice a peer review performed by referees, either in parallel to a study or
after the initial study has finished, will deal (also) with an overall assessment
of the data quality of a study. This goes into the area of human decision
making and evaluation and LCA. Considerable work has been done in this
area (by (Seppälä 1997), (Hertwich et al. 2001), (Volkwein et al. 1996)).
However, one seems far from understanding in detail strengths and
weaknesses of human decision makers as peer reviewers, in LCA. The next
paragraph will come back to this issue. This point merits more complete
exploration in another paper.
Conclusions:
ƒ
More guidance is needed for the application of data quality indicators.
ƒ
Guidance should be given on the selection of DQIs (must have – nice
to have – not necessary, for specific application cases) as well as on
the application of each then selected indicator.
ƒ
Guidance should include rules for an assessment / evaluation of the
indicator as well as rules for interpreting the result, as a stand-alone
value and in combination with other indicator results.
ƒ
Guidance should include how to deal with qualitative information.
ƒ
The pedigree matrix concept seems an applicable, attractive concept;
it has the charm to combine human judgement and hard facts into
quantitative figures in a clear and transparent way.
ƒ
Considerable work has been done in the area of human decision
making and evaluation and LCA, yet one seems far from
understanding in detail strengths and weaknesses of human decision
makers as peer reviewers.
Page 56
TF3 Methodological consistency
2.3.4 Verification and validation
Verification and validation are prime concerns for any modeller. The
verification process checks whether the model calculates its results
technically correct, while validation is concerned whether the model models
what it should (see section 2.3.1.2.1).
Uncertainty and data quality are clearly lower-ranking; it even seems
inefficient to care much about model uncertainties if the validation is not
specified. Validation and verification, and empirical tests used in the validation
context, are key aspects of science27, (Ciroth and Becker 2006).
Despite this, verification and validation have not seen much attention in LCAs
so far, though a number of different approaches exist. There is not yet a
consistent “framework”. Expert judgement seems most important for
validation. With the task to check the conduct and result of an LCA study
against goal and scope, the peer review process for LCAs is a validation task;
it takes place, however, on the level of expert groups.
Standards in the ISO 14040 series mention “validation”, yet, somewhat
surprisingly, they restrain validation to data validity, specifically for the
inventory, and for unit processes. For the overall LCA study, a check of the
unit processes is more a verification than a validation, see also Figure 1228.
This seems comparable to building a boat from many components, where
some have been tested before use, but without the possibility to test whether
the complete boat will float; the only option is to ask experts whether they
think, on the basis of their expertise, whether this boat will float or not
according to its specifications. There are enough examples in the history of
technical inventions where this approach has failed, i.e. where a test of the
newly developed product made experts revise their judgement29.
27
“3. What may be called the method of science consists in learning from our mistakes systematically:
first, by taking risks, by daring to make mistakes--that is, by boldly proposing new theories; and
secondly, by searching systematically for the mistakes we have made -- that is, by the critical
discussion and the critical examination of our theories.
4. Among the most important arguments that are used in this critical discussion are arguments from
experimental tests.” (Popper 1996, p. 94; 17 theses regarding scientific knowledge)
28
ISO 14041 (1998) states, in section 6.4.2 (Validation of data): “a check on data validity shall be
conducted during the process of data collection. Validation may involve establishing, for example, mass
balances […]”.
Very similarly, ISO 14044, section 4.3.3.2 (Validation of data) “A check on data validity shall be
conducted during the process of data collection […] validation may involve establishing, for example,
mass balances […]. And in section 5.2 (Additional requirements for third party reports), section d), life
cycle inventory analysis: “5) validation of data: i) data quality assessment ii) treatment of missing data.”
Specification of the treatment of missing data is important and often overlooked, yet it is not a validation
for the whole LCA study in the sense of “the process of ascertaining that the model mimics the real
system”, (yet, it’s true for the single unit process).
29
See also Popper’s statement above. Samuel Pierpont Langley, secretary of the Smithsonian Institute,
claimed shortly before the Wright brothers to have constructed a technical apparatus that enables men
to fly; he let his assistant, Charles Matthews Manly, fly this machine; this “proof of concept” failed, the
machine did not go further than several meters, injuring the assistant.
Page 57
TF3 Methodological consistency
As for verification, or validation on the unit process level, many different
procedures and tests exist. Many of them are applied today, e.g.:
ƒ check of material balances;
ƒ check of the overall mass balance;
ƒ check of energy balance;
ƒ check of release factors;
ƒ calculation with different software packages and with different algorithms.
On the unit process level, there exist elaborated procedures for verification,
validation, and data quality assurance. As an example, the ecoinvent project
and database employed the following scheme (Figure 14).
Figure 14: Overview of the internal review and data quality control
within the ecoinvent project (Frischknecht and Jungbluth 2003, p. 54)
Note the separation of the reviewing person and the person who is
responsible for the creation and documentation of the data set. This
separation fosters an independent evaluation of the quality, which is highly
desirable.
In an ideal case, there are several independent referees judging the quality of
a dataset or any other element of an LCA study, up to an assessment of the
quality of the overall study. These experts shall, again ideally, provide
complementary expertise for the task (technical knowledge about the
process; methodological knowledge; practical knowledge about the plant, and
so on). Figure 15 tries to visualise this: General, technical, and LCA-specific
aspects of a dataset are to be covered by the review panel; technical expert,
LCA expert, and a general expert take their “share” in these aspects; as a
result, all the aspects that come along wit the dataset are covered.
Page 58
TF3 Methodological consistency
General, technical, and LCA specific aspects of a process
dataset
Technical expert
LCA expert
Additional expert
Figure 15: Patchwork of expertise in the evaluation of the quality of an
LCA dataset (Ciroth et al. 2006, modified)
Information in LCAs is often quantitative. Quantitative, automated procedures
for verification are thus a logical option. They may be used on the level of
LCA databases or of single unit process data sets, or in the impact
assessment, and serve for the identification of hot spots that deserve further
attention by human expertise.
This relevance of quantitative verification procedures has not often been
addressed up until now; e.g. they are barely used in a systematic manner in
the conduct of peer reviews. Their use in practical studies is rarely mentioned,
although many practitioners may use a variety of verification procedures,
based on their personal experiences and IT environment. Quantitative
procedures seem a helpful, promising element in a sound verification strategy
for LCAs, to be used in synergistic addition to human expertise.
Quantitative plausibility checks in this sense need not be complicated. As an
example, Figure 16 shows a check of the relative changes in prospective
datasets for wood co-combustion in a coal power plant, for datasets with
reference year 2000, 2010, 2020 and 2030. In each case, the relative
difference to the year 2000 is calculated. Processes are taken from the
ProBas database of the German EPA30. According to the data provider (not
the German EPA!) these processes have passed a review stage.
The calculations show, inter alia,
ƒ A drastic increase of NOx emissions in 2010 compared to 2000
ƒ The NOx emissions are reduced in the following years
ƒ Emissions of dust decrease considerably
ƒ Production of ashes increase by 50% from 2000 to 2010.
ƒ Many gaseous emissions are reduced by 70% in 2030 (e.g. CO; HF;
N2O).
30
www.probas.umweltbundesamt.de
Page 59
TF3 Methodological consistency
These results are not wrong or right per se, but some (e.g. the NOx emission
increase from 2000 to 2010 by 90%) simply raise questions that should be
answered in a following scrutinising by human experts.
Sum of Factor
Exchange
Emission in air
Exchange
Product
Waste
Product
Unit
CH4
kg
kg
CO
HCl
kg
HF
kg
N2O
kg
NMVOC
kg
kg
NOx
SO2
kg
Dust
kg
Steel
kg
Water (mat.)kg
Cement
kg
Electricity TJ
Ashes
kg
REA-waste kg
Process
2000
2010
-14.65%
-14.65%
-14.65%
-14.65%
-14.65%
-57.32%
92.05%
-14.65%
-29.48%
0.00%
-6.59%
0.00%
0.00%
53.78%
-14.65%
2020
-17.79%
-17.79%
-17.79%
-17.79%
-17.79%
-58.90%
84.96%
-17.79%
-32.08%
0.00%
-6.59%
0.00%
0.00%
48.11%
-17.79%
2030
-24.21%
-69.68%
-62.10%
-69.68%
-72.93%
-62.10%
13.69%
-62.10%
-35.49%
0.00%
0.00%
0.00%
0.00%
36.55%
-20.63%
[
][
[]
[]
[]
[]
][
]
[
Figure 16: Example for a plausibility calculation, for a wood cocombustion process in coal power plants with reference years of 2000,
2010, 2020 and 2030 (see also Ciroth et al. 2006)
Plausibility calculations thus can provide background material for experts who
have the task to judge upon the quality of the calculation in a review
procedure.
Conclusions:
ƒ
Verification and validation are prime concerns for any modeller. The
verification process checks whether the model calculates results in a
technically correct manner, while validation is concerned whether the
model models what it should.
ƒ
For LCAs, validation is restricted today to expert judgement and to the
validation of unit process datasets. A consistent, overall framework for
validation is lacking. This is a flaw, and makes efforts for uncertainty
specification in the LCA model difficult, even ineffective.
ƒ
For verification and validation of unit process datasets and of inventories,
quantitative procedures are helpful. They support human expertise by
pointing to “hot spots” in a data quality assessment and in an overall
review procedure. Although different experts and institutions may have
their in-house methods, plausibility calculations in that sense are not yet
applied in a generally accepted, systematic, and routine manner.
Page 60
TF3 Methodological consistency
2.3.5 Concluding remarks
2.3.5.1 Impact Assessment
This section has concentrated on the inventory part of LCA. However, many
of the problems and also some of the solutions appear in LCIA as well. For
example, characterisation models face uncertainty in their input data,
uncertainty propagation in calculating characterisation factors, and,
consequently, uncertainty in the calculated factors; some practitioners refuse
to address toxicity categories in their LCA cases due to uncertainty and, more
generally, due to low data quality they assign to these categories.
Impact assessment and data quality has not often been investigated, and
would well deserve a detailed analysis and treatment.
2.3.5.2 Towards Consensus
For many of the addressed topics, this paper has failed in providing
recommendations. However, it may have been succeeded in determining
where work is required and where not, and in providing an overall picture of
data quality, uncertainty, validation and verification, which might, in turn,
serve to identify consensus and recommended applications procedures, and
thus provide practical guidance.
Such recommendations should be discussed and derived not by the author,
but, rather, by an international group or “task force”, e.g. within UNEP
SETAC, ideally combining expertise on uncertainty management on a
worldwide level.
Page 61
TF3 Methodological consistency
3 Advancing life cycle modelling
Corresponding author: Gjalt Huppes, CML, Leiden University
3.1 Introduction
All choices regarding technologies have environmental consequences, be
they purchasing choices, investment choices, strategy choices or policies
regarding processes and products. Sustainability decision support requires
knowledge about the environmental consequences of alternatives and
options. Clearly, the differences in technologies resulting from choices are an
essential part of the analysis. LCA started out as a simple, primarily
technology oriented type of modelling. Processes are defined as fixed ratios
between inputs and outputs. One output flow is chosen as a reference, the
product or function delivering flow. A volume for that flow is set as the
functional unit (FU). Other inputs and outputs of this central functional unit
delivering process are linked to other economic processes. The volumes of
these processes are adapted according to the amount required. All further
processes are linked using the same method of quantification. If a process
has multiple outputs of which some are not required for the product they are
removed by some sort of allocation procedure, by substitution or partitioning.
Looped systems, like electricity production using steel used for electricity
production are solved either by going through the loops often enough to
approach the ultimately stable value searched for or, being a set of linear
equations, the system is solved as through matrix inversion, see the final
section of this chapter for details. The result is a system where all internal
relations, as flows, are removed. Each amount produced by one process is
used by one or more other processes to the full extent. The system resulting
has no links to other economic activities any more, except for the functional
flow. All other flows are inputs from the environment and outputs to the
environment. This set of quantified environmental interventions is the basis
for the life cycle impact assessment. Making such a model for other
alternatives and options, with for each the final reference flow quantified as to
the same functional unit, systems can be compared as to their relative
environmental scores. This basic approach to modelling in LCA is generally
adhered to, with a few exceptions. The main exception is that with a few
adaptations, making it not fully determined, the system can be solved in
multiple ways allowing for optimisation, for example minimizing the
environmental impact, or if costs are added, minimising cost.
Before entering in details on how to interpret this LCA modelling structure a
short note on terminology is required. There is a basic distinction between
dynamic and static models. In dynamic models time the value of variables at
time t determines variables at time t + 1. The development of these variables
of concern thus is endogenously determined. Static models do not have such
a time specification. An intermediate position is where a time path is specified
Page 62
TF3 Methodological consistency
exogenously, based on other knowledge which is not part of the model. Such
modelling may be named quasi-dynamic. This is a usual procedure in costbenefit analysis (CBA), where investments and proceeds are specified for
each year of the operation of the project analysed. In LCA, there are no
dynamic models, nor quasi-dynamic models. With static models, situations
can be analysed based on different inputs into the model leading to
differences in the output variables. For a given functional unit, different
technologies can be specified, as inputs, leading to different environmental
impacts (and cost, and other effects in broadened LCA). There is no
dynamics involved; it is separate pictures of specific technology systems. In
LCA for decision support, the situations refer to potential future states, as
alternative options. The current situation may be one of the alternatives.
Several options may be compared, as a comparative static analysis. As in
many decision situations the reference is the current situation, one may
specify all alternative options relative to the current situation. Then the current
situation functions as a reference situation. The effect of a choice may then
be specified as a difference analysis. For making a difference analysis, two
situations have to be modelled. In subtracting, the old from the new situation,
the effect of the change can be indicated. As the new situation will often be an
improvement as compared to the old situation, avoided burdens can come up.
The current habit in LCA to specify avoided burdens when only one
alternative is specified may easily lead to confusion, as the reference situation
is only implicitly indicated. The tendency to do so comes from the rightful wish
to indicate effects of choices as a dynamic process, using a dynamic analysis.
To have clear methods, this implicit dynamics in a static model is to be
avoided. Either the current situation is compared with a possible new
situation, with a difference analysis as one way to make the comparison, or a
dynamic model is to be used for the analysis. As dynamic models are not
available in LCA now, the comparative static option is the only one available
in current LCA. If in a static model substitution is used as a solution to the
multifunctionality problem, there always is an implicit comparison to some
unspecified reference situation.
Another distinction is between equilibrium models and non-equilibrium
models. In equilibrium models there are opposing mechanisms which at a
certain point do not lead to further adjustments. A most common example is a
market model where there is an equilibrium price and quantity, based on a
supply function and a demand function constituting the opposing forces. In
LCA, such opposing forces are not present. The demand for a functional unit
is met by the supply of all activities required, automatically, not based on
supply mechanisms as economic mechanisms. Nor is demand specified as a
function with quantity depending on price. In the system specification for all
economic flows the amounts produced and the amounts used are equalised,
and are in that sense ‘in equilibrium’. We will avoid that term mostly, and if
used it has the loose meaning of equalised.
Next, there is the steady state model. Such a model depicts the situation after
all adjustments as modelled in the system have been made, and assumes
Page 63
TF3 Methodological consistency
that the system will converge to an equilibrium; is stable in the end. LCA,
being a static model cannot depict an equilibrium steady state. However, we
may think about the LCA outcomes as resulting from a dynamic model, in
which all processes have adjusted to the assumed constant demand. In that
sense, the “equilibrium” also is a “steady state”. This unusual state of
modelling in LCA has to be kept in mind when using the terms equilibrium and
steady state, as interpretative terminology.
So, how may we interpret this basic structure of LC inventory analysis?
Firstly, all processes in the system are mutually equalised; in that limited
sense the LCI model is an ‘equilibrium model’. Next, the technologies of all
processes are constant, they don’t develop in time and they accept a given
capacity use (or: working point) of processes. Also all capital requirements
are met by the necessary investment processes, taking into account the life
time of the capital goods externally. So the system not only is in equilibrium
but also depicts a constant flow per functional unit, in that sense indicating a
steady state. This constant nature seems one most general characteristic of
LC inventory analysis and hence LCA. Though the static outcome gives one
specific mutually equalised set of processes only, it can be most useful for
decision support. It depicts for each alternative or (sub)variant investigated
the relative environmental effect resulting, if the technology set
investigated would function long enough to approach a steady state.
There are no complicating dynamics, no behavioural adaptations and
autonomous developments, no changing technologies, no spatial
developments which all would make both the analysis and the interpretation a
much more complex affair. The systems as specified give the pure link of
given function realised with the given set of technologies, with the
environmental consequences linked to that function for each alternative set of
technologies. Reasoning from a hypothetical (not-modelled) steady state
equilibrium, one slice in time, as a snapshot, is enough to know the system,
see Figure 17 in Section 3.5.1 below. Time is outside of the model; the
dynamic version is never specified.
Of course, this strength is also the weakness of LCA in environmental
analysis for sustainability oriented decision support. When considering to set
up a production line of a new version of a product, for example, the LCA will
show the comparative environmental effects relative to other options from a
steady state point of view. However, the most obvious economic mechanisms
are not taken into account: bringing a new product on the market will reduce
prices and volumes of similar products. If less expensive, it will lead to more
income being available for other spending. And most products do not function
stand-alone but in combination with other products. Buying a barbecue
implies a shift in food purchases towards barbecue food. Getting fast internet
access at home allows for home working. All these shifts in real life have
environmental consequences which ideally would be reflected in sustainability
oriented decision making, and hence in the modelling for decision support.
Page 64
TF3 Methodological consistency
Before indicating other modelling options to deepen the modelling structure of
LCA, some remarks now first on further limitations of LCA which are less
fundamental but not less important. In the practice of LCA for decision
support, the processes involved in a product/function system are specified
based on the specification of technologies as is available from the past,
except for some specific new technologies considered. Of course we know
that past performance will not be continued in the future. Some technologies
are declining, others relatively stable and others increasing in market share,
while in the course of time new technologies will be employed. The fixation on
old processes is nurtured by the lack of data on new processes, and by fear
for arbitrariness in choosing the technologies supposed to become relevant
for the time horizon involved in the choice. In the analysis of apartment
buildings, the relevant technologies for waste processing after demolition may
better be specified based on future not yet existing technologies than on old
technologies that now already hardly are compatible with recycling laws. One
approach is using relatively modern processes, the typical modern process,
the modal-modern process (Heijungs et al 1992) or slightly newer in many
cases, the marginal process (Weidema 2000, Weidema 2002), being the
technology expanded with growing demand. The step to using expected
future processes is logical for decisions with long term implications, involving
future technology scenarios, see the results of the SETAC Working Group on
Scenario development in LCA (Pesonen et al 2000)31. To avoid arbitrariness
as much as possible, a systematic approach to process selection is
schematised in section 3.3.
Other elements in modelling as mentioned above are more fundamental.
Supply and demand mechanisms, including substitution mechanisms, are
modelled for many products and clearly are relevant for sustainability decision
support. The income elasticity of demand, also called the propensity to
consume, is highly relevant if products with very different market prices per
unit of function are compared, as with using a bicycle or a tram for city
transport. Such models may also be static equilibrium models, but incorporate
a behavioural mechanism lacking in LCA. When taking the outcomes of the
market analysis, the processes involved may be fixed and analysed as a
multifunction LCA, see Ekvall (2002) in this vein. Focussing on substitution
only, one may indicate a chain of substitution mechanisms and then having
fixed these substitutions, further model the system as the usual LCA steady
state model, see Weidema (2000 & 2002)32.
Other models again may be more realistic in reckoning with the supply of
production factors, indicating shifts involved there as a consequence of using
more of one product. Such models usually are dynamic, simulating changes
31
The study defines a scenario as ‘a description of a possible future situation relevant for specific LCA
applications, based on specific assumptions about the future, and (when relevant) also including the
presentation of the development from the present to the future’.
32
Weidema uses substitution at the same time for solving the multifunctionality problem, see the section
on allocation in chapter 2.
Page 65
TF3 Methodological consistency
in developments due to changes in policy choices and technology decisions.
They have been developed in the realm of energy policy and increasingly
involve other environmental aspects, and increasingly are able to reckon with
specific technologies and products, overlapping with the domain of LCA.
Examples are the MATTER model of the International Energy Agency (IEA)
(Loulou and Lavigne 1996 and Gielen et al 2000), and the E3ME (see:
http://www.camecon.com/e3me/e3me_model.htm) and GEM-E3 (see:
http://www.gem-e3.net) models developed for the EU. These are dynamic
equilibrium models. Next to these we have the Cost-Benefit Analysis (CBA)
type of models which are used to assess the economic and environmental
consequences of decisions, as does LCA and LCC. Mostly they are a
specification of yearly events resulting from project implementation, see
Mishan (1971) and Dasgupta and Pearce (1972). The dynamics involved
usually are specified externally, exogenously, “by hand” or by a series of
partial equilibrium models, while the dynamic equilibrium models have
endogenised a number of mechanisms. What all these models have in
common with LCA is that they specify the functioning of a processes system
in relation to a possible decision to be made, and that they indicate the
environmental interventions of the process system as a basis for impact
assessment (though often called differently). All these models may cover the
full life cycle of systems involved and hence in a way may be seen as Life
Cycle Inventory models, taking the concept of life cycle beyond a narrower
ISO-LCA interpretation.
We might approach modelling in LC inventory analysis as a sequence of
choices on modelling options, specifying main model characteristics and
ultimately defining one specific modelling options as most relevant to the
decision at hand. This way one may pick out the right type of modelling
technique. It should be kept in mind however that LC inventory analysis is not
just a matter of modelling principles. The ideal model from an epistemological
point of view may be out of reach of practitioners now but several directions
for more advanced modelling are feasible. If such new and better methods
may be made practical in due time, they should be on the agenda, not as
current alternatives to good practice, but as candidates for future good
practice. LC inventory analysis and LCA will never be ready in any final
sense, and in some situations alternatives to LCA may be more appropriate,
as may be the case with dynamic economic-environmental modelling for
broader energy technology choices. However, at the moment, guiding rules
leading to well specified alternative modelling options seem one step too far.
Real options are very limited, also if possible further developments in LC
inventory analysis and related modelling are taken into account. Incorporating
real life mechanisms, if effectively possible in a balanced and systematic way,
would lead to superior modelling in terms of validity, see chapter 2. Therefore,
this detour on modelling options may as yet be more to gain perspective on
LC inventory analysis, both for its current limitations and for its possible
further development, than effectively guiding choices for practitioners now.
Page 66
TF3 Methodological consistency
This perspective is important as methods inconsistencies in current LCA
relate to the wish to overcome the limitations of current LCA, especially as
related to lacking behavioural mechanisms and lacking explicit dynamics.
Such elements one rightly would like to cover when assessing consequences
of choices but now do not fit into the essentially steady state nature of LCA
modelling. The central question of this chapter is: how can we deal with what
we know about reality and with what we can model about reality in a
consistent way? We will make some steps beyond the static models and
hence comparative static nature of current life cycle analysis.
We will first go into the seemingly contradictory nature of current LCI
modelling, as being static on the one hand and indicating changes on the
other hand, if used in a comparison to current practice. Next, we may gauge
from the applied research field where the most pressing limitations of LCA are
felt, now brought together under the heading of ‘rebound’ mechanisms, in
section 3.3. Next we will survey options for a more deliberate choice of
processes to build the LCA system from, in section 3.4. In section 3.5 we look
into advanced modelling options, taking the time aspect as the prime
distinction between modelling types. Going into more detail, in section 3.6, we
survey options for hybrid models combining more physical process
specifications with monetary ones. Such approaches may become part of
improved practice on short notice. Finally, in section 3.7, some basics on
modelling in LCA will be surveyed from a mathematical point of view. In each
section, recommendations for current practice and recommendations for
method and tools development will be given. Some conclusions on
consistency in LCI modelling form the final part of the chapter.
3.2 Advancing life cycle modelling in LCA
LCI in LCA for decision support is telling us something about the future, even
if using process data from the past for comparative static modelling of the
possible future states. The underlying assumption is that changes in
processes which of course will take place but are hard to predict will leave the
overall structure of the processes system more or less intact. Though LCI
processes may be depicting the past in a technical sense, when used for
decision support they indicate future states. Though evidently making LCIs is
‘modelling’, there is not much reflection on the nature of LCI modelling in a
general sense, with exceptions like Azapagic (1996), Guinée et al (2002, Part
3) and Ekvall and Weidema (2002). This is an unsatisfactory state of affairs
as LCI modelling is different in nature from most other modelling approaches,
also from other models used for sustainability analysis. Generally, LC
inventory analysis is set up as a system of linear equations, with each
equation representing one process in terms of its fixed input and output
relations, be they based on averages, incremental changes or marginal
changes. Coal mining, for example, is one process in the process tree for
electricity production while electricity production is part at several spots in the
LCI of frozen whale meat consumption. In LC inventory analysis, all these
Page 67
TF3 Methodological consistency
processes are linked to the amounts required or implied ‘in the next process’.
Differences in time between these processes are disregarded, though of
course we know that coal mining precedes electricity production for cooling of
meat, but not the electricity production for lighting the coal mines. If one would
choose a time moment or time period for fulfilling the function specified in the
LC inventory analysis, all other processes could be specified – roughly – in
relation to that time period. For instance, the coal for the making of the steel
for the truck factory for the truck which transports the refrigerator to the
consumer’s home would have been mined between 10 and 15 years ago. The
recycling of the steel for the truck would come 10 years after the eating of the
meat. A first remark: such a time specified LCI model is not a historical model
in the sense of representing the actual historical data as recorded time series.
It would be an analytical model, constructed on the basis of process
specifications based on certain modelling assumptions. For example, they are
representative of a certain period and region, for example, current general
grid electricity production in Japan for the refrigerator being used in Japan, for
cooling the meat till the time of consumption comes. The processes involved
are single function processes, through some form of allocation; they are not
real processes as can be seen in reality.
So, though the time sequence of processes is known in principle, and even
might be specified, the LCI model abstracts from it and uses a set of process
specifications which does not reflect actual historical processes as they have
developed and will develop through time. If LCA practitioners would be
historians they would act differently. LCA practitioners are not historians but
make their models for decision support. This first observation on the nature of
modelling in LC inventory analysis has a direct relation to the discussion on
prospective and retrospective LCA.
Firstly, LCI is not a historical descriptive analysis, if one would indeed specify
the process relations in time. Secondly, there are no known examples of
historic analysis in LCA where LC inventory analysis reflects the actual
historical sequence of events. All practitioners make their models as an aid
for decision making, that is as an aid to envisage a future situation as related
to the decision at hand. Hence, all practitioners use prospective LCAs, even
when specifying most processes in their LCI model on the basis of existing
(that is: past) processes. Historians of technology and economy don’t make
LCIs and LCAs but describe how technologies develop and explain these
developments. There are no alternatives involved, in principle. In special
instances some historians analyse how a specific choice for a specific
technology has worked out, comparing that choice to a counterfactual: what
would have happened if another technology would have been chosen. It
seems highly improbable that historians would make that analysis in the form
of LC inventory modelling. Collapsing historical processes to one timeless
steady state looses the historical part of the processes involved. However,
making such a steady state analysis for several points in time may give good
historical insight in technology development.
Page 68
TF3 Methodological consistency
The related concepts of attributional and consequential LCA may receive a
more clear meaning in a modelling context. In LCA for decision support, the
LCI model is indicative of the consequences of the choice at hand, that at
least is the intention. In that sense it is a consequential LCA. If constructed as
an attributional LCI model, it still is consequential in its application in decision
support. See the chapter 2 on prospective and descriptive analysis for a
broader treatment of this subject.
Conclusions
ƒ LCA for decision support in its intentions is always consequential,
prospective etc., summarised as change oriented, as choices to be
supported always regard alternative options for the future.
ƒ Change oriented life cycle analysis depicts a series of mutually independent
alternatives, to compare them as indicative of possible future developments
due to the choice at hand.
ƒ LCI modelling is mostly now a comparative static analysis, using steady
state models.
ƒ Historical LCAs have never been made, but several steady state type LCAs
might be used to get historical insight in technology development over
longer periods, like ‘the LCA history of electricity production, in the period
1850 to 2000’.
Recommended practice
ƒ When specifying technology options for fulfilling a function, the time frame
should be clearly specified in the goal and scope definition, leading to
process selections not necessarily referring to the same period, as in end
of life processes for sturdy buildings.
ƒ Mechanisms relevant for the choice at hand but not included in the LCI
model should be indicated in the goal and scope analysis and discussed in
the interpretation part of the LCA.
Recommended developments
ƒ Similar to the data bases on current processes being constructed for LCI
purposes now, data sets on future states of main background
technologies should be constructed, as future scenarios, based on broad
sets of decisions and developments in one direction or another.
Page 69
TF3 Methodological consistency
3.3 Rebound mechanisms and modelling challenges
Let us start the modelling methods discussion around LCA with the example
of rebound effects now generally discussed, see Hertwich (2005, pp4680-1),
and more references there. The concept has come up in energy analysis,
where technical improvements have been more than compensated for by
behavioural reactions, like the effect of more fuel efficient cars having been
compensated by a shift to heavier cars. The rebound may cover specific
mechanisms related to the subject analysed, as in the example of increased
number and using hours of light bulbs due to introducing electricity saving
bulbs. It also covers adapted behaviour due to technical options, like reduced
mobility due to better IT systems. It also covers the more general income
effects because of cheaper options created by environmentally superior
products for a function. The old example is city transport by car or bike, where
the bike is environmentally (and possibly in other respects) superior, but takes
just a fraction of the cost of the car for the same distance. When shifting to
bicycle transport, will this income be spent on violin classes or on trips to the
other side of the world? Is this shift symmetric? Taking such income effects
into account in an LCA way, how ever done, will surely be a main factor
determining the outcome of the car-bicycle comparison.
3.3.1 Rebound mechanisms
Experience has shown that in principle environmentally friendly options
sometimes have worked out perversely. Making energy efficient lighting might
have contributed to larger electricity use due to an increased number of
lighting points and an increased use time per light point. Such indirect or
secondary effects, having in common that they are not part of the model first
used, have been named rebound effects. Could they be incorporated
systematically? Surely this would imply substantial changes in LCI modelling.
Several “rebound” mechanisms may be distinguished, with the first four
stemming from Greening, Green and Difiglio (2000) as cited by Hertwich
(2005).
1. The price effect, which leads to higher consumption volume of the same
product with lower prices due to technological progress of which
environmental progress usually is a part
2. The income effect, which leads to higher consumption volumes for other
products due to lower prices of environmentally attractive but cheaper
products studied
3. Secondary effects on other technologies (for example as technology or
knowledge externalities), leading to cheaper production and higher
production volumes elsewhere
4. Secondary effects through supply and demand mechanisms of products
involved in the chain, as with lower energy prices.
One may add mechanisms of a more complex nature, like the following:
5. Technically linked activities, like buying recyclable glass products and
driving substantial distances to a collection point by car.
Page 70
TF3 Methodological consistency
6. Complex cultural processes, when specific product shifts can induce
larger shifts elsewhere.
The regulations on reduced gasoline use of passenger cars in US cars,
resulting in less attractive cars, may have led to the market breakthrough
of SUVs, also outside the US.
7. Macro-economic consequences of technology-volume changes.
The shifts in energy use as would follow from the large scale introduction
of fuel cells in cars and private households would have large
consequences on the overall economic structure, with large environmental
consequences.
Nobody will deny the relevance of such mechanisms in deciding on the
environmental consequences of choices on products and their technologies.
The interesting fact to note is that mostly such relevant mechanisms are
ignored in LCAs, while on the other hand it is clear that they never can be
incorporated all. This understandably gives an uneasy feeling with LCA
practitioners and the more so with users who are to trust the outcomes. Not
answering these very real questions may be an option but will put LCA to
oblivion. Real decision makers want to know about real consequences. So
how to deal with request for reality in the simple structure which LCI, and
LCA, constitutes in modelling terms? Should LC inventory analysis broaden
and deepen, incorporating what is missing in terms of sustainability aspects
and real life mechanisms? Or should LCA remain the simple technique for
indicating consequences of technology changes only, as it now mainly does?
There is good reason as it is the only one broadly operational model for
environmental decision support on technologies, also for SMEs and NGOs.
Of course what it misses then should be specified and maybe analysed
additionally. Then LCA, at least in its simple forms, would remain “as it is”,
with only its limitations better clarified. However, being a technique for
decision support this hardly is a tenable position: LCA should be improved,
with simple LCA possibly remaining as one of the more simple options with
restricted but easy applicability. For simple applications, as guiding
consumers in their choices on existing products, there may be good reason to
stick to this type of LC inventory analysis and LCA. However, for product
design and for technology development, this surely is not the case. There,
either LCA is to be improved and expanded, in a basic modelling sense, or
other models will take over, as has been the case in environmental analysis
of energy related technologies. There other models, like the MARKAL based
models as have been developed with the International Energy Agency (IEA),
are used in stead of LCA, see for an application Gielen et al (2000).
Environmental analysis as for fuel cell cars can be done using such models,
with some clear advantages, and disadvantages, as compared to LCA. Next
to LCA, such energy & environment models are actively developed as well. In
the EU the European Commission has been developing models like the
E3ME (see web ref.) and GEM-E3 (see web ref.) models, which have still
Page 71
TF3 Methodological consistency
limited but increasing options for environmental analysis incorporating specific
technologies and products.
3.3.2 From rebound mechanisms to modelling challenges
Rebound mechanisms mentioned above are not a coherent set. They have
been grouped only because of what was missing in simple LCA models.
Some rebound mechanisms are close to LCA, as when specifying the activity
complex of home IT and reduced commuting, using a multifunctional
functional unit. This seems the exception however. All other mechanisms
mentioned require the modelling of other mechanisms than technology
relations. Expanding on mechanisms incorporated LCI modelling would
require cultural models to indicate cultural mechanism; market models to
specify volume changes and substitution effects; some further economic
modelling of income relations to indicate income effects; and macro-economic
models to specify effects through that domain. When analysing the
sustainability effects of choices related to technologies, clearly the technology
specification plays a central role, in detail covered by specific technology
models, like engineering models, and at a systems level covered by models
like LCA. Virtually all production technologies function in markets, with
economic mechanisms (partly cultural based) substantially determining their
use and development. All market mechanisms ultimately are driven by final
demand, as related to values and preferences of private and also public
consumers. So a first set of further mechanisms relates to market
mechanisms. Market mechanisms are part of broader economic mechanisms,
like macro-economic mechanisms related to savings, investments,
employment and growth. Influencing such parameters clearly may have
substantial sustainability consequences, even if technology related choices
do not have a prime influence.
Next, markets function in a broader economic setting, with cultural and
regulatory mechanisms on the one hand driving market developments but on
the other also setting boundaries to such developments. Larger structural
developments in the global economy, as around the globalisation of many
markets, not only form a background for specifying consequences of
technology related choices, but also are part of this development. The choice
of local products as against global products is a basic choice diminishing or
increasing such globalisation tendencies.
It is clear that a model of all is not feasible. On the other hand, just leaving out
main relevant mechanisms is not a reasonable option either. It seems
essential for adequate sustainability decision support to have some insight in
what is incorporated in LCA; in what mechanisms might be incorporated in
LCA, single or as a group; and in what is relevant for sustainability analysis of
choices but will remain outside the modelling options of LCA. Table 3
summarises some main mechanisms not present in many simple LCAs, but
all of which may be highly relevant for sustainability analysis of choices.
Page 72
TF3 Methodological consistency
Ordering the field beyond the sketchy contents of this table would give better
grips on modelling options.
There may be two categories for problems: one is a problem in which
LCI/LCA and external methodologies established but the interface cannot be
suitably designed. An example of the first category is combination of
sophisticated CGE (Computational General Equilibrium mode) coupled with
LCA of new versions of a specific brand product in which model resolution of
even the most sophisticated CGE model is not high enough to link to the
specific LCA questions and the central technologies involved. The other is a
problem in which both LCI/LCA models and the external models have close
interaction which we do not have proper models covering the relevant
mechanism. An example is the case LCA based Eco-labelling influences the
consumers’ choice; choice of inventory model such as industry average data
or company specific data will be crucial for the modelling of consumer
behaviour.
Both problems become relevant only if the LCA part and “the other” part can
be established, including the appropriate databases as then are required.
Table 3 Mechanisms missed in simple LCAs and options for linking
them in or to expanded LCAs.
MECHANISMS
OPTIONS FOR INCORPORATING
REQUIREMENTS ON LCI ANALYSIS
Price effects
Î market
mechanisms
as supply and
demand
Substitution
mechanisms
A few only, as larger numbers are
incomputable; substantial data
requirements, never covering full LCA.
Includes substitution.
Not steady state any more; very
complex multifunctionality problem, Or:
external to LCA, specified by hand.
Limited part of market mechanism; partial
mechanism may lead to unrealistic
outcomes; not applicable for whole
product system.
Spending behaviour as based on income
elasticity of demand (propensity to
consume).
Production factor models in IOA; may be
integrated with LCA as integrated hybrid
analysis.
External to base LCA, integrated by
hand. Probably covered under ‘process
selection’, see section below.
Income effects
Factor
substitution
Technological
and knowledge
externalities
Technology
dynamics
Linked
technology
development
Extremely difficult to quantify, as involving
social dynamics; innovation diffusion is a
hot issue, but empirically not well
developed.
Descriptively available in case studies; no
general theory; learning curve theory for
treating new, now too expensive
technologies.
Mobile phone technology in combination
with GPS/Galileo allows many new
technologies as for effective car sharing
systems; for freight traffic control; and
theft prevention of larger items.
Hybrid LCA can manage this effect.
Knowledge or assumptions on
elasticities required.
FU not with arbitrary unit but as actual
total volume. Comparisons not on same
functional unit, but for example on same
level of economic activity.
Diffuse mechanism throughout society.
Possibly incorporated in a similar way to
learning curve in technology dynamics.
Not compatible with steady state
modelling; comparison based on FU not
possible generally.
Technological externalities get us very
far away from functional unit type of
LCA.
Page 73
TF3 Methodological consistency
System
dynamics
Macroeconomic
mechanisms
Consumption
theory, what
drives
choices?
Cultural
mechanisms:
codevelopment
Cultural &
regulatory
mechanisms:
counter
development
Structural
developments
Mixture of main relevant mechanisms
changing the nature of the system. Gametheory based; gets at the real
mechanisms of decision making and long
term effects.
Employment effects (as type of jobs
involved), savings and investments,
effects on economic growth.
What goes together, with advantage?
Cooker, gas and food to cook of course
but more complex relations: simple
(environmentally friendly) house, but with
second house in countryside
Less work; more leisure; less travel; less
material view of life.
Possibly, first system dynamics analysis
then freeze the outcome and analyse
with ordinary LCA?
Relating micro-level choices to macrolevel economic development a disputed
area.
Items combined in consumption as
delivering the functional unit. No serious
conceptual problems but lack of insight
in what people value and appreciate.
Industry does it, in eco-efficiency
analysis
Broader production and consumption
analysis, as IOA linked scenarios
SUV as way out of the restrictions on
larger car due to reduced fuel
consumption rules per company.
Complex functional unit, as fleet level
scenarios? Predictions extremely
difficult, but effects possibly quite
dominant.
Relation to globalisation; structure of
labour force; capital intensity, effects on
developing countries as through soft trade
barriers.
Well beyond current LCI analysis
options
Et cetera?
In other domains, relevant tools of analysis have developed, which under
certain conditions might be transposed to the LCA domain. In the field of
economics, for example, there is a way to evaluate similar problems, by a
specific variant of comparative static analysis in which one evaluates the
influence of a parameter on the equilibrium outcome, by taking a partial
derivative of the equation system that describes the equilibrium against the
parameter. If such an economic description of reality is available (and then we
can really speak of an equilibrium), LCI/LCA may tell the environmental
consequences. The point is that the main difficulty is not the steady state
assumption of LCI/LCA but the lack of availability of such economic models to
use in such a deepened type of LCI modelling.
Whatever choices are made on the further development of LCA variants, their
place in the modelling domain should be clear. Especially when assessing the
consistency subject, clarity in a technical modelling sense and clarity in terms
of modelling strategy both are essential. The strategy level is the subject of
the next section.
Page 74
TF3 Methodological consistency
Conclusions
ƒ Rebound mechanisms indicate specific socio-economic feedback
mechanisms often ignored in LCI modelling.
ƒ Rebound mechanisms are not a specific class of modelling mechanisms;
they survey what is practically encountered.
ƒ A systematic approach dealing with mechanisms seems required to avoid
arbitrary case specific approaches proving how good/bad an alternative is
in that case.
Recommended practice
ƒ In cases where effects may be relevant for the choice and are not modelled
in the realm of the LCA, this should be stated in the Goal & Scope
definition, while the interpretation should contain an explicit at least
qualitative treatment of such rebound mechanisms, considering the
mechanisms listed in table 3 above.
Recommended developments
•
•
A change in modelling set up of LCA should be approached
systematically, either starting from changes in analytic modelling structure
of LC inventory analysis, shifting from steady state to other types of
modelling; or by looking at other models for decision support already
incorporating some of these mechanisms and possibly adapted for better
LCA-type of decision support; or indicating which modelling approaches
might be complementary to LCA, covering the mechanisms not covered in
(that) LCA.
Develop criteria to help decide whether a model reflects reality well
enough given the capacities and resources of the decision makers
involved; whereas goal and scope definition defines what is “well enough”
in a practical application.
3.4 Process selection and results in LC inventory analysis
The well known adagium about models, garbage in is garbage out, is true for
any model, how nicely its structure may have been set up. The problem is
more subtle though. The input data for LC inventory analysis are not garbage,
but have serious restrictions. Such restrictions are not absolute but relate the
questions one wants to answer with the modelling outcomes. In this section
we will first indicate the goal of LCA studies, restricting LCA to analysis for
decision support, and see how the process selection in LC inventory analysis
relates to these general goals. Next we will go into more detail, around the
complex of marginal and average. The nature of data selection and the nature
of modelling in LC inventory analysis has quite strong consequences for what
can be said about the reliability of results in LCI, that is the subject of the last
part of this section 3.4.
Page 75
TF3 Methodological consistency
3.4.1 Process selection
Choices regard the future, in that sense, LCA for decision support always
refers to the future. The comparison between two future options for
technologies may be made based on older process data available, under the
assumption that improvements in these technology systems will be similar.
This may be a reasonable approach in many cases, especially if the time
horizon is not long. This practical approach should then be recognised, but as
one option and maybe not the best. When selecting processes for the system
model, the scope of the choice determines which processes are relevant. For
decision support, these processes should reflect the future situation as
influenced by the choice at hand. So, if we model the future consequences of
having a certain function, ideally we would use the processes and their
specifications as will function in that future time period. However, in a
practical sense, there are no empirical process data available for any future
situation. at best we may have models of future process functioning of
technologies. So, for supplying empirical content, data on current processes
may be used only as a proxy for future functioning of such technologies.
There then is a number of further choices to be made, assuming that broad
data is available. Current processes may differ substantially in the age of the
technologies used. Some current technologies are several decades old, like
some air planes and iron foundries. In data based we may see the modal
process, typical for the current process mix, or an average over several
technologies. We can be sure that old and currently modal processes will not
be built any more, so they are less relevant for the future than the more
modern processes as are already functioning and are still being installed.
Pilot installations and large scale experimental installations are closer to
future technologies probably, but the details of their economic functioning and
environmental performance usually are not reliably available. So a modern
process as currently still built (and if more types are being built the average or
modal one of these modern processes) may be a good choice to include in
the systems model as representing the future. The typical modern process,
has been named the modal-modern process (Heijungs et al 1992). With
maybe some more emphasis on still newer processes, these are named the
marginal process by Weidema (2000 and 2002), being the technology which
is the expanding technology with growing demand. To avoid the several
conflicting connotations of marginal33 these processes may be described as
“the processes affected by marginal changes in demand”, see Weidema
(2003).
33
Marginal may have a quite conflicting meaning as being the inferior technology. Also, the optional
definition in economics of marginal depends on the time horizon chosen and the time characteristics of
the technologies involved. Long term marginal for a given technology equals short term average, see
Baumol (1972).
Page 76
TF3 Methodological consistency
Figure 3.2: Linking process data into the systems model in line with
scope and modelling choices
technical scenarios
future
technologies:
2010 2030 2050
3
Region
2
1
Process level data:
time of process installing : data base
technology x
experi-temporal
old
modal
modern
mental
-regional
representativeness
Scope definition:
-time of system functioning:
“typical 2010 for building construction, 2060 for
building demolition”
-place of system functioning:
functional unit delivered in region 1, indirectly
also processes from other regions involved (eg
through regionalised E-IOA)
Modelling choices implicit in LCA:
-linear relations
-homogeneous to degree 1
Î steady state
Î arbitrary FU
System specification:
-foreground processes, as for
FU and main unit processes
-background processes “fitting”
FUprocess
Page 77
TF3 Methodological consistency
When data availability is less ideal, the question arises if choosing the best
per process is also the best for the system, in terms of comparability of the
outcomes for different alternatives. If for alternative 1, for example solar cells
of a new generation, good models are available for their future functioning,
including centrally relevant technologies, while for alternative 2 existing
powder coal based electricity production is used, the comparison will not be
balanced. In principle, the same temporal representativeness could be
chosen for comparing. For the future situation involved, say 15 to 20 years
time from now, near zero emission coal (NZEC) might be the relevant
alternative to compare with. This balanced treatment of time frames is not yet
common practice in LCA. Also practically, no data sets on the future, as
detailed technology scenarios, are now available to be used in LC inventory
analysis.
In current LCA practice, it is not even current modern process data which are
used but often only average data are used from already older sources,
making it difficult to go for the modern version, which is more representative
for future functioning than an average processes set. Also in IO-databases
like CEDA 3.0 for the US and CEDAEU25 for the EU, the sectors are given as
averages based on historical data. In such situations modelling may be based
on average technologies as much as possible, with most relevant deviations,
where we are quite sure that other technologies will become relevant,
analysed in a sensitivity analysis. This quite usual situation in simple LCAs for
decision support, that average historical data are being used has a curious
consequence: the difference of consequential LCA with ‘historical’,
‘descriptive’ or ’attributional” LCA vanishes practically. The past performance
of processes then is assumed to be indicative of future performance. See
chapter 2 above for a more detailed discussion.
3.4.2 The nature of results
There are several reasons why LCI modelling typically does not predict future
states. They all relate to the data being used in modelling and to the nature of
modelling itself. If major mechanisms in society are not included in the LCI
model, the consequence of course is that the model cannot predict
developments as will take place due to some technology/product choice.
Predictive models reflect mechanisms in reality as good as possible.
Secondly, steady state models assume and indicate a hypothetical
equilibrium situation with quite strong and of course unrealistic ceteris
paribus assumptions: no other technologies will change; no market
adaptations other than supply-demand matching for the FU will take place.
Such a hypothetical equilibrium is not a prediction on reality, it never can be
seen, not in past nor in future reality. Thirdly, the data as put into the model
determine the nature of what comes out of the model. We will go into more
detail now on the question what the nature of input data may be and what this
implies for the nature of results. The first subject here relates to choice of
Page 78
TF3 Methodological consistency
data. The second subject is how validity and reliability, as in precision and
accuracy, can be established.
Marginal processes, marginal process data, marginal analysis
Several terminological problems surround the data selection in LCA, some
quite different subjects going under the name of marginal. We just
encountered the ‘marginal process’, as the process most relevant in indicating
effects of choices’, as the one reacting on a change in demand. How a
process is being described is another question. Ultimately, it is a description
in terms of input and output coefficients. These, however, may represent the
marginal functioning of the process, its average functioning in practice, or its
intended functioning, or its optimal functioning from an economic perspective,
etc. To complicate matters further, the marginal process functioning can be
specified under different assumption on what is kept constant. If the amount
of the capital good involved in production is kept constant, the marginal
process data reflect short term changes induced by changes in production
volume, without a change in capital goods. If, however, the amount of capital
goods can be adjusted to increased demand, the marginal takes these into
account as well, leading to totally different outcomes. This latter situation
seems the most relevant for LCI modelling. This long term marginal change
happens to be equal to the average long term functioning of the future
process.
To further complicate affairs, one may look upon LCA as answering questions
of marginal or incremental change, that is what would happen if one unit of
product (representing the functional unit) or a certain realistic amount were
added? This is a marginal or incremental analysis. In new-LCA involving nonlinearities, this difference becomes of paramount importance! It is the main
reason for developing such non-linear models. Sustainability analysis
indicating the shifts in land use resulting from large scale introduction of
bioenergy crops will indicate a totally different score for the first units of
energy crops added now as compared to the last unit required for giving us
20% bioenergy in 2050. The tensions created by this huge agricultural
demand may well lead to severe disruption of available nature area, and to a
further diminishing share of nature in global net primary production, as a
nearly total appropriation of nature.
Validity, reliability, precision and accuracy
Validity relates to the appropriateness of the model. Reliability can be seen in
terms of precision and accuracy: What is the spread in the result of several
measurements, and how good can results be reproduced. These customary
concepts relate to measuring and predictive modelling. In LC inventory
analysis, there is no prediction of actual states to reach but there are
description of steady state situations, as what-if-scenarios. These alternatives
or scenarios are to be based on such similar assumptions that they can be
compared, as being indicative for a not known “real” prediction. This
characteristic is often the characteristic of scenarios, apart from predictive
Page 79
TF3 Methodological consistency
scenarios. The judgement on such scenarios first is on internal consistency,
can they exist? Secondly there is a check possible on their being realistic: Is
there a reasonable backcasting route? As the LCI scenarios depict steady
states and not a certain future state “in the year 2050”, even this backcasting
criterion is not readily applicable but only grosso modo. It is clear that the
usual empirical modelling concepts of validity and reliability of model
outcomes are not directly applicable to such idealised LCI scenarios. Validity
may be defined as the scope of the LCA being in line with the decision to be
supported, but this is a weak link. Why not use other models? All statistical
analysis on reliability of LCA results is based on how input data affect the
results, not on assumptions which link results to real life. There only is a
relation between the spread in data of process inputs and the spread this
causes in final LCI and LCA results. This of course is useful but should not be
confused with reliability in a predictive sense. Accuracy as a statistical
concept related to true value is inapplicable in this sense. Precision, as the
same outcome in several measurements, might be defined in special cases
when several modelling approaches apply, for example if there is an
incomplete process LCA approach and a rough hybrid LCA approach used on
the same alternatives.
Next, LCA being modelling for decision support, the scientific approach to
statistical analysis may not be the most relevant one. LCA functions in a
context where decision makers actively are creating the future, reckoning with
relevant mechanisms some of which are reflected in LCA. There is a vast
body of literature on this subject, with as a central kernel that it is the
assumptions relevant for decision makers which are to be reflected in the
decision making process. Though often called by names suggesting wide
gaps in philosophical backgrounds, this aptness for decision makers seems a
quite straightforward criterion. Though advertised as post-normal science, the
seminal contribution to the field by Funtowics and Ravetz (1990) remains
close to normal science but focuses on application in a (political) decision
making context.
Conclusions
ƒ There is undue focus in LCA on past processes, as LCA for decisions
support is to give insight in future consequences of choices.
ƒ The nature of process selection and modelling is different from empirical
modelling.
ƒ Quality assessment is different from that in measurement and predictive
modelling, see chapter 2.
Recommended practice
ƒ When selecting processes for building the LCI model, explicit reference to
the time frame of the question is to be made.
ƒ As all LCA for decision support regards the future, processes representing
that future are to be preferred to processes representative of the past.
Page 80
TF3 Methodological consistency
ƒ
ƒ
ƒ
ƒ
ƒ
They may be modern processes already existing modern, or being
implemented or developed.
For long term decisions future technology scenarios may be more relevant.
With processes of different time frames in the analysis, incomparability of
alternatives may result. This inconsistency is a serious flaw in LCI
modelling, which should be specified and be subject of sensitivity analysis.
Short-term marginal process data (as indicating a change in inputs and
outputs at constant amount of capital goods) are the relevant data only in
special case of short-term optimisation. This should be indicated in the goal
and scope of the LCA study, and is not now in the domain of LCA.
In the interpretation part of the study it should be indicated how the
problem of lacking data on the future has been resolved in the LCI model.
Quality analysis of LCA results as in terms of validity, reliability, accuracy
and precision is different from quality analysis in domain of measurement
and predictive modelling, as LCA steady state models don’t give empirical
predictions.
Recommended developments
ƒ Next to databases on current processes, data on modern processes, being
implemented now, are an option to be preferred.
ƒ For decisions with a long time horizon, technology scenarios on main
processes are to be developed for use in LCA.
ƒ Methods in LCA are to develop indicating how to deal with long term
developments. This may refer to current product systems where the
functional unit delivering process covers a long period, as with cars lasting
over 15 years, and building construction lasting up to several centuries
(supposedly, for example, with environmentally friendly ‘solids’). Also,
many research and development decisions refer to technologies which
may be implemented at a substantial scale after decades, like many solar
cell technologies.
3.5 Time in sustainability modelling: main options surveyed
The primary point of view in this survey is how time is incorporated in models
for sustainability decision support. Other aspects are also important like the
treatment of spatial detail.
Page 81
TF3 Methodological consistency
Table 4 Time in sustainability modelling
TIME IN MODELLING
1. No specification of variables in time 1:
STEADY STATE EQUILIBRIUM MODELS
2. No specification of variables in time 2:
STATIC EQUILIBRIUM MODELS
3. Specification of variables in time outside
the model, as time series, exogenously:
QUASI-DYNAMIC MODELS
4. Time dependent specification of at least
some main variables, endogenously,
past period determines next period
DYNAMIC EQUILIBRIUM MODELS (ANALYTIC)
5. Specification of all (relevant) variables
endogenously in the model: predictive
models tested against historical data sets
DYNAMIC MODELS (EMPIRICAL)
EXAMPLES
LCI model; related
optimisation models; EIOA
models
market equilibrium model;
related optimisation models
Some Cost-Benefit Analysis
(CBA) studies34
Dynamic EIOA models; GEME3 models; related
optimisation models
several macro-economic
models
The first four modelling types are analytical models, giving insight in some but
not all mechanisms relevant to the decision situation. Only dynamic empirical
models, number 5, may pretend to cover the full relevant reality. They hardly
exist at the technological detail required in the domains of application of LCA
of products and technologies. So, when modelling expected effects of
choices, there remain four basic approaches for incorporating more
mechanism in LCA. One is to analyse mechanisms externally and incorporate
them in steady state LCA in terms of choice of relevant processes and their
relevant specification. The next option is to incorporate mechanism as
equilibrium models, like is usual in market analysis. A change in volume leads
to a change in price at a new equilibrium. This option would be interesting.
However, the technicalities involved in modelling restrict this option to limited
systems, as partial equilibrium analysis, involving a limited number of
processes only. For larger LCA systems, computational power and data on
markets both are lacking. A first step to incorporating time is doing so “by
hand”. In Cost Benefit Analysis, for example, times series often are specified
externally, possibly involving partial models. Though time specified, such
models are not dynamic in the sense that they are time dependent, with the
state of on period determining the next state in time. Dynamic models, as time
dependent models, have endogenised the mechanisms involved. As with
34
By discounting, the time specification collapses to a single point in time. The partial equilibrium
models used in CBA then could as well be classified under STEADY STATE EQUILIBRIUM MODELS. When the
impact assessment and evaluation are set up in a time specified way, and LCI results are not
discounted, the time specification may have a serious role. For example, the analysis of biofuels based
on wood disregards the delay of carbon capture in the tree, which takes place after the wood has been
used for energy purposes. If there is great urgency in reducing climate forcing, this time aspect, the
growing time of the trees involved, may be highly relevant.
Page 82
TF3 Methodological consistency
equilibrium models, the technicalities involved in modelling make it impossible
to make such models at the level of detail customary in LCA now. Such
models may play a role however background scenario modelling. If leading to
a stable equilibrium, such dynamic models may play a role in specifying
steady state background scenarios. We will now treat the options in some
more detail.
3.5.1 Steady state equilibrium models
We may go one step further in typifying the nature of modelling in LCI, placing
it in the perspective of other options for modelling. Again, the treatment of
time is the starting point. By disregarding, in the model, the specific time
sequence of activities, the interpretation of the outcome of an LCI (let alone
LCIA) is not straightforward. It is a construct which can never be “seen” in
reality and which cannot even exist in reality. A thought experiment can bring
some clarity: Imagine that, with the technologies as specified we would make
a historical construct putting every process each time its functioning is
required in a time frame. So, recycling of refrigerator scrap steel would not go
back to the steel production for the production of the refrigerator as used, but
would go the steel making for a future refrigerator, used for meat consumption
at a later point in time. By adding enough units of meat consumption in
consecutive periods of time, each process in the system will occur at each
period of time exactly in the amount required for one unit of function analysed
in that period of time, see figure 1. This situation would result in due time if all
technologies were to remain as specified and all consecutive time units were
present at this imagined future moment of time. The usual name for such a
model is a steady state model. The steady state reflects the technical
relations as specified for each process in its input-output characteristics. In
LCA this steady state is based on a static model of processes, which do not
have time as a variable, so the steady state is not resulting from the model
but the equalised system is assumed to represent the steady state. Dynamic
equilibrium models, like the energy-environment models developed for the EU
may also specify a steady state, of course only if converging to a stable state.
Page 83
TF3 Methodological consistency
Figure 17 Constant technology systems specified as a steady state time
slice (“snapshot”) in time.
steady
state
time slice
FUprocess
FUprocess
FUprocess
FUprocess
FUprocess
FUprocess
FU
Time
The linear relations used to specify processes in LCA and the steady state
set-up of modelling have another nice advantage: it does not matter how
large the functional unit is chosen. The size is arbitrary; there are no
economies or diseconomies of scale, at least not in the LC inventory system
thus modelled. If the functional unit is chosen twice as large, all processes
inflate with the same factor two35. In terms of interpretation, this means that
the outcome for one unit does not differ from the next unit. With the process
parameters set, the discussion on marginal, incremental and average scores
becomes senseless in the context of linear steady state LC inventory
analysis. In the model these are all the same. If the linear homogeneous
model would be based on short term marginal/incremental process data, of
course the domain of application is limited; for larger changes the non-linear
support models would have to be used again to specify a new set of linear
relations. So there is a very relevant discussion on marginal, incremental and
average in the specification of processes, see below on choice of processes.
Other wordings may convey these aspects better, avoiding the confusion on
“marginal”, as being used in at least three different contexts.
35
In technical terms this means that the model is homogeneous to degree one, also named linearly
homogeneous. In economics non-linear models are commonly used having these characteristics, like
Cobb-Douglas production function based models using exponential equations. In the LCA context,
linear relations form the basis for linear-homogeneous models.
Page 84
TF3 Methodological consistency
Of course, non-linearities may be incorporated in LCA without moving to
dynamic models yet. Two main classes of non-linear models may be linked to
LCA, technology models as in engineering models and market models
indicating the quantitative role of processes, including substitution
mechanisms, as linked to different alternatives. This step seems close
enough to LCA to make. One consequence would be that the arbitrary choice
of the functional unit would not be possible any more; real amounts as
expected would have to be used. We then move to the subject of equilibrium
models and optimisation models
Equilibrium models and optimisation models
There is a class of equilibrium models which does not indicate a steady state.
A much applied example is the market equilibrium model, which indicates
how prices and quantities react on a disturbance in conditions. Such models
may be used in environmental analysis, but then referring to very partial
systems only, due to computational and data problems when trying to model
larger sets of markets. Such partial models may well be used to exogenously
specify the role of specific processes in LC inventory analysis.
Optimisation models (see Introduction) have a much more relevant role to
play. A first distinction is between short term optimisation and steady state
optimisation. Short term optimisation is sensible with other types of
technology models and with non-steady state type of models. Also for
environmental analysis this is a highly relevant class of models. Seminal work
on this is in Azapagic (1996) and Azapagic and Clift (1998), and in a number
of case studies based on engineering models. For steady state optimisation
its use seems similarly attractive but the requirements and interpretational
consequences have not been worked out.
The conditions under which optimisation models are most relevant for LC
inventory analysis; the methods to apply to create degrees of freedom to be
filled with a goal function; and the interpretation of outcomes all deserve
further attention.
3.5.2 Non-steady state models for LC inventory analysis?
The disadvantage of steady state fixed technology LCI modelling is very real.
All decisions will have real consequences, also real environmental
consequences, which mostly are not reflected in them. The discussion on
rebound effects is an example of a reaction of how consumers may react on
choosing for one instead of another product. The mechanisms involved are
not present in steady state LCI models and cannot be incorporated in such
models. The results of such mechanisms may be incorporated to some extent
however, as in the choice of functional unit and the choice of processes
included in the system, and the way they are specified. Similarly, for
producers, we all know that if demand for a certain product goes up, prices
will be affected and all producers will adapt production volumes, by changing
capacity use on the short run and adapting the capacity of installations on the
Page 85
TF3 Methodological consistency
long run. Such real mechanisms which form the core of economic behaviour
cannot be incorporated in the steady state LCI model. As with consumer
reactions, such mechanisms may be dealt with exogenously, in specifying the
linear relations. Process choice and specification may reflect behavioural
choices of all process operators, but in a frozen form, e.g. reflecting their
choice of operating point, in terms of capacity use. In reality, the choice for a
new technology will lead to a learning curve, where the central process is
optimised and adjoining processes are adjusted to the new one. This learning
process is an essential ingredient in modelling technology development. The
dynamics may be simplified as in using standard learning curves, but even
such simplified curves have one characteristic which does not fit into steady
state modelling: They require a time specification of all activities. For steady
state modelling, only the frozen relation at a certain point on the curve can be
taken into account.
On specific example, indicated in section 3.4.1 above, is when in the
specification of a product system process with different time background are
used, like when we use current processes for current industrial waste
treatment and future processes for end-of-life waste treatment of long lasting
products and installations in the same LCA, thus specifying these processes
in a time perspective, but still using a "frozen" model. The outcome of such a
time-mixed model would be closer to what we would like to produce for
decision support. Currently, the model used is the steady state equilibrium
model, with a time twist built into it. It might be worthwhile to specify such
situations in a quasi-dynamic way, specifying the full system in time around a
functional unit also specified in time (“the use of house from 2010 to 2060”).
Such a model would be more complex, for example because all background
processes, like electricity production cradle-to-gate, would evolve in time as
well. Having done so, one option is to indeed move the life cycle analysis to
this quasi-dynamic framework. This would lead to options for improved
environmental modelling and evaluation as well. It clearly is more relevant to
reduce climate changing emissions now, to avert possible disasters, than to
do so by recycling materials 70 years from now. The other option is to specify
systems in time, but then collapse them in time to allow for the usual steady
state life cycle analysis.
When going for real life mechanisms in LCA, there are two basic roads to
incorporate such mechanisms in the analysis, not just them frozen steady
state effects resulting from these mechanisms. One is to move to comparative
static equilibrium modelling, incorporating the mechanisms but only terms of
the equilibria resulting. This has the clear advantage that alternatives can
easily be generated using the same mechanisms incorporated in the LCA
model, see section 3.5.3. The other option is to specify the effects of
mechanisms in time. This can be done externally still, specifying the time
path in the model, as a quasi-dynamic model, or incorporating the
mechanism in the model, making the model a time dependent dynamic
model. In time dependent models, the situation in t1 determines the situation
Page 86
TF3 Methodological consistency
in a later t2. Main examples of quasi-dynamic models in sustainability decision
making are in the application of cost-benefit analysis, which is obligatory in
the US in many public decisions regarding the environment. Dynamic models
for sustainability analysis have developed for example in the energy domain,
see for example the E3ME models, the three ‘E’s standing for energy,
environment and economy (see E3ME, 2006) and similarly the later GEM-E3
models (GEM-E3 2006).
Before jumping into dynamic models let us first review other forms of nondynamic equilibrium modelling. Steady state models may be seen as a
specific form of comparative static equilibrium models. Market models in
economics are a main example of such equilibrium models and may be quite
relevant in sustainability decision making.
3.5.3 Non-steady state static equilibrium models
Steady state life cycle inventory modelling now specifies how things would go
if indefinitely the same technical relations would hold. There are other ways of
modelling which also may specify an equilibrium situation, but not based on
the long term assumption. The most well known example is the supply and
demand relations on the market for some product. If additional demand is
created, supply adjusts to this demand through the price mechanism, which
leads to adjusted demand, etc., in a number of adjusting steps. These
dynamic steps are not modelled in comparative static equilibrium modelling,
but only the equilibrium is specified. The one-market situation is the simplest
one. However, markets are connected as increased supply of some product
implies increased demand for the intermediate products required in its
production. So the additional demand will lead to price changes upstream.
This will affect the behaviour of current purchasers, who will decrease the
amount purchased and will substitute part of their demand to other products,
e.g. when propylene is required for the product investigated, some other
producers will shift to PVC as their construction material. It is clear that for the
analysis of the environmental consequences of choosing for one type of
product instead of another, such mechanisms are of utmost importance. So
why don’t we shift to these market based mechanisms in LC inventory
analysis, using given technologies as a reference, as in current LC inventory
analysis? We would if we could but we can’t. There are two basic reasons
why this option is not yet open:
•
One is the computational requirements in linking many markets
simultaneously. In current LC inventory analysis, a process is specified
in terms of its functioning at some working point. In the market
mechanism based equilibrium modelling the amounts of inputs and
outputs can be varied to some extent independently, based on the
production function. Given prices for inputs and outputs, the producer
will adjust his behaviour. For each process hence a production function
Page 87
TF3 Methodological consistency
•
and a goal function are specified. The simultaneous equations
resulting for the system as whole are, even for the most simple
production functions, beyond the scope of main frame computers if the
number of processes is large enough. The number of processes
involved in current LCA is much smaller than in this type of equilibrium
analysis, as in LCA side flows are cut off as through allocation, while in
general equilibrium modelling almost all processes will enter the
system. Even if through some aggregation procedure the number of
processes is reduced to “normal” product system sizes of around one
thousand (ecoinvent contains around 2500 processes and CEDA 3.0
and CEDAEU25 specify around 500 sectors) the set of equations cannot
be solved.
The second reason is data requirements: Current LCI data reflect a
certain capacity use or working point, with input-output coefficients
fixed at that point. Shifting to market based analysis requires data on
production functions. These generally are not available. One specific
problem in data requirements is the quite common mechanism of
substitution. While simple markets can be typified through elasticities
of supply and demand, substitution involves cross-elasticities, where
the supply in one product by producer 1 influences the demand in
other products not directly related in the production chain of producer
1. Realistic modelling requires a deep insight in technologies and
markets involved, as shifting to different suppliers and different
products usually involves initial adjustment costs. Empirical modelling
is limited in this field and to our knowledge not present in relation to
environmental interventions modelling.
The solutions chosen in economic modelling go in two directions. One option
is to aggregate main parts of the economy into larger unspecified sumprocesses, which are difficult to link to environmental interventions. This is
one variant of applied general equilibrium modelling. The other option is to cut
off the system and go for partial equilibrium modelling.
It would be interesting to investigate these options of comparative static
modelling as possible extensions of LC inventory analysis. Some interesting
examples have already developed in the realm of dynamic modelling (see
Section 3.5.4 for more details).
3.5.4 Non-steady state dynamic models
If models are to specify the actually expected consequences of a decision
they should reflect the real mechanisms. They therefore should be dynamic,
as causes and effects are always intertwined in time. Predicting the future in
enough technological detail to specify environmental interventions and their
impacts indeed is a main challenge. Generally, sophisticated extrapolation is
an option for short term prediction. For the longer term, models with specific
Page 88
TF3 Methodological consistency
empirical relations are to be preferred. However, we cannot hope to
effectively cope with all relevant variables and their mutual relations in an
adequate way. In terms of detailed modelling, the future is open to a
substantial extent. So, dynamic models do not pretend to predict the future,
but may depict developments reckoning with a few main mechanisms.
3.5.4.1 Dynamic input-output models
A most simple dynamic modelling input-output analysis based modelling type
is that where a number of technologies are available, with the economically
better one replacing the other equivalent technologies based on an
investment function. An example is the DIMITRI-model (Idenburg en Wilting
2004), which indicates environmental consequences of introducing a new
technology. Similarly, the analysis of food consumption can be analysed
(Duchin 2005) and the environmental advantages of international trade can
be studied (Unger and Ekvall; 2003). A time path of technology mixes is
depicted. Their environmental interpretation is not relative to a functional unit
in this case, but relates to total demand in society. Development of emissions
in time may be specified. Using this model for sustainability decision support
on technologies is not possible directly, as the basis for comparison should be
established. For example, a link to steady state LCA can be made quite
straightforwardly. The technology mix for a certain year can be used as a
background data set for a system which is further defined in terms of usual
LCA processes. We then are back to usual steady state hybrid LCA, see Suh
(2003) and use the dynamic model for the generation of background data
only. If such models were to develop specifying future technologies in an
encompassing way, this would be very interesting. Some may be doubt
however if the required insight in technology development is there. Surely,
there is more knowledge required than for an investment function indicating
the dissemination of a new and economically superior but exogenously
defined technology. In special cases this approach may already be interesting
as an LCI type of analysis.
3.5.4.2 Dynamic general equilibrium models
The main focus in environment related dynamic modelling is on energy
technologies, fuelled by concerns on climate change and on the supply of
fossil fuels. They have led to a series of Computable General Equilibrium
models (CGE, also as Computational General Equilibrium models, and also
named AGE, Applied General Equilibrium models), most of which are
dynamic or quasi-dynamic and a few comparative static, see Bergman and
Henrekson (2003) for a survey. A main example of such a model has been
developed for the European Commission, the E3ME model, which has been
extended into a global version as E3MG and is regularly updated (Version 4.0
is available now, see the e3me website: http://www.camecon.com/e3me/).
The general set-up of such CGE models is that foreground processes are
modelled specifically, though not as specific as in LC inventory analysis, while
Page 89
TF3 Methodological consistency
the background data are in the form of input-output tables to which the
foreground processes link, as in integrated hybrid analysis and hybrid LCA.
Similar to the DIMITRI model, the input-output part is not a constant but itself
dynamic as well, based on technology development or assumed technology
development scenarios. These models may cover not only the energy part but
also specify resource use and emissions, especially as related to energy use.
For a given decision where now LCA is invoked, it might also be possible to
use such a CGE model. In principle the technologies considered in the choice
can be specified in this framework and the consequences for the environment
derived by comparing the paths as predicted by the model. However, in
practice the number of foreground equations is a few dozen only covering still
quite aggregate units like a number of demand functions, macro-economic
relations and base technologies for energy production. Also the background
data, based on input-output data are quite aggregate, typically involving
around 30 sectors. It will hence not be possible to derive a detailed insight in
the consequences of more specific technology choices now. However, the
expansion of these models is going on and especially for decisions involving
larger entities, like in energy production, metals production or mode of
transport choices, such models may already now be more adequate than
LCAs to get the overall view. Also, the level of detail in background processes
may increase substantially, as more detailed EIO tables become available.
Their interpretation in terms of functional units remains quite impossible
however. The time paths depicted either has a cut off, or they may never
converge. With a cut-off, for example 2025, the sum of all environmental
interventions might be treated as in LCA impact assessment, but not covering
long term effects. If paths do not converge, a cut-off in time has to be made to
avoid large but irrelevant effects. Discounting of effects could be an option.
Partial equilibrium models have not been developed in a way which is
interesting for direct application in environmental analysis. In market analysis,
they may be as detailed in terms of processes as LCA models are now, but
covering a few processes only. It is at this level that substitution processes
can be described in the detail which would be required for use in LCA.
Effectively, such models are lacking empirically.
Conclusions
ƒ Current LCI modelling gives results in terms of steady state scenarios
which reflect the technologies as specified. This state of affairs is by no
means a natural state to be in forever.
ƒ Going to time specified models opens up options for increased realism,
also in the impact assessment, at the cost of increased model complexity
and data requirements.
ƒ Modelling market mechanism within LCA as in partial equilibrium modelling
would add realism to LCA, but is not possible at the moment, due to
system complexity required in representing the real world, and due to the
immense data requirements.
Page 90
TF3 Methodological consistency
ƒ Modelling substitution is one of the most complicated parts in the economic
modelling of markets, with extreme data demands which may be met only
incidentally.
ƒ Modelling of substitution at the level of detail of LCA process specification
is not present in relation to environmental modelling and is hardly available
at all. Efforts to model substitution in LCA are not relevant before more
simple market relations would have been incorporated in LCA.
ƒ Dynamic models might be used for background modelling in LCA, if they
can be integrated in form of input-output tables. This option deserves
attention for more future-oriented LCAs.
ƒ Dynamic economic-environmental models as developed for energy
analysis may replace steady state LCAs in situations where decisions
involve larger units in society. Their structure and interpretation are
fundamentally different from functional unit linked LCA.
ƒ Dynamic LCA for detailed technology decisions seems an impossibility
given the extreme modelling complexity and data requirements.
Recommended practice
ƒ There is no recommended practice yet regarding the use of other models
than steady state models for decision support in situations where steady
state LCI models now are applied, but see below on hybrid LCA and
integrated hybrid analysis.
ƒ …
Recommended developments
ƒ The conditions under which optimisation models are most relevant; the
methods to apply for having degrees of freedom to be filled with a goal
function; and the interpretation of outcomes all deserve further attention.
3.6
Hybrid modelling for LCI
3.6.1 Modelling principles
The modelling structure of Input-Output Analysis with environmental
extensions (EIOA) is very similar to that in LC inventory analysis, so LC
inventory analysis and IOA can be combined in one system (see Heijungs
and Suh, 2002 for the mathematical analysis and a broad practice as is
emerging), especially if they cover the same sets of environmental
interventions. There are two essential differences between the two modelling
approaches. One is the level of aggregation of process specifications in IO
tables. These processes are aggregate processes, like ‘meat production’
variously referred to as activities, sectors or industries. We mainly use the
term industry here, but cannot set this as a standard. Depending on country
the number of industries for which input-output data are produced may range
Page 91
TF3 Methodological consistency
between 15 and around 500, the last for the US and Japan. In Europe, the
number of industries into which uniform data are aggregated are around 60.
There is a new standard classification being introduced in the EU and the US
(NACE and NAICS respectively, in an updated version), and with some delay
quite probably in the UN, which distinguishes between over 600 industries.
This may be the ultimate standard for hybrid analysis in the next decade.
Process databases, such as the EcoInvent database already cover over 2500
processes, with many specialised databases adding thousands to these. So,
compared to process-based LC inventory analysis, EIOA may be more
aggregate, depending on the industry and available process details probably
between one and three orders of magnitude. The second main difference is
how the links between processes, as input and output flows, are specified. In
LCA there is various ways of description, ranging from '’number of t-shirt’ to
‘number of phone calls’ to ‘MJ of electricity’. In the world of IOA there are
standardised nomenclatures for products. One industry produces several
products and sells these in different ratios to different industries. These
specified flows, however, are condensed to their monetary value when
making the Input-Output table. This step involves the “making of homogenous
industries”. Making sectors homogenous means that parts have been cut off
by partitioning or subtraction/substitution, while, these parts have been added
to the sectors they belong to most, following procedures similar to those in
allocation in LCA. As LC inventory analysis is liberal in its dimension of
connecting flows, the combination with flows in monetary terms poses no
problems.
Having the same mathematical structure, LC inventory analysis and EIOA can
be combined, as a hybrid type of analysis, next to using either of them
separately. In the combination there are two basic options. One is to
strengthen LC inventory analysis by using EIOA data as an approximation for
fast studies and especially for missing flows. In this way, arbitrary cut-offs can
be avoided and equal levels of completeness can be specified also for
technically quite different alternatives. We have referred to this option as
tiered hybrid LCA, for short also referred to as hybrid LCA in the following.
The other option is to enlarge the domain of application of EIOA by adding
more technology specific parts as related to the questions at hand. Sector
specifications, even at the 600 sectors level of detail, will remain coarse
averages over very different processes. By adding detailed LCI-type
technology specifications, a data structure results which may be used for the
same domains in sustainability decision support as (hybrid) LCA, but with
some interesting differences. We refer to this second option as Integrated or
Embedded Hybrid Analysis.
The focus of applications is the same for both types of analysis: supporting
choices on technologies from the environmental part of sustainability
considerations, possibly combined with economic and social aspects.
Technologies is a broad concept, including consumption technologies like
cooking food and driving a diesel car, and choices may relate to specific
Page 92
TF3 Methodological consistency
technologies but also to choices on strategies and policies having an
influence on such technologies. So the analysis in principle is a comparative
one, stating environmental characteristics of each of the alternatives or
variants of them under scrutiny. In LCA the equal functional unit is the basis
of comparison, in process LCA usually specified in physical functional terms,
like ‘driving one car kilometre’, with different types of cars or fuels to be
compared. In EIOA, the functional unit would tend to be specified in terms of
equal amount of spending, as on car driving, leading to different amounts
driven, with different car systems having different prices. The monetary
defined functional unit can easily be applied in process LCA while the
physically defined functional unit can be used in EIOA, with simple pricevolume transformations.
The more fundamental option however is that totals for society can be
specified, as resulting from total consumption. Especially if a global model is
available, like GTAP (see ref: GTAP), a change in volumes of consumption or
a change in technologies can be specified against this full total in society. So,
in the combination of LC inventory analysis and EIOA four main options for
defining the functional unit can be discerned, see table 3.1. Of course, it
always is possible to step down from a full size monetary defined system
specification to a product specification of the corresponding totals, to a certain
amount of spending on the product involved, to a certain amount of the
product involved in physical or functional terms. Surely, the outcomes will be
different.
Table 5 Four main options for functional units in LCA and EIOA
combinations
FUNCTIONAL UNIT
OPTIONS
description in
terms of
product
characteristics
products in terms
of
monetary value
UNIT SIZE
FULL AMOUNTS
(ARBITARY)
(NOT ARBITRARY)
FU1
1000 bus-km local bus
transport
in the EU
Or:
20,000 bus passenger-km
FU3
total volume of local bus
transport in the EU, 15 billion
bus-km
Or:
30 billion bus passenger-km)
FU2
FU4
1000€ expenditure
on local bus transport
total volume of final
expenditure on local bus
transport in EU
Page 93
TF3 Methodological consistency
3.6.2 Tiered Hybrid LCA
Process descriptions in process-based LCA usually are incomplete in terms
of the flows specified, both in terms of the directly visible flows of materials
and energy required for a product and in terms of overheads as in capital
goods, research and development, marketing, administration, etc. Filling in
such data is a costly affair, the main cost of making LCAs. As a consequence,
studies with an extensive budget will show worse results, environmentally
speaking than simple studies. The amount of underreporting can be
estimated using the in principle full coverage of EIOA.
However, available data bases for EIOA are a few only. Detailed IO tables
with environmental extensions have been pioneered in Japan by Moriguchi
and in the US by the research group at Carnegie Mellon, with adaptations for
use of American data in LCA by Suh, and adaptations to the European
situation in the EIPRO study for the European Commission (Tukker 2004,
Huppes 2006). Even the around 500 x 500 tables for Japan, US and derived
EU25 are extremely coarse as compared to details encountered in process
LCA. There is a global EIOA system available, GTAP, with around 60 sectors
and 60 regions, but a very limited number of environmental interventions.
European EIOA data are available also at a higher level of integration, as
NAMEAs per country, with around 60 sectors, slightly differing from the GTAP
sectors. The NAMEA data differ from the EIPRO data and the GTAP data.
However, as the totals involved will by and large represent the real world, on
average the EIOA scores may already be seen as reasonable while their data
quality may still be much improved. One strategy for making LCAs is to make
them in a hybrid way. Processes specific for the product system studied, and
processes well described in available process databases, are filled in in
detail. Normal business administrations can track down costs for purchases,
proceeds of sales and the value added as the factor incomes resulting. The
part of costs not covered in LCA usually can be established at the level of the
firm, including an indication of the type of activities and purchases involved.
These missing flows can be linked to the most relevant sectors. When
developing an LCA study, after establishing the basic processes structure, all
other processes can be estimated, roughly, using EIO tables. Where the
input-output based flows seem important, at overall systems level, these data
can be replaced by more detailed process data, thus improving overall quality
of the study systematically (see Suh and Huppes 2002).
This hybrid LCA approach seems a most sensible addition to process-based
LCA. Current LCI oriented data bases form a reasonable start for hybrid LCA.
Improved data, especially a linked global IO system as now starts to be used,
could improve the quality of the input-output based part substantially.
3.6.3 Integrated Hybrid Analysis
Integrated Hybrid Analysis (IHA) depicts total volumes of final demand. In a
regionalised global model, this would mean the final demand in all regions,
with the regions linked in terms of import and export flows. By depicting total
Page 94
TF3 Methodological consistency
demand, the corresponding sets of environmental interventions specify the
total of these as well, the anthropogenic ones. The strength of Integrated
Hybrid Analysis is that it can form a bridge between sustainability
requirements at a macro level and the specific activities in production and
consumption.
Conclusions
ƒ Linking process LCA and environmentally extended input-output analysis
seems a most promising area for improvement of sustainability modelling
for technology oriented decision support.
ƒ There are two main lines of development, either linking EIOA to LCA, as
Hybrid LCA, or linking LCA type of process descriptions to EIOA, as
Integrated Hybrid Analysis.
ƒ In this hybrid analysis, the arbitrary functional unit can be replaced by totals
in society. This opens up perspectives for dealing with non-linearities, both
in the modelling of the Inventory part and in the environmental part of the
analysis, like in landuse effects of bio-energy as related to its scale level.
Recommended practice
ƒ For Hybrid LCA, the addition of missing flows using EIO tables should
become practice, based on provisional data bases now already available.
ƒ For integrated hybrid analysis, consumption analysis is well established.
Clarity on a number of issues is required, as on the use of producer versus
consumer prices, the way capital goods have been included and how
international links have been established, especially in relation to resource
use.
Recommended developments
ƒ Input-output data bases with broad environmental extensions need to be
developed for all regions in the world.
ƒ More detailed tables are required than now available in NAMEAs and
GTAP at a 60 sectors level. Even the US and Japanese level of around
500 sectors is coarse when linking to LCA type process specifications.
ƒ Methods for linking process LCA to EIOA have developed in a
mathematical sense. Guidelines for ‘how to do it’ need further
development.
ƒ Linking Integrated Hybrid Analysis to non-linear environmental
mechanisms seems highly interesting, for better assessment of technology
developments and for more rational sustainability policy development.
3.7 Mathematical structure of LCA models
The previous sections have described many important issues in LC inventory
analysis that concentrated in making the right model (steady-state,
prospective, etc.) and getting the right data (choice of technology, marginal
Page 95
TF3 Methodological consistency
processes, etc.). A final aspect of LC inventory analysis is of course to
combine the data in a computational structure, to produce LCI results. This is
not a trivial thing, although it is a bit a forgotten aspect of most texts on LCA.
ISO 14044 International Standard devotes no more than three sentences to
this subject; see their Section 4.4.4.3, while the ISO 14049 Technical Report
does not mention the computational procedure at all.
Within the general context of the present report, where functional
relationships are still very open, it is difficult to present an explicit and
operational mathematical framework. Therefore, we will in many cases
introduce a more restricted form, e.g., by assuming that all functional
relationships are expressed as a linear homogeneous system of equations.
In general, the computational problem in LC inventory analysis is one of
matching the volumes of economic inputs required and outputs produced over
all processes involved. Only if the inventory relations would be set up as
market relations, the computational problem would expand to the level of
market clearing, matching supply and demand in an active equilibrium
process. In current LCI, the functional unit/reference flow specifies a
“demand” for a certain product, hence the “market clearing” condition states
that this product is to be produced by a production process. This process is
connected to upstream processes by other products, like materials, energy
and services, and to downstream processes by waste products. Thus, a
demand for upstream products induces an automatic supply of these
products, involving several production processes, but no market mechanisms,
and hence without any economic significance. Likewise, the supply of
downstream wastes induces a demand of waste treatment services, involving
several waste treatment processes. These processes, in turn, are also
connected to upstream and downstream processes, possibly ad infinitum.
In economic equilibrium models, market clearing conditions that are imposed
on a non-linear production function give rise to complicated non-linear sets of
equations. In input-output analysis, the production functions are assumed to
be linear. This facilitates the solution of the system of equations to a large
extent, because a system of linear equations can be solved by a
straightforward application of matrix algebra, for instance using the inverse
matrix. Likewise in LCA, systems of linear equations can be formed and
expressed in matrix terms, at least when the underlying assumptions of linear
homogeneous representation of technologies are made. Instead of
simultaneous solution with a matrix inverse, a layer-by-layer computation may
be used as well. The equivalence between these two approaches is apparent
from the fact that the inverse of a matrix may be expressed as an infinite sum
of powers of matrices. By stressing the analogy between IOA and LCA, it
becomes possible to use IOA for LCA type of applications, as integrated
hybrid analysis (IHA), or to use IOA in addition to standard process-based
LCA, in the form of hybrid LCA.
Page 96
TF3 Methodological consistency
An underlying assumption of the full market clearing is that of a stable
equilibrium, which is akin to a steady state. In such a steady-state, demand
and supply will match, and all processes will operate in some optimal way.
The assumptions to be made are strong, in an economic sense, like
disregarding net investments, disregarding technology development and
changes in sector structure. In a shorter time perspective, there may be
tensions between supply and demand due to time constraints in the
adjustment process, as involving sunk cost. LCA is sometimes used for short
term optimization purposes. In that case, computational procedures that are
taken from operations research may be better than the steady-state matrix
approach from IOA. For instance, linear programming models may be applied
to LCA as well. It should be born in mind that the choice between the
computational procedures is related to the purpose and overall model and
data set-up of the LCA. For example, if short term optimisation of a production
process is required, the usual way of specifying technologies in LCA, as fixed
input-output ratios over the full process, may be less adequate than for
example short term marginal relations.
When a dynamic (or otherwise unsteady-state) model is used for other
purposes than equilibrium analysis and optimization, time lags between
supply and demand and stock dynamics are to be part of the model as well.
Dynamic input-output analysis provides one example of a linear model where
this has been achieved. At the level of detail required in sustainability analysis
for technology choices, the more general nonlinear case will be difficult to
formulate and solve.
3.8 Conclusions on advances in Life Cycle Inventory modelling
Consistency can be looked upon as internal consistency of one specific
method, or more generally as being consistent with broader knowledge in the
field, as external consistency. An internally consistent model (the Ptolemeus
view of cosmology) can be totally inconsistent with what we know about
reality, though in the long run such a discrepancy cannot continue. Ultimately,
it is being consistent with well founded more general knowledge which
counts. However, internally inconsistent models will hardly contribute to
knowledge, so is a derived criterion of external consistency as well. In this
conclusions section we concentrate on the external consistency, while part 2
of this report is about internal consistency. The central question is: can we
deal in a consistent way with what we know and can model about reality, for
sustainability decision support on technology choices?
The first main conclusion we can draw are that a better insight in the position
of current LCI modelling in relation to other modelling options is very useful.
The seeming discrepancy between steady state modelling on the one hand
and the desire to know the future consequences of choices on the other is not
Page 97
TF3 Methodological consistency
a discrepancy at all. Simplified models, like steady state models, may give
very relevant answers on the future, though of course partial only, within the
confines of the simplifications chosen.
Recognising this fact may then lead to improved practice in LC inventory
analysis for prospective purposes. Especially for decisions with a longer time
horizon, the use of data representing future technological relations, in stead of
old and discarded ones, can improve consistency of LCI with real life. This
option is totally different from changing the modelling structure, as when
incorporating dynamic mechanisms.
The second main conclusion is that rebound mechanisms as coming up in
discussions in energy analysis and LCA are a diverse group of mechanisms.
Incorporating such mechanisms seems possible in a more systematic way,
choosing between remaining within the realm of steady state LC inventory
analysis, and then accepting those limitations, or going outside these
boundaries, and then choosing for clear modelling options, as in terms of
other types of equilibrium modelling like in partial market analysis, as quasidynamic modelling like in cost-benefit analysis, or as dynamic modelling, like
in dynamic input-output modelling and energy & environment applied general
equilibrium modelling.
The third main conclusion is that LCA, deepened and broadened, remains the
main modelling technique for detailed systems analysis for sustainability
decision support. No other models link to detailed technology specifications.
Within the realm of steady state LCI modelling, we have the option to
represent technologies on the basis of past, current or assumed future
specification. This highly relevant subject should not be confused with
dynamic LCA. As far as mechanisms are concerned, we can stick to the
technology specifications in terms of fixed input-output coefficients.
Incorporating other mechanisms, like substitution, as a behavioural
mechanisms is not possible in LC inventory analysis in a systematic way, nor
is it possible as part of economic equilibrium analysis, apart from very limited
partial analysis. Acknowledging this state of affairs may simplify discussions
substantially. Of course it is a different matter if this state of affairs should be
accepted. Clearly, this is not the case. What should be accepted is that
simple steady state models cannot handle dynamics, by nature, in any other
way then as steady state scenarios. For better decision support we would
have to leave the realm of steady state LCA, and then be clear about the
modelling set up chosen to accommodate a more dynamic way of modelling.
The most promising extensions to LCI modelling are based on the
mathematically similar environmentally extended input output models, the
static, not the dynamic version. The first option is using such easily available
but much to be improved data for missing flows in process LCA. This may
both make the LCA cheaper and faster, and can help produce LCIs with equal
completeness. This last point is of clear importance, as more incomplete
Page 98
TF3 Methodological consistency
LCAs now go for a double premium: They are cheaper to make and show
better environmental performance. The second link is the other way around:
linking specific technologies, at the level of process LCA specification, into the
EIOA framework of total expenditures in society. This opens the option to
better link sustainability analysis to non-linear environmental mechanisms as
are dominant in many domains. Land use, climate change and links to
biodiversity all are based on very non-linear processes. In making this shift,
two new elements come up in sustainability analysis:
ƒ the functional unit can be generalised to expenditure levels on certain
groups of functions
ƒ The sustainability analysis can shift from a products evaluation to a
technologies evaluation.
Finally, LC inventory analysis and LCA, even if deepened and broadened,
always will give a partial view, as any other model does. There is no model of
all possible. It is of central importance that LCA guidelines should be
extended with rules on specification of missing mechanisms, including a first
analysis on how important these omissions might be.
Page 99
TF3 Methodological consistency
4 Summary and conclusions on methodological consistency
4.1
Summary and conclusions on selected methodological issues in
LCI
4.1.1 Prospective and descriptive analysis. Modelling changes in LCA
A discussion of prospective and descriptive analysis leads, for LCAs, instantly
to the discussion of attributional and change-oriented modelling. For this
reason, a scheme of recommended application should not deal with
prospective and descriptive analysis but “directly” with the question of
attributional and change oriented modelling.
It was possible to develop a scheme in this sense. The scheme poses three,
rather straightforward, questions:
•
Is decision support embodied in the goal and scope of the
analysis?
•
Is a change in the “status quo” embodied in any comparison being
studied?
•
Can that change be modelled with a net benefit?
The first two questions have, implicitly in most cases, been discussed in
previous literature. The third question is newly introduced here.
The questions are of a general nature. They aim at representing a consensus
among the whole LCA community, and to structure a more detailed
discussion and more elaborated guidelines. They will need to be discussed
and tested, while questions 2 and 3 will need to be detailed further. For
example, when should one assume that the status quo does not change?
How can one assess “costs and benefits” of modelling the change? What can
be modelled rather easily, and what seems excessive?
These questions have not been tackled in sufficient detail in previous
literature in a way that enables LCA practitioners to decide upon a suitable
change modelling method in a rational manner. They call for a “change
analysis” as a step in every LCA that aims at decision support, and for a
detailed “method cost benefit analysis”. The latter would best be
undertaken at a more generic, non-case specific level, with input from specific
cases.
Neither of these forms of analysis yet exist; there exist, however, several
threads that could be used as starting points. For example, the literature on
advantages and disadvantages of attributional modelling in comparison to
change-oriented modelling is rather broad (Ekvall et al., Weidema 2003,
Frischknecht 1998; see also Chapter 3). Several authors have presented
tools applicable for a change analysis (e.g. Weidema 2003), there is rich
literature and knowledge outside the LCA field, in statistics and advanced
Page 100
TF3 Methodological consistency
modelling, decision theory, in game theory, and most specifically in the field of
prospective analysis.
There is not yet, however, a consistent “framework” that integrates both types
of assessment and modelling, change-oriented and attributional, in a
consistent manner. The application scheme described here aims to be, in this
long-ongoing discussion, a first step towards a consensus on modelling
change in LCA. Looking at how deeply the modelling of change affects LCA
results and also conclusions drawn from an LCA, such a consensus is of high
need.
4.1.2 Multi-functionality and allocation in LCA
Based on the review of publications addressing methodological issues and
case studies it seems that the approach for dealing with multifunctional
processes suggested in the ISO framework (ISO 14044, 2006) is not
frequently followed in the practical application of LCA; ISO recommends in
order of preference 1) avoidance of allocation by subdividing unit processes
or expanding the system boundaries, 2) allocation based on underlying
physical relationships and then 3) allocation that reflect other relationships
(eg. economic, energy or mass allocation).
In the majority of the reviewed case studies some sort of allocation
procedures are applied. However, the levels of detail and justification
provided for decisions about system boundary expansion or allocation are
inconsistent and incomplete in most published reports. The first two steps of
the ISO hierarchy have been less commonly applied than the third. The
methodological choice of dealing with multi-functional processes is generally
handled on a case-by-case basis. No generic procedure for multi-functional
processes in co-production, combined waste processing and recycling has
been defined yet.
There is general agreement that the system expansion approach is a very
attractive way to theoretically avoid the difficult problem of allocation
altogether. In that sense, system expansion simplifies modelling because it
limits the assumptions that the modeller needs to make. However, system
boundary expansion is only applicable for consequential, not for attributional
LCAs.
But broadening the system boundaries makes the process of data collection
much more extensive. System expansion inflates the system under study due
to the widespread occurrence of multi-functional processes. System boundary
expansion generally introduces new multi-functional processes; some sort of
allocation is often still needed in order to collect the necessary background
data. Hence, in practice, allocation can very seldom be totally avoided even
by system expansion. Furthermore, system boundary expansion is equivalent
to redefining the functional unit.
In practice all types of allocation are applied, i.e. physico-chemical, economic,
mass and energy allocation. Economic allocation is most commonly used in
Page 101
TF3 Methodological consistency
situations where there is co-production; it seems to be the preferred approach
and is perceived to be the best avenue to capture the downstream recycling
activities. However, no generic procedure for multi-functional processes in coproduction, combined waste processing and recycling has been defined yet.
Based on the literature review the following recommendations can be made:
•
Link closely methodological choices to Goal and Scope Definition: It
seems to be a recurring theme that methodological choice needs to fit
closely with the goal of the study where the intentions of the study are
outlined. In the Goal and Scope Definition questions are answered, such
as why is the study commissioned, for what purpose, who is the target
audience etc. These issues are very likely to have a direct impact on
methodological choices. Hence, a closer link of the methodological
choices in multi-functional situations to Goal and Scope Definition can be
recommended, particularly in consequential LCAs. The justification of
choices should be explicit and transparent. Standard guidance on how to
describe and justify system boundary expansion and allocation decisions
in published reports might help to make LCA studies with multi-functional
processes more robust and transparent.
•
Rethink the ISO preference order of allocation procedures: As the
suggested ISO preference order does not seem to be applied in practice,
and in view of the practical difficulties of both system boundary expansion
and various types of allocation methods, it might be worthwhile to consider
moving system expansion from Step 1b to Step 3 in ISO 14044 in order to
put system expansion on the same level as the use of economic and other
causalities. Furthermore, economic relationships seem to be at least as
important as physical relationships in practice. Some authors recommend
economic allocation as a baseline method for most detailed LCA
applications, because it seems the only generally applicable method.
However, this goes against ISO 14044 and allocation on this basis is still
susceptible to various uncertainties, such as (locally) fluctuating prices,
demand, inflation, tariffs and industry subsidies etc. In either case physicochemical allocation seems to be the preferred approach if sufficient
information is available.
•
Develop industry-specific allocation procedures: it could be assumed that
no generic procedure for all multi-functional processes in co-production,
combined waste processing and recycling is definable. Hence, more effort
needs to be invested in developing allocation procedures appropriate to
specific industry sectors; if possible, physico-chemical ones.
Page 102
TF3 Methodological consistency
4.1.3 Input data quality, data validation, uncertainty in LCA
Identifying consistencies is perhaps especially difficult in the data quality and
uncertainty field. Many of the papers analysed agree best on only two things:
firstly there is broad criticism about inconsistent nomenclature and the
different uses of important terms such as uncertainty, and about a “general
infancy” of the methodology (interestingly, this statement can be found in
papers from 1996 to 2005) as well; secondly, there is consensus that
uncertainty assessment should be applied broadly, and that this is not yet the
case. These general statements still hold, albeit the situation has improved in
recent years. Data quality assessment for datasets is indeed applied in
commonly used LCI databases, while both Monte Carlo simulation and a
“pedigree matrix” approach that quantifies qualitative assessment information
have seen broad application success.
This text identifies six stages in the conduct of an LCA:
(1) specification of the goal and scope of the analysis;
(2) input data specification and collection;
(3) calculation of the LCA study;
(4) obtaining the result of the study as output;
(5) interpretation, and perception of the result by the audience, decision
makers;
(6) decision / action taken or initiated by the decision maker.
Based on these stages, the text suggests a top-down approach, starting from
effects in the real world and from the general characteristics of a good
decision.,As a consequence, analysis of how to provide good decision
support by an “improved” LCA should not stop at the model result stage (nr.
4), but consider how the result is perceived, and how decision makers react
when perceiving the result.
For the question of whether to address uncertainty or not, the text provides a
quite general answer: Uncertainty must be addressed if it is relevant for the
decision at stake, and this is the case if the uncertainty is high, or if it is
relatively higher in one alternative than in the other, or if the magnitude of the
uncertainty is of a similar order to the magnitude of the differences between
compared systems.
Verification and validation are, or should be, prime concerns for any modeller.
The verification process checks whether the model calculates its results in a
technically correct manner, while validation is concerned with whether the
model actually models what it should. Validation is barely used for LCAs
today; one reason being that it is difficult to apply for life cycle impacts. This
has the somewhat surprising effect that the specific result of an LCA is of
minor importance compared to the selected approach, and compared to
agreement being reached among stakeholders. Seeking possible “entry
points” for a validation into an LCA product model would be well worthwhile,
Page 103
TF3 Methodological consistency
and would turn Life Cycle Assessment modelling into a more scientific
approach.
Data quality indicator lists are often comparable between different authors.
Yet there seems far less consensus about their definition, and even less
about their application. How to deal with trade-offs between different
indicators is rarely discussed. Practical guidance would be of value, both on
selection and practical use. From the different lists and concepts, the
“pedigree matrix” seems especially attractive; it has the appeal of combining
human judgement and hard facts into quantitative values in a clear and
transparent way.
For many of the methods considered, this paper does not provide
recommendations. Quite often, the conclusion is that further work is required.
This is not highly satisfactory, and might appear to be a common reflex in
scientific papers. However, following on from the proposal of the six stages in
an LCA application, and of a top-down approach that starts where uncertainty
and data quality really matter (at the point of considering the effects on the
decision to be supported by the LCA), it is astonishing how little indeed has
been done.
The overall picture of data quality, uncertainty, validation and verification
provided in this text is new. It is hoped that it will serve to identify consensus
and recommended application procedures, and thus provide practical
guidance, leading towards consistency and improvement, even in the field of
data quality and uncertainty.
4.2 Summary and conclusions on advancing life cycle modelling
The limitations of simple ISO LCA for decision support are substantial. The
LCI part is a static model without any dynamics incorporated. Behavioural
mechanisms, including market mechanisms, are absent. Processes refer to
the past instead of the future. Spatial differentiation is mainly lacking.
However, by being so simple LCA has the advantage of being operational.
The problems of consistency relate to the current limitations, of which many
are keenly aware and which we would dearly like to overcome. There is a
tendency to use the quite limited static LCI model to indicate dynamics. It
would be a great improvement if either the static nature of LCA were
acknowledged with simple and clean comparative static analysis, or that a –
daring! - choice of dynamic modelling as the norm were made.
One discussion in this vein is centred around the issue of rebound effects. In
many situations there are clear indirect effects which, as rebounds, can
qualify the normal LCA outcomes - both negatively, as with high efficiency
light bulbs leading to new energy intensive applications, and positively, as
with IT services reducing travelling. These mechanisms are linked
haphazardly now, either in a comparative static or a loosely dynamic
Page 104
TF3 Methodological consistency
framework. They should rather be part of a more systematic approach to
deepened forms of life cycle analysis, in the first instance still of a
comparative, static type but which could, in due time, be linked to dynamic
modelling when relevant mechanisms and appropriate data have been
developed.
Remaining within the realm of comparative, static analysis does not
necessarily mean that we should stick to current LCI. More mechanisms may
be added in static models as well, market models being an important
example. For all such variants, clarity about what is being compared is
essential. When several technology systems may produce the same function,
these can be compared on an equal footing. In contrast, the emerging trend
to make implicit comparisons with an unspecified reference situation, by
assuming substitution to take place relative to this unspecified reference
situation, is a major cause of inconsistency. If an LCA involves comparison
with a current situation, that situation should be specified on an equal footing
with the other alternatives under study.
The term ‘substitution’ used in the context of allocation, suggests an
economic mechanism - normally based on market mechanisms and
especially on elasticities of supply and demand. These may add one layer of
realism to the analysis, and also one layer of complexity. Considering market
reactions is clearly highly relevant to improving the realism of any assessment
of the consequences of choices. Doing this systematically is therefore a
requirement, firstly finding comparative static solutions, with dynamic analysis
coming “later”, if at all.
If these market mechanisms are incorporated in an LCA, they should be used
explicitly and systematically. Saying that “substitution” is being carried out,
failing to analyse it thoroughly, and then doing the not-real-substitution only
partially creates substantial inconsistency now. In short: consistency in LCI
can be much improved. This can be done either by specifying better the
purely technology-based simple LCA, or by developing a broader comparative
static framework involving main market mechanisms. Such options for
deepening life cycle based analysis are probably feasible now,
computationally as well as conceptually, but have not yet developed
empirically. It will not be possible to go all the way to computable general
equilibrium (CGE) models, as applied in general equilibrium modelling,
because the data requirements and computational power needed are too
huge if technological detail is to be realised. Partial equilibrium modelling is
the best target at this time, with choices about how “partial” being essential for
the outcomes and for interpreting the outcomes.
Closer to home, LCI/LCA can be much improved if the nature of current
modelling is clarified, not only in terms of what comparative static analysis is
about but also in terms of specifying the questions asked and linking the
answers to the questions. For more strategic technology questions, for
Page 105
TF3 Methodological consistency
instance relating to new energy sources and transformation routes, the time
horizon of decisions is up to decades. Persevering in the use of data that
describes existing processes for such analysis then increasingly becomes the
wrong approach, linking to the past instead of to the relevant future. As the
future is not fully determined, technology scenarios then become important,
specifying consistent sets of future technologies as background for other
technology choices investigated. If wind power, clean coal and solar energy
emerge as dominant electricity technologies, low energy light bulbs, with
notable environmental burdens in their production and end of life, become
less attractive.
Moving to dynamic analysis at the level of detail required in technology –
specific LCI – is currently not feasible. Some dynamic elements are present in
macro-level energy modelling, roughly linked to major technologies, as
applied in general equilibrium models (GEM, also referred to as CGE:
computable general equilibrium models). These models have an equilibrium
part with market mechanisms, and a time dependent part in which
technologies develop due to investment in new types, or other dynamic
mechanisms. Though not specifiable at sufficient detail for the purpose of
comparing different technology alternatives that could deliver a functional unit,
they may play a role in background process specification for LCI, as separate
but linked models. This may become more relevant if these general
equilibrium models are themselves developed to embody more technological
detail. Currently they represent the economy mostly at a 20-30 sector level of
detail. Input-output databases with more sectoral detail are being developed,
moving towards the level of around one hundred sectors, with even up to 500
sectors. The link to specific technologies as required in LCA them becomes
much more meaningful.
The detailed IO tables with broad environmental extensions (EIOA) that are
emerging can be linked to current LCI in two different ways. One way is to
use them to solve some of the data problems in LCI, incorporating
background data based on such IO tables in a tiered hybrid analysis. This
analysis is mathematically fully equivalent to current LCI, as matrix inversion.
However, a whole new domain of life cycle analysis can be developed, not
linked to a functional unit of arbitrary size but to full totals in society. The
system analysed in technological detail is fitted into the sectoral framework
with total demand for the function specified in the context of total demand in
society. This analysis has the big advantage that the link to sustainability
aims, which are not at the level of product systems but at the level of society,
can be made directly. This integrated hybrid analysis (IHA) makes the link
from the micro to the macro level of analysis. If the analysis would next be
extended to market mechanisms, as partial equilibrium analysis, the
specification in the integrated hybrid analysis could function a background on
the choice which partial markets to model: the most relevant ones.
Page 106
TF3 Methodological consistency
5 References
ABS (2001) Australian Bureau of Statistics – Input Output Tables 1996-1997, ABS
publication 5209.0, Canberra
AIAA "Guide for the Verification and Validation of Computational Fluid Dynamics
Simulations", AIAA Standards Series, AIAA (American Institute for
Aeronautics and Astronautics), 1998.
An International Workshop on Electricity Data for Life Cycle Inventories, Introduction
and Overview, (2001), http://www.sylvatica.com/Issues_paper.pdf, site
accessed 30 April 2005.
Analysis of Asset Allocation, financial dictionary & glossary, http://www.assetanalysis.com/glossary/ site accessed 30 September 2004.
Argus (Arbeitsgruppe Umweltstatistik GmbH) (2002): Untersuchung des Berliner
Restabfalls aus der Systemabfuhr; Durchführung von Sieb- und
Sortieranalysen; Repräsentative Beprobung von Siebfraktionen für die
anschließenden Bestimmung chemisch-physikalischer Parameter, im Auftrag
der BSR.
Asher, H.B.: Causal Modelling, Sage Publishing, Newbury Park 1989.
ASQ (American Society of Quality): Web-glossary, http://www.asq.org/info/glossary,
website accessed 4 October 2004.
Azapagic A., Clift R.: Allocation of environmental burdens in multiple-function
systems. J Cleaner Prod 7(2):101-119 (1999).
Azapagic A., Environmental system analysis: The application of linear programming
to Life Cycle Assessment, Ph.D. dissertation, University of Surrey, 1996.
Azapagic A., R. Clift. Linear programming as a tool in Life Cycle Assessment, Int. J.
LCA 3(6) (1998) 305-316
Azapagic, A.; Clift, R. Int. J. LCA 1999, 4(6), 357-369
Azapagic A, Clift R (2000): Allocation of Environmental Burdens inCo-Product
Systems: Product-Related Burdens (Part 2). Int J LCA 5(1) 31–36
Bacharach M (1970): Biproportional Matrics and Input-Output Change, Cambridge
University Press: Cambridge
Backhaus K., Erichson B., Plinke W., Weiber R.: Multivariate Analysemethoden,
Springer, Berlin 1994.
Bandemer, H., Gottwald, S.: Fuzzy Methoden; Akademie Verlag Berlin 1993.
Basson, L., Petrie, J.G.: An Integrated Approach for the Management of Uncertainty
in Decision Making Supported by LCA-Based Environmental Performance
Information, iEMS conference 2004, proceedings, 14-17 June 2004,
Osnabruck, Germany.
Baumol, W.J. Economic theory and operations analysis. London: Prentice Hall, 1972
Page 107
TF3 Methodological consistency
Beaufort, A. S. H . de., R. Bretz, R. Hischier, M. Huijbregts, P. Jean, T. Tanner, and
G. van Hoof (Eds.): Code of Life-Cycle Inventory Practice. SETAC Press,
Pensacola (USA) / Brussels (Belgium). 2003.
Bell, D. E., Raiffa, H., Tversky, A.: Decision Making. Descriptive, normative, and
prescriptive interactions. Cambridge University Press, 1988.
Bergman, Lars and Magnus Henrekson (2003) CGE Modelling of Environmental
Policy and Resource Management. Triest. Downloadable from:
http://www.ictp.trieste.it/~eee/workshops/smr1533/Bergman%20%20Handbook-1.doc
Bernesson S, Nilsson D, Hansson P-A (2004): A limited LCA comparing large- and
small-scale production of rape methyl ester (RME) under Swedish conditions.
Biomass and Bioenergy 26, 545–559
Bevington R., Robinson D.K.: Data Reduction and Error Analysis for the Physical
Sciences, WCB/McGrawHill, Boston 1992.
Bez, J., Heyde, M., Goldhan, G. (1998) Waste Treatment in Product Specific Life
Cycle Inventories - An Approach of Material-Related Modelling. In: Int. J. LCA
3 (2) 100 – 105 (1998)
Borg M, Paulsen J, Trinius W (2001): Proposal of a Method for Allocation in BuildingRelated Environmental LCA Based on Economic Parameters. Int J LCA 6 (4)
219–230
Braam, J., Tanner, T.M., Askham, C., Hendriks, N., Maurice, B., Mälkki, H., Vold, M.,
Wessman, H., de Beaufort, A.S.H.: SETAC-Europe LCA Working Group
'Data Availability and Data Quality', Energy, Transport and Waste Models,
Availability and Quality of Energy, Transport and Waste Models and Data, 6
LCA (3) 135-139 (2001)
Bretz et al., Int J LCA
Cederberg, C, Stadig, M (2003): System Expansion and Allocation in Life Cycle
Assessment of Milk and Beef Production. Int J LCA 8 (6) 350–356
Ciroth and Srocka 2005: Prozessorientierte Basisdaten für
Umweltmanagementsysteme, Arbeitspaket 2.1 im Auftrag des
Umweltbundesamtes: Zusammenstellen von Reviewansätzen, 2005
(unveröffentlicht).
Ciroth, A., Becker H.: Validation – The Missing Link in Life Cycle Assessment.
Towards pragmatic LCAs, 11 LCA (5) 295-297 (2006)
Ciroth, A., Fleischer, G., Steinbach, J.: Uncertainty Calculation in Life Cycle
Assessments - A Combined Model of Simulation and Approximation, Int J
LCA 9 (4) 216 – 226 (2004).
Ciroth, A., Hagelüken, M., Sonnemann, G.W., Castells, F., Fleischer, G.:
Geographical and Technological Differences in Life Cycle Inventories Shown
by the Use of Process Models for Waste Incinerators. Part I: Technological
and Geographical Differences, Int J LCA 7 (5) 2002 295- 300 (2002).
Ciroth, A., James, K., Trescher, Ch.: A Survey of Current LCC Studies, chapter 6 in
Hunkeler, D., Rebitzer, G., Lichtenvort, K. (edts.): Environmental Life Cycle
Page 108
TF3 Methodological consistency
Costing, pp. 90-108, submitted to SETAC Publications as the Result of the
Life Cycle Costing Working Group of SETAC Europe, December 19, 2005.
Ciroth, A., Schmitz, St., Köhn, M.: Experiences from the ProBas review procedure,
poster, Setac Annual Meeting, Den Haag 2006.
Ciroth, A.: Case Studies in Life Cycle Assessments – their role, misunderstandings,
improvements. Demonstrated on a case study for plastics waste disposal.
Platform presentation, 10th Setac case study symposium, Barcelona, 2
December 2002.
Ciroth, A.: Fehlerrechnung in Ökobilanzen, Dissertation TU Berlin 2001.
Ciroth, A.: Uncertainties in Life Cycle Assessments, Editorial, Int J LCA 9 (3) 141 –
142 (2004)
Ciroth, A.: Uncertainty calculation for LCI data: Reasons for, against, and an efficient
and flexible approach for doing it, proceedings, International Workshop on
Quality of LCI Data, 20 – 21 October 2003, Forschungszentrum Karlsruhe.
CML 2001 Guinée, J.B. (ed.): Life Cycle Assessment, an operational guide to the
ISO standards, part 2a, guide; final report, May 2001.
Cochran, W.G.: Sampling Techniques, Wiley Series in Probability and Statistics,
1977.
Cooke, R.: Experts in Uncertainty, Oxford University Press, Oxford, 1991.
COST 530 Minutes of the meetings of the Working Group 3 “Data base” of COST
Action 530.
Curran, M. A. (2006) Co-Product and Input Allocation Approaches forCreating Life
Cycle Inventory Data: A Literature Review. In: International Journal of LCA,
OnlineFirst 1-14
Curran MA, Mann M, Norris G. Report on the International Workshop on Electricity
Data for Life Cycle Inventories. Cincinnati, Ohio 45268 USA, October 23 - 25,
2001.
Curran, M. A., M. Mann, and G. Norris: Report on the International Workshop on
Electricity Data for Life Cycle Inventories. National Risk Management
Research Laboratory Office of Research and Development / U.S.
Environmental Protection Agency, Washington D.C. (USA). 2001.
Dalkey, N., Brown, B., Cochran, S.: The Delphi Method, III: Use of Self Ratings to
Improve Group Estimates, RM-6115-PR, Rand Corporation, Santa Monica,
1969.
Dannemand, A.P.; Bjerregaard, E., Schleisner, S.: Driving factors for environmental
sound design and recycling of future wind power systems. European wind
energy conference and exhibition, Copenhagen (DK), 2-6 July 2001.
Dasgupta P, AK Sen, SA Marglin (1972) Guidelines for project evaluation. United
Nations, New York
Deardorff's Glossary of International Economics,
www-personal.umich.edu/~alandear/glossary/p.html, site accessed 30 April
2005.
Page 109
TF3 Methodological consistency
Dietz, Th.: What is a Good Decision? Criteria for Environmental Decision Making,
Human Ecology Review, Vol. 10, No. 1, 2003, 33-39.
Duchin, F. 2005. Sustainable Consumption of Food: A Framework for Analyzing
Scenarios about Changes in Diets. Journal of Industrial Ecology, 9(1-2) pp.
99 - 114
EcoInvent See http://www.ecoinvent.ch/
Ekvall T: Limitations of Consequential LCA, InLCA-LCM 2002,
http://www.lcacenter.org/lca-lcm/index.html, May 2002, site accessed 30
September 2004.
Ekvall, T., Ciroth, A., Hofstetter, P., Norris, G.: Evaluation of attributional and
consequential life cycle assessment, in preparation.
Ekvall, T., Weidema, B.: System Boundaries and Input Data in Consequential Life
Cycle Inventory Analysis, 9 LCA (3) 161-171 (2004).
Ekvall, T., Cleaner production tools: LCA and beyond. Journal of Cleaner Production,
2002. 10(5): pp403-406.
Ekvall, T.; Finnveden, G. J. Cleaner Prod. 2001, 9, 197-208.
Ekvall T (2000): A Market-Based Approach to Allocation at Open-Loop Recycling.
Resources, Conservation and Recycling 29, 91–109
Ekvall, T.: System expansion and allocation in life cycle assessment. Göteborg:
Department of Technical Environmental Planning, Chalmers University of
Technology. (AFR Report 245) (1999).
Ekvall, T., Tillmann, A.-M.: Open-Loop Recycling: Criteria for Allocation Procedures,
2 LCA (3) 155-162 (1997).
Ellingsen H, Aanondsen SA (2006): Environmental Impacts of Wild Caught Cod and
Farmed Salmon – A Comparison with Chicken. Int J LCA 1 (1) 60–65
Eyjólfsdóttir HR, Jónsdóttir H, Yngvadóttir E, Skúladóttir B (2003): Environmental
effects of fish on the consumers dish – Life cycle assessment of Icelandic
frozen cod products. Icelandic Fisheries Laboratory Report 06-03,
Technological Institute of Iceland
EN ISO 14041 (1998): Environmental Management - Life Cycle Assessment - Goal
and Scope Definition and Inventory Analysis (ISO/EN 14041:1998). European
Committee for Standardization (CEN), Brussels.
Feitz, A., Lundie, S., Dennien, G., Morain, M., Jones, M (2007) Generation of an
Industry-specific Physico-chemical Allocation Matrix - Application in the Dairy
Industry and Implications for Systems Analysis. In: 12 IJLCA (2) 2007, pp.
109-117
Finnveden, G., Nielsen, P.H.: Long-Term Emissions from Landfills Should Not be
Disregarded, Letters to the editor, Int. J. LCA 4 (3) 125 – 126 (1999).
Frischknecht R., Life Cycle Inventory Analysis for Decision-Making; Scopedependent Inventory System Models and Context-specific Joint Product
Allocation, PhD. Thesis Nr. 12599, Swiss Federal Institute of Technology
(ETH), Zurich 1998..
Page 110
TF3 Methodological consistency
Frischknecht R (2000): Allocation in Life Cycle Inventory Analysis forJoint
Production. Int J LCA 5 (2) 85–95
Frischknecht, R., Jungbluth, N. (Editors): Overview and Methodology Data v1.01
(2003), ecoinvent report No. 1, Dübendorf, December 2003. [ecoinvent 2003]
Frischknecht, R., N. Jungbluth, H.-J. Althaus, G. Doka, R. Dones, R. Hischier, S.
Hellweg, T. Nemecek, G. Rebitzer, and M. Spielmann: Overview and
Methodology. CD-ROM Final report ecoinvent 2000 No. 1, Swiss Centre for
Life Cycle Inventories, Dübendorf (Switzerland), 2004.
Funtowicz S., and Ravetz J.R.: Uncertainty and Quality in Science for Policy, Kluwer,
Dordrecht, 1990.
Gem-E3 See: http://www.gem-e3.net/
Gielen, DJ, Bos AJM, Feber de MAPC, Gerlagh T (2000) Biomass for greenhouse
gas emission reduction task 8 : optimal emission reduction strategies for
Western Europe. ECN report ECN-C—00-001, ECN, Petten, Netherlands
Godet, M., Durance, P.: Prospective Stratégique, Problèmes et méthodes, Cahiers
de Lipsor, Librairie des Arts et Métiers, Paris 2006.
Godet, M.: From anticipation to action, UNESCO Publishing, Paris 1994.
Godet, M.: Manuel de Prospective stratégique, tome 1 : Une indiscipline
intellectuelle, tome 2 : L'art et la méthode, Dunod, Paris 2001.
Godet, M.: The Crisis in Forecasting and the Emergence of the "Prospective"
Approach, Pergamon, 1979.
Greening, L, D Green, C Difiglio (2000) Energy efficiency and consumption – The
rebound effect – a survey. Energy Policy 2000 28 pp389-401
Gregg J.B., Gregg P.S.: Dry bones, Dakota Territory reflected: an illustrated
descriptive analysis of the health and well being of previous people and
cultures as is mirrored in their remnants, Univ. of South Dakota, 1988.
Guinée JB, Huppes G, Heijungs R (2004): Economic Allocation: Examples and
Derived Decision Tree. Int J LCA 9, 23-33
Guinée, J. (ed.): Danish-Dutch Workshop on LCA methods, held on 16-17
September 1999 at CML, Leiden; Final report 26-10-’99.
Guinée, J. B.; Gorree, M.; Heijungs, R.; Huppes, G.; Kleijn, R.; de Koning; A., van
Oers, L.; Wegener Sleeswijk, A.; Suh, W.; Udo de Haes, H. Handbook on Life
Cycle Assessment. Operational Guide to the ISO Standards, Kluwer
Academic Publishers: Dordrecht, 2002.
Guinée, J.B.; Huppes, G.; Heijungs, R. Int. J. LCA 2004, 9(1), 23-33.
Heijungs, R. and S. Suh. 2006. Reformulation of matrix-based LCI: from product
balance to process balance. Journal of Cleaner Production 14(1): 47-51.
Heijungs, R., S. Suh, and R. Kleijn. 2005. Numerical approaches to life cycle
interpretation. The case of the Ecoinvent'96 database. International Journal
of Life-Cycle Assessment 10(2): 103-112.
Page 111
TF3 Methodological consistency
Heijungs, R., Huijbregts, M.A.J.: A Review of Approaches to Treat Uncertainty in
LCA, iEMS conference 2004, proceedings, 14-17 June 2004, Osnabruck,
Germany.
Heijungs, R., Suh, S. (2002) The computational structure of life cycle assessment.
Kluwer Academic Publishers (ISBN 1-4020-0672-1), Dordrecht, 2002, xii+241
pp.
Heijungs, R., Kleijn R.: Numerical Approaches Towards Life Cycle Interpretation, Int
J LCA 6 (3) 2001.
Heijungs, R.; Frischknecht, R Int. J. LCA 1998, 3(6), 321-332
Heijungs R: Identification of key issues for further investigation in improving the
reliability of life-cycle assessments. J Cleaner Prod 4 (3–4) 159-166 (1996).
Heijungs, R., J.B. Guinée, G. Huppes, R.M. Lankreijer, H.A. Udo de Haes, A.
Wegener Sleeswijk, A.M.M. Ansems,P.G. Eggels, R. van Duin, H.P. de
Goede (1992) Environmental Life Cycle Assessment of products. Guide and
Backgrounds. NOH reports 9266 & 9267 Leiden: CML 96pp + 130pp Guide:
ISBN: 90-5191-064-9Backgrounds: ISBN: 90-5191-064-9
Heintz B, Baisnée P-F. (1992). System boundaries. Pp 35-52 in SETAC-Europe:
Life-cycle assessment. Brussels: Society for Environmental Chemistry and
Toxicology. (Report from a workshop in Leiden, 1991.12.02-03).
Henrickson, C., A Horvath and S Joshi (1998) “Economic Input-Output Models for
Environmental Life-Cycle Assessment.” Environmental Sci. & Tech., 32:
pp184-191.
Hertwich, E., Hammit, J.: A Decision-Analytic Framework for Impact Assessment,
part 1 6 LCA (1) 5-12 (2001), part 2 6 LCA (5) 265-272 (2001)
Hertwich, EG (2005) Life Cycle Approaches to Sustainable Consumption: A Critical
Review. In: ES&T vol.39, no.13 pp4674-84
Hischier, R., Althaus, H.-J., Werner, F. (2005) Developments in Wood and
Packaging Materials Life Cycle Inventoriesin ecoinvent. In: Int J LCA 10 (1)
50 – 58 (2005)
Hischier, R., A.-S. Carlsson: Comparison EcoSpold and Sirii/SPINE. Presentation at
the International Workshop on Quality of LCI Data. UNEP-SETAC Life Cycle
Initiative / Forschungszentrum Karlsruhe (FZK), Karlsruhe (Germany). 2003.
Hospido A, Vazquez ME, Cuevas A, Feijoo G, Moreira MT (2006): Environmental
assessment of canned tuna manufacture with a life cycle perspective.
Resources, Conservation and Recycling 47, 56–72
Huijbregts, M.A.J., Norris, G., Bretz, R., Ciroth, A., Maurice, B., von Bahr, B.,
Weidema, B., de Beaufort, A.S.H.: Framework for Modelling Data Uncertainty
in Life Cycle Inventories, Int J LCA 6 (3) 2001 127-131.
Huijbregts, M.A.J.: Uncertainty and variability in environmental life-cycle assessment,
academisch proefschrift, University of Amsterdam, 2001.
Hunt RG, Boguski TK, Weitz K, Sharma A (1998): Case studies examining LCA
streamlining techniques. Int J LCA 3 (1) 36–42
Page 112
TF3 Methodological consistency
Huppes G, Schneider F (1994): Procedures European Workshop Allocation LCA,
Centre of Environmental Science, Leiden University, 24-25 February 1994,
CML: Leiden
Huppes, G.: Questions, models and data in LCA; note for the international workshop
on electricity data for life cycle inventories,
http://sylvatica.com/data,%20models%20&%20questions.pdf, site accessed
30 September 2004
Huppes G (1993) Macro-environmental policy: Principles and design. Elsevier:
Amsterdam
Huppes, G. Allocating impacts of multiple economic processes in LCA. In: Life Cycle
Assessment, s.n. Society of Environmental Toxicology and Chemistry
(SETAC) Brussels: SETAC, 20pp, 1992
Ibenholt K: Materials flow analysis and economic modelling, in: Ayres R.U., Ayres
L.W., editors. Handbook of Industrial Ecology, Edward Elgar, Cheltenham
2002, pp 177-184.
Idenburg AM and C Wilting (2004) DIMITRI: A model study policy issues in relation
to economy, technology and environment. In: JCJM van den Bergh and MA
Janssen Eds (2004) Economics of industrial ecology. Materials, structural
change and spatial scales. Cambridge (MA): MIT-Press, pp 223-255.
IEA (International Energy Agency): Experience Curves for Energy Technology
Policy, IEA, Paris, 2000.
ISO 14040 International Standardization Organisation (ISO): Environmental
management – Life cycle assessment – Principles and framework. ISO
14040. Geneva, 1997.
ISO 14041 (1998): Environmental management – Life cycle assessment – Goal and
scope definition and life cycle inventory analysis. International Organization
for Standardization
ISO 14041 Environmental management – Life cycle assessment – Goal and scope
definition and life cycle inventory analysis. International Organization for
Standardization, Geneva, 1998.
ISO 14042 International Standardization Organisation (ISO): Environmental
management – Life cycle assessment – Life Cycle Impact Assessment.
Geneva, 2000.
ISO 14043 International Standardization Organisation (ISO): Environmental
management – Life cycle assessment – Life Cycle Interpretation. ISO 14043.
Geneva (Switzerland), 2000.
ISO 14044 International Standardization Organisation (ISO): Environmental
management – Life cycle assessment – Life Cycle Interpretation. ISO 14044.
Geneva (Switzerland), 2006.
ISO 14048 International Standardization Organisation (ISO): Environmental
management – Life cycle assessment – LCA data documentation format.
ISO/TS 14’048. Geneva (Switzerland), 2002.
Joshi, S., (2000). Product Environmental Life-Cycle Assessment Using Input-Output
Techniques. Journal of Industrial Ecology, 3 (2-3), 95-120.
Page 113
TF3 Methodological consistency
Jungmeier, G.; Werner, F.; Jarnehammar, A.; Hohenthal, C.; Richter, K. Int. J. LCA
2002a, 7(5), 290-294
Jungmeier, G.; Werner, F.; Jarnehammar, A.; Hohenthal, C.; Richter, K. Int. J. LCA
2002b, 7(6), 369-375
Kahn, H., Wiener, A. J.: The year 2000. MacMillan, London 1967.
Kahneman, D., Tversky, A.: Risk and Rationality: Can Normative and Descriptive
Analysis Be Reconciled? Working Paper RR-4, Institute for Philosophy and
Public Policy, University of Maryland, 1987.
Kahnemann, D., Tversky, A.: Rational Choice and the Framing of Decisions, Journal
of Business, 1986, vol. 59, no. 4, pt2. (1986)
Kheir, N.A. (ed.): Systems Modelling and Computer Simulation, 2nd ed., Dekker, New
York 1996.
Kim S, Overcash M (2000): Allocation Procedure in Multi-Output Pro-cess: An
Illustration of ISO 14041. Int J LCA 5 (4) 221–228
Kim S, Dale B (2002): Allocation Procedures in EtOH Production from Corn Grain.
Int J LCA 7 (4) 237–243
Kim, S.; Overcash, M.R. Int. J. LCA 2000, 5(4), 221-228
Lave, L. B., E. Cobas-Flores, et al. (1995). Generalizing Life-Cycle Analysis Using
Input-Output Analysis to Estimate Economy-Wide Discharges. Environmental
Science and Technology 29(9): pp420-426
Le Téno JF: Visual Data Analysis and Decision Support Models for NonDeterministic LCA. Int J LCA 4 (1) 41–47 (1999).
Le Téno JF: Visual Data Analysis and Decision Support Models for NonDeterministic LCA. Int J LCA 4 (1) 41–47 (1999).
Leontief, W.W. (1970) Environmental repercussions and the economic structure: a
input-output approach. Review of Economics and Statistics 50, 262-271.
Linden, JA van der and Dietzenbacher E (2000): The determinants of structural
change in the European Union: A new application of RAS, Environ Planning
A 32, 2205-2229
Ljung, L., Glad, T.: Modelling of dynamic systems, Prentice Hall, Englewood Cliffs
1994.
Loper, E.: Decision trees, web course,
http://www.cis.upenn.edu/~edloper/slides/cse391_dtree.pdf, site accessed 7
May 2005.
Loulou, R, Lavigne D (1996) MARKAL model with Elastic Demands: Application to
Greenhouse Gas Emission Control. In: C. Carraro, A. Haurie (eds.):
Operations research and environmental management. Dordrecht: Kluwer
Academic Publishers, pp 201-220
Lundie, S.: Ökobilanzierung und Entscheidungstheorie, Springer, Berlin 1999.
Matthews, H.C., Small, M.J. (2001) Extending the boundaries of life-cycle
assessment through environmental economic input-output model. Journal of
Industrial Ecology 4(3), 7-10.
Page 114
TF3 Methodological consistency
Merriam-Webster's Online Dictionary, http://www.britannica.com/dictionary, site
accessed 30 September 2004.
Merrian Webster 2006 http://www.webster.com/cgi-bin/dictionary?va=substitution
Miller, R. E. and P. D. Blair (1985). Input-Output Analysis: Foundations and
Extensions. Englewood Cliffs, New Jersy, Prentice-Hall
Minutes of the International Workshop on Quality of LCI Data. UNEP-SETAC Life
Cycle Initiative / Forschungszentrum Karlsruhe (FZK), Karlsruhe (Germany).
2003.
Mongelli, I., G. Huppes and S. Suh. 2005. A structural comparison of two
approaches to life cycle inventory using the MIET and the ETH databases.
International Journal of Life-Cycle Assessment 10(5): 317-324.
Morgan M.G., Henrion M.: Uncertainty – A guide to dealing with uncertainty in
quantitative risk and policy analysis. Cambridge University Press, NY 1990.
Mungkung R (2005): Shrimp aquaculture in Thailand: Application of life cycle
assessment to support sustainable development. PhD. thesis. Center for
Environmental Strategy, School of Engineering, University of Surrey, England
Nielsen AM, Nielsen PH, Jensen JD, Andersen M, Weidema BP (2003):
Identification of processes affected by a marginal change in demand for food
products – Two examples on Danish pigs and cheese. Life Cycle
Assessment in the agrifood sector. Proceedings from the 4th International
Conference Dias Report 61, 127–134
Notten, P., Petrie, J.G.: Enhanced Presentation and Analysis of Uncertain LCA
Results with Principal Component Analysis, iEMS conference 2004,
proceedings, 14-17 June 2004, Osnabruck, Germany.
NPARC (NASA Glenn Research Center, Arnold Engineering Development Center),
CFD Verification and Validation website,
http://www.grc.nasa.gov/WWW/wind/valid/tutorial/glossary.html, site
accessed 2 October 2004.
NREL (National Renewable Energy Laboratory) (2004): U.S. Data-base Project
Development Guidelines. Prepared by Athena SustainableMaterials Institute,
NREL/SR-33806
Palmer K, Sigman H, Walls M (1997): The Cost of Reducing Municipal Solid Waste.
J Environ Econom Manage 33, 128–150
Papatryphon E, Petit J, Van der Werf H, Kaushik S. (2003): Life Cycle Assessment
of trout farming in France: a farm level approach. Life Cycle Assessment in
the agrifood sector. Proceedings from the 4th International Conference Dias
Report 61, 71–77
Papatryphon E, Petit J, Kaushik S, Van der Werf H (2004): Environmental impact
assessment of salmonid feeds using Life Cycle Assessment (LCA). Ambio 33
(6) 316–323
Parikh A (1979) Forecasts of input-output matrices using the R.A.S. Method, Rev
Econ Stat 61, 477-481
Page 115
TF3 Methodological consistency
Pesonen H-L, Ekvall T, Fleischer G, Huppes G, Jahn C, Klos Z S, Rebitzer G,
Sonnemann G W, Tintinelli A, Weidema B P, Wenzel H. (2000) Framework
for Scenario Development in LCA. International Journal of Life Cycle
Assessment 5(1):21-30.
Pesonen, H.-L., et al.: Framework for Scenario Development in LCA, report of the
SETAC Working Group on Scenario Development in LCA, 5 LCA (1) 21-30
(2000)
PMS 2004 www.pms.ac.uk/pms/school/glossary.php, Penninsula Medical School,
website accessed 4 October 2004.
Pohl Chr., Ros M. (1996): Sind Ökobilanzen zu präzise? In: Intelligente Methoden
zur Verarbeitung von Umweltinformation. 2. Bremer KI Pfingstworkshop
(Tagungsband), metropolis, Marburg, pp. 121–136.
Popper, K.: The Myth of the Framework, Routledge, 1996.
Prywes, R.W.: The United States Labor Force : A Descriptive Analysis, Quorum
Books, 2000.
R. Heijungs and S. Suh (2002) The computational structure of life cycle assessment.
Dordrecht: Springer
Raiffa, H., Richardson, J., Metcalfe, D.: Negotiation analysis : the science and art of
collaborative decision making, The Belknap Press of Harvard University
Press, Cambridge, London 2002.
Raynold M, Roydon F, Checkel D (2000a): The relative mass-energy economic
(RMEE) method for system boundary selection, Part I: A means to
systematically and quantitatively select LCA boundaries. Int J LCA 5 (1) 37–
46
Raynold M, Roydon F, Checkel D (2000b): The relative mass-energy economic
(RMEE) method for system boundary selection. Part II: Selecting the
boundary cut-off parameter (ZRMEE) and its relationship to overall
uncertainty. Int J LCA 5 (2) 96–104
Rebitzer G, Ekvall T, Frischknecht R, Hunkeler D, Norris G, Rydberg T, Schmidt WP, Suh S, Weidema BP, Pennington DW (2004) Life Cycle Assessment. Part
I: Framework, goal and scope definition, inventory analysis and application.
Environment International 30 (2004), pp. 701-720 [Rebitzer et al. 2004]
Roš, M. Unsicherheit und Fuzziness in ökologischen Bewertungen. Orientierung zu
einer robusten Praxis der Ökobilanzierung. PhD thesis, ETH Zürich, Zürich
1998.
Rothenberg. J.: A Discussion of Data Quality for Verification, Validation, and
Certification of Data to be used in Modelling. RAND, Santa Monica (1999).
Sachs, L.: Angewandte Statistik, Springer, Heidelberg New York 1992.
Schwartz, P.: The Art of the Long View, Doubleday, New York 1996.
Seebregts, Ad J., Gary A. Goldstein, Koen Smekens (2001) Energy/Environmental
Modelling with the MARKAL Family of Models. Petten: ECN, PublicationRX-01-039
Sen, AK (1970) Collective choice and social welfare. Holden Day, San Francisco
Page 116
TF3 Methodological consistency
Seppälä J., (1999): Decision analysis as a tool for life cycle impact assessment. In:
Klöpffer W., Hutzinger O. (eds) LCA Documents 4, Ecomed publishers,
Landsberg.
SETAC (Society of Environmental Toxicology and Applied Chemistry): Guidelines for
Life-Cycle Assessment – A Code of Practice. Sesimbra / Portugal SETAC
Workshop Report, SETAC Press, Pensacola, Florida, 1993.
Seyler, C., Hellweg, S., Monteil, M., Hungerbühler, K (2005) Life Cycle Inventory for
Use of Waste Solvent as Fuel Substitute in the Cement Industry- A MultiInput Allocation Model. In: Int J LCA 10 (2) 120 – 130 (2005)
Silva, G. A., Kulay, A. (2003) Application of Life Cycle Assessment to theLCA Case
Studies Single Superphosphate Production. In: Int J LCA 8 (4) 209 – 214
(2003)
Simon, H.A.: Spurious Correlation: A Casual Interpretation, in: Blalock, H.M. (ed.):
Causal Models in the Social Sciences, Aldine Publishing Company, New York
1985.
Spielmann, M.: Prospective Life Cycle Assessment for Transport Systems,
dissertation, Swiss Federal Institute of Technology, Zurich 2005.
Steen, B., R. Carlson, and G. Löfgren: SPINE - a relational database structure for
Life Cycle Assessments. Swedish Environmental Research Institute /
Chalmers University of Technology / Chalmers Industriteknik, Göteborg
(Sweden). 1995.
Stone R (Ed) (1963): Input-Output Relationships, 1954-1966, A Programme for
Growth, Volume 3, Chapman and Hall: London
Stone, R. A. (1963). Input-Output Accounts and National Accounts. Paris,
Organization for European Economic Cooperation.
Suh S, Huppes G (2002) Missing Inventory Estimation Tool Using Extended InputOutput Analysis. Int J LCA, 2002, 7 (3), pp.134-140
Suh S., Lenzen M., Treloar G.J., Hondo H., Horvath A., Huppes G., Jolliet O., Klann
U., Krewitt W., Moriguchi Y., Munksgaard J. & Norris G. System Boundary
Selection in Life-Cycle Inventories Using Hybrid Approaches. Environmental
Science & Technology 38 657-664 (2004)
Suh, S (2003): Input-Output and Hybrid Life Cycle Assessment. Int J LCA 8, 257
Suh, S. (2004b). Comprehensive Environmental Data Archive (CEDA) 3.0. User’s
guide. Institute of Environmental Sciences (CML), Leiden University, Leiden,
the Netherlands.
Suh, S. 2005. Developing sectoral environmental database for input-output analysis:
Comprehensive environmental data archive of the U.S. Economic Systems
Research 17(4): 449-469.
Suh, S. and G. Huppes. 2005. Methods in Life Cycle Inventory (LCI) of a product.
Journal of Cleaner Production 13(7): 687-697.
Suh, S. Ed. (2006) Handbook of input-output economics in industrial ecology.
Dordrecht: Springer, in press.
Page 117
TF3 Methodological consistency
Suh, Sangwon and Gjalt Huppes (2002) Missing inventory estimation tool using
extended input-output analysis. Int J LCA 7 pp134-140
Sylvatica (2004): A Practical Method for Life Cycle Review of Products and Services
for Manufacturers and Purchasers: Final report. Prepared for the International
Design Center for the Environment in connection with their eLCieTM web tool,
available online <www.idce.org>, 44 pp
Thrane M (2004a): Environmental impacts from Danish fish products – Hot spots and
environmental policies. PhD Thesis. Ålborg University, Ålborg, Denmark
Tillman A-M, Baumann H, Eriksson E, Rydberg T. (1991). Life cycle analysis of
packaging materials. Calculation of environmental load. Göteborg: Chalmers
Industriteknik.
Tukker, A., G. Huppes, L. van Oers, R. Heijungs (2006) Environmentally Extended
Input-Output Tables and Models for Europe. Report for DG JRC, IPTS, under
Tender No J02/29/2004, in press.
Tukker, A., G. Huppes, L. van Oers, S. Suh, A. De Koning, R. Heijungs, J, Guinee,
B. Jansen, M. van Holderbeke, Th. Geerken, P. Nielsen (2005)
Environmental impacts of products. Sevilla: IPTS
Udo de Haes, H., Sonnemann, G. (Life Cycle Initiative): Task Forces : Overview and
Terms of Reference, Draft Final (2003).
Udo de Haes, H.A., R. Heijungs, S. Suh, G. Huppes, (2004) Three strategies to
overcome the limitations of life-cycle assessment. Journal of Industrial
Ecology 8(3) 19 – 32.
UNEP IE – Industry and Environment: Life Cycle Assessment: What it is and How to
do it? Technical Report, United Nations Environment Programme, Paris
1996.
Unger, T. and T. Ekvall (2003). ”Benefits from increased cooperation and energy
trade under CO2 commitments – the Nordic case.” Climate Policy. 3(3): 279294.
US EPA (1993): Life Cycle Assessment: Inventory Guidelines and Principles. Office
of Research and Development, EPA/600/R-92/245
van den Berg, N., Huppes, G., Lindejier, E., van der Ven, B.L., Wrisberg, M.N.:
Quality Assessment for LCA, CML report 152, 1999.
van der Linden JA and Dietzenbacher E (2000): The determinants of structural
change in the European Union: A new application of RAS, Environ Planning
A 32, 2205-2229
Vigon B W, Tolle D A, Cornaby B W, Latham H C, Harrison C L, Boguski T L, Hunt R
G, Sellers J D. (1993). Life cycle assessment: Inventory guidelines and principles. Washington D.C. & Cincinnati: United States Environmental Protection
Agency, Office of Research and Development. (EPA/600/R-92/245).
Vogtländer J, Brezet H, Hendricks C (2001a): The Virtual Eco-Costs'99: A Single
LCA-Based Indicator for Sustainability and the Eco-Costs– Value Ratio (EVR)
Model for Economic Allocation. Int J LCA 6 (3)157–166
Page 118
TF3 Methodological consistency
Vogtländer J, Brezet H, Hendricks C (2001b): Allocation in Recycling Systems – An
Integrated Model; for the Analyses of EnvironmentalImpact and Market
Value. Int J LCA 6 (6) 344–355
Volkwein S., Gihr R., Klöpffer W., (1996): The valuation step within LCA. Part 2: A
formalized method of prioritization by expert panels. International Journal of
LCA 1 (4): 182-192.
Von Neumann, J., Morgenstern, O.: Theory of Games and Economic Behaviour,
Princeton University Press, Princeton, 1944.
Vose D.: Quantitative Risk Analysis: A Guide to Monte Carlo Simulation Modelling.
John Wiley & Sons, Chichester, New York, Brisbane, Toronto, Singapore
1996.
Wang M, Lee H, Molburg J (2004): Allocation of Energy Use in Petro-leum Refineries
to Petroleum Products. Int J LCA 9 (1) 34–44
Wang M (1999): The Greenhouse Gases, Regulated Emissions, and Energy Use in
Transportation (GREET) Model, version 1.5,
<http://www.transportation.anl.gov/pdfs/TA/264.pdf>, Center for
Transportation Research, Argonne National Laboratory, Argonne, Illinois
W.E. Biles, Discrete-Event Systems, in: Khair, N.A.: Systems Modelling and
Computer Simulation, Marcel Dekker, New York 1996, p. 220
Washidea, T. 2004. Economy-wide model of rebound effect for environmental
efficiency. In: Hubacek K., A. Inaba and S. Stagl Eds. 2004. Proceedings
International Workshop on driving forces of and barriers to sustainable
consumption. University of Leeds, UK, March 5-6 2004,
Webster's 1913 Dictionary. http://www.webster-dictionary.org/, site accessed 30
September 2004.
Weidema B et al. (2003) Procedural guideline for collection, treatment, and quality
documentation of LCA data, CASCADE project report, version 5, January 22,
2003.
Weidema B P (2000) · Avoiding co-product allocation in a simplified hypothetical
refinery. Section 3.9.3.2, pp. 36-41 in Part 2b: Operational annex of J B
Guinée, M Gorrée, R Heijungs, G Huppes, R Kleijn, A de Koning, L van Oers,
A W Sleeswijk, S Suh, H A Udo de Haes, H de Bruijn, R van Duin, M A J
Huijbregts, E Lindeijer, A A H Roorda, B L van der Ven, B P Weidema (2002):
LCA - An operational guide to the ISO standards . Dordrecht: Kluwer/Springer
Weidema B P, Norris G A. (2005). Avoiding co-product allocation in the metals
sector. Pp. 81-87 in A Dubreuil: "Life Cycle Assessment and Metals: Issues
and research directions." Pensacola: SETAC. (Proceedings of the
International Workshop on Life Cycle Assessment and Metals, Montreal,
Canada, 2002.04.15-17). http://www.lca-net.com/files/icmm.pdf
Weidema B.P., Wesnæs M.S.: Data quality management for life cycle inventories: an
example of using data quality indicators. J Cleaner Prod 4 (3-4) 167174(1996)
Page 119
TF3 Methodological consistency
Weidema B et al. (2003) Procedural guideline for collection, treatment, and quality
documentation of LCA data, CASCADE project report, version 5, January 22,
2003.
Weidema, B P (2004) Geographical, technological and temporal delimitation in LCA.
UMIP 2003 method. København: Miljøstyrelsen. (Environmental News 74).
http://www.mst.dk/udgiv/Publications/2004/87-7614-305-8/pdf/87-7614-3066.PDF
Weidema, B P. (2003). Market information in life cycle assessment. Copenhagen:
Danish Environmental Protection Agency. (Miljøstryrelsen,, Environmental
Project no. 863). http://www.mst.dk/udgiv/publications/2003/87-7972-9916/pdf/87-7972-992-4.pdf
Weidema, B. P., N. Frees, and A.-M. Nielsen: Marginal Production Technologies for
Life Cycle Inventories. Int J LCA 4(1):48-56, 1999.
Weidema, B. P.: New developments in the methodology for life cycle assessment.
Presentation for the 3rd International Conference on Ecobalance, Tsukuba
1998.11.25-27.
Weidema, B. P.: The SPOLD file format '99. Society for Promotion of Life-cycle
Assessment Development (SPOLD), Kopenhagen (Denmark). 1999.
Download from http://www.spold.org/publ/SPOLD99.zip.
Weidema, B., Fress, N., Petersen, E.H., Olgard, H.: Reducing Uncertainty in LCI,
Developing a Data Collection Strategy, Environmental Project No. 862 2003,
Danish Environmental Protection Agency, 2003.
Weidema, B., Masoni, M. Cappellaro, F., Carlson, R., Notten, P., Pålsson, A., Patyk,
A., Regalini, E., Sacchetto, F., Scalbi, S.: Procedural guideline for collection,
treatment, and quality documentation of LCA data; Task 2.3 of the Cascade
project on Standards or Modelling LCA data, EU Contract No. G7RT-CT2001-05045.
Weidema, B., Masoni, M. Cappellaro, F., Carlson, R., Notten, P., Pålsson, A., Patyk,
A., Regalini, E., Sacchetto, F., Scalbi, S.: Procedural guideline for collection,
treatment, and quality documentation of LCA data; Task 2.3 of the Cascade
project on Standards or Modelling LCA data, EU Contract No. G7RT-CT2001-05045.
Weidema, B.P., A.M. Nielsen, K. Christiansen, G. Norris, P. Notten, S. Suh, and J.
Madsen, 2005. Prioritisation within the integrated product policy. 2.-0 LCA
Consultants for Danish EPA, Copenhagen, Denmark
Weidema, B.P.: LCA developments for promoting sustainability. Invited keynote
lecture for 2nd National Conference on LCA, Melbourne, 2000.02.23-24.
Weidema, Bo: Market information in life cycle assessment, Environmental Project
No. 863 2003, Danish Environmental Protection Agency.
Weidema, Bo: Market information in life cycle assessment, Environmental Project
No. 863 2003, Danish Environmental Protection Agency.
Werner, F., Althaus, H.-J., Richter, K., Scholz, R.: Post-Consumer Waste Wood in
Attributive Product LCA - Context specific evaluation of allocation procedures
in a functionalistic conception of LCA. Int J LCA 2006 (OnlineFirst): 1 – 13
Page 120
TF3 Methodological consistency
Werner F., Richter K. 2000. Economic Allocation and Value-Corrected Substitution.
Int. J.LCA 5 (4)pp 189-190
Werner F, Richter K (2000): Economic Allocation in LCA: A Case Study about
Aluminum Window Frames. Int J LCA 5 (2) 79–83
WordNet, A lexical database for the English language, http://wordnet.princeton.edu/,
site accessed 2 October 2004.
Ziegler F, Nilsson P, Mattsson B, Walther Y (2003): Life Cycle Assessment of frozen
cod fillets including fishery-specific environmental impacts. Int J LCA 8 (1)
39–47
Page 121

Documents pareils