Entscheidungstheorie, SS 2005 Matthias Blonski Arbeitsblatt 1

Transcription

Entscheidungstheorie, SS 2005 Matthias Blonski Arbeitsblatt 1
J.W.Goethe Universität Frankfurt a.M.
Vorlesung: Entscheidungstheorie, SS 2005
Matthias Blonski
Arbeitsblatt 1:
[Bei diesem Arbeitsblatt im Gegensatz zu späteren erwarte ich keine Lösungen von Ihnen,
sondern Vertiefung der Vorlesung durch Lektüre der Texte, am Besten durch aktives
Durcharbeiten oder besser gemeinsames Diskutieren der Texte z.B. in kleineren Gruppen. In
der Vorlesung werden wir aber nicht weiter darauf eingehen.]
Wiederholungsfrage
Sie lassen sich auf eine seltene Krankheit (nur 1 in 1.000.000 Menschen bekommt diese
Krankheit) mit einem Test prüfen, dessen Vorhersagegenauigkeit 99,9% (also sehr gut) ist.
Wie hoch ist die a-posteriori Wahrscheinlichkeit nach dem Test, falls dieser ergibt, daß Sie
die Krankheit haben?
Leseaufgabe: Schelling, T. „Choice and Consequence“ (1984), Ch 3,4, Harvard University
Press
Hier soll das im weiteren Verlauf der Vorlesung verwendete Konzept der instrumentellen
Rationalität weiter kritisch hinterfragt und problematisiert werden. Neben anderen
Rationalitätskonzepten wie z.B „expressiver Rationalität“ gibt es das Phänomen multipler
Präferenzen oder Identitäten, das nicht nur bei der Konzeptionalisierung von Haushalts- und
Firmenpräferenzen sondern auch bei Individuen auftritt. Als Lektüre zu diesem Thema
verwenden wir einen Klassiker der Literatur, Schelling (1984), Kap 3&4 (hauptsächlich Kap
3). Lesen Sie so aufmerksam, dass Sie folgende Fragen aus dem Text für sich beantworten
können.
1. Sind gute Vorsätze zu Neujahr nötig? S 57
2. Haben Haushalte, Firmen, Individuen Präferenzen? S 59
3. Warum Selbstdisziplinierung (Odysseus und die Sirenen)? S 63
4. Gedankenexperimente (Schmerz, Sterbehilfe)? S 66
5. Klassifizierung „ungewollter Bedürfnisse“ gemäß Zeitprofil, Dauer, Warnzeit, physischer
Charakteristika, Schadensfunktion, Schadenszeitprofil, Bewußtsein usw. S 69
6. Größenordnung: Wieviel wäre ein Raucher, der mit dem Rauchen aufhören möchte, es
aber ohne Hilfe nicht schafft, bereit zu bezahlen für eine Erlösung von der Sucht?
Aggregation für eine Volkswirtschaft für eine Generation... S 73
7. Taktiken der Selbstkontrolle S 76
8. Zum juristischen Problem des Selbstkontrakts S 98
Schelling, T. „Choice and Consequence“ (1984), Ch 3,4, Harvard University Press
->Bibliothek oder als Kopie bei Frau Habel (Kettenhofweg 139)
2
Leseaufgabe: Economist
Lesen Sie:
Cause for conCERN? (Economist, Oct. 28th 2000)
Siehe unten.
In diesem Artikel wird ein noch unstrukturiertes Entscheidungsproblem beschrieben.
1. Versetzen Sie sich in die Rolle eines Beraters. Strukturieren Sie das
Entscheidungsproblem,
indem
Sie
die
in
der
Vorlesung
diskutierten
Strukturierungselemente benutzen (auf Basis der in diesem Artikel genannten
Informationen).
2. Verteidigen Sie bei jeder von Ihnen gewählten Spezifikation der Strukturierungselemente
Ihre Wahl gegenüber dem Einwand, daß eine andere Strukturierungswahl zu einem
anderen Entscheidungsresultat führen kann.
Europe’s largest physics laboratory is at a crossroads, in more ways than one
IT IS the decision from hell. CERN, a giant high-energy-physics laboratory straddling the
Franco-Swiss border near Geneva, had planned to shut its biggest particle accelerator,
the Large Electron-Positron (LEP) collider, at the beginning of October. Construction of an
even better machine, the Large Hadron Collider (LHC), would then begin in the same
27km-long, circular tunnel used by LEP. But just as LEP seemed ready to make a
graceful exit, nature decided to tease its physicists with some oh-so-tantalising hints of a
new fundamental particle.
In the past few months, experiments at LEP have detected signs of what has become the
rainbow’s end of high-energy physics, the Higgs boson. This is the last particle missing
from the list of those predicted by the “Standard Model” of the universe. Finding the
Higgs could close a major chapter of physics. It could also open up a whole new tome,
forcing physicists to move beyond the Standard Model into a weird world of halfsuspected particles and multiple dimensions (see our article on October 7th). Either way,
proof of the Higgs’s existence would be the scoop of the decade. But gathering the data
needed to confirm that scoop takes time.
An existential crisis
Time, unfortunately, is what CERN lacks. The horns of its dilemma are clear enough.
Keep taking measurements and maybe make a stunning breakthrough—but delay an
engineering project which is scheduled with a Swiss precision that will cost much to
undo. Or stick with the original plan, close LEP, and run the risk that another highenergy-physics laboratory will nab the Higgs first. Assuming, of course, that it is the
Higgs that is out there, and not a cruel statistical fluke.
Faced with this choice, the answer is obvious. The physicist keeps measuring—what
could be more important than catching the Higgs? The manager sticks to the original
plans—the alternative is a logistical nightmare that could lead to spiralling costs for the
LHC. And CERN’s backers, governments not always known for their generosity towards
fundamental research, lose their patience. This is not an idle threat. A giant hole in the
desert in Texas bears witness to a previous occasion when politicians pulled the plug on
an over-expensive particle accelerator.
Given such a tough call, CERN’s director-general, Luciano Maiani, decided to offer the
laboratory a month to make some more measurements—a period that is still just within
3
the margins of the original schedule for the LHC. This is probably not enough to corner
the Higgs, but it gives CERN’s scientists and engineers a chance to figure out a plan B,
which should be announced at the beginning of November. Rumours are that the Higgshunters will have it their way, postponing the LHC. But it will be a nail-biting finish for
LEP.
Stressful as this may be for the researchers collecting the data, CERN has an even
greater challenge to face: plotting the future of the laboratory beyond the LHC. In the
midst of the current drama, that may seem a problem that is aeons away. The LHC will
be completed in 2005 if—a big if now—all goes according to schedule, and the new
machine will have a useful lifetime of about a decade. Yet such is the scale of highenergy physics these days—1,000 scientists and engineers are involved in building the
LHC, at a cost of over SFr3 billion (about $1 1/2 billion) for the hardware alone—that it is
necessary to start planning a generation in advance.
There are many fundamental questions to answer. What sort of machine should CERN
build next? And for what sort of physics? But the most fundamental question of all may
well be, why continue? The standard responses about pushing back the limits of
mankind’s ignorance and probing the mysteries of matter are worthy, but not necessarily
worth billions of dollars to taxpayers and the politicians who represent them—especially
when compared with other projects in more practical areas of science.
Beyond the science, CERN has in the past had some powerful political arguments to
justify its existence. Born in 1954, it has been a flagship of European co-operation from
the beginning, being funded by a host of European states and staffed by exemplary
multinational research teams. It was also a symbol of hope during the cold war, a place
where scientists from both sides of the iron curtain could work together on peaceful
projects. With the rise of the EU and the fall of the Berlin wall, however, the visionary
role of CERN in Europe has waned.
But CERN also has another raison d’être, often overlooked even by its own scientists. The
facility is an important generator of new technologies. Everything from vacuum pumps
and radiation detectors to silicon chips and advanced software are pushed to the limits of
the possible for CERN experiments, with frequent spin-off benefits for the rest of society.
The most famous example is surely the World Wide Web, invented at CERN by Tim
Berners-Lee to help cope with the sharing of high-energy-physics data. Although some
CERN scientists initially balked at the use of this tool for base commercial purposes, the
laboratory now happily basks in the glow of the web’s success. The only catch is that,
although invented in Europe, the web was exploited far more quickly in America. Officials
at CERN are easily irked by mention of this, and point out that had the organisation tried
to cash in on the technology by making it proprietary, it might not have been adopted so
widely.
In retrospect at least, giving the web away free was tactically smart. But that misses the
strategic point. If CERN had had a thriving technology park around it, and had embraced
an entrepreneurial culture that encouraged its young scientists to go out and
commercialise their ideas, Europe as a whole might have profited more from the web.
CERN could, in other words, have played a role similar to Stanford University in
California, which catalysed much of the early development of Silicon Valley.
Alas, radical change of the sort this would require is not CERN’s forte. As a multinational
organisation, it has to balance the interests of its member states. This puts a damper on
some ambitions. For instance, a thriving technology park at CERN might seem to favour
France and Switzerland over other partners, so industrial R&D contracts such as those for
building the LHC have been carefully farmed out to partner countries in true eurocratic
fashion, that is in rough proportion to the dues they pay to CERN, rather than according
to their actual industrial talents. To be fair, though, CERN has recently been making a
vigorous effort to shed its ivory-tower image. Last year, a director for technology transfer
was appointed for the first time. An industrial technology liaison office has been opened
4
and CERN has even begun filing patents—a phenomenon practically unheard of two years
ago. But is this all too little, too late?
Gearing up for the Grid
Perhaps not. For CERN is in a position to lead Europe, and perhaps the world, into the
Internet’s next bold phase, called the Grid. This involves harnessing the huge computing
resources that the Internet links together in order to solve problems beyond the scope of
any single supercomputer. The technologies that need to be developed for the Grid,
especially so-called middleware that combines software and hardware in order to make
different computers work together in a seamless way, promise to be even more
revolutionary than the web, and could lead to a whole new dot.com (or dot.grid?) boom.
CERN has the dream application for the Grid: the analysis of the phenomenal amounts of
data that the LHC detectors will generate. Packets of protons moving in opposite
directions round the LHC will collide 40m times a second. Even after judicious filtering,
the amount of data coming out of a single detector will rival the entire traffic on the
global telephone network. The strategy for dealing with this requires “fanning” the data
out to national hubs using high-speed networks, then further fanning them to regional
centres, and finally analysing them on legions of computers located in individual
university laboratories.
Already, CERN has achieved some significant results. The organisation has, for example,
built a “virtual” exabyte memory device (an exabyte is a billion billion bytes) by linking
computers spread across many sites. An exabyte is about half the amount of unique
electronic data that is currently produced worldwide in a year, and several hundred times
more than the LHC will create annually. On top of that, in September, CERN
demonstrated a network that could handle 10 gigabits of data a second while linking
computers from four different manufacturers (Compaq, IBM, SGI and Sun), an important
step towards the sort of high-speed communications the Grid will need.
The question that CERN’s management is mulling over, however, is exactly how big
CERN’s Grid ambitions should be. There is little doubt that the Grid is going to stimulate
a wide variety of science, not just high-energy physics. For example, four of the 15
initiatives announced in July as priorities for Britain’s state-funded research councils are
linked to the Grid. These include applications in biology, where it can be used for
manipulating information from the Human Genome Project, and in meteorology, for
climate modelling.
Turning CERN into a leading centre for Grid development in a wider scientific arena could
be an inspired move. It could also be a risky one. Losing focus on its core competence,
fundamental high-energy physics, could leave the laboratory vulnerable to cuts. After all,
the critics will argue, if CERN is doing something so useful, surely it can get industrial
sponsorship. It may also horrify many CERN scientists to think of their laboratory turning
into a jazzy high-tech incubator for next-generation e-commerce. But it is not necessarily
a fate worse than death. And as recent events have shown, it is always good to have a
plan B.