Program Links under construction!!!

klausUr1SS12

klausUr1SS12sol

klausUr2SS12

Ultimas Noticias

Check this out: IDE for R. download free4academia (>500MB):

http://www.revolutionanalytics.com/products/enterprise-productivity.php

maps 4everything in R here:

http://blog.revolution-computing.com/2009/10/geographic-maps-in-r.html

 

CIPIII free each Tu 16-20! Can we move up from 8-10?

i have never forgotten this article and come back to it now and then

http://www.nytimes.com/2008/07/31/opinion/31kristof.html?em

 

New EX2 for Monday, 14th April

The nyt on Bayes Theorem:

http://www.nytimes.com/2011/08/07/books/review/the-theory-that-would-not-die-by-sharon-bertsch-mcgrayne-book-review.html?pagewanted=all

Specifically Bayes’s theorem states (trumpets sound here) that the posterior probability of a hypothesis is equal to the product of (a) the prior probability of the hypothesis and (b) the conditional probability of the evidence given the hypothesis, divided by (c) the probability of the new evidence.

Consider a concrete example. Assume that you’re presented with three coins, two of them fair and the other a counterfeit that always lands heads. If you randomly pick one of the three coins, the probability that it’s the counterfeit is 1 in 3. This is the prior probability of the hypothesis that the coin is counterfeit. Now after picking the coin, you flip it three times and observe that it lands heads each time. Seeing this new evidence that your chosen coin has landed heads three times in a row, you want to know the revised posterior probability that it is the counterfeit. The answer to this question, found using Bayes’s theorem (calculation mercifully omitted), is 4 in 5. You thus revise your probability estimate of the coin’s being counterfeit upward from 1 in 3 to 4 in 5.

A serious problem arises, however, when you apply Bayes’s theorem to real life: …… 

where: CIP-III (law library 2nd staircase down to 2nd basement)

when: mo 10.15-11.45, tu 8.30-10.00ßchange!

start: 02/04/12

what: http://www.r-project.org/

 

Course description

This course is about simulation methods in statistics. It is divided into 2 parts:

MCMC (Markov chain monte carlo methods) and bootstrapping

About the first part:

Simulation methods use random numbers, so we will start out with random number generation.  

MCMC is rooted in Bayesian statistics (remember Bayes rule?) though it is applied in sampling based (frequentist) statistics as well.

We will give an introduction to Bayesian statistics in order to understand MCMC. Though many results from classical statistics can be obtained as special cases of Bayesian statistics it will become apparent that this radically different theory can also be applied to inference problems where sampling based methods fail, for example: the sample size is too small, there are too many parameters, or computations would be too complicated.  

This course will be computer intensive. Computations are an indispensible part. We use computations as heuristic tool to understand or illustrate the theory as well as to illustrate practical applications of the methods we learn. We use the languages R and WinBugs . No previous knowledge of programming in these languages is required. There will be programming exercises weekly. Soon you will be fluently programming in R soon.

Evaluation: The final grade is computed from weekly programming exercises and a written final exam.

Prerequisites are a course in basic probability calculus (conditional probabilities, Markov chains) and in basic statistics.

The course will consist of 25% bootstrapping and 75% mcmc. If time permits we will have also have a chapter on the EM-algorithm.

There will be more details on this outline in the coming days. For questions or suggestions write me a message.

 

http://www.r-project.org/