jackman.stanford.edu/blog
bannerImage

Jenkins resignation, betting markets reaction

Wednesday November 23, 2011

Filed under: Australian Politics — jackman @ 2:48 pm

At 9am AEST Centrebet had 3.18:1.34 (ALP:Coalition). That is now at 3.40:1.30, indicating that someone (the Centrebet odds maker, or election punters, or both) think that this is bad news for Labor.

The logic might be that Jenkins’ resignation destabilizes the fragile state of play on the House floor, or that something is afoot inside the Labor caucus, all of which might have the consequence of bringing on an election…?

No change at the other houses that I follow closely (sportsbet, sportingbet).

And just quietly, 3.40 ALP at Centrebet against the Coalition on 1.40 at sportingbet is getting close to free money. Do ya’ best, and all that.

Comments Off

Persi Diaconis shows us math and magic

Filed under: statistics — jackman @ 10:40 am

This should be great fun…billed as a “public lecture”. Persi also has a co-authored book on the topic.

20111123-103714.jpg

Comments Off

Variational methods for Bayesian inference (David Blei seminar)

Monday November 21, 2011

Filed under: statistics — jackman @ 11:15 am

Slow week here at Stanford, but I hope to make this talk tomorrow over at Stats (or I could just ask Justin Grimmer for his take…)

Announcement (pdf)

Title: Online variational methods for scalable posterior inference

Abstract:
Probabilistic topic modeling provides a suite of tools for analyzing large collections of documents. Topic modeling algorithms can uncover the underlying themes of a collection and decompose its documents according to those themes. We can use topic models to explore the thematic structure of a corpus and to solve a variety of prediction problems
about documents.

At the center of a topic model is a hierarchical mixed-membership model, where each document exhibits a shared set of mixture components with individual (per-document) proportions. Our goal is to condition on the observed words of a collection and estimate the posterior distribution of the shared components and per-document proportions. When analyzing modern corpora, this amounts to posterior inference with billions of latent variables.

How can we cope with such data? In this talk, I will describe online variational inference for approximating posterior distributions in hierarchical models. In traditional variational inference, we posit a simple family of distributions over the latent variables and try to find the member of that family that is close to the posterior of interest. In online variational inference, we use stochastic optimization to nd the closest member of the family, where we obtain noisy estimates of the appropriate gradient by repeatedly subsampling from the data. This approach (along with some information geometric considerations) leads to a scalable variational inference algorithm for massive data sets.

I will demonstrate the algorithm with probabilistic topic models tted to millions of articles. I will then describe two variants, one for mixed-membership community detection in massive social networks and one for Bayesian nonparametric mixed-membership models. I will show how online variational inference can be generalized to many kinds of hierarchical models. Finally, I will highlight several open questions and outstanding issues.

This is joint work with Francis Bach, Matt Ho man, John Paisley, and Chong Wang.

About this Speaker: David Blei is an associate professor of Computer Science at Princeton University. His interests include graphical models, approximate posterior inference, and applications to text analysis and other domains.

Comments Off

balanced budget amendment, House roll call

Friday November 18, 2011

Filed under: politics — jackman @ 2:12 pm

A few party defections. The 4 Republicans voting Nay are a long way from the estimated cutpoint, and all in the “doesn’t reign in spending enough”, Paul Ryan camp.

A bit more noise among the Dems, as is the way these things go (i.e., votes reveal more about the distribution of types in the out-party than in the in-party). No Democrat predicted to vote Yea actually voted Nay.

The picture below (clink on the thumbnail) shows the fitted item-characteristic curve for this roll call, given ideal points on other 112th House roll calls.

Comments Off

Lo, I am become death

Monday November 14, 2011

Filed under: general — jackman @ 10:11 am

Zombie Gunship on the iPad:

20111114-101006.jpg

Comments Off

78% vote by mail in Santa Clara County

Wednesday November 9, 2011

Filed under: politics — jackman @ 11:33 am

From the Santa Clara Country Registrar of Voters, in the low-intensity, off-off-year election yesterday, turnout (as a proportion of RVs) was around about 30%. 78% of ballots cast were by mail.

Quite the headache for polling, campaigning etc, when (a) there is not a lot of voting (!), but (b) the bulk of it is happening over a 4-week pre-election window.

Comments Off

Newspoll the market mover

Tuesday November 8, 2011

Filed under: Australian Politics — jackman @ 2:03 pm

A tick-up for Labor in the last Newspoll (+3pp, primary vote) and a corresponding bump in the betting markets.

Comments Off

Bayes class at IPSA Summer School (Sao Paulo)

Thursday November 3, 2011

Filed under: general — jackman @ 2:03 pm

“Summer school”, in the Southern hemisphere sense of “summer”. Hooray.

I’m offering a “Practical Introduction to Bayesian Statistical Modeling” in Sao Paulo, Feb 6-10, 2012. Further details here.

Comments Off

Apple now speaks Australian?

Wednesday November 2, 2011

Filed under: Apple,computing — jackman @ 4:33 pm

Too funny. Apple just released iOS 5.0.1 beta to developers. One of the listed improvements and bug fixes is

improves voice recognition for Australian users using dictation

Comments (1)

Powered by WordPress

Bad Behavior has blocked 2863 access attempts in the last 7 days.