Eleven million? or 22 million? A new Yale/MIT study estimates the illegal alien population in the US somewhere in the range of 16.5 to 29.1 million (for us statisticians, that’s 22.8 ± 6.3 million). That’s a margin of error larger than the entire population of Los Angeles (3.99 million). Worse yet, this estimate suggests that the Census Bureau’s annual American Community Survey report of 11 million is a seriously low-ball estimate. The Center for Immigration Studies is in the low-ball camp, but their argumentum ab auctoritate seems a bit shrill, and unwilling to admit to the possibility of systematic bias in previous estimates.
Not that counting is as easy as it appears. I regularly open my basic statistics classes with an audience-participation version of the classic Bouba-Kiki experiment, and collect response data by having two or more student volunteers count hands. Invariably, the student counts are not all the same. The confusion provides a “teaching moment” illustrating that the simplest measurement method is prone to variation.
Don’t believe me? If you’re a Windows computer user, download the freebie version of Wildlife Counts, and see how well you can count a static population of animals in a short time.
Wonderful article here about the Mosteller and Wallace analysis of the twelve Federalist Papers, the ones of disputed authorship–was it Madison or Hamilton who wrote them? With a nice, easy-to-understand explanation of the Bayesian methodology they used.
Yesterday I was cautioned by the recounting of an event that occurred in our College of Business. It seems that a lecturer was explaining a concept that required either averaging or the area under a curve, and resorted to writing an integral on the board, by way of illustration. This was NOT a demonstration of technique, nor an explanation of how to perform calculations required in the course, rather just background. However, one student–correctly recalling that calculus was not a prerequisite–took umbrage; he wrote a letter of complaint to the Dean! Holy hellfire sh!t! Just last week I spent 10 minutes explaining to my calculus-averse biostatistics students how the standard normal table was constructed (integration does not conquer all). I had no idea I was skating so close to the edge. Probably because I’m an idiot or a lunatic.
That question gets asked dozens of times every semester in my statistics classes; it’s pretty clear that most of my students have no sense of scale or proportion about numbers.
But now I have Dr Rhett Alain’s short answer in his Dot Physics Measurement and Uncertainty Smackdown, wherein he refers to the (extremely) long answer in John Denker’s excellent Uncertainty as Applied to Measurement and Calculation. Why we’re not teaching this in our service courses for science majors, I have no idea. The Monte Carlo approach described by Alain is a simple application of what statisticians call “bootstrapping,” so perhaps I will start.