Fairy Tales and Labour Force Surveys
This is a true story.
I made porridge for my children this morning, as I usually do.Â Â When my son first tried it, it was too hot.Â When I finally got around to eating, it after making the kid’s lunches, it was too cold.Â But when my daughter tried it, it was just right (and no, she doesn’t have golden locks, but a recipe tip is that Â peanut butter tastes great mixed in with porridge).
At about the same time, the media report on the labour force survey came on the 7AM news. And it was just about the same story.Â
In June, the increase in employment had been too hot (rising by 93,000).Â In July, it was too cold (falling by 9,000).Â Â But the August figures were just about right, rising by 36,000.Â Â This was almost exactly in the middle of the range that I predicted it would be after last month’s report.
Just like with fairy tales, I find this story comes around again and again withÂ Statscan’s Labour Force Survey.Â I haven’t done any back analysis, but from myÂ casual observation, there often seems to a three month cycleÂ with these labour force Â figures–too hot, too cold, then about right–that repeats itself.
What I find odd is that so media attention is paid, comments made and probably markets moved over figures that clearly have a lot of noise in them.Â There’s way more hyperventilating expended over this than exterted by myÂ children over the temperature of their porridge.
What is almost always completely ignored in the media reports is how statistically unreliable the monthly labour force figures are.Â Â For instance, the standard error (S.E. in the third column ofÂ major tables in the detailed report) for employment at a national level is 28,000.Â Â This means that there is only 68% confidence that the “real value” of an increase of 36,000 is in the range of +8,000 to Â +64,000.Â Â At a more reliable confidence interval of 95% (the familiar “19 times out of 20” reported for public opinion polls), the range is -20,000 to +92,000, as is explained in the data quality section of the report and in a bit more detail in the Guide.Â Â There’s still a 5% chance that the real value is outside this wide range.
And those are the confidence intervals for the unadjusted figures.Â Â I’m sure that the seasonal adjustment process adds more variability.Â Â And I expect that the unique sampling method of the LFS, with its panel data and possible reporting errors, may add even more noise and variability to the data.Â Â
After obliging with some quotes, I explained these reliability problems to a well-known and intelligent CBC reporter when he called for some commentary, but after after expressing some surpriseÂ about this heÂ didn’t show any interest.Â Â Of course, no news is bad news for the media.
I’m not exactly sure when it happened, but I stopped paying any serious attention to the monthly labour force figures about ten years ago when there seemed to be fair too much back and forth variability in the monthly numbers they reported.Â Â I’ve never been part of the labour force survey sample, but an economist friend of mine was when I lived up in the Yukon (where it is hard to escape being sampled) and he came out of the experience quite concerned about the reliability of the data.
Households are kept in the sample for six months, only the first interview is detailed and the information for all members of the household are subsequently reported by one member of the household by telephone.Â Â This six month rotation doesn’t necessarily explain what appears to be be a three monthÂ hot, cold, just right cycle of accuracy, but perhaps there is some other adjustment they make that accounts for that.Â Â
Don’t get me wrong.Â I still use the annual labour force figures and I think the quarterly or three-month moving average figures are still fairly reliable.Â Â But I wish Statscan would fix the problems that seem to exist with the monthly labour force figures.Â Â TheyÂ report that they use some of these modified sample methods in order to reduce response burden, but I suspect that a lot ofÂ thsi was really done to reduce costs.Â Â We now know that similar cost cutting in the 1990s for the survey size of employment, earnings and hours resulted in major problems for the reliability of that survey at the provincial level and they’ve made some recent changes to improve those figures.
Is it expecting far too much to hope (especially in lightÂ of looming federal budget cuts and this government’s attititude towards accurate information)Â thatÂ Statistics Canada will try to fix some of the problems with the Labour Force Survey, considering how much attention is paid to it?Â Â
I’m sure there are many others much moreÂ informed about the internal workings of this survey who could comment more knowledgably about it.Â Is there anybody at the Stats barn (or elsewhere) with something to add about this?