More Statscan Censorship?
Once again, there seems to be a heavier hand in censoring or editing Statistics Canada’s releases. This morning The Daily reported that:
“Spending on research and development in the higher education sector amounted to $9.6 billion (current dollars) in the fiscal year 2006/2007.”
but there was no word on whether this was an increase or decrease from the previous period, which Statscan releases almost always have.
I checked and found that the summary in the Daily is word for word identical to the highlights in the Science Statistics (as it is for many publications) except that it excludes the words “up 1.1% from 2005/6”.Â
As Jim Stanford pointed out earlier this week, the lede is the most important part of the story. The Statscan Daily report on this took out the context and information that might lead reporters onto a potentially negative story about this issue. Â
This 1.1% increase is less than the rate of inflation and in constant dollar terms, R&D funding in the higher education sector actually declined by 1.2%. In fact the Science Statistics publication shows that 2006/7 was the first time in a decade that research and development spending in the higher education sector in Canada (HERD) actually declined in constant dollar terms (see table 1.1, page 9 in the Science Statistics document). The average rate of growth in constant dollar terms had been 8.8% a year since 1996/7.
Why did it decline?Â
The stats tables show that it is almost entirely because of lower funding from the federal government. This amounted to a 4.4% cut in funding in constant dollar terms. The federal funding cut of $102 million in real dollar terms was responsible for almost all the total $105 million total decline. In actual current dollars, the federal cut amounted to $55 million in a year that the fedeal government registered a $13 billion surplus.
Why is this relevant?
The year 2006/7 was the first year that the Harper government was in office. Investment in research and development is essential to increase our economy’s productivity, which hasn’t increased since the start of 2006 (and has grown at a dismal rate since 2000).Â
Canada has some of the most generous tax incentives for private R&D in the world, yet Canada has one of the lowest rates of investment in R&D among OECD countries thanks to both low rates of government and business investment in R&D, accoridng to Industry Canada’s Science and Technology Data tables. Canada’s investment in higher education R&D had recently been relatively good, but it looks like the current federal government may soon rectify that.Â
The Harper government is laying off federal scientists and forcing departments to slash their R&D budgets . It is deregulating food safety inspection and transferring or selling off federal labs to the private sector, intent on further commercialization and privatization. They eliminated the national science advisor and have instead appointed Preston Manning among others to help advise on science issues. This approach to science recently earned the Harper government scathing criticism in an editorial in Nature, one of the most respected science publications in the world.
Was the removal of these four words (which effectively killed news interest in this story) the result of overactive self-censorship within Statistics Canada or the consequence of the long reach of Harper’s communications tsars? I suspect we’ll never know, but in whatever case it is a concern. The threat of budget cuts can be very effective in keeping departments and agencies in line.
This development is disturbing: both the drop in funding for R&D for the health of our economy, and what appears to be a diminishing level of objectivity in Statistics Canada’s reporting.Â
At one point, Statistics Canada was ranked the top statistical agency in the world by The Economist magazine. They still produce a tremendous amount of excellent material and have many superb staff, but with this type of political spin-doctoring, I suspect they wouldn’t make it onto the podium either right now.
You need to look at R&D investment from a holistic perspective rather than just input based or spending metrics. More importantly is how effectively and efficiently are our R&D investment dollars producing tangible results. The fact we are spending more doesn’t necessarily mean Canadian R&D is more productive or efficient. In fact in 2006 Canada was #6 in OECD spending on R&D while in 2007 Canada was #14 in terms of output according to the Conference Board of Canada annual report card on Canada. Which means Canada Ranks 15th out of 17 OECD countries in terms of performance. Canada is also #1 in spending on public research in the world yet we continue to produce disappointing results in terms of innovation in the past decade.
Bottom line is what Canada is doing in terms of innovation isn’t working and things need to change. Frankly thinking that because we are spending more means we are producing more or being more innovative is a myopic view on our economic performance as a country.
Okay then, Mr Graham then how do we measure this holistic perspective of R&D.
I would have thought in this case a simple metric would actually be- more is better. Sure from a diffusion perspective there is something to be said about spending. But regardless, you are basically saying that somehow less spending is better. That just does not compute.
I would also like point out that the nature of the relationships and linkages between the corporate sector and the higher education sector is tantamount to the causality with regard to measurable outputs.
Given the asymmetrical distribution of this relationship when compared across various OECD countries, would lead one to believe that the measures put out by the conference board, at least when looking at R&D expenditure for higher education, is not something that I would sauy is comparable.
i.e. the USA has a higher education research base that is so corporately linked that it is hard to tell where the boundary of the private sector firm starts and ends. This of course is what the neo-cons want more of, full control over the public R&D dollars and the decisions made, handed over to the private sector. And that is precisely what you Mr. Graham, whether you know it or not, are suggesting by your comment.
that should be- the conference board measures are not comparable fro higher education based expenditure versus outputs given the differences in corporate governance over these public dollars. More corporate arms length translate I am sure into less measurable outputs.
Hi Ian:
I don’t think any reasonable economist would say that increased spending automatically leads to greater outputs, and this is of course even more the situation for R&D and productivity.
Of course we need to do things smarter and more effectively, and that applies to the process of research and development as well as its application.
But I maintain, as I stated in the post, that investment in R&D is essential to increase our productivity. I don’t think that cutting federal funding for R&D to universities and colleges will help to boost our productivity.
In any case, my post was about apparent self-censorship of information released by Statistics Canada. This type of spin-doctoring or covering up of research and information also won’t help develop a more dynamic and innovative society and economy.
Hi Toby,
Regarding the censorship of Stats Canada this isn’t something that I can intelligently comment on. If the format of the press release has changed from previous years as you suggest, then yes this is an anomaly. The bigger and more important issue in my opinion is WHY and HOW has spending changed.
To gauge the impact of R&D spending you really need to understand where the dollars are going and what results are being produced; this isn’t something that is gleaned from merely looking at the top line or spending. One metric on how we are doing as a country is licensing revenue from patents in public research and my understanding is that last year in Canada we spent $10B on public research and generated $12M – $15M in licensing revenue from those patents. If any of your readers to comment intelligently on this metric this may be the basis for some interesting discussion.
I agree that spending on R&D is important and would suggest that how much you spend is less important than how effectively resources are being utilized and what you produce. You can spend less and produce more. The sad part is that most of the metrics in Canada, like the one quoted in the original blog post, are input based metrics rather than results based metrics.
Bottom line is that Canada is lagging (and losing ground) the world in the knowledge economy and innovation we really need to start making some significant changes to our R&D policy to remain competitive.
Again Ian you use a metric that is highly skewed towards a market oriented measurement.
If you believe the measure you just put up as $10B in investment is somehow representative of $12M-15M in outputs then yes we do have a problem. However I find it a fairly long stretch to see how those two measures are somehow indicative of some input/output model. I am sure whatever investments that were made must have produced something more in terms of output than the figures you mention. It goes back to measurability and reporting.
Not all these notions are easily reported and quantified so succinctly and neatly put into a package.
Public R&D and its output should not be so rigidly measured in terms of dollars. I do not have the solutions but i can say this- your methodology needs work before it can be meaningfully employed.
It is not an easy area to take on, but I would say this, one should avoid large aggregated measures. I think something more meaningful may be by industry and type of R&D.
Regarding Ian Graham’s initial comment: Two points…
Point 1, stats:
I’m not sure how the Conference Board does its calculations. Went to their site and only the report card for 2008 (without explanations … not due til September) is available.
I did go to the main source of the Conference Board, the OECD at http://lysander.sourceoecd.org/vl=13693526/cl=12/nw=1/rpsv/figures_2007/en/page17.htm
(OECD in figures 2007, Science and Technology (I)).
Looking at the 17 of the 30 countries that the Conference Board uses as comparisons, and given this is the most recent info I can find from the OECD, the figures don’t agree with what you report the Conference Board gives.
In spending as a percentage of GDP (2005), Canada is 10th overall in R&D of the 17 and well below the median. In government expenditures planned for 2006, Canada is 15th of 17 as a percentage of GDP.
So the figures that the Board uses (that you quote) seem strange at least.
Point 2: you fail to address exactly how decreasing real dollars to R&D will improve outcomes. Your point of how funds are used versus amount of funds invested is well taken, but then how does reducing input funding solve the inefficiencies of the process? It seems more logical to first fix the process that is giving poor performance (if poor performance is actually occurring), and then, given our poor government spending in R&D, increase funding. One might even argue that the low level of funding in R&D is actually causing some of the inefficiencies in performance (not giving enough funding for projects to succeed, just enough to exist).
Finally, the original intent of this blog was to address possible censorship in Statscan (self-imposed or otherwise). I’ve seen some examples of this myself recently and concur with the feelings of the author. Hopefully, if there is any overt government interference with the work of Statscan, that information will be revealed by civil servants whose allegiance with the public they serve overcome any fears they have of retaliation.
Cheers
On your last point Homer, I would love to sit down and have a long blog conversation about how the bias in the winds that blow through the towers at Statistics Canada effect the numbers.
I am not sure how one goes about proving such things but it seems that even suggesting it can get you fired. Apparently it goes against the oath that all Statcan employees make upon accepting their position. I am not sure how change can ever be made against those high within the towers. It is there that ruling with a statistical fist is initiated and I am sure the senior staff who’re quite well paid discuss the nature of policy regarding striving for objectiveness and test for such goals. A majority of the high profile releases that make it out to the public are sent up to the top floor and edited for release. It is somewhat of an editorial process for the senior staff to make the stats dance to what ever song they feel will ensure their masters are pleased with their work. A spin here and a spin there and knowledge over how the media reports, and presto, the masters are pleased. It would be difficult to prove but it is something that slowly degrades the credibility of independence. It is this slow death by a thousand cuts that is important. It is the arms length that disappears, it is the critical voice that is muffled in the numbers that is imperative. It is the report card that suddenly becomes all A’s.
A statistical agency must become separate from the rulers in reporting the outcomes of policy. It also must help in providing direction for policymakers. It is a difficult balance, but one that must be maintained and delicately nurtured. It is not something that should spun and beaten to please the masters. It is an important vehicle for a well functioning democracy.
So how does one prove such interference. It will not be statistically proven, it will be fought out in the subjectivity of the qualitative, and in the hearts and minds of individuals who experience life and the economy of the times.
It has been this critically acclaimed role that Statcan has been trying to accomplish for many years. It has its successes but it has also had its failings. That is why the Chief Statistician has access to the Statistics Act. It is quite a powerful tool, and can alter your reality when effectively used. It has consequences though when it is used in a manner that does not promote democratic notions.
And that as stated is, garnering it about in a fashion that such commentary as we write here, diminishes its credibility and effectiveness, as public confidence in the measures and numbers falls fleetingly to the ground.
It is this that we have been blogging about. It is the process of checks and balances in the new information age. It is about democracy and the ability of a worker to make statements and not fear reprisal. However with some recent rulings regarding blogs it seems like the court systems are starting to back employers and their rights to fire employees for statements made on a simple blog site.
I know for a fact that one of the commentators on this site was threatened with dismissal for blogging on this site about Statcan and its anti-worker bias that has suddenly crept into its reporting.
Saul T.
Perhaps it would help if Statscan was made a parliamentary agency, instead of a government one.