OrientationObservationIn-depth interviewsDocument analysis and semiologyConversation and discourse analysisSecondary Data
SurveysExperimentsEthicsResearch outcomes
Conclusion
7.5 Examining data sources Data sources, whether official, unofficial or 'big data', are not neutral objects and the original data have been collected for a purpose.
This purpose informs how the data is defined, collected and compiled.
The Government Statistical Service, for example, claims that statistics are produced without any political influence, to ensure that they are a trusted and authoritative source of information. However, this claim is disputed given the manipulation by successive governments of way that statistics are calculated
7.5.1 Political pressure on published statistics
It must be remembered that statistics are collected and published with a particular aim in mind. They are not 'objective facts' (see Section 1.7.5.1). They may be instruments of policy-making, such as those produced by central or local government. Or they may be collected or manipulated to give weight to the cases of pressure groups such as Shelter who use information to lobby MPs about homelessness.
Central government exerts a considerable pressure on the production of social and economic statistics, especially when they relate closely to party policy. Inflation, employment, government borrowing, inequality and crime rates are areas where government manipulation of the way that they are measured has beendone for political ends. The Coalition Government (2010–15) redefined the way they referred to the national debt not as a figure in absolute terms, which was rising despite policies aimed to reduce it, but as a percentage of gross national income, which made the situation look better.
The Conservative Government of the 1980s ensured several changes were made to the way in which statistics relating to employment, inflation, inequality and crime rates were compiled and published. The method of counting unemployment has been constantly changed over the last forty years to minimise the 'official' numbers of people out of work.
Similarly, moves were made to recalculate the rate of inflation, with suggestions that the real rate should exclude mortgage repayments and the poll tax [council tax]. Furthermore, as the party of 'law and order', the Conservative Government tended to recognise only the 'official' crime statistics, based on data provided by police forces, although these statistics greatly under-represent the extent of crime as revealed by numerous surveys both official and unofficial, such as the Crime Survey for England and Wales (see Section 7.4.1.2.5).
In 2012, for example, the Scottish Police Federation passed a resolution calling for a review of Scottish crime statistics, after hearing claims that they had been manipulated in such a way as to conceal from the public the true extent of crime in Scotland (Hakes, 2012). Figures had been altered by counting several crimes that had taken place at the same time as a single crime. Subsuming several crimes into a single incident included crimes such as murder, attempted murder, and robbery.
More recently the HeraldScotland (Hutcheon, 2015) reported that the Scottish government is proposing to absolve itself of the responsibility for recorded crime statistics amid further claims that the figures are routinely manipulated by police. In recent years, the SNP Government has claimed that recorded crime is at its lowest level since the 1970s. The single police force produces the data, which is then verified by the government. However, the UK Statistics Authority (UKSA) refused to rubber the stamp the Scottish government's crime statistics for 2014, noting that the statistics do not currently comply with several elements of the Code of Practice (see Section 7.4.1.2.1). It added that the Scottish Government should seek 'strong levels of assurance' about the quality of the figures but warned that 'the Scottish Government lacks sufficient evidence to be able to provide such appropriate reassurance.' According to the newspaper:
The Scottish government stated: "We are considering changes to the way in which crime data is published....Police Scotland would publish recorded crime data quarterly, rather than the SG publishing annually. The Scottish Government would discontinue the publication of the other bulletins [and would publish] analytical topic reports.
Hutcheon (2015) claimed that police sources said the recorded total was deliberately kept low by crimes being left as 'incidents' by officers and kept off the books and officers use their 'discretionary powers', such as verbal warnings, to prevent incidents being upgraded to crimes. In other cases, it has been alleged that victims of crime decline to pursue complaints after being told by officers that they will have to give evidence in court. The violent crime figures can also be manipulated by recording serious assault as common assault, the latter of which is a lesser category. Hundreds of thousands of crimes are also recorded as offences.
Furthermore, statistics may well not be published, or may be delayed if they are politically embarrassing. For example, statistics on low-income families were published annually until 1979. Then the Department of Social Security began to publish the figures less regularly and by 1991 the latest figures available for that year were for 1988. This reluctance to provide up-to-date information on poverty coincided with research that shows that in Britain the poor got poorer throughout the 1980s (Bradshaw, 1990; EC Commission. 1991; Oppenheim, 1990; Townsend, 1991). This resulted, in 2015, in Work and Pensions Secretary Iain Duncan Smith announcing a new way of measuring child poverty. Hitherto, a child was defined as being in poverty when living in a household with an income below 60% of the UK's average. Rather than dealing with the problem of poverty the government wanted to redefine the problem by changing the deeply embarrasing measure that showed that the number of children living in relative poverty between 2011 and 2014 had remained unchanged at 2.3 million (BBC, 2015a).
The report of the National Foundation for Educational Research (a government-funded research body for the National Audit Office) on the dangerous and unhealthy state of Britain's schools was blocked for five months by two senior civil servants. The report detailed accidents to teachers and pupils caused by neglect to buildings that became hazardous. Both Sir John Caines (Permanent Secretary at the Department of Education) and Sir Terry Heiser (Permanent Secretary at the Department of the Environment) halted publication by refusing to accept its findings. The report was embarrassing for the government at a time when education was high on the political agenda because essentially it blamed the Department of Education for inadequately financing maintenance of schools.
Similarly, the Registrar General's Decennial Supplement on Occupational Mortality for 1979–83 was published in 1986 with the minimum statistics and no commentary (overturning a tradition dating from 1850). This concealed the continuing widening differences in death rates between social classes that otherwise would have embarrassed the Conservative Government of the time.
Social Trends used to be a reliable evaluation of changes in the United Kingdom. However, despite its widespread use and the large range of surveys it draws on, Social Trends must not in any way be treated as a neutral or 'objective' publication. Indeed, it provides a good example of political manipulation and presentation of statistical data. The original purpose of Social Trends, as set out in its first editorial in 1970, was to make readily accessible to Parliament, the media and the population at large, statistics that could be used to measure both social and economic progress.
The first editions were very much about people, but following cuts in the Government Statistical Service in the early 1980s Social Trends became limited to work needed for government. Although its readership still goes far beyond Whitehall there are now subtle differences in presentation. Rather than revealing the problems and plight of specific groups as it used to, Social Trends now concentrates on broad social changes on a wide range of topics.
For example, Social Trends (1990) openly reproduced the Conservative Government's philosophy by suggesting that on average the population of Britain was getting richer, enjoying more education and longer holidays, and living in better-equipped houses. There is little detailed analysis by income group, social class, family type or region. This effectively conceals major divisions that have arisen in Britain in the 1980s such as the North–South divide (Fothergill and Vincent, 1985; Smith, 1989) and the growing gap between rich and poor (Lilley, 1990; Beckford, 2011). Similarly, the first page of the section on education picked out only the figures that showed how things were getting better. The chart that shows Britain bottom of the international ratings for education of 16 to 18-year-olds, and the tables of teacher vacancies were not highlighted. Nor were there any regiona""l breakdowns of expenditure per pupil. The appalling difficulties faced in some areas are simply covered up by omission. These statistics did exist, but they had to be unearthed from CSO publications such as Regional Trends.
The point is that such regional variations used to be published in the most accessible of government statistical publications but slowly Social Trends is being manipulated to reproduce government ideology.
In the United States, PeterSchiff (2013) reported how the government manipulated economic statistics. For example, in 2013, the Bureau of Economic Analysis (BEA) announced new methods of calculating Gross Domestic Product (GDP) that will immediately make the economy 'bigger' than it used to be.
The changes focus heavily on how money spent on research and development and the production of "intangible" assets like movies, music, and television programs will be accounted for. Declaring such expenditures to be "investments" will immediately increase U.S. GDP by about three percent. Such an upgrade would immediately increase the theoretic size of the U.S economy and may well lead to the perception of faster growth. In reality these smoke and mirror alterations are no different from changes made to the inflation and unemployment yardsticks that for years have convinced Americans that the economy is better than it actually is. (Schiff, 2013)
7.5.2 Selective reporting
Manipulation of the way statistics are computed differs from the selective reporting of statistics by government. For example, the Labour government in 2002 claimed that street crime was on the decrease using data from the 10 police force areas with the worst robbery records that showed there were 70 fewer mugging victims each day in England and Wales in a specific month compared to the same time in the previous year. The conclusion about a decreasing crime rate was criticised because the data was from a selective group of places over a selective period. The government claim was further critiqued because it had failed to provide a context for the statistics or to publish the background data that would allow checks on the conclusions.
In 2011, the Secretary of State for Education, Michael Gove, argued that controversial school reforms were necessary because Britain had dropped down OECD's Programme for International Student Assessment (PISA), supposedly from '4th to 16th place in science; from 7th to 25th place in literacy; and from 8th to 28th in maths' between 2000 and 2009 (Eaton, 2012). However, the chair of the UK Statistics Authority, Andrew Dilnot, pointed out that the use of the figures ignored an important caveat in the 2009 PISA report, which said that it was only possible to compare the 2006 and 2009 data because the PISA 2000 and PISA 2003 samples did not meet the PISA response-rate standards. Eaton (2012) continued:
This isn't the only recent instance of the coalition playing fast and loose with statistics. David Cameron is fond of boasting that "one million" new private sector jobs have been created since the coalition came to power, but, as I've noted before, what he doesn't mention is that 196,000 of these were simply reclassified from the public sector.
Indeed, selective reporting under the Conservative government goes back further than that. Social Trends, which started in 1970 and was designed, according to its first editor, to inform Parliament and the public about social and economic progress (Nissel, 1990). In the Thatcherite era, the reporting in Social Trends shifted from a clear appraisal of the economy and society to reporting the good things that alligned with government policy. In 1990 there was very little detailed analysis by social class, income group or family type.
A closer look at a particular section, that on education, illustrates many of these points. The first page, intended to show the main features of the section, picks out those where everything is getting better. Our own perception tells us something different. It might have given a fairer picture to refer to the excellent new table which shows the United Kingdom bottom of the international class for 16 to 18 year-olds in full-time education or training, or one of the other new tables showing teacher vacancies in schools by subjects. (Nissell, 1990)
It is important, then, to remember that, as with any other data, statistics, official or otherwise, do not speak for themselves: they have to be interpreted. Any set of figures can be interpreted in different ways.
A study by the National Children's Home (1991) showed that one poor family in five was going hungry, and the report regarded this as scandalous. On the other hand, Ann Widdecombe, a Conservative government minister at the time welcomed the report, saying that it showed that four out of five adults had not gone hungry!
Another related issue is the lack of reporting of big data enquiries. Often the analyis of this data is seen as business sensitive and retained for private corporate consumption. In other cases, such as the Facebook enquiry (see Section 7.4.3), the resultant hassle for the company hardly encourages further publication of any research they undertake.