Yaron Matras

Yaron Matras

Professor of Linguistics, The University of Manchester

When researchers try to initiate change in policy, they are likely to run against the complexity of procedural constraints imposed on the civil service – that’s one of the lessons I learned at a meeting with officials from the Office for National Statistics (ONS) last week. The other lesson is that Modern Linguists need to consolidate their vision of how their academic enquiry can be of benefit to a linguistically diverse society if they are to make a credible case for trying to influence policy; and in order to do that, they need to have a clear vision of the kind of society in which they want to live.

I have been encouraged by the blog editor to state the obvious, which is that the views expressed in the following piece are my own.

The Census question: Setting things in motion

During my presentation at a workshop in Cambridge last February, I raised the issue of the England & Wales 2011 Census question ‘What is your main language?’. I mentioned the ambiguity of the wording, a point that I have been making since the census results were released back in 2013. In the discussion that followed, a member of the audience identified himself as an official of the Statistics Authority and asked whether I had made formal representations to the ONS. According to him, now was the time to do so, since the questions for the 2021 Census were being finalised. He said his piece, and was quick to leave the room before we broke for coffee, not to be seen again.

I was left to discuss the matter with a number of colleagues, and together we contemplated whether the official’s intervention was meant as a direct invitation or prompt to give advice, and indeed whether we needed such prompt or whether it would be appropriate to act even without one. We recalled Habermas’s distinction between Experts and the Intellectuals: The former voice opinions only when consulted, the latter speak up when they feel they have something to say that is of concern to society.

So in early March 2018, a group of us – including Wendy Ayres-Bennett of Cambridge, Thomas Bak of Edinburgh, and Mark Sebba of Lancaster universities – wrote to the UK Statistics Authority, and asked to consider revisions to the question on language. Thomas had already appeared before a group of officials in Scotland to advise on similar issues. Mark had written extensively about the consultation process around the language question on the 2011 Census. And Wendy had various contacts in Whitehall and Westminster with whom she routinely raises issues of language policy.

In our letter we explained that the question ‘What is your main language?’, which had been used for the first time in the 2011 Census for England & Wales, had proven to be very helpful to collect data on the country’s language diversity that had never been collected before, and therefore should in principle remain part of the census questionnaire. However, it required some improvement, for two main reasons:

First, there was proven ambiguity around the attribute ‘main’. Many respondents were uncertain whether this referred to the language in which they were most proficient, or to the one that they used most frequently, or perhaps to one that was preferred for other reasons (for example, in Manchester, one person each put down ‘Manx’ and ‘Cornish’ as their ‘main language’ on the 2011 Census form). Respondent feedback collected by the Office of National Statistics had in fact confirmed this, and as a result the ONS had already classified the question as imposing a ‘medium burden’ on respondents. In its topic report on the language question from 2016, the ONS noted that “the relatively high demand for online help for this question indicated that some online respondents had difficulty interpreting the question”, and also that “some non-UK born respondents were uncertain whether the question was asking about the language they first learnt or the language they most frequently spoke” (ONS Census Transformation Programme. The 2021 Census Assessment of initial user requirements on content for England and Wales. Language topic report. May 2016, p. 16).

Second, the wording did not allow respondents to list multiple languages. In that way it obscured the reality of multilingualism in many of the country’s households.

We provided the ONS with documentation on both issues. Our request to reconsider the question was supported by various stakeholders, among them Manchester City Council and several parliamentarians.

Seeking alternatives

We also pointed out that other countries with an English-speaking majority have a more fine-tuned question on language in their census: New Zealand asks in which languages respondents can have a conversation about a lot of everyday things. Canada asks about proficiency in English or French, and then asks respondents to list any other language used in the home and any additional language used on a regular basis, and to state which language was learnt first. South Africa asks which two languages are spoken most often in the household. The US, Australia and Ireland ask respondents whether they speak a language other than English in the home. Some countries provide a list of pre-selected languages representing those that are most widely spoken, which saves many respondents the time and effort of writing down language names and facilitates the coding of data at the evaluation stage. In Scotland, the 2011 Census didn’t ask about ‘Main language’ but instead it asked respondents whether they speak a language other than English in the home (though only one could be listed).

In its Transformation Programme report from 2016, the ONS provides an explanation for the question used in 2011:

“New questions on ‘main languages used’ and ‘English language proficiency’ were included in the 2011 Census. Data from these questions have been used to identify people for whom English is not their main language and to identify areas where a particular language is in use. This information helps councils and other organisations plan support strategies and monitor the impact of policies. Data are also used for targeting the delivery of services such as language support, translation, and study programmes at a local level to promote integration and cohesion within communities, to help eliminate discrimination, and to ensure that people are treated fairly”. (ONS Census Transformation Programme. The 2021 Census Assessment of initial user requirements on content for England and Wales. Language topic report. May 2016, p 6)

The ONS also notes that it will “balance consideration of user need, respondent burden and space constraints when reviewing requirements for additional response options in the main languages question”. (ibid., p. 25)

After insisting that we should have the opportunity to put our case to officials in person, we were invited to meet with a small group of people who had direct responsibility for drafting this part of the census questionnaire. We explained that we appreciated that the census has limited space and resource capacity to include an entire set of questions on the topic of language, but that we believed that a rather simple amendment might add considerable value: Instead of the 2011 Census question pair

  1. What is your main language?
  2. (If the answer to 18 is not English) How well do you speak English?

the 2021 Census might ask

  1. Which languages do you use in the home?
  2. If English is not your first language, how well do you speak English?

A possible alternative to 18 might be ‘Which languages do you use regularly’, but that might re-define the question from one about population composition to one about acquired skills (Canada and New Zealand ask this question).

Limited prospects for change

We presented our case very briefly to the officials, who thanked us for our input and expressed what we felt was genuine appreciation of our arguments. They also explained their reluctance to implement changes, and I was able to get a feel for what the ONS means when it refers to “balancing considerations”. First, they said they had no evidence from users that the question was inadequate. Next, they said there was no space in the census questionnaire to accommodate amendments. They also suggested that processing multiple responses to the question would be costly. A further argument against a change was the need to ensure comparability of the data across censuses. The officials went on to suggest that obtaining accurate data on language diversity was not a priority for the census’s principal stakeholders, who are interested primarily in respondents’ level of proficiency in English; question 18 on ‘main language’, they said, was merely a stepping-stone to question 19 on proficiency in English. Changing the variables would also require time in order to test the new questions for quantity and the impact of data on users. The ONS’s time plan foresees completion of a White Paper outlining the 2021 Census by October 2018, which would make it quite impossible to run such trials. Instead, there were two practical suggestions: The first was to improve the guidance notes for respondents in order to help remove the ambiguity of the question. The second, in the longer term, was to draw on other surveys in order to collate data on language diversity.

It should be noted that the ONS officials used the term ‘evidence from users’ strictly in connection with the responses that the ONS solicited from a pre-defined list of institutions. That means that our research, and the statements from stakeholders, including parliamentarians and a major city council, do not constitute, in their view, ‘user evidence’ for the purpose of the exercise, nor does, apparently, the survey of individual respondents (as opposed to institutional ‘users’) carried out by ONS itself. There is thus circularity in the statement, as the officials in fact admitted.

Space on the questionnaire is of course a factor, but our suggestion as presented above foresees adding just 12 characters (including spaces) to the question. Certainly, the option of listing multiple languages would require an additional coding effort, but one would think that it is good value for money (I asked whether ONS might produce an estimate of the additional cost, and we would then see if it might be covered, in part or in full, by our AHRC-funded OWRI award; but our hosts dismissed that suggestion with a smile).

Ensuring comparability with previous datasets is of course important. If the question were to be amended as we propose, we would be getting more data, not less. Correlations with other responses would help evaluators to pinpoint which changes reflect actual increases in speaker numbers, and which reflect greater clarity in the question. For some four million immigrants who have settled in the UK since 2011, the question of comparability with the previous census is irrelevant anyway.

The suggestion that stakeholders are not actually interested in language diversity is, on the other hand, somewhat difficult to reconcile with the other arguments: If both space and data comparability across censuses are paramount considerations, and users have no interest in language diversity, then why was the question on language introduced in the first place in 2011? ONS’s own statement about the purpose of question 18, cited above from its Transformation Programme report, mentions explicitly stakeholders’ wish “to identify areas where a particular language is in use” and further to target “the delivery of services such as language support, translation, and study programmes” as well as “to help eliminate discrimination, and to ensure that people are treated fairly”. IN the Scottish 2011 Census, the question on English proficiency (question 17) in fact precedes the question ‘Do you use a language other than English in the home?’ (question 18), and so the so-called “stepping-stone” is in fact reversed.

Which brings us to the more practical suggestions. If we were to contribute to amending the guidelines for respondents, then that would seem to offer an opportunity for both sides in the conversation to tick boxes: The ONS would benefit from the input of specialist researchers, who would now become partners in the process. The researchers would be in a position to compose a remarkably convincing impact case study for REF 2021: helping ONS ‘change practice’ in connection with guidance notes on the census questionnaire would have a potential impact, in effect, on the entire population.

Tempting as this may be, it is not yet clear how many respondents actually make use of the guidelines, nor is it obvious that we could help find a reasonable explanation for the concept of ‘main language’. Unless we are able to capture respondents’ actual language practice, then there is a risk that information on ‘main language’ would simply be used to categorise persons who have low proficiency in English, as demonstrated by the government’s recent Integrated Communities Strategy Green Paper from March 2018, p. 35-36, which correlates English proficiency with ethnicity.

The offer to draw on a range of other datasets (which ONS is now empowered to do, apparently, by the Digital Economy Act 2017) is interesting, but of course it comes nowhere near in scope to the prospect of improving data capture through the census. At present, the only other national tool that asks about language is the School Census. Here too, there is under-reporting of languages, and no opportunity to list multiple languages. Moreover, the category ‘first language’ used there does not necessarily overlap with ‘main language’ in the census. Crucially, it only surveys school pupils. Nonetheless, the general idea of triangulating datasets on languages is well in line with our current efforts in Manchester to compile data from a variety of sources, something that we are piloting as part of the Multilingual Manchester Data Tool. If we could gain ONS as a partner for such an endeavour on a national scale, then that would indeed be a meaningful achievement that could significantly change the way we approach data compilation and assessment on languages, which in turn might have a far reaching impact on the way local authorities, public service providers, and the public in general appreciate the country’s language diversity.

A vision for a linguistically diverse society

One of the key achievements of the AHRC’s OWRI scheme has been to put public engagement and policy impact firmly on the agenda of funded research initiatives in Modern Languages. But what does policy impact mean, and how can it be achieved? Researchers can try to influence civil servants, but these are committed to “balancing considerations”. That balance will inevitably favour inertia over a change of direction. This is managed by controlling variables such as what constitutes ‘evidence’ and ‘user feedback’, and by choosing not to deviate from a pre-set timetable. Ostensibly, the most promising avenue for researchers to work with civil servants is to assume the role of invited experts, whose input proceeds on the terms set by existing procedures. Without massive political pressure, the officials will not change course, even if the value for improving accuracy may seem obvious to us. This leaves researchers with the option of lobbying politicians. Apart from the dilemma whether we wish to cross the line from engagement to advocacy, approaching politicians requires a consolidated narrative as to how the changes we propose will be of benefit to society as a whole, as well as, on the tactical side, of benefit to the politicians themselves.