The national literacy and numeracy strategies

The following comments were sent to the principal author of the report, Lorna Earl. Her response follows together with my response to her. We both agree that these issues are important and deserve wide discussion.

The first annual report of these strategies has now been published and can be downloaded from www.education.gov.uk/.

In a press release DfEE minister Estelle Morris says "We have always believed that they (the strategies) would be a big part of our drive to raise standards in schools and it is a great tribute to all the teachers who have been teaching the literacy hour and the daily maths lesson that they have received international recognition".

Those in England who have witnessed the controversy surrounding these strategies and the manner of their introduction may wonder whether this is simply another ministerial spin on the somewhat less glowing details in the report itself. In fact, in this case,Morris' remarks do seem to reflect rather accurately the contents of this report. Written by a team from the University of Toronto, the report generally endorses the strategies and praises the government and specifically the head of the Standards and Effectiveness Unit for their expertise and the fact that they have 'high credibility among educators' (P. 38).

The report's authors state that they have not carried out an evaluation 'in the typical sense 'but acted instead as 'critical friends'. It isn't entirely clear what this means but since the report offers judgements about the success of the strategies it seems reasonable to judge it by the usual standards of intellectual coherence and adherence to good evidence. Sadly, it fails on both counts.

The study team visited schools, LEAs and others, and conducted interviews involving some 200 people. They do not state how they made their selections, nor do they attempt to provide detailed numerical summaries of opinions etc; rather we are provided with the team's summary views together with selected quotations. It seems that there has been no attempt to expose the team's findings to peer review or debate prior to publication, and there is no mention of the pilot studies, which were set up prior to the introduction of the NLNS. Brief reference is made to critics of the strategies and associated policies such as targets, but few references are given and the treatment is cursory. In this commentary I want to draw attention to some of the problems, as I see them, with this report.

The pilot studies preceding the introduction of the strategies attempted to test out the efficacy of different approaches in a sample of schools together with a 'control' sample. Before the pilot for the literacy strategy ended, and before its results could have any effect, the strategy itself was introduced, and a similar thing occurred with the numeracy strategy. Such a cavalier attitude towards evidence based policy implementation would seem to be highly relevant to any evaluation of the strategies themselves, but no mention of this is to be found in the Toronto report.

One of the concerns of many educators in England has been the 'side effects' of concentrating resources on literacy and numeracy (the report itself doesn't attempt to define these terms, and for example, often equates 'numeracy' with 'mathematics'). Of these side effects there is the 'demotion' of other subjects (music, humanities etc) and another concern is the stress induced in teachers, parents and children by the testing regime and the targets for performance that have been given to each school. There is barely a mention, let alone reflection, of such concerns in the report. In something that attempts to be comprehensive this is a serious omission.

The report in general seems curiously decontextualised. Those who are familiar with the scene in England are only too aware of the long-standing controversy over the role of OFSTED, thew ides pre ad demoralisation of teachers and the disputes about performance indicators and league tables. Yet there is barely a mention of this background:it is as if the NLNS had been implemented in a consensual atmosphere rather than one where deep (and often justifiable) suspicion exists of government motives. Those who have lived with recent government initiatives could be forgiven perhaps for viewing this report as a convenient 'whitewash'.

On the question of 'targets' and accountability generally, the report seems to be particularly confused. It suggests that 'accountability is increasingly focussed on identifying weaknesses in schools' but fails to note that the real debate is about how precisely to define those weaknesses. The current performance indicators are widely acknowledged not to do so, and OFSTED inspections are also widely criticised. Benchmarks and school targets have also been criticised for being inadequate and unrealistic. These issues are crucial, but the Toronto report barely touches upon them and certainly does not present a coherent account of their relevance. The report talks of national tests as providing 'reference points for considering pupil attainments' (P. 12) and refers to the Key stage2 targets for level 4. There is no mention of the problems associated with interpreting the year on year changes in test scores which has bedevilled all large scale repeat assessment schemes, and which the authors are surely well aware of. With different tests every year there can be no foolproof way of measuring absolute trends. The report does recognise that (apparent) increases in test scores could be due to things other than the NLNS, but fails to mention that, in the absence of a valid evaluation study, there is really no way that causal links to the NLNS can be made. The report (P. 36) uses phrases such as 'it is likely' and 'we think' and 'may well be the result' to describe what is observed, but this is merely playing with words and has little meaningful content.

In short, this report is a great disappointment. It lacks intellectual cohesion, fails properly to situate the NLNS within current cultural and political realities and altogether portrays an unrealistically rosy picture of what is actually happening. One might well be tempted to speculate that with critical friends such as these, what the government really requires are some serious critics!

Harvey Goldstein, July 2000

A response to Harvey Goldstein's comments

Harvey Goldstein has provided the OISE/UT (Ontario Institute for Studies in Education/University of Toronto) team with a number of comments based on the recently published report Watching and Learning. We are happy to respond to these comments. As academics trying to bring clarity and rationality to the investigation of educational reform (always an emotional topic), we welcome and encourage critiques of our work. We have tried to address the issues raised by Goldstein and hope that if he, or others with an interest in the implementation of NLNS in England, have evidence that is contrary to our own, they will make it available to us so that we can include it in our future analyses.

As with any evaluation, it is important to consider the OISE/UT external evaluation of the National Literacy and Numeracy Strategies in context. OISE/UT was commissioned by DfEE to undertake a 3-year study of the implementation of NLNS and to provide ongoing feedback based on our findings right from the early phases. This is not the only evaluation of NLNS. HMI(as part of OFSTED) is following two samples of 300 schools, one for literacy and one for numeracy, observing and assessing teaching in Literacy Hours and daily mathematics lessons as well as tracking pupil test scores.

The recently published report, Watching and Learning, is the first annual report from the OISE/UT team. It is based on data collection between November 1998 and December 1999. A single year is certainly not adequate to answer of all of the questions associated with such a massive undertaking as NLNS. Our time in the first year was, by necessity, dedicated largely to becoming familiar with the Strategies, developing a conceptual framework for the evaluation (as described in the report) and devising systematic data collection procedures. The literature reviews that frame our conceptual framework have been published by DfEE and are available from them.

During this time, we concentrated almost exclusively on investigating NLNS as policy levers, in relation to findings and experiences in other places (drawn from the literature reviews). As you can see from the description of the methodology, we spent very little time in schools (certainly not enough to draw any defensible conclusions about how the Strategies were being implemented). Instead, we focused on establishing the intents of NLNS as they were envisioned by the national leaders and investigating what was actually happening in development and training. We interviewed all of the Regional Directors, shadowed them in their work, attended planning meetings, shadowed several HMIs, observed training sessions and interviewed a range of stakeholders including QCA, teachers' unions, BSA, TTA, and researchers in higher education who are investigating NLNS in other studies. In addition we reviewed a mountain of documents and resources. The report is intentionally written in an accessible style because it is intended for a wide audience of practitioners and policy-makers. Nevertheless, we have been quite systematic in the data collection and analysis process.

In the second year, we are focusing on NLNS as local challenges. We are conducting surveys in a randomly selected sample of schools and spending most of our time in England doing school visits to a small number of schools that we have selected to give us insights into the experiences of headteachers and teachers with NLNS. These data ought to provide us with evidence about the extent to which teachers and heads are motivated to implement NLNS, have acquired or are able to acquire the capacity to implement NLNS and are operating in situations that support and are likely to sustain NLNS. We will, of course, continue to monitor the policy dimensions of NLNS as well.

It is important to bear in mind that NLNS are the main targets of our inquiry. We are certainly aware of the history, developments and debates that surround NLNS (and many other educational initiatives in England) and the likelihood that they will influence how teachers, headteachers and LEA personnel respond to NLNS. Educators in England have experienced a relentless barrage of change and considerable negativism from many quarters. We have built both the national and the international context into our framework and will be collecting data to investigate how they affect implementation. However, we need to separate the context from NLNS, at least in the initial stages. Our work is focused first on NLNS. It would be foolish to concentrate on the context at this stage and derail our primary purpose of examining the implementation of NLNS. At this stage, we have no evidence from teachers and headteachers about how the government context is or is not affecting them. If they are negatively disposed to NLNS, we need to ascertain their views, independent of the context. If it turns out that NLNS, in their own right, are viewed as valuable by teachers and headteachers, we will try to establish whether NLNS can override the past history or whether the context drags teachers down, regardless of their response to NLNS. It is not until the third year that we will be in a position to bring our evidence about all of the pieces together and make inferences about relationships.

Goldstein suggests that we have overstepped the bounds of evidence and made judgements about the success of the Strategies. On re-reading, I think he might find that we do not, at any point, offer such judgements. (It is interesting to note that BBC reported only the positive statements from the report and the Daily Telegraph only the negative. It's hard to believe that they were based on the same report.) We certainly have described what we see to be strengths of NLNS as viewed through the lenses of the international literature and supported by data from our interviews. For example, we heard routinely about people's confidence in and esteem for the National Directors. Representative of very different stakeholders' groups viewed them as fair, knowledgeable and trustworthy. We have noted that NLNS do have the potential, as Strategies, to enhance literacy and numeracy but that it was too early (since we do not yet have data) for us to comment about the actual implementation and use of the Strategies. The National Assessment results suggest that there have been improvements. We point out that, if these are valid estimates of literacy and numeracy,we believe that the gains are likely due to changes in motivation and an intensification of effort (e.g., more systematic use of existing strategies,more time), not altered practices. In our interim report, we identify the seduction of numbers (page 3 of Watching and Learning) as an issue that we will continue to investigate and we identify assessment literacy as an important dimension of using data for decision-making (Page 19). One of the foci in 2000 and 2001 is availability, interpretation and use of data in schools and LEAs.

Goldstein's comment about the "side-effects" of concentrating resources on literacy* and mathematics on other subjects is an important one. Although we have heard such comments, 1999 pupil test data showed an increase in science as well. So far, most of the people we have talked to in schools suggest that they are able to maintain a balanced curriculum, although they tell us that it takes considerable effort and commitment. It may also be that fundamental literacy and numeracy are so important that they warrant sacrificing something else. Nevertheless, within our investigation of NLNS, we have included a "value for money" dimension that includes consideration, albeit crude, of the impact of NLNS on other subjects. The results remain to be seen.

We look forward to continued conversations and challenges about our work, and about the implementation of NLNS in England. Only by close study, systematic data collection and careful consideration of the evidence are we able to move beyond the exigencies of personal conviction and sort out the complexities of human behaviours.

Lorna Earl

A response to Lorna Earl's comments:

I have one or two specific comments on Lorna Earl's response to my earlier critique and then some more general remarks

Lorna misunderstands my point about side effects when she says that '1999 pupil test data showed an increase in science as well'. Potential side effects are to do with the complete experience of schooling and this response indicates a rather limited viewpoint. In my earlier comments I stressed that the Toronto team were in no position to reach conclusions about the causal effects of the NLNS, yet persisted in doing so through the use of vague conditional statements. Lorna continues to do this when she says 'if these (test scores) are valid estimates of literacy and numeracy, we believe that the gains are likely (to be) due to changes in motivation and an intensification of effort'; that is they believe that NLNS has actually resulted in literacy gains. The team provide no good evidence for such a belief and this underlines my concern that they have indeed overstepped the bounds of evidence.

Lorna has also not responded to the point I made about coming to conclusions about improving 'standards' on the basis of observing increasing test scores over time. This is a very basic and rather widely misunderstood point and it is a pity that the Toronto report perpetuates such a misunderstanding.

General

Lorna is at pains to stress that they wish to examine the implementation of NLNS separate from the political and historical context. I still find such a position very difficult to understand. The context will inevitably affect the implementation of the programme, its efficacy, the views that people have of it and the way in which they respond to an external team appointed by DfEE to examine it. The evaluation team do in fact position the NLNS within the current framework of targets, accepting these targets as embodying 'explicit standards for pupil outcomes'. The debateable issue is whether the targets really do this, yet this possibility is just ignored.

In practice, the Toronto team have taken account of the context but they have done so implicitly by accepting a particular set of official views of what is happening. For example, the report says of targets 'They can be used to clarify, integrate and raise expectations and are most useful when they become part and parcel of school improvement plans'. In reality there is some considerable doubt about how far this has been or can be achieved within the current educational context, yet the team nowhere question this assumption. Such a naïve approach can also be dangerous, especially where it relates to resource allocation and where it permeates the thinking of those concerned with school improvement. The general approach of the team is to take too many things for granted and that, in my view, does no constitute a sound evaluation strategy.

Lorna does talk of what they plan to do in the next two years and her openness to other views is welcome, as is their decision to make various presentations to audiences in England in September and October 2000. I would expect the feedback they obtain to inform and improve future reports.

Harvey Goldstein, July 28, 2000