How to avoid 'teaching to the test' and other unintended consequences?
Despite good intentions, national and international assessments can easily become 'high stakes' for jurisdictions and this can distort the measures used, either by encouraging teachers to 'teach to the test' or by exerting an undue influence over the content of the curriculum. It is of course, axiomatic that any valid assessment should largely comprise items which reflect what children would typically be expected to be able to do at any particular age; however, no assessment is ever designed to do more than sample these learning domains and there are many aspects of learning which cannot easily be assessed. The iPIPS assessments are designed to provide accurate measurements against a small number of domains which have proven over the years to have strong predictive value in terms of children's later attainment.
The assessments are, therefore, designed to produce significant information about the skills and disposition which children acquire, but they are not definitive in terms of the curriculum which should be taught. This is an important distinction.
In order to avoid the assessments becoming 'high stakes' for schools or teachers (for example through the use of 'league tables' or performance management procedures), the reports will not include any data identifiable to an individual school or teacher. This should minimise the risk of schools (or pre-schools) 'teaching to the test', in order to obtain better results, either in the initial baseline assessment or in the follow up assessment. The iPIPS team will work with individual jurisdictions to ensure that this position is safeguarded. It is possible that additional measures may be included in the assessment which would act as a mechanism for preserving the integrity of the survey.
At school level therefore, there will be an 'expectation of honesty' based on the fact that schools and teachers are not being held to account. In addition, at least part of the data will be collected by independent researchers. The inclusion of new and unknown items in the follow up survey would allow schools to be dropped from the analysis where there was evidence of inappropriate administration procedures.