1. Clarify the aims of the service and the evaluation 

Box 3. Factors to be considered when estimating the time needed for impact to be seen

  1. The need to communicate changes to staff, and gain their support to make changes.
  2. Providing resources and materials to help staff implement the changes in care.
  3. Whether the new care pathways/processes are being implemented in different ways across the local health and care providers, therefore adding complexity.
  4. Whether the new care models are being implemented across all providers in the area, and impact on contractual agreements.
  5. Allowing time to recruit/enrol sufficient patients to actually receive care via the new pathways/processes.
  6. Allowing time or a large enough sample of patients to have been actively managed in the new care model.
  7. Allowing time for patients whose health has changed to demonstrate improved outcomes and show an overall statistically significant benefit.

2. Decide on the number of people needed to demonstrate an effect


3. Ensure permission is granted to access person-level datasets

Figure 1: The use of pseudonymous data to record person-level activity 

Figure 1: The use of pseudonymous data to record person-level activity 


4. Ensure there are data on who received the new service, and some information about the service received

Box 4. Example of service level information about patients receiving a particular intervention

Identifiers that might be collected:

  • NHS number (if available)
  • Date of birth
  • Sex
  • Post code
  • First name
  • Last name

Service details:

  • Start date
  • End date (if available)
  • Description of service including eligibility criteria for entry (if applied), referral routes etc.


5. Identify the potential control population

Before starting the matching process, the pool of potential controls must be identified. There are a number of choices for controls:

Note that some of the differences between the intervention and control groups at baseline can have less of an impact if a ‘difference-in-difference’ method is used when analysing results. Further details are given in Box 6. 


6. Create longitudinal patient-level histories of service use

When creating histories of patients’ health and care activity over time, a typical approach is to use at least two years of data before the intervention start date, and to follow up a year or so after the intervention period – although shorter timescales can be feasible.

The hospital datasets that might be used include admissions and attendances for inpatient, outpatient and A&E activity. People will typically have one or more of these care events over a period of time so we need to combine the records of individual events into a person’s care history. There are a number of ways to do this but the simplest is to build up a dataset with one row of records for each person in the potential control pool and the intervention group (this might be, for example, all over-65s in one area, if the intervention group are all over 65).  

Illustrating an individual's health and social care history over a three-year period (Figure 2) can help bring the data to life, and is a useful way of gaining an understanding of the frequency and patterns of contacts with health and care providers for individuals in the study. 

Figure 2. Diagrammatic illustration of an individual’s health and social care history over a three-year period (Bardsley and others, 2011)

Figure 2. Diagrammatic illustration of an individual’s health and social care history over a three-year period (Bardsley and others, 2011)

In the example in Figure 2, in the first year the patient had four outpatient attendances and three hospital admissions, as well as some GP visits. A social care assessment was carried out towards the end of the year, but this appears not to result in any service being provided. In the following year, two social care assessments were carried out and a low intensity package of home care was put into place. Using the information from years one and two, we can then predict likely care usage in year three. In this example, our model predicted an increase in the intensity of the home care packages or a care home admission. However, this was not observed: in the third year several unplanned hospital admissions occurred, as well as two social care assessments, but social care services did not continue past March in year three.


7.  Identify matched controls

The actual process of matching is something that should probably be undertaken by a specialist analyst.  However, outlined here are the steps involved. 

Box 5. Examples of variables used in matching

  • Age
  • Gender
  • Index of Multiple Deprivation of local area
  • Risk of admission score
  • Presence of specific diseases
    • e.g. cancer, coronary heart disease, diabetes, chronic obstructive pulmonary disorder, dementia
    • number of long-term diseases
  • Prior hospital activity (1 month, 6 months, 3 years)
    • inpatient admissions
    • emergency admissions
    • A&E attendances
    • outpatient activity
    • bed days

Figure 3. Example of how intervention and control groups are matched on a series of variables indicating the presence of prior disease (Georghiou and Steventon, 2014)

 

Figure 4. Standardised difference across variables used for matching; before and after matching


8. Monitor outcome variables for those receiving the new service and matched controls

Having identified the individuals receiving the new service and controls, the next step is to monitor the outcome variables (as defined in step 1) over time post-intervention.  

Figure 5. Comparing the number of emergency admissions pre- and post-intervention and for a matched control group – regression to the mean (Steventon and others, 2011)

By way of example, Figure 5 shows the results of a study we evaluated which looked at a scheme to support older people who had been in hospital (Stevenson and others, 2011). The intervention group showed a sharp increase in admissions before the start of the intervention, which is what would be expected when candidates for the service were people already in hospital. Following the intervention, the number of future emergency admission fell away dramatically – but could this have been a result of the intervention, or alternatively regression to the mean (see Box 1)? To answer that question we made use of the retrospective matched control method to compare the level of activity pre- and post-intervention with a control group.

The controls were selected in part because they showed an almost identical rise in emergency admissions relative to specific months equivalent to the intervention start dates. In this, they appeared to closely match the intervention group, but they were also matched on a range of other factors.

The ‘future’ emergency admissions of this control group appeared to decline even faster than the intervention group, suggesting that emergency admissions in the intervention group were indeed just regressing to the mean (Figure 5). 


9. Undertake summative analysis

Box 6. Example of difference-in-difference results table on secondary care utilisation

The table below gives an example of how a results table may look for a difference-in-difference analysis.

A difference-in-difference approach compares the change in outcome within the intervention group to the change in the outcome within the control group, over two time points.

The change in emergency admissions for the intervention group, comparing rates before (a) and after (b) the intervention, was a 0.35 absolute reduction in emergency admissions (a-b=c).

In the control group, the change over the same periods before (d) and after (e) the intervention was a 0.58 absolute reduction in emergency admissions (d-e=f).

The difference-in-difference compares the difference between the change in the intervention group and the control group over the same two time points (c-f=g). So in this example the emergency admissions rate increased by 0.23 in the intervention group (after taking into account the change in the control group over the same time points).


10. Continuously monitor