PHSO – the facts and evidence We have published an updated position statement on the Parliamentary and Health Service Ombudsman (PHSO). To develop this, we looked at the available evidence about PHSO’s performance. This article sets out what we found, both about what the evidence tells us, and about the strengths and weaknesses of different sources of evidence. 1) PHSO’s Service Charter In 2016, PHSO developed and launched a Service Charter setting out what it aimed to deliver for complainants (note: while we generally refer to ‘patients’, PHSO generally refers to ‘complainants’, both to reflect that some complaints are not lodged by the patient, for instance when the patient has died, and that it also deals with some non-healthcare services; each term will be used as appropriate in this article). Since then, it has issued quarterly figures showing how people who have used its service rate its delivery. The Service Charter contains the following commitments: We will explain our role and what we can and cannot do We will explain how we handle complaints and what information we need from you We will direct you to someone who can help with your complaint if we are unable to, where possible We will keep you regularly updated on our progress with your complaint We will listen to you to make sure we understand your complaint We will explain the specific concerns we will be looking into We will explain how we will do our work We will gather all the information we need, including from you and the organisation you have complained about, before we make our decision We will share facts with you, and discuss with you what we are seeing We will evaluate the information we have gathered and make an impartial decision on your complaint We will explain our decision and recommendations, and how we reached them We will treat you with courtesy and respect We will give you a final decision on your complaint as soon as we can We will make sure our service is easily accessible to you and give you support and help if you need it. Evidence of complainants’ views is gathered by telephone interviews, conducted by an independent research company, operating under the Market Research Society’s Code of Conduct. Approximately 600 complainants are surveyed for each report, and the data is published in a quarterly ‘dashboard’ report on the PHSO website, giving a percentage score for complainants who were satisfied that each commitment had been met . Although the publication of this data offers a good deal of transparency – indeed, no other UK ombudsman publishes anything comparable - it is not presented anywhere on the PHSO website (that we could find) in a form that allows comparison of the data over time. Accordingly, we had to scrape the data manually from quarterly releases and compile it into a spreadsheet for analysis; the data for complainant responses is reproduced in Table 1. While PHSO deserves credit for transparency, it might do well to present the data in a way that can be more readily analysed and interrogated. Table 1: Complainant feedback (%) against the PHSO Service Charter, as published on the PHSO website. Domains correspond to the list of commitments given above, as numbered in the left-hand column. As this is PHSO’s data, it might be asked whether weight should be placed on it. We see no reason why not: the methodology by which it is gathered appears in principle to be sound; the Service Charter’s domains broadly capture the important aspects of complainants’ experiences, including those that patients have told us have been problematic; and given that the data contains both good news and unflattering results for PHSO, we see conclude that it has been gathered and published honestly. With those things in mind, it is by far the strongest evidence base for assessing PHSO’s performance since the data was first collected in the third quarter of 2016-17. PHSO is also investigating how to develop a measurement of satisfaction for commitment 10 (‘We will evaluate the information we have gathered and make an impartial decision on your complaint’) that will produce a score that is independent of whether the complaint was upheld or not. At present, no complainant data exists for this commitment. The above data shows a complex picture, with some measures trending roughly level over the three years, some up and some down; there is also considerable disparity between different measures. To draw out some conclusions, we considered the measures with the lowest average satisfaction over the three years, as well as the measures that had trended downwards and those that had trended upwards over the period. To identify upward and downward trends, we averaged the first three quarters’ and last three quarters’ scores, and calculated the difference. The five domains highlighted in Figure 1 all returned average scores of less than 70%. They were: We will gather all the information we need, including from you and the organisation you have complained about, before we make our decision We will share facts with you, and discuss with you what we are seeing We will explain our decision and recommendations, and how we reached them We will give you a final decision on your complaint as soon as we can We will make sure our service is easily accessible to you and give you support and help if you need it. Two of the domains trend essentially flat over the three years, and two showed clear deterioration – these are shown in Figure 2. Treating complainants with courtesy and respect (0.0% change), and explaining how complaints are handled and what information is needed (0.3% worse over the last three quarters compared to the first). There are two domains with marked deterioration, however: explaining decisions and recommendations (down 8%), and giving a final decision as soon as possible (down 4.3%). It is notable that these two domains are also among those that complainants report as showing the weakest performance over the period as a whole. These two domains were among the areas of greatest concern, and have caused the most upset, to patients during PHSO’s period of poor performance, so while we appreciate that the transformation programme still has further to run, it is particularly disappointing to see a further decline in performance on these commitments. Five domains, shown in Figure 3, showed good improvements in the service with rises of six percentage points or more from the first three quarters of the period to the last three: Making sure the service is easily accessible (6.0%) Keeping complainants regularly updated on progress (6.7%) Explaining the specific concerns that will be looked into (6.7%) Explaining how PHSO will do its work (7.3%) Sharing facts with complainants, and discussing what PHSO is seeing (7.7%). Two of the lowest-scoring domains across the whole period also count among the most improved (sharing facts, making the service accessible). 2) Data from our helpline As well as evidence from PHSO, we considered whether evidence from our helpline could tell us about the extent of change (if any) at PHSO. Qualitative information from the helpline remains a rich and valuable source of evidence about people’s experiences with PHSO. When people report problems to us, they continue to show much the same characteristics as we noted in our earlier reports. However, these cases can still date from the period before PHSO began its transformation programme, or from the programme’s very early stages. The continued occurrence of cases of this sort over the early years of the transformation period would always be expected to some extent, even if the programme has gone extremely well. What would be more illuminating would be if trends could be identified that shed light on possible changes in PHSO’s performance over time. This would have to be found in quantitative data. We therefore looked at the numbers of helpline cases that relate to PHSO over recent years. In this, we are able to go back to 2016, as this was when we began using our current database. This provides a useful overlap with the period when we were issuing our reports that were critical of PHSO, so effectively giving us a baseline in the form of the earliest year’s data. However, we need to attach some caveats to the data in Table 2. We have changed our processes since 2016 in a couple of key respects. Firstly, we now count multiple contacts about the same matter as one ‘case’, whereas previously they were counted as individual cases. Secondly, we introduced a new and more robust classification system for cases at the start of 2018. For that reason, we are highly confident about the quality of the data for 2018 and 2019, while the earlier data is broadly comparable, subject to those two considerations. Drawing any conclusions from these data is extremely difficult. Above all, they represent a small number of cases relative to PHSO’s work overall (5,658 investigations in 2018-19). To the extent that there is a pattern, the figures appear to show numbers of PHSO-related cases remaining broadly flat over the period as a proportion of our overall helpline traffic, with a spike in 2018. It is not possible to draw any useful conclusions either way about PHSO’s performance from this data. 3) Data from Trustpilot Thirdly, we considered whether data from the consumer ratings website Trustpilot might provide useful insight into PHSO’s performance. A small number of people have left reviews of PHSO on Trustpilot, the first in 2017 and continuing into 2020. PHSO attracts a 1.5 star rating (out of a possible five stars, five stars being described as ‘excellent’ and one star ‘bad’) from 103 reviews (at the time of writing), drawn from ratings in the following proportions: Excellent – 2% Great – 0% Average – 1% Poor – 1% Bad – 96%. On a qualitative level, the reasons for the reviewers’ negative ratings are largely in line with those that we have previously identified and PHSO has acknowledged. But the methodological problems with using this data to evaluate PHSO’s performance overall are immediately obvious. The very small sample of cases amounts to well under 1% of total enquiries to PHSO. Even in a well performing organisation, one might expect dissatisfaction among such a small number of service users, so fewer than 100 bad reviews on Trustpilot doesn’t provide meaningful evidence. The nature of the reviews also means that it is not always possible to identify when the cases date from, so it is not possible to identify any change over time. It is noticeable that Trustpilot hosts similarly negative review pages for many other ombudsman services. Similarly low scores are given, also by modest numbers of users, for the Legal (1.5 stars), Financial (1.5), Housing (1.5), Local Government (1.5) and Property Ombudsmen (2). There appears to be a wider pattern of Trustpilot attracting negative comment from unhappy users of ombudsman services, overwhelmingly those whose complaints have not been upheld (whether rightly or wrongly). Extrapolating wider conclusions about the performance of any of these organisations from this very limited data is impossible. Conclusions Although there is quite a lot of information available about PHSO’s performance, not all of it is useful. From the useful data, a somewhat equivocal picture emerges. PHSO’s performance certainly doesn’t seem to have deteriorated, which is arguably quite impressive of itself given the extent of the organisational change it has undergone during this period. Its own Service Charter data shows a mix of modest improvement, overall flatlining and slight deterioration, depending on the measure in question. Those where there has been progress outnumber those where things have got worse. But, just as there has been no drastic decline in PHSO’s performance, there has not been a step-change improvement either. As PHSO’s changes start to be more widely felt, we would hope to see clear evidence that patients’ and complainants’ experiences of the services are indeed improving as a result. We will also expect to see redoubled efforts to improve any areas that the evidence shows remain weak.