Sources of Data for Monitoring Expert Performance - Dr Mark Burgin
26/11/20. Dr. Mark Burgin BM BCh (oxon) MRCGP considers what sources are available for monitoring of experts and how to maximise the benefit from that data.
Techniques for managing the performance of professionals are mainly about providing the data in an accessible way. There is no one-size-fits-all approach and multiple sources of data should be considered. The data should be presented in a neutral way and peer referenced rather than on a range from good to bad. The professional can then see if they are an outlier and consider the reasons for this difference. This is called Closing the Loop and is probably the best way of managing the performance of professionals as it does not involve a manager.
Highly performing professionals typically are fairly average in most measures although their overall performance is outside the range. This is because to achieve greatness you have to get the little things right. Highly performing professionals are often only outliers when they are under strain or for short periods to adjust to outside changes. Poorly performing professionals will often be outliers on the same measures to achieve average performance. These apparent inconsistencies can cause problems for inexperienced managers only looking at variance figures.
Trying to rely on incomplete or single measures causes gaming of the system by poorly performing professionals. Highly performing professionals will resist the system because they feel that the outcomes of maximising performance on one measure will damage overall performance. This demotivates both poorly performing professionals and highly performing professionals. Using multiple linked measures and looking at composite performance is more effective but there are limits to using data to understand performance. Many differences in performance involve issues with relationships which regulators or quasi regulators may try to ignore.
Big data
MedCo has a database which includes some useful data about performance. The major providers of report writing software have extensive data which has largely remained untapped. Individual agencies have access to many reports but do not have the resources to investigate their data. MedCo could provide individual experts and MROs with peer referenced reports about performance at minimal cost. There is a strong argument for MedCo to share their data in this way to improve performance or allow someone else use their data to provide reports. Examples are prognosis length, variation and the number of injuries that are in the report.
The providers of report writing software could provide extensive anonymised reports for individual experts or agencies. This could include peer referenced data as the databases they hold are large enough to make it difficult to identify individuals. As the Data Protection Act 2018 applies fully to providers of report writing software it would not be possible to provide MROs with individual expert’s data. Examples of data available are the recommendations and referrals, amount of free text used, use of flags such as ‘severe’ pain.
Most ‘audits’ are very basic, they check that the personal details are the same as the instruction letter and that the report contains prognoses. This type of initial scan is useful for picking up reports where the expert has made a mistake. Some errors can get through such as where the expert has forgotten to remove a templated phrase but it generally fairly good at spotting obvious errors. Data from this type of audit is good at picking up poor secretarial skills and experts that do not proofread their reports. Although performance on a single report is rarely informative if the reports are often inconsistent finding out whether the expert or the instruction was correct may be material.
Auditing Expert Reports
Clinical audit of expert reports involves two different processes, the first is adherence to CPR35 and the second is quality of the information. Although some experts are poor at both or good at both, it is possible to be divergent in performance. An expert who is good clinically may not be as confident in the legal aspects and vice versa. Although the number of areas that clinical audit can provide data on is far greater than big data approaches it is relatively expensive. It is important to target audits to maximise the pickup of problems and have robust policies for managing the results. Without an efficient system to manage both poor and high performance this approach can be expensive and ineffective.
Random screening should involve 1% of expert reports to minimise costs but still protect the courts from large numbers of unsafe reports. The audits need to be provided to the experts whether they have passed or failed as this allows them to learn and improve. No-one writes perfect reports every time and many experts enjoy have neutral feedback even if they do not agree with all the points. Identifying those experts have not allocated sufficient time, have knowledge gaps or have lost motivation early is important. The financial cost of even one failed case can run into hundred of thousands of pounds, imagine having to withdraw hundreds of reports due to a court finding against an expert.
Diagnosing an expert’s difficulties using clinical audit can be straight forward for instance when they have not followed CPR35. Lack of legal knowledge is perhaps less common since legal training has been mandatory at least for MedCo cases. Many experts have simple but less obvious reasons for poor performance, they may have a medical illness or pressing work commitments. Experts can be aware of their difficulties but breach CPR35 by for instance not recording their limited experience. Some problems can only be diagnosed by repeated audits for instance extensive copy and paste in sections such as examination. Part 35 requests are a further source of relevant data.
Interviewing experts
The types of information that MROs can obtain by asking for written statements or telephone, video or face to face discussions is much greater than by other means. It is good practice for every agency to provide feedback annually to experts so that they are aware of their good status. As this is mandatory for appraisal experts often try to obtain detailed information by asking the agency. Even with 360 degree colleague feedback the expert may not be aware of the context of any complaints. As the expert has not had a chance to engage with the concerns it is unlikely that they will be able to learn effectively from the feedback.
The expert’s explanation for their behaviour can help the MRO decide if they are dealing with high or poor performance. An expert might have received much criticism for following CPR35 because ‘other experts do not’. Refusing to amend a report because the statement of truth contradicts what was told at the time of the examination is good practice. Experts might be surprised to find out that amending a report in these circumstances could lead to a prison sentence. The outlying expert who refuses may have an important lesson to teach the MRO and other experts.
It is however difficult to interview an expert unless a person at the agency has a good working relationship with the expert. This means more than sending emails, they must have spoken on more than one occasion for trust to build up. The best agencies arrange face to face contact such as at medical legal conferences, the worst have no-one who knows the expert. The importance of having a named contact goes further than having someone to speak to when things go wrong. If a senior manager has to become involved then having a named contact to ask the initial questions can be less threatening and improve fidelity.
Conclusions
Information is the key to managing both poor and high performance and a manager without information cannot manage. All three sources of data discussed above are essential and each source on its own is insufficient. All MROs should have robust systems to ensure that they analyse big data, target audits where they can do most good and build up the relationships needed to interview their experts. Having an open mind when investigating an outlying expert is essential to learn why it has occurred rather than simply trying to change performance. Without good information it is impossible to give good feedback needed to improve the performance.
There is a large focus on individual performance so the importance of systems is underestimated, for example the failure to have a trusted contact at the agency or lack of feedback. In general where more than 10% of experts have the same problem the cause is with the system rather than the expert. Examples of systemic problems in PI reports are inadequate accident details to determine the force, failure to address inconsistencies or offer range of opinion. In each case lawyers will almost never ask for clarification, so experts feel that they do not need to alter their behaviour.
Trying to diagnose an illness from a few symptoms is likely to cause harm to the patient, increase costs and delay the correct treatment. The same is true for medical expert reports so it is essential that a proper procedure is in place to investigate performance. It is unhelpful to have a kangaroo court that can only decide that the expert is failing but worse is when there is no feedback of any sort. Clinical audit is the most effective and least risky way to identify problems with expert report writing and improve performance. The right systems need to be in place to properly use the data that is provided by this methodology.
Doctor Mark Burgin, BM BCh (oxon) MRCGP is on the General Practitioner Specialist Register.
Dr. Burgin can be contacted on This email address is being protected from spambots. You need JavaScript enabled to view it. and 0845 331 3304 website drmarkburgin.co.uk
Image ©iStockphoto.com/prizela_ning