Stuff that occurs to me

All of my 'how to' posts are tagged here. The most popular posts are about blocking and private accounts on Twitter, also the science communication jobs list. None of the science or medical information I might post to this blog should be taken as medical advice (I'm not medically trained).

Think of this blog as a sort of nursery for my half-baked ideas hence 'stuff that occurs to me'.

Contact: @JoBrodie Email: jo DOT brodie AT gmail DOT com

Science in London: The 2018/19 scientific society talks in London blog post

Showing posts with label Errordiary. Show all posts
Showing posts with label Errordiary. Show all posts

Wednesday, 20 November 2013

Diabetes and language used in healthcare and research

NB. Likely a few more links to be added in, I am working on a slightly uncooperative wifi network at the moment

I took part in a work-related Twitter chat last night, about avoiding errors in the self-management of diabetes.

The project I work on, CHI+MED, is looking at ways of making interactive medical devices safer but to do this we don't just study the devices themselves but also the people who use them and the systems that the machines are used in - basically it's a "sociotechnical" model sort of thing.

People know that they make errors in using machines. Sometimes the system helps them to prevent this, for example I've made good use of the delete key while typing this post, someone had the good sense to add one in to keyboard design. Sometimes people develop their own cunning plans to prevent errors. These are 'resilience strategies' (strategies that make them resilient to error) that are either generated by the person themselves or picked up from colleagues - they're rarely 'in the instruction manual' and they're not part of any official training.

But they can be really useful - both to other people who are using that medical device, but also to researchers who want to find out the strategies people employ to prevent mishaps.

And that's what the chat was about - what are the sorts of errors that people with diabetes (particularly Type 1 diabetes who are regularly monitoring their blood glucose levels and adjusting doses of injected insulin) might make and what tricks have they developed to try and avoid making an error.

One interesting things that came up was the language used by error researchers and how this might conflict with that used by people with diabetes or diabetes researchers. Dom (a colleague on CHI+MED who was co-hosting the Tweetchat with @OurDiabetes) uses terms like slip, mistake and violation which have precise meanings in the context of human factors and ergonomics research.

One of the people participating in the chat felt that the word violation was a bit of a strong term - it certainly carries negative connotations. Suzette Woodward has a helpful post explaining some of the examples of violations (eg of policies) in a healthcare setting: Working to rule?

Language used in different disciplines often has the potential to offend, or even just misfire, when heard by other people out of context.

I remember, when working in a GPs' surgery 10 years ago, reading that "the patient denied having any chest pains" and being amused at the implication that the doctor knew full well that the patient was having chest pains but that the patient wasn't having any of it. That's not what it means of course, it just seemed a strange way to say "the patient reported that he was not experiencing any chest pains" but "deny" carries other meanings to those not immersed in this use of language.

Similarly there are terms used in healthcare research looking at situations where medication is just not taken. It might be forgotten, lost (stolen?), unusable (damaged) and so not used. Equally it might be intentionally not used.

The various terms I came across that meant "not taking his or her medication" were non-compliance, non-adherence and non-concordance. All mean more or less the same thing but non-compliant sounds a bit more "naughty diabetic*" and "non-concordance" suggests a certain disagreement between patient and doctor.

*I do of course mean "naughty person with diabetes" ;-)

Further reading





Friday, 20 September 2013

Healthcare professionals: research project / focus group on error, blame and resilience in London

Summary
There's a focus group next Thursday in London for healthcare professionals to talk about the issues of error, blame and their resilience strategies to avoid errors. An example might be using a post-it note to flag up a reminder.

By their very nature people's 'resilience strategies' aren't official or found in manuals - we're collecting this sort of 'hidden' information to use in improving the design of medical devices.

Making devices more resilient to error can make them safer for patients. 

Medical Professionals (e.g. nurses, doctors, paramedics and emergency care practitioners)
Thursday 26th September 2013
5.30-6.00pm – registration, food and refreshments
6.00-8.00pm – focus group



I work on the CHI+MED project which is about making interactive medical devices (such as cancer drug pumps and blood glucose meters for people with diabetes) safer and more resistant to error. One of the offshoots of the project is Errordiary* which collects examples of everday error and they're running a focus group next Thursday 26 September 2013 to find out about error, and its prevention, in healthcare.

In most situations we pretty much accept that everyone makes mistakes. If I turn up to the tube station and reach for my keys out instead of my rail card I've made a mistake. But no-one dies. There are no calls to retrain me and no-one's blaming me beyond some annoyed tuts from passengers behind me. The press is unlikely to scour my Facebook page for pictures of me drunk and incompetently trying to get through barriers with the wrong card, and I'm unlikely to lose my job.

We do tend to be a bit more blame-y towards people who work in healthcare when something goes wrong - these are highly trained individuals who are rarely 'permitted' to be human and make mistakes. When we ask for them to be sacked for incompetence we may be holding them to an impossible (unreasonably high) standard. When we ask for them to be retrained we may be wasting everyone's time if 'lack of training' had nothing to do with the fault in the first place.

No amount of training can really prevent me from grabbing the wrong thing but there are things that can be done to make it less likely.

Most people are right-handed so it's useful to have the control point on the right hand side (that's building resilience into the system) and it's sensible for me to keep my card in my right pocket. I can also keep my keys in my bag, so they're separate from my rail card!

On a chemotherapy pump it's easy enough to type in a wrong number: 52 instead of 5.2. Perhaps the visual display of the decimal point be clearer. Or the decimal point on the keypad could be in a better position. Perhaps the drug library installed on the pump could flag up that the drug dose is much higher than expected.

Hopefully the user will notice anyway and re-enter the correct figure - but what can be done to increase the chances that any error is spotted?

Finding out more about the errors that people make and how they avoid them, or recover from them, is a big part of tackling them and improving the safer use of devices.

Here's the blurb that I've pinched from my colleague Dom Furniss' post on the Errordiary focus group:
We are organising focus groups to find out more about what you think of human error, blame culture and resilience to error. We’re interested in mistakes – why we make them, how often we make them and what happens when we make them in trivial and serious contexts. For example:
  • How often do you make errors? All the time, never or somewhere in between?
  • What do you think about errors? Are they sometimes funny? What about when they happen at work?
  • What do you think about fatal errors reported in the news? What do you think should happen to people after they’ve made a serious error?
  • Should we share errors more? What are the pros and cons of this? What are the challenges?
  • What can we do to prevent errors happening in the future?
If you're a healthcare professional and free next Thursday please help, or tell a friend - thank you :)

*Errordiary
"To err is human…
To understand why we err and to try to reduce our erring is human too!

Errordiary is about sharing errors so people can think about human error in a new way. We already know that the same psychological principles lie behind everyday errors and those errors of a more serious nature. Whether they are funny, frustrating or fatal depends on the context."
From About Errordiary
Wiseman, S., Gould, S., Furniss, D., & Cox, A. (2012). Errordiary: Support for teaching human error. Paper presented at the Contextualised Curriculum Workshop at CHI 2012, Austin, Texas, May 2012.

You can see the latest tweets, tagged with #Errordiary (about errors) or #rsdiary (resilience strategies diary) too.

Further reading - preprints available to download as PDFs from the links given
Furniss, D., Back, J., & Blandford, A. (2012). Cognitive resilience: Can we use Twitter to make strategies more tangible? Proceedings of European Conference on Cognitive Ergonomics (ECCE 2012), 96–99. New York: ACM.

Lee, P. T., Thompson, F., & Thimbleby, H. (2012). Analysis of infusion pump error logs and their significance for health care. British Journal of Nursing (Intravenous Supplement), 21(8), S12-S20.

Furniss, D., Blandford, A., & Mayer, A. (2011). Unremarkable errors: Low-level disturbances in infusion pump use. Proceedings of the 25th BCS Conference on Human Computer Interaction (HCI-2011), 197–204.







Thursday, 1 October 2009

Is there a way to categorise harm (eg from woo remedies)?

How do we categorise harm? 

(i) If I eat some bits of a yew or some iffy fungus I'll be very unwell - this is harm arising because the substance itself is toxic (clearly dose has some impact).

(ii) If I drink a lot of grapefruit or orange juice while taking certain statin drugs I'll reduce the rate at which my body clears the statin from my system, this might cause a problematic increase in the drug - this is harm arising from the grapefruit interacting with the enzyme that's meant to be clearing statins from the body (interactions).

(iii) If I buy some dodgy herbal pills from the internet they might contain prescription-only medicines that I don't know about. The real medicine could have been withdrawn from sale, or could interact with other prescribed meds that I might be taking or something else - this is harm arising from insufficient information and also a bit of (i) and (ii).

(iv) If I have a potentially serious health problem but choose to take treatment from an unconventional healer then by delaying getting appropriate treatment I may become very ill - this is harm arising from failure to act to preserve health.

There are probably other nuanced versions of these - I'm wondering if there's a recognised typology of harm, in the same way that you can have a Type I or Type II error in statistics.

If not, can we make some up ourselves?

Edit 31 August 2014
Just read an interesting post from Edzard Ernst looking more closely at the link between cardiac patients who are taking herbal remedies and their adherence to their prescribed medication. It looks like there may be a link (perhaps not surprising, though possibly not studied in depth before) and it seems that ther'e's a correlation between taking herbals and not taking prescribed medication appropriately. This could be dangerous and relates to (iv) in my imaginary taxonomy above.

A hitherto unknown risk of herbal medicine usage (31 August 2014) Edzard Ernst's blog

----

A project I work on (CHI+MED - making medical devices safer) looks at many aspects of medical safety, including human factors and systems thinking in handling medical errors. Specifically this involves looking at ways of designing into the system or device ways of making unavoidable user error more noticeable so that people can recover from them.

The older 'blame culture' that's been prevalent in many healthcare systems has taken the view that error is because someone's done their job wrongly and the response has been to retrain them. If you've ever poured orange juice in your tea or forgot your umbrella you can see immediately that this isn't a helpful view to take. Human error is pervasive (hence inevitable) and only rarely will training (or worse, sacking and getting in new people) fix it. Much better to learn from error and bolster systems to protect against it.

To a certain extent Google does this everytime you mistype something and it says "did you mean?" and spellcheckers do something similar for Word documents. In both cases the system has a design function that acknowledges the possibility of mistyping and offers an alternative or solution. Similarly most keyboards have a delete key to let you undo and even pencils have an eraser on the end of them.

We've found a really nice way of talking about error that doesn't involve blame - the dumb things we do everyday tend to be quite funny and no-one really seems to mind poking fun at themselves for doing something silly. And lo and behold, the cognitive processes involved in making these everyday errors are pretty much identical to those often involved in medical error - so we can learn from them too - have a look at the #errordiary hashtag and the Errordiary website which explains more.