The potential and pitfalls of AI.

[et_pb_section admin_label=”section”]
[et_pb_row admin_label=”row”]
[et_pb_column type=”4_4″][et_pb_text admin_label=”Text”]

I wrote a piece for the Australian College of Nursing’s (ACN) quarterly publication. Cite as: DeSouza, R. (Summer 2019/20 edition). The potential and pitfalls of AI. The Hive (Australian College of Nursing), 28(10-11).

Many thanks to Gemma Lea Saravanos for the photo.

The biggest opportunity that Artificial Intelligence (AI) presents is not the elimination of errors or the streamlining of workload, but paradoxically the return to caring in health. In eliminating the need for health professionals to be brilliant, as machines will be better at diagnosis and other aspects of care, the need for emotional intelligence will become more pressing.

In his book Deep medicine, he recounts how he grew up with a chronic condition, osteochondritis dissecans which was disabling. At 62, a knee replacement surgery went badly wrong, followed by an intense physical protocol which led to devastating pain and distress leaving him screaming in agony. Topol tried everything to get relief and his orthopaedic surgeon advised him to take antidepressants. Luckily his wife found a book called Arthrofibrosis, which explained why he was suffering a rare complication of inflammation affecting 2-3% of people after a knee replacement. His surgeon could only offer him future surgery, but a physiotherapist with experience of working with people with osteochondritis dissecans (OCD), offered a gentler approach that helped him recover. AI could have helped him by creating a bespoke protocol which took into account his history which the doctor did not. The problems of health care won’t be fixed by technology, but the paradox is that AI could help animate care, in the case of the robotic health professionals he had to deal with in the quest of recovery.

The three D’s

Nursing practice is being radically transformed by new ways of knowing including Artificial Intelligence (AI), algorithms, big data, genomics and more, bringing moral and clinical implications (Peirce et al., 2019). On one hand, these developments have massive benefits for people, but they also raise important ethical questions for nurses whose remit is to care for patients (Peirce et al., 2019). In order for nurses to align themselves to their values and remain patient centred they need to understand the implications of what Topol calls the three D’s: the digitisation of human beings through technological developments such as sensors and sequencing are digitally transforming health care; the democratising of medicine as patient’s knowledge of themselves becomes their possession rather than that of the health system and lastly, deep learning, which involves pattern recognition and machine learning.

Data is fundamental to AI

The massive amounts of data being collected -from apps, wearable devices, medical grade devices, electronic health records, high resolution images and whole genome sequences- allows for increased capability in computing to enable the effective analysis and interpretation of such data, and therefore, making predictions.

Artificial Intelligence (AI) includes a range of technologies which can work on data to make predictions out of patterns. Alan Turing, who is thought to be the founding father of AI, defined it as the science of making computers intelligent; in health AI uses algorithms and software to help computers analyse data (Loh, 2018).

Applications of AI
Data are transforming health in two key ways:

Assisting with enhancing patient care – from improving decision making and making diagnosis more effective and accurate to recommending treatment.
Systemising onerous tasks to make systems more effective for health care professionals and administrators.

Applications are emerging including automated diagnosis from medical imaging (Liu et al., 2019), surgical robots (Hodson, 2019), trying to predict intensive care unit (ICU) mortality and 30-day psychiatric readmission from unstructured clinical and psychiatric notes (Chen, Szolovits, & Ghassemi, 2019), skin cancer diagnosis; heart rhythm abnormalities, interpreting medical scans and pathology slides, diagnosing diseases, and predicting suicide using pattern recognition, having been trained on millions of examples.

These systems overcome the disadvantages of being a human for example being tired or distracted. And from a knowledge translation point of view, rather than waiting for knowledge to trickle down from research into practice over decades, steps could be automated and more personalised (Chen et al., 2019).

AI can also be used to better serve populations who are marginalised. For example, we know that not everyone is included in the gold standard of evidence: randomised trials. This means that they are not representative of entire populations, so therapies and treatments may not be tailored to marginalised populations (Chen et al., 2019; Perez, 2019).

Potential for algorithmic bias in health
However, large annotated data sets on which machine learning tasks are trained aren’t necessarily inclusive. For example image classification through deep neural networks may be trained on ImageNet,which has 14 million labelled images. Natural language processing requires that algorithms are trained on data sets scraped from websites that are usually annotated by graduate students or via crowdsourcing which then unintentionally produce data which embeds gender, ethnic and cultural biases. (Zou & Schiebinger, 2018).

This is because the workforce that designs, codes, engineers and programs AI may not be from diverse backgrounds and the future workforce are a concern also as gender and ethnic minorities are poorly represented in schools or Universities (Dillon & Collett, 2019).

Zou & Schiebinger (2018) cite three examples of where AI applications systematically discriminate against specific populations- the gender biases in the ways google translate converts Spanish language items into English; software in Nikon cameras that alert people when their subject is blinking, identify “Asians “as always blinking and word embedding, an algorithm for processing and analysing natural-language data, identifies European American names as “pleasant” and African American ones as “unpleasant”.

Other similar contexts include crime and policing technologies and financial sector technologies (Eubanks, 2018; Noble, 2018; O’Neill, 2016). Also see (Buolamwini & Gebru, 2018). But, how does one counter these biases? As Kate Crawford (2016) points out “Regardless, algorithmic flaws aren’t easily discoverable: How would a woman know to apply for a job she never saw advertised? How might a black community learn that it were being overpoliced by software?”.

Biased decision-making in a systematic way might happen with individual clinicians but they also rely on clinical judgement, reflection, past experience and evidence.

Digital literacies for an ageing workforce
We have a crisis in healthcare, and in nursing. Our technocratic business models with changes from above are contributing to “callous indifference” (Francis, 2013). Calls to reinstate empathy and compassion in health care, and ensure care is patient-centered, reflect that these features are absent from care.

In the meantime, we have had Royal Commissions into aged care, disability and mental health. For AI to be useful, it’s important that nurses understand how technology is going to change practice. Nurses already experience high demands and complexity in their work, so technological innovations that are driven from the top down risk alienating them and further burning them out (Jedwab, et al. 2019). We are also going to have to develop new models of care that are patient centred and codesigning these innovations with diverse populations is going to become increasingly important.

Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. In S. A. Friedler & C. Wilson (Eds.), Proceedings of the 1st Conference on Fairness, Accountability and Transparency (pp. 77–91). Retrieved from
Chen, I. Y., Szolovits, P., & Ghassemi, M. (2019). Can AI Help Reduce Disparities in General Medical and Mental Health Care? AMA Journal of Ethics, 21(2), E167–E179.
Crawford, K. (2016, June 25). OpinionArtificial Intelligence’s White Guy Problem. The New York Times. Retrieved from
Dillon, S., & Collett, C. (2019). AI and Gender: Four Proposals for Future Research. Retrieved from
Eubanks, V. (2018). Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. Retrieved from
Hodson, R. (2019). Digital health. Nature, 573(7775), S97.
Jedwab, R. M., Chalmers, C., Dobroff, N., & Redley, B. (2019). Measuring nursing benefits of an electronic medical record system: A scoping review. Collegian , 26(5), 562–582.
Liu, X., Faes, L., Kale, A. U., Wagner, S. K., Fu, D. J., Bruynseels, A., … Denniston, A. K. (2019). A comparison of deep learning performance against health-care professionals in detecting diseases from medical imaging: a systematic review and meta-analysis. The Lancet Digital Health, 1(6), e271–e297.
Loh, E. (2018). Medicine and the rise of the robots: a qualitative review of recent advances of artificial intelligence in health. BMJ Leader, 2(2), 59–63.
Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. Retrieved from–ThDDwAAQBAJ
O’Neill, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Nueva York, NY: Crown Publishing Group.
Peirce, A. G., Elie, S., George, A., Gold, M., O’Hara, K., & Rose-Facey, W. (2019). Knowledge development, technology and questions of nursing ethics. Nursing Ethics, 969733019840752.
Perez, C. C. (2019). Invisible Women: Exposing Data Bias in a World Designed for Men. Retrieved from
Zou, J., & Schiebinger, L. (2018). AI can be sexist and racist — it’s time to make it fair. Nature, 559(7714), 324–326.


Advice to a student nurse

My response to  a student nurse who was haunted by questions about becoming a nurse. Published in Kai Tiaki: Nursing New Zealand 13.1 (Feb 2007): p4(1).

I was pleased to see [x} letter, Questions haunt nursing student, in the December/ January 2006/2007 issue of Kai Tiaki Nursing New Zealand (p4). The questions she has reflected on indicate she is going to be an amazing nurse.

I believe nursing is both an art and a science, and our biggest tools are our heart and who we are as human beings. I was moved by her letter and thought I’d share my thoughts. The questions she posed were important because the minute we stop asking them, we risk losing what makes us compassionate and caring human beings.

Let me try to give my responses to some of the questions Lisa raised–I’ve been reflecting on them my whole career and continue to do so.

1) Can a nurse “care” too much?

Yes, when we use caring for others as a way of ignoring our own “issues”. No, when we are fully present in the moment when we are with a client.

2) Don’t patients deserve everything I can give them?

They deserve the best of your skills, compassion and knowledge. Sometimes we can’t give everything because of what is happening in our own lives, but we can do our best and remember we are part of a team, and collaborate and develop synergy with others, so we are resourced and can give our best.

3) How do I protect myself and still engage on a deeper level with the patient?

I think we have to look after our energy and maintain a balance in our personal lives, so we can do our work weft. We also need healthy boundaries so we can have therapeutic communication.

4) How do I avoid burnout?

Pace yourself, get your needs met outside work, have good colleagues and friends, find mentors who have walked the same road to support you. I’ve had breaks from nursing so I could replenish myself.

5) Why can’t I push practice boundaries, when I see there could be room for adjustment or improvement?

I think you can and should, but always find allies and justification for doing something. Sometimes you have to be a squeaky wheel

6) Isn’t it okay to feet emotionally connected to the patient?

Yes, it is okay to feel emotionally connected to the patient, but we also have to remember that this is a job and our feelings need transmutation into the ones we live with daily.

7) Don’t I need to continually ask questions, if nursing is to change, or will that just get me fired?

Yes, you do have to ask questions but it is a risky business. Things don’t change if we don’t have pioneers and change makers.

8) Finally, am I just being a laughable year-one student with hopes and dreams, and in need of a reality check?

No, your wisdom and promise are shining through already and we want more people like you. Kia Kaha!

The ‘small’ things count in caring

Editorial published in Kai Tiaki: Nursing New Zealand 8.10 (Nov 2002): p28(1).

KAI TIAKI Nursing New Zealand has recently carried narratives written by nurses discussing their experiences as recipients of health care, eg “My Journey of Pain” by Glenis McCallum (July 2002, p16). These experiences gave the nurses the opportunity to re-examine their practice and to reclaim their empathy.

Similarly, a personal experience provided the impetus to write this brief piece. I recently had the opportunity to re-evaluate my own beliefs about nursing and the importance of communication and caring when I witnessed my sister receiving care in a hospital maternity setting. What came across was the importance of the “small” things–the caring and the communication, and the importance of compassion and empathy. The sweetness of the person who opened the door to the unit and said “welcome to our world”. The rudeness, almost surliness, of the nurses who forgot to introduce themselves or tell us what was happening.

Rightly, there is much focus on nursing as a profession, yet is it possible that in this debate we have forgotten the small things that really matter to our clients -the things that make people feel safe and cared for?

This personal and professional interest was further piqued by two workshops held in Auckland recently that focused on maternal mental health issues. Both highlighted the important role nurses have to play when caring for women experiencing childbirth.

In the first workshop, organised by the education and support group, Trauma and Birth Stress (TABS), 170 consumers and health professionals gathered to explore post-traumatic stress disorder (PTSD) after childbirth. The group TABS was formed by women who had all experienced stressful and traumatic pregnancies or births that had negatively affected their lives for months or even years after the experience. One of TABS’s aims is to educate health professionals on the distinctions between PTSD and post-natal depression so the chance of misdiagnosis is lessened and correct treatment is started quickly.

Speakers at the workshop included an international nursing researcher from the United States, Cheryl Beck. A number of New Zealand women have shared their stories of PTSD with Beck and have found telling their stories and having someone understand and believe them has been very therapeutic. Other speakers included TABS member Phillida Bunkle and Auckland University of Technolgy midwifery lecturer Nimisha Waller who spoke on how mid wives can assist mothers with PTSD.

In my role at UNITEC Institute of Technology, I organised the second workshop, which also featured Beck. Entitled “Teetering on the edge: Postpartum depression–assessment and best practice”, the workshop attracted around 100 nurses, midwives, GPs and consumers. A professor in the School of Nursing at the University of Connecticut, Beck has for many years focused her efforts on developing a research programme on postpartum depression. Using both qualitative and quantitative research methods, she has extensively researched this devastating mood disorder that affects many new mothers. Based on the findings from her series of qualitative studies, she has developed the postpartum depression screening scale (PDSS). Currently Beck’s research is focused on PTSD after childbirth and she presented her work to date. In September, there were 27 participants in the study, 18 from New Zealand and the rest from the United States.

The themes of her presentation were a reminder of the dramatic negative consequences of occurrences we as health professionals deal with frequently. Emergency situations arise and we all do our job, often without a second thought as to the future impact of our actions (or inactions) on the woman and her family.

Beck also spoke at the TABS work shop. The response to both workshops was really positive. Workshops such as these, where the long-term impacts of the health care experience are discussed, can act as a reminder for anyone working with women at and around the time of childbirth to critically view their practice and that of their colleagues. Themes that feature in the research are around caring, communication and competence–the very things that were absent in my recent experience of the health system. Women in the study felt they were not shown caring, communication from health providers was poor, and they perceived their care as incompetent.

Through her research, Beck poses the question so many mothers ask: “Was it too much to ask to care for me?” As health professionals, we need to ask ourselves every day “how can I care for the needs of this client?”, because nursing is not just a profession, it is a caring profession.

* For further information on TABS