User-experience studies can be conducted to determine an optimal way of interaction between the device and its operator
. The systems operators at the point of care need to be trained.
As a prerequisite for successful implementation at the point of care, it is important that the algorithm is readily accessible and easy to use. The software would ideally be integrated with the EMR and should be available to the end-user as part of the standard workflow at the point of care. Particularly in support of a multidisciplinary decision-making process, integration with other clinical decision support software can support effective implementation. The output of the algorithm should foster optimization of treatment or diagnosis. The output should therefore be aligned with daily clinical practice.
AI/ML based solutions are prone to all types of biases in classical computer systems—the preexisting social biases influencing the way a software is designed, the biases emerging from how software is being used, and purely technical biases
One prominent problem with algorithm's fairness is the potential for discrimination of certain groups of patients, particularly if such patients were not properly represented in data used for the algorithm development.
Finally, it is also possible that an AI/ML algorithm is intentionally designed in a biased way, e.g. to favor diagnoses which are more profitable in certain healthcare systems (Char et al., 2018) or to recommend treatments produced by a particular company.
Glasp is a social web highlighter that people can highlight and organize quotes and thoughts from the web, and access other like-minded people’s learning.