The quest to simulate human intelligence in machines has introduced numerous industries to the power of artificial intelligence (AI). In the healthcare and pharmaceutical settings, the past few years have witnessed the advancement and adoption of AI platforms, most notably, chat and voice solutions, which improve patient/physician engagement, streamline access to resources and promote patient self-efficacy throughout the care journey. Other integral AI-powered resources include electronic health records (EHRs), wearable devices such as Fitbit, symptom checkers and more.
This remodeling has triggered a high demand (and even an expectation) for chat and voice solutions beyond patient and provider engagements. Today, 69% of consumers prefer to use conversational AI due to the quick communication it offers. Over the next three years, a projected 70% of consumers will replace visits to the dealer, store and bank with voice assistants. In healthcare, 51.9% of consumers are interested in using voice assistants. 2020’s unprecedented convergence of political, economic and regulatory forces has triggered wide-spread and progressive conversations around tech adoption. The Covid-19 pandemic and subsequent recession have increased the necessity for innovative and cost-effective solutions, while still maintaining the security and privacy of health data.
Prior to the pandemic, the stringent HIPAA regulations and additional privacy requirements introduced in 2003 severely limited the exchange of health data to protect patients’ personal information. The easing of these regulations in response to Covid, compounded by growing patient and physician demand for virtual care and engagement, has cleared the path for chat and voice solutions—which have now demonstrated their ability to support patients in the provider and payer settings.
Unfortunately, pharma has yet to see any similar leeway from FDA, which has heightened the years-long debate: do the strict regulations hinder the creation of tomorrow’s solutions?
From clinical research and through to FDA post-market safety monitoring, it is mandated that pharma companies must report all adverse events (AEs) with strict fines and penalties if they fail to do so. Serious AEs must be reported within 24 hours and alert reports must be submitted within 15 calendar days. This has limited pharma’s adoption of AI and reduced chatbots to button-driven experiences.
Pharmacovigilance is expansive and requires broad communication with patients and healthcare providers (HCPs). Many pharma companies have teams of people monitoring the various points of communication to detect, confirm and escalate AEs. This is an expensive undertaking, and also faces heightened challenges to keep up with the rest of healthcare’s transition to adopting virtual solutions.
As the Covid crisis has evolved and reliance on chat and voice tools remains firmly entrenched, key players have entered the market to deliver solutions that use the power of AI and elevate pharma to meet patients and HCPs in this digital space, while adhering to FDA regulations. With the advancement of machine learning and stronger natural language processing algorithms, pharma can now address at least one part of these AE reporting requirements: detection. Enabled as a monitoring capability within chat and voice experiences, AE detection modules are an initial step for the pharma to enable immersive conversational experiences for patients and HCPs alike.
While there is a gradual acceptance of this movement and adoption of chat and voice solutions, the climate from every angle is becoming more progressive and favorable toward change. Innovative entities outside of the pharma realm can support these companies by introducing and augmenting chat and voice services to maintain the integrity of the pharmacovigilance process while adjusting to the major technology shifts that were accelerated by the changes brought on during 2020.
About the author
Elise Whitaker is VP of Customer Success, Practice Lead, Pharmaceuticals & Life Sciences, at Orbita. She would like to thank Hayden Meinero for his contributions in putting together this article.