Mindstrong: An Application of AI in Wellness
Mindstrong is using AI and "digital phenotyping" toward diagnosing mental health conditions
Following a decade of high growth, ranging from step monitoring to heart rate to sleep and mood technologies, the wellness technology sector is now seeing an emerging wave of technologies that incorporate artificial intelligence (AI).
Underlying the growth of wellness technology is the belief that wellness is, according to Dr. Bill Hettler, co-founder of the National Wellness Institute, a “conscious, self-directed and evolving process of achieving full potential.” This view of wellness is aligned with the modern do-it-yourself culture, which has given rise to many now familiar devices, including wearables such as the Fitbit and Apple Watch that can use data analytics to predict an individual’s risk of diabetes.  Cognitive wellness is also receiving increasing attention, in light of historical data that has shown suicide as the leading cause of death in the U.S. and antidepressants as the third most common prescription drug taken by Americans. A whole new host of technologies has moved to address these growing concerns, including relative newcomer AI startup, Mindstrong Health (“Mindstrong” or the “Company”).
Mindstrong’s technology tracks users’ behaviors and interactions on their smartphones as a means toward early detection of depression and other mental health conditions. Through a process called “digital phenotyping,” the app monitors types, taps, swipes, and scrolls, which it then analyzes using machine learning to find correlations to mental health. For instance, the app diagnoses memory problems by tracking the speed of typing, the number of errors made, the frequency of character deletion, usage of punctuation, and the speed of swapping to other apps. The results are shared with the patients and their respective medical providers. Currently, the application can only be accessed by users through referral by a healthcare provider, who would be monitoring their patients for indicators of mental conditions. For Mindstrong, AI has served as a powerful tool in its product development to map aggregated, continuous user data into identified patterns of behavior. Furthermore, studies done by the company to date have shown that the app is in some instances able to predict how a person will feel a week into the future based on neurocognitive markers.
Mindstrong’s management team is focused on building up a robust data set in order to further support its methods of applying AI in product development. The company has five years of clinical study data to confirm its technology, and in March started to run tests with patients and their doctors. The company is first focused on working with seriously ill individuals who are at risk for relapses into depression, schizophrenia, and substance abuse. In order to fuel further growth, specifically with the goal of covering more conditions and developing its predictive capabilities, the company is continuously running further tests and focusing on publishing peer-reviewed research. To date, the company has only tested its app in controlled clinical settings and trials. Additionally the company is focused on signing on customers, including approximately 15 counties in California who have committed to spending ~$60 million in the next four years to implement Mindstrong and other applications into their healthcare systems.
Looking forward, there are a number of ways that management can continue to incorporate AI into Mindstrong’s product development. The company may consider using more than one modality (e.g., using voice as well as gestures) to expand its data set and more accurately provide diagnoses. Additionally, the company may consider expanding from pure diagnosis to more preventative treatment measures. For instance, Mindstrong could offer other AI-based technologies such as a “virtual therapist” that is able to process and analyze emotional cues derived from facial expressions and tone of voice. Finally, as the company continues to expand and grow, management should enforce privacy to address concerns stemming from collecting massive quantities of continuous data that point toward specific mental health diagnoses.
As we consider the implications of AI in wellness and in the case of Mindstrong, one question that comes to mind is whether the use of AI in this specific implementation will be effective as a preventative measure for treatment of mental health conditions. Which aspects of the treatment of mental conditions can be aided by AI, and which aspects cannot? Is the usage of AI In this manner able to create true behavioral change (on the part of the doctors, patients, and other parties)?
 Nationalwellness.org. (2018). The Six Dimensions of Wellness – National Wellness Institute. [online] Available at: https://www.nationalwellness.org/page/six_dimensions [Accessed 13 Nov. 2018].
 Hobbs, A. (2018). Fitbit and Apple Watch can help predict diabetes, says report | Internet of Business. [online] Internet of Business. Available at: https://internetofbusiness.com/deepheart-fitbit-apple-watch-predict-diabetes-risk/ [Accessed 13 Nov. 2018].
 Medium. (2018). The Emerging Artificial Intelligence Wellness Landscape: Opportunities and Areas of Ethical Debate. [online] Available at: https://medium.com/@lkcyber/the-emerging-artificial-intelligence-wellness-landscape-802caf9638de [Accessed 13 Nov. 2018].
 CNet.com. (2018). Mindstrong Health app offers early detection for depression based on smartphone interactions. [online] Available at: https://download.cnet.com/blog/download-blog/mindstrong-health-app-offers-early-detection-for-depression-based-on-smartphone-interactions/ [Accessed 13 Nov. 2018].
 STAT. (2018). Mindstrong’s mood-predicting app is shadowed by questions over evidence – STAT. [online] Available at: https://www.statnews.com/2018/10/04/mindstrong-questions-over-evidence/ [Accessed 13 Nov. 2018].
 Metz, R. (2018). The smartphone app that can tell you’re depressed before you know it yourself. [online] MIT Technology Review. Available at: https://www.technologyreview.com/s/612266/the-smartphone-app-that-can-tell-youre-depressed-before-you-know-it-yourself/ [Accessed 13 Nov. 2018].
Student comments on Mindstrong: An Application of AI in Wellness
Very interesting post! I’m surprised to hear that Mindstrong is able to identify people’s mental health through their types, taps, swipes, and scrolls. Do people with, for instance, depression really interact with their phones in similar ways? People start off with such different smartphone usage habits, so I’m guessing the app tracks how usage changes. Additionally, it will be interesting to see how this technology adapts as smartphone usage (the baseline) changes. I would also be curious to learn about how knowing you have this app running on your phone alters how people interact with their smartphones.
This concept is very interesting and potentially very useful. Wondering though if people would be concerned for the “big brother” like model that this confers (monitoring all types, taps, swipes and scrolls). Additionally, people would have to opt-in to this type of monitoring, which I see it difficult to convince adopters – particularly in patients with schizophrenia. That said anything that can help prevent suicide or identify high risk people is worth pursuing.
This is a fascinating application of machine learning. I think this is important for two reasons. 1) There is so much stigma attached to asking for help that this is a way for someone to get help when they are incapable of asking for it themselves. 2) Considering how many objective and diagnostic tests we have for other illnesses while mental health is typically diagnosed through self-reported or more subjective diagnostic tools, I could see this being a useful tool for a more objective type of diagnosis. It could be a great way to sense the beginning of a crisis and respond accordingly, and I could even see this connected to crisis hotlines or peer support groups. I do worry about data privacy concerns. Do we think there are any downsides to this? On the one hand, it’s incredibly important. On the other hand, if this data gets hacked or stolen I could see massive and horrifying openings for abuse.
Fascinating topic! I think there is so much that AI could do to help humanity, and this is a great example. In particular, where AI helps is early pattern recognition, to alert humans in digging deeper where needed. While I think the use of AI in detecting disease can be helpful, the risk of not passing the same information through human interpretation is also high (in the case of false negatives – i.e. there was a disease but machine did not catch it). Where I think machine could really add value is in detecting mental health issues that are not quite diseases, e.g. a deterioration in mental health that is worth noting and correcting. Currently this part of mental health is largely ignored, because there is just not enough resource to look at everyone in their daily life. AI can look through this vast amount of information and pattern much faster, and provide recommendations for individuals to consider, and if warranted, to send to doctors for further evaluation. Would love to hear your thoughts on how AI can help the “quality of life” part of mental health!