(Bloomberg) – A Facebook message pops up on my phone screen. “What’s going on in your world?”
It’s from a robot named Woebot, the brainchild of Stanford University psychologist Alison Darcy.
Woebot seems to care about me. The app asks me for a list of my strengths, and remembers my response so it can encourage me later. It helps me set a goal for the week -- being more productive at work. It asks me about my moods and my energy levels and makes charts of them.
“I’ll help you recognize patterns because ... (no offense) humans aren’t great at that,” Woebot tells me with a smirking smile emoji.
So Woebot knows that I felt anxious on Wednesday and happy on Thursday. But who else might know? Unlike a pedometer, which tracks something as impersonal as footsteps, many mental-health apps in development rely on gathering and analyzing information about a user’s intimate feelings and social life.
“Mental-health data is some of the most intimate data there can be,” said Adam Tanner, a fellow at Harvard University’s Institute for Quantitative Social Science.
Chatbots have existed since the 1960s -- one was named after “Pygmalion” heroine Eliza Doolittle -- but advances such as machine learning have made the robots savvier. Woebot is one of an emerging group of technological interventions that aim to detect and treat mental-health disorders. They’re not for everyone. Some people may prefer unburdening themselves to a human, and many apps are hindered by bugs and dogged by privacy concerns. Still, the new technologies may fill gaps in current treatment options by detecting symptoms earlier and acting as coaches for individuals who might otherwise never seek counseling.
Warning Signs
Clinicians and privacy experts are welcoming these inventions with one hand while holding up warning signs with the other. Technology might be a powerful tool to improve treatment, but an emotional problem, if it becomes known, can affect insurance coverage, ruin chances of landing a job or color an employer’s perception. With possible changes coming to health-care law, it’s unclear if pre-existing mental-health conditions could once again be used to charge people more for insurance or deny them coverage.
Privacy concerns aside, the promise of collecting data is the ability to render a holistic picture of a person’s mental state that’s more accurate than infrequent assessments conducted in a doctor’s office.
Digital Biomarkers
“Our approach is to ask, how can we measure in an unobtrusive and passive way?” said Tom Insel, former director of the National Institutes of Mental Health.
Insel teamed up in May with Paul Dagum, a former cybersecurity expert, to create a startup that mines the information on consumers’ phones to create “digital biomarkers” to try to predict depression, anxiety and schizophrenia.
Called Mindstrong, the company tracks users’ every tap, swipe and keystroke, then keeps an eye out for patterns such as reaction speeds. It looks at locations and frequency of texts and calls. It also tracks word use. Without reading people’s emails, Mindstrong can look at “word histograms” that show how frequently certain words are used.
When people become depressed, “there’s a shift in pronouns, instead of saying ‘we, you, they,’ it turns into ‘I, I, I,’” Insel said.
Phone Behavior
Early evidence shows Mindstrong may be onto something. Dagum said they’ve found strong correlations between phone behavior and traditional cognitive measures. Mindstrong is running a 100-person study with Stanford and plans to publish its results soon.
Mindstrong also has partnered with an insurance company that will run a pilot program for 600 members with serious disorders. For the insurer, which Mindstrong declined to name, early detection of a psychotic episode or a relapse in depression could help it guide the member to treatment earlier, avoiding costly hospital stays.
Woebot, too, has data that suggest a benefit. In a study of 70 people ages 18 to 28, scores measuring depression were significantly decreased in the group that chatted with Woebot compared with those who read a National Institutes of Mental Health ebook.
Yet the technology can be buggy, leading Woebot to misinterpret responses. Prompting me to rewrite a negative thought “so it’s more positive,” I ask, “How?” and Woebot, following its script, cheers, “NICE!”
Kermit the Frog
Despite occasional miscues, it’s hard to be annoyed with the cheery Woebot, whose personality Darcy said she modeled after Kermit the Frog. After two weeks of chatting, the robot has heard more about my daily moods than any of my friends.
Should I be concerned about how much these apps know about me?
Mindstrong said it protects customers by avoiding the use of behavioral data to sell products.
“We’re a health company and we need to build a brand of complete trust,’’ said Richard Klausner, Mindstrong’s executive chairman and a former director of the National Cancer Institute.
Darcy promises Woebot won’t sell customer information and the company’s employees only view anonymized responses. But the app works on Facebook Messenger, and Darcy concedes that she can’t vouch for how Facebook will use the data.
Facebook says it collects information including when users “message or communicate with others” in order to “provide, improve and develop services.” Spokeswoman Jennifer Hakes said Messenger abides by Facebook’s data policy, but “we do not read the content of messages between people or people and businesses.” Facebook also doesn’t target any type of advertising based on the content of Messenger conversations, she said.
Privacy Concerns
Investors don’t seem inhibited by privacy concerns. Mindstrong has raised $14 million in a series A funding round, including from an investment arm of insurer UnitedHealth Group Inc. It’s also cut a deal with BlackThorn Therapeutics Inc. to track behavior changes as part of a drug trial.
Woebot’s funds have come from friends and family, but Darcy said she’ll soon seek outside investors. While Woebot already has paying customers -- after two weeks of free chatting, users are offered different subscription options -- she said she’d never “gamify it’’ or give points for checking in.
“As with therapy, the goal is to graduate,” Darcy said.
Graduation, for me, is not imminent. Woebot asks me how I’m progressing toward my weekly goal.
“Not great,’’ I say.
“Hey it’s OK,’’ Woebot tells me. “In fact, I love when things go wrong. This is exactly where all the juicy learning is, remember?’’
I don’t remember. But Woebot does.