A A A
Thursday, 19 October 2017

RLC Articles

The RLC produces a newsletter each and every month that offers monthly calendars, announcements and articles. Although old newsletters are archived on the website, the articles - often on important advocacy-related issues - run the risk of being buried and difficult to find. In an attempt to make that information more accessible, we still also list RLC-published articles from the newsletter (and other relevant sources) here for your ease and perusal.

 

Force in Massachusetts

Massachusetts is one of five states that still don’t have Involuntary Outpatient Laws in place.Marionette Man
But how long will that last?
 
On July 11, people gathered at the State House in Boston at a hearing to seek community input on several pieces of legislation currently being proposed. These included a proposal from Representative Mathew Muratore to implement Involuntary Outpatient Commitment.
 
Involuntary Outpatient Commitment (IOC) is often referred to as ‘Assisted Outpatient Treatment’ or ‘AOT’ in an effort to help it sound more benign, but the truth is that it can include forced drugging, and a number of other terms that people might find invasive or harmful.
Click here for more information on IOC.
 
Fortunately, more than a dozen advocates and people with first-hand experience had the opportunity to speak out against such measures and in favor of better voluntary supports. For example, Thomas Brown (who works in peer support in a Massachusetts-based organization) testified that,
What really finally helped more than anything was finding peer support. When I found peer support I stopped wanting to die.”
 Click here for a full write up on the hearing.
 
In the meantime, it’s on our community to not only push back against efforts to bring Involuntary Outpatient Commitment to the state, but to also come together to talk about the alternatives we can help create to help people who are struggling.

Sun Magazine features RLC Voices

Sun MagazineCheck out Sun Magazine's April 2017 issue, featuring an article,
"An Open Mind’ by Tracy Frisch, and featuring the Western Mass
RLC’s Director, Sera Davidow.
 
It will be available in April at www.thesunmagazine.org/
and at newsstands where the magazine is sold.

The RLC in the NYT (and the Boston Globe): Hearing Voices

We’re excited to share that the Western Mass RLC has made some
high profile appearances in the mainstream media this past month!
 
August 8th, the New York Times:
It appeared in print and on-line on Tuesday, August 8, and featured some of the great Hearing Voices work led by RLC team members, Caroline White, Marty Hadge and others in our community.
 
August 12, the Boston Globe:
appeared in the Boston Globe. It was an editorial written by Sera Davidow, also of the Western Mass RLC. This article highlighted both the good and the bad that the mainstream news often overlooks.
 
Both articles appeared on the front of the science sections of their respective publications
and can still be viewed on-line, or by clicking on their titles above!

Hearing Voices "Foreign Policy" Article!

Check out this new article in Foreign Policy featuring the RLC’s own Marty Hadge
and Caroline White, and our Hearing Voices work.
(We don’t love the title, but there’s lots of good information in the piece.)
 
CLICK HERE to read it!
 
The Radical Movement Redefining Schizophrenia:
“People with unquiet minds are locked up, medicated, and stigmatized.
Now an international support network is telling them they might not be sick at all.”
By Samantha M. Shapiro

Watch Out! - Mental Health & Artificial Intelligence

Joy BotIn a world where so many of us can agree that human connection is essential to emotional health and healing, there is a growing trend to try and create artificial intelligence that can take the place of (or at least fill in for) all that.
 
Chatbots and other forms of ‘artificial intelligence’ are showing up all over the place trying to offer pre-programmed support to people who might be lonely, struggling, or otherwise wanting connection.
 
“Joy” is one particularly visible support bot attempting to use pre-programmed responses to simulate support and help people track their emotions over time. “Woebot” is quite similar,
although this version comes with multiple choice buttons rather than allowing for so much free-form typing. And while Joy and Woebot both got their starts through the Facebook chat system, “WYSA” is a phone application that is meant to work as a ‘coach’ trained in such popular approaches as Cognitive Behavioral Therapy and Motivational Interviewing.
 
Unfortunately, fake or ‘programmed’ connection is often like no connection at all, and sometimes these bots give responses that are quite dismissive or hurtful. While they tend to come with lots of disclaimers, the disclaimers are not prominently displayed, and some people who use these services will likely never see them or necessarily even understand quite what they’re interacting with or why it gives the responses it does.
 
In addition to that, there’s the question of privacy. Who can see what you say to a bot? Is it programmed to call in ‘emergency’ professionals when you might not be expecting it? What if these bots cause more harm than good?
 
These are all questions well worth thinking about.
 
Read “Killjoy: the Story of a Misguided Mental Health Bot” by Sera Davidow for more on this topic:

Supported Decision Making, NOT Shared

‘Shared Decision Making’ is taking up a lot of space in system conversations as the next ‘hot new thing’, but there’s a lot to think about whenever taking on a new ‘next thing’. For example:

  • Is it really changing things or just repackaging the same old stuff?
  • If it really is making a change, is it definitely headed in the right direction?
  • How consistent is it with other efforts a group or system is currently working on?
  • If it’s not consistent, what else needs to change so that one isn’t constantly stuck in a circle of defeating all movement by trying to do opposing things all at basically the same time?

This article will not address all these issues, but it will start with one:

Should we be working on ‘Shared’ Decision Making… or ‘Supported’?

Shared is not the same as supported. Specifically, unless two people are truly making a decision that is close to equally impactful for both of them (e.g., when a couple decides to move to another state or similar), then a decision is not shared. To suggest that it is shared, when what we’re really talking about is a doctor or other treatment provider being a part of making a decision that they get to walk away from when they return to the rest of their life, will ultimately feel dishonest, patronizing, and possibly even coercive to a lot of people.

Instead, we ask that people adopt the language of ‘supported decision making’ (already in use in many legal circles) to much more clearly signify that decisions often belong to one person, but that that person may sometimes benefit from a supportive process that helps them learn about and weigh options, concerns and the realities of various situations.

This may seem like a trivial matter, but how we talk about and name things can impact our ability to work with them and with each other in dramatic ways. Once we agree that we’re talking about ‘Supported Decision Making’ and not ‘Shared’ that leaves room for many other conversations!

Stay tuned for more in coming months!

RLC TEAM E MAIL

Registration & Login for Website Users