Big White Wall / togetherall evidence: “Evaluative Review”

Big White Wall (edited: now in 2020 known as “togetherall”) has been marketing its service using this graphic:


Sounded too good to be true, so I had a look to see if I could find out more.

Their website states, “A copy of the independent review is available on request”. When I asked for a copy (May 2014), they replied that “there is some potentially commercially sensitive data in this review, so we’re not able to share it directly.” I continued asking and even had a meeting with one of their psychiatrists, explaining how important it is that they are transparent about their evidence, especially since they are marketing to CCGs and competing for NHS funding – still nothing.

I accidentally discovered today (Jan 2015!) that the report is online over here, linked to a post on the Mental Health Innovation Network forum. (I have also mirrored it here in case it disappears.)

Now the 95% claim makes sense. The report explains that this is the percentage of users who “reported one or more improvements in mental wellbeing”. There are problems with this kind of evaluation, the most obvious of which is the absence of any comparison group. This means we don’t know what percentage of non-users “feel better” too, so it is difficult to attribute any improvement to Big White Wall.

Additionally, the “one or more” obscures important detail provided in the report. As the tables below show, the improvement rates are often very much lower than 95%.



2 thoughts on “Big White Wall / togetherall evidence: “Evaluative Review””

  1. Hi Andy,
    I feel exactly the same about those numbers from the Big White Wall.
    I’m glad I’ve found someone else that doesn’t drink the Kool-Aid.

    I’m an entrepreneur interested in the e-Mental health field. I was initially captivated by their product, but the more I look at it, the more I find it disappointing.

    That “independent review” is a good example. You highlighted some issues already, but let me add some more.

    It’s based on a survey carried out in January 2008 and a review a year later in January 2009. In January 2008 they only had 598 users. Of those 598 users, only 38.3% actually responded.

    So the review is effectively based on only 229 users.

    At the end of their review in January 2009 they had 3000 users, which means in practice they were basing their conclusions only on 7.6% of their active user base.

    If you think that now they have somewhere around 30K users in the UK, how honest it is to keep using those initial 229 users self-reports to claim a 95% improvement ratio?

    Also, the review is only based on those 38.3% of users that decided to self-report. But of course, this is very likely to be a biased sample.

    Unsatisfied users are unlikely to stick with the service for a whole year.

    So we are automatically (and I’d say cunningly) excluding from the survey all the users that tried it and gave up. We are also excluding all the users that tried it, sticked with it, but they were so un-impressed that they didn’t bother filling the survey.

    As a final criticism, is self-report of well-being in mental health a good enough metric?

    It reminds me of a study about the effectiveness of subliminal self-esteem boosting audiotapes. The subjects were tested before and after for their level of self-esteem. The tapes had no effect whatsoever, but the subjects that at the beginning of the experiment expected it to work, actually reported a perceived improvement.

    Here is the study:

    In this interview done in Dec 2014 the CEO of BWW claims they currently have 9 studies being carried out. Let’s hope they’ll have better data to publish soon.

    I’d like to add something more about the BWW: the technical quality of their products is disgraceful.

    Their website is full of bugs, built with obsolete technologies, and their smartphone apps are even worst: just look at the low ratings they have in the Apple App Store and Google Play Store and check out the customers complains – actually, let me quote them: “Great service, shockingly bad app”, “Useless”, “Takes forever to log on and throws u off!”, “Disappointed”, “Buggy as hell”.

    I’m not surprised. If you look at their “team”, they don’t even have a CTO, or anyone else sitting in the board with a technical background.

    I think the BWW could really be a great service, but right now, it’s just badly implemented, and I wish NHS were spending their money to address mental health in a better way.


Comments are closed.