the_watcher
kiwifarms.net
- Joined
- Jun 28, 2018
No problem. A glance at the minimum experience of 1+year in SEO indicates that they are looking for some inexperienced, cheap college kid, so really nothing special.ah fair enough i saw marketing manager and SEO which i took to be Senior Executive Officer my bad
[This is my speculation]
It's hard to say what an "end" goal may be, as access to an individual's health profile (provided by users of the service) and access to conversational data that can be used to train an AI to detect individuals who may exhibit signs of disorders like anxiety, depression, etc.
There seems to be a LOT of money in this company based on how much they spending on customer acquisition and growth. And when something seems "too good to be true" and you don't know what they are selling, then you're the product. It could even be that the $60/week fee is just a way to appear like a more "legit" company selling services and not data. (there are additional reason for charging such as a sunk cost fallacy encouraging users to use this service more often since they "already paid for it").
I would speculate that the data generated from this service could vary widely, from tracking mental disorders, to training AI algorithms that can detect fringe signs of depression in individuals. As you said, someone who would be showing signs of depressions are highly vulnerable and (IMO) targeted ads like this could convince someone they have a more serious issue and they can/should talk to a therapist. I think I heard CVS was involved here somehow, so in the future maybe they could then prescribe some medications.
I'm not saying this will happen, but "worst case scenario" type of thing, this data could be used in very malicious ways to expand the drug industry. BetterHealth is trying to switch thing from a scenario in which you have to find a therapist for help, to one in which the therapist service finds you, when you see an ad.
No matter what though, there is no doubt in my mind that data is being used for something right now. It's just a question of how deep this rabit hole goes.
This may be just a hunch, but I think they are selling the data packages themselves to companies for targeted marketing. Basically what facebook does, just in the health industry.
What could be more intimate than a 'wire-tapped' therapy session - so there is lots of valuable, personalized data to be gained. The AI angle would be a further purpose for the data. The algorithm could target everything for you. This may be a long shot, but the Chinese use such metrics for a social/citizen score. Could you be a future offender? Maybe increase surveillance on you. For example.
Last edited by a moderator: