Depression

  • Guest, it's time once again for the massively important and exciting FoH Asshat Tournament!



    Go here and give us your nominations!
    Who's been the biggest Asshat in the last year? Give us your worst ones!

Delly

Trakanon Raider
2,995
634
During this time, we were supposed to have a staff of 13, I had between 6 and 7 depending on the month. So while covering call-ins and shifts we just didn't have coverage for, I was still doing my job of running both shelter's operations, which is hard to do when you're check-in on sleeping youth (eyes on) every 15 minutes while they are in their rooms because of state licensing guidelines.
The best change in my career was working a role that was dependent on working with/covering others to just having my own work load. I do my work and go home. Sometimes it is mind numbing, stressful work, but at least I know once the work is done I can escape. No mandating or on call shenanigans. Burnout is far too common in the industry and its a sinking ship in so many places.
 
  • 1Truth!
Reactions: 1 user

Tarrant

<Prior Amod>
15,782
9,193
The best change in my career was working a role that was dependent on working with/covering others to just having my own work load. I do my work and go home. Sometimes it is mind numbing, stressful work, but at least I know once the work is done I can escape. No mandating or on call shenanigans. Burnout is far too common in the industry and its a sinking ship in so many places.

i absolutely can not wait to go to work, do my job and go home and not have to worry or think about it again until next time I clock in.
 
  • 2Like
Reactions: 1 users

Falstaff

Ahn'Qiraj Raider
8,396
3,321
Yeah, it had got pretty out of control. My boss always told me he was worried about my burnout but I just put my head down and powered through because what other option did I have? I wasn't getting help from him, I think the beginning of the end was when I did bring it up during one of my supervision meetings with him and he said "you think it's bad now you shoulda been here 6 months before you got here." I let him know that wasn't the flex he thought it was, and he didn't appreciate it.

I lived almost an hour away from the shelters, and our crisis on-call was supposed to rotate between 4 people, each taking a week. On-call only paid an extra $100 per 7 days you were on it, so I lost money when it came to paying for gas, more so when the 4-person rotation turned into me taking it 3 weeks and then two case managers sharing it for a week. I was sleeping in my car multiple times a week and my wife was convinced I was going to die falling asleep at the wheel. I built up almost 400 hours of PTO in 2022 due to the high workload.

During this time, we were supposed to have a staff of 13, I had between 6 and 7 depending on the month. So while covering call-ins and shifts we just didn't have coverage for, I was still doing my job of running both shelter's operations, which is hard to do when you're check-in on sleeping youth (eyes on) every 15 minutes while they are in their rooms because of state licensing guidelines.

I'll never work in public housing again and I doubt I'll ever work for a nonprofit again either unless it's one I get off the ground again at some point in the future. My new job starts in a week, its 40 hours a week Monday - Friday 9am-5pm and I supervise no one I still get to work with at-risk communities and populations. I'm really looking forward to it and mentally, I'm in a much better place than I've been in, in a long time. I didn't realize how much of a toll my job had taken on me and I'm glad to be done with it. I do miss the kids though, I hope they do well moving forward.
This is why I left social work 10 years ago. I was in my late 20s and already saw all of this and knew the worst was yet to come. I don’t know how people do it. If I want to change the world there are plenty of volunteer opportunities. But still happy for you and glad to see you coming out the other side for your mental well being if nothing else.
 

Tarrant

<Prior Amod>
15,782
9,193
This is why I left social work 10 years ago. I was in my late 20s and already saw all of this and knew the worst was yet to come. I don’t know how people do it. If I want to change the world there are plenty of volunteer opportunities. But still happy for you and glad to see you coming out the other side for your mental well being if nothing else.
I still love the work, but I doubt I’ll work a position again that has no set boundaries on work/life balance.
 
  • 1Like
Reactions: 1 user

TheAdlerian

Potato del Grande
<Banned>
83
36
ChatGPT is going to be far better than 99% of currently practicing shrinks at using the DSM. Same thing versus regular doctors too.
I'm a therapist and have been one for decades.

I don't know how an AI can work with patients effectively due to the fact that most people will say what appears to be true but really it's something else after you get to know them.

For instance, when you first meet someone they will give you the technical details of their problem, such as, I'm depressed, I can't sleep, I'm having trouble focusing, etc. That info fits in very well with DSM diagnosis, so it's easy in the beginning.

However, most people don't have "mental illness" they have some trauma that caused them to develop ideas that produce depression, anxiety, etc. So, after the patient gets to know you they will tell you their parents took them for granted, they had few friends, they grew up around bullies, their dad left and they can't put a name to how they feel about that, and so on. Thus, the problem is NOT in the DSM but rather it's due to the effect of life events and is an existential issue.

If you are using AI you probably can only get technical diagnostic data and not find information about dad neglecting you or dealing with the effects of a single mom on your life. That's because those issues are understood by humans but do not have much research. That or the research is fake woke material.

I have dealt with a lot of prostitutes because I do a lot of treatment of criminals/drug addicts. The research data on prostitutes is fake, because it's "woke" and concludes that prostitutes are OKAY. Meanwhile, I have gotten the real story about prostitutes and that is, they hate men, they are frequently lesbians, they need drugs to numb themselves to have sex, they are sociopathic tricksters, etc. My bet is that if you tell an AI you're a prostitute it will probably conclude that you are a well adjusted "sex worker" and that you don't have any diagnosable problem when in reality you have the worst issues in psychology.

Psychology is VERY bourgeois which means there's a lot of "middle class" lies in it to hide the real sources of suffering, which tend to be family issues and marginalized people like homeless, prostitutes, and addicts which bourgeois don't want to think about. So, AI is not going to be programmed to know about these people and issues.
 
  • 2Like
Reactions: 1 users

moonarchia

The Scientific Shitlord
23,406
42,522
I'm a therapist and have been one for decades.

I don't know how an AI can work with patients effectively due to the fact that most people will say what appears to be true but really it's something else after you get to know them.

For instance, when you first meet someone they will give you the technical details of their problem, such as, I'm depressed, I can't sleep, I'm having trouble focusing, etc. That info fits in very well with DSM diagnosis, so it's easy in the beginning.

However, most people don't have "mental illness" they have some trauma that caused them to develop ideas that produce depression, anxiety, etc. So, after the patient gets to know you they will tell you their parents took them for granted, they had few friends, they grew up around bullies, their dad left and they can't put a name to how they feel about that, and so on. Thus, the problem is NOT in the DSM but rather it's due to the effect of life events and is an existential issue.

If you are using AI you probably can only get technical diagnostic data and not find information about dad neglecting you or dealing with the effects of a single mom on your life. That's because those issues are understood by humans but do not have much research. That or the research is fake woke material.

I have dealt with a lot of prostitutes because I do a lot of treatment of criminals/drug addicts. The research data on prostitutes is fake, because it's "woke" and concludes that prostitutes are OKAY. Meanwhile, I have gotten the real story about prostitutes and that is, they hate men, they are frequently lesbians, they need drugs to numb themselves to have sex, they are sociopathic tricksters, etc. My bet is that if you tell an AI you're a prostitute it will probably conclude that you are a well adjusted "sex worker" and that you don't have any diagnosable problem when in reality you have the worst issues in psychology.

Psychology is VERY bourgeois which means there's a lot of "middle class" lies in it to hide the real sources of suffering, which tend to be family issues and marginalized people like homeless, prostitutes, and addicts which bourgeois don't want to think about. So, AI is not going to be programmed to know about these people and issues.
What you are describing is a GIGO, not an AI issue. If people are willing to be honest with the machine, they will get the correct diagnosis from the DSM. Many, if not most of us, already do the same thing with medical symptoms to have a starting point when talking to a Dr, or to verify what we are being told. You could simply program the AI to assume it is being lied to to get it to ask probing questions or make you clarify and verify what you are feeling.
 

Hatorade

A nice asshole.
8,450
7,201
My business I started picked up, had the best week ever so far cash wise but got home and my brain did its thing. Apparently I am only happy when I spend money on others and making money has the opposite effect.
 
  • 1Like
Reactions: 1 user

TheAdlerian

Potato del Grande
<Banned>
83
36
What you are describing is a GIGO, not an AI issue. If people are willing to be honest with the machine, they will get the correct diagnosis from the DSM. Many, if not most of us, already do the same thing with medical symptoms to have a starting point when talking to a Dr, or to verify what we are being told. You could simply program the AI to assume it is being lied to to get it to ask probing questions or make you clarify and verify what you are feeling.
That's moronic.

Do you have any life experience?

Have you ever been around a woman before?

A woman will say "Nothing is wrong" and that means something is wrong but you can instantly tell due to her tone and body language.

Everyone knows this....but you.
 

moonarchia

The Scientific Shitlord
23,406
42,522
That's moronic.

Do you have any life experience?

Have you ever been around a woman before?

A woman will say "Nothing is wrong" and that means something is wrong but you can instantly tell due to her tone and body language.

Everyone knows this....but you.
Again, GIGO issue, not an AI issue. If you lie to a tool that's on you, not it.
 
  • 1Like
  • 1Solidarity
Reactions: 1 users

Oblio

Utah
<Gold Donor>
11,714
25,614
That's moronic.

Do you have any life experience?

Have you ever been around a woman before?

A woman will say "Nothing is wrong" and that means something is wrong but you can instantly tell due to her tone and body language.

Everyone knows this....but you.
This is a grownup thread, so I will just say that I am sorry to learn about your disability.
 

TheAdlerian

Potato del Grande
<Banned>
83
36
Again, GIGO issue, not an AI issue. If you lie to a tool that's on you, not it.
Moronic.

You are a naive person who thinks science fiction is real.

You can't program a computer to observe something that humans can't even always observe. Then, of course, you can observe something correctly, but if you're dealing with a passive-aggressive person, they will deny what you observed it correct.

There is no computer language in use to program a computer to determine tone and body language and no way to program to detect lies on the passive-aggressive level. AI are currently robots, not actual AI, due to the limitations of programming language. So, there's no way that any modern program could be robotically programmed to detect and stop people from lying and admit what's on their mind.

As I've said in my original reply, you sound like you lack life experience and don't know how people work. Now, it's clear you don't know how computers work, lol.
 

ToeMissile

Pronouns: zie/zhem/zer
<Gold Donor>
3,161
2,050
Moronic.

You are a naive person who thinks science fiction is real.

You can't program a computer to observe something that humans can't even always observe. Then, of course, you can observe something correctly, but if you're dealing with a passive-aggressive person, they will deny what you observed it correct.

There is no computer language in use to program a computer to determine tone and body language and no way to program to detect lies on the passive-aggressive level. AI are currently robots, not actual AI, due to the limitations of programming language. So, there's no way that any modern program could be robotically programmed to detect and stop people from lying and admit what's on their mind.

As I've said in my original reply, you sound like you lack life experience and don't know how people work. Now, it's clear you don't know how computers work, lol.
I don't disagree w/ your initial point that AI (especially in it's current form/abilities) isn't a replacement for a human therapist and won't be anywhere close for a long time if ever. However, used as a tool to assist seems very much in the realm of near possibility.

That aside, you're being a dick.
 

moonarchia

The Scientific Shitlord
23,406
42,522
Moronic.

You are a naive person who thinks science fiction is real.

You can't program a computer to observe something that humans can't even always observe. Then, of course, you can observe something correctly, but if you're dealing with a passive-aggressive person, they will deny what you observed it correct.

There is no computer language in use to program a computer to determine tone and body language and no way to program to detect lies on the passive-aggressive level. AI are currently robots, not actual AI, due to the limitations of programming language. So, there's no way that any modern program could be robotically programmed to detect and stop people from lying and admit what's on their mind.

As I've said in my original reply, you sound like you lack life experience and don't know how people work. Now, it's clear you don't know how computers work, lol.
So yes, you are being moronic here. Let's go back to what you made some sort of childish attempt to dunk on initially:

ChatGPT is going to be far better than 99% of currently practicing shrinks at using the DSM. Same thing versus regular doctors too.

Now based on your response I am not sure you actually understand what the DSM is, and how it is used. The DSM is the big old book of currently defined illnesses and how you diagnose someone based on their symptoms. Guess what, genius? My initial post is still absolutely correct. ChatGPT will be far better than 99% of shrinks at using it, not limited to, but because shrinks were/are the infection vector for wokeism, meaning that they are inherently biased much more often than not. If you, or me, or anyone in the world, is able to honestly describe our symptoms, ChatGPT is going to spit out a much more accurate list of possible issues than a shrink in seconds. That is going to be an amazing asset for people who want a second opinion or want to get more information. Believe it or not, there is still an entire generation or two that is still fucking abysmal at "googling" things on their own. An AI interface is going to help a lot of people.
 

Tarrant

<Prior Amod>
15,782
9,193
I’m still struggling, and I’ve had some pretty dark thoughts as a result. No self harm but as an example my wife asked me yesterday to give her an example of what filters in and out.

Me: I struggle knowing that my job as a father is to protect my kids, prepare them for the world and what it has to both offer and good and bad. I work and preparing them for life as an adult while showing them love and support. I’d do anything for all my kids and I’ll do anything for them and to keep them happy.

Her: Right…

Me: So knowing all of that, it destroys me to know at some point I’ll be one of the biggest sources of pain in their life when I die and it’ll be a trauma they will carry with them for years probably.

Her: …….well fuck

Now obviously death is apart of life and there’s dick you can do about it, I know this logically. But shit like that fucks with my head a lot. Or at least, has the last few months. I started talking to someone but it doesn’t help really.
 
  • 2Solidarity
Reactions: 1 users