I have suffered from severe depression for the past few years, and while I can certainly tell that this isn't just something that will "go away on its own," I for one am sick of all the apparent "opinions" out there that "depression never really goes away - it can get better, but it stays with you for life."
Obviously, I can't speak for everyone, but for my part I have had some very serious life events (abusive family/having to leave uni/bullied constantly) that have contributed in no small part to my depression. Even if I'm out of my current situation for a day, to visit a friend or something, I often feel perfectly normal. Myself, I think that if I could fix all my problems like my career and get back on my feet, I would never be depressed (clinically so, not just feeling sad) again.
So why do people constantly peddle this idea that "depression never entirely goes away?" It's pessimistic and unhelpful. Especially given that a lot of people who are depressed go on to commit suicide because they feel hopeless, I feel this is a stupid message to send out.
Is there actually any medical basis for this or is it a load of hot air?