Why are students obsessed with getting a degree when universities don't teach you any proper skills other than essay writing?
University is important for specific jobs (dentistry/medicine etc) but unless you want a job that requires a degree you're better off doing anything else, which is better than getting into debt. Most employers don't look more favourably at someone because they have a piece of paper. It's good that schools and colleges want you to do well in life but they shouldn't focus all their efforts on university as if it's the be all and end all.
There are more cost effective options -
Starting a business
Degree apprenticeship
Getting a job and working your way up
There will always be jobs that require a degree but apart from those, university is just a money-making scam.