In the UK, most children are educated in schools run by the state and most patients are treated by a healthcare system run by the state (NHS). They are paid for through taxation, so we all contribute to the cost.
Do you think there is any merit in the idea of providing education and healthcare through private companies that are motivated by profit making or should we continue to provide these services as we currently do?
I believe that we should continue to provide these services as we currently do, at least until the NHS has been improved, so that we have a more beneficial health care system for all which is fairer for society.
(does the beginning of my answer make any sense and am I on the right track)