I have never argued that they were not useful, all I am telling you is that they can be introduced on a need to know bases.
A quick google search on matrices brings this simple tutorial up:
http://algebra.nipissingu.ca/tutorials/matrices.htmlnot particularly hard, and can be covered using practical examples in passing during 1 lecture, when it is relevant to know. This is how my university did things and benefits the student a lot more, as they can see how the maths links to computing.
There is maths in a computing science degree, in fields like AI, data mining etc. When you write searching algorithms you do use maths for example, but the level of maths required is not on the same level as a subject like physics or other sciences. That is why Universities like Nottingham do not require A level maths, they just teach you what you need to know as you go along. By it's very nature computing science is a very practical subject. I studied in Scotland, we had people from places like Edinburgh/St Andrews transfer from their course because the course was too theoretical leading them to develop poor programming skills.
Finally, given the sheer amount of unemployed graduates out there struggling to get a job, "getting a job after your degree" is extremely important. If I interviewed a graduate and he didn't know how the object oriented paradigm worked, or the model view controller software pattern, I would not hire him. Programmers/developers make a lot of money and are in massive demand, due to a shortage of people with programming skills.
Any serious IT professional would try and learn to be a competent programmer, you can't do anything useful in computing without being one.