From what I know you're right, heat is dissipated due to collisions with atoms (or some elements of atoms - I don't know exactly)
.
Two things can happen to dissipated heat - either it will be taken by the surroundings or it can cause the wire's temperature
tw to increase. If the difference between the
tw and surrounding temperature
ts is not high enough, some part of the heat will be given out to the surroundings, and some will contribute to increase in wire's temperature. When the difference in temperature
Δt=tw−ts has reached value high enough, thermodynamical equilibrium is reached, so that the temperature of the wire remains constant from this point on.
If you now increase voltage, more heat is dissipated. This "extra heat" cannot be taken by the surroundings because the difference in temperature is too small, so it "stays within the wire" and causes
tw to increase. When
tw has increased by an appropriate value, new thermodynamical equilibrium is reached, and a new, higher temperature
tw is settled.
This explanation isn't influenced by the fact of increase in resistance qualitatively, but only quantitatively. However, quantitative treatment of the problem would be a bit complicated.
I'm not sure if I've managed to make it clearer, but I hope so
As to whether it complies to
V=IR - yes, it does, but as I said mathematical treatment would be complicated. If you can specify your doubts about its complying with
V=IR more precisely, maybe then I could be of more help.