They're (variance and standard deviation) obviously equivalent as one is the square / square root of the other. Standard deviation is more natural to interpret the width of the distribution as it's the same scale as the mean so you say things like
95% of the distribution lies in the interval [mu-2sig, mu+2sig]
For mean mu and standard deviation sig. You use standard deviation to get the z value, by scaling the difference from the mean.
Using variance means it is easier to derive theory/recursive algorithms/generalise to more than one variable etc. It represents the (negated, inverse) curvature parameter (quadratic "a" coefficient) in the normal distribution.
So they're basically the same. Standard deviation is easier to interpret statistically and relate to the data. Variance is more convenient for mathematical analysis.