Hi,
I'm a mature student distance learning Physics A level.
I have a question about wether or not to include the uncertainty in the wavelength of a laser in the total uncertainty for an experiment.
It's to measure the distance between tracks on a CD.
The final equation I'm working with is: d = wavelength / sin(angle)
Where d is the distance between the tracks on the CD.
I've got an uncertainty for the angle (which my tutor was happy with), but I included an uncertainty value for the wavelength of the laser (+/- 5nm).
Being as the final equation is wavelength / sin(angle), I think that the % uncertainty for the wavelength of the laser should be added to the uncertainty for the angle, but my tutor disagrees.
They've said that 'The scatter in the measurements of (the length used to determin the angel) subsume most of the others. Does the wavelength vary in real time? Even if it did, it would be subsumed in the scatter in B, and this is not a measurement that has an error in it - it is a physical property that is being measured'.
We're not directly measuring the wavelength, or comparing it directly to a measurement of say 30cm (rendering the +/-5nm utterly insignificant), so I think it should be included.
What do you guys think? Have I misunderstood something?
*Absolutely not having a go at my tutor here, she's great, but I'd like a second opinion without arguing with her and potentially coming across as arrogant to my tutor*
Thank you for any help you can offer.