Please can someone explain the difference? I thought % error was ((maximum error x times used)/value of measurement) x 100 and % uncertainty was (absolute uncertainty/ measured value) x 100, so if that's correct then the problem must be that I don't understand the difference between maximum error & absolute uncertainty.