hello...
how do you calculate the percentage error of a reaction using the maximum error (eg. in +-0.5 for a thermometer)
i hope i made sense...
Usually % error is calculated as follows :
% error = ^t/T x 100where
^t represents the uncertainty in temperature and
T the measured value.
However it will depend which type of reaction you are carrying out and how you are operating.
If you are measuring initial temperature and then final temperature in order to find the temperature change, then since you recorded two values,
^t = 2(0.5)