Once upon a time, the accuracy and precision of tuning was all done by ear. I think we can all agree on this.
The accuracy was determined by the accuracy of tuning forks and how that was transferred to master reed sets. The precision or repeatability of a tuning being down to the skill and ear of the tuner.
When I first started in this game, I tried different meters, and looked for accuracy and repeatability in readings, I also looked at discrimination of displayed values, so if I was trying to tune to say +/- 5 cents, I needed to be able to measure at least 1/5th of the spread of the tolerance preferably 1/10th of the spread, or 1 cent in this case.
So, Accuracy is in the actual value of the nominal value being achieved, assuming the meter shows no error for the note in question.So if the meter says A= 440 Hz, it is 440 Hz not 441, or 442 Hz).
Precision = how close the the tuning technique, my skill and the repeatability of the meter allows me to get to the shown A= 440 Hz over, say 20 different occasions.
All this is why engineers talk in tolerances.
My next quest was to establish my own tuning tolerance standard, assuming an accurate measurement and a repeatable meter which would allow me read with an appropriate level of discrimination. I read-up various medical and musical papers and came upon the statement that most people can determine an error of 5 cents between two 'same' notes, some can say which of two notes is sharp or flat to the other, many cannot. I was taken be the statement that people can hear a dime fall. I took me quite some time to realise that a dime was 5 cents, but we don't have dimes in Yorkshire. This 5 cents detection was hearing two continuous notes, I reasoned that musical notes are transitory, a MM of 60 beats would give each note 1 second if playing crotchets, 1/2 a second if playing quavers, clearly MM 120 halves all this again, So +/- 2.5 cents seamed reasonable, say +/- 2 cents to allow for drift etc.
Playing 3rds. 5ths & octaves I re-evaluated this to my current standard of +/- 1.5 cents from nominal across the range. The only exceptions being at a player's request or if a low pitched reed on an Anglo which can sometimes bend to flat when played above mp.
I raise this topic as a result of working on two instruments for a certain player who could determine errors of less than 1 cent, by ear, and tell which note is the sharper of two , all this on an Anglo. The best I could manage was +/- 0.5 cents using a discrimination of 0.1 cents, and an instrumentation discrimination of probably 0.01 cents.
Whilst I am quite happy to work to +/- 1.5 cents from nominal, and my various customers and players (except 1) are also happy with this, I wonder what tolerance other repairers are using. I will add that I get instruments brought to me for tuning tweaks that have been known (or reported) to have been serviced, tuned, repaired by others or before acquisition that can have several 5 cents or more variations from nominal. Some probably as a result of drift, but much as a result of rushed and sloppy workmanship. I can understand an inaccurately tuned instrument, say all notes 3 cents sharp, or say 2 cents flat, but not a wide spread in the overall precision of tuning.
It would be good to determine an agreed standard, to protect players provide a control standard to repairers and guide home restorers alike.
So I have declared my standard, what does the rest of the world do?