A digital multimeter, set up as a milliammeter on the 100 milliamp scale, is known to have an accuracy of plus or minus 2%. A display reading of 10.0 milliamps would indicate an actual line current between what two values?
• Instrument accuracy as a percentage of the reading • How to calculate ±2% of a measured value • Interpreting results as a range of possible true values
• If the meter shows 10.0 mA and the accuracy is ±2%, how many milliamps is 2% of 10.0? • Once you know the 2% amount, what is the lowest possible true current? What is the highest possible true current? • Compare your low and high values to the answer choices: which choice gives a range that fully covers both limits?
• Be sure you calculate 2% of 10.0 mA, not 2% of the 100 mA full-scale range • Confirm that you apply the tolerance both below and above the indicated reading • Check that your final range includes the displayed value of 10.0 mA and matches one of the listed choices
No comments yet
Be the first to share your thoughts!