A digital multimeter, set up as a milliammeter on the 100 milliamp scale, is known to have an accuracy of plus or minus 2%. A meter reading of 5.0 milliamps would indicate an actual line current between what two values?
• Percentage error and how to apply it to a measured value • Difference between meter full-scale range (100 mA) and the actual reading (5.0 mA) • How to convert a percent of a value into a plus/minus range
• Is the ±2% accuracy applied to the 100 mA scale value or to the 5.0 mA reading itself? Think carefully about what the question is implying. • How do you calculate 2% of the quantity it applies to, and then add/subtract that from the reading to find the minimum and maximum possible currents? • Once you find the total possible error in milliamps, which option’s range matches that spread around 5.0 mA?
• Be clear whether 2% of 5.0 mA is the same as 2% of 100 mA—this decision changes the error amount a lot. • Double-check your percent-to-decimal conversion (2% = 0.02). • After calculating the minimum and maximum values, verify that the midpoint of the range is 5.0 mA and that the width of the range matches your error calculation.
No comments yet
Be the first to share your thoughts!