Which two types of distributions would be expected when portraying a product measurement as a deviation from a check standard?

Prepare for the ASQ Calibration Technician Certification Test with a variety of questions, explanations, and study tips. Boost your knowledge and confidence for exam day!

When portraying a product measurement as a deviation from a check standard, the expected types of distributions are normal and lognormal.

A normal distribution is characterized by its bell-shaped curve and is commonly observed in natural phenomena where random variations occur around a mean value. This type of distribution is critical in quality control, including calibration, because many measurement processes, when properly controlled, produce results that cluster around a mean with symmetrical probabilities of further deviations in either direction.

On the other hand, a lognormal distribution occurs when the logarithm of the variable follows a normal distribution. This is applicable in scenarios where the data is constrained to be positive and can reflect real-world phenomena such as the measurement of time until an event occurs or the size of a particular product. Consequently, it is important in calibration processes where the characteristics of the variable being measured can result in skewed data, often reflecting multiplicative processes.

Choosing normal and lognormal distributions captures the range of behavior seen in product measurements against a check standard, presenting a comprehensive overview of potential variations.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy