In statistics, the range of a set of data is the difference between the largest and smallest values. It can give you a rough idea of how the outcome of the data set will be before you look at it actually Difference here is specific, the range of a set of data is the result of subtracting the smallest value from largest value.
However, in descriptive statistics, this concept of range has a more complex meaning. The range is the size of the smallest interval (statistics) which contains all the data and provides an indication of statistical dispersion. It is measured in the same units as the data. Since it only depends on two of the observations, it is most useful in representing the dispersion of small data sets. Range happens to be the lowest and the hightest numbers subtracted.The range of an analytical method is the interval between the upper and lower concentration of analyte (Including these concentration) for which it has been demonstrated that the analytical procedure has a suitable level of precision, accuracy and linearity.The specified range is normally derived from linearity studies and depend on the intended application of the procedure. It is established by confirming that the analytical procedure provided an acceptable degree of linearity, accuracy and precision when applied to sample containing amount of analyte within or at the extremes of the specified range of the analytical procedure.For Assay range is usually NLT 80 to 120% of the test concentration.For determination of content uniformity the range is usually NLT 70 to 130% of the test concentration.For Determination of impurities the range is usually NLT the reporting limit of the impurity to 120 % of the specification.For dissolution testing the range is usually ±20 % over the expected concentration.Range is usually demonstrated by confirming that the analytical procedure provided an acceptable degree of linearity, accuracy and precision when applied to sample containing amount of analyte within or at the extremes of the specified range.