When we compute averages, we assume that we are developing useful relevant information about the population or data being summarized. An “average” takes in but one dimension at a time even if we are talking about a concept such as Intelligence which may have more than a hundred components and for which there is no overall agreed upon definition. One who bandies averages typically exudes a hubris as one who has “found it” (Nirvana) without the slightest clue as to what is missing. Thus an average becomes an ill founded testament to a particularized feature where we disregard all the data about the base population; ignoring the frequency distribution as the measured aspect; and simply presuming that all cases are the same. So if we describe averages among an American Population, be it about height, IQ, glucose levels, religiosity, years of education, etc., the singular number called average or mean tells us nothing whatsoever about the members.
Populations can be described by compiling data and by "smoothing” it out into a seeming curve such as the normal curve, the bi-modal curve; revolutionary curves, etc. Most distributions are skewed. And true, the image of the frequency distribution shows much more information than the mean, but yet it does not begin to "scratch” the surface considering that we can measure populations along thousands of variables. And we can also deal with clusters of variables and correlations, say, race and hypertension or years of post high school education and income.
The point is that with averages, all sorts of information is “baked in” and absorbed into the mean with no explanation nor insight as to what is there nor even why we should ever attend to and keep track of averages. Perhaps Malcolm Gladwell in his fantastically interesting book “Outliers” intuited this and helped to spur a few of us into decrying the notion of averages.