In statistics there exist a well known aphorism:
All models are wrong but some are useful.
— George Edward Pelham Box, 1919-2013
From the point of view of the definition of the word “model” this is true in an absolute sense, since a model implicitly means approximations are made, and as such discrepancies with “the real system” exist. As a result, this real system is considered as the only “not wrong” description of itself. In the exact sciences, the real system is often nature. This may lead some scientists to believe that experimental results, and by extension conclusions based on them, are true by default. When confronted with theoretical results in disagreement with experimental conclusions, the quick response entails a failure of the theoretical model used since it is not real nature that was worked with, but only a model.
Quite often this is true, and leads to the formulation of new and better models of reality: This allowed, for example, Newton’s laws of motion to evolve to special relativity and further to general relativity. However, equally often (in materials science at least) something else may be going on: The scientist may have forgotten that the experimentalist is also using a model to create his/her experimental results. Broadly speaking, experimental results can be categorized as either being direct or indirect results. Direct results are what you could call “WYSIWYG”-results. What you measure is the quantity you are interested in: e.g. contact angles of liquids by measuring the angle between a drop of the liquid and the substrate surface, the scanning tunneling and atomic force microscopy pictures of a surface,… Indirect results on the other hand, require some post-processing of a direct result to obtain the quantity of interest. This post-processing step includes the use of a model which links the direct result to the property of interest. e.g. The atomic structure of a material. Here the direct result would be the measured X-ray diffraction (XRD) spectrum, while the model and its assumptions are nowadays neatly hidden in well-performing software. This software will try to fit known crystal models to obtain lattice parameters and atomic positions for the XRD spectrum provided. This means however that the obtained result is the best fit that can be obtained, which is not necessarily the actual atomic structure.
Another important aspect to remember in regard to experimental results is the fact that different samples are truly different systems. For example, a material grown as a single crystal or synthesized as a powder may give subtly different XRD-spectra. In a recent paper with Thomas Bogaerts, we investigated how well different models for the MIL-47(V) Metal Organic Framework (MOF) fitted to experimental XRD spectra of this material. We found that depending on which experimental spectrum (single crystal or powder XRD) we fitted to, a different model was preferred, showing nature to have multiple truths for the same system. The structural difference between these models is minute, since the models entail different spin configurations on the same topology. However, the effort required for the more extended fitting procedure performed by Thomas is well worth it, since it provided a new (indirect) method for determining the spin-configuration in these rather complex structure, giving access to slightly less-wrong models for the future.