empiricism
noun
em·pir·i·cism
im-ˈpir-ə-ˌsi-zəm
em-
1
a
: a former school of medical practice founded on experience without the aid of science or theory
b
2
a
: the practice of relying on observation and experiment especially in the natural sciences
b
: a tenet arrived at empirically
3
: a theory that all knowledge originates in experience
Love words? Need even more definitions?
Merriam-Webster unabridged
Share