My name is Matteo Colombo. I am an Assistant Professor in the Tilburg Center for Logic, Ethics and Philosophy of Science, and in the Department of Philosophy at Tilburg University in The Netherlands. My work is mostly in the philosophy of the cognitive sciences, in the philosophy of science, and in moral psychology. I am interested in questions about evidence and explanation, and in how resources from the sciences can help address philosophical puzzles about mind and moral behaviour. I pursue these interests by combining experimental and non-experimental methods.
In a recent experiment study, Leandra Bucher, Yoel Inbar, and I investigated how moral value can bias explanatory judgement. Our general goal was to assess the empirical adequacy of a popular view in the philosophy of science, which contends that scientific reasoning is objective to the extent that the appraisal of scientific hypotheses is not influenced by moral, political, or economic values, but only by the available evidence. In particular, Lea, Yoel, and I wanted to understand how the prior credibility of scientific hypotheses, their perceived moral offensiveness, and the motivation to be accurate in judging their explanatory power affect one’s assessment of a scientific report.
We asked our participants to express their opinions on the quality of a series of scientific reports. Imagine, for example, that you are asked to evaluate a report that provides evidence that being raised by a same-sex couple increases the chances of suffering from certain developmental disorders. This hypothesis does not apparently carry any moral value and can be objectively tested. Although you may find it morally offensive, you are sure that your personal values will not influence your appraisal of the quality of the report. You are also confident that the fact that you have a monetary incentive to properly assess how the evidence bears on the hypothesis will not make any difference in your considered judgement. But will these convictions of yours be borne out in practice? Will your personal values play no significant role in your assessment of the evidence?
In our experiments, we found that the more the conclusion of a scientific report was perceived as morally offensive, the more likely were our participants to judge it as less credible and rigorous. Furthermore, a monetary incentive to be accurate in the assessment of the evidence did not have a mitigating effect on the impact of moral offensiveness on explanatory judgement.
Our findings indicate that people’s judgements about scientific results are often imbued with moral value. While this conclusion suggests that, as a matter of psychological fact, the ideal of a value-free science may not be achievable, it raises important questions about the attainment of scientific knowledge in democratic societies. How can scientific evidence be more effectively conveyed to the public? What is it that drives public controversy over such issues as climate change, vaccinations and genetically modified organisms? Does the prevalent political and moral homogeneity in many present-day scientific communities hinder or systematically bias their pursuit of knowledge?
In a recent experiment study, Leandra Bucher, Yoel Inbar, and I investigated how moral value can bias explanatory judgement. Our general goal was to assess the empirical adequacy of a popular view in the philosophy of science, which contends that scientific reasoning is objective to the extent that the appraisal of scientific hypotheses is not influenced by moral, political, or economic values, but only by the available evidence. In particular, Lea, Yoel, and I wanted to understand how the prior credibility of scientific hypotheses, their perceived moral offensiveness, and the motivation to be accurate in judging their explanatory power affect one’s assessment of a scientific report.
We asked our participants to express their opinions on the quality of a series of scientific reports. Imagine, for example, that you are asked to evaluate a report that provides evidence that being raised by a same-sex couple increases the chances of suffering from certain developmental disorders. This hypothesis does not apparently carry any moral value and can be objectively tested. Although you may find it morally offensive, you are sure that your personal values will not influence your appraisal of the quality of the report. You are also confident that the fact that you have a monetary incentive to properly assess how the evidence bears on the hypothesis will not make any difference in your considered judgement. But will these convictions of yours be borne out in practice? Will your personal values play no significant role in your assessment of the evidence?
In our experiments, we found that the more the conclusion of a scientific report was perceived as morally offensive, the more likely were our participants to judge it as less credible and rigorous. Furthermore, a monetary incentive to be accurate in the assessment of the evidence did not have a mitigating effect on the impact of moral offensiveness on explanatory judgement.
Our findings indicate that people’s judgements about scientific results are often imbued with moral value. While this conclusion suggests that, as a matter of psychological fact, the ideal of a value-free science may not be achievable, it raises important questions about the attainment of scientific knowledge in democratic societies. How can scientific evidence be more effectively conveyed to the public? What is it that drives public controversy over such issues as climate change, vaccinations and genetically modified organisms? Does the prevalent political and moral homogeneity in many present-day scientific communities hinder or systematically bias their pursuit of knowledge?
ent-day scientific communities hinder or
systematically bias their pursuit of knowledge?