Contribution to Raymond Piccoli’s opinion piece
The initiative taken by EUROPEAN SCIENTIST to carry out a “European Tour” of research funding is timely. It could contribute to better understanding of research problems and, where they exist (!) the policies implemented to address them.
Scientific publication is not an end in itself. It should not be confused with an “advertising” approach, let alone “comparative advertising” in a competitive context. “Content” is supposed to contribute, for cognitive or fundamental research, to the advancement of knowledge, and to the improvement of the human condition, based on what we call applied research.
While Raymond Piccoli denounces fraudulent scientific practices, such as plagiarism or data manipulation, the primary requirements that form the basis of the honest, effective, productive and inventive science which he goes on to discuss, are not clearly stated.
The appearance of the “scientific publication bubble” and its worryingly harmful effects are just noxious by-products of the current system. It encourages an explosion in the number of scientific publications which appear to be the only means of evaluating researchers.
It is the wobbly tripod of “researcher funding, publication, and evaluation” that has to be looked at in its entirety, to take urgent action with a primary focus on an evaluation methodology.
The process of examination begun by Cédric Villani, Claudie Haigneré, Laurent de Gosse, Jean-Pierre Alix, Pierre Corvol and several others, on scientific integrity over the past ten years, has been based on the following observations..:
- Research is the main engine of progress
- “Public opinion” puts its faith in research but distrusts researchers.
- It needs certainty and allows for neither hesitation nor doubt.
- The attacks on scientific integrity, which the media are now taking an interest in, risk undermining confidence in research which, moreover, is becoming a priority target for Fake News.
The acceleration of progress in innovative and unexpected fields, the explosion in the cost of essential material and intangible investment, have been root causes of the major revolutions in French research discussed by Raymond Piccoli but which is not confined to France.
The concept of “critical mass” and the “definition of scientific objectives” are essential requirements. Critical mass does not necessarily imply “an approach based on reduction in numbers” but “an approach based on cooperation and mutual support”. The essential work of priority setting, unless you take the risky step of “sprinkling” the financial resources widely, presupposes that the choices are objective and transparent, in short, democratic.
An example of this transparency is OPECST, Office Parlementaire d’Evaluation des Choix Scientifiques et Technologiques, [The Parliamentary Office For Scientific and Technological Assessment] legally created on July 8, 1983 by an act of the French parliament, and mandated to inform parliament and via parliament, public opinion, about the consequences of the choices made.
The creation of the first “Cancer Plan” in 2003 and the initiatives which followed it, showed the benefit, and the limits, of a clear definition of targeted objectives and the coherent implementation of the means necessary to reach them, with the participation of researchers, healthcare professionals and “civil society” via the voluntary sector.
The challenges of research no longer lie solely at national and European level. Calls for tender -or better- requests for project proposals can only be “operational” at international level, European at least, to ensure that national and European calls are considered alongside each other to maximise value.
What are disastrous, and rightly denounced by Raymond Piccoli, are the lumberingly slow, complex and “nit-picking” administrative procedures which discourage and hinder researchers, to the sole benefit of certain Brussels agencies.
These considerations are not “off topic”. They are intended to position the “scientific publication bubble” in the scientific research ecosystem. The starting point must be the status of the researcher, whose basic principles were defined by Unesco’s Recommendation on Science and Scientific Researchers in 1974 and by the European Charter for Researchers, and more recently in France, the National Charter of Scientific Integrity.
As Raymond Piccoli has it, “the core of the mechanism” is not to be found in the funding-publishing relationship, but in the evaluation of the research and the researcher. What is perverse and disastrous is, as he writes: “The researcher who publishes, publishes and publishes again is now considered a good researcher” and, further on: “I will never fail to be surprised by this breathtaking spectacle: the current evaluation criteria are the opposite of those of an effective, productive, inventive honest science” …”It is a real problem because there is no possible metric for quality”.
This problem is so entrenched and the lack of response is so distressing, in the face of the rise of scientific fraud, that we resort to denial or pretending everything is fine.
If, in response to the European Scientist initiative “decision-makers and scientists” could agree on qualitative evaluation criteria, the scientific community ought to be eternally grateful. For the time being, as well as applying the criterion of reproducibility, we need to prevent, detect, alert and punish.
Training of young researchers, respect for the research charter and ethics, and possibly swearing an oath, as has already been implemented in some places, position the researcher’s remit well above commercial concerns.
The theory of participatory research or contributing science may have its limitations. But it is built on values far removed from commercial gain, and if only for that, it deserves to be explored.
This post is also available in: FR (FR)DE (DE)