Assembling a dream team of international researchers could offer a useful snapshot of the UK’s strength in chemical engineering, says Mark Peplow
Invited speaker at an imaginary conference’ would seem a rather nebulous honour, ranking just below ‘honorary unicorn wrangler’. But we must take our plaudits where we can find them.
It must be no small thrill, then, to discover that you could soon be selected to appear at a virtual world congress (VWC). And there’s no need to slave over slides or hunt for budget travel options. In fact, you might not even know you attended.
The congress is all part of an international benchmarking exercise being conducted by the Engineering and Physical Sciences Research Council (EPSRC) to assess the UK’s reputation in chemical engineering. Last year, the EPSRC consulted with the Royal Society of Chemistry, the Royal Academy of Engineering and the Institute of Chemical Engineers to carve chemical engineering into 40 sub-fields, from catalysis to carbon capture, and then invited chemists and chemical engineers to nominate session organisers for each of the areas. Those nominated will then fill the programme of the fictitious congress with the cream of the world’s researchers. The results, out in summer, should show where UK science excels, and how it compares with other nations.
International assessments traditionally blend the subjective views of an expert panel with bibliometric data (such as citation rates) to profile a nation’s research. But these sprawling reports tend to eat up time and money, so the council is trialling this ‘fantasy football’ approach as a potentially quicker and cheaper option.
The VWC method was first tested by the US National Academies in the late 1990s, as it tried to profile the country’s research in reliable, objective ways. It featured in a series of US benchmarking exercises in 2007, which assessed chemistry, chemical engineering and mechanical engineering, but has never been tried in the UK before.
In principle, the VWC offers several advantages over conventional approaches. Relying on tallies of researchers, publications and funding is never enough to gauge a nation’s leadership in science – not least because it can be extremely difficult to compile comparable information from other countries. Although reputation is hard to quantify, session organisers should know talent when they see it.
A researcher’s reputation can spread through the community much faster than citation metrics can keep up, so canvassing expert views offers a more immediate snapshot of the rising stars, as well as the leaders in a field. This is particularly true of chemical engineering, where a lot of research is reported in conference proceedings that are not always picked up by bibliometrics.
Also, chemical engineers often work in other departments – chemistry or engineering, for example – which can make it difficult to tease apart their contributions to the field. The imaginary conference allows the organisers to precisely tailor which research themes they want to assess, without relying on such artificial designations.
And in contrast to bibliometric assessments, which are increasingly criticised by researchers as an inadequate measure of research quality, an exercise driven by scientists’ own recommendations could have more legitimacy in the community.
Of course, recommendations are subjective, and the VWC could be skewed by the same bias that showed up in the US congresses back in 2007: sessions organised by US chemists typically contained about 15% more American speakers than those organised by non-US chemists. However, the EPSRC says it will mitigate the risk of localism by using statistical analysis to highlight any bias.
The council is at pains to point out that this is a trial run, although if judged to be a success it plans to apply the method to other disciplines. And the VWC will not completely displace more established assessment methods – an expert panel will still interpret the list of delegates, and the results will be complemented by a bibliometric survey.
Oddly enough, though, that list may remain secret. The EPSRC says that it is up to the steering panel to decide whether to publish the complete list of speakers, but reading between the lines it seems unlikely that they will.
I can understand their reticence. Although the VWC is not meant to rate individual researchers, few could resist the temptation to check whether their peers had given them a slot – and those left out in the cold might feel miffed.
But the community should be able to scrutinise in full any assessment of their collective competence, even if it risks causing a little friction. A completely transparent selection process is more likely to win the community’s support for what appears to be a promising assessment technique. I also doubt that the list of speakers could be kept secret for long anyway, and it would be better for the EPSRC to release the names rather than have them leaked.
Even if a few do feel slighted for not being picked, many more are likely to be flattered by their inclusion in the VWC’s programme. After all, who wouldn’t want to know they were part of a dream team?
No comments yet