I read a line that struck me this past weekend in an article about how the Houston Astros combined statistical analysis with human scouting to create a machine-learning system that a journalist correctly predicted (in 2014) would make them World Series Champions in 2017. The quote about the "black box" the Astros created was simply:
"If you knew how it worked, it wouldn't."
The psychology and incentives of transparent -- i.e., understandable and understood -- systems of metrics compromise the effectiveness of the metric. The impact factor is this metric in scholarly publishing. It has been shown to drive people to fabricate data, distort results, and cheat to gain an advantage. It has been used to incentivize scientists both indirectly (through tenure, grants, and so forth) and directly (through payments for publication in high-impact journals).
The quote could be mistaken for Goodhart's Law, which states that, "When a measure becomes a target, it ceases to be a good measure," but it is essentially different. That is, in this example, the Astros share the same goal as every other team -- win more games, score more runs, defeat more opponents -- so the measure (wins, runs, losses -- take your pick) is not the issue. Wins are still a good measure of baseball success, even if they are also a target.
Baseball is competitive, and there is only one champion. The zero-sum nature of each season (I win, you lose) is fundamentally different from the situation in scholarly publishing. A citation to your paper doesn't preclude a citation to my paper, which itself leads to abuses such as editors and senior authors and reviewers urging or even requiring authors to cite their papers and boost their h-index or journal's impact factor. Because how the impact factor works is well-understood and transparent, to use the modern term, it is ripe for abuse and distortion. Goodhart's Law is in effect.
What's interesting about the quote heading this post is that the opacity of the Astros approach defeats psychology and evades Goodhart. The Astros have one or more measures they certainly are pursuing, likely a single calculated measure since they use recursions, but nobody else knows what that is with any certainty. For them, it is a good target, and a good measure. It has placed them among the elite of the sport. It delivered for them a championship. Other teams have other measures. But if every team knew how the Astros approach worked, it wouldn't.
This is the secret of the high-impact journals. They have a way of remaining in the elite, a flywheel of reputation they know how to keep charged and spinning, internal measures that they don't share, a system of sabremetrics for themselves.
Maybe Goodhart's Law isn't quite as simple as we think. Maybe metrics that aren't transparent can evade it, and these private metrics feed into a common metric that isn't as all-encompassing or representative as we might believe. Maybe each journal is like a competitive team, with its own approach, culture, and measures. Maybe we only know the standings, but don't really know the games and teams as well as we think.