The recent NCC paper of that name prompted a comment. Plausible? Yes, certainly. This idea has been doing the rounds for some time, the basic idea being that the response to radiative forcing is not precisely as would be predicted from a simple energy balance model with a fixed sensitivity in which the radiative feedback depends solely on the global mean temperature anomaly, but rather one in which the pattern of warming also affects the radiative feedback. And in particular, during a warming scenario, the feedback is a bit higher (and therefore effective sensitive is a bit lower) during the transient phase than it is at the ultimate warmer equilibrium. This happens in most (almost all) climate models, it's certainly plausible that it applies to the real climate system too. It is the major weakness of the regression-based "Gregory method" for estimating the equilibrium sensitivity of models, in which extrapolation of a warming segment tends to lead to an underestimate of the equilibrium result. Here's a typical example of that from a paper by Maria Rutgenstein:
The regression line based on the first 150 years predicts an equilibrium response of 5.4C (for 4xCO2) but the warming actually continues past 6.2C (and who knows how much further it may continue). There are numerous pics of this sort of thing floating around with multiple models showing qualitatively similar results under a range of scenarios, so this is not just an artefact of the specific experiment shown here.
Kyle Armour has also done a fair amount of research looking at the way regional effects and warming patterns combine with regional feedbacks to generate this sort of behaviour (eg here). This new work tries to quantify this effect in order to more precisely interpret the observed 20th century temperature changes in terms of the equilibrium response. For some reason I can't get through the paywall right now to check the details but in principle it seems entirely reasonable.