Zum Seitenende Übersicht Artikel Home & Impressum
First the link to this week's complete list as HTML and as PDF.
Lean et al. is a convincing and well done confirmation of previous assumptions. That said one can't help noticing that here too statistics is done a ritual enactment of cargo cult without understanding what one's doing. Their figure 3 is labelled
“confidence interval”. Confidence intervals are an inference about the population mean from a limited sample. This is nothing of the sort, fig. 3 is a purely descriptive statistic. A finite group of probands was fully measured and 5 % outliers struck before drawing the results. That's the inner 95 % of the distribution of data, something entirely different. Looking more closely it's probably not even that. All the error bars are completely symmetric. That's no representation of real distribution but the two-sigma-area of a fitted Gaussian, although the true distribution is certain to be decidedly non-Gaussian. On top of that the spread is smallest for the control group and widest for the small subgroups of drops-out in the different phases. That's not the standard deviation but the standard error, a totally nonsensical value in this context. We do not have the one single standard patient the value for whom we determine more and more precisely, the larger our number of measurements. Only for him would the standard error make sense (and then only, if all our measurements were free of any systematic bias). When the performers fail to understand what they're doing and just follow the ritual, the result is bound to be nonsense.
What is the
“sunk cost fallacy”? To give an example you have already paid a large irrecoverable deposit on e.g. a house when a new and better offer comes up. The rational choice would be to compare the remaining cost of the first choice to the total price of the new one. The deposit is gone and irrecoverable and ought not to have any influence on your view of the future. This means that sunk costs only ever come into play when there is new information or when your expectations for the future change. This is not the case with Sweis et al. and so contrary to what the title implied their study has nothing whatever to do with sunk costs.
In Sweis et al.'s setup the future is precisely known from the start and never changes, so the only rational choice is to stick with the decision once made. If you do change your choice at all, you ought to do so right at the beginning. Something worth waiting 20 seconds for should certainly be worth a wait of five seconds. The result in their figure 2 is perfectly rational and nothing whatever to do with sunk costs and much less with any fallacy.
So why change at all if that's irrational? Neuroscientists like to play gods and expect to be treated as such. Their pronouncements are irrefutable truth. That's not how the real world works, the one we are optimally adapted to. When the clues point to a long wait of thirty seconds it might just as well turn out much shorter. There may also be more factors at work. Cologne tram stops give you the waiting time and that information is reliable. So why stop and walk on after a short wait? Having just walked up to the stop, a rest may look appealing, while after a short while boredom sets in and walking on becomes much more attractive. Will the probability of taking off at a given remaining waiting time correlate to the time already waited? It certainly will. The readiness to stop at a given waiting time will strongly depend on the current tiredness, the distance to the next stop, and whether that stop is my destination or just another chance to step onto the same bus. These criteria won't change and will persist. These are sunk costs but correlating to them is no fallacy.
As a last example you have already spent much effort into refurbishing a run-down house with a lot still left to do when the offer of a (seemingly) much better one comes up. The psychologist designing the test expects you to take his word on the
“much better” without any doubt. In real life the work already done will have made you come to know the first house intimately. Your estimate about the remaining work will have become quite reliable over time. The new house may look perfect at first glance but practical experience has made you aware of the possibility of all kinds of hidden impairments. That is why the many probands preferring 80 dollars now to 100 in the future are no hedonists or short sighted innumerates. What you get now you have and who knows whether a promise for the future will be fulfilled or not. Sunk costs all too often correlate with better knowledge.
People and animals are not as stupid and irrational as all-knowing and godlike experimenters make them seem to be.
Carbon dioxide is highly politicized and provides a means for acquiring large grants and state subsidies. So an article dealing with long term carbon storage in the oceans has to be political as well as purely scientific and should be written in a way that's accessible to lay audiences. Rau et al. do their best towards just that goal by consistently using common and easy to visualize conversion units like EJ/Gt. But their main result can still be extracted from the data given in their introduction:
The total anthropogenic annual carbon dioxide emission is about 41 Gt/a and current electricity generation comes to about 90 EJ/a worldwide. Their process fixates around 0.15 Gt/EJ per electricity consumed. So to offset the total emission they'd need to install an extra 270 EJ/a or three times the total current generating capacity as renewable, carbon-free electricity. If we lower our goals to the electricity sector alone, we'd only have to fix a quarter of the total emission or about 10 Gt/a requiring 67 EJ/a of electricity. Seeing that a substantial portion of the total electricity supply already comes from nuclear, hydro and other renewables, this is probably more than the sum of all fossil generation. So why not just substitute that in the first place instead of running all that generation twice and using one half to suck up the emissions of the other half? By the way 0.15 Gt/EJ comes out as 540 g/kWh, substantially less than a coal fired plant emits. Citing that single number in the common unit practitioners are used to would already have told us all we need to know.
There’s more in the details. The headline tries to sell the process as one producing hydrogen. Their first process 1a uses up all its hydrogen to neutralize the also produced highly toxic chlorine gas. The recombination can be used to produce some electricity, but instead of offsetting that against their gross use, they list the full calorific value of the interim hydrogen as a gain. Their second process 1b also frees chlorine without them telling us where it goes. The third one 1c is the only one with a real net hydrogen production, but it is also the least well explained of the three. The only thing this process is ever going to suck up in the total net balance is a huge amount of taxpayer subsidy and the article turns out well designed toward just that goal.
Is the current drought in the Sahel proof of anthropogenic influence as Carré et al. try to imply? True, the current amplitude is the highest in 1600 years, but the steep rise started well before 1800 and the curve shows no sign of any influence of the industrialization. If anything it may be passing its current peak and the last 15 years of rising precipitation need not be a short-term fluctuation as they suppose but may signal the beginning of the return to the long-term average.
What’s more important in their result is the inverse relationship of temperature and humidity across the Medieval Climatic Optimum and Little Ice Age. This is the opposite of the long term trend, where both are in synchronicity. Both the MCA and LIA are often said to be purely European in scope while other parts of the world often exhibit the opposite trend in temperatures. The Senegalese sea surface temperature for one does not reproduce the same course.
Zum Anfang Übersicht Artikel Home & Impressum