Diese Liste als PDF Datei .
Zum Ende Zurück zum Blog Home & Impressum
An analysis of fossil imprints of ancient raindrops suggests that the density of the atmosphere 2.7 billion years ago was much the same as that today. This result casts fresh light on a long-standing palaeoclimate paradox.
A team of physicists says it can use lasers to see whether the universe stores information like a hologram. But some key theorists think the test won't fly.
Not everyone cheers the effort, however. In fact, Leonard Susskind, a theorist at Stanford University in Palo Alto, California, and co-inventor of the holographic principle, says the experiment has nothing to do with his brainchild. “The idea that this tests anything of interest is silly,” he says, before refusing to elaborate and abruptly hanging up the phone. Others say they worry that the experiment will give quantumgravity research a bad name.
Bousso notes that a premise of special relativity called Lorentz invariance says the rules of physics should be the same for all observers, regardless of how they are moving relative to one another. The holographic principle maintains Lorentz invariance, Bousso says. But Hogan's uncertainty formula does not, he argues: An observer standing in the lab and another zipping past would not agree on how much an interferometer's beam splitter jitters. So Hogan's uncertainty relationship cannot follow from the holographic principle, Bousso argues.
s336-0245-Supplement.pdf
Skilled readers use information about which letters are where in a word (orthographic information) in order to access the sounds and meanings of printed words. We asked whether efficient processing of orthographic information could be achieved in the absence of prior language knowledge. To do so, we trained baboons to discriminate English words from nonsense combinations of letters that resembled real words. The results revealed that the baboons were using orthographic information in order to efficiently discriminate words from letter strings that were not words. Our results demonstrate that basic orthographic processing skills can be acquired in the absence of preexisting linguistic representations.
s336-0095-Supplement.pdf
Behavioral economic studies involving limited numbers of choices have provided key insights into neural decision-making mechanisms. By contrast, animals' foraging choices arise in the context of sequences of encounters with prey or food. On each encounter, the animal chooses whether to engage or, if the environment is sufficiently rich, to search elsewhere. The cost of foraging is also critical. We demonstrate that humans can alternate between two modes of choice, comparative decision-making and foraging, depending on distinct neural mechanisms in ventromedial prefrontal cortex (vmPFC) and anterior cingulate cortex (ACC) using distinct reference frames; in ACC, choice variables are represented in invariant reference to foraging or searching for alternatives. Whereas vmPFC encodes values of specific well-defined options, ACC encodes the average value of the foraging environment and cost of foraging.
Baboons can distinguish between written words and nonwords.
The authors developed a task to assess whether baboons can successfully perform orthographic processing. To receive a food pellet reward, the baboons had to discriminate four-letter English written words from nonwords of the same length. Nonwords were constructed of letter combinations that were rare in the sample of English words. The baboons were able to learn not only a specifi c list of words, but also to predict whether a new letter sequence was a real English word or not. Presumably, the baboons learned the statistics of letter combinations in the fourletter English words and used this information to perform the task. Because baboons do not speak any human language, this fi nding strongly suggests that the speech-based model of reading may be at best incomplete and possibly wrong.
n484-0359-Supplement.pdf
According to the `Faint Young Sun' paradox, during the late Archaean eon a Sun approximately 20 % dimmer warmed the early Earth such that it had liquid water and a clement climate1. Explanations for this phenomenon have invoked a denser atmosphere that provided warmth by nitrogen pressure broadening1 or enhanced greenhouse gas concentrations2. Such solutions are allowed by geochemical studies and numerical investigations that place approximate concentration limits on Archaean atmospheric gases, includingmethane, carbon dioxide and oxygen2-7. But no field data constraining ground-level air density and barometric pressure have been reported, leaving the plausibility of these various hypotheses in doubt. Here we show that raindrop imprints in tuffs of the Ventersdorp Supergroup, SouthAfrica, constrain surface air density 2.7 billion years ago to less than twice modern levels. We interpret the raindrop fossils using experiments in which water droplets of known size fall at terminal velocity into fresh and weathered volcanic ash, thus defining a relationship between imprint size and raindrop impact momentum. Fragmentation following raindrop flattening limits raindrop size to amaximum value independent of air density, whereas raindrop terminal velocity varies as the inverse of the square root of air density. If the Archaean raindrops reached the modern maximum measured size, air density must have been less than 2.3 kgm23, compared to today's 1.2 kgm23, but because such drops rarely occur, air density was more probably below 1.3 kgm23. The upper estimate for air density renders the pressure broadening explanation1 possible, but it is improbable under the likely lower estimates. Our results also disallow the extreme CO2 levels required for hot Archaean climates8.
Antiquity recently published the first continuous Glacial radiocarbon calibration curve (van Andel 1998), which is based on the geomagnetic extrapolations of Laj et al. (1996), but was already challenged by Van der Plicht (1999) in the immediately succeeding volume of ANTIQUITY. Van der Plicht's reply urges caution and suggests to prehistorians the unreliability of radiocarbon calibration for the Last Glacial period “for the purposes of prehistoric and environmental calibration” (Van der Plicht 1999, 119) until a number of issues concerning the time-scales of the Greenland GRIP and GISP2 ice-cores and the Japanese Lake Suigetsu varve chronology published by Kitagawa & Van der Plicht (1998a; 1998b) shall have been solved. This response to Van der Plicht (1999) documents that the data sets required for such purposes are already available today and – with possibilities to test each subset of data – do, indeed, allow first approximations of Calendric Age-Conversions of Last Glacial radiocarbon data.
Key-words: radiocarbon calibration, Calendric Age-Conversions, Last Glacial, contextual palaeoclimate archives, deep-sea records, ice-cores
Standardized criteria will help to reliably distinguish marks made by hominid tools from feeding traces of other animals.