candy
the fallacy of obviousness: a new interpretation of a classic psychology experiment will change your view of perception, judgment – even human nature
teppo felin 2018
https://aeon.co/essays/are-humans-really-blind-to-the-gorilla-on-the-basketball-court
“computers and algorithms – even the most sophisticated ones – cannot address the fallacy of obviousness. Put differently, they can never know what might be relevant.”
“while Kahneman calls for large-scale replications of priming studies, the argument here is not that we need more studies or data to verify that people indeed miss blatantly obvious gorillas. Instead, we need better interpretation and better theories. After all, additional studies and data would undoubtedly verify the gorilla finding. But the more important issue is the interpretation of the gorilla finding and the experimental construction of the finding.
Having a ‘blind to the obvious’-baseline assumption about human nature biases the types of experiments that are crafted by scientists in the first place, what scientists go on to observe and look for, and how they interpret what they find. And importantly, the assumption of human blindness or bias makes scientists themselves blind to the other, more positive aspects of human cognition and nature. Thus the problem is more upstream, in the set of a priori questions and theories that science has. Namely, if our theory focuses on some aspect of human blindness and bias, and if we construct lab experiments to prove it (or look for naturally occurring instances of it), then yes, we are likely to find evidence.”
“Any well-intentioned efforts to correct human blindness also need to recognise that making these corrections comes with costs or tradeoffs. The trivial point here is that if we want to correct for the blindness of not spotting something (such as the gorilla), then this correction comes at the cost of attending to any number of other obvious things (eg, number of basketball passes). But the far more important point is that we also need to recognise and investigate the remarkable human capacities for generating questions and theories that direct our awareness and observations in the first place. Bias and blindness-obsessed studies will never get us to this vital recognition. In other words, continuing to construct experiments that look for and show bias and blindness – and adding them to the very large and growing list of cognitive biases and forms of blindness – will always leave out the remarkable capacities of humans to generate questions and theories. At its worst, the fascination with blindness and bias flattens humans, and science, to a morally dubious game of ‘gotcha’.”
rationality, perception, and the all-seeing eye
teppo felin et al. 2016
https://doi.org/10.3758/s13423-016-1198-z
the theory-based view: economic actors as theorists
teppo felin and todd r. zenger 2017
https://doi.org/10.1287/stsc.2017.0048
mind, rationality, and cognition: an interdisciplinary debate
nick chater et al. 2017
https://doi.org/10.3758/s13423-017-1333-5
scientists reveal the number of times you’re actually conscious each minute: spoiler: it’s not very often (and that’s a good thing)
emma betuel 2018
https://www.inverse.com/article/48300-why-is-it-hard-to-focus-research-humans
Four times every second, explains Princeton Neuroscience Institute Ian Fiebelkorn, Ph.D., to Inverse, the brain stops focusing on the task at hand. That’s about 240 times a minute.
“The brain is wired to be somewhat distractible.”
“The brain is wired to be somewhat distractible,” he says. “We focus in bursts, and between those bursts we have these periods of distractibility, that’s when the brain seems to check in on the rest of the environment outside to see if there’s something important going on elsewhere. These rhythms are affecting our behavior all the time.”
To understand these “rhythms of attention,” Fiebelkorn suggests imagining standing in Times Square on New Years’ Eve, surrounded by people, cars, and music. The scene presents far more sensory information than one human brain is capable of sorting through, and so, the brain deals with all of the information in two ways. First, it focuses on a single point of interest: the street corner where you might meet a friend, or Ryan Seacrest combing the crowd for interviews. Like a filmstrip, the brain takes snapshots of these moments and pieces them together into a cohesive narrative, or “perceptual cycle.”
We experience that moment as continuous, but in reality, we’ve only sampled certain elements of the environment around us. It feels continuous because our brains have filled in the gaps for us, explains Berkeley’s Knight Lab researcher and first author Randolph Helfrich, Ph.D. to Inverse.
“I think it’s more a philosophical problem that it is a scientific problem,” he says. “Because when we look at brain data we see a pattern that waxes and wanes, they’re never constant and stable. Everyone perceives the world as continuous and coherent, but the real tricky part is, how does the brain do that?”
Modern society tends to think of distractibility as a bad thing, Fiebelkorn says, but it might have offered early humans and our distant ancestors a huge evolutionary advantage. The brain’s natural tendency to “zoom out” and become distracted by the environment, even for just a few milliseconds, could have allowed them the time to detect the presence of a threat and react accordingly.
“Say you spot a shiny red apple in a tree and you focus on that,” Fiebelkorn says. “You’re going to go and pick that apple, but you’ll want to know if there’s any larger animal with bigger teeth going after the same apple. So, having these windows of distractibility helps you to detect these stimuli you might otherwise miss.”
The findings of these two papers in conjunction are powerful evidence that these rhythms are highly adaptive and have been preserved in humans and their relative species for millions of years. This hypothesis is based on the fact that the teams noticed nearly identical neural patterns of attention (the “rhythms of distractability”) in both the humans and macaques. For a trait to still be so similar in species that diverged from a common ancestor billions of years ago, it very likely must provide a useful evolutionary advantage that has been preserved by natural selection.
neural mechanisms of sustained attention are rhythmic
randolph f. helfrich et al. 2018
https://doi.org/10.1016/j.neuron.2018.07.032
“Our subjective experience of the visual world is an illusion,” said Sabine Kastner, a professor of psychology and the Princeton Neuroscience Institute (PNI). “Perception is discontinuous, going rhythmically through short time windows when we can perceive more or less.”
The researchers use different metaphors to describe this throb of attention, including a spotlight that waxes and wanes in its intensity. Four times per second — once every 250 milliseconds — the spotlight dims and the house lights come up. Instead of focusing on the action “onstage,” your brain takes in everything else around you, say the scientists.
Their work appears as a set of back-to-back papers in in the Aug. 22 issue of Neuron; one paper focuses on human research subjects, the other on macaque monkeys.
“The question is: How can something that varies in time support our seemingly continuous perception of the world?” said Berkeley’s Randolph Helfrich, first author on the human-focused paper. “There are only two options: Is the data wrong, or is our understanding of our perception biased? Our research shows that it’s the latter. Our brains fuse our perceptions into a coherent movie — we don’t experience the gaps.”
Perception doesn’t flicker on and off, the researchers emphasized, but four times per second it cycles between periods of maximum focus and periods of a broader situational awareness.
“Every 250 milliseconds, you have an opportunity to switch attention,” said Ian Fiebelkorn, an associate research scholar in PNI and the first author on the macaque-focused paper. You won’t necessarily shift your focus to a new subject, he said, but your brain has a chance to re-examine your priorities and decide if it wants to.
Brain rhythms have been known for almost a century, since electroencephalograms — better known as EEGs — were invented in 1924. “But we didn’t really understand what these rhythms are for,” said Kastner, who was the senior author on both papers. “We can now link brain rhythms for the first time to our behavior, on a moment-to-moment basis. ... This is a very surprising finding, more since these rhythmic processes are evolutionarily old — we find them in non-human primates as well as in our own species.”
This pulsing attention must present an evolutionary advantage, the researchers suggest, perhaps because focusing too intently on one subject could allow a threat to catch us by surprise.
“Attention is fluid, and you want it to be fluid,” said Fiebelkorn. “You don’t want to be over-locked on anything. It seems like it’s an evolutionary advantage to have these windows of opportunity where you’re checking in with your environment.”
“It’s an elegant way to allocate brain resources — to sample the environment and not have any lapses,” said Robert Knight, a professor of psychology and neuroscience at Berkeley and a co-author on the human-focused paper.
Kastner’s lab focuses on macaque research, so she reached out to Knight’s lab, which does similar studies on humans. The resulting papers are unprecedented, Knight said.
“This is cross-species validation of a fundamental aspect of human behavior,” he said. “I have not seen any back-to-back human and monkey papers appear anywhere ... and these are in the same issue of Neuron, a preeminent journal.”
Fiebelkorn agreed: “We have an assumption that what we find in the monkey will hold up in humans, but it’s rarely checked as carefully as it is here.”
“Originally, we wanted to study something very different,” said Kastner. “We wanted to ask how we can select objects from our cluttered visual environments. ... We were particularly looking at how the intake of visual information unfolds over time — something that is rarely done in behavioral studies — and this revealed the rhythmic structure of perception. It was a complete surprise finding.”
abstract The functional architecture of attention is rhythmic
Frontoparietal theta activity predicts behavior on a rapid timescale
Theta activity controls cortical excitability and information flow
Rhythmic sampling is independent of task structure and context
Classic models of attention suggest that sustained neural firing constitutes a neural correlate of sustained attention. However, recent evidence indicates that behavioral performance fluctuates over time, exhibiting temporal dynamics that closely resemble the spectral features of ongoing, oscillatory brain activity. Therefore, it has been proposed that periodic neuronal excitability fluctuations might shape attentional allocation and overt behavior. However, empirical evidence to support this notion is sparse. Here, we address this issue by examining data from large-scale subdural recordings, using two different attention tasks that track perceptual ability at high temporal resolution. Our results reveal that perceptual outcome varies as a function of the theta phase even in states of sustained spatial attention. These effects were robust at the single-subject level, suggesting that rhythmic perceptual sampling is an inherent property of the frontoparietal attention network. Collectively, these findings support the notion that the functional architecture of top-down attention is intrinsically rhythmic.
a dynamic interplay within the frontoparietal network underlies rhythmic spatial attention
ian c. fiebelkorn et al. 2018
https://doi.org/10.1016/j.neuron.2018.07.038
abstract Non-human primates, like humans, sample the visual scene in rhythmic cycles
Neural oscillations in the frontoparietal network modulate perceptual sensitivity
Theta phase acts as a clocking mechanism, organizing alternating attentional states
Temporal dynamics linked to specific function and cell type define attentional state
Classic studies of spatial attention assumed that its neural and behavioral effects were continuous over time. Recent behavioral studies have instead revealed that spatial attention leads to alternating periods of heightened or diminished perceptual sensitivity. Yet, the neural basis of these rhythmic fluctuations has remained largely unknown. We show that a dynamic interplay within the macaque frontoparietal network accounts for the rhythmic properties of spatial attention. Neural oscillations characterize functional interactions between the frontal eye fields (FEF) and the lateral intraparietal area (LIP), with theta phase (3–8 Hz) coordinating two rhythmically alternating states. The first is defined by FEF-dominated beta-band activity, associated with suppressed attentional shifts, and LIP-dominated gamma-band activity, associated with enhanced visual processing and better behavioral performance. The second is defined by LIP-specific alpha-band activity, associated with attenuated visual processing and worse behavioral performance. Our findings reveal how network-level interactions organize environmental sampling into rhythmic cycles.
the ‘saw but forgot’ error: a role for short-term memory failures in understanding junction crashes?
chloe j. robbins et al. 2019
http://dx.doi.org/10.1371/journal.pone.0222905
drivers looked at, but seconds later failed to recall, critical approaching vehicles on up to 15% of occasions. Drivers were around 5 times more likely to forget a motorcycle compared with a car.
The research was funded by the ESRC and carried out by PhD student, Chloe Robbins supervised by Dr Peter Chapman, in the School of Psychology. It suggests that many 'Look but Fail to See' (LBFTS) crashes may have been misclassified and are more likely to be a case of 'Saw but Forgot' (SBF) errors.
"The 'Saw But Forgot' error: A role for short-term memory failures in understanding junction crashes?" has been published in the online journal PLOS ONE and offers practical interventions that may prevent SBF crashes in the future.
Short term memory failure
For each real-world crash there are hundreds of thousands of safe, successful, junction crossings, but when errors do occur they can have fatal consequences. To understand what is going on the research team explored where drivers looked and what they remembered while crossing junctions in a driving simulator. The big surprise from the research was the fact that some drivers have absolutely no recollection of seeing an oncoming vehicle at all even as they are about to pull out at a junction.
Their results suggest that it's what happens in the moments between seeing an approaching vehicle and pulling out that can lead to a complete absence of memory -- particularly for approaching motorcycles.
Dr Chapman, an expert in the psychology of driving, said: "Typical interpretations of the LBFTS crash are based on the idea that the driver pulling out has failed to devote sufficient attention to the traffic on the road. Our study set out to look for systematic biases in attention towards and memory for different vehicle types. Although these effects were found, the most striking finding was not subtle biases in vision or memory, but the fact that in some cases there was a complete absence of memory, particularly for approaching motorcycles."
Study 1: Stopping at a junction
The research team recorded the eye movements of 60 drivers crossing junctions in the University of Nottingham's high fidelity driving simulator -- part of NITES (the Nottingham Integrated Transport and Environment Simulation facility). Although drivers seemed to look in the right places as they approached the junction, there were 20 occasions where a driver couldn't remember one of the oncoming vehicles. The forgotten vehicle was a LGV on 2 occasions, a car on 4 occasions and a motorcycle on 14 occasions.
Study 2: Pulling out of the junction
For this study 30 drivers were required to approach a series of junctions and go straight on, if they thought it was safe to do so. The simulation of oncoming vehicles involved either 2 cars or a car and a motorcycle. The driver's eye movements were tracked continuously throughout the experiment and memory tests were only given if the driver actually pulled out in front of oncoming vehicles. Out of the 120 times memory was tested drivers failed to report a car on one occasion and a motorcycle on 8 occasions.
Study 3: Tracking head and eye movements
This experiment used the same design as study 2 but now 45 drivers wore lightweight eye-tracking glasses to obtain highly accurate measures of exactly where they looked before pulling out. Out of the 180 memory tests drivers failed to report a car on 3 occasions and a motorcycle on 16 occasions. Of these 16 memory failures there were 5 occasions when the driver had not looked directly at the oncoming motorcycle. These could be examples of typical LBFTS (Look but Fail to See) errors where the driver looked in the right direction but failed to see the motorcycle. In contrast, on the remaining 11 occasions the driver clearly looked directly at the oncoming motorcycle, but couldn't remember it a few seconds later. The researchers have described these as SBF (Saw but Forgot) errors.
This study also showed that SBF errors were associated with more head movements and a longer gap between fixating on the motorcycle and pulling out. The researchers suggest that this is where the forgetting is occurring. Things the driver looks at between seeing the oncoming vehicle and pulling out might be overwriting the initial contents of visuospatial memory so information about the oncoming vehicle is no longer available at the time a decision is made to pull out.
Dr Chapman said: "These studies compellingly demonstrate that even in safety-critical situations it is possible to observe dramatic failures of visual memory. These 'Saw but Forgot' errors were remarkably frequent in the simulator and we have every reason to think that they may be equally prevalent in the real world. The surprising lack of memory may be exactly why these crashes appear so mysterious."
The phonological loop -- 'See bike say bike'
As a result of their findings the research team has established a new framework to understand dynamic risky decision-making with an emphasis on the role of short-term memory in such situations. The 'Perceive Retain Choose' (PRC) model creates new predictions and proposals for practical interventions.
Specifically, they suggest teaching drivers that if they see a motorcycle approaching, they should say so out loud -- 'See Bike, Say Bike'.
Dr Chapman said: "If relevant visual information is encoded phonologically it has been shown that it is no longer subject to visuospatial interference. Clearly any research that improves our understanding of these crashes and the kind of countermeasures that can be used to prevent them, has the potential to be a major contribution to world health."
abstract Motorcyclists are involved in an exceptionally high number of crashes for the distance they travel, with one of the most common incidents being where another road user pulls out into the path of an oncoming motorcycle frequently resulting in a fatal collision. These instances have previously been interpreted as failures of visual attention, sometimes termed ‘Look but Fail to See’ (LBFTS) crashes, and interventions have focused on improving drivers’ visual scanning and motorcycles’ visibility. Here we show from a series of three experiments in a high-fidelity driving simulator, that when drivers’ visual attention towards and memory for approaching vehicles is experimentally tested, drivers fail to report approaching motorcycles on between 13% and 18% of occasions. This happens even when the driver is pulling out into a safety-critical gap in front of the motorcycle, and often happens despite the driver having directly fixated on the oncoming vehicle. These failures in reporting a critical vehicle were not associated with how long the driver looked at the vehicle for, but were associated with drivers’ subsequent visual search and the time that elapsed between fixating on the oncoming vehicle and pulling out of the junction. Here, we raise the possibility that interference in short-term memory might prevent drivers holding important visual information during these complex manoeuvres. This explanation suggests that some junction crashes on real roads that have been attributed to LBFTS errors may have been misclassified and might instead be the result of ‘Saw but Forgot’ (SBF) errors. We provide a framework for understanding the role of short-term memory in such situations, the Perceive Retain Choose (PRC) model, as well as novel predictions and proposals for practical interventions that may prevent this type of crash in the future.
neuroscientist lisa feldman barrett explains how emotions are made 2017
theverge.com/2017/4/10/15245690/how-emotions-are-made-neuroscience-lisa-feldman-barrett
the vision thing: how babies colour in the world
nicola davis 2017
theguardian.com/lifeandstyle/2017/apr/11/vision-thing-how-babies-colour-in-the-world
the sea was never blue
the greek colour experience was made of movement and shimmer. can we ever glimpse what they saw when gazing out to sea?
maria michela sassi 2017
aeon.co/essays/can-we-hope-to-understand-how-the-greeks-saw-their-world
hunter-gatherer olfaction is special
majid and kruspe 2018
http://dx.doi.org/10.1016/j.cub.2017.12.014
abstract •People struggle to name odors, but this limitation is not universal
•Is superior olfactory performance due to subsistence, ecology or language family?
•Hunter-gatherers and non-hunter-gatherers from the same environment were compared
•Only hunter-gatherers were proficient odor namers, showing subsistence is crucial
People struggle to name odors. This has been attributed to a diminution of olfaction in trade-off to vision. This presumption has been challenged recently by data from the hunter-gatherer Jahai who, unlike English speakers, find odors as easy to name as colors. Is the superior olfactory performance among the Jahai because of their ecology (tropical rainforest), their language family (Aslian), or because of their subsistence (they are hunter-gatherers)? We provide novel evidence from the hunter-gatherer Semaq Beri and the non-hunter-gatherer (swidden-horticulturalist) Semelai that subsistence is the critical factor. Semaq Beri and Semelai speakers—who speak closely related languages and live in the tropical rainforest of the Malay Peninsula—took part in a controlled odor- and color-naming experiment. The swidden-horticulturalist Semelai found odors much more difficult to name than colors, replicating the typical Western finding. But for the hunter-gatherer Semaq Beri odor naming was as easy as color naming, suggesting that hunter-gatherer olfactory cognition is special.
“There has been a long-standing consensus that ‘smell is the mute sense, the one without words,’ and decades of research with English-speaking participants seemed to confirm this,” says Asifa Majid of Radboud University in the Netherlands. “But, the Jahai of the Malay Peninsula are much better at naming odors than their English-speaking peers. This, of course, raises the question of where this difference originates.”
To find out whether it was the Jahai who have an unusually keen ability with odors or whether English speakers are simply lacking, Majid and Nicole Kruspe at Lund University in Sweden looked to two related, but previously unstudied, groups of people in the tropical rainforest of the Malay Peninsula: the hunter-gatherer Semaq Beri and the non-hunter-gatherer Semelai. The Semelai are traditionally horticulturalists, combining shifting rice cultivation with the collection of forest products for trade.
The Semaq Beri and Semelai not only live in a similar environment; they also speak closely related languages. The question was: how were they at naming odors?
“If ease of olfactory naming is related to cultural practices, then we would expect the Semaq Beri to behave like the Jahai and name odors as easily as they do colors, whereas the Semelai should pattern differently,” the researchers wrote. And, that’s exactly what they found.
Majid and Kruspe tested the color- and odor-naming abilities of 20 Semaq Beri and 21 Semelai people. Sixteen odors were used: orange, leather, cinnamon, peppermint, banana, lemon, licorice, turpentine, garlic, coffee, apple, clove, pineapple, rose, anise, and fish. For the color task, study participants saw 80 Munsell color chips, sampling 20 equally spaced hues at four degrees of brightness. Kruspe tested participants in their native language by simply asking, “What smell is this?” or “What color is this?”
The results were clear. The hunter-gatherer Semaq Beri performed on those tests just like the hunter-gatherer Jahai, naming odors and colors with equal ease. The non-hunter-gatherer Semelai, on the other hand, performed like English speakers. For them, odors were difficult to name.
The results suggest that the downgrading in importance of smells relative to other sensory inputs is a recent consequence of cultural adaption, the researchers say. “Hunter-gatherers’ olfaction is superior, while settled peoples’ olfactory cognition is diminished,” Majid says.
They say the findings challenge the notion that differences in neuroarchitecture alone underlie differences in olfaction, suggesting instead that cultural variation may play a more prominent role. They also raise a number of interesting questions: “Do hunter-gatherers in other parts of the world also show the same boost to olfactory naming?” Majid asks. “Are other aspects of olfactory cognition also superior in hunter-gatherers,” for example, the ability to differentiate one odor from another? “Finally, how do these cultural differences interact with the biological infrastructure for smell?” She says it will be important to learn whether these groups of people show underlying genetic differences related to the sense of smell.
spatial representations of the viewer’s surroundings
satoshi shioiri et al. 2018
http://dx.doi.org/10.1038/s41598-018-25433-5
Spatial representation surrounding a viewer including outside the visual field is crucial for moving around the three-dimensional world. To obtain such spatial representations, we predict that there is a learning process that integrates visual inputs from different viewpoints covering all the 360° visual angles. We report here the learning effect of the spatial layouts on six displays arranged to surround the viewer, showing shortening of visual search time on surrounding layouts that are repeatedly used (contextual cueing effect). The learning effect is found even in the time to reach the display with the target as well as the time to reach the target within the target display, which indicates that there is an implicit learning effect on spatial configurations of stimulus elements across displays. Since, furthermore, the learning effect is found between layouts and the target presented on displays located even 120° apart, this effect should be based on the representation that covers visual information far outside the visual field.
contextual cueing effect (CCE),
CCE of surrounds (CCES)
the visual system constructs representations that link information within the visual field and information outside the visual field through repeated observation of the same spatial arrangements, that is, the CCES. The CCES is implicit and done without awareness of repeated observation of the same stimulus. Representations obtained by repetition without awareness are useful for moving around in familiar spaces and also in spaces that have structures in common with familiar places. Such representations should support actions in everyday life as well as specific actions for sports, driving, and so on.
avian uv vision enhances leaf surface contrasts in forest environments
cynthia tedore, dan-eric nilsson 2019
http://dx.doi.org/10.1038/s41467-018-08142-5
Human colour vision is based on three primary colours: red, green and blue. The colour vision of birds is based on the same three colours -- but also ultraviolet. Biologists at Lund have now shown that the fourth primary colour of birds, ultraviolet, means that they see the world in a completely different way. Among other things, birds see contrasts in dense forest foliage, whereas people only see a wall of green.
"What appears to be a green mess to humans are clearly distinguishable leaves for birds. No one knew about this until this study," says Dan-Eric Nilsson, professor at the Department of Biology at Lund University.
For birds, the upper sides of leaves appear much lighter in ultraviolet. From below, the leaves are very dark. In this way the three-dimensional structure of dense foliage is obvious to birds. This in turn makes it easy for them to move, find food and navigate. People, on the other hand, do not perceive ultraviolet, and see the foliage in green; the primary colour where contrast is the worst.
Dan-Eric Nilsson founded the world-leading Lund Vision Group at Lund University. The study in question is a collaboration with Cynthia Tedore and was conducted during her time as a postdoc in Lund. She is now working at the University of Hamburg.
It is the first time that researchers have succeeded in imitating bird colour vision with a high degree of precision. This was achieved with the help of a unique camera and advanced calculations. The camera was designed within the Lund Vision Group and equipped with rotating filter wheels and specially manufactured filters, which make it possible to show what different animals see clearly. In this case, the camera imitates with a high degree of accuracy the colour sensitivity of the four different types of cones in bird retinas.
"We have discovered something that is probably very important for birds, and we continue to reveal how reality appears also to other animals," says Dan-Eric Nilsson, continuing:
"We may have the notion that what we see is the reality, but it's a highly human reality. Other animals live in other realities, and we can now see through their eyes and reveal many secrets. Reality is in the eye of the beholder," he concludes.
abstract UV vision is prevalent, but we know little about its utility in common general tasks, as in resolving habitat structure. Here we visualize vegetated habitats using a multispectral camera with channels mimicking bird photoreceptor sensitivities across the UV-visible spectrum. We find that the contrast between upper and lower leaf surfaces is higher in a UV channel than in any visible channel, and that this makes leaf position and orientation stand out clearly. This was unexpected since both leaf surfaces reflect similarly small proportions (1–2%) of incident UV light. The strong UV-contrast can be explained by downwelling light being brighter than upwelling, and leaves transmitting < 0.06% of incident UV light. We also find that mirror-like specular reflections of the sky and overlying canopy, from the waxy leaf cuticle, often dwarf diffuse reflections. Specular reflections shift leaf color, such that maximum leaf-contrast is seen at short UV wavelengths under open canopies, and at long UV wavelengths under closed canopies.
wild hummingbirds discriminate nonspectral colors
mary caswell stoddard et al. 2020
dx.doi.org/10.1073/pnas.1919377117
“Humans are color-blind compared to birds and many other animals,” said Mary Caswell Stoddard, an assistant professor in the Princeton University Department of Ecology and Evolutionary Biology. Humans have three types of color-sensitive cones in their eyes — attuned to red, green and blue light — but birds have a fourth type, sensitive to ultraviolet light. “Not only does having a fourth color cone type extend the range of bird-visible colors into the UV, it potentially allows birds to perceive combination colors like ultraviolet+green and ultraviolet+red — but this has been hard to test,” said Stoddard.
To investigate how birds perceive their colorful world, Stoddard and her research team established a new field system for exploring bird color vision in a natural setting. Working at the Rocky Mountain Biological Laboratory (RMBL) in Gothic, Colorado, the researchers trained wild broad-tailed hummingbirds (Selasphorus platycercus) to participate in color vision experiments.
“Most detailed perceptual experiments on birds are performed in the lab, but we risk missing the bigger picture of how birds really use color vision in their daily lives,” Stoddard said. “Hummingbirds are perfect for studying color vision in the wild. These sugar fiends have evolved to respond to flower colors that advertise a nectar reward, so they can learn color associations rapidly and with little training.”
Stoddard’s team was particularly interested in “nonspectral” color combinations, which involve hues from widely separated parts of the color spectrum, as opposed to blends of neighboring colors like teal (blue-green) or yellow (green-red). For humans, purple is the clearest example of a nonspectral color. Technically, purple is not in the rainbow: it arises when our blue (short-wave) and red (long-wave) cones are stimulated, but not green (medium-wave) cones.
While humans have just one nonspectral color — purple, birds can theoretically see up to five: purple, ultraviolet+red, ultraviolet+green, ultraviolet+yellow and ultraviolet+purple.
Stoddard and her colleagues designed a series of experiments to test whether hummingbirds can see these nonspectral colors. Their results appear June 15 in the Proceedings of the National Academy of Sciences.
The research team, which included scientists from Princeton, the University of British Columbia (UBC), Harvard University, University of Maryland and RMBL, performed outdoor experiments each summer for three years. First they built a pair of custom “bird vision” LED tubes programmed to display a broad range of colors, including nonspectral colors like ultraviolet+green. Next they performed experiments in an alpine meadow frequently visited by local broad-tailed hummingbirds, which breed at the high-altitude site.
Each morning, the researchers rose before dawn and set up two feeders: one containing sugar water and the other plain water. Beside each feeder, they placed an LED tube. The tube beside the sugar water emitted one color, while the one next to the plain water emitted a different color. The researchers periodically swapped the positions of the rewarding and unrewarding tubes, so the birds could not simply use location to pinpoint a sweet treat. They also performed control experiments to ensure that the tiny birds were not using smell or another inadvertent cue to find the reward. Over the course of several hours, wild hummingbirds learned to visit the rewarding color. Using this setup, the researchers recorded over 6,000 feeder visits in a series of 19 experiments.
The experiments revealed that hummingbirds can see a variety of nonspectral colors, including purple, ultraviolet+green, ultraviolet+red and ultraviolet+yellow. For example, hummingbirds readily distinguished ultraviolet+green from pure ultraviolet or pure green, and they discriminated between two different mixtures of ultraviolet+red light — one redder, one less so.
“It was amazing to watch,” said Harold Eyster, a UBC Ph.D. student and a co-author of the study. “The ultraviolet+green light and green light looked identical to us, but the hummingbirds kept correctly choosing the ultraviolet+green light associated with sugar water. Our experiments enabled us to get a sneak peek into what the world looks like to a hummingbird.”
Even though hummingbirds can perceive nonspectral colors, appreciating how these colors appear to birds can be difficult. “It is impossible to really know how the birds perceive these colors. Is ultraviolet+red a mix of those colors, or an entirely new color? We can only speculate,” said Ben Hogan, a postdoctoral research associate at Princeton and a co-author of the study.
“To imagine an extra dimension of color vision — that is the thrill and challenge of studying how avian perception works,” said Stoddard. “Fortunately, the hummingbirds reveal that they can see things we cannot.”
“The colors that we see in the fields of wildflowers at our study site, the wildflower capital of Colorado, are stunning to us, but just imagine what those flowers look like to birds with that extra sensory dimension,” said co-author David Inouye, who is affiliated with the University of Maryland and RMBL.
Finally, the research team analyzed a data set of 3,315 feather and plant colors. They discovered that birds likely perceive many of these colors as nonspectral, while humans do not. That said, the researchers emphasize that nonspectral colors are probably not particularly special relative to other colors. The wide variety of nonspectral colors available to birds is the result of their ancient four color-cone visual system.
“Tetrachromacy — having four color cone types — evolved in early vertebrates,” said Stoddard. “This color vision system is the norm for birds, many fish and reptiles, and it almost certainly existed in dinosaurs. We think the ability to perceive many nonspectral colors is not just a feat of hummingbirds but a widespread feature of animal color vision.”
abstract Birds have four color cone types in their eyes, compared to three in humans. In theory, this enables birds to discriminate a broad range of colors, including many nonspectral colors. Nonspectral colors are perceived when nonadjacent cone types (sensitive to widely separated parts of the light spectrum) are predominantly stimulated. For humans, purple (stimulation of blue- and red-sensitive cones) is a nonspectral color; birds’ fourth color cone type creates many more possibilities. We trained wild hummingbirds to participate in color vision tests, which revealed that they can discriminate a variety of nonspectral colors, including UV+red, UV+green, purple, and UV+yellow. Additionally, based on an analysis of ∼3,300 plumage and plant colors, we estimate that birds perceive many natural colors as nonspectral.
Many animals have the potential to discriminate nonspectral colors. For humans, purple is the clearest example of a nonspectral color. It is perceived when two color cone types in the retina (blue and red) with nonadjacent spectral sensitivity curves are predominantly stimulated. Purple is considered nonspectral because no monochromatic light (such as from a rainbow) can evoke this simultaneous stimulation. Except in primates and bees, few behavioral experiments have directly examined nonspectral color discrimination, and little is known about nonspectral color perception in animals with more than three types of color photoreceptors. Birds have four color cone types (compared to three in humans) and might perceive additional nonspectral colors such as UV+red and UV+green. Can birds discriminate nonspectral colors, and are these colors behaviorally and ecologically relevant? Here, using comprehensive behavioral experiments, we show that wild hummingbirds can discriminate a variety of nonspectral colors. We also show that hummingbirds, relative to humans, likely perceive a greater proportion of natural colors as nonspectral. Our analysis of plumage and plant spectra reveals many colors that would be perceived as nonspectral by birds but not by humans: Birds’ extra cone type allows them not just to see UV light but also to discriminate additional nonspectral colors. Our results support the idea that birds can distinguish colors throughout tetrachromatic color space and indicate that nonspectral color perception is vital for signaling and foraging. Since tetrachromacy appears to have evolved early in vertebrates, this capacity for rich nonspectral color perception is likely widespread.
human senses
transduction of the geomagnetic field as evidenced from alpha-band activity in the human brain
connie x. wang et al. 2019
http://dx.doi.org/10.1523/eneuro.0483-18.2019
Many animals, such as migratory birds and sea turtles, have a geomagnetic sense that supports their biological navigation system. Although magnetoreception has been well-studied in these animals, scientists have not yet been able to determine whether humans share this ability.
Geoscientist Joseph Kirschvink, neuroscientist Shin Shimojo, and their colleagues at Caltech and the University of Tokyo set out to address this long-standing question using electroencephalography to record adult participants' brain activity during magnetic field manipulations. Carefully controlled experiments revealed a decrease in alpha-band brain activity -- an established response to sensory input -- in some participants. The researchers replicated this effect in participants who responded strongly and confirmed these responses were tuned to the magnetic field of the Northern Hemisphere, where the study was conducted.
Future studies of magnetoreception in diverse human populations may provide new clues into the evolution and individual variation of this ancient sensory system.
abstract Magnetoreception, the perception of the geomagnetic field, is a sensory modality well-established across all major groups of vertebrates and some invertebrates, but its presence in humans has been tested rarely, yielding inconclusive results. We report here a strong, specific human brain response to ecologically-relevant rotations of Earth-strength magnetic fields. Following geomagnetic stimulation, a drop in amplitude of EEG alpha oscillations (8-13 Hz) occurred in a repeatable manner. Termed alpha event-related desynchronization (alpha-ERD), such a response has been associated previously with sensory and cognitive processing of external stimuli including vision, auditory and somatosensory cues. Alpha-ERD in response to the geomagnetic field was triggered only by horizontal rotations when the static vertical magnetic field was directed downwards, as it is in the Northern Hemisphere; no brain responses were elicited by the same horizontal rotations when the static vertical component was directed upwards. This implicates a biological response tuned to the ecology of the local human population, rather than a generic physical effect.
Biophysical tests showed that the neural response was sensitive to static components of the magnetic field. This rules out all forms of electrical induction (including artifacts from the electrodes) which are determined solely on dynamic components of the field. The neural response was also sensitive to the polarity of the magnetic field. This rules out free-radical 'quantum compass' mechanisms like the cryptochrome hypothesis, which can detect only axial alignment. Ferromagnetism remains a viable biophysical mechanism for sensory transduction and provides a basis to start the behavioral exploration of human magnetoreception.
Significance Statement Although many migrating and homing animals are sensitive to Earth’s magnetic field, most humans are not consciously aware of the geomagnetic stimuli that we encounter in everyday life. Either we have lost a shared, ancestral magnetosensory system, or the system lacks a conscious component with detectable neural activity but no apparent perceptual awareness by us. We found two classes of ecologically-relevant rotations of Earth-strength magnetic fields that produce strong, specific and repeatable effects on human brainwave activity in the EEG alpha band (8-13 Hz); EEG discriminates in response to different geomagnetic field stimuli. Biophysical tests rule out all except the presence of a ferromagnetic transduction element, such as biologically-precipitated crystals of magnetite (Fe3O4).
chemosensory modulation of neural circuits for sodium appetite
sangjun lee et al. 2019
http://dx.doi.org/10.1038/s41586-019-1053-2
When the body is low on sodium, the brain triggers specific appetite signals that drive the consumption of sodium. Though the mechanisms of these appetite signals are not fully understood, a team of researchers has now discovered a small population of neurons in the mouse hindbrain that controls the drive to consume sodium.
Led by graduate student Sangjun Lee, the team used genetic tools to manipulate the activity of these neurons so that they could be stimulated with light. The researchers observed that artificially stimulating these neurons caused mice to lick a piece of rock salt repeatedly, even when their bodies were completely sated with sodium.
Next, the researchers measured the activity of these neurons while mice ate sodium. Within several seconds of sodium hitting the animal's tongue, the activity of the sodium-appetite neurons was inhibited. However, a direct infusion of sodium into the stomach of these mice did not suppress the neural activity. This neural suppression also did not occur when sodium receptors on the tongue were pharmacologically blocked. Taken together, the research shows that oral sodium signals, likely mediated by the taste system, are necessary to inhibit the sodium-appetite neurons.
"The desire to eat salt is the body's way of telling you that your body is low on sodium," says Oka. "Once sodium is consumed, it takes some time for the body to fully absorb it. So, it's interesting that just the taste of sodium is sufficient to quiet down the activity of the salt-appetite neurons, which means that sensory systems like taste are much more important in regulating the body's functions than simply conveying external information to the brain."
Interestingly, in many species, including humans, consuming sodium can drive the desire to eat even more. In future work, Oka and his collaborators would like to understand how sodium-appetite neurons are modulated over time. Answering this question may open up avenues to help people with health issues to eat less sodium in their diets.
abstract Sodium is the main cation in the extracellular fluid and it regulates various physiological functions. Depletion of sodium in the body increases the hedonic value of sodium taste, which drives animals towards sodium consumption1,2. By contrast, oral sodium detection rapidly quenches sodium appetite3,4, suggesting that taste signals have a central role in sodium appetite and its satiation. Nevertheless, the neural mechanisms of chemosensory-based appetite regulation remain poorly understood. Here we identify genetically defined neural circuits in mice that control sodium intake by integrating chemosensory and internal depletion signals. We show that a subset of excitatory neurons in the pre-locus coeruleus express prodynorphin, and that these neurons are a critical neural substrate for sodium-intake behaviour. Acute stimulation of this population triggered robust ingestion of sodium even from rock salt, while evoking aversive signals. Inhibition of the same neurons reduced sodium consumption selectively. We further demonstrate that the oral detection of sodium rapidly suppresses these sodium-appetite neurons. Simultaneous in vivo optical recording and gastric infusion revealed that sodium taste—but not sodium ingestion per se—is required for the acute modulation of neurons in the pre-locus coeruleus that express prodynorphin, and for satiation of sodium appetite. Moreover, retrograde-virus tracing showed that sensory modulation is in part mediated by specific GABA (γ-aminobutyric acid)-producing neurons in the bed nucleus of the stria terminalis. This inhibitory neural population is activated by sodium ingestion, and sends rapid inhibitory signals to sodium-appetite neurons. Together, this study reveals a neural architecture that integrates chemosensory signals and the internal need to maintain sodium balance.
unexpected arousal modulates the influence of sensory noise on confidence
micah allen et al. 2016
eLife.18103
how your mind, under stress, gets better at processing bad news
tali sharot 2018
https://aeon.co/ideas/how-your-mind-under-stress-gets-better-at-processing-bad-news
the influential mind: what the brain reveals about our power to change others
tali sharot 2017
supernormal stimuli
oral somatosensatory acuity is related to particle size perception in chocolate
scott p. breen et al. 2019
http://dx.doi.org/10.1038/s41598-019-43944-7
111 volunteer tasters who had their tongues checked for physical sensitivity and then were asked their perceptions about various textures in chocolate.
"We've known for a long time that individual differences in taste and smell can cause differences in liking and food intake -- now it looks like the same might be true for texture," said John Hayes, associate professor of food science. "This may have implications for parents of picky eaters since texture is often a major reason food is rejected."
The perception of food texture arises from the interaction of a food with mechanoreceptors in the mouth, Hayes noted. It depends on neural impulses carried by multiple nerves. Despite being a key driver of the acceptance or rejection of foods, he pointed out, oral texture perception remains poorly understood relative to taste and smell, two other sensory inputs critical for flavor perception.
One argument is that texture typically is not noticed when it is within an acceptable range, but that it is a major factor in rejection if an adverse texture is present, explained Hayes, director of the Sensory Evaluation Center. For chocolate specifically, oral texture is a critical quality attribute, with grittiness often being used to differentiate bulk chocolate from premium chocolates.
"Chocolate manufacturers spend lots of energy grinding cocoa and sugar down to the right particle size for optimal acceptability by consumers," he said. "This work may help them figure out when it is good enough without going overboard."
Researchers tested whether there was a relationship between oral touch sensitivity and the perception of particle size. They used a device called Von Frey Hairs to gauge whether participants could discriminate between different amounts of force applied to their tongues.
When participants were split into groups based on pressure-point sensitivity -- high and low acuity -- there was a significant relationship between chocolate-texture discrimination and pressure-point sensitivity for the high-acuity group on the center tongue. However, a similar relationship was not seen for data from the lateral edge of the tongue.
Chocolate texture-detection experiments included both manipulated chocolates produced in a pilot plant in the Rodney A. Erickson Food Science Building and with two commercially produced chocolates. Because chocolate is a semi-solid suspension of fine particles from cocoa and sugar dispersed in a continuous fat base, Hayes explained, it is an ideal food for the study of texture.
"These findings are novel, as we are unaware of previous work showing a relationship between oral pressure sensitivity and ability to detect differences in particle size in a food product," Hayes said. "Collectively, these findings suggest that texture-detection mechanisms, which underpin point-pressure sensitivity, likely contribute to the detection of particle size in food such as chocolate."
Research team member Nicole Etter, assistant professor of communication sciences and disorders in the College of Health and Human Development, trained students on the team to administer tactile pressure tests she developed on participants' tongues using the Von Frey Hairs. As a speech therapist, she explained that her interest in the findings -- recently published in Scientific Reports -- were different than the food scientists.
"The overarching purpose of my work is to identify how we use touch sensation -- the ability to feel our tongue move and determine where our tongue is in our mouth -- to behave," she said. "I'm primarily interested in understanding how a patient uses sensation from their tongue to know where and how to move their tongue to make the proper sound."
However, in this research, Etter said she was trying to determine whether individual tactile sensations on the tongue relate to the ability to perceive or identify the texture of food -- in this case, chocolate. And she focused on another consideration, too.
"An important aspect of speech-language pathology is helping people with feeding and swallowing problems," she said. "Many clinical populations -- ranging from young children with disabilities to older adults with dementia -- may reject foods based on their perception of texture. This research starts to help us understand those individual differences."
This study sets the stage for follow-on cross-disciplinary research at Penn State, Etter believes. She plans to collaborate with Hayes and the Sensory Evaluation Center on studies involving foods beyond chocolate and older, perhaps less-healthy participants to judge the ability of older people to experience oral sensations and explore food-rejection behavior that may have serious health and nutrition implications.
abstract Texture affects liking or rejection of many foods for clinically relevant populations and the general public. Phenotypic differences in chemosensation are well documented and influence food choices, but oral touch perception is less understood. Here, we used chocolate as a model food to explore texture perception, specifically grittiness perception. In Experiment 1, the Just Noticeable Difference (JND) for particle size in melted chocolate was ~5 μm in a particle size range commonly found in commercial chocolates; as expected, the JND increased with particle size, with a Weber Fraction of ~0.17. In Experiment 2, individual differences in touch perception were explored: detection and discrimination thresholds for oral point pressure were determined with Von Frey Hairs. Discrimination thresholds varied across individuals, allowing us to separate participants into high and low sensitivity groups. Across all participants, two solid commercial chocolates (with particle sizes of 19 and 26 μm; i.e., just above the JND) were successfully discriminated in a forced-choice task. However, this was driven entirely by individuals with better oral acuity: 17 of 20 of more acute individuals correctly identified the grittier chocolate versus 12 of 24 less acute individuals. This suggests phenotypic differences in oral somatosensation can influence texture perception of foods.
dual-target hazard perception: could identifying one hazard hinder a driver’s capacity to find a second?
robert j. sall, jing feng et al. 2019
http://dx.doi.org/10.1016/j.aap.2019.06.016
"This is a phenomenon called a subsequent search miss (SSM), which was first described in the context of doctors evaluating medical images -- their ability to spot a problem was hindered if they had already found another problem in the same image," says Jing Feng, corresponding author of a paper on the research and an associate professor of psychology at NC State. "We wanted to determine whether SSMs might impact driving safety. What we've found suggests that SSMs may play an important role."
To test this, researchers conducted three studies. Each study asked participants to evaluate 100 traffic images and identify any potential hazards that would prevent them from driving in a given direction. Each image contained between zero and two hazards. Some hazards were "high-salience" targets, meaning they were glaringly obvious -- like a red sports car. Other hazards were low-salience targets, such as drably dressed pedestrians.
In the first study, researchers gave 20 participants approximately one second to identify any hazards. The participants were able to detect 70% of low-salience targets if they were the only hazard in the scene. But only 30% of the low-salience targets were identified when there were two hazards in the scene. In other words, low-salience hazards were 40% less likely to be identified when they appeared in the same scene as a high-salience hazard.
In the second study, researchers gave 29 participants up to five seconds to spot any hazards. In this study, participants did a better job of identifying both high-salience and low-salience targets -- but low-salience targets were still 15% less likely to be identified in scenes where there were two hazards. In other words, while performance improved with extra time, SSMs were still present.
In the final study, researchers gave 30 participants up to five seconds to identify any hazards -- but there was a twist. Scenes were introduced as having either a high risk or a low risk of containing multiple targets.
"Here, we found that participants spent more time evaluating traffic scenes after being told the scenes were high risk," says Robert Sall, first author of the paper and a Ph.D. student at NC State. "However, there was still a distinct pattern of performance that could be attributed to SSMs."
When told scenes were low-risk, low-salience targets were 18% less likely to be identified in two-hazard scenes. When given high-risk instructions, low-salience targets were 31% less likely to be identified in two-hazard scenes.
"This work gives us a much better understanding of why people miss certain hazards when driving," Sall says. "It could help us modify driver training to reduce accidents, and inform the development of in-vehicle technologies that focus on accident reduction."
"Our findings will also likely be useful for those whose work involves traffic accident diagnostics," Feng says. "It's now clear that SSMs have the potential to prevent drivers from noticing important pieces of visual information, which may contribute to lapses in driving performance. A great deal of work now needs to be done to determine the scope of the problem and what we can do about it."
abstract •Visual search of multiple hazards is critical for driving but error-prone.
•Finding one hazard impaired detection of another in the same scene.
•Warnings led to modest improvements in hazard detection performance.
•Visual search errors were still present following high-risk warnings.
•The novelty of this cognitive flaw for drivers warrants further investigation.
Low-level cognitive processes like visual search are crucial for hazard detection. In dual-target searches, subsequent search misses (SSMs) are known to occur when the identification of one target impedes detection of another that is concurrently presented. Despite the high likelihood of concurrent hazards in busy driving environments, SSMs have not been empirically investigated in driving. In three studies, participants were asked to identify safety-related target(s) in simulated traffic scenes that contained zero, one, or two target(s) of low or high perceptual saliency. These targets were defined as objects or events that would have prevented safe travel in the direction indicated by an arrow preceding the traffic scene. Findings from the pilot study (n = 20) and Experiment 1 (n = 29) demonstrated that detecting one target hindered drivers’ abilities to find a second from the same scene. In Experiment 2 (n = 30), explicit instructions regarding the level of risk were manipulated. It was found that search times were affected by the instructions, though SSMs persisted. Implications of SSMs in understanding the causes of some crashes are discussed, as well as future directions to improve ecological and criterion validity and to explore the roles of expertise and cognitive capabilities in multi-hazard detection.
covert digital manipulation of vocal emotion alter speakers’ emotional states in a congruent direction
jean-julien aucouturier, petter johansson, lars hall, rodrigo segnini, lolita mercadié, and katsumi watanabe 2016
http://doi.org/10.1073/pnas.1506552113
Link: doi.org/10.1073/pnas.1506552113
Significance
We created a digital audio platform to covertly modify the emotional tone of participants’ voices while they talked toward happiness, sadness, or fear. Independent listeners perceived the transformations as natural examples of emotional speech, but the participants remained unaware of the manipulation, indicating that we are not continuously monitoring our own emotional signals. Instead, as a consequence of listening to their altered voices, the emotional state of the participants changed in congruence with the emotion portrayed. This result is the first evidence, to our knowledge, of peripheral feedback on emotional experience in the auditory domain. This finding is of great significance, because the mechanisms behind the production of vocal emotion are virtually unknown.
Abstract
Research has shown that people often exert control over their emotions. By modulating expressions, reappraising feelings, and redirecting attention, they can regulate their emotional experience. These findings have contributed to a blurring of the traditional boundaries between cognitive and emotional processes, and it has been suggested that emotional signals are produced in a goal-directed way and monitored for errors like other intentional actions. However, this interesting possibility has never been experimentally tested. To this end, we created a digital audio platform to covertly modify the emotional tone of participants’ voices while they talked in the direction of happiness, sadness, or fear. The result showed that the audio transformations were being perceived as natural examples of the intended emotions, but the great majority of the participants, nevertheless, remained unaware that their own voices were being manipulated. This finding indicates that people are not continuously monitoring their own voice to make sure that it meets a predetermined emotional target. Instead, as a consequence of listening to their altered voices, the emotional state of the participants changed in congruence with the emotion portrayed, which was measured by both self-report and skin conductance level. This change is the first evidence, to our knowledge, of peripheral feedback effects on emotional experience in the auditory domain. As such, our result reinforces the wider framework of self-perception theory: that we often use the same inferential strategies to understand ourselves as those that we use to understand others.
phytochromes function as thermosensors in arabidopsis
jae-hoon jung et al. 2016
http://dx.doi.org/10.1126/science.aaf6005
Link: dx.doi.org/10.1126/science.aaf6005
phytochrome b integrates light and temperature signals in arabidopsis
martina legris et al. 2016
science.aaf5656
young children see a single action and infer a social norm: promiscuous normativity in 3-year-olds
marco f. h. schmidt, lucas p. butler, julia heinz, and michael tomasello 2016
http://dx.doi.org/10.1177/0956797616661182
Link: dx.doi.org/10.1177/0956797616661182
the racialized construction of exceptionality: experimental evidence of race/ethnicity effects on teachers' interventions
rachel elizabeth fish 2016
http://dx.doi.org/10.1016/j.ssresearch.2016.08.007
Link: dx.doi.org/10.1016/j.ssresearch.2016.08.007
social class and the motivational relevance of other human beings: evidence from visual attention
p. dietze, e. d. knowles 2016
http://dx.doi.org/10.1177/0956797616667721
Link: dx.doi.org/10.1177/0956797616667721
the threat of increasing diversity: why many white americans support trump in the 2016 presidential election
b. major, a. blodorn, g. major blascovich 2016
http://dx.doi.org/10.1177/1368430216677304
Link: dx.doi.org/10.1177/1368430216677304
behavioral and neural correlates to multisensory detection of sick humans
christina regenbogen et al. 2017
dx.doi.org/10.1073/pnas.1617357114
In the perpetual race between evolving organisms and pathogens, the human immune system has evolved to reduce the harm of infections. As part of such a system, avoidance of contagious individuals would increase biological fitness. The present study shows that we can detect both facial and olfactory cues of sickness in others just hours after experimental activation of their immune system. The study further demonstrates that multisensory integration of these olfactory and visual sickness cues is a crucial mechanism for how we detect and socially evaluate sick individuals. Thus, by motivating the avoidance of sick conspecifics, olfactory–visual cues, both in isolation and integrated, may be important parts of circuits handling imminent threats of contagion.
Throughout human evolution, infectious diseases have been a primary cause of death. Detection of subtle cues indicating sickness and avoidance of sick conspecifics would therefore be an adaptive way of coping with an environment fraught with pathogens. This study determines how humans perceive and integrate early cues of sickness in conspecifics sampled just hours after the induction of immune system activation, and the underlying neural mechanisms for this detection. In a double-blind placebo-controlled crossover design, the immune system in 22 sample donors was transiently activated with an endotoxin injection [lipopolysaccharide (LPS)]. Facial photographs and body odor samples were taken from the same donors when “sick” (LPS-injected) and when “healthy” (saline-injected) and subsequently were presented to a separate group of participants (n = 30) who rated their liking of the presented person during fMRI scanning. Faces were less socially desirable when sick, and sick body odors tended to lower liking of the faces. Sickness status presented by odor and facial photograph resulted in increased neural activation of odor- and face-perception networks, respectively. A superadditive effect of olfactory–visual integration of sickness cues was found in the intraparietal sulcus, which was functionally connected to core areas of multisensory integration in the superior temporal sulcus and orbitofrontal cortex. Taken together, the results outline a disease-avoidance model in which neural mechanisms involved in the detection of disease cues and multisensory integration are vital parts.
seeing it both ways: openness to experience and binocular rivalry suppression
anna antinori, olivia l. carter, luke d. smillie 2017
dx.doi.org/10.1016/j.jrp.2017.03.005
Demonstrates personality and mood can impact low-level perceptual experiences.
Mixed percept, a binocular rivalry state, positively correlated with openness.
Findings were replicated across samples and response bias was excluded.
Used a perceptual-aesthetic mood induction that increased mixed in open people.
Openness to experience is characterised by flexible and inclusive cognition. Here we investigated whether this extends to basic visual perception, such that open people combine information more flexibly, even at low-levels of perceptual processing. We used binocular rivalry, where the brain alternates between perceptual solutions and times where neither solution is fully suppressed, mixed percept. Study 1 showed that openness is positively associated with duration of mixed percept and ruled out the possibility of response bias. Study 2 showed that mixed percept increased following a positive mood induction particularly for open people. Overall, the results showed that openness is linked to differences in low-level visual perceptual experience. Further studies should investigate whether this may be driven by common neural processes.
magic
now you see it, now you… seeing things that are hidden; failing to see things in plain sight. how magic exploits the everyday weirdness of perception
vebjørn ekroll 2017
aeon.co/essays/how-real-magic-happens-when-the-brain-sees-hidden-things
what brain-bending magic tricks can teach us about the mind
rachel becker 2019
http://theverge.com/2019/4/5/18297272/magic-psychology-optical-illusions-perception-cognition-experiencing-the-impossible
a crash in visual processing: interference between feedforward and feedback of successive targets limits detection and categorization
jacob g. martin et al. 2019
http://dx.doi.org/10.1167/19.12.20
a "crash in visual processing," happens when the neurons busy processing one image are tasked with processing another too quickly, and then either one or both images do not reach conscious awareness.
"Prior studies have shown that people are rather poor at detecting objects of interests that appear close together in time, even though the human brain can process up to 70 images per second," says the study's senior investigator, neuroscientist Maximilian Riesenhuber, PhD, a professor of neuroscience at Georgetown University Medical Center. "Our study shows a specific limitation of the visual system and explains why our consciousness cannot keep up. When someone tells you they didn't see something that occurred in a chaotic situation, maybe they did, but they didn't know that they did."
The study provides evidence for the theory that a bottleneck can occur in the neuronal pathway that takes in visual stimuli. That pathway starts at the back of the brain and extends forward, rapidly processing the visual signals up to the frontal cortex ("feed forward"), and then sending them back again to the areas the stimuli were first processed in ("feedback").
"The feedback wave appears to be crucial for participants to actually become conscious of the stimuli their brains had processed in the 'feedforward' pass," Riesenhuber explains.
The study included a series of EEG experiments in which participants viewed images of natural scenes streamed to them in short bursts at a rate of 12 per second, and answered how many images contained animals, and also what the animals were.
The crash in visual processing happens when the back of the brain is stimulated again with a second image before the feed forward and feedback loop needed for the first image is completed, Riesenhuber explains.
The researchers say their conclusions not only are relevant to how, when, and where capacity limits in the brain's processing abilities can arise, but also have ramifications that span consciousness to learning and attention.
"In addition to introducing a theory that explains the underlying reason for the lack of awareness, our study also shows how to avoid the neuronal signal 'crash' and increase awareness," explains the study's lead author, Jacob G. Martin, PhD. "When we experimentally reduced the interference between the feedforward and feedback portions of the two stimuli, we observed improved detection and categorization performance."
abstract The human visual system can detect objects in streams of rapidly presented images at presentation rates of 70 Hz and beyond. Yet, target detection is often impaired when multiple targets are presented in quick temporal succession. Here, we provide evidence for the hypothesis that such impairments can arise from interference between “top-down” feedback signals and the initial “bottom-up” feedforward processing of the second target. Although it is has been recently shown that feedback signals are important for visual detection, this “crash” in neural processing affected both the detection and categorization of both targets. Moreover, experimentally reducing such interference between the feedforward and feedback portions of the two targets substantially improved participants' performance. The results indicate a key role of top-down re-entrant feedback signals and show how their interference with a successive target's feedforward process determine human behavior. These results are not just relevant for our understanding of how, when, and where capacity limits in the brain's processing abilities can arise, but also have ramifications spanning topics from consciousness to learning and attention.
the order of disorder: deconstructing visual disorder and its effect on rule-breaking
hiroki p. kotabe, omid kardan, marc g. berman 2016
xge0000240
confirmation bias in human reinforcement learning: evidence from counterfactual feedback processing
stefano palminteri et al. 2017
doi.org/10.1371/journal.pcbi.1005684
Previous studies suggest that factual learning, that is, learning from obtained outcomes, is biased, such that participants preferentially take into account positive, as compared to negative, prediction errors. However, whether or not the prediction error valence also affects counterfactual learning, that is, learning from forgone outcomes, is unknown. To address this question, we analysed the performance of two groups of participants on reinforcement learning tasks using a computational model that was adapted to test if prediction error valence influences learning. We carried out two experiments: in the factual learning experiment, participants learned from partial feedback (i.e., the outcome of the chosen option only); in the counterfactual learning experiment, participants learned from complete feedback information (i.e., the outcomes of both the chosen and unchosen option were displayed). In the factual learning experiment, we replicated previous findings of a valence-induced bias, whereby participants learned preferentially from positive, relative to negative, prediction errors. In contrast, for counterfactual learning, we found the opposite valence-induced bias: negative prediction errors were preferentially taken into account, relative to positive ones. When considering valence-induced bias in the context of both factual and counterfactual learning, it appears that people tend to preferentially take into account information that confirms their current choice.
While the investigation of decision-making biases has a long history in economics and psychology, learning biases have been much less systematically investigated. This is surprising as most of the choices we deal with in everyday life are recurrent, thus allowing learning to occur and therefore influencing future decision-making. Combining behavioural testing and computational modeling, here we show that the valence of an outcome biases both factual and counterfactual learning. When considering factual and counterfactual learning together, it appears that people tend to preferentially take into account information that confirms their current choice. Increasing our understanding of learning biases will enable the refinement of existing models of value-based decision-making.
enduring extremes? polar vortex, drought, and climate change beliefs
benjamin a. lyons et al. 2018
http://dx.doi.org/10.1080/17524032.2018.1520735
alpha activity reflects the magnitude of an individual bias in human perception
laetitia grabot, christoph kayser 2020
http://dx.doi.org/10.1523/jneurosci.2359-19.2020
abstract Urban areas offer considerable potential for horticultural food production, but questions remain about the availability of space to expand urban horticulture and how to sustainably integrate it into the existing urban fabric. We explore this through a case study which shows that, for a UK city, the space potentially available equates to more than four times the current per capita footprint of commercial horticulture. Results indicate that there is more than enough urban land available within the city to meet the fruit and vegetable requirements of its population. Building on this case study, we also propose a generic conceptual framework that identifies key scientific, engineering and socio-economic challenges to, and opportunities for, the realization of untapped urban horticultural potential.
electroencephalography to monitor the brain activity of adults while they made a decision. The participants saw a picture and heard a sound milliseconds apart and then decided which one came first. Prior to the experiment, the researchers determined if the participants possessed a bias for choosing the picture or sound. Before the first stimulus appeared, the strength of the alpha waves revealed how the participants would decide. Weaker alpha waves meant resisting the bias; stronger alpha waves indicated succumbing to the bias.
a drama movie activates brains of holistic and analytical thinkers differentially
mareike bacha-trams et al. 2018
http://dx.doi.org/10.1093/scan/nsy099
Aalto University researchers showed volunteers the film My Sister's Keeper on a screen while the research subjects were lying down in an MRI scanner. The study compared the volunteers' brain activity, and concluded that holistic thinkers saw the film more similarly with each other than analytical thinkers. In addition, holistic thinkers processed the film's moral issues and factual connections within the film more similarly with each other than the analytical thinkers.
Before conducting the MRI scan, the 26 persons participating in the research were divided into holistic and analytical thinkers on the basis of a previously established evaluation survey. According to previous studies, analytical thinkers pay attention to objects and persons while looking at photographs, whereas holistic thinkers consider also the background and context.
'Holistic thinkers showed more similarities in extensive areas of the cerebral cortex than analytical thinkers. This suggests that holistic thinkers perceive a film more similarly with each other than analytical thinkers,' says Professor Iiro Jääskeläinen.
Significantly more similarity was observed in holistic thinkers in the parts of the brain generally related to moral processing -- in the occipital, prefrontal and anterior parts of the temporal cortices. This suggests the holistic thinkers processed the moral questions of My Sister's Keeper in a similar way to one another. The anterior parts of the temporal lobes, however, process meanings of words.
Analytical thinkers showed similarities mainly in the sensory and auditory parts of the brain. They listen to the dialogue literally, whilst holistic thinkers perceive the meanings through the context and their own interpretation of the film's narrative.
'It was surprising to find so many large differences in so many cerebral areas between the groups.' Professor Jääskeläinen said. 'Analytical and holistic thinkers clearly see the world and events in very different ways. On the basis of the visual cortex, it can still be concluded that holistic thinkers follow the film scenes more similarly, whereas analytical thinkers are more individual and focus more on details.'
So far, research dealing with analytical and holistic views has focused on cultural differences between the east and west: more analytical thinking has been detected in western cultures, and more holistic thinking in eastern cultures. Now the study was carried out within one culture and, for the first time, as a film study.
'The research can help people understand the way other people observe the world. A holistic thinker may find it frustrating that an analytical thinker interprets things literally, sticks to details and does not see the big picture or context. An analytical thinker may, on the other hand, see the holistic thinker as a superstitious person, who believes in long causal links, such as the butterfly effect.'
abstract People socialized in different cultures differ in their thinking styles. Eastern-culture people view objects more holistically by taking context into account, whereas Western-culture people view objects more analytically by focusing on them at the expense of context. Here we studied whether participants, who have different thinking styles but live within the same culture, exhibit differential brain activity when viewing a drama movie. A total of 26 Finnish participants, who were divided into holistic and analytical thinkers based on self-report questionnaire scores, watched a shortened drama movie during functional magnetic resonance imaging. We compared intersubject correlation (ISC) of brain hemodynamic activity of holistic vs analytical participants across the movie viewings. Holistic thinkers showed significant ISC in more extensive cortical areas than analytical thinkers, suggesting that they perceived the movie in a more similar fashion. Significantly higher ISC was observed in holistic thinkers in occipital, prefrontal and temporal cortices. In analytical thinkers, significant ISC was observed in right-hemisphere fusiform gyrus, temporoparietal junction and frontal cortex. Since these results were obtained in participants with similar cultural background, they are less prone to confounds by other possible cultural differences. Overall, our results show how brain activity in holistic vs analytical participants differs when viewing the same drama movie.
differential inter-subject correlation of brain activity when kinship is a variable in moral dilemma
mareike bacha-trams et al. 2018
http://dx.doi.org/10.1038/s41598-017-14323-x
is overconfidence a social liability? the effect of verbal versus nonverbal expressions of confidence
elizabeth r. tenney et al. 2018
http://dx.doi.org/10.1037/pspi0000150
The team conducted a series of experiments in which participants met potential collaborators or advisers and decided which — the confident or cautious — they trusted and wanted to work with most. On average, they strongly preferred the confident candidate; however, once they learned that person was overconfident and the cautious counterpart was well-calibrated, caution won.
“Interestingly, though, we found that if the overly confident candidates expressed their confidence nonverbally, they remained the most trusted and desirable choice, even when revealed to be over-the-top,” Meikle says.
The findings illustrate how politicians, business leaders and others are able to retain their status and influence even when they are potentially exposed as being overconfident: by leveraging plausible deniability — their ability to deny responsibility due to a lack of concrete evidence.
“The plausible deniability hypothesis explains why overconfidence sometimes, but not always, is punished,” Meikle says. “For example, verifiably overconfident claims, void of plausible deniability, will face consequences. However, there are a number of ways people can create plausible deniability.”
abstract What are the reputational consequences of being overconfident? We propose that the channel of confidence expression is one key moderator—that is, whether confidence is expressed verbally or nonverbally. In a series of experiments, participants assessed target individuals (potential collaborators or advisors) who were either overconfident or cautious. Targets expressed confidence, or a lack thereof, verbally or nonverbally. Participants then learned targets’ actual performance. Across studies, overconfidence was advantageous initially—regardless of whether targets expressed confidence verbally or nonverbally. After performance was revealed, overconfident targets who had expressed confidence verbally were viewed more negatively than cautious targets; however, overconfident targets who had expressed confidence nonverbally were still viewed more positively than cautious ones. The one condition wherein nonverbal overconfidence was detrimental was when confidence was clearly tied to a falsifiable claim. Results suggest that, compared with verbal statements, nonverbal overconfidence reaps reputational benefits because of its plausible deniability.
pre–suasion: a revolutionary way to influence and persuade
robert cialdini 2016 9781501109812
surprise, recipes for surprise, and social influence
jeffrey loewenstein 2018
http://dx.doi.org/10.1111/tops.12312
perceived entitlement causes discrimination against attractive job candidates in the domain of relatively less desirable jobs
margaret lee et al. 2017
dx.doi.org/10.1037/pspi0000114
prediction
now you see it: our brains predict the outcomes of our actions, shaping reality into what we expect. that’s why we see what we believe
daniel yon 2019
https://aeon.co/essays/how-our-brain-sculpts-experience-in-line-with-our-expectations
déjà vu: an illusion of prediction
anne m. cleary, alexander b. claxton 2018
http://dx.doi.org/10.1177/0956797617743018
what you saw is what you will hear: two new illusions with audiovisual postdictive effects
noelle r. b. stiles et al. 2018
http://dx.doi.org/10.1371/journal.pone.0204217
the first-member heuristic: group members labeled “first” influence judgment and treatment of groups
janina steinmetz et al. 2019
http://dx.doi.org/10.1037/pspi0000201
seven separate studies to confirm that the performance of a group's first member can significantly influence people's decisions about the rest of the group.
One implication from this research can be found in supermarkets or retailers where numbered cash registers are in use -- specifically on registers labelled with the numeral one.
Because the number one on a register labels its cashier as the group's first member, even though in reality it is an arbitrary number, a person who has a bad experience with the cashier at register number one will judge the whole store more harshly than if they had a bad experience at cashier number three, five or any other number.
Conversely, a pleasant experience at register number one will result in greater positivity about the store than it would at any other register.
"If the first group member to do something is bad then the whole group is seen as bad, if the first group member to do something is great then the whole group is seen as great, and this is much less the case if the middle or the last member does something," Dr Steinmetz said.
This is because, in people's experience, the first group member is often influential for the group -- a company's first employee shapes its culture much more than later employees, for example -- and people then apply this logic to first members in general.
In one research study, participants were presented with a scenario where five international cancer researchers are granted temporary work visas with a potential for extension.
When the participants were told that the scientist whose visa was approved first made a grave mistake they were more likely to judge the whole group of scientists as incompetent and less likely to support extending their work visas.
"When the first one makes the big mistake, people are more likely to say that all these scientists are terrible and we don't want them in the country," Dr Steinmetz said.
"People are more forgiving when the mistake is made by the scientist who receives their visa in the middle or last in the group and don't make such a harsh judgment."
This effect occurred although there was no reason to believe that the first researcher was special in any way, or that the group was actually incompetent. Instead, people were ready to deport the scientists because the bad apple in their group happened to be first in some arbitrary way when receiving the visa.
Other studies conducted during the research found the effect is replicated in judging students, athletes, and even racehorses.
"If the first horse of a group that are trained together runs very slowly in its race then other members of its group are expected to be slow as well and people would be less likely to bet on them," Dr Steinmetz said.
abstract People often make judgments about a group (e.g., immigrants from a specific country) based on information about a single group member. Seven studies (N = 1,929) tested the hypothesis that people will expect the performance of an arbitrarily ordered group to match that of the group member in the first position of a sequence more closely than that of group members in other positions. This greater perceived diagnosticity of the first member will in turn affect how people treat the group. This pattern of judgment and treatment of groups, labeled the “first-member heuristic,” generalized across various performance contexts (e.g., gymnastic routine, relay race, and job performance), and regardless of whether the focal member performed poorly or well (Studies 1–3). Consistent with the notion that first members are deemed most informative, participants were more likely to turn to the member in the first (vs. other) position to learn about the group (Study 4). Further, through their disproportionate influence on the expected performance of other group members, first members’ performances also influenced participants’ support for policies that would benefit or hurt a group (Study 5) and their likelihood to join a group (Study 6). Finally, perceived group homogeneity moderated the first-member heuristic, such that it attenuated for nonhomogeneous groups (Study 7).
symbolic sequence effects on consumers’ judgments of truth for brand claims
dan king, sumitra auschaitrakul 2019
http://dx.doi.org/10.1002/jcpy.1132
the brain attempts to organize information in ways that follow familiar patterns and sequences. One of the most universal, well-known patterns is the alphabet, and the investigators suspected that claims with first letters conforming to the arbitrary "ABCD" sequence -- such as Andrenogel Increases Testosterone -- would be perceived as more truthful. The study is available online in the Journal of Consumer Psychology.
"We go about our lives looking for natural sequences, and when we find a match to one of these patterns, it feels right," says study author Dan King, PhD, an assistant professor at the University of Texas Rio Grande Valley. "An embedded alphabetic sequence, even if unconsciously perceived, feels like a safe haven, and our brains can make unconscious judgments that cause-and-effect statements following this pattern are true."
To test this "symbolic sequence effect," the researchers conducted an experiment in which one group of participants read 10 claims that followed the natural alphabetic sequence, such as "Befferil Eases Pain" or "Aspen Moisturizes Skin," and the control group read statements that did not conform to alphabetical order, such as "Vufferil Eases Pain" or "Vaspen Moisturizes Skin." Then both groups rated their estimation of the truthfulness of the claims. The truthfulness ratings were significantly higher for the claims that followed an alphabetical order, even if participants could not attribute the source of the feeling of truthfulness.
Then the researchers tested whether they could temporarily alter the brain's pattern recognition process and consequently influence an individual's perception of a claim's truthfulness. In this experiment, one group of participants watched a short video clip of the alphabet sung normally while another group saw the clip with the ABC song sung in reverse order. Later, the groups rated the truthfulness of 10 claims.
The truthfulness ratings for claims following the reversed alphabetical sequence -- such as "Uccuprin Strengthens Heart" -- were higher for participants who had heard the alphabet sung in reverse.
The finding suggests that companies may be more likely to convince consumers that a slogan or claim is true if the causal statement follows an alphabetical order, King says. The more frightening implication, though, relates to fake news. Headlines with cause-effect statements that are in alphabetical order may feel more true, even if they are not.
"Consumers need to make evaluations based on fact or experimental evidence rather than whether something feels right," says King. "The alphabet is a random, arbitrary sequence we have learned, and it can play tricks on the brain when it comes to making judgments."
abstract We introduce symbolic sequence effects—the consequences of whether the sequence of the initial letters of a pair of words (e.g., a word representing a putative cause and another word representing a putative effect) conforms to the structure of symbolic sequences that are stored in the mind as overlearned natural language traces (“natural sequence”). We synthesize insights from psychophysics as well as numerical and natural language symbolic representations to demonstrate that consumers are able to unconsciously perceive the mere sequence of symbols contained in a brand claim, and that this sequence information influences judgments of truth. Across three experiments, we showed that when a brand claim is structured in a way that is consistent with the natural sequence of symbols (“A causes B” rather than “B causes A”), people experience feelings of sequential fluency, which in turn influences judgments of truth. This occurs despite the inability of participants to attribute the true source of the feelings. Our results suggest that carefully designed brand claims are likely to benefit from this natural sequencing. These findings provide important contributions to the literatures on processing fluency, branding, and advertising. These findings also have sobering societal implications and warn that fake news might be more persuasive if the perpetrators understand symbolic sequencing techniques.
rude color glasses: the contaminating effects of witnessed morning rudeness on perceptions and behaviors throughout the workday
woolum, andrew et al. 2017
http://psycnet.apa.org/doi/10.1037/apl0000247
coffee cues elevate arousal and reduce level of construal
eugene y. chan, sam j. maglio 2019
http://dx.doi.org/10.1016/j.concog.2019.02.007
"As long as individuals see a connection between coffee and arousal, whatever its origin may be, mere exposure to coffee-related cues might trigger arousal in and of themselves without ingesting any form of caffeine," Dr Chan said.
"Smelling coffee gives rise to the beverage's psychoactive, arousing effects. This is because the brains of habitual coffee consumers are conditioned to respond to coffee in certain ways, as per the prominent Pavlov's dog theory.
"So walking past your favourite café, smelling the odours of coffee grounds, or even witnessing coffee-related cues in the form of advertising can trigger the chemical receptors in our body enough for us to obtain the same arousal sensations without consumption."
Researchers exposed 871 participants from Western and Eastern cultures to coffee and tea-related cues across four separate experiments that would make them think of the substance without actually ingesting it.
In one study, participants had to come up with advertising slogans for coffee or tea. In another, they had to mock-up news stories about the health benefits of drinking coffee or tea. The arousal levels and heart rates were monitored by the researchers throughout the studies.
The study centred on a psychological effect called 'mental construal'. This determines how individuals think and process information, whether they focus on narrow details or the bigger picture.
Results showed that priming people with coffee cues -- exposing them to images and other stimuli (smells and sounds) about coffee -- increased their alertness, energy levels, heart rate, and made them think narrowly.
The cognitive-altering effects of coffee were more prevalent in participants from Western countries, where coffee is more popular and has connotations related to energy, focus and ambition, compared to those from Eastern countries. Coffee was also associated with greater arousal than tea.
"Our research can offer intriguing implications, as it relies not on physiology but rather psychological associations to change our cognitive patterns," Dr Chan said.
"This study could even help to explain how drinking decaffeinated coffee can produce faster reaction times on tasks. Perhaps the mental association between coffee and arousal is so strong that it can produce cognitive changes even where there's no caffeine ingestion physiologically.
"This adds to the growing amount of literature documenting that the foods we eat and the beverages we drink do more than simply provide nutrition or pleasure -- mere exposure to, or reminders of them, affect how we think."
abstract •In Western societies, coffee is associated with greater arousal than tea.
•Thus, exposure to coffee- (vs. tea-) related cues should increase arousal and lower mental construal level.
•We conducted four experiments to test this hypothesis, presenting participants with cues related to either coffee or tea.
•The results suggest that exposure to coffee cues can lead to a concrete construal via greater arousal.
•The effects arise even without actually drinking coffee or tea.
Coffee and tea are two beverages commonly-consumed around the world. Therefore, there is much research regarding their physiological effects. However, less is known about their psychological meanings. Derived from a predicted lay association between coffee and arousal, we posit that exposure to coffee-related cues should increase arousal, even in the absence of actual ingestion, relative to exposure to tea-related cues. We further suggest that higher arousal levels should facilitate a concrete level of mental construal as conceptualized by Construal Level Theory. In four experiments, we find that coffee cues prompted participants to see temporal distances as shorter and to think in more concrete, precise terms. Both subjective and physiological arousal explain the effects. We situate our work in the literature that connects food and beverage to cognition or decision-making. We also discuss the applied relevance of our results as coffee and tea are among the most prevalent beverages globally.
expertise fails to attenuate gendered biases in judicial decision-making
andrea l. miller 2018
http://dx.doi.org/10.1177/1948550617741181
emotion
emotion sensitivity across the lifespan: mapping clinical risk periods to sensitivity to facial emotion intensity
lauren a. rutter et al. 2019
http://dx.doi.org/10.1037/xge0000559
For the study, researchers created a digital test of emotion sensitivity that was completed by nearly 10,000 men and women, ranging in age from 10 to 85. The test allowed researchers to measure how much each person was able to detect subtle differences in facial cues of fear, anger, and happiness. The test also identified how people in different age groups displayed changes in their sensitivity to those facial emotions.
Rutter, the study's lead author and a research fellow at McLean Hospital's Laboratory for Brain and Cognitive Health Technology, explained that participants were tested using the web-based platform TestMyBrain.org. They were shown images of faces, presented in pairs, and were asked "Which face is more angry?," "Which face is more happy?," or "Which face is more fearful?" Rutter stated that the online platform helped the researchers tap into a "much larger and more diverse sample set" than previous studies. She also said that the novel testing method helped improve the accuracy of the results for decoding facial cues.
Germine, the study's senior author, said that the new testing method and the large sample size helped the researchers gain a deeper understanding into differences in emotion processing. "From studies and anecdotal evidence, we know that the everyday experiences of an adolescent is different from a middle aged or older person, but we wanted to understand how these experiences might be linked with differences in basic emotion understanding," said Germine, who is the technical director of the McLean Institute for Technology in Psychiatry and director of the Laboratory for Brain and Cognitive Health Technology. Rutter added that "the paper grew out of knowing that these differences existed and wanting to compare these differences across the emotion categories."
Through their study, the researchers also drilled down on the way emotion sensitivity develops during adolescence.
"We found that sensitivity to anger cues improves dramatically during early to mid-adolescence," said Rutter. "This is the exact age when young people are most attuned to forms of social threat, such as bullying. The normal development of anger sensitivity can contribute to some of the challenges that arise during this phase of development."
On the other end of the life span, the study showed that sensitivity to facial cues for fear and anger decrease as people age, but the ability to detect happiness cues stays the same. "It's well established that there is an age-related decline in the ability to decode emotion cues, in general, but here we see very little decline in the ability to detect differences in happiness," Germine said. This is even though the study was designed to be sensitive to differences in happiness sensitivity with age, based on principles from psychometrics and signal detection theory.
"What's remarkable is that we see declines in many visual perceptual abilities as we get older, but here we did not see such declines in the perception of happiness," she said. "These findings fit well with other research showing that older adults tend to have more positive emotions and a positive outlook."
Now, the researchers are building on this study by conducting new work that examines how emotion sensitivity is related to differences in aspects of mental health, such as anxiety. The team is also looking at how sensitivity to anger and happiness cues might be related to the development of poorer mental health after trauma.
abstract Face emotion perception is important for social functioning and mental health. In addition to recognizing categories of face emotion, accurate emotion perception relies on the ability to detect subtle differences in emotion intensity. The primary aim of this study was to examine participants’ ability to discriminate the intensity of facial emotions (emotion sensitivity: ES) in three psychometrically matched ES tasks (fear, anger, or happiness), to identify developmental changes in sensitivity to face emotion intensity across the lifespan. We predicted that increased age would be associated with lower anger and fear ES, with minimal differences in happiness ES. Participants were 9,546 responders to a Web-based ES study (age range = 10 to 85 years old). Results of segmented linear regression confirmed our hypotheses and revealed differential patterns of ES based on age, sex, and emotion category. Females showed enhanced sensitivity to anger and fear relative to males, but similar sensitivity to happiness. While sensitivity to all emotions increased during adolescence and early adulthood, sensitivity to anger showed the largest increase, potentially related to the importance of anger perception during adolescent development. We also observed age-related decreases in both anger and fear sensitivity in older adults, with little to no change in happiness sensitivity. Unlike previous studies, the effect observed here could not be explained by task-related confounds (e.g., ceiling effects for happiness recognition), lending strong support to observed differences in ES for happiness, anger, and fear across age. Implications for everyday functioning and the development of psychopathology across the lifespan are discussed.
the perils of murky emotions: emotion differentiation moderates the prospective relationship between naturalistic stress exposure and adolescent depression
lisa r. starr et al. 2019
http://dx.doi.org/10.1037/emo0000630
"Adolescents who use more granular terms such as 'I feel annoyed,' or 'I feel frustrated,' or 'I feel ashamed' -- instead of simply saying 'I feel bad' -- are better protected against developing increased depressive symptoms after experiencing a stressful life event," explains lead author Lisa Starr, an assistant professor of psychology at the University of Rochester.
Those who score low on negative emotion differentiation tend to describe their feelings in more general terms such as "bad" or "upset." As a result, they are less able to benefit from useful lessons encoded in their negative emotions, including the ability to develop coping strategies that could help them regulate how they feel.
"Emotions convey a lot of information. They communicate information about the person's motivational state, level of arousal, emotional valence, and appraisals of the threatening experience," says Starr. A person has to integrate all that information to figure out -- "am I feeling irritated," or "am I feeling angry, embarrassed, or some other emotion?"
Once you know that information you can use it to help determine the best course of action, explains Starr: "It's going to help me predict how my emotional experience will unfold, and how I can best regulate these emotions to make myself feel better."
The team found that a low NED strengthens the link between stressful life events and depression, leading to reduced psychological well-being.
By focusing exclusively on adolescence, which marks a time of heightened risk for depression, the study zeroed in on a gap in the research to date. Prior research suggests that during adolescence a person's NED plunges to its lowest point, compared to that of younger children or adults. It's exactly during this developmentally crucial time that depression rates climb steadily.
Previous research had shown that depression and low NED were related to each other, but the research designs of previous studies did not test whether a low NED temporally preceded depression. To the researchers, this phenomenon became the proverbial chicken-and-egg question: did those youth who showed signs of significant depressive symptoms have a naturally low NED, or was their NED low as a direct result of their feeling depressed?
The team, made up of Starr, Rachel Hershenberg, an assistant professor of psychiatry at Emory University, and Rochester graduate students Zoey Shaw, Irina Li, and Angela Santee, recruited 233 mid-adolescents in the greater Rochester area with an average age of nearly 16 (54 percent of them female) and conducted diagnostic interviews to evaluate the participants for depression.
Next, the teenagers reported their emotions four times daily over the period of seven days. One and a half years later, the team conducted follow-up interviews with the original participants (of whom 193 returned) to study longitudinal outcomes.
The researchers found that youth who are poor at differentiating their negative emotions are more susceptible to depressive symptoms following stressful life events. Conversely, those who display high NED are better at managing the emotional and behavioral aftermath of being exposed to stress, thereby reducing the likelihood of having negative emotions escalate into a clinically significant depression over time.
Depression ranks among the most challenging public health problems worldwide. As the most prevalent mental disorder, it not only causes recurring and difficult conditions for sufferers, but also costs the U.S. economy tens of billions of dollars each year and has been identified by the World Health Organization as the number one cause of global burden among industrialized nations. Particularly depression in adolescent girls is an important area to study, note the researchers, as this age brings a surge in depression rates, with a marked gender disparity that continues well into adulthood.
Adolescent depression disrupts social and emotional development, which can lead to a host of negative outcomes, including interpersonal problems, reduced productivity, poor physical health, and substance abuse. Moreover, people who get depressed during adolescence are more likely to become repeatedly depressed throughout their life span, says Starr. That's why mapping the emotional dynamics associated with depression is key to finding effective treatments.
"Basically you need to know the way you feel, in order to change the way you feel," says Starr. "I believe that NED could be modifiable, and I think it's something that could be directly addressed with treatment protocols that target NED."
The team's findings contribute to a growing body of research that tries to make inroads in the fight against rising rates of adolescent depression, suicidal thoughts, and suicide. According to the most recent CDC data, about 17 percent of high school students nationwide say they have thought of suicide, more than 13 percent said they actually made a suicide plan, and 7.4 percent attempted suicide in the past year.
"Our data suggests that if you are able to increase people's NED then you should be able to buffer them against stressful experiences and the depressogenic effect of stress," says Starr.
abstract Negative emotion differentiation (NED) refers to the ability to identify and label discrete negative emotions. Low NED has been previously linked to depression and other indices of low psychological well-being. However, this construct has rarely been explored during adolescence, a time of escalating depression risk, or examined in the context of naturalistic stressors. Further, the association between NED and depression has never been tested longitudinally. We propose a diathesis-stress model wherein low NED amplifies the association between stressful life events (SLEs) and depression. A sample of 233 community-recruited midadolescents (Mage 15.90 years, 54% female) completed diagnostic interviews and reported on mood and daily stressors 4 times per day for 7 days. SLEs were assessed using a semistructured interview with diagnosis-blind team coding based on the contextual threat method. Follow-up interviews were conducted 1.5 years after baseline. Low NED was correlated with depression but did not predict prospective changes in depression as a main effect. Confirming predictions and supporting a diathesis-stress model, low NED predicted (a) within-subjects associations between daily hassles and momentary depressed mood, (b) between-subjects associations between SLE severity and depression, and (c) prospective associations between SLE severity and increases in depression at follow-up. Results were specific to negative (vs. positive) emotion differentiation. Results suggest that low NED is primarily depressogenic in the context of high stress exposure.
prevalence-induced concept change in human judgment
david e. levari et al. 2018
http://dx.doi.org/10.1126/science.aap8731
anomaly detection in paleoclimate records using permutation entropy
joshua garland et al. 2018
http://dx.doi.org/10.3390/e20120931
When making sense of the massive amount of information packed into an ice core, scientists face a forensic challenge: how best to separate the useful information from the corrupt.
A new paper published in the journal Entropy shows how tools from information theory, a branch of complexity science, can address this challenge by quickly homing in on portions of the data that require further investigation.
"With this kind of data, we have limited opportunities to get it right," says Joshua Garland, a mathematician at the Santa Fe Institute who works with 68,000 years of data from the West Antarctic Ice Sheet Divide ice Core. "Extracting the ice and processing the data takes hundreds of people, and tons of processing and analysis. Because of resource constraints, replicate cores are rare. "
By the time Garland and his team got ahold of the data, more than 10 years had passed from the initial drilling of the ice core to the publishing of the dataset it contained. The two-mile ice core was extracted over five seasons from 2007-2012, by teams from the multiple universities funded by the National Science Foundation. From the field camp in West Antarctica, the core was packaged, then shipped to the National Science Foundation Ice Core Facility in Colorado, and finally to the University of Colorado. At the Stable Isotope Lab at the Institute of Arctic and Alpine Research, a state-of-the-art processing facility helped scientists pull water isotope records from the ice.
The result is a highly resolved, complex dataset. Compared to previous ice core data, which allowed for analysis every 5 centimeters, the WAIS Divide core permits analysis at millimeter resolution.
"One of the exciting thing about ice core research in the last decade is we've developed these lab systems to analyze the ice in high resolution," says Tyler Jones, a paleoclimatologist at the University of Colorado Boulder. "Quite a while back we were limited in our ability to analyze climate because we couldn't get enough data points, or if we could it would take too long. These new techniques have given us millions of data points, which is rather difficult to manage and interpret without some new advances in our [data] processing."
In previous cores, Garland notes that decades, even centuries, were aggregated into a single point. The WAIS data, by contrast, sometimes gives more than forty data points per year. But as scientists move to analyze the data at shorter time scales, even small anomalies can be problematic.
"As fine-grained data becomes available, fine-grained analyses can be performed," Garland notes. "But it also makes the analysis susceptible to fine-grained anomalies."
To quickly identify which anomalies require further investigation, the team uses information theoretic techniques to measure how much complexity appears at each point in the time sequence. A sudden spike in the complexity could mean that there was either a major, unexpected climate event, like a super volcano, or that there was an issue in the data or the data processing pipeline.
"This kind of anomaly would be invisible without a highly detailed, fine-grained, point-by-point analysis of the data, which would take a human expert many months to perform," says Elizabeth Bradley, a computer scientist at the University of Colorado Boulder and External Professor at the Santa Fe Institute. "Even though information theory can't tell us the underlying cause of an anomaly, we can use these techniques to quickly flag the segments of the data set that should be investigated by paleoclimate experts."
She compares the ice core dataset to a Google search that returns a million pages. "It's not that you couldn't go through those million pages," Bradley says. "But imagine if you had a technique that could point you toward the ones that were potentially meaningful?" When analyzing large, real-world datasets, information theory can spot differences in the data that signal either a processing error or a significant climate event.
In their Entropy paper, the scientists detail how they used information theory to identify and repair a problematic stretch of data from the original ice core. Their investigation eventually prompted a resampling of the archival ice core -- the longest resampling of a high-resolution ice core to date. When that portion of the ice was resampled and reprocessed, the team was able to resolve an anomalous spike in entropy from roughly 5,000 years ago.
"It's vitally important to get this area right," Garland notes, "because it contains climate information from the dawn of human civilization."
"I think climate change is the most pressing problem ever to face humanity, and ice cores are undoubtedly the best record of Earth's climate going back hundreds of thousands of years," says Jones. "Information theory helps us sift through the data to make sure what we're putting out into the world is the absolute best and most certain product we can."
abstract Permutation entropy techniques can be useful for identifying anomalies in paleoclimate data records, including noise, outliers, and post-processing issues. We demonstrate this using weighted and unweighted permutation entropy with water-isotope records containing data from a deep polar ice core. In one region of these isotope records, our previous calculations (See Garland et al. 2018) revealed an abrupt change in the complexity of the traces: specifically, in the amount of new information that appeared at every time step. We conjectured that this effect was due to noise introduced by an older laboratory instrument. In this paper, we validate that conjecture by reanalyzing a section of the ice core using a more advanced version of the laboratory instrument. The anomalous noise levels are absent from the permutation entropy traces of the new data. In other sections of the core, we show that permutation entropy techniques can be used to identify anomalies in the data that are not associated with climatic or glaciological processes, but rather effects occurring during field work, laboratory analysis, or data post-processing. These examples make it clear that permutation entropy is a useful forensic tool for identifying sections of data that require targeted reanalysis—and can even be useful for guiding that analysis.
intention
cues to intention bias action perception toward the most efficient trajectory
katrina l. mcdonough et al. 2019
http://dx.doi.org/10.1038/s41598-019-42204-y
the way humans "see" the actions of others is slightly distorted by their expectations.
The new study shows these changes really reflect the intentions we attribute to others, and specifically happen when watching people but not other objects.
For the study, participants watched brief videos of an actor starting to reach either straight for an object or making an arched reach over an obstacle. Before the action was complete, the hand suddenly disappeared, and participants identified the point on the touch screen where they last saw it.
The trick was that in some trials, the hand made an arched reach even though there was no obstacle or started to reach straight even though an obstacle was in the way, so that it would knock into it. In other words, what people saw was clearly in conflict with their expectations about how people typically act.
As in the authors' earlier work, the results showed people were able to judge the expected actions accurately. The perception of unexpected ones was, however, subtly coloured by their expectations.
For unexpected straight reaches, people reported the hand disappeared slightly higher than it really did, as if they "saw" it start to avoid the obstacle even though it clearly did not.
Similarly, if there was no obstacle to avoid, high arched reaches were reported slightly lower, as if people saw a straighter action than was really shown. In other words, people tended to see the actions as they expected, but not as they really were.
The question was what would happen in another group of participants that did not watch moving hands, but balls -- objects to which no goals are typically attributed, and certainly no tendencies to avoid obstacles. When these participants reported what they had seen, they did not make these mis-judgments, particularly when the ball did not move in a biological, human-like way.
The new results therefore show that it is really the intentions we attribute to other people -- but not objects -- that lead us to mis-perceive their actions.
Lead author Katrina L McDonough, a PhD candidate at the University, said: "The misjudgements we found were not large. People did not see a completely different movement than was really there, but even the subtle changes we measured could have large impacts in everyday life. If we see a person behaving ambiguously, for example, such small changes may be enough to make us interpret the behaviour differently or cause us to miss the true intention behind it."
Dr Patric Bach, Associate Professor and Head of the Action Prediction Lab, added: "While this study was conducted with typically developing participants, it may provide new avenues for understanding psychological conditions. It could explain, for example, why people with an autism spectrum condition sometimes find it hard to read the meaning of other people's behaviour. Conversely, it may help explain why people with schizophrenia are more prone to see meaning and intention where none exists."
abstract Humans interpret others’ behaviour as intentional and expect them to take the most energy-efficient path to achieve their goals. Recent studies show that these expectations of efficient action take the form of a prediction of an ideal “reference” trajectory, against which observed actions are evaluated, distorting their perceptual representation towards this expected path. Here we tested whether these predictions depend upon the implied intentionality of the stimulus. Participants saw videos of an actor reaching either efficiently (straight towards an object or arched over an obstacle) or inefficiently (straight towards obstacle or arched over empty space). The hand disappeared mid-trajectory and participants reported the last seen position on a touch-screen. As in prior research, judgments of inefficient actions were biased toward efficiency expectations (straight trajectories upwards to avoid obstacles, arched trajectories downward towards goals). In two further experimental groups, intentionality cues were removed by replacing the hand with a non-agentive ball (group 2), and by removing the action’s biological motion profile (group 3). Removing these cues substantially reduced perceptual biases. Our results therefore confirm that the perception of others’ actions is guided by expectations of efficient actions, which are triggered by the perception of semantic and motion cues to intentionality.
egocentric bias in emotional understanding of children and adults
hajimu hayashi et al. 2019
http://dx.doi.org/10.1016/j.jecp.2019.04.009
Even if the end result is the same, the recipient of an action usually responds with stronger emotions if the actions are intentional rather than accidental. For example, if we compare a scenario in which B intentionally destroyed A's treasured possession in front of A with a scenario in which B accidentally destroyed the possession, we assume that A is sadder if the action was intentional. In this case A sees B's actions, so A knows B's intentions. If A did not know B's intentions, A would feel the same amount of sadness in both cases. But if we as a third party judge that A is sadder if B intentionally destroyed A's possession, we are interpreting another person's emotions based on knowledge that only we have (the intentions of B), and egocentric bias is taking place.
This study looked at whether egocentric bias occurs in emotional understanding, and potential differences based on age and positive or negative contexts. The participants were 106 children age 8-9, 108 children age 11-12, 122 children age 15-16, and 154 adults.
The research team prepared four pairs of stories. Two pairs featured negative context in which the protagonists were harmed by the actors and felt sad, and the other two pairs featured positive contexts in which the protagonists were helped by the actors and felt happy. The only difference between each pair of scenarios was whether the actions were intentional or accidental.
Two conditions were prepared in terms of the knowledge or ignorance of the protagonists. In the knowledge condition, the protagonists in both stories watched the actors and therefore knew that the actors intentionally or accidentally harmed or helped. In the ignorance condition, the protagonists in both stories did not watch the actors and therefore did not know their intentions. After some fact-checking questions, the researchers quizzed the participants with questions about emotional understanding, such as "Which girl is sadder at the end of each story?"
Scores were assigned based on the results of these questions, and they calculated average scores for two pairs of the stories in each of the negative and the positive contexts.
+1: The participant judged that the protagonist was sadder when the harm was intentional (negative context) or happier when the help was intentional (positive context)
-1: The participant judged that the protagonist was sadder when the harm was accidental (negative context), or happier when the help was accidental (positive context)
0: The participant judged that both protagonists are equally sad (negative context), or equally happy (positive context)
Average scores in the knowledge condition (in which the protagonists were aware of the actors' intentions) were significantly greater than 0 for all age groups in both contexts, confirming a general trend to judge the feelings of the recipient to be stronger when the actions are intentional. In the ignorance condition (in which the protagonists were ignorant of the actors' intentions), they should feel the same whether the action is accidental or intentional, so logically the average scores should be 0. However, the average scores were significantly greater than 0 for all age groups in both contexts, revealing that some participants interpret the protagonists' emotions based on information that the protagonists do not possess. This shows that people of any age may demonstrate an egocentric bias when interpreting the feelings of others, although this bias is stronger in younger children. There was no significant difference between the score for negative and positive contexts, so the type of emotion does not affect the level of bias.
Understanding the emotions of others is the key to social behaviors such as smooth communication and helping others. This study reveals that our understanding of the strength of other people's emotions can be distorted by the information we possess. This may lead to miscommunication and have an effect on actions like offering help and comfort. Let's put ourselves in the position of the actors in this study: if we accidentally harm someone else, we may wrongly assume that his/her feelings of sadness are weaker even if he/she is ignorant of our intention. Quarrels between children over trivial matters may also occur because of this sort of egocentric bias. This bias is particularly strong in children up to elementary school age, so adults can use their awareness of this in education and guidance for child socialization.
ingroup outgroup
contextual knowledge provided by a movie biases implicit perception of the protagonist
mamdooh afdile et al. 2019
http://dx.doi.org/10.1093/scan/nsz028
Peoples' brains are naturally biased towards other people who are the same as them -- a behavioural trait scientists call 'in-group favouritism'. The opposite trait is also true: people are often naturally biased against people who are not the same as them, called 'out-group derogation'. Mamdooh Afdile -- a filmmaker studying for a PhD in neuroscience at Aalto University -- decided to use cinema to explore this.
Afdile used the film Priest to create a 20-minute stimulus film version that explored biases in two social groupings: heterosexual and homosexual men. 'If knowledge gained from our social environment can implicitly bias how we perceive each other, this should hold true to characters in movies as well,' Afdile explained. To see if watching the movie biased the viewers subconsciously, Afdile flashed the face of the protagonist repeatedly for a brief duration of 40 milliseconds before and after showing the movie.
Even though the viewer wouldn't be able to notice being shown a person's face -- much less have time to recognise the person -- their subconscious brain responded to the flashed face based on whether or not they had become biased. By using functional MRI, the researchers were able to detect how people's biases could be changed.
In the beginning of the movie, the viewer gets the impression that the priest is heterosexual and falling in love with a woman. At the 10 minute mark, the viewer finds out the priest is in fact in love with another man. The study groups watching the film consisted of 14 homosexual and 15 heterosexual men, and the team measured the bias felt by each group towards the priest character when they thought he was straight, and when they knew he was gay.
The social groupings were chosen by the researchers because, unlike race or gender, we cannot perceive another person's sexual orientation just by looking at their face -- so any bias response by the participants in the experiment toward the face presented to them would be dependent on what they came to know about the person. The subconscious response to the face of the protagonist after seeing the movie, compared to before seeing it, was significantly different between the two groups, and this result was not symmetrical. The results from the heterosexual group showed a very mild negative bias response, and interestingly those from the homosexual group showed a very strong response in brain regions associated with in-group, such as empathy and favouritism.
These results are interesting for our understanding of unconscious bias because they demonstrate that the brain responds in a biased way to traits it can't detect using our basic senses.
'This study shows the brain can be biased based on learned knowledge and not only by external factors,' explains. Mamdooh Afdile. By combining movies with subliminal measurement we can now investigate the subconscious brain in ways that were extremely difficult before.'
abstract We are constantly categorizing other people as belonging to our in-group (“one of us”) or out-group (“one of them”). Such grouping occurs fast and automatically and can be based on others’ visible characteristics such as skin color or clothing style. Here we studied neural underpinnings of implicit social grouping not often visible on the face, male sexual orientation. Fourteen homosexuals and 15 heterosexual males were scanned in functional MRI while watching a movie about a homosexual man, whose face was also presented subliminally before (subjects did not know about the character’ sexual orientation) and after the movie. We discovered significantly stronger activation to the man’s face after seeing the movie in homosexual but not heterosexual subjects in medial prefrontal cortex (mPFC), frontal pole (FP), anterior cingulate cortex (ACC), right temporal parietal junction (rTPJ) and bilateral superior frontal gyrus (sFG). In previous research, these brain areas have been connected to social perception, self-referential thinking, empathy, theory of mind, and in-group perception. In line with previous studies showing biased perception of in/out-group faces to be context dependent, our novel approach further demonstrates how complex contextual knowledge gained under naturalistic viewing can bias implicit social perception.
sensitivity
genetic architecture of environmental sensitivity reflects multiple heritable components: a twin study with adolescents
elham assary et al. 2020
http://dx.doi.org/10.1038/s41380-020-0783-8
compared pairs of identical and non-identical 17-year-old twins to see how strongly they were affected by positive or negative experiences — their ‘sensitivity’ level. The aim was to tease out how much of the differences in sensitivity could be explained by either genetic or environmental factors during development: nature or nurture.
Twins who are brought up together will mostly experience the same environment. But only identical twins share the same genes: non-identical twins are like any other sibling. If identical twins show no more similarity in their levels of sensitivity than non-identical twins, then genes are unlikely to play a role.
Using this type of analysis, the team found that 47 percent of the differences in sensitivity between individuals were down to genetics, leaving 53 percent accounted for by environmental factors. The research, from Queen Mary University of London and Kings College London, is the first to show this link conclusively in such a large study. The findings are published in Molecular Psychiatry.
Michael Pluess, Professor of Developmental Psychology at Queen Mary University of London and study lead, said: “We are all affected by what we experience — sensitivity is something we all share as a basic human trait. But we also differ in how much of an impact our experiences have on us. Scientists have always thought there was a genetic basis for sensitivity, but this is the first time we’ve been able to actually quantify how much of these differences in sensitivity are explained by genetic factors.”
Over 2800 twins were involved in the study, split between around 1000 identical twins and 1800 non-identical twins, roughly half of whom were same sex. The twins were asked to fill out a questionnaire, developed by Professor Pluess, which has been widely used to test an individual’s levels of sensitivity to their environment This test will be made available online later this month so anyone can assess their own sensitivity.
The questionnaire is also able to tease out different types of sensitivity — whether someone is more sensitive to negative experiences or positive experiences — as well as general sensitivity. The analysis by the team suggested that these different sensitivities also have a genetic basis.
Co-researcher Dr Elham Assary said: “If a child is more sensitive to negative experiences, it may be that they become more easily stressed and anxious in challenging situations. On the other hand, if a child has a higher sensitivity to positive experiences, it may be that they are more responsive to good parenting or benefit more from psychological interventions at school. What our study shows is that these different aspects of sensitivity all have a genetic basis.”
Finally, the team explored how sensitivity to other common and established personality traits, known as the ‘Big Five’: openness, conscientiousness, agreeableness, extraversion and neuroticism. They found that there was a shared genetic component between sensitivity, neuroticism and extraversion, but not with any of the other personality traits.
Professor Pluess believes the findings could help us in how we understand and handle sensitivity, in ourselves and others.
“We know from previous research that around a third of people are at the higher end of the sensitivity spectrum. They are generally more strongly affected by their experiences,” he said. “This can have both advantages and disadvantages. Because we now know that this sensitivity is as much due to biology as environment, it is important for people to accept their sensitivity as an important part of who they are and consider it as a strength not just as a weakness.”
abstract Humans differ substantially in how strongly they respond to similar experiences. Theory suggests that such individual differences in susceptibility to environmental influences have a genetic basis. The present study investigated the genetic architecture of Environmental Sensitivity (ES) by estimating its heritability, exploring the presence of multiple heritable components and its genetic overlap with common personality traits. ES was measured with the Highly Sensitive Child (HSC) questionnaire and heritability estimates were obtained using classic twin design methodology in a sample of 2868 adolescent twins. Results indicate that the heritability of sensitivity was 0.47, and that the genetic influences underlying sensitivity to negative experiences are relatively distinct from sensitivity to more positive aspects of the environment, supporting a multi-dimensional genetic model of ES. The correlation between sensitivity, neuroticism and extraversion was largely explained by shared genetic influences, with differences between these traits mainly attributed to unique environmental influences operating on each trait.
vision
adaptive colour change and background choice behaviour in peppered moth caterpillars is mediated by extraocular photoreception
amy eacock et al. 2019
http://dx.doi.org/10.1038/s42003-019-0502-7
Cephalopods, chameleons and some fish camouflage themselves by adapting their color to their surroundings. These animals have a system to perceive color and light independently of the eyes. Some insects, such as caterpillars of the peppered moth (Biston betularia), also match their body color to the twig color of their food plant; although this color change is rather slow compared to other animals. Until now, scientists have not known how insect larvae can perceive the color of their environment and how the color change occurs. Two theories dating back more than 130 years proposed that the color change could be caused by the diet or by the animal seeing the color. As some insects are known to be able to perceive light -- but not color -- by the skin, researchers from Liverpool University and the Max Planck Institute for Chemical Ecology pursued three different approaches to finally solve the riddle of how caterpillars of the peppered moth match the color of their surroundings.
First, they tested if caterpillars of the peppered moth, whose eyes were painted over with black acrylic paint, could still adjust their color to the background. The blindfolded caterpillars were raised on white, green, brown and black branches and their body color observed. Even without being able to see, the caterpillars changed color to resemble the background to the same extent as caterpillars whose eyes were not covered. "It was completely surprising to me that blindfolded caterpillars could still change their color and match it to the background. I don't think my supervisor, Ilik Saccheri, believed me until he saw it by himself," says Amy Eacock, one of the lead authors of the new study and currently a postdoc at the Max Planck Institute for Chemical Ecology.
In behavioral experiments, blindfolded caterpillars had the choice to move to differently colored twigs. Consistently the caterpillar rested on the twig most similar to their own color.
In a third approach, the researchers examined in which parts of the body genes related to vision were expressed. They found them not only in the head of the caterpillars, where the eyes are, but also in the skin of all body segments. One visual gene was expressed even more in the skin than in the heads of the caterpillars. "We assume that this gene is involved in the perception of background color by the skin," notes Hannah Rowland, second lead author and leader of the Max Planck Research Group, Predators and Toxic Prey.
"One of the major challenges animals face is how to avoid being eaten by predators. Numerous species have evolved camouflage to avoid being detected or recognised. A considerable problem, however, is how prey animals can match the range of visual backgrounds against which they are often seen. Color change enables animals to match their surroundings and potentially reduce the risk of predation," says Hannah Rowland, highlighting the study's ecological context. Amy Eacock adds: "We constructed a computer model that can 'see' the same way birds do, so we are able to conclude that these adaptations -- color change, twig-mimicking, behavioral background-matching -- likely evolved to avoid visual detection by predators." Caterpillars with better color sensing may have been eaten less by birds, while birds with improved vision may prey more upon these larvae, continuing the evolutionary predator-prey arms race.
abstract Light sensing by tissues distinct from the eye occurs in diverse animal groups, enabling circadian control and phototactic behaviour. Extraocular photoreceptors may also facilitate rapid colour change in cephalopods and lizards, but little is known about the sensory system that mediates slow colour change in arthropods. We previously reported that slow colour change in twig-mimicking caterpillars of the peppered moth (Biston betularia) is a response to achromatic and chromatic visual cues. Here we show that the perception of these cues, and the resulting phenotypic responses, does not require ocular vision. Caterpillars with completely obscured ocelli remained capable of enhancing their crypsis by changing colour and choosing to rest on colour-matching twigs. A suite of visual genes, expressed across the larval integument, likely plays a key role in the mechanism. To our knowledge, this is the first evidence that extraocular colour sensing can mediate pigment-based colour change and behaviour in an arthropod.
neural correlates of the conscious perception of visual location lie outside visual cortex
sirui liu et al. 2019
http://dx.doi.org/10.1016/j.cub.2019.10.033
“Our study provides clear evidence that the visual system is not representing what we see but is representing the physical world,” said lead author, Sirui Liu, a graduate student of psychological and brain sciences at Dartmouth. “What we see emerges later in the processing hierarchy, in the frontal areas of the brain that are not usually associated with visual processing.”
To examine how the perception of position occurs in the brain, participants were presented with visual stimuli and asked to complete a series of behavioral tasks while in a functional magnetic resonance imaging (fMRI) scanner. For one of the tasks, participants were asked to stare at a fixed black dot on the left side of the computer screen inside the scanner while a dot that flickered between black and white, known as a Gabor patch, moved in the periphery. Participants were asked to identify the direction the patch was moving. The patch appears to move across the screen at a 45 degree angle, when in fact it is moving up and down in a vertical motion. Here, the perceived path is strikingly different from the actual physical path that lands on the retina. This creates a “double-drift” illusion. The direction of the drift was randomized across the trials, where it drifted either towards the left, right or remained static.
Using fMRI data and multivariate pattern analysis, a method for studying neural activation patterns, the team investigated where the perceived path, tilted left or right from vertical, appears in the brain. They wanted to determine where conscious perception emerges and how the brain codes this. On average, participants reported that the perceived motion path was different from the actual path by 45 degrees or more. The researchers found that while the visual system collects the data, the switch between coding the physical path and coding the perceived path (illusory path) takes place outside of the visual cortex all the way in the frontal areas, which are higher-order brain regions.
“Our data firmly support that frontal areas are critical to the emergence of conscious perception,” explained study co-author and co-principal investigator, Patrick Cavanagh, a research professor of psychological and brain sciences at Dartmouth, and senior research fellow and adjunct professor of psychology at Glendon College. “While previous research has long established the frontal lobes are responsible for functions such as decision-making and thinking, our findings suggest that this area of the brain is also the end step for perceiving where objects are. So, that’s kind of radical,” he added.
abstract •Perceived path of the double-drift illusion is not represented in visual cortex
•Consciously perceived location emerges in anterior areas beyond the visual cortex
•The illusion’s long time constant matches properties of frontal not visual cortex
•Low-level features differentiate illusory stimuli in V2, V3, and MT+, but not V1
When perception differs from the physical stimulus, as it does for visual illusions and binocular rivalry, the opportunity arises to localize where perception emerges in the visual processing hierarchy. Representations prior to that stage differ from the eventual conscious percept even though they provide input to it. Here, we investigate where and how a remarkable misperception of position emerges in the brain. This “double-drift” illusion causes a dramatic mismatch between retinal and perceived location, producing a perceived motion path that can differ from its physical path by 45° or more. The deviations in the perceived trajectory can accumulate over at least a second, whereas other motion-induced position shifts accumulate over 80–100 ms before saturating. Using fMRI and multivariate pattern analysis, we find that the illusory path does not share activity patterns with a matched physical path in any early visual areas. In contrast, a whole-brain searchlight analysis reveals a shared representation in anterior regions of the brain. These higher-order areas would have the longer time constants required to accumulate the small moment-to-moment position offsets that presumably originate in early visual cortical areas and then transform these sensory inputs into a final conscious percept. The dissociation between perception and the activity in early sensory cortex suggests that consciously perceived position does not emerge in what is traditionally regarded as the visual system but instead emerges at a higher level.
oculomotor freezing reflects tactile temporal expectation and aids tactile perception
stephanie badde et al. 2020
doi.org/10.1038/s41467-020-17160-1
heart–brain interactions shape somatosensory perception and evoked potentials
esra al et al. 2020
http://dx.doi.org/10.1073/pnas.1915629117
The first mechanism establishes a relationship between the phase of the heartbeat and conscious experience. In a regular rhythm, the heart contracts in the so-called systolic phase and pumps blood into the body. In a second phase, the diastolic phase, the blood flows back and the heart fills up again. In a previous publication from the MPI CBS, it was reported that perception of external stimuli changes with the heartbeat. In systole, we are less likely to detect a weak electric stimulus in the finger compared to diastole.
Now, in a new study, Esra Al and colleagues have found the reason for this change in perception: Brain activity is changing over the heart cycle. In systole a specific component of brain activity, which is associated with consciousness, the so called P300-component is suppressed. In other words, it seems that -- in systole -- the brain makes sure that certain information is kept out of conscious experience. The brain seems to take into account the pulse which floods the body in systole and predicts that pulse-associated bodily changes are "not real" but rather due to the pulse. Normally, this helps us to not be constantly disturbed by our pulse. However, when it comes to weak stimuli which coincide with systole we might miss them, although they are real.
During their investigations on heart-brain interactions, Al and colleagues also revealed a second effect of heartbeat on perception: If a person's brain shows a higher response to the heartbeat, the processing of the stimulus in the brain is attenuated -- the person detects the stimulus less. "This seems to be a result of directing our attention between external environmental signals and internal bodily signals.," explains study author Al. In other words, a large heartbeat-evoked potential seems to reflect a "state of mind," in which we are more focused on the functioning of our inner organs such as the blood circulation, however less aware of stimuli from the outside world.
The results not only have implications for our understanding of heart-brain interactions in healthy persons, but also in patients. The senior author, Arno Villringer explains, "The new results might help to explain why patients after stroke often suffer from cardiac problems and why patients with cardiac disease often have impaired cognitive function."
The researchers investigated these relationships by sending weak electrical stimuli to electrodes clamped onto the study participants fingers. In parallel, they recorded each participants' brain processes using an EEG and their cardiac activity using an EKG.
abstract Even though humans are mostly not aware of their heartbeats, several heartbeat-related effects have been reported to influence conscious perception. It is not clear whether these effects are distinct or related phenomena, or whether they are early sensory effects or late decisional processes. Combining electroencephalography and electrocardiography, along with signal detection theory analyses, we identify two distinct heartbeat-related influences on conscious perception differentially related to early vs. late somatosensory processing. First, an effect on early sensory processing was found for the heartbeat-evoked potential (HEP), a marker of cardiac interoception. The amplitude of the prestimulus HEP negatively correlated with localization and detection of somatosensory stimuli, reflecting a more conservative detection bias (criterion). Importantly, higher HEP amplitudes were followed by decreases in early (P50) as well as late (N140, P300) somatosensory-evoked potential (SEP) amplitudes. Second, stimulus timing along the cardiac cycle also affected perception. During systole, stimuli were detected and correctly localized less frequently, relating to a shift in perceptual sensitivity. This perceptual attenuation was accompanied by the suppression of only late SEP components (P300) and was stronger for individuals with a more stable heart rate. Both heart-related effects were independent of alpha oscillations’ influence on somatosensory processing. We explain cardiac cycle timing effects in a predictive coding account and suggest that HEP-related effects might reflect spontaneous shifts between interoception and exteroception or modulations of general attentional resources. Thus, our results provide a general conceptual framework to explain how internal signals can be integrated into our conscious perception of the world.
unconventional consumption methods and enjoying things consumed: recapturing the “first-time” experience
ed o’brien, robert w. smith 2018
http://dx.doi.org/10.1177/0146167218779823
“experiences vs things”
stuffocation
james wallman 20 978-0-9575245-2-1
“experiences are more prone to positive reinterpretation, less likely to be dulled by hedonic adaptation, harder to compare, more likely to contribute to identity, and they bring you closer to people.”
dopamine d2 receptors in discrimination learning and spine enlargement
yusuke iino et al. 2020
http://dx.doi.org/10.1038/s41586-020-2115-1
Psychosis is a debilitating psychological condition with a long history. Described in the medical writings of Hippocrates as early as the 4th century B.C., the psychotic state of hallucinations, delusions and disordered thought represent an existential threat to an afflicted human mind. Now, a team of researchers from the International Research Center for Neurointelligence (IRCN) and the Graduate School of Medicine at the University of Tokyo, and the Graduate School of Informatics at Kyoto University, proposes that psychosis involves defective neural signaling in a deep brain area called the ventral striatum during a behavior called discrimination learning.
Led by Lecturer Sho Yagishita and Professor Haruo Kasai, the researchers studied the way mice predict future rewards in their environment, a behavior known as reward learning, which is shared by us humans and other mammals, too. Reward learning involves the release of a chemical messenger dopamine to a receptor protein in the brain called dopamine D1 receptor (D1R) to signal the anticipation of a reward. Specifically, the team searched for a second dopamine signal that occurs only when the anticipated reward fails to materialize — reward omission.
The researchers suspected this signal for reward omission existed in neurons of the ventral striatum area of the brain that contain a counterpart to D1R, dopamine D2 receptor (D2R). Coincidentally, D2R is the major brain receptor for nearly every antipsychotic medication used to date. The team showed that reward omission triggers a signal in these neurons called the dopamine dip, a drop in dopamine levels, which lasts less than a second.
These dips seem to contribute to the process of discrimination learning, which includes how all animals, including humans, judge previously learned rewards and punishments. To explore the connection between dips and discrimination learning, the researchers used sophisticated optogenetic technologies to artificially increase or decrease the dips for the first time and measured their effects on how the mice estimated rewards. Optogenetics is a way to activate artificial light-sensitive proteins with finely controlled laser light to turn neuronal activity on or off.
“We initially observed that dips caused certain synaptic structures called spines to expand and send signals within D2R neurons,” said Yagishita. “We searched for several years before we discovered that discrimination learning was the cognitive process that refines reward learning following dopamine dips.”
To establish a link to psychosis, the authors administered a well-known psychosis-inducing drug, methamphetamine, and showed that both discrimination learning and dopamine dips were impaired. As a result, mice showed exaggerated behavioral and dopamine responses even when no reward was presented, as is the case in human psychosis. These deficits could be prevented with an antipsychotic compound that blocks D2R activity.
“If D2R signaling and discrimination learning is impaired, subjects may be unable to assign an appropriate significance to objects or people in their environment, and their fears or insecurities may fill in the gap,” said Yagishita. “For example, persecutory delusions arise from mistakenly assigning malevolent intent to strangers who pose no threat.”
The authors propose that these findings open a previously unknown window into psychosis. Their data show that an antipsychotic D2R drug can reverse effects of a psychosis-inducing one by specifically restoring the dopamine dips and discrimination learning to normal levels. Their hypothesis is that an impairment in discrimination learning can result in an inability to predict the environment accurately, leading to overt symptoms of psychosis or schizophrenia.
“The brain seems to have an intrinsic capacity for fantasy or delusional thinking, but there are built-in controls like D2R discrimination learning that help us to correct our misjudgments,” commented Kasai. “Our study raises the possibility that when these corrective controls break down, we can risk losing contact with reality and may enter a downward spiral of pathology.”
abstract Dopamine D2 receptors (D2Rs) are densely expressed in the striatum and have been linked to neuropsychiatric disorders such as schizophrenia21,22. High-affinity binding of dopamine suggests that D2Rs detect transient reductions in dopamine concentration (the dopamine dip) during punishment learning3,4,5. However, the nature and cellular basis of D2R-dependent behaviour are unclear. Here we show that tone reward conditioning induces marked stimulus generalization in a manner that depends on dopamine D1 receptors (D1Rs) in the nucleus accumbens (NAc) of mice, and that discrimination learning refines the conditioning using a dopamine dip. In NAc slices, a narrow dopamine dip (as short as 0.4 s) was detected by D2Rs to disinhibit adenosine A2A receptor (A2AR)-mediated enlargement of dendritic spines in D2R-expressing spiny projection neurons (D2-SPNs). Plasticity-related signalling by Ca2+/calmodulin-dependent protein kinase II and A2ARs in the NAc was required for discrimination learning. By contrast, extinction learning did not involve dopamine dips or D2-SPNs. Treatment with methamphetamine, which dysregulates dopamine signalling, impaired discrimination learning and spine enlargement, and these impairments were reversed by a D2R antagonist. Our data show that D2Rs refine the generalized reward learning mediated by D1Rs.
taste
functions of opsins in drosophila taste
nicole y. leung et al. 2020
http://dx.doi.org/10.1016/j.cub.2020.01.068
first example of a role of opsins in taste, or in any form of chemical sensation,” said coauthor Craig Montell, a distinguished professor of molecular, cellular, and developmental biology.
Scientists in the late 1800s discovered the light-sensing role of rhodopsin — which consists of an opsin bound to retinal, a form of vitamin A — and it has since become the most studied sensory receptor. Until recently, researchers believed that the family of rhodopsin proteins was involved only in light reception. However, in 2011, Montell and his colleagues found that an opsin enables the fruit fly Drosophila melanogaster to detect small temperature changes within its comfortable range.
Animals have many types of sensory proteins that respond to stimuli from the environment. Some require a strong stimulus, such as scalding heat, to activate. Rhodopsins are able to respond to very subtle changes or very low levels of stimuli — like those in very dim light conditions — and then initiate a molecular cascade that amplifies the signal, ultimately activating a sensory response.
Researchers in Montell’s lab used aristolochic acid — a toxic, bitter compound found in some plants — to study taste receptors in fruit flies. High concentrations of this bitter chemical activate the flies’ taste neurons by directly opening a channel protein called TRPA1, which lets calcium and sodium into the cells. This leads to a bitter taste the animals avoid. However, the flies also avoid even highly diluted aristolochic acid, which isn’t a strong enough signal to open the channels directly.
Montell and lead author Nicole Leung, who recently completed her predoctoral studies at UC Santa Barbara, suspected opsin molecules might be at work in detecting subtle chemical signals as well, via a signal amplification process.
They presented flies with a choice between sugar alone or sugar spiked with dilute aristolochic acid. Unsurprisingly, the flies rejected the sugar laced with the bitter chemical and ate the pure sugar.
The scientists then raised fruit flies with mutations that prevented them from synthesizing different opsin proteins. They found that flies with defects in any one of three types of opsins couldn’t detect the small concentrations of acid, and ate nearly equal amounts of the sugar laced with the bitter compound as the pure sugar. However, the mutant animals were still sensitive to large amounts of the bitter compound, which they continued to avoid. According to Montell, the large amounts of the bitter chemical directly activated the TRPA1 channel, which was still present in the flies missing the opsins.
The team showed that aristolochic acid activated these opsins by binding to the same site that retinal normally does in rhodopsin. Much like rhodopsins turned on by very dim light, the chemically-activated opsins then initiated a molecular cascade that amplified the small signals. This enabled the flies to detect concentrations of the compound that would otherwise be insufficient to trigger a response in their sensory neurons.
“Rhodopsins were discovered back in the 1870s,” Montell said, “so to discover that opsins have roles in taste after 150 years or so is pretty exciting.”
Montell speculates that chemoreception may have been the original role of opsin proteins. Chemical reception, he said, is a more basic requirement for life than is light reception. Knowing what to eat and what dangerous chemicals to avoid serves a more ancient survival function than does the ability to detect light. Perhaps by chance, he ventured, a retinal became bound to an opsin and conferred light sensitivity to the opsin.
Following Montell’s 2011 discovery that opsins function in temperature sensation, another group found that opsins play a role in hearing in flies. Now, with the demonstration that opsins are taste receptors as well, Montell suspects they may be involved in still additional senses.
“In every case, they provide a mechanism for sensing low levels of stimuli by initiating an amplification cascade,” he said.
The new finding likely extends beyond the fruit flies the scientists studied. “The ramifications are that maybe opsins represent a new class of taste receptor in mammals, including humans,” said Montell, a hypothesis the team is currently investigating.
abstract Rhodopsin is a light receptor comprised of an opsin protein and a light-sensitive retinal chromophore. Despite more than a century of scrutiny, there is no evidence that opsins function in chemosensation. Here, we demonstrate that three Drosophila opsins, Rh1, Rh4, and Rh7, are needed in gustatory receptor neurons to sense a plant-derived bitter compound, aristolochic acid (ARI). The gustatory requirements for these opsins are light-independent and do not require retinal. The opsins enabled flies to detect lower concentrations of aristolochic acid by initiating an amplification cascade that includes a G-protein, phospholipase Cb, and the TRP channel, TRPA1. In contrast, responses to higher levels of the bitter compound were mediated through direct activation of TRPA1. Our study reveals roles for opsins in che- mosensation and raise questions concerning the original roles for these classical G-protein-coupled receptors.
olfaction
mechanosensory-based phase coding of odor identity in the olfactory bulb
ryo iwata et al. 2017
http://dx.doi.org/10.1016/j.neuron.2017.11.008
•Mechanosensation in olfactory sensory neurons generates sniff-coupled oscillations
•Phase coding in mitral/tufted cells distinguishes odor from mechanical signals
•Phase coding is more stable than rate coding across time and odor concentrations
•The loss of mechanosensory-based oscillations impairs robust phase coding of odors
Mitral and tufted (M/T) cells in the olfactory bulb produce rich temporal patterns of activity in response to different odors. However, it remains unknown how these temporal patterns are generated and how they are utilized in olfaction. Here we show that temporal patterning effectively discriminates between the two sensory modalities detected by olfactory sensory neurons (OSNs): odor and airflow-driven mechanical signals. Sniff-induced mechanosensation generates glomerulus-specific oscillatory activity in M/T cells, whose phase was invariant across airflow speed. In contrast, odor stimulation caused phase shifts (phase coding). We also found that odor-evoked phase shifts are concentration invariant and stable across multiple sniff cycles, contrary to the labile nature of rate coding. The loss of oscillatory mechanosensation impaired the precision and stability of phase coding, demonstrating its role in olfaction. We propose that phase, not rate, coding is a robust encoding strategy of odor identity and is ensured by airflow-induced mechanosensation in OSNs.
volatile biomarkers of symptomatic and asymptomatic malaria infection in humans
consuelo m. de moraes et al. 2018
http://dx.doi.org/10.1073/pnas.1801512115
gray whales strand more often on days with increased levels of atmospheric radio-frequency noise
granger et al. et al. 2020
http://dx.doi.org/10.1016/j.cub.2020.01.028
champions of illusion: the science behind mind-boggling images and mystifying brain puzzles
susana martinez-conde, stephen macknik 2017
reading people: how seeing the world through the lens of personality changes everything
anne bogel 2017
seeing what others don’t: the remarkable ways we gain insights
gary klein 2013
how to be more interesting
edward de bono 2010
yes! 50 secrets from the science of persuasion
noah goldstein, steve martin, robert cialdini 2007
blindspot: hidden biases of good people
mahzarin banaji & anthony greenwald 2013
race on the brain: what implicit bias gets wrong about the struggle for racial justice
jonathan kahn 2017
is science racist?
jonathan marks 2017
less than human: why we demean, enslave, and exterminate others
david livingstone smith 2012
coming to our senses: perceiving complexity to avoid catastrophes
viki mccabe 2014
war of the worldviews: where science and spirituality meet and do not
deepak chopra, leonard mlodinow 2012
the smell of fresh rain: the unexpected pleasures of our most elusive sense barney shaw 2017
simplexity: why simple things become complex (and how complex things can be made simple)
jeffrey kluger 2008
our senses: an immersive experience
rob desalle 2018
the power of intuition: how to use your gut feelings to make better decisions at work
gary klein 2004
the mind is flat: the illusion of mental depth and the improvised mind
nick chater 2018
the forgetting machine: memory, perception, and the “jennifer aniston neuron”
rodrigo quian quiroga 2017
the intuitive way the definitive guide to increasing your awareness
penney peirce 1997
the confidence game: the psychology of the con and why we fall for it every time
konnikova maria 2017
joyful: the surprising power of ordinary things to create extraordinary happiness
ingrid fetell lee 2018
the perils of perception: why we’re wrong about nearly everything
bobby duffy 2018
the case against reality: why evolution hid the truth from our eyes
donald d. hoffman 2019
experiencing the impossible: the science of magic
gustav kuhn 2019
how to make the world add up
tim harford 2020 unread
world too small
when someone's world becomes too small, that is when problems start. They start to attribute even accidental consequences to alleged malicious behaviour in the people around them.
our perspective sometimes narrows, and it is then that we are more likely to abuse.
when our perspective narrows, it becomes harder to break out of our current outlook, to learn of the changing shape of reality, to look beyond what we already believe.
see–feel–value
literally cannot see or feel something (like risk, climate change), so most people can not value it
if we cannot “see” or perceive something, then we have little hope of understanding or changing it.