nell
noise-induced schooling of fish
jitesh jhawar et al. 2020
doi.org/10.1038/s41567-020-0787-y
the behavioural dynamics that govern alignment, or collective motion, of cichlid fish — offering new insights into the dynamics of schooling, and potentially the coordinated behaviour of other animals.
“In the fish that we have studied, schooling turns out to be noise-induced. It’s not what anyone traditionally thought it was,” says Dr Richard Morris from UNSW Science, co-leader of the study and EMBL Australia group leader in UNSW’s Single Molecule Science.
“Noise, in this setting, is simply the randomness arising from interactions between individual fish.”
In the study, the researchers present the first experimental evidence of noise-induced ordering, which previously only existed as a theoretical possibility. The interdisciplinary team of ecologists, physicists and mathematicians achieved this by combining the capabilities of their scientific disciplines to integrate experiments with computer simulations and analysis.
“Everyone’s been aware of noise-induced phenomena, theoretically, but it’s quite rare to find in practice. You can only observe it when the individuals in a study can actually make decisions. For example, you wouldn’t find this type of noise-induced behaviour studying electrons or particles,” says Dr Morris.
This new model proposed contradicts the standard ‘moving average’ theories for schooling and herding behaviour, which assume that the animals are capable of estimating the overall direction of the group.
“Every fish only interacts with one other fish at any given time. They either spontaneously change direction, or copy the direction of a different fish. Calculating an average direction of the group — which was the popular theory until now — is likely too complicated for a fish to compute,” explains Dr Morris.
To study the behavioural dynamics, the researchers filmed schools of 15, 30 and 60 cichlid fish, tracking their trajectories to analyse the mechanism behind mutual alignment, or schooling.
“Smaller groups of fish schooled more coherently than large groups. This is counterintuitive, since the randomness, or noise, from individual interactions plays a bigger role in smaller groups than larger ones,” Dr Morris says.
When researchers interpret data, noise is usually an unrelated factor that obscures and distracts from the information, like glare from the sun that you would try to eliminate to get a clearer photo.
In this case, Dr Morris explains that the random copying between pairs of fish gives rise to a different class of noise, and is actually what drives their highly coordinated behaviour. This new insight highlights the importance of noise, showing that noise may encode some important information about behavioural dynamics of fish and other animals.
“Here the signal is the noise. If you ignored the fluctuations completely, you couldn’t explain schooling at all.”
Beyond fish behaviour, the discovery has the power to reshape the understanding of collective motion in animals, and calls for a revision of how noise is treated in studies of behaviour dynamics.
abstract We report on the dynamics of collective alignment in groups of the cichlid fish Etroplus suratensis. Focusing on small- to intermediate-sized groups (10 ≲ N ≲ 100), we demonstrate that schooling (highly polarized and coherent motion) is noise induced, arising from the intrinsic stochasticity associated with finite numbers of interacting fish. The fewer the fish, the greater the (multiplicative) noise and therefore the greater the likelihood of alignment. Such rare empirical evidence tightly constrains the possible underlying interactions that govern fish alignment, suggesting that E. suratensis either spontaneously change their direction or copy the direction of another fish, without any local averaging (the otherwise canonical mechanism of collective alignment). Our study therefore highlights the importance of stochasticity in behavioural inference. Furthermore, rather than simply obscuring otherwise deterministic dynamics, noise can be fundamental to the characterization of emergent collective behaviours.
Scattershot approach to knowledge, all it takes is one thing to be very right and your investment in many things which have not come true yet has already paid off
1/n rule of investment is to invest equally across all opportunities. this works because losses will occur and are spread across many areas but we also give opportunity to the rare chance that has very high return. this is the same whether for financial investments or for evolutionary investments.
so what did i learn? well i made sure not to put all my eggs in one basket, and also not to get my hopes too high. i made sure to approach the whole thing as a learning experience and to appreciate the tools and processes that i created in the process of applying. i made sure to get on with my life in the meantime and not to put too much store by the outcome
it is a mistake to put our hopes on someone being perfect. people are often far from what they would like to be, and to place the burden of our expectation solely on one person is to invite disappointment. this does not mean we don’t love them, just that it is wise to keep eggs in more than one basket. if there is something you know a person is not able to do, do not hope that they can suddenly do it when you need them to. make alternative plans so things can still work out.
the more I learn, the more I understand that “there but for the grace of god go I” — even though have been abused as have many others, was “lucky” enough that was moved away from most influences by parental job postings (expat) and that most influences could not follow me as a child (pre–internet) — a scattershot childhood allowed bad influences to wane and good influences to have a chance
scattershot is needed in science, where scarcity of publication and concentrated time and effort investment necessitated by current scientific publication method gives incentive for rose–tinted observations and unwillingness to move on
selected writings of hermann von helmholtz
hermann von helmholtz; russell kahl 1971
9780819540393 unread
The Facts of Perception (1878) from Selected Writings of Hermann Helmholtz
(helmholtz on kant) here relevant for treatment of “scattershot” mode of vision
marxists.org/reference/subject/philosophy/works/ge/helmholt.htm
scribd.com/document/272211224/Helmholtz-the-Facts-of-Perception-1878
Among the traces which frequently repeated perceptions leave behind in the memory, the ones conforming to law and repeated with the greatest regularity are strengthened, while those which vary accidentally are obliterated. In a receptive, attentive observer, intuitive images of the characteristic aspects of the things that interest him come to exist; afterward he knows no more about how these images arose than a child knows about the examples from which he learned the meanings of words.
we are concerned here with the elementary processes which are the real basis of all thought, even though they lack the critical certainty and refinement to be found in the scientific formation of concepts and in the individual steps of scientific inferences.
The memory traces of previous experience play an even more extensive and influential role in our visual observations. An observer who is not completely inexperienced receives without moving his eyes (this condition can be realised experimentally by using the momentary illumination of an electric discharge or by carefully and deliberately staring) images of the objects in front of him which are quite rich in content. We can easily confirm with our own eyes, however, that these images are much richer and especially much more precise if the gaze is allowed to move about the field of vision, in this way making use of the kind of spatial observations which I have previously described as the most fundamental. Indeed, we are so used to letting our eyes wander over the objects we are looking at that considerable practice is required before we succeed in making them — for purposes of research in physiological optics — fix on a point without wandering.
It is always well to keep this in mind in order not to infer from the facts more than can rightly be inferred from them. The various idealistic and realistic interpretations are metaphysical hypotheses which, as long as they are recognised as such, are scientifically completely justified. They may become dangerous, however, if they are presented as dogmas or as alleged necessities of thought. Science must consider thoroughly all admissible hypotheses in order to obtain a complete picture of all possible modes of explanation. Furthermore, hypotheses are necessary to someone doing research, for one cannot always wait until a reliable scientific conclusion has been reached; one must sometimes make judgments according to either probability or aesthetic or moral feelings. Metaphysical hypotheses are not to be objected to here either. A thinker is unworthy of science, however, if he forgets the hypothetical origin of his assertions. The arrogance and vehemence with which such hidden hypotheses are sometimes defended are usually the result of a lack of confidence which their advocates feel in the hidden depths of their minds about the qualifications of their claims.
I have frequently noted in my previous works the agreement between the more recent physiology of the senses and Kant’s teachings. I have not meant, of course, that I would swear in verbs magistri to all his more minor points. I believe that the most fundamental advance of recent times must be judged to be the analysis of the concept of intuition into the elementary processes of thought. Kant failed to carry out this analysis or resolution; this is one reason why he considered the axioms of geometry to be transcendental propositions. It has been the physiological investigations of sense perception which have led us to recognise the most basic or elementary kinds of judgment, to inferences which are not expressible in words. These judgments or inferences will, of course, remain unknown and inaccessible to philosophers as long as they inquire only into knowledge expressed in language.
Some philosophers who retain an inclination toward metaphysical speculation consider what we have treated as a defect in Kant’s system, resulting from the lack of progress of the special sciences in his time, to be the most fundamental part of his philosophy. Indeed, Kant’s proof of the possibility of metaphysics, the alleged science he did nothing further to develop, rests completely upon the belief that the axioms of geometry and the related principles of mechanics are transcendental propositions, given a priori. As a matter of fact, however, Kant’s entire system really conflicts with the possibility of metaphysics, and the more obscure points in his theory of knowledge, over which so much has been argued, stem from this conflict.
Be that as it may, the natural sciences have a secure, well–established foundation from which they can search for the laws of reality, a wonderfully rich and fertile field of endeavour. As long as they restrict themselves to this search, they need not be troubled with any idealistic doubts. Such work will, of course, always seem modest to some people when compared to the high–flown designs of the metaphysicians.
book review
russell kahl. (ed., with intro.) selected writings of hermann von helmholtz. middletown, connecticut: wesleyan universiy press, 1971. pp. xiv + 542. $25.00
authors
j. a. cardno
doi.org/10.1002/1520-6696(197404)10:2%3C270::AID-JHBS2300100218%3E3.0.CO;2-O
the black swan: the impact of the highly improbable
nassim nicholas taleb 2007
on the influence of rare, important, unpredictable events
ensemble vs time probability
medium.com/incerto/the-logic-of-risk-taking-107bf41029d3
skin in the game: hidden asymmetries in daily life
nassim nicholas taleb 2018 9780425284636
fooled by randomness: the hidden role of chance in life and in the markets
nassim nicholas taleb 2001
statistics describes the shape of luck
same average, less variance
prosecutors’ fallacy
en.m.wikipedia.org/wiki/Prosecutor’s_fallacy
the drunkard’s walk: how randomness rules our lives
leonard mlodinow 2008 978-0-307-37754-8
three doors problem
i would say this affects the feminist bank teller problem — by giving details of the other prosocial tendencies of linda, we have opened mlodinow’s 100 doors — we have intervened in the game and altered its probabilities
it shows that if two companies compete head-to-head or two employees within a company compete, though there may be a winner and a loser each quarter or each year, to get a reliable answer regarding which company or which employee is superior by simply tallying who beats whom, you’d have to make the comparison over decades or centuries. If, for instance, employee A is truly superior and would in the long run win a performance comparison with employee B on 60 out of 100 occasions, in a simple best-of-5 series of comparisons the weaker employee will still win almost one-third of the time. It is dangerous to judge ability by short-term results.
as Bernoulli put it, “One should not appraise human action on the basis of its results.
are life expectancies skewed in a hidden way? should we be including all lives since conception rather than since birth? in a similar manner, do processes that seem to follow a normal distribution simply appear to do so because we do not have enough data over a long enough timescale? it would follow that heights, like life exepctancies, would be skewed in this way because lives that end before birth are omitted.
“Galton soon realized that processes that did not exhibit regression toward the mean would eventually go out of control.”
“It is only when pure luck occasionally leads to a lopsided preponderance of hits from some particular direction—the molecular analogue of Roger Maris’s record year in baseball—that a noticeable jiggle occurs. When Einstein did the math, he found that despite the chaos on the microscopic level, there was a predictable relationship between factors such as the size, number, and speed of the molecules and the observable frequency and magnitude of the jiggling. Einstein had, for the first time, connected new and measurable consequences to statistical physics. That might sound like a largely technical achievement, but on the contrary, it represented the triumph of a great principle: that much of the order we perceive in nature belies an invisible underlying disorder and hence can be understood only through the rules of randomness. As Einstein wrote, “It is a magnificent feeling to recognize the unity of a complex of phenomena which appear to be things quite apart from the direct visible truth.””
“perception, Faraday recognized, is not a direct consequence of reality but rather an act of imagination.”
“We also use our imagination and take shortcuts to fill gaps in patterns of nonvisual data. As with visual input, we draw conclusions and make judgments based on uncertain and incomplete information, and we conclude, when we are done analyzing the patterns, that our “picture” is clear and accurate. But is it?”
“When we look closely, we find that many of the assumptions of modern society are based, as table moving is, on shared illusions.”
“So the relevant question is, if thousands of people are tossing coins once a year and have been doing so for decades, what are the chances that one of them, for some fifteen-year period, will toss all heads? That probability is far, far higher than the odds of simply tossing fifteen heads in a row.”
“in all aspects of our lives we encounter streaks and other peculiar patterns of success and failure. Sometimes success predominates, sometimes failure. Either way it is important in our own lives to take the long view and understand that streaks and other patterns that don’t appear random can indeed happen by pure chance. It is also important, when assessing others, to recognize that among a large group of people it would be very odd if one of them didn’t experience a long streak of successes or failures.”
“as Langer wrote, “While people may pay lip service to the concept of chance, they behave as though chance events are subject to control.”
In real life the role of randomness is far less obvious than it was in Langer’s experiments, and we are much more invested in the outcomes and our ability to influence them. And so in real life it is even more difficult to resist the illusion of control.”
“The human brain has evolved to be very efficient at pattern recognition, but as the confirmation bias shows, we are focused on finding and confirming patterns rather than minimizing our false conclusions. Yet we needn’t be pessimists, for it is possible to overcome our prejudices. It is a start simply to realize that chance events, too, produce patterns. It is another great step if we learn to question our perceptions and our theories. Finally, we should learn to spend as much time looking for evidence that we are wrong as we spend searching for reasons we are correct.”
“as the Nobel laureate Max Born wrote, “Chance is a more fundamental conception than causality.”
Max Born, Natural Philosophy of Cause and Chance (Oxford: Clarendon Press, 1948), p. 47. Born was referring to nature in general and quantum theory in particular.”
“In the scientific study of random processes the drunkard’s walk is the archetype. In our lives it also provides an apt model, for like the granules of pollen floating in the Brownian fluid, we’re continually nudged in this direction and then that one by random events. As a result, although statistical regularities can be found in social data, the future of particular individuals is impossible to predict, and for our particular achievements, our jobs, our friends, our finances, we all owe more to chance than many people realize. On the following pages, I shall argue, furthermore, that in all except the simplest real-life endeavors unforeseeable or unpredictable forces cannot be avoided, and moreover those random forces and our reactions to them account for much of what constitutes our particular path in life.”
“In any complex string of events in which each event unfolds with some element of uncertainty, there is a fundamental asymmetry between past and future. This asymmetry has been the subject of scientific study ever since Boltzmann made his statistical analysis of the molecular processes responsible for the properties of fluids (see chapter 8). Imagine, for example, a dye molecule floating in a glass of water. The molecule will, like one of Brown’s granules, follow a drunkard’s walk. But even that aimless movement makes progress in some direction. If you wait three hours, for example, the molecule will typically have traveled about an inch from where it started. Suppose that at some point the molecule moves to a position of significance and so finally attracts our attention. As many did after Pearl Harbor, we might look for the reason why that unexpected event occurred. Now suppose we dig into the molecule’s past. Suppose, in fact, we trace the record of all its collisions. We will indeed discover how first this bump from a water molecule and then that one propelled the dye molecule on its zigzag path from there to here. In hindsight, in other words, we can clearly explain why the past of the dye molecule developed as it did. But the water contains many other water molecules that could have been the ones that interacted with the dye molecule. To predict the dye molecule’s path beforehand would have therefore required us to calculate the paths and mutual interactions of all those potentially important water molecules. That would have involved an almost unimaginable number of mathematical calculations, far greater in scope and difficulty than the list of collisions needed to understand the past. In other words, the movement of the dye molecule was virtually impossible to predict before the fact even though it was relatively easy to understand afterward.”
“That fundamental asymmetry is why in day-to-day life the past often seems obvious even when we could not have predicted it.”
“normal accident theory, Perrow’s doctrine describes how that happens—how accidents can occur without clear causes, without those glaring errors and incompetent villains sought by corporate or government commissions. But although normal accident theory is a theory of why, inevitably, things sometimes go wrong, it could also be flipped around to explain why, inevitably, they sometimes go right. For in a complex undertaking, no matter how many times we fail, if we keep trying, there is often a good chance we will eventually succeed.”
“We cannot see a person’s potential, only his or her results, so we often misjudge people by thinking that the results must reflect the person. The normal accident theory of life shows not that the connection between actions and rewards is random but that random influences are as important as our qualities and actions.”
“Does even unearned “success” instill a feeling of superiority? To find out, pairs of volunteers were asked to cooperate on various pointless tasks. In one task, for instance, a black-and-white image was briefly displayed and the subjects had to decide whether the top or the bottom of the image contained a greater proportion of white. Before each task began, one of the subjects was randomly chosen to receive considerably more pay for participating than the other. When that information was not made available, the subjects cooperated pretty harmoniously. But when they knew how much they each were getting paid, the higher-paid subjects exhibited more resistance to input from their partners than the lower-paid ones. Even random differences in pay lead to the backward inference of differences in skill and hence to the development of unequal influence. It’s an element of personal and office dynamics that cannot be ignored.”
“We unfortunately seem to be unconsciously biased against those in society who come out on the bottom.”
“We miss the effects of randomness in life because when we assess the world, we tend to see what we expect to see. We in effect define degree of talent by degree of success and then reinforce our feelings of causality by noting the correlation. That’s why although there is sometimes little difference in ability between a wildly successful person and one who is not as successful, there is usually a big difference in how they are viewed.”
“since chance does play a role, one important factor in success is under our control: the number of at bats, the number of chances taken, the number of opportunities seized. For even a coin weighted toward failure will sometimes land on success. Or as the IBM pioneer Thomas Watson said, “If you want to succeed, double your failure rate.”
“the true power of the theory of random processes, however, lies in the fact that once we understand the nature of random processes, we can alter the way we perceive the events that happen around us.”
“We judge people and initiatives by their results, and we expect events to happen for good, understandable reasons. But our clear visions of inevitability are often only illusions. I wrote this book in the belief that we can reorganize our thinking in the face of uncertainty. We can improve our skill at decision making and tame some of the biases that lead us to make poor judgments and poor choices. We can seek to understand people’s qualities or the qualities of a situation quite apart from the results they attain, and we can learn to judge decisions by the spectrum of potential outcomes they might have produced rather than by the particular result that actually occurred.”
thinking in bets: making smarter decisions when you don’t have all the facts
annie duke 2018 9780735216365
on the relationship between personal experience, affect and risk perception: the case of climate change
sander van der linden 2014
doi.org/10.1002/ejsp.2008
equivalence of wave–particle duality to entropic uncertainty
patrick j. coles et al. 2014
doi.org/10.1038/ncomms6814
counterfactual reasoning underlies the learning of priors in decision making
ariel zylberberg et al. 2018
doi.org/10.1016/j.neuron.2018.07.035
People can learn base rates without feedback and apply them to make better decisions
The estimate of base rate is updated based on the confidence in each decision
The form of confidence used is counterfactual, as if the base rate were uninformative
The study extends the Bayesian framework from choice to prior probability estimation
Accurate decisions require knowledge of prior probabilities (e.g., prevalence or base rate), but it is unclear how prior probabilities are learned in the absence of a teacher. We hypothesized that humans could learn base rates from experience making decisions, even without feedback. Participants made difficult decisions about the direction of dynamic random dot motion. Across blocks of 15–42 trials, the base rate favoring left or right varied. Participants were not informed of the base rate or choice accuracy, yet they gradually biased their choices and thereby increased accuracy and confidence in their decisions. They achieved this by updating knowledge of base rate after each decision, using a counterfactual representation of confidence that simulates a neutral prior. The strategy is consistent with Bayesian updating of belief and suggests that humans represent both true confidence, which incorporates the evolving belief of the prior, and counterfactual confidence, which discounts the prior.
we need vaccines, but what is the cost?
buzzfeed.com/shaunlintern/these-nhs-staff-were-told-the-swine-flu-vaccine-was-safe
cd8 t cells from patients with narcolepsy and healthy controls recognize hypocretin neuron-specific antigens
natasja wulff pedersen et al. 2019
doi.org/10.1038/s41467-019-08774-1
'We have found autoreactive cytotoxic CD8 T cells in the blood of narcolepsy patients. That is, the cells recognise the neurons that produce hypocretin which regulates a person's waking state. It does not prove that they are the ones that killed the neurons, but it is an important step forward. Now we know what the cells are after,' says Associate Professor Birgitte Rahbek Kornum from the Department of Neuroscience.
The immune system is designed to recognise viruses and bacteria. When its cells are autoreactive -- which is the case in autoimmune diseases -- the immune system recognises the body's own cells and attacks them. That they are cytotoxic means that they are capable of killing other cells. In most narcolepsy patients, the neurons that produce hypocretin and thus regulate our waking state have been destroyed.
'To kill other cells, e.g. neurons producing hypocretin, CD4 and CD8 T cells usually have to work together. In 2018, scientists discovered autoreactive CD4 T cells in narcolepsy patients. This was really the first proof that narcolepsy is in fact an autoimmune disease. Now we have provided more, important proof: that CD8 T cells are autoreactive too,' says Birgitte Rahbek Kornum.
Autoreactive Cells Were Also Found in Healthy Individuals
In the study, the researchers studied and analysed blood samples from 20 persons with narcolepsy. In addition, they analysed blood samples from a control group of 52 healthy persons. In nearly all 20 narcolepsy patients the researchers found autoreactive CD8 T cells. But autoreactivity was not only found in persons suffering from the sleep disorder. The researchers also discovered autoreactive cells in a lot of the healthy individuals.
'We also found autoreactive cells in some of the healthy individuals, but here the cells probably have not been activated. It is something we see more and more often with autoimmunity -- that it lies dormant in all of us, but is not activated in everyone. The next big puzzle is learning what activates them', says Birgitte Rahbek Kornum.
According to Birgitte Rahbek Kornum, the discovery of autoreactive cells in healthy individuals also stresses the theory that something has to trigger narcolepsy and activate autoreactivity. Scientists still do not know what causes the disease. They expect a combination of genetics, autoreactive cells and a form of trigger to bring about the disease, e.g. a virus infection. The disease can be treated medically today, but the new research results may pave the way for even better treatments.
'Now there will probably be more focus on trying to treat narcolepsy with drugs allaying the immune system. This has already been attempted, though, because the hypothesis that it is an autoimmune disease has existed for many years. But now that we know that it is T cell-driven, we can begin to target and make immune treatments even more effective and precise,' says Birgitte Rahbek Kornum.
There are two types of narcolepsy. People suffering from type 1, which is the most common form, lack the transmitter substance hypocretin which regulates the waking state, and they suffer from cataplexy which is brief loss of muscle control.
Persons with type 2 do not lack hypocretin and do not suffer from cataplexy. Still, they experience the same symptoms as type 1 patients. In this study, the researchers focussed on type 1.
abstract Narcolepsy Type 1 (NT1) is a neurological sleep disorder, characterized by the loss of hypocretin/orexin signaling in the brain. Genetic, epidemiological and experimental data support the hypothesis that NT1 is a T-cell-mediated autoimmune disease targeting the hypocretin producing neurons. While autoreactive CD4+ T cells have been detected in patients, CD8+ T cells have only been examined to a minor extent. Here we detect CD8+ T cells specific toward narcolepsy-relevant peptides presented primarily by NT1-associated HLA types in the blood of 20 patients with NT1 as well as in 52 healthy controls, using peptide-MHC-I multimers labeled with DNA barcodes. In healthy controls carrying the disease-predisposing HLA-DQB106:02 allele, the frequency of autoreactive CD8+ T cells was lower as compared with both NT1 patients and HLA-DQB106:02-negative healthy individuals. These findings suggest that a certain level of CD8+ T-cell reactivity combined with HLA-DQB1*06:02 expression is important for NT1 development.
detecting species at low densities: a new theoretical framework and an empirical test on an invasive zooplankton
jake r. walsh et al. 2018
doi.org/10.1002/ecs2.2475
“Our original idea was (to ask): ‘How is this possible? In what scenario would we miss spiny water flea for 10 years, even after so much effort?’” says Jake Walsh, lead author of the study and a postdoctoral researcher at the UW-Madison Center for Limnology.
The answer is that completely missing a species “is not only possible, it’s likely,” says Walsh, noting that the study can help inform invasive species ecology and is a “way of using math and computer modeling to fill in the blanks of what we see.”
With Center for Limnology director Jake Vander Zanden and Eric Pedersen, a colleague from Fisheries and Oceans Canada, Walsh developed a theory of the probability of detecting a species as its population densities change.
Their modeling shows that when species are in low abundance in a given habitat, the ability for scientists to detect them drops off precipitously.
This may explain why spiny water fleas passed undetected in Lake Mendota for a decade. Early on, researchers would have needed to dip their nets into the lake “hundreds or even thousands” of times, Walsh says. Once the invaders became more abundant, detection became much easier: “You can go out sampling three times and likely detect spiny water fleas.”
Part of the problem is size. Even if there was one spiny water flea for every cubic meter of water in Lake Mendota, catching one in a net would be like finding a sesame seed in roughly 250 gallons of water.
One of the solutions, the study shows, may be for scientists to increase the size of the funnel-shaped plankton nets they drag through the water when looking for the small creature. Standard nets are roughly a foot in diameter, but by upgrading to a one-meter-wide net (about three feet in diameter), “your detection increases by quite a bit,” Walsh says.
Being more deliberate about sampling for spiny water fleas and other invasives at the right places and times may also improve scientists’ chance for detection, Walsh says. Spiny water fleas are a type of zooplankton (small, free-floating crustaceans) that travel in groups called swarms and are pushed around by wind and currents. A swarm may move at any time out of any given sampling site. And their abundances vary throughout the year. For instance, spiny water fleas are present in the greatest numbers in Lake Mendota in the fall.
“If you were to double your effort at sampling for spinies in the fall,” Walsh says, “you get the same advantage as if you were to double your effort across the entire year.”
The study offers some “basic rules of thumb” for designing species surveillance programs of any kind — from likely invasives to rare or endangered natives, Walsh says.
“It has to do with targeting our efforts better and finding times of the year where things are more abundant or areas where they’re more abundant because that dramatically increases your detection rate,” he says. “If you take a little extra time to get to know the species you’re looking for, it can really pay off.”
abstract Species often occur at low abundance and, as a result, may evade detection during sampling. Here, we develop a general theoretical model of how the probability of detecting a species in a discrete sample varies with its mean density in that location based on sampling statistics. We show that detection probability at low densities can be approximated with a simple one parameter saturating function: The probability of detection (P) should increase with the mean number of individuals in a discrete sample (N) as P ~ αN/(1 + αN). We further show how the parameter α will be affected by species spatial aggregation, within‐sample observability, and gear catchability. We use the case of the invasive zooplankter, spiny water flea (Bythotrephes longimanus), to demonstrate that this theoretical model fits the empirical pattern of detectability. In Lake Mendota, WI, Bythotrephes went undetected despite rigorous long‐term zooplankton monitoring (possibly as long as 14 yr with 15 sampling dates/yr). Using our modeling framework, we found that the likelihood of detecting Bythotrephes was low unless peak annual densities exceeded 0.1 individuals/m3 over multiple years. Despite using models based on Bythotrephes ecology to identify ways to increase detectability, detection probabilities remained low at low densities, such that cases of missed detection such as Lake Mendota are possible despite rigorous monitoring effort. As such, our theoretical model provides a simple rule of thumb for estimating sampling effort required to detect rare species when densities are expected to be low: The minimum samples required (S) to reliably detect a species can be approximated as S ~ (1 + N)/N.
pluck or luck: does trait variation or chance drive variation in lifetime reproductive success?
robin e. snyder, stephen p. ellner 2018
doi.org/10.1086/696125
systematic unpredictability
on chaotic dynamics in transcription factors and the associated effects in differential gene regulation
mathias l. heltberg et al. 2019
doi.org/10.1038/s41467-018-07932-1
The researchers investigated how a particular protein produced within cells, NF-kB, stimulates genes. Among other things, this particular protein is vital for maintaining the body's immune defense system and thereby, the body's ability to combat disease. The concentration of NF-kB fluctuates over time, and these swings impact the genes and subsequently, the condition of cells.
The researchers demonstrated that chaotic swings in the concentration of the protein -- what in mathematics is known as chaotic dynamics -- can increase the activation of a number of genes that are otherwise not activated. In other words, when in a chaotic state, the NF-kB protein is most effective at activating genes and optimally "tuning" the immune system.
"The results can have a tremendous impact on our understanding of how the immune system functions and how the incidence of some of the most serious illnesses, including diabetes, cancer and Alzheimer's, might be avoided. For example, we know that cancer is related to a failure of signaling within the body. So, to avoid cancer, it is imperative to have the right dynamic at work in cells," says Mogens Høgh Jensen, a professor in biocomplexity at the University of Copenhagen's Niels Bohr Institute.
Improved knowledge can improve cancer treatment
The researchers point out that this new knowledge can be deployed in future therapies.
"These could come in the form of new medications that ensure proper protein function. Therapies could also involve the withdrawal and testing of cells from a body to gauge whether cells are in the right condition to have the correct swings. If they aren't, it may be possible to predict and discover illnesses before they occur," explains Mathias Heltberg, a PhD student in Biocomplexity.
The research results are among the first to prove that chaos can be an important aspect of the mechanisms that steer the enormous complexity characteristic of all living things. Even the researchers were surprised by their discovery, as chaotic dynamics is often seen as something that living organisms seek to avoid. The new knowledge opens up an entirely new understanding of how genes can be regulated through varied swings in the proteins that control them.
"Chaos is a mathematically well-defined dynamic, one that, for example, has previously been used to explain great changes in weather systems. With the enormous complexity that characterizes higher order living things, it is evident that chaotic dynamics will occur in different types of systems. But how chaos plays a decisive role in living cells is entirely new," concludes Mogens Høgh Jensen.
Based upon a range of experimental results, the researchers arrived at their conclusions through mathematical calculations and theoretical arguments
abstract The control of proteins by a transcription factor with periodically varying concentration exhibits intriguing dynamical behaviour. Even though it is accepted that transcription factors vary their dynamics in response to different situations, insight into how this affects downstream genes is lacking. Here, we investigate how oscillations and chaotic dynamics in the transcription factor NF-κB can affect downstream protein production. We describe how it is possible to control the effective dynamics of the transcription factor by stimulating it with an oscillating ligand. We find that chaotic dynamics modulates gene expression and up-regulates certain families of low-affinity genes, even in the presence of extrinsic and intrinsic noise. Furthermore, this leads to an increase in the production of protein complexes and the efficiency of their assembly. Finally, we show how chaotic dynamics creates a heterogeneous population of cell states, and describe how this can be beneficial in multi-toxic environments.
protean behaviour: the biology of unpredictability
p. m. driver and d. a. humphries 1988 unread
protean primates: the evolution of adaptive unpredictability in competition and courtship
geoffrey f. miller 1997
irresponsible captain tylor
無責任艦長タイラー
1993
why can only 24% solve bayesian reasoning problems in natural frequencies: frequency phobia in spite of probability blindness
patrick weber et al. 2018
doi.org/10.3389/fpsyg.2018.01833
transcriptome-wide noise controls lineage choice in mammalian progenitor cells
hannah h. chang et al. 2008
doi.org/10.1038/nature06965
selective survival of embryos can explain dna methylation signatures of adverse prenatal environments
elmar w. tobi et al. 2018
doi.org/10.1016/j.celrep.2018.11.023
The new research was motivated by the observation that people conceived during the Dutch Hunger Winter of 1944-1945 suffer from reduced cardiovascular health in their sixties. This can be attributed to persistent changes in how genes are expressed, through so-called epigenetic modification of the DNA. "We know that a lack of nutrition decreases the likelihood of an embryo to survive. Our new study indicates that surviving famine in the uterus hinged on having a DNA methylation pattern allowing continued growth of the embryo in spite of limited resources. But those same methylation patterns may have adverse health effects much later in life," says Bas Heijmans, epigeneticist at the Leiden University Medical Center.
To understand the interplay between epigenetics and survival of the embryo, the researchers took inspiration from evolutionary biology. In evolution, random genetic variation is filtered by natural selection, resulting in accumulation of variants that best 'fit' the environment. A computer model showed that random epigenetic variation between embryos is inevitable, just like genetic mutation. Some of the random DNA methylation variants may enhance an embryo's chance to survive on low nutrition. As a consequence, those epigenetic variants will become more common in cohorts that were exposed to a famine as embryos. "We have always struggled to explain how early embryos would be able to modify specific epigenetic marks in response to nutrition. It is fascinating that selective survival based on random epigenetic variation fits the data best," says Tobias Uller, evolutionary biologist at Lund University.
Some health effects of the Dutch Famine only show later in life and those exposed during early gestation seem to be most affected. "These findings have often been interpreted as conclusive proof of fetal adaptations in the womb that will lead to adult disease if the adult environment changes for the better. But our findings point to a different mechanism," says L.H. Lumey, MD, epidemiologist at Columbia Mailman School and principal investigator of the Dutch Hunger Winter Families study.
•Variation in gene expression and DNA methylation is generated in early development
•Adverse prenatal environments may result in selection on this variation in utero
•Selection reduces the variance in DNA methylation at loci that affect survival
•Selection may help explain some of the health effects of prenatal adversity
An adverse intrauterine environment is associated with long-term physiological changes in offspring. These are believed to be mediated by epigenomic marks, including DNA methylation (DNAm). Changes in DNAm are often interpreted as damage or plastic responses of the embryo. Here, we propose that stochastic DNAm variation, generated during remodeling of the epigenome after fertilization, contributes to DNAm signatures of prenatal adversity through differential survival of embryos. Using a mathematical model of re-methylation in the early embryo, we demonstrate that selection, but not plasticity, will generate a characteristic reduction in DNAm variance at loci that contribute to survival. Such a reduction in DNAm variance was apparent in a human cohort prenatally exposed to the Dutch famine, illustrating that it is possible to detect a signature of selection on epigenomic variation. Selection should be considered as a possible mechanism linking prenatal adversity to subsequent health and may have implications when evaluating interventions.
engineering vibrationally assisted energy transfer in a trapped-ion quantum simulator
d. j gorman et al. 2019
doi.org/10.1103/physrevx.8.011038
When electrons move in a conducting material or when energy absorbed from sunlight is shifted between molecules during photosynthesis, the process can be described using the wave equations of quantum mechanics. Wave interference can lead to a suppression of the transport of charge or energy, but, counterintuitively, random fluctuations such as thermal noise can actually improve this transport. Researchers have now demonstrated both of these effects for energy transport in a chain of ten atoms, where the degree of disorder and noise in the system can be carefully controlled. Studying such effects in a clean, well-controlled experiment could lead to a better understanding of how they operate in larger, more complex situations such as in electronic and optical devices.
An electric current can be described as a wave, but so can other kinds of energy transport. For example, energy can be transferred between atoms or molecules as a series of absorption and emission events of photons, which can be represented as a wave of energy moving from one component to the next. If the waves remain in step, they are said to be coherent, in which case constructive interference (alignment of peaks and troughs) can amplify the flow of charge or energy. But disorder, such as impurities or defects in the lattice of atoms, may cause coherent destructive interference (peaks canceling out troughs) that averages out to suppress the flow entirely. This suppression, which can trap charge or halt the flow of energy in a specific region, is called Anderson localization.
Theoretical studies have predicted that noise in these quantum waves, such as that from thermal fluctuations, may have the surprising effect of breaking down Anderson localization and boosting transport. This boost occurs because noise destroys the coherence of the waves and so removes the possibility of coherent destructive interference. The effect is known as environment-assisted quantum transport (ENAQT), and it has been reported previously in systems ranging from optical fibers 1 to superconducting circuits 2. It has also been suggested that ENAQT might operate in photosynthetic energy transfer 3, although this hypothesis remains unproven.
Most of the previous studies of ENAQT used just a few macroscopic and therefore hard-to-control components, or they contained too few components to be relevant for real materials 4. To investigate the effect in a clean, well-defined, many-particle system, Christian Roos of the Institute for Quantum Optics and Quantum Information in Innsbruck, Austria, and coworkers reproduced it in ten calcium ions suspended in a row in free space by electric fields.
The trapped-ion array was essentially a one-dimensional crystal. The researchers could switch the ions between two electronic states, and, when injecting energy into the array using a laser pulse, they could monitor the wavelike passage of the energy along the array from ion to ion. To make the system somewhat disordered and induce Anderson localization, they could vary the transition energies of the ions along the chain by exciting each one with a beam of a different intensity. With ten separate beams, “we can create disorder in a very controlled way that can be switched on and off on short time scales,” says Roos.
In the team’s experiments, when the degree of disorder was high, they found that energy transport along the chain was lowered by Anderson localization, as expected. But when Roos and colleagues introduced noise into their array by rapidly changing the beam intensity exciting each ion, so that the transition frequencies “wobbled,” things changed. The transport efficiency increased with increasing noise level, consistent with ENAQT predictions.
However, if the noise was too great, the energy transport declined again, thanks to the so-called quantum Zeno effect: In the ion-array experiments, a high level of noise mimics the effect of repeated observation, which, according to quantum mechanics, can freeze a system and suppress any change of state.
Roos says that his team’s setup allows such quantum effects to be studied “in artificially engineered quantum systems where we have excellent control over the network connectivity, coupling strengths between sites, [and] local disorder.” ENAQT, or more generally, transport processes influenced by their environment, may occur in a wide range of systems and so are important to investigate, says Clemens Gneiting, a quantum physicist at RIKEN, near Tokyo. “That they can investigate it here with high precision and clean control of the system on the quantum level allows for clear comparison with theory.”
abstract Many important chemical and biochemical processes in the condensed phase are notoriously difficult to simulate numerically. Often, this difficulty arises from the complexity of simulating dynamics resulting from coupling to structured, mesoscopic baths, for which no separation of time scales exists and statistical treatments fail. A prime example of such a process is vibrationally assisted charge or energy transfer. A quantum simulator, capable of implementing a realistic model of the system of interest, could provide insight into these processes in regimes where numerical treatments fail. We take a first step towards modeling such transfer processes using an ion-trap quantum simulator. By implementing a minimal model, we observe vibrationally assisted energy transport between the electronic states of a donor and an acceptor ion augmented by coupling the donor ion to its vibration. We tune our simulator into several parameter regimes and, in particular, investigate the transfer dynamics in the nonperturbative regime often found in biochemical situations.
a belief in meritocracy is not only false: it’s morally wrong
clifton mark 2019
aeon.co/ideas/a-belief-in-meritocracy-is-not-only-false-its-morally-wrong
simpson’s paradox
wikipedia
apparent mismatch between result for smaller time samples and result over entire period
simpson’s paradox and causality
prasanta s. bandyopadhyay et al. 2015
academia.edu/11600200/Simpsons_Paradox_and_Causality
There are three questions associated with Simpson’s Paradox (SP): (i) Why is SP paradoxical? (ii) What conditions generate SP?, and (iii) What should be done about SP? By developing a logic-based account of SP, it is argued that (i) and (ii) must be divorced from (iii). This account shows that (i) and (ii) have nothing to do with causality, which plays a role only in addressing (iii). A counter-example is also presented against the causal account. Finally, the causal and logic-based approaches are compared by means of an experiment to show that SP is not basically causal.
“the collapsibility principle (CP) … says that
relationships between variables that hold in the
sub-populations (e.g., the rate of acceptance of
females is higher than the rate of acceptance
of males in both sub-populations) must hold in
the overall population as well (i.e., the rate of
acceptance of females must be higher than the
rate of acceptance of males in the population).”
our mental shortcut is to apply cp when this is not warranted
but this is a common assumption in the social sciences — it is the major extrapolation from experimental populations to non–experimental populations
“Our resolution of the paradox has illuminated another aspect of human frailty. We explained its apparent paradoxical nature by invoking the failure
of our widespread intuitions about numerical inference. The failure of collapsibility, which is non-causal, in Simpson’s paradox-type cases is what makes them puzzling.”
truths about simpson’s paradox: saving the paradox from falsity
prasanta s. bandyopadhyay, r. venkata raghavan, don wallace dcruz and gordon brittan jr. 2015
academia.edu/11600189/Truths_about_Simpson_s_Paradox_Saving_the_Paradox_from_Falsity
There are three questions associated with Simpson’s paradox(SP): (i) Why is SP paradoxical? (ii) What conditions generate SP? and(iii) How to proceed when confronted with SP? An adequate analysisof the paradox starts by distinguishing these three questions. Then, bydeveloping a formal account of SP, and substantiating it with a counter-example to causal accounts, we argue that there are no causal factors atplay in answering questions (i) and (ii). Causality enters only in connection with action
A paradox is an (apparently) inconsistent set
of sentences each of which seems to be true
Saving Truth from Paradox
Hartry Field discusses the philosophical significance of paradoxes. According to him, “[a]ny resolution of the paradoxes will involve giving up (or at least restricting) some very firmly heldprinciples:... [and] [t]he principles to be given up, are the ones to which the average person simply can’t conceive of alternatives. That’s why the paradoxes are paradoxes.” [4, p.17].
“… SP also highlights the significant role played by CP in generating the paradoxical result. Such principles are what Field would suggest we jettison to escape paradoxes.”
USA june statistics, simpson’s paradox?
mobile.twitter.com/mbeckett/status/1278750652160634880
probabilities change in unexpected fashions
snakebites and climate change in california, 1997–2017
caleb phillips et al. 2018
doi.org/10.1080/15563650.2018.1508690
for every 10 percent increase in rainfall over the previous 18 months, cases of snake bites spiked by 3.9 percent in California’s 58 counties.
“This study shows a possible unexpected, secondary result of climate change,” said Phillips, an adjunct assistant professor in CU Boulder’s Department of Computer Science. “We probably need to take climatological changes into account when we coordinate systems that may seem unrelated like planning how we distribute antivenin supplies or funding poison control centers.”
Phillips and his colleagues suspect that the reason for the surge in snake bites during wet years may come down to snake food. Mice and other rodents, the prime meals for rattlesnakes, flourish in rainy years — and that might give snakes a boost.
Phillips, an avid trail runner and trained wilderness first responder, urges outdoor enthusiasts like him to stay calm. “If you encounter a rattlesnake,” Phillips said, “don’t pick a fight with it, and it won’t pick a fight with you.”
abstract Patterns of precipitation and drought had a significant and predictive effect on snakebites in California over a 20-year period. Snakebite incidence decreased following drought, and increased after precipitation.
errors as a primary cause of late-life mortality deceleration and plateaus
saul justin newman et al. 2018
doi.org/10.1371/journal.pbio.2006776
As we age through adulthood, the probability of dying increases year after year. But studies in multiple species, including humans, have suggested that, at the far end of the lifespan, the rate of increase slows, or even plateaus. Biological explanations for such late-life mortality deceleration have been developed, but are controversial, and a role for statistical error has also been proposed.
In the new report, Newman shows that a variety of errors, individually and combined, have the effect of producing a slowing of apparent mortality at the end of the lifespan, and can largely explain away the observed trends. Categories of error include those in demographic sampling, birth and death records, age reporting, and others.
For instance, random errors in reporting of age within a population will result in some younger individuals being mistakenly recorded as older, and vice versa. As this population ages, older individuals mistakenly recorded as younger will die earlier than expected, but those mistakenly recorded as older will die later, enriching the pool of very old individuals and flattening the mortality curve.
Newman found that an error rate of as low as one in ten thousand would be sufficient to produce the observed declines in apparent age-related mortality. Furthermore, he was able to show that an improvement in data quality in large population studies corresponded with a reduction in late-life mortality deceleration.
"These findings suggest that human late-life mortality plateaus are largely, if not entirely, artefacts of error processes," Newman concludes. The finding has important consequences for understanding human longevity, since predictions that lifespan can be greatly increased have depended in part on the apparent decelerations and plateaus previously reported in the biological and demographic literature.
In a separate short paper, Newman asked whether such errors might even explain away the late-life mortality plateau reported in a recent high-profile paper published in Science Magazine earlier this year by Elisabetta Barbi, Kenneth Wachter and colleagues -- that paper used a high-quality dataset of nearly 4,000 death records from Italy to show that death rates decelerate after the age of 80 and plateau after 105. Newman calculates that this apparent effect could still be down to plausible error rates in record-keeping. In a response to this, Wachter defends the quality of their dataset, and describes Newman's proposed error rate as "wildly implausibly high."
Newman does note that in at least one species, the fruit fly, an observed late-life mortality plateau does not seem to be due to error, and may require a biological explanation. "Discriminating between real and artefactual cases will require careful case-by-case analysis, and will constitute an ongoing challenge in the study of aging."
abstract Several organisms, including humans, display a deceleration in mortality rates at advanced ages. This mortality deceleration is sufficiently rapid to allow late-life mortality to plateau in old age in several species, causing the apparent cessation of biological ageing. Here, it is shown that late-life mortality deceleration (LLMD) and late-life plateaus are caused by common demographic errors. Age estimation and cohort blending errors introduced at rates below 1 in 10,000 are sufficient to cause LLMD and plateaus. In humans, observed error rates of birth and death registration predict the magnitude of LLMD. Correction for these sources of demographic error using a mixed linear model eliminates LLMD and late-life mortality plateaus (LLMPs) without recourse to biological or evolutionary models. These results suggest models developed to explain LLMD have been fitted to an error distribution, that ageing does not slow or stop during old age in humans, and that there is a finite limit to human longevity.
Author summary
In diverse species, mortality rates increase with age at a relatively fixed rate within populations. However, recent discoveries have suggested this relationship breaks down in advanced old age, with mortality rate increases slowing and even reaching a plateau. This late-life mortality deceleration has initiated sustained debate on the cause of late-life deceleration and plateaus. Proposed explanations include evolutionary patterns, the exhaustion of selective pressure, population heterogeneity, and even the cessation of the ageing process. Here, I demonstrate that apparent late-life mortality decelerations and plateaus can be generated by low-frequency errors. I then reveal how indicators of demographic data quality predict the magnitude of late-life mortality deceleration and the existence of late-life plateaus in human populations. These findings suggest that human late-life mortality plateaus are largely, if not entirely, artefacts of error processes. As a result, late-life mortality plateaus and decelerations may be explained by error patterns in humans and many other species without invoking complex biological, heterogeneity, or evolutionary models. This finding has immediate consequences for demographic modelling, evolutionary biology, and the projected upper limits of human and nonhuman life.
uncertainty in local outcomes shaping risk averse behaviour
the developmental origins of risk and time preferences across diverse societies
amir, d. et al. 2019
psycnet.apa.org/doi/10.1037/xge0000675
personality is not only about who but also where you are
dorsa amir 2019
aeon.co/ideas/personality-is-not-only-about-who-but-also-where-you-are
success and failure of ecological management is highly variable in an experimental test
easton r. white et al. 2020
doi.org/10.1073/pnas.1911440116
ecological systems might contain a lot of inherit randomness that makes them difficult to manage. One of the most difficult parts of managing an invasive species or a fishery is determining whether or not the management strategy was effective. If a management strategy failed to reach some goal, was this because it was the wrong strategy or because of inherit randomness in the system? Perhaps, that particular management strategy would have been the right choice 9 times out of 10 and managers were simply unlucky.
Led by Dr. Easton White from the University of Vermont, in collaboration with scientists in California and Colorado, the study used mathematical models to first demonstrate that there could be high levels of variability in species management outcomes. They then tested these ideas with an experimental invasive species, the flour beetle (Tribolium confusum).
“In nature, we might only have a single study site we are concerned with managing,” White says. “This means we typically only have a single replicate under study, making it difficult to determine the ultimate cause of management success or failure. The combination of mathematical models and laboratory experiments provide replication and a measure of ecological management variability.”
The team also found that the highest levels of management variability occurred at intermediate levels of management effort. In other words, unless a large amount of effort is used to control a system, we are likely to fail or succeed simply by chance. This is concerning for real systems where we have limited budgets.
“Our results suggest that much of ecological management is bound to succeed or fail simply because of good or bad luck,” notes White. “In our experiment we were able to control the laboratory conditions precisely, reducing variability caused by the environment. Thus, we might expect that managing natural systems might lead to higher levels of variability.”
The team also investigated the combination of different management strategies. To control the invasive species, they tried direct harvesting and controlling the beetle movement. They found that combinations of strategies, as opposed to only using a single strategy, were often more effective.
abstract When managing natural systems, the importance of recognizing the role of uncertainty has been formalized as the precautionary approach. However, it is difficult to determine the role of stochasticity in the success or failure of management because there is almost always no replication; typically, only a single observation exists for a particular site or management strategy. Yet, assessing the role of stochasticity is important for providing a strong foundation for the precautionary approach, and learning from past outcomes is critical for implementing adaptive management of species or ecosystems. In addition, adaptive management relies on being able to implement a variety of strategies in order to learn—an often difficult task in natural systems. Here, we show that there is large, stochastically driven variability in success for management treatments to control an invasive species, particularly for moderate, and more feasible, management strategies. This is exactly where the precautionary approach should be important. Even when combining management strategies, we show that moderate effort in management either fails or is highly variable in its success. This variability allows some management treatments to, on average, meet their target, even when failure is probable. Our study is an important quantitative replicated experimental test of the precautionary approach and can serve as a way to understand the variability in management outcomes in natural systems which have the potential to be more variable than our tightly controlled system.
improbability principle
(quoted from?)
dance with chance: making luck work for you
spyros makridakis 2009
the signal and the noise: why so many predictions fail - but some don’t
nate silver 2012
risk intelligence: how to live with uncertainty
dylan evans 2015
the blind spot: science and the crisis of uncertainty
william byers 2011
the jungles of randomness: a mathematical safari
ivars peterson 1998
the evolution of scientific knowledge: from certainty to uncertainty
edward dougherty 2016
the improbability principle: why coincidences, miracles, and rare events happen every day
david hand 2014
the possibility principle: how quantum physics can improve the way you think, live, and love
mel schwartz 2017
understanding uncertainty
dennis lindley 2006
the logic of miracles: making sense of rare, really rare, and impossibly rare events
laszlo mero, márton moldován 2018
knock on wood: luck, chance, and the meaning of everything
jeffrey s. rosenthal 2018
probability demystified second edition
allan g. bluman 2018
forewarned: a sceptic’s guide to prediction
paul goodwin 2018
luck: the brilliant randomness of everyday life
nicholas rescher 2001
fall, or dodge in hell
neal stephenson 2019
the art of statistics: learning from data
david spiegelhalter 2019
python for probability, statistics, and machine learning
josé unpingco 2019
accidental
alex richards 2020
how to decide: simple tools for making better choices
annie duke 2020
let us make chance habits in (seemingly different levels of) our lives. for example, i run a script often which randomly chooses between several preset options. so for example, the same action runs this script, which has different outputs depending on the random selection. i change the set of options at intervals, most often when i have reached the end of one of the books i am reading (and that book was one of the options). so for example, i choose the book semi-randomly, by using a random number generator that outputs between 1 and 100, and if the folder with that number exists, i choose a book from that folder to read next. the books that are in the folders are preselected as ones that sounded interesting to me, but that i have not read. so for example, they might well be books that i disagree with strongly, but that i feel i might learn something by reading, though i do not tend to know what that might be in advance (or often, not even for some time after).
we can do the same with non-urgent tasks. or even urgent tasks, if there are too many to easily choose between, and you are willing to allow that level of randomness.
melodies of life
白鳥 英美子
alone for a while / i’ve been searching through the dark
for traces of the love you left inside my lonely heart
a mosaic / of / the pieces that remain
a melody of life / love lost’s refrain
our paths they did cross / though we cannot say just why
we met / we laughed / we held on fast / and then we said goodbye
to weave a fabric for fables yet untold
let them ring out loud as they unfold
in my dearest memories / i see you reaching out to me
though you’re gone, i still remember that you called / out my name
☆a voice from the past / joining yours and mine
adding to the layers of harmony
and so it goes, on and on
melodies of life,
to the sky beyond the soaring birds
forever and ever
so far and away, see the birds as they fly by
soaring through the shadows / of the clouds up in the sky
i have laid my memories and dreams upon those wings
leave them now and see what tomorrow brings
in your dearest memories / do you remember loving me?
was it fate / that brought us close and now leaves / me behind?
☆repeat
if i should leave / this lonely world behind
a voice will still remember our harmonies
now i know we’ll carry on
melodies of life
entwine and bond deep in our hearts
as long as we remember
Link: 02099-c5d7bd2ea105f77e09be68287e011b8b.html
Link: 01099-d1aba8cc82228d06e13d2b555333d64e.html
role of chance and luck
not to judge someone by their achievements as we do not know their circumstances.
the bad things that people do to you, to each other, to the world — we really do not understand what we do… if we did truly understand, we would not do these things. bad things that people do to you — mostly they do not know even who you are, or are acting on an image they have of you, not who you truly are. this is because our society and culture has not evolved yet to deal with our reality. so too the good things that people do — it may be that we mostly do them without understanding what we are truly doing, and any good consequences too are just luck rather than the result of understanding. by this reasoning, we cannot take credit for the good things we do, nor the blame for the bad things we do — we are still as children, not understanding the consequences of what we do, of how we live our lives.