Feedback and Notes

 

Imagine No Religion

Latest Activity

Stephen commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
1 hour ago
Stephen commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
1 hour ago
Stephen commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
1 hour ago
Doone has Fremdschämen commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
3 hours ago
Stephen commented on Adriana's group Freethought and Funny Bones
6 hours ago
Suzanna commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
7 hours ago
Stephen commented on Adriana's group Freethought and Funny Bones
7 hours ago
Stephen commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
7 hours ago
Mrs.B commented on Hope's group Imagine No Religion, Please!
8 hours ago
Suzanna commented on Hope's group Imagine No Religion, Please!
9 hours ago
Stephen commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
11 hours ago
Mrs.B commented on Doone has Fremdschämen's group World History
12 hours ago
Stephen commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
12 hours ago
Stephen commented on Doone has Fremdschämen's group World History
12 hours ago
Mrs.B commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
13 hours ago
Doone has Fremdschämen commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
13 hours ago
Doone has Fremdschämen commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
14 hours ago
Doone has Fremdschämen commented on A Former Member's group The Burgeoning Family Tree of the Naked Ape
14 hours ago
Doone has Fremdschämen commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
14 hours ago
Mrs.B commented on Doone has Fremdschämen's group Canada, Mexico most of the World and Some Nutty Country Suffering from Fremdschämen about Scumpism News
16 hours ago

We are a worldwide social network of freethinkers, atheists, agnostics and secular humanists.

I thought it might be kind of nice to have a thread where we can just add all the great stuff that we find which may not merit its own thread.

This thread can be used to add:

  • Relevant quotes
  • Books and journals
  • Websites or other useful links
  • Definitions of important words, concepts, theories
  • Papers, articles, and reports in PDF format

Even if you don't have anything to add right now, please select 'follow' so that you can be notified of new additions.

Views: 273

Replies to This Discussion

The Critical Thinking Glossary from UNC Charlotte


Affirming the consequent
Like denying the antecedent, affirming the consequent is a formal fallacy. The fallacy lies solely in the form itself. It has the following pattern: if p then q, q, therefore p. Any argument that fits this pattern is invalid, that is, even if the premises are true, the conclusion that follows from these premises may not be true. Whereas, a valid form guarantees that, if the premises are true, the conclusion will be true. Indeed, if an argument has a valid form and true premises, then it is impossible for the conclusion to be false.


Alternatives
Alternatives are necessary for critical thinking. By itself a claim is difficult if not impossible to understand or assess. But if one has a rival interpretation one can compare and contrast the two, choosing the better alternative. By generating this second reading the critical reader can then determine which one best fits the text and make an informed judgment. This sort of critical awareness is key to critical thinking. Without relevant choices one cannot make informed judgments.

For instance, in "What the Doctor Ordered" the nurse could have accepted the explanation that the doctor was "weaning" the patient from the respirator but, based on her knowledge of medical procedures, she figured out that the prescribed procedure would result in the patient's death.

Also, by considering alternative interpretations, one can come to understand that an intelligent author made informed choices. She or he did not write down the first thing that came to her or his mind. S/he considered alternatives, choosing the one that best fit her or his purpose. In other words, the text is not sacred. It was constructed. One even can--with practice--recover the choices made by an author and then assess them.


Ambiguity
Ambiguity is the use of a term, phrase or statement in two or more distinct senses. If a headline in the University Times were to read:

Chancellor on Drugs

we would probably expect an article about the Chancellor speaking on drugs. (Oops! "The Chancellor speaking on drugs" is also ambiguous.) We would expect an article about a speech on the issue of drugs. But, in a different context, the words, "Chancellor on drugs" could mean something else. Ambiguity can be distinguished from vagueness. See "Who Needs the WWW?" for an example of ambiguity.


Analysis
Analysis (from a Greek word that means "a loosing") is an intellectual activity much valued by philosophers, which consists of understanding a concept or situation by identifying its constituent parts. Thus a term can be defined by identifying what is implicit in the concept. For instance, "sister" can be defined as "a female sibling." There are two characteristics common to all those we so designate: female and sibling. Again, an object, situation or event can be understood by sorting it into various categories. Buildings on the UNC Charlotte campus can be analyzed in terms of name, use, occupants, type of construction, age, etc. Thus the Winningham Building is used for offices, classrooms and storage, but is occupied by the Department of Philosophy and some members of the art department. Thus Winningham can be analyzed either in terms of name, use or occupancy. The assumption of philosophical analysis is that terms, objects and events can be understood by mentally taking them apart. See necessary and sufficient conditions.


Appeal to authority
If fallacious, an appeal to authority relies on an inappropriate authority, inappropriate because the person appealed to is either insincere (=untruthful) or lacking the necessary expertise. The relevant rule that it violates is: One may rely, to a greater or lesser extent, on the information or advice provided by someone who is truthful and knowledgeable about the issue in question.

Linus Pauling, who won the Nobel Prize for his work in chemistry, advocated the use of megadoses of Vitamin C for controlling the common cold. Therefore you should take lots of Vitamin C for your cold.

Pauling's expertise is not in health care. Causal links are difficult to identify. Even experts in a particular field do not ask us to accept a claim on their authority alone; they cite studies that show causation.

Here is an amusing appeal to authority:

A fellow wanted to take advantage of an advertised special at a fast food place--a hamburger, fries and large drink for $2.50. But instead of a large drink he decided to get a small drink. When he got to the cashier, she charged him for each item separately and the total came to more than $2.50. When he objected, she explained that there could be no changes and, if he wanted the lower price he would have to exchange the small drink for a large one. So he did, noticing, to his dismay, that the counterman dropped the small drink in the trash before giving him a large one. When he returned to the cashier he commented that this made no sense. But she failed to see the problem. She just pointed to the computerized cash register and said, "So how come that's what it says here?" (Adapted from an entry in the Metropolitan Diary section of The New York Times, October 11, 1998).

For another example see "Who's To Say?".

A List of Cognitive Biases

A cognitive bias is a pattern of deviation in judgment that occurs in particular situations. Implicit in the concept of a "pattern of deviation" is a standard of comparison; this may be the judgment of people outside those particular situations, or may be a set of independently verifiable facts. The existence of some of these cognitive biases has been verified empirically in the field of psychology.


Cognitive biases are instances of evolved mental behavior. Some are presumably adaptive, for example, because they lead to more effective actions in given contexts or enable faster decisions when faster decisions are of greater value. Others presumably result from a lack of appropriate mental mechanisms, or from the misapplication of a mechanism that is adaptive under different circumstances.


Cognitive bias is a general term that is used to describe many distortions in the human mind that are difficult to eliminate and that lead to perceptual distortion, inaccurate judgment, or illogical interpretation.


Decision-making and behavioral biases


Many of these biases are studied for how they affect belief formation, business decisions, and scientific research.


Anchoring – the common human tendency to rely too heavily, or "anchor," on one trait or piece of information when making decisions.

Bandwagon effect – the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.

Bias blind spot – the tendency to see oneself as less biased than other people.[2]

Choice-supportive bias – the tendency to remember one's choices as better than they actually were.

Confirmation bias – the tendency to search for or interpret information in a way that confirms one's preconceptions.[3]

Congruence bias – the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.

Contrast effect – the enhancement or diminishing of a weight or other measurement when compared with a recently observed contrasting object.[4]

Denomination effect – the tendency to spend more money when it is denominated in small amounts (e.g. coins) rather than large amounts (e.g. bills).[5]

Distinction bias – the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[6]

Endowment effect – "the fact that people often demand much more to give up an object than they would be willing to pay to acquire it".[7]

Experimenter's or Expectation bias – the tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.[8]

Extraordinarity bias – the tendency to value an object more than others in the same category as a result of an extraordinarity of that object that does not, in itself, change the value.[citation needed]

Focusing effect – the tendency to place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.[9]

Framing effect – drawing different conclusions from the same information, depending on how that information is presented.

Continue reading on Wikipedia.com.

 

I haven't read this. Just ran arcross it, but now it is on my list.

How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life

Sports fans who think that basketball players shoot in "hot streaks," and maternity nurses who maintain that more babies are born when the moon is full adhere to erroneous beliefs, according to Gilovich, associate professor of psychology at Cornell. With examples ranging from the spread of AIDS to the weight of Scholastic Aptitude Test scores, he skewers popular but mistaken assumptions. Faulty reasoning from incomplete or ambiguous data, a tendency to seek out "hypothesis-confirming evidence" and the habit of self-serving belief are among the factors Gilovich pinpoints in his sophisticated anaylsis. However, in the book's second half, his debunking of holistic medicine, ESP and paranormal phenomena is superficial and one-sided, marred by some of the very tendencies he effectively exposes in the "true believers."

Copyright 1991 Reed Business Information, Inc.
Introspection illusion

The introspection illusion is a cognitive illusion in which people wrongly think they have direct insight into the origins of their mental states, while treating others' introspections as unreliable. In certain situations, this illusion leads people to make confident but false explanations of their own behavior or predictions about their future mental states.

The illusion has been examined in psychological experiments, and suggested as a basis for biases in how people compare themselves to others. These experiments have been interpreted as suggesting that, rather than offering direct access to the processes underlying mental states, introspection is a process of construction and inference, much as people indirectly infer others' mental states from their behavior.

When people mistake unreliable introspection for genuine self-knowledge, the result can be an illusion of superiority over other people. For example each person thinks they are less biased and less conformist than the rest of the group. Even when experimental subjects are provided with reports of other subjects' introspections, in as detailed a form as possible, they still rate those other introspections as unreliable while treating their own as reliable. Although the hypothesis of an introspection illusion informs some psychological research, the existing evidence is arguably inadequate to decide how reliable introspection is in normal circumstances.
I have always enjoyed books on decision theory, and what makes choices irrational. one book I love is Predictably Irrational by Dan Ariely. In this book he talks about those biases, but also talks about the experiments that further explain what our minds are going through. He also wrote another book called The Benefits of Irrationality that is similar but talks about issues that seem like they would work from a rational perspective, but do not upon execution. One example he provides is bonuses, and how they don't benefit productivity in an office, but do benefit when the task is more mechanical/physical as opposed to mental/cognitive, and finds that a better reward for people doing mental tasks is more freedom from the job. Anyways, two book recommendations I hope you enjoy.

Motivated Skepticism in the Evaluation of Political Beliefs

ABSTRACT: We propose a model of motivated skepticism that helps explain when and why citizens are biased information processors. Two experimental studies explore how citizens evaluate arguments about affirmative action and gun control, finding strong evidence of a prior attitude effect such that attitudinally congruent arguments are evaluated as stronger than attitudinally incongruent arguments. When reading pro and con arguments, participants (Ps) counterargue the contrary arguments and uncritically accept supporting arguments, evidence of a disconfirmation bias. We also find a confirmation bias – the seeking out of confirmatory evidence – when Ps are free to self-select the source of the arguments they read. Both the confirmation and disconfirmation biases lead to attitude polarization – the strengthening of t2 over t1 attitudes – especially among those with the strongest priors and highest levels of political sophistication. We conclude with a discussion of the normative implications of these findings for rational behavior in a democracy.

 

 

Attachments:

Motivated skepticism is the mistake of deliberately applying more skepticism to claims that you don't like (or intuitively disbelieve), than to claims that you do like. Because emotional disposition towards a claim isn't generally evidence about its truth, including it in the process of arriving at a belief means holding the belief partly for reasons other than because it's true.

The Situationist (blog)

There is a dominant conception of the human animal as a rational, or at least reasonable, preference-driven chooser, whose behavior reflects preferences, moderated by information processing and will, but little else. Laws, policies, and the most influential legal theories are premised on that same conception. Social psychology and related fields have discovered countless ways in which that conception is wrong. “The situation” refers to causally significant features around us and within us that we do not notice or believe are relevant in explaining human behavior. “Situationism” is an approach that is deliberately attentive to the situation. It is informed by social science—particularly social psychology, social cognition, cognitive neuroscience and related fields—and the discoveries of market actors devoted to influencing consumer behavior—marketers, public relations experts, and the like. The Situationist is a forum for scholars, students, lawyers, policymakers, and interested citizens to examine, discuss, and debate the effect of situational forces – that is, non-salient factors around and within us – on law, policy, politics, policy theory, and our social, political, and economic institutions. The Situationist is associated with The Project on Law and Mind Sciences at Harvard Law School. To visit the Project’s website, click here.

Situationism is premised on the social scientific insight that the naïve psychology—that is, the highly simplified, affirming, and widely held model for understanding human thinking and behavior—on which our laws and institutions are based is largely wrong. Situationists (including critical realists, behavioral realists, and related neo-realists) seek first to establish a view of the human animal that is as realistic as possible before turning to legal theory or policy. To do so, situationists rely on the insights of scientific disciplines devoted to understanding how humans make sense of their world—including social psychology, social cognition, cognitive neuroscience, and related disciplines—and the practices of institutions devoted to understanding, predicting, and influencing people’s conduct—particularly market practices. Jon Hanson & David Yosifon, The Situation: An Introduction to the Situational Character, Critical Realism, Power Economics, and Deep Capture, 152 U. Pa. L. Rev. 129, 149–77 (2003).


Situationism has been applied to such topics as power economics, natural disasters, obesity, commerical speech and junk-food advertising, Supreme Court dynamics, racial injustice, affirmative action, race and rape, employment discrimination, employee adherence to workplace rules, legitimization of war, inside counsel, corporate law, and player autonomy in the National Basketball Association, among other topics.


For more on situationism, visit The Project on Law and Mind Sciences‘s website.

Selective exposure theory is a theory of communication, positing that individuals prefer exposure to arguments supporting their position over those supporting other positions. As media consumers have more choices to expose themselves to selected medium and media contents with which they agree, then tend to select content that confirms their own ideas and avoid information that argues against their opinion. People don’t want to be told that they are wrong and they do not want their ideas to be challenged either. Therefore, they select different media outlets that agree with their opinions so they do not come in contact with this form of dissonance. Furthermore, these people will select the media sources that agree with their opinions and attitudes on different subjects and then only follow those programs.

Situationism in psychology refers to an approach to personality that holds that people are more influenced by external, situational factors than by internal traits or motivations.


It therefore challenges the position of trait theorists, such as Hans Eysenck or Raymond B. Cattell. The term is popularly associated with Walter Mischel, although he himself does not appear to like the term. Empirical evidence upon which situationists base their claims take the form of cross-situational measures of traits such as extraversion, in which only low correlations of the same trait taken in different situations have been found. However, in response to such evidence, Hans Eysenck has pointed out that the correlations, while low, are typically still high enough to reach statistical significance. A midrange position, which holds that personality is best understood as resulting from subtle interplay of internal and external factors, is known as "interactionism".


Some notable situationist studies include: Zimbardo's Stanford prison experiment, Bystander experiments, Obedience experiments like Milgram experiment and Heat and Aggression experiments.

How apropos is this?

Mind projection fallacy, as coined by physicist and bayesian philosopher E.T. Jaynes, occurs when one takes for sure that the way he sees the world reflects the way the world really is, going as far as assuming the real existence of imagined objects. Another form of the fallacy is when one assumes his own lack of knowledge about how the things really are as meaning that things are indeterminate.

I haven't read these yet. I just ran acoss them and thought I'd pass them along.

 

Blind Spots: Why Smart People Do Dumb Things

Clinical psychologist Van Hecke has compiled a list of 10 mental glitches that have infiltrated contemporary society, afflicting even the smartest among us, limiting thought, success and relationships. Van Hecke devotes a chapter to each blind spot, including "Not stopping to think," "Not noticing," "Jumping to conclusions" and "Missing the big picture." Examining each in detail, Van Hecke details the root causes of these unconscious habits ("information overload," "our tendency to habituate") and tactics for overcoming them, using humorous anecdotes and other real-life examples to drive her points; the key is remaining open to new ideas and taking a step back from our busy lives in order to process information, situations and people. Filling in "the big picture" herself, Van Hecke demonstrates how embracing and understanding our weaknesses can not only improve personal and professional relationships, but also entire communities; this self-help is a welcome, highly readable first step.


Sway: The Irresistible Pull of Irrational Behavior

Recently we have seen plenty of irrational behavior, whether in politics or the world of finance. What makes people act irrationally? In a timely but thin collection of anecdotes and empirical research, the Brafman brothers—Ari (The Starfish and the Spire), a business expert, and Rom, a psychologist—look at sway, the submerged mental drives that undermine rational action, from the desire to avoid loss to a failure to consider all the evidence or to perceive a person or situation beyond the initial impression and the reluctance to alter a plan that isn't working. To drive home their points, the authors use contemporary examples, such as the pivotal decisions of presidents Lyndon B. Johnson and George W. Bush, coach Steve Spurrier and his Gators football team, and a sudden apparent epidemic of bipolar disorder in children (which may be due more to flawed thinking by doctors making the diagnoses). The stories are revealing, but focused on a few common causes of irrational behavior, the book doesn't delve deeply into the psychological demons that can devastate a person's life and those around him.


The Invisible Gorilla: How Our Intuitions Deceive Us

Professors of Psychology Chabris and Simons write about six everyday illusions of perception and thought, including the beliefs that: we pay attention more than we do, our memories are more detailed than they are, confident people are competent people, we know more than we actually do, and our brains have reserves of power that are easy to unlock. Through a host of studies, anecdotes, and logic, the authors debunk conventional wisdom about the workings of the mind and what "experts" really know (or don't). Presented almost as a response to Malcolm Gladwell's blink, the books pay special attention to "the illusion of knowledge" and the danger of basing decision-making, in areas such as investing, on short-term information; in the authors' view, careful analysis of assumed truths is preferred over quick, intuitive thinking. Chabris and Simons are not against intuition, "...but we don't think it should be exalted above analysis without good evidence that it is truly superior."


Why We Make Mistakes: How We Look Without Seeing, Forget Things in Seconds, and Are All Pretty Sure We Are Way Above Average

We forget our passwords. We pay too much to go to the gym. We think we’d be happier if we lived in California (we wouldn’t), and we think we should stick with our first answer on tests (we shouldn’t). Why do we make mistakes? And could we do a little better?

We human beings have design flaws. Our eyes play tricks on us, our stories change in the retelling, and most of us are fairly sure we’re way above average. In Why We Make Mistakes, journalist Joseph T. Hallinan sets out to explore the captivating science of human error--how we think, see, remember, and forget, and how this sets us up for wholly irresistible mistakes.

In his quest to understand our imperfections, Hallinan delves into psychology, neuroscience, and economics, with forays into aviation, consumer behavior, geography, football, stock picking, and more. He discovers that some of the same qualities that make us efficient also make us error prone. We learn to move rapidly through the world, quickly recognizing patterns--but overlooking details. Which is why thirteen-year-old boys discover errors that NASA scientists miss—and why you can’t find the beer in your refrigerator.

Why We Make Mistakes is enlivened by real-life stories--of weathermen whose predictions are uncannily accurate and a witness who sent an innocent man to jail--and offers valuable advice, such as how to remember where you’ve hidden something important. You’ll learn why multitasking is a bad idea, why men make errors women don’t, and why most people think San Diego is west of Reno (it’s not).

Why We Make Mistakes will open your eyes to the reasons behind your mistakes--and have you vowing to do better the next time.

RSS

© 2018   Created by Atheist Universe.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service