What follows is taken primarily from Robert Cialdini's Influence: The Psychology of Persuasion.<\/em> I own three copies of this book, one for myself, and two for loaning to friends.<\/p>\n
S<\/span>carcity,<\/em> as that term is used in social psychology, is when things become more desirable<\/em> as they appear less obtainable<\/em>.<\/p>\n <\/a><\/p>\n Similarly, information that appears forbidden or secret, seems more important and trustworthy:<\/p>\n The conventional theory for explaining this is \"psychological reactance\", social-psychology-speak for \"When you tell people they can't do something, they'll just try even harder.\" The fundamental instincts involved appear to be preservation of status and preservation of options. We resist dominance, when any human agency tries to restrict our freedom. And when options seem to be in danger of disappearing, even from natural causes, we try to leap on the option before it's gone.<\/p>\n Leaping on disappearing options may be a good adaptation in a hunter-gatherer<\/a> society—gather the fruits while the tree is still in bloom—but in a money-based society it can be rather costly. Cialdini (1993) reports that in one appliance store he observed, a salesperson who saw that a customer was evincing signs of interest in an appliance would approach, and sadly inform the customer that the item was out of stock, the last one having been sold only twenty minutes ago. Scarcity creating a sudden jump in desirability, the customer would often ask whether there was any chance that the salesperson could locate an unsold item in the back room, warehouse, or anywhere. \"Well,\" says the salesperson, \"that's possible, and I'm willing to check; but do I understand that this is the model you want, and if I can find it at this price, you'll take it?\"<\/p>\n As Cialdini remarks, a chief sign of this malfunction is that you dream of possessing<\/em> something, rather than using<\/em> it. (Timothy Ferriss offers similar advice on planning your life: ask which ongoing experiences<\/em> would make you happy, rather than which possessions or status-changes.)<\/p>\n But the really fundamental problem with desiring the unattainable is that as soon as you actually get<\/em> it, it stops being unattainable<\/a>. If we cannot take joy in the merely available, our lives will always<\/em> be frustrated...<\/p>\n Ashmore, R. D., Ramachandra, V. and Jones, R. A. (1971.) \"Censorship as an Attitude Change Induction.\" Paper presented at Eastern Psychological Association meeting, New York, April 1971.<\/p>\n Brehm, S. S. and Weintraub, M. (1977.) \"Physical Barriers and Psychological Reactance: Two-year-olds' Responses to Threats to Freedom.\" Journal of Personality and Social Psychology,<\/em> 35<\/strong>: 830-36.<\/p>\n Broeder, D. (1959.) \"The University of Chicago Jury Project.\" Nebraska Law Review<\/em> 38<\/strong>: 760-74.<\/p>\n Cialdini, R. B. (1993.) Influence: The Psychology of Persuasion: Revised Edition.<\/em> Pp. 237-71. New York: Quill.<\/p>\n Knishinsky, A. (1982.) \"The Effects of Scarcity of Material and Exclusivity of Information on Industrial Buyer Perceived Risk in Provoking a Purchase Decision.\" Doctoral dissertation, Arizona State University.<\/p>\n Mazis, M. B. (1975.) \"Antipollution Measures and Psychological Reactance Theory: A Field Experiment.\" Journal of Personality and Social Psychology<\/em> 31<\/strong>: 654-66.<\/p>\n Mazis, M. B., Settle, R. B. and Leslie, D. C. (1973.) \"Elimination of Phosphate Detergents and Psychological Reactance.\" Journal of Marketing Research<\/em> 10<\/strong>: 390-95.<\/p>","mainEntityOfPage":{"@type":"WebPage","@id":"https://www.lesswrong.com/posts/MCYp8g9EMAiTCTawk/scarcity"},"headline":"Scarcity","description":"What follows is taken primarily from Robert Cialdini's Influence: The Psychology of Persuasion. I own three copies of this book, one for myself, andâ¦","datePublished":"2008-03-27T08:07:29.000Z","about":[{"@type":"Thing","name":"Censorship","url":"https://www.lesswrong.com/tag/censorship","description":" Censorship<\/strong> is the suppression of speech, public communication, or other information, on the basis that such material is considered objectionable, harmful, sensitive, or \"inconvenient.\" Censorship can be conducted by governments, private institutions, and other controlling bodies. (From Wikipedia)<\/a><\/p>"},{"@type":"Thing","name":"Dark Arts","url":"https://www.lesswrong.com/tag/dark-arts","description":" Dark Arts <\/strong>is a colloquial term for techniques or methods which involve deception and/or manipulation of others or oneself into believing things for non-truth-seeking reasons. These techniques may prey on human cognitive biases.<\/p> Some use the term to refer more narrowly to techniques that work equally well to compel both true and false beliefs, i.e., they are symmetric weapons<\/a>. Some focus more on the Dark Arts as applied to oneself (self-deception) vs applied to manipulating others.<\/p> An example from the Dark Arts of Rationality<\/a>:<\/p> Today, we're going to talk about Dark rationalist techniques: productivity tools which seem incoherent, mad, and downright irrational. These techniques include:<\/p> Sometimes these arts are further augmented by the use of persuasion technology<\/strong>, such as broadcast advertising or PowerPoint slides. Persuasion technology may prevent the person who is being targeted from carefully deliberating on the intended message, or thinking up an effective response to it in real time.<\/p> Such effects can be caused by something as benign as the use of a specialist vocabulary which the target is unfamiliar with, or an institutional vocabulary with high-status connotations: this is one reason why many specialist professions employ ethical codes to regulate their unbalanced power relationship with customers.<\/p> The use of such techniques as whiteboards or PowerPoint slides brings additional concerns, since these tend to connote a single party as the one \"in charge\" of the presentation: this makes it even more difficult for the intended audience to raise any effective objection, and encourages them to focus their attention on the content of the whiteboard or slides. Said content is often presented as a list of abrupt \"bullet points\", further connoting it as factual, objective and neutral. One outspoken critic of PowerPoint, management professor David R. Beatty, states: \"It is like a disease. It's the AIDS of management.\" Beatty further states that Powerpoint \"removes subtlety and thinking\".<\/p> Many futurists expect that a technological singularity of even a very mild character will lead to an explosion in the use of radically effective persuasive technology, or \"cognotechnology\"--a term coined by American military researchers at the Lawrence Livermore Laboratories. The collection and... <\/p>"},{"@type":"Thing","name":"Heuristics & Biases","url":"https://www.lesswrong.com/tag/heuristics-and-biases","description":" Heuristics<\/strong> and Biases<\/strong> are the ways human reasoning differs from a theoretical ideal agent, due to reasoning shortcuts that don't always work (heuristics) and systematic errors (biases).<\/p> See also<\/em>: Affect Heuristic<\/a>, Confirmation Bias<\/a>, Fallacies<\/a>, Predictably Wrong<\/a>, Rationality<\/a>, Your Intuitions Are Not Magic<\/a>, Bias<\/a>, Heuristic<\/a><\/p>\n âCognitive biasesâ<\/a> are those obstacles to truth which are produced, not by the cost of information, nor by limited computing power, but by the shape of our own mental machinery<\/em>. For example, our mental processes might be evolutionarily adapted to specifically believe some things that arent true, so that we could win political arguments in a tribal context. Or the mental machinery might be adapted not to particularly care whether something is true, such as when we feel the urge to believe what others believe to get along socially. Or the bias may be a side-effect of a useful reasoning heuristic. The availability heuristic is not itself a bias, but it gives rise to them; the machinery uses an algorithm (give things more evidential weight if they come to mind more readily) that does some good cognitive work but also produces systematic errors.<\/p> Our brains are doing something wrong, and after a lot of experimentation and/or heavy thinking, someone identifies the problem verbally and concretely; then we call it a â(cognitive) bias.â Not to be confused with the colloquial âthat person is biased,â which just means âthat person has a skewed or prejudiced attitude toward something.â<\/p> A bias is an obstacle to our goal of obtaining truth, and thus in our way<\/em>.<\/p> We are here to pursue the great human quest for truth: for we have desperate need of the knowledge, and besides, we're curious. To this end let us strive to overcome whatever obstacles lie in our way, whether we call them âbiasesâ or not.<\/p> It's also useful to know the kinds of faults human brains are prone to, in the same way it's useful to know that your car's brakes are a little gummy (so you don't sail through a red light and into an 18-wheeler).<\/a><\/p> The Sequence, Predictably Wrong<\/a>, offers an excellent introduction to the topic for those who are not familiar.<\/p>\n While a bias<\/strong> is always wrong, a heuristic<\/strong> is just a shortcut which may or may not give you an accurate answer. Just because you know complex mathematical methods for preci... <\/p>"}],"author":[{"@type":"Person","name":"Eliezer Yudkowsky","url":"https://www.lesswrong.com/users/eliezer_yudkowsky"}],"interactionStatistic":[{"@type":"InteractionCounter","interactionType":{"@type":"http://schema.org/CommentAction"},"userInteractionCount":20},{"@type":"InteractionCounter","interactionType":{"@type":"http://schema.org/LikeAction"},"userInteractionCount":87}]}
haha, reminds me of when i first got my gmail account almost four years ago. ah, but i still love it. i guess this theory explains why i keep my hotmail accounts even though i don't use them anymore--they were grandfathered over from when syncing with outlook was free. Buyers for supermarkets, told by a supplier that beef was in scarce supply, gave orders for twice as much beef as buyers told it was readily available. In the defense of consumers, this behavior is in fact rational. A shortage increases the likelihood of a stockout in the near future, so it behooves you to buy more now when you can so you can maintain your own personal inventory and avoid the consequences of the coming stockout. This behavior is one cause of the Bullwhip Effect. This is better described as a "noisy clue" than a "bias." On average the fact that something is rare, in demand, or hidden on purpose is in fact a reason to be more interested in it. Of course sometimes people can use our willingness to follow noisy clues to fool us. Of course sometimes people can use our willingness to follow noisy clues to fool us. Right, over time this tactic can become cliche in a given context? I hope I'm not the only one who rolls his eyes when a saleman claims that a certain product model is in short supply. Then again, I just missed a chance to buy a car at the price I wanted by ignoring such a claim and waiting. Or did I (miss a genuine rather than fake chance)? See infomercials. All the effective ones say "but for a limited time only..." and "call while supplies last!" and "we can only guarantee this offer for the next 24 hours..." even though it's the exact same infomercial that has been on at the exact same time every night for a month. I've also never seen one of these products that wasn't "on sale" at some sort of "reduced" price. They also usually include freebies "worth" hundreds of dollars with an item they are selling for less than $20. I've always wondered that anybody could be stupid enough to think that an item actually worth $100 would be included in a million $20 orders. I agree with Bobvis: a LOT of this is rational:
This seems straight Bayes to me. The banning of the speech counts as information about the chance that you'll agree with it, and for a reasonably low probability of banning speech that isn't dangerous to the administration (i.e. speech that won't convince), Everyone's Favorite Probability Rule kicks in and makes it totally rational to become more opposed to coed dorms -- assuming, that is, that you believe your chance of being convicted comes largely from rational sources (a belief that practical agents are at least somewhat committed to having).
This too seems rational, though in this case only mostly, not totally. We can understand jurors as trying to balance the costs and the benefits of the award (not their legal job, but a perfectly sane thing to do). And the diminishing marginal utility of wealth suggests that imposing a large judgment on an insurance company causes less disutility to the person paying (or people, distributing that over the company's clients) than imposing it on a single person. As for the judge's informing the jurors that insurance information is inadmissible, well, again, they can interpret that instruction as information about the presence of insurance and update accordingly. (Although that might not be accurate in the context of how judges give instructions, jurors need not know that.) Of course, it seems like they updated too much, since they increased their awards much more when p(insurance) increased but is less than 1, than they did when they learned that p(insurance)=1. So it's still probably partially irrational. But not an artifact of some kind of magical scarcity effect. Bovis, I believe the irrational part is that the consumer would buy more meat depending of how scarce the information of scarcity was. For example, consumer overheard the butcher talk about meat being scarce so the consumer buys 8 times more instead of 4 times more (if the scarcity of meat was common knowledge). Bovis, I believe the irrational part is that the consumer would buy more meat depending of how scarce the information of scarcity was. For example, consumer overheard the butcher talk about meat being scarce so the consumer buys 8 times more instead of 4 times more (if the scarcity of meat was common knowledge). Re: "information that appears forbidden or secret, seems more important and trustworthy"
Michael Scheuer says the same thing about how the CIA analyzes data. He claims that public sources are often ignored in favor of confidential ones, even when its irrational to do so. Minor point to speed up finding the book: I believe (or rather, Google believes) that the correct name is "Influence: The Psychology of Persuasion" (i.e. drop the "Social") Cialdini also seems to have put out the same info in a textbook (which does not read like one) "Influence, Science and Practice." Amazon reviews say it is nearly identical, except it has chapter reviews and problems. I only mention this because this is the version that was available at my 2 nearest library systems. Very good reading a quarter of the way in-- so thanks for the tip. EY-- what other books are in the "own 3 copy" club? Scarce things become objects of envy, increasing social value to ourselves, and they also increases the probable resale value, even in the absence of other information. The scarcity often signals high demand from others, as in the out-of-stock example. I run an SEO blog and it seems the articles that are most unique or are perceived to be less available, or general insider secrets are a lot more read. Also, things perceived to be wrong create more interest themselves. Nice article, worth a lot of money to an avid marketer I'd think. This reminds me that one of the first things I did when I made my account here was to disabled the setting that had me ignore posts that had been downvoted, because I found I was always clicking the link to view them anyway. So downvoting the post actually made me pay more attention to it then a post with no points at all. I remember that I used to intentionally put some things that I wanted out of reach, so that I still got that feeling of yearning for it, all the while knowing that if I achieved it then it wouldn't be as great. Then I eventually realized that if achieving something lessens the value of it, then I shouldn't really want it in the first place. When University of North Carolina students learned that a speech opposing coed dorms had been banned, they became more opposed to coed dorms (without even hearing the speech). (Probably in Ashmore et. al. 1971.) De-platforming may be effective in a different direction than intended. Oof, be wary of Tim Ferriss, for he is a giant phony. I bought one of his books once, and nearly every single piece of advice in it was a bad generalization from a single study, and all of it was either already well known outside of the book, or ineffective, or just plain wrong. I have had great luck by immediately downgrading the trustworthiness of anything that mentions him, and especially anything that treats him as an authority. I have found the same with NLP. Please don't join that club. Tim Ferriss is an utterly amoral agent. His purpose is to fill pages with whatever you will buy, and sell them to you, for money. Beyond that, he does not care, at all. I expect he has read Robert Cialdini's "Influence", but only as a guidebook to the most efficient ways to extract money from suckers. This is just a warning to all readers.\n
\n
\nArt vs. Technology<\/h2>
Basics<\/h1>\n
Wait a minute... fallacies, biases, heuristics... what's the difference??<\/h1>\n
When University of North Carolina students learned that a speech opposing coed dorms had been banned, they became more opposed to coed dorms (without even hearing the speech). (Probably in Ashmore et. al. 1971.)
When a driver said he had liability insurance, experimental jurors awarded his victim an average of four thousand dollars more than if the driver said he had no insurance. If the judge afterward informed the jurors that information about insurance was inadmissible and must be ignored, jurors awarded an average of thirteen thousand dollars more than if the driver had no insurance. (Broeder 1959.)
What follows is taken primarily from Robert Cialdini's Influence: The Psychology of Persuasion. I own three copies of this book, one for myself, and two for loaning to friends.
Scarcity, as that term is used in social psychology, is when things become more desirable as they appear less obtainable.
Similarly, information that appears forbidden or secret, seems more important and trustworthy:
The conventional theory for explaining this is "psychological reactance", social-psychology-speak for "When you tell people they can't do something, they'll just try even harder." The fundamental instincts involved appear to be preservation of status and preservation of options. We resist dominance, when any human agency tries to restrict our freedom. And when options seem to be in danger of disappearing, even from natural causes, we try to leap on the option before it's gone.
Leaping on disappearing options may be a good adaptation in a hunter-gatherer societyâgather the fruits while the tree is still in bloomâbut in a money-based society it can be rather costly. Cialdini (1993) reports that in one appliance store he observed, a salesperson who saw that a customer was evincing signs of interest in an appliance would approach, and sadly inform the customer that the item was out of stock, the last one having been sold only twenty minutes ago. Scarcity creating a sudden jump in desirability, the customer would often ask whether there was any chance that the salesperson could locate an unsold item in the back room, warehouse, or anywhere. "Well," says the salesperson, "that's possible, and I'm willing to check; but do I understand that this is the model you want, and if I can find it at this price, you'll take it?"
As Cialdini remarks, a chief sign of this malfunction is that you dream of possessing something, rather than using it. (Timothy Ferriss offers similar advice on planning your life: ask which ongoing experiences would make you happy, rather than which possessions or status-changes.)
But the really fundamental problem with desiring the unattainable is that as soon as you actually get it, it stops being unattainable. If we cannot take joy in the merely available, our lives will always be frustrated...
Ashmore, R. D., Ramachandra, V. and Jones, R. A. (1971.) "Censorship as an Attitude Change Induction." Paper presented at Eastern Psychological Association meeting, New York, April 1971.
Brehm, S. S. and Weintraub, M. (1977.) "Physical Barriers and Psychological Reactance: Two-year-olds' Responses to Threats to Freedom." Journal of Personality and Social Psychology, 35: 830-36.
Broeder, D. (1959.) "The University of Chicago Jury Project." Nebraska Law Review 38: 760-74.
Cialdini, R. B. (1993.) Influence: The Psychology of Persuasion: Revised Edition. Pp. 237-71. New York: Quill.
Knishinsky, A. (1982.) "The Effects of Scarcity of Material and Exclusivity of Information on Industrial Buyer Perceived Risk in Provoking a Purchase Decision." Doctoral dissertation, Arizona State University.
Mazis, M. B. (1975.) "Antipollution Measures and Psychological Reactance Theory: A Field Experiment." Journal of Personality and Social Psychology 31: 654-66.
Mazis, M. B., Settle, R. B. and Leslie, D. C. (1973.) "Elimination of Phosphate Detergents and Psychological Reactance." Journal of Marketing Research 10: 390-95.