We show how the prevailing majority opinion in a population can be rapidly reversed by a small fraction p of randomly distributed committed agents who consistently proselytize the opposing opinion and are immune to influence. Specifically, we show that when the committed fraction grows beyond a critical value pc=10%, there is a dramatic decrease in the time Tc taken for the entire population to adopt the committed opinion. In particular, for complete graphs we show that when p<pc, Tc~exp[a(p)N], whereas for p>pc, Tc~lnN. We conclude with simulation results for Erdos-Rényi random graphs and scale-free networks which show qualitatively similar behavior.
And this is important, with implications for everything from church governance to church planting movements. Perhaps one of the most critical lines here is: "randomly distributed committed agents who consistently proselytize [their] opinion and are immune to influence." This is related to church planting movements & the conversion process, because conversions most always happen in the context of relationship (Stark, Rise of Christianity, p. 16-17, see here, here, here and here). 1. This principle, if true, would work in both small groups and large. In an organization of 500 (the size of a medium-sized church or mission agency), just 50 committed agents would (probably) be enough to change any policy within it. 2. Simply holding an opinion is not enough to change the winds of culture. If the Tenth Column [like that?] isn't proselytizing their beliefs, nothing will happen. Sharing your opinion--through speaking, writing, webinars, blogs, Twitter, Facebook, over coffee, in workshops, at Sunday School, whatever--is important. You have to take the risk, speak out, and find others who share your ideas--even though you are in the minority in the larger community. (This doesn't necessarily mean speaking out very publicly, which is where wisdom and discernment come in.) 3. The rate of conversion to the new idea determines the speed at which transformation occurs. The study says nothing about how long this process will take. Indeed, there is some debate about its parameters. In the study, (1) "listener" talked with a "speaker"; (2) if the listener's opinion differed from the speaker, the listener moved on; (3) if the listener then encountered a second speaker with the same opinion, the listener adopted the new belief. In real life, it could take a lot more time and exposures before someone changes their mind. Therefore, the chief goal should be to increase the rate of exposure to an idea, try to determine the "threshhold" of exposures it takes for a listener to change their mind, and accelerate to reach that threshhold in 10% of the population. 4. The reason the threshhold of 10% works: with every person added, there becomes an increasing chance of encountering someone who holds the opinion positively. Think about it: if you have 1 person in 100, there is no chance of growth on this model--because it takes exposure to 2 different people to "convert." (You might disagree with the math, but set that aside for the moment and consider the process). When there are 2 people, there is a very small chance of conversion of a third (probability theory: encountering one individual is 1/100, and you multiply the two together, so the overall chance is 1/100 of 1/100). However, with each additional person added there is a greater chance of hitting the 2-exposure threshhold, because there are more "speakers" to talk to. Once you get to 10%, each individual (because of connectedness) likely knows a number of people and will easily shift. (In a crowd of 100, you know 10 with the opposing attitude. If it takes 2 exposures, you have 5x what is required in exposures to shift.) You can see this kind of thing in models for everything from fax machines to Facebook. "Follow this link--you'll need an account on Facebook. It's free." If you get more than a few of those from people whose links you want to follow, you'll get yourself a free account. Once there, it's a short jump to, "Let's start a Facebook group to discuss this." 5. Just because an opinion becomes the prevailing opinion does not mean it will remain so. Consider: "Freedom is good." From there, quickly to "I ought to be free of government control in the decisions I make." And from there to, "everyone else ought to be free too." But will it jump from there to "people should be free to change their religion?" The Arab Spring may be a test of this: one idea jumps to majority, but what minority ideas can reclaim thought space. It works both ways. 6. Small threshholds of bad ideas need to be responded to early. If you are a thought leader in an organization, it behooves you to know what is being discussed in small groups--because small groups have less incentive to compromise, more incentive to spread quietly, and can fairly easily get to the "magic 10%" needed. Respond to ideas when they are at the "1%" level and they are much easier to deal with. This is why China is working so hard to regulate its people. 7. Conversion requires not just taking on a new opinion, but also becoming a proselytizer. The only way this works is if the "listener" likewise becomes a "speaker." For an idea to spread, the newly converted cannot simply hold the opinion. It is great if they speak up about the opinion. But for it to really spread, they must also make listeners into speakers. Therefore, one way to mitigate the spread of a bad idea is to prevent this "leaders-make-leaders" phenomenon. And one key ingredient to spreading a good idea is to codify it, teach people to spread it, and inspire them to do so. This is a critical piece of research. The original article is $25 and likely worth the purchase for you. How will you review this, test it, and pursue its use in your own work and ministry?