Home › Forums › Other › OT: NYT article on, among other things, the limits of our ability to acknowledge what we don’t know
- This topic has 395 replies, 22 voices, and was last updated 14 years, 2 months ago by bearishgurl.
-
AuthorPosts
-
July 2, 2010 at 12:31 AM #575737July 2, 2010 at 8:23 AM #574755briansd1Guest
This is a great article about a Russian mathematician.
Who would turn down a $1 million prize for solving a math problem?
Perhaps the smartest man in the world.
http://www.washingtonpost.com/wp-dyn/content/article/2010/07/01/AR2010070106247.html?hpid=topnews
July 2, 2010 at 8:23 AM #574852briansd1GuestThis is a great article about a Russian mathematician.
Who would turn down a $1 million prize for solving a math problem?
Perhaps the smartest man in the world.
http://www.washingtonpost.com/wp-dyn/content/article/2010/07/01/AR2010070106247.html?hpid=topnews
July 2, 2010 at 8:23 AM #575377briansd1GuestThis is a great article about a Russian mathematician.
Who would turn down a $1 million prize for solving a math problem?
Perhaps the smartest man in the world.
http://www.washingtonpost.com/wp-dyn/content/article/2010/07/01/AR2010070106247.html?hpid=topnews
July 2, 2010 at 8:23 AM #575484briansd1GuestThis is a great article about a Russian mathematician.
Who would turn down a $1 million prize for solving a math problem?
Perhaps the smartest man in the world.
http://www.washingtonpost.com/wp-dyn/content/article/2010/07/01/AR2010070106247.html?hpid=topnews
July 2, 2010 at 8:23 AM #575782briansd1GuestThis is a great article about a Russian mathematician.
Who would turn down a $1 million prize for solving a math problem?
Perhaps the smartest man in the world.
http://www.washingtonpost.com/wp-dyn/content/article/2010/07/01/AR2010070106247.html?hpid=topnews
July 20, 2010 at 6:06 AM #580255ArrayaParticipanthttp://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/
In the end, truth will out. Won’t it?Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
July 20, 2010 at 6:06 AM #580349ArrayaParticipanthttp://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/
In the end, truth will out. Won’t it?Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
July 20, 2010 at 6:06 AM #580879ArrayaParticipanthttp://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/
In the end, truth will out. Won’t it?Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
July 20, 2010 at 6:06 AM #580984ArrayaParticipanthttp://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/
In the end, truth will out. Won’t it?Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
July 20, 2010 at 6:06 AM #581286ArrayaParticipanthttp://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/
In the end, truth will out. Won’t it?Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
July 20, 2010 at 6:25 AM #580260blahblahblahParticipant[quote=Arraya]http://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/
In the end, truth will out. Won’t it?Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.[/quote]
Interesting but no surprise, really. I’ve come to the same conclusion just reading the comments section on internet blogs.
July 20, 2010 at 6:25 AM #580354blahblahblahParticipant[quote=Arraya]http://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/
In the end, truth will out. Won’t it?Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.[/quote]
Interesting but no surprise, really. I’ve come to the same conclusion just reading the comments section on internet blogs.
July 20, 2010 at 6:25 AM #580884blahblahblahParticipant[quote=Arraya]http://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/
In the end, truth will out. Won’t it?Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.[/quote]
Interesting but no surprise, really. I’ve come to the same conclusion just reading the comments section on internet blogs.
July 20, 2010 at 6:25 AM #580989blahblahblahParticipant[quote=Arraya]http://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/
In the end, truth will out. Won’t it?Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.[/quote]
Interesting but no surprise, really. I’ve come to the same conclusion just reading the comments section on internet blogs.
-
AuthorPosts
- You must be logged in to reply to this topic.