Skip to content

Divisiveness/consensus metric in automatic report #1661

Answered by metasoarous
phoqui asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @phoqui. Thanks for asking this question.

The "divisiveness" metric is defined in our methods paper using the term "extremity" (see pages 9-10). Basically, it's a measure of how strongly predictive a comment is of where participants fall in the 2D "opinion space" represented in the visualization. It's more or less the norm (distance from [0,0]) of the PCA vector entries corresponding to the comment in question. Our rationale for looking at this metric specifically is to be able to quickly get a sense of which comments best explain the opinion space and groupings.

Admittedly, this has sometimes been confused with a more raw measure of the total vote variance or overall level of agreemen…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by metasoarous
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants