Right, but that is a survey of the type of people who answer surveys. I have to wonder how many people who don’t bother to vote also do bother to answer surveys about voting.
Pretty sure an organization like Pew knows how yes l to handle the most basic challenges with polling (self-selection bias of those who answer polls). There are validated, proven ways to address those issues with a large enough sample size and specific methods for how and who they poll.
I grew up on pew data; I was disappointed years ago when they stopped using face to face interviews.
Later, I could not get a good answer about how they dealt with the scam epidemic the last few years
I’m beginning to think most polling companies in the USA have serious flaws in their methodology because of changes in the last few years, and they’re not going back to in person questions.
But these are institutions now in the USA, so most people assume they know what they are doing.
I’m beginning to think most polling companies in the USA have serious flaws in their methodology because of changes in the last few years, and they’re not going back to in person questions.
There’s no real solution for selection bias if you don’t have other respondents of that group. With something like race or education, you have their demographics and can upsample those that do respond. But it the group is specifically defined by not wanting to respond to polls and that comes with biases to the poll questions, you don’t have anything to upsample.
Now whether such a group is really a distinct entity out there that can’t be kind of approximated by people who share other traits is the question. If white conservatives have a spectrum of trust in pollsters and the non-responders would just answer questions the same you’re fine. But it those with low trust are also more anti-vax or some sort of distinct population like an insular community, you couldn’t just approximate them with people who did respond.
A quick google search shows that there are massive differences in how willing different generations are to respond to surveys, especially relating to how they are delivered. 40% of gen-z will abandon a survey if they are asked for personally identifying information.
Another user in this thread mentioned that this particular survey was delivered by mail, which means that this was only able to reach people with a mailing address, who actually read non essential mail, and who are willing to respond to this survey.
I agree that being young makes you less likely to RESPOND to a survey.
What we are talking about is the results of a survey that shows you it compensated for that bias by making sure they reached enough people in every demographic on all parts of the political spectrum.
They are reaching enough genz to know the genz opinion, i promise you, if you need it proven to you please go to the paper and read the methodology for the survey
Right, but that is a survey of the type of people who answer surveys. I have to wonder how many people who don’t bother to vote also do bother to answer surveys about voting.
Pretty sure an organization like Pew knows how yes l to handle the most basic challenges with polling (self-selection bias of those who answer polls). There are validated, proven ways to address those issues with a large enough sample size and specific methods for how and who they poll.
And yet they are still regularly wrong. Because statistics are probability, not certainty.
Pretty sure means don’t know.
I grew up on pew data; I was disappointed years ago when they stopped using face to face interviews.
Later, I could not get a good answer about how they dealt with the scam epidemic the last few years
I’m beginning to think most polling companies in the USA have serious flaws in their methodology because of changes in the last few years, and they’re not going back to in person questions.
But these are institutions now in the USA, so most people assume they know what they are doing.
This, exactly.
There’s no real solution for selection bias if you don’t have other respondents of that group. With something like race or education, you have their demographics and can upsample those that do respond. But it the group is specifically defined by not wanting to respond to polls and that comes with biases to the poll questions, you don’t have anything to upsample.
Now whether such a group is really a distinct entity out there that can’t be kind of approximated by people who share other traits is the question. If white conservatives have a spectrum of trust in pollsters and the non-responders would just answer questions the same you’re fine. But it those with low trust are also more anti-vax or some sort of distinct population like an insular community, you couldn’t just approximate them with people who did respond.
So do you have any evidence to imply that willingness to respond to a survey has anything to do with political orientation?
https://www.surveylegend.com/customer-insight/generational-differences-in-surveys/
A quick google search shows that there are massive differences in how willing different generations are to respond to surveys, especially relating to how they are delivered. 40% of gen-z will abandon a survey if they are asked for personally identifying information.
Another user in this thread mentioned that this particular survey was delivered by mail, which means that this was only able to reach people with a mailing address, who actually read non essential mail, and who are willing to respond to this survey.
I agree that being young makes you less likely to RESPOND to a survey.
What we are talking about is the results of a survey that shows you it compensated for that bias by making sure they reached enough people in every demographic on all parts of the political spectrum.
They are reaching enough genz to know the genz opinion, i promise you, if you need it proven to you please go to the paper and read the methodology for the survey