Fact Check on Joel Isabirye’s Television Research

Another source of error is the use of samples that are not representative of the population as a consequence of the polling methodology.

1

By Reality Check Uganda Experts

A number of our readers wrote asking us to give an opinion, on the poll results that Radio Consultant Joel Isabirye authors.

Our panel of experts sat down and this is their response.

To begin a “poll,” is a survey of public opinion from a particular sample. They are usually designed to represent the opinions of a population by conducting a series of questions and then extrapolating generalities in ratio or within confidence intervals.

The choice between administration modes is influenced by: 1) cost, 2) coverage of target population, 3) flexibility of asking questions, 4) respondents’ willingness to participate, and 5) response accuracy.

Different methods create mode effects that change how respondents answer. The most common modes of administration are:

Response Formats

Usually, a survey consists of a number of questions the respondent answers in a set format. A distinction is made between open-ended and closed-ended questions. An open-ended question asks the respondent to formulate his or her own answer, while closed-ended questions have the respondent choose an answer from a given number of options. The response options for a closed-ended question should be exhaustive and mutually exclusive. The four types of response scales for closed-ended questions are:  Dichotomous: The respondent has two options. Nominal-polytomous: The respondent has more than two unordered options.

Ordinal-polytomous: The respondent has more than two ordered options.

(Bounded) Continuous: The respondent is presented with a continuous scale.

A respondent’s answer to an open-ended question can be coded into a response scale afterwards or analyzed using more qualitative methods.

Problems with polls typically stem either from issues with the methodology that bias the sample or the responses that cause the bias.

Another source of error stems from faulty demographic models by pollsters who weigh their samples by particular variables such as program identification say in a viewership poll.

A number of theories and mechanisms have been offered to explain erroneous polling results. Some of these reflect errors on the part of the pollsters; many of them are statistical in nature. Others blame the respondents for not giving candid answers (the controversial Bradley effect & Shy Tory Factor).

Non-response Bias

Since some people do not answer calls from strangers or refuse to answer the poll, poll samples may not be representative samples from a population due to a non-response bias. Because of this selection bias, the characteristics of those who agree to be interviewed may be markedly different from those who decline. That is, the actual sample is a biased version of the universe the pollster wants to analyze. In these cases, bias introduces new errors, in addition to errors caused by sample size. Error due to bias does not become smaller with larger sample sizes–taking a larger sample size simply repeats the same mistake on a larger scale.

Response Bias

Surveys may be affected by response bias, where the answers given by respondents do not reflect their true beliefs. This may be deliberately engineered by unscrupulous pollsters in order to generate a certain result or please their clients, but more often is a result of the detailed wording or ordering of questions. Respondents may deliberately try to manipulate the outcome of a poll by advocating a more extreme position than they actually hold in order to boost their side of the argument or give rapid and ill-considered answers in order to hasten the end of their questioning. Respondents may also feel under social pressure not to give an unpopular answer. In American political parlance, this phenomenon is often referred to as the Bradley effect. If the results of surveys are widely publicized this effect may be magnified in a phenomenon commonly referred to as the spiral of silence.

Wording of Questions

It is well established that the wording of the questions, the order in which they are asked, and the number and form of alternative answers offered can influence results of polls. For instance, the public is more likely to indicate support for a person who is described by the operator as one of the “leading candidates. ”

 

A common technique to control for this bias is to rotate the order in which questions are asked. Many pollsters also split-sample in that one of two different versions of a question are presented to half the respondents.

Coverage Bias

Another source of error is the use of samples that are not representative of the population as a consequence of the polling methodology. For example, telephone sampling has a built-in error because in many times and places, those with telephones have generally been richer than those without.

Selection Bias

Selection bias occurs when some units have a differing probability of selection that is unaccounted for by the researcher. For example, some households have multiple phone numbers making them more likely to be selected in a telephone survey than households with only one phone number.

In statistics, self-selection bias arises in any situation in which individuals select themselves into a group, causing a biased sample with non-probability sampling. It is commonly used to describe situations where the characteristics of the people which cause them to select themselves in the group create abnormal or undesirable conditions in the group.

There may be a purposeful intent on the part of respondents leading to self-selection bias whereas other types of selection bias may arise more inadvertently, possibly as the result of mistakes by those designing any given study.

In conclusion our verdict is that for  Isabirye’s polls to be taken seriously he needs to come clean and tell his audience the name of the company that carried out the research, the method of data collection and the sample size. Short of that he is bound to mislead people with his alleged research.

Do you have any issue that you want us to fact check?  Whatsapp your question to +256772745010

1 Comment
  1. Admin says

    Here’s a comment from one of our readers
    I want to agree with you greatly on the authenticity of Joel’s polls. Most times they are conducted to appease the clients and give room for consultancy on his part or better still create an impression that listeners have switched from one program to another or from one television station to another. Adult listeners are the most conservative when it comes to stations as opposed to younger listeners. For example, till to date people have listened to Radio Uganda in some circles and still prefer the government narrative on issues? Who captures such information, or which group is listening to what program and at what time.
    This was a brilliant exposition on your part and it would take a sober person to reply such a well researched criticism. It should not take rocket science to see that in bars and salons people are tuned to NTV OR BUKEDDE for that matter especially at prime time news…
    Good work you people are doing.

Leave A Reply

Your email address will not be published.