IHSN Survey Catalog
  • Home
  • Microdata Catalog
  • Citations
  • Login
    Login
    Home / Central Data Catalog / WLD_2001_GBS_V01_M
central

Global Barometer Survey 2001-2008, Round 1

Argentina, Benin, Bolivia, Brazil, Botswana, Chile, Colombia, Cabo Verde, Costa Rica, Algeria, Ecuad, 2001 - 2008
Get Microdata
Reference ID
WLD_2001_GBS_v01_M
Producer(s)
Department of Political Science, National Taiwan University
Metadata
Documentation in PDF DDI/XML JSON
Created on
Jul 10, 2013
Last modified
Aug 26, 2021
Page views
16542
Downloads
58
  • Study Description
  • Data Dictionary
  • Downloads
  • Get Microdata
  • Related Publications
  • Identification
  • Version
  • Scope
  • Coverage
  • Producers and sponsors
  • Sampling
  • Survey instrument
  • Data collection
  • Data processing
  • Data appraisal
  • Data Access
  • Disclaimer and copyrights
  • Contacts
  • Metadata production
  • Identification

    Survey ID number

    WLD_2001_GBS_v01_M

    Title

    Global Barometer Survey 2001-2008

    Subtitle

    Round 1

    Country
    Name Country code
    Argentina ARG
    Benin BEN
    Bolivia BOL
    Brazil BRA
    Botswana BWA
    Chile CHL
    Colombia COL
    Cabo Verde CPV
    Costa Rica CRI
    Algeria DZA
    Ecuador ECU
    Ghana GHA
    Guatemala GTM
    Honduras HND
    Indonesia IDN
    India IND
    Jordan JOR
    Japan JPN
    Kenya KEN
    Kuwait KWT
    Lebanon LBN
    Sri Lanka LKA
    Lesotho LSO
    Morocco MAR
    Madagascar MDG
    Mexico MEX
    Mali MLI
    Mongolia MNG
    Mozambique MOZ
    Malawi MWI
    Namibia NAM
    Nigeria NGA
    Nicaragua NIC
    Nepal NPL
    Pakistan PAK
    Panama PAN
    Peru PER
    Philippines PHL
    Paraguay PRY
    Senegal SEN
    Singapore SGP
    El Salvador SLV
    Thailand THA
    Taiwan, China TWN
    Tanzania TZA
    Uganda UGA
    Uruguay URY
    Venezuela, RB VEN
    Vietnam VNM
    West Bank and Gaza WBG
    South Africa ZAF
    Zambia ZMB
    Zimbabwe ZWE
    Study type

    Other Household Survey [hh/oth]

    Abstract

    The Global Barometer Survey represents a systematic comparative survey of attitudes and values toward politics, power, reform, democracy and citizens' political actions in Africa, Asia, Latin America and the Arabic region. It is based on a common module of questions contained in regional barometer surveys.

    Cross-national comparative surveys have been implemented in 55 political systems. In each of the 55 countries or regions, a national research team administers a country-wide face-to-face survey using standardized survey instruments to compile the required micro-level data under a common research framework and research methodology.

    Kind of Data

    Sample survey data [ssd]

    Unit of Analysis

    -Individuals

    Version

    Version Description
    • v01

    Scope

    Notes

    Survey topics: Economic evaluation, Trust, Feeling of security, Vote and election, Political interest and situation, Media exposure, Party, Demonstration, Democracy, Citizen power, Freedom, Corruption, Leader, Army, Gender, Education, Marital status, Religious, Income, Occupation.

    Coverage

    Geographic Coverage

    Country-wide survey in 55 political systems -namely Japan, Korea, Mongolia, Philippines, Taiwan, Thailand, Indonesia, Singapore, Vietnam, Argentina, Bolivia, Brazil, Colombia, Costa Rica, Chile, Ecuador, El Salvador, Guatemala, Honduras, Mexico, Nicaragua, Panama, Paraguay, Peru, Uruguay, Venezuela, Benin, Botswana, Cape Verde, Ghana, Kenya, Lesotho, Madagascar, Malawi, Mali, Mozambique, Namibia, Nigeria, Senegal, South Africa, Tanzania, Uganda, Zambia, Zimbabwe, Bangladesh, India, Nepal, Pakistan, Sri Lanka, Jordan, Palestine, Algeria, Morocco, Kuwait and Lebanon.

    Universe

    The survey of Algeria, Argentina, Bolivia, Botswana, Brasil, Cape Verde, Colombia, Costa Rica, Chile, Ecuador, El Salvador, Ghana, Guatemala, Honduras, India, Japan, Jordan, Kenya, South Korea, Kuwait, Lebanon, Lesotho, Madagascar, Malawi, Mali, Mexico, Mongolia, Morocco, Mozambique, Namibia, Nepal, Nicaragua, Nigeria, Pakistan, Palestine, Panama, Paraguay, Peru, Philippines, Senegal, South Africa, Sri Lanka, Taiwan, Tanzania, Thailand, Uganda, Uruguay, Venezuela, Vietnam, Zambia, Zimbabwe cover both sexes, 18 and more years. The survey of Indonesia covers both sexes, 17 and more years. And the survey of Singapore covers both sexes, 21 and more years.

    Producers and sponsors

    Primary investigators
    Name
    Department of Political Science, National Taiwan University
    Producers
    Name
    List of producers http://www.jdsurvey.net/gbs/GBSParticipants.jsp
    ASEP/JDS

    Sampling

    Sampling Procedure

    Sampling procedure for each individual country is available at http://www.jdsurvey.net/gbs/GBSTechnical.jsp

    Response Rate

    Benin: Contact rate: 0.89 Cooperation rate: 0.90 Refusal rate: 0.02 Response rate: 0.80

    Indonesia: Response Rate. The original sample size was 1,600 respondents. There were 1,440 successful interviews without substitution, and therefore the response rate is 90%. The number of substitutions is 160.

    Botswana: Contact rate: 0.91 Cooperation rate: 0.87 Refusal rate: 0.03 Response rate: 0.79

    Cape Verde: Contact rate: 0.92 Cooperation rate: 0.78 Refusal rate: 0.11 Response rate: 0.72

    Ghana: Contact rate: 0.98 Cooperation rate: 1.00 Refusal rate: 0.00 Response rate: 0.98

    Japan: The second wave of the ABS in Japan was conducted between February and March 2007 and yielded 1,067 valid cases from a sample of 2,500 cases yielding a response rate of 42.7%. On both surveys, we used additional sub-sampling which also followed the same two-stage random sampling procedure. The response rate for Japan was lower than it has been on the EAB 2003 (the first wave of the ABS).

    Jordan: Response rate in Jordan’s sample was 95%.

    Kenya: Contact rate: 0.80 Cooperation rate: 0.75 Refusal rate: 0.07 Response rate: 0.60

    South Korea: A total of 3,224 addresses were selected. At 649 addresses, there was no one at home after two callbacks so that the household residents could not be enumerated and a respondent selected. Of the 2,575 households where an individual name could be selected by the birthday method, 32 individuals were not interviewed because they were too old or infirm or absent from the household; 630 refused; and 413 were not completed because of the respondent’s impatience, for which Koreans are notorious. A total of 1,500 interviews were satisfactorily completed, registering a response rate of 58 percent. Of the completed interviews, 20 percent were randomly selected and independently validated.

    Lesotho: Contact rate: 0.78 Cooperation rate: 0.85 Refusal rate: 0.02 Response rate: 0.66

    Madagascar: Contact rate: 0.77 Cooperation rate: 0.84 Refusal rate: 0.03 Response rate: 0.65

    Malawi: Contact rate: 0.91 Cooperation rate: 0.86 Refusal rate: 0.04 Response rate: 0.78

    Mozambique: Contact rate: 0.96 Cooperation rate: 0.95 Refusal rate: 0.03 Response rate: 0.91

    Namibia: Contact rate: 0.85 Cooperation rate: 0.70 Refusal rate: 0.21 Response rate: 0.60

    Nigeria: Contact rate: 0.77 Cooperation rate: 0.78 Refusal rate: 0.10 Response rate: 0.61

    Senegal: Contact rate: 0.95 Cooperation rate: 0.90 Refusal rate: 0.07 Response rate: 0.85

    Singapore: The original sample size was 1,000 respondents plus another 1,000 as reserve. There were 1,012 successful interviews with 456 cases replaced by the cases in the reserve list, and therefore the response rate is 69.52%.

    South Africa: Contact rate: 0.95 Cooperation rate: 0.91 Refusal rate: 0.06 Response rate: 0.87

    Tanzania: Contact rate: 0.98 Cooperation rate: 0.98 Refusal rate: 0.00 Response rate: 0.96

    Zambia: Contact rate: 0.86 Cooperation rate: 0.83 Refusal rate: 0.07 Response rate: 0.72

    Zimbabwe: Contact rate: 0.82 Cooperation rate: 0.80 Refusal rate: 0.09 Response rate: 0.66

    Weighting

    Benin, Botswana, Ghana, Malawi, Mozambique, Namibia, Senegal, Zambia, Zimbabwe: None

    Cape Verde: Weighted to take account of over- or under-samples with respect to island and urban-rural distribution.

    Kenya: Weighted to account for over- or under-samples with respect to province and urban-rural distribution.

    South Korea: The sample was more or less consistent with the survey population with respect to age, gender, and region. Hence no weighting variable was constructed.

    Lesotho: Weighted to account for over- or under-samples with respect to district and urban-rural distribution.

    Madagascar: Weighted to account for individual selection probabilities.

    Mail: Weighted to account for over- or under-samples with respect to region and urban-rural distribution.

    Nigeria: Weighting to account for an over- sample in Bayelsa, Delta and Rivers states, and an undersample in Northwest Region.

    Singapore: To yield representative figures at the national level, census-based population weights are applied to the survey data. The weight projection is computed by age. Since Singapore is geographically quite small, age is more significant than area statistically as age represents the band-width of the population. Appropriate projected factors were applied so that original population proportions were reflected in the data tables using this formula. Projection factors(Weight)= Population/ No. of Interviews.
    The SPSS version of the datafile is already weighted according to the above projection factors. As the data are weighted, the total number of cases that appear is 2,616,457.

    South Africa: Weighted to account for individual selection probabilities.

    Taiwan:
    (c) Goodness of Fit and Data Weighting
    The purpose of Chi-Square test within the SPSS Nonparametric Statistical Test is to establish that the data is consistent with the distribution among the entire population. Three sample characteristics are addressed: gender, age and education.
    Both age and education failed the chi-square test. This means that the data for age and education are inconsistent with the whole population. The problem is rooted in the sampling. In order to rectify the data, “Raking,” a kind of weighting method in accor dance with multiple variables, was used to render the sample’s gender, age and education data consistent with the entire population.

    Tanzania: Weighted to correct for over-sample in Zanzibar.

    Uganda: Weighted to account for over- or under-samples with respect to region and urban-rural distribution.

    Vietnam:
    To yield representative figures at the national level, census-based population weights are applied to the survey data. The weight projection is computed by dividing the projected population in the area by the sample size of the same area. Appropriate projected factors were applied so that original population proportions were reflected in the data tables using this formula.
    Projection factors (Weight) = Population/No. of Interviews
    The SPSS version of the data file is already weighted according to the above projection factors. As the data are weighted, the total number of cases that appear is 45,847. The figure is in thousands, i.e., 45,847,142 persons representing NSO's projected number of adults (18 years old and above) for year 2005 based on the 1999 Census

    Survey instrument

    Questionnaires

    Algeria: Languages: Arabic (with a small number of respondents preferring French)

    Indonesia:
    (1) Questionnaire
    The definitive language version of the questionnaire is Bahasa Indonesia, which is a translated version from the original questionnaire in English. Then the language translation underwent cognitive pretests to make sure that the messages were conveyed accurately.
    (2) Pre-Testing and Finalizing the Questionnaire
    LSI pre-tested the questionnaire on 17 voting-age adults from different socio-economic classes in order to:

    • Determine the time length of the interview,
    • Improve the wording of the questions, if necessary,
    • Eliminate unnecessary questions or add new items, as the case may be,
    • Test question sequence and identify bases,
    • Correct and improve translation,
    • Change open-ended questions into multiple-choice questions,
    • Find out which items are conceptually vague,
    • Check accuracy and adequacy of the questionnaire instructions,
    • Determine whether the focus of the question is clear, and
    • Identify interviewer’s recording difficulties.

    Kuwait: Language: Arabic
    Lebanon: Language: Arabic
    Morocco: Language: Arabic
    Palestine: Language: Arabic

    Singapore:
    (1) Questionnaire
    The original language of the questionnaire was English and it was translated into Chinese and Malay by qualified translators. Back-translation was done to ensure accuracy.
    (2) Pre-Testing and Finalizing the Questionnaire
    Five interviewers were tasked with pre-testing the pilot questionnaire on 20 respondents, with a good spread of respondents in terms of gender, race and age. The pilot test served to achieve the following:

    • Determine the time length of the interview
    • Improve the wording of the questions, if necessary
    • Eliminate unnecessary questions or add new items, as the case may be
    • Test question sequence and identify bases
    • Correct and improve translation
    • Change open-ended questions into multiple-choice questions
    • Find out which items are conceptually vague
    • Check accuracy and adequacy of the questionnaire instructions
    • Determine whether the focus of the question is clear
    • Identify interviewer's recording difficulties

    Vietnam:
    (1) Questionnaire
    English version of questionnaire is translated into Vietnamese. The language translation goes through a cognitive pre-test before the actual pretest is undertaken.
    (2) Pre-Testing and Finalizing the Questionnaire
    IHS pre-tested the questionnaire on 100 voting-age adults from different socio-economic classes in order to:

    • Determine the time length of the interview
    • Improve the wording of the questions, if necessary
    • Eliminate unnecessary questions or add new items, as the case may be
    • Test question sequence and identify bases
    • Correct and improve translation
    • Change open-ended questions into multiple-choice questions
    • Find out which items are conceptually vague
    • Check accuracy and adequacy of the questionnaire instructions
    • Determine whether the focus of the question is clear
    • Identify interviewer's recording difficulties

    Data collection

    Dates of Data Collection
    Start End
    2001-06-24 2008-12-31
    Data Collectors
    Name Affiliation
    MORI Argentina Argentina
    Institute for Empirical Research in Political Economy Benin
    Equipos MORI Consultores Bolivia
    Centre of Specialisation in Public Administration and Management Botswana
    Department of Political and Administrative Studies, University of Botswana Botswana
    Brazilian Institute of Public Opinion and Statistics Brasil
    Afro-Sondagem, Praia Cape Verde
    Yanhaas Colombia
    CID-GALLUP Costa Rica
    MORI Chile Chile
    Apoyo, Opinión y Mercado Ecuador
    CID-GALLUP El Salvador
    Center for Democratic Development, Accra Ghana
    CID-GALLUP Guatemala
    CID-GALLUP Honduras
    Lembaga Survei Indonesia Indonesia
    Central Research Services Japan
    Institute for Development Studies, Nairobi Kenya
    Garam Research Institute South Korea
    Sechaba Consultants, Maseru Lesotho
    National Institute of Statistics (INSTAT) and COEF Ressources Madagascar
    Institute for Economic and Social Research Malawi
    Groupe de Recherche en Economie Appliquee et Theorique Mali
    Mundamericas Mexico
    Academy of Political Education Mongolia
    Centre for Population Studies (CPS), Eduardo Mondlane University Mozambique
    Research Facilitation Services, Windhoek Namibia
    CID-GALLUP Nicaragua
    Practical Sampling International, Lagos Nigeria
    CID-GALLUP Panama
    Equipos MORI Consultores Paraguay
    Apoyo, Opinión y Mercado Peru
    Social Weather Stations Philippines
    GERCOP l’ENEA, L’Université Gaston Berger de Saint Louis Senegal
    AC Nielsen South Africa
    Institute of Political Science, Academia Sinica Taiwan
    Research on Poverty Alleviation Tanzania
    King Prajadhipok's Institute Thailand
    Wilsken Agencies, Ltd., Kampala Uganda
    Equipos MORI Consultores Uruguay
    DOXA Venezuela
    Institute of Human Studies Vietnam
    Steadman Research Services, Kenya Zambia
    Mass Public Opinion Institute, Harare Zimbabwe
    Data Collection Notes

    Indonesia:
    (3) Training
    (a) LSI had two levels of trainings. LSI conducted the first level of training in Jakarta by inviting all areas coordinators of LSI to special sessions. The aim of this training was to give the coordinators a general picture of the survey, as well as read and review the questionnaire. The second level of training took place in various provinces of Indonesia. The area coordinators who received knowledge disseminated information to surveyors in their respective areas. Particular attention was given to areas with large primary sampling units such as West Java, Central Java, and East Java and where researchers from LSI’s head office were most involved in the training.
    (b) Training time - The minimum training time for group supervisors and interviewers was two days prior to field implementation. The third day was the start-off, where the field supervisor observed the field interviewers during their first round of interviews.
    (c) Training Activities - These mainly consisted of one or two days of office training to learn the basics of the project and mock interviews among participants. The latter activity meant that field interviewers interviewed field anchors as if they were respondents in order to get accustomed to the flow of interviewing and the questionnaire format. Interviewers practiced with a supervisor until they could sufficiently conduct interviews on their own.
    (d) Evaluation of interviewer’s work - A field supervisor observed and evaluated all of the first interviews of each field interviewer. Only after meeting certain evaluation criteria was an interviewer left to interview on her own. The field supervisor always stayed within the vicinity of the sample spot to conduct checks, however.

    Fieldwork:
    (1) Workers on Hand
    For this project, LSI deployed a total of 174 field staff:
    Overall Field Manager = 1
    Western Indonesia Field Anchors = 12 Field Interviewers = 135
    Central Indonesia Field Anchors = 10 Field Interviewers = 35
    Eastern Indonesia Field Anchors = 2 Field Interviewers = 4
    (2) Supervision
    Supervisors reported to the field manager and monitored the study full-time. They observed interviewers, where at least 10% of the total interviews were observed by supervisors, conducted follow-ups, and did surprise checks on the field interviewers. They also ensured that field logistics were received promptly and administered properly.
    (3) Spot-checking
    Part of quality control was to make sure at least 30% of each interviewer’s output was spot-checked and back-checked. Once an incomplete or inconsistent answer was spotted in the questionnaire, the field interviewer went back to the respondent’s house to re-ask the question for verification.

    Japan:
    Fieldwork: The interviews were also conducted by Central Research Services interviewers. Though the interviewers were skilled at this kind of fieldwork, they were still required to participate in an orientation training session for these particular interviews. Interviews were conducted in Japanese. The mean length of the interviews was 50.3 minutes, with a range from 19 to 178 minutes (the SD was 16.2).

    South Korea:
    Fieldwork: The South Korea survey was conducted during the month of February 2003.
    Fieldwork was undertaken by regularly employed interviewers of the Garam Research Institute. Each interviewer participated in a one-day orientation session and completed three trial interviews.
    If no one was at home at a household or if the adult selected for interview was not at home, the interviewer was instructed to call back two times a day. The mean length of interviews was 60 minutes; the range was from 50 to 90 minutes.

    Mongolia:
    Fieldwork: The survey was conducted with face to face method and the respondents were above 18 years old. The number of respondents was based on the list of participants in 2000 parliament elections.

    Singapore:
    (3) Training
    Training was conducted for each and every interviewer involved in this project. They were made familiar with the questionnaire and mock interviews were conducted to test their understanding of the questions. All interviewers were required to present the 1st three pieces of their work for a ‘1st Check’. Only if these three pieces were totally error-free would they be allowed to proceed with the study. If there are errors in the questionnaires, they would be required to go back to the respondent and rectify the error and do another three pieces for a ‘2nd Check’. This process continues until the interviewer’s work is free of errors.
    (1) After each interview, the interviewer was asked to go over his own work and check for consistency.
    (2) Office editors conducted final consistency checks on all questionnaires prior to coding.
    (3) Logic checks were also put in place during data processing.
    Fieldwork: Sixty interviewers in total were used for this survey and they are deployed to locations all over the island. Fieldwork supervision was conducted to observe the interviewers at work so as to maintain quality of fieldwork.

    Vietnam:
    (3) Training
    (a) Training time - IHS staff is trained for field work.The minimum training time for group supervisors and interviewers was 4 days prior to field implementation. The third day was the start-off, where the field supervisor observed the field interviewers on their first interviews.
    (b) Training Activities - These mainly consisted of:
    One or two days office training to learn the basics of the project. Mock interviews among participants, i.e. field interviewers interviewing field anchors as respondents are done to get
    accustomed to the flow of interviewing and questionnaire format.
    Interviews were practiced with a supervisor around until the interviewer could be left on her own.
    (c) Evaluation of interviewer's work - All first interviews of each field interviewer were observed by her field supervisor, and then evaluated. Only after meeting a certain evaluation criteria was an interviewer left to interview on her own, although her field supervisor always stayed within the vicinity of the sample spot to conduct checks.
    (1) After each interview, the interviewer was asked to go over her own work and check for consistency.
    (2) All accomplished interview schedules were submitted to the assigned group supervisor who, in turn, edited every interview.

    Fieldwork:
    (1) Workers on Hand
    For this project, a total of 30 field staff were deployed:
    Field Manager = 1
    Red River Delta Field Anchors = 2
    Field Interviewers = 6
    North East Field Anchors =1
    Field Interviewers = 3
    North West Field Anchors = 1
    Field Interviewers = 2
    North Central Coast Field Anchors = 1
    Field Interviewers = 5
    South Central Coast Field Anchors = 1
    Field Interviewers = 2
    Central Highlands Field Anchors =1
    Field Interviewers = 3
    South East Field Anchors = 1
    Field Interviewers = 4
    Mekong Delta Field Anchors = 1
    Field Interviewers = 5
    (2) Supervision
    Supervisors reporting to the field manager monitored the study full-time. They observed interviewers, (at least 10% of total interviewers were observed by supervisors), followed-up and did surprise checks on the field interviewers. They also ensured that field logistics were received promptly and administered properly.
    (3) Spot-checking
    Part of quality control was to make sure at least 30% of each interviewer’s output was spot-checked and back-checked. Once an incomplete or inconsistent answer was spotted in the questionnaire, the field interviewer went back to the respondent’s house to re-ask the question for verification.

    The date of the collection for individual countries could be found at http://www.jdsurvey.net/gbs/GBSTechnical.jsp

    Data processing

    Data Editing

    Indonesia:
    (c) Data Processing
    (1) Office editors conducted final consistency checks on all interviews prior to coding.
    (2) A data entry computer program verified and checked the consistency of the encoded data before data tables were generated.

    Data appraisal

    Estimates of Sampling Error

    Argentina: Estimated Error: 3%
    Benin: Estimated Error: +/- 3% with 95% confidence level
    Bolivia: Estimated Error: 2.8%
    Botswana: Estimated Error: +/- 3% with 95% confidence level.
    Brasil: Estimated Error: 2.8%
    Cape Verde: Estimated Error: +/- 3% with 95% confidence level.
    Colombia: Estimated Error: 3%
    Costa Rica: Estimated Error: 3.1%
    Chile: Estimated Error: 3%
    Ecuador: Estimated Error: 2.8%
    El Salvador: Estimated Error: 3.1%
    Ghana: Estimated Error: +/- 3% with 95% confidence level.
    Guatemala: Estimated Error: 3.1%
    Honduras: Estimated Error: 3.1%

    Indonesia: Estimated Error: ± 2.5 %
    Sample Sizes and Error Margins. An indicator of data quality is the standard error of the estimate, on which the margin for sampling error is based. As survey statistics are mostly proportions, the key measure of data precision is the standard error of a proportion taken from a sample. It is computed as follows: ± Z vp(1-p)/n
    Where Z, at a 95% confidence level is 1.96, p is the sample proportion estimate, and n is the sample size.
    The overall sample size of 1,600 voting-age adults gives a maximum error margin of ± 2.5 % at the 95% confidence level, assuming a simple random sampling design. The sampling error is at its highest when the true proportion being estimated is close to 50%.
    The following approximate 95%-confidence margins for sampling error should be made when aggregating data at various levels:

    Sample Size     Error Margin    

    The original 1,600 ±2.5%
    Actual with substitution 1,600 ±2.5%
    Actual without substitution 1,440 ±2.7%

    However, somewhat higher error margins should be expected since multi-stage cluster sampling was used; this design-effect is not readily measurable through established statistical software.

    Kenya: Estimated Error: +/- 3% with 95% confidence level.
    Lesotho: Estimated Error: +/- 3% with 95% confidence level.
    Madagascar: Estimated Error: +/- 3% with 95% confidence level.
    Malawi: Estimated Error: +/- 3% with 95% confidence level.
    Mozambique: Estimated Error: +/- 3% with 95% confidence level.
    Namibia: Estimated Error: +/- 3% with 95% confidence level.
    Nicaragua: Estimated Error: 3.1%
    Nigeria: Estimated Error: +/- 2% with 95% confidence level.
    Panama: Estimated Error: 3.1%
    Paraguay: Estimated Error: 4%
    Peru: Estimated Error: 2.8%
    Senegal: Estimated Error: +/- 3% with 95% confidence level.
    South Africa: Estimated Error: +/- 2% with 95% confidence level.
    Tanzania: Estimated Error: +/- 3% with 95% confidence level.
    Uganda: Estimated Error: +/- 2% with 95% confidence level.
    Uruguay: Estimated Error: 2.8%
    Venezuela: Estimated Error: 3%
    Vietnam: Estimated Error: ± 2.83 %

    Vietnam:
    Sample Sizes and Error Margins. An indicator of data quality is the standard error of the estimate, on which the margin for sampling error is based. As survey statistics are mostly proportions, the key measure of data precision is the standard error of a proportion taken from a sample. It is computed as follows: ± Z vp(1-p)/n
    Where Z, at 95% confidence level is 1.96; p is the sample proportion estimate and n is the sample size. The overall sample size of 1,200 voting-age adults gives a maximum error margin of ± 2.83 % at the 95% confidence level, assuming a simple random sampling design. The sampling error is at its highest when the true proportion being estimated is close to 50%.
    The following approximate 95%-confidence margins for sampling error should be made when aggregating data at various levels:

    Sample Size     Error margin    

    Vietnamese 1200 +3%
    Red River Delta 234 +6%
    North East 234 +6%
    North West 36 +6%
    North Central Coast 156 +6%
    South Central Coast 102 +6%
    Central Highlands 78 +6%
    South East 168 +6%
    Mekong Delta 252 +6%

    However, somewhat higher error margins should be expected since multi-stage cluster sampling was used; this design-effect is not readily measurable through established statistical software.
    Estimated Error: ± 2.83 %

    Zambia: Estimated Error: +/- 3% with 95% confidence level.
    Zimbabwe: Estimated Error: +/- 3% with 95% confidence level.

    Data Access

    Access authority
    Name URL Email
    ASEP/JDS http://www.jdsurvey.net/gbs/gbs.jsp asep.jds@jdsurvey.net
    Access conditions

    Dataset is available to download from external repository at http://www.jdsurvey.net/gbs/GBSData.jsp

    Citation requirements

    Use of the dataset must be acknowledged using a citation which would include:

    • the Identification of the Primary Investigator
    • the title of the survey (including country, acronym and year of implementation)
    • the survey reference number
    • the source and date of download

    Example:

    Department of Political Science, National Taiwan University. Global Barometer Survey (GBS) 2001-2008, Ref. WLD_2001_GBS_v01_M. Dataset downloaded from [url] on [date].

    Disclaimer and copyrights

    Disclaimer

    The user of the data acknowledges that the original collector of the data, the authorized distributor of the data, and the relevant funding agency bear no responsibility for use of the data or for interpretations or inferences based upon such uses.

    Contacts

    Contacts
    Name Affiliation Email
    Yu-tzung Chang Department of Political Science, National Taiwan University yutzung@ntu.edu.tw
    Ms Kai-Ping Huang Department of Political Science, National Taiwan University asianbarometer@ntu.edu.tw
    ASEP/JDS asep.jds@jdsurvey.net

    Metadata production

    DDI Document ID

    DDI_WLD_2001_GBS_v02_M

    Producers
    Name Affiliation Role
    Development Data Group The World Bank Documentation of the DDI
    Date of Metadata Production

    2013-03-01

    Metadata version

    DDI Document version

    DDI Document - Version 02 - (04/21/21)
    This version is identical to DDI_WLD_2001_GBS_v01_M but country field has been updated to capture all the countries covered by survey.

    Version 01 (March 2013)

    Back to Catalog
    IHSN Survey Catalog

    © IHSN Survey Catalog, All Rights Reserved.