Free Cloud Apps with OffiDocs

Survey Analysis Report Example: What to Include and How to Present

Survey Clipart

Surveys provide invaluable first-hand data on customer opinions. But raw survey results alone don’t tell the full story. Proper analysis and an insightful presentation transform survey data into compelling insights. Therefore, to derive actionable insights, you need to analyze the data and present it effectively through your survey analysis report.

This article will walk through creating an effective survey analysis report. You’ll learn:

  • Key elements to include like objectives, methodology, data visualizations, conclusions, and recommendations.
  • Visualization best practices to present data clearly.
  • Step-by-step breakdown of all elements of a Survey Analysis Report through an example.

Understanding the Essentials of a Survey Analysis Report

Before we dive into the example, let’s establish a clear understanding of what a survey analysis report should encompass:

1. Define the Survey Goals and Methodology

The survey analysis report should begin by recapping the goals and methodology. Remind readers why the survey was conducted and how you gathered the data.

Clearly state the purpose such as:

  • Evaluate customer satisfaction
  • Assess brand awareness
  • Gauge product demand

Explain how and when the survey was administered. Share details like:

  • Survey length – e.g. 20 questions
  • Mode – online, email, phone, in-person
  • Dates fielded – e.g. July 1 – July 15, 2022
  • Number and source of respondents

This context helps readers interpret the findings.

2. Present Key Data Insights Up Front

Don’t bury the most important survey findings. Highlight key takeaways upfront through an executive summary or overview.

Use data visualizations like charts to showcase crucial insights. For example, include:

  • A satisfaction score breakdown showing percentages of satisfied, neutral, and dissatisfied respondents.
  • A column chart of the most frequent survey responses.
  • A comparison chart on satisfaction ratings across customer segments.

Summarize the most actionable findings. This executive view prepares readers to dive into the analysis details.

3. Analyze Closed-Ended Survey Questions

Survey Data Presentation

Closed-ended questions with set response options are easier to quantify and analyze. Review response rates for each option.

For rating questions like satisfaction scales, calculate the weighted average. Compare averages across customer groups with charts.

Use frequency tables to show counts and percentages when choosing each option. Filter by respondent segments for deeper analysis.

Statistical testing can reveal significant differences in responses across segments. Include relevant statistical analysis to back conclusions.

4. Uncover Themes in Open-Ended Feedback

For open-ended questions, use qualitative analysis to uncover themes. Group similar ideas and comments into categories using coding.

Text analysis tools can automatically detect top themes and sentiments in open-ended feedback. Include keyword summaries or shared sample quotes.

Watch for trends and correlations, like dissatisfied customers frequently mentioning certain issues.

Compare themes across customer segments, such as complaints about technical issues primarily coming from older respondents.

Prioritize addressing feedback from key customer groups, even if not the most frequent.

5. Put Survey Findings in Context

Don’t just present the survey data at face value. Provide context around findings to make them meaningful.

Benchmark results against competitors or industry standards. Show if ratings and metrics are above or below benchmarks.

Point out statistically significant differences between segments.

Compare results to previous surveys to determine trends over time.

Link findings to broader organizational goals and business metrics. Demonstrate their real-world impact.

Thorough analysis and commentary transform raw survey data into insightful revelations.

6. Present Visually Appealing Data Visualizations

Replace dense survey data tables with appealing charts and graphs. Well-designed visuals make findings more accessible and memorable.

Use simple, easily interpreted charts like bar and column graphs to show response distributions, averages, and comparisons. Reserve complex charts for technical presentations.

Choose colors, layouts, and styles that fit your organization’s brand for professional data visuals.

Follow best practices for effective visuals, like consistent scales, legible text, clear labels, and concise chart titles.

Let the data shine by keeping visuals simple. Avoid over-embellishment and clutter.

7. Provide Conclusions and Recommendations

Wrap up the survey analysis report by summarizing conclusions and providing recommendations.

Summarize the key takeaways from the survey in relation to the original objectives. Highlight the most important findings revealed through the analysis.

Offer data-driven recommendations on actions to take based on conclusions. For example, suggest:

  • Improving low-rated experiences based on feedback.
  • Increasing marketing in regions showing high demand.
  • Further research to delve into unclear dynamics uncovered.

Clear recommendations give stakeholders direction on how to apply the survey insights .

Following this survey analysis report structure will showcase your data in the best light. Pairing compelling data presentation with insightful analysis ensures your findings achieve maximum impact.

Now, let’s examine each section of a survey analysis report using a comprehensive example.

Survey Analysis Report Example: Customer Satisfaction Survey

Presentation on Laptop

1. Title and Introduction

Title : Customer Satisfaction Survey Analysis – Q2 2023

Introduction:

Background : In an effort to enhance customer experience and loyalty, XYZ Corporation conducted a customer satisfaction survey during the second quarter of 2023. This report aims to analyze the survey findings and provide actionable insights.

Objectives : The survey sought to assess overall satisfaction, identify areas for improvement, and measure the effectiveness of recent service enhancements.

Methodology : We collected responses from 1,500 customers through online surveys, with a focus on post-service interactions.

2. Survey Results

Data presentation:.

  • Overall Satisfaction : (Bar chart displaying satisfaction ratings)
  • Service Attributes : (Pie chart showing the distribution of satisfaction across key service attributes)

Key Findings:

  • Overall satisfaction remains high, with 88% of respondents reporting satisfaction levels of 4 or 5 on a 5-point scale.
  • Timeliness of service delivery emerged as the most critical attribute, with 92% of customers expressing satisfaction.

Detailed Analysis:

While satisfaction levels are positive, it’s crucial to examine the comments provided by dissatisfied customers. Several highlighted difficulties in reaching customer support, indicating room for improvement in communication channels.

3. Discussion and Interpretation

Contextualization:.

Comparing our current satisfaction levels with industry benchmarks, we are ahead by 10%. This suggests that our recent service enhancements have had a positive impact.

Implications:

The high satisfaction with service attributes highlights our strengths. However, the challenges in reaching customer support demand immediate attention to maintain our overall positive reputation.

4. Recommendations

  • Streamline customer support channels, ensuring quicker response times.
  • Leverage customer feedback for targeted training programs to address areas identified by dissatisfied customers.

5. Conclusion

In conclusion, while our customer satisfaction levels are commendable, there are areas requiring our focus. By addressing the issues related to customer support and continuing our efforts to enhance service timeliness, we can further elevate the customer experience.

6. Appendices

Appendix A: Survey Questionnaire

(Include the full survey questionnaire for reference)

Appendix B: Raw Data

(Provide the raw survey data for transparency)

7. References

(Cite any external sources or references used in the report)

This example illustrates a comprehensive survey analysis report, but remember that each report may vary depending on the survey’s focus and objectives. Use this template as a guide to craft your reports effectively, ensuring that your organization not only collects data but also acts upon it to drive improvements.

Related Posts

Survey on Laptop Screen

Top 7 Survey Results Presentation Examples for Impactful Insights

Survey

The Best Way to Present Survey Results: Engaging, Informative, and Impactful

How to make a survey report: A guide to analyzing and detailing insights

  • March 8, 2024

The survey report: Meaning and importance

Survey findings, analysis and interpretation, recommendations, use clear and accessible language, structure a report for clarity, provide context, include visual aids, cite sources, proofread and edit, introduction, create a survey with surveyplanet.

In today’s data-driven world, surveys are indispensable tools for gathering valuable intelligence and making informed decisions. Whether conducting market research, gauging customer satisfaction, or gathering employee feedback, the key to unlocking a survey’s true potential lies in the subsequent analysis and reporting of the collected data.

Whether you’re new to surveys or a seasoned researcher, mastering analysis and reporting is essential. To unlock the full potential of surveys one must learn how to make a survey report.

Before diving into the process, let’s clarify what a survey report is and why it’s crucial. It is a structured document that presents the findings, analysis, and conclusions derived from survey data. It serves as a means to communicate the insights obtained from the survey to stakeholders, enabling them to make informed decisions.

The importance of a survey report cannot be overstated. It provides a comprehensive overview of collected data, allowing stakeholders to gain a deeper understanding of the subject matter. Additionally, it serves as a reference point for future decision-making and strategy development, ensuring that actions are based on sound evidence rather than assumptions.

Key components of a survey report

Start a survey report with a brief overview of the purpose of the survey, its objectives, and the methodology used for data collection. This sets the context for the rest of the report and helps readers understand the scope of the survey.

The introduction serves as the roadmap that guides readers through the document and provides essential background information. It should answer questions such as why the survey was conducted, who the target audience was, and how the data was collected . By setting clear expectations upfront, the groundwork is laid for a coherent and compelling report.

Present the key findings of the survey in a clear and organized manner. Use charts, graphs, and tables to visualize the data effectively. Ensure that the findings are presented in a logical sequence, making it easy for readers to follow the narrative.

The survey findings section is the heart of the report, where the raw data collected during the survey is presented. It’s essential to organize the findings in a way that is easy to understand and digest. Visual aids such as charts, graphs, and tables can help illustrate trends and patterns in the data, making it easier for readers to grasp the key insights.

Dive deeper into the survey data by analyzing trends, patterns, and correlations. Provide insights into what the data means and why certain trends may be occurring. The findings must be interpreted in the context of the survey’s objectives and any relevant background information.

Analysis and interpretation are where the real value of the survey report lies. This is where surface-level findings are moved beyond to uncover the underlying meaning behind the data. By digging deeper to provide meaningful insights, stakeholders gain a deeper understanding of the issues at hand and identify potential opportunities for action.

Based on the analysis, offer actionable recommendations or suggestions that address the issues identified in the survey. These recommendations should be practical, feasible, and tied directly to the survey findings.

The recommendations section is where insights are translated into action. It’s not enough to simply present the findings—clear guidance on what steps should be taken next must be provided. Recommendations should be specific, actionable, and backed by evidence from the survey data. Such practical guidance empowers stakeholders to make informed decisions that drive positive change.

Don’t forget to summarize the key findings, insights, and recommendations presented in the report. Reinforce the importance of the survey results and emphasize how they can be used to drive decision-making.

The conclusion serves as a final wrap-up, summarizing the key takeaways and reinforcing the importance of the findings. It’s an opportunity to remind stakeholders of the survey’s value and how the results can be used to inform decision-making and drive positive change. By ending on a strong note, readers have a clear understanding of the significance of the survey and the actions that need to be taken moving forward.

Best practices for survey report writing

In addition to understanding the key components of a survey report, it’s essential to follow best practices when writing and presenting findings. Here are some tips to ensure that a survey report is clear, concise, and impactful.

Avoid technical jargon or overly complex language that may confuse readers. Instead, use clear and straightforward wording that is easily understood by the target audience.

Organize a survey report into clearly defined sections:

  • Conclusion.

This helps readers navigate the document and find needed information quickly.

Always provide the background of findings by explaining the significance of the survey objectives and how the data relates to the broader goals of the organization or project.

Charts, graphs, and tables can help illustrate key findings and trends in the data. Use them sparingly and ensure they are properly labeled and explained in the text.

When referencing external sources or previous research, be sure to cite them properly. This adds credibility to the findings and allows readers to explore the topic further if they wish.

Before finalizing a survey report, take the time to proofread and edit it for grammar, spelling, and formatting errors. A polished and professional-looking report reflects positively on your work and enhances its credibility.

By following these best practices, it is ensured that a survey report effectively communicates findings and insights to stakeholders, empowering them to make informed decisions based on the data collected.

Short survey report example

To illustrate the process, let’s consider a hypothetical short survey report example:

The purpose of this survey was to gather feedback from customers regarding their satisfaction with our products and services. The survey was conducted online and received responses from 300 participants over a two-week period.

  • 85% of respondents reported being satisfied with the quality of our products.
  • 70% indicated that they found our customer service to be responsive and helpful.
  • The majority of respondents cited price as the primary factor influencing their purchasing decisions.

The high satisfaction ratings suggest that our products meet the expectations of our customers. However, the feedback regarding pricing indicates a potential area for improvement. By analyzing the data further, we can identify opportunities to adjust pricing strategies or offer discounts to better meet customer needs.

Based on the survey findings, we recommend conducting further market research to better understand pricing dynamics and competitive positioning. Additionally, we propose exploring initiatives to enhance the overall value proposition for our products and services.

The survey results provide valuable insights into customer perceptions and preferences. By acting on these findings, we can strengthen our competitive position and drive greater customer satisfaction and loyalty .

Creating a survey report involves more than just presenting data; it requires careful analysis, interpretation, and meaningful recommendations. By following the steps outlined in this guide and utilizing the survey report example like the one provided, you can effectively communicate survey findings and empower decision-makers to take action based on valuable insights.

Ready to turn survey insights into actionable results? Try SurveyPlanet, our powerful survey tool designed to streamline survey creation, data collection, and analysis. Sign up now for a free trial and experience the ease and efficiency of gathering valuable feedback with SurveyPlanet. Your journey to informed decision-making starts here!

Photo by Kaleidico on Unsplash

A Comprehensive Guide to Survey Research Methodologies

For decades, researchers and businesses have used survey research to produce statistical data and explore ideas. The survey process is simple, ask questions and analyze the responses to make decisions. Data is what makes the difference between a valid and invalid statement and as the American statistician, W. Edwards Deming said:

“Without data, you’re just another person with an opinion.” - W. Edwards Deming

In this article, we will discuss what survey research is, its brief history, types, common uses, benefits, and the step-by-step process of designing a survey.

What is Survey Research

A survey is a research method that is used to collect data from a group of respondents in order to gain insights and information regarding a particular subject. It’s an excellent method to gather opinions and understand how and why people feel a certain way about different situations and contexts.

Brief History of Survey Research

Survey research may have its roots in the American and English “social surveys” conducted around the turn of the 20th century. The surveys were mainly conducted by researchers and reformers to document the extent of social issues such as poverty. ( 1 ) Despite being a relatively young field to many scientific domains, survey research has experienced three stages of development ( 2 ):

-       First Era (1930-1960)

-       Second Era (1960-1990)

-       Third Era (1990 onwards)

Over the years, survey research adapted to the changing times and technologies. By exploiting the latest technologies, researchers can gain access to the right population from anywhere in the world, analyze the data like never before, and extract useful information.

Survey Research Methods & Types

Survey research can be classified into seven categories based on objective, data sources, methodology, deployment method, and frequency of deployment.

Types of survey research based on objective, data source, methodology, deployment method, and frequency of deployment.

Surveys based on Objective

Exploratory survey research.

Exploratory survey research is aimed at diving deeper into research subjects and finding out more about their context. It’s important for marketing or business strategy and the focus is to discover ideas and insights instead of gathering statistical data.

Generally, exploratory survey research is composed of open-ended questions that allow respondents to express their thoughts and perspectives. The final responses present information from various sources that can lead to fresh initiatives.

Predictive Survey Research

Predictive survey research is also called causal survey research. It’s preplanned, structured, and quantitative in nature. It’s often referred to as conclusive research as it tries to explain the cause-and-effect relationship between different variables. The objective is to understand which variables are causes and which are effects and the nature of the relationship between both variables.

Descriptive Survey Research

Descriptive survey research is largely observational and is ideal for gathering numeric data. Due to its quantitative nature, it’s often compared to exploratory survey research. The difference between the two is that descriptive research is structured and pre-planned.

 The idea behind descriptive research is to describe the mindset and opinion of a particular group of people on a given subject. The questions are every day multiple choices and users must choose from predefined categories. With predefined choices, you don’t get unique insights, rather, statistically inferable data.

Survey Research Types based on Concept Testing

Monadic concept testing.

Monadic testing is a survey research methodology in which the respondents are split into multiple groups and ask each group questions about a separate concept in isolation. Generally, monadic surveys are hyper-focused on a particular concept and shorter in duration. The important thing in monadic surveys is to avoid getting off-topic or exhausting the respondents with too many questions.

Sequential Monadic Concept Testing

Another approach to monadic testing is sequential monadic testing. In sequential monadic surveys, groups of respondents are surveyed in isolation. However, instead of surveying three groups on three different concepts, the researchers survey the same groups of people on three distinct concepts one after another. In a sequential monadic survey, at least two topics are included (in random order), and the same questions are asked for each concept to eliminate bias.

Based on Data Source

Primary data.

Data obtained directly from the source or target population is referred to as primary survey data. When it comes to primary data collection, researchers usually devise a set of questions and invite people with knowledge of the subject to respond. The main sources of primary data are interviews, questionnaires, surveys, and observation methods.

 Compared to secondary data, primary data is gathered from first-hand sources and is more reliable. However, the process of primary data collection is both costly and time-consuming.

Secondary Data

Survey research is generally used to collect first-hand information from a respondent. However, surveys can also be designed to collect and process secondary data. It’s collected from third-party sources or primary sources in the past.

 This type of data is usually generic, readily available, and cheaper than primary data collection. Some common sources of secondary data are books, data collected from older surveys, online data, and data from government archives. Beware that you might compromise the validity of your findings if you end up with irrelevant or inflated data.

Based on Research Method

Quantitative research.

Quantitative research is a popular research methodology that is used to collect numeric data in a systematic investigation. It’s frequently used in research contexts where statistical data is required, such as sciences or social sciences. Quantitative research methods include polls, systematic observations, and face-to-face interviews.

Qualitative Research

Qualitative research is a research methodology where you collect non-numeric data from research participants. In this context, the participants are not restricted to a specific system and provide open-ended information. Some common qualitative research methods include focus groups, one-on-one interviews, observations, and case studies.

Based on Deployment Method

Online surveys.

With technology advancing rapidly, the most popular method of survey research is an online survey. With the internet, you can not only reach a broader audience but also design and customize a survey and deploy it from anywhere. Online surveys have outperformed offline survey methods as they are less expensive and allow researchers to easily collect and analyze data from a large sample.

Paper or Print Surveys

As the name suggests, paper or print surveys use the traditional paper and pencil approach to collect data. Before the invention of computers, paper surveys were the survey method of choice.

Though many would assume that surveys are no longer conducted on paper, it's still a reliable method of collecting information during field research and data collection. However, unlike online surveys, paper surveys are expensive and require extra human resources.

Telephonic Surveys

Telephonic surveys are conducted over telephones where a researcher asks a series of questions to the respondent on the other end. Contacting respondents over a telephone requires less effort, human resources, and is less expensive.

What makes telephonic surveys debatable is that people are often reluctant in giving information over a phone call. Additionally, the success of such surveys depends largely on whether people are willing to invest their time on a phone call answering questions.

One-on-one Surveys

One-on-one surveys also known as face-to-face surveys are interviews where the researcher and respondent. Interacting directly with the respondent introduces the human factor into the survey.

Face-to-face interviews are useful when the researcher wants to discuss something personal with the respondent. The response rates in such surveys are always higher as the interview is being conducted in person. However, these surveys are quite expensive and the success of these depends on the knowledge and experience of the researcher.

Based on Distribution

The easiest and most common way of conducting online surveys is sending out an email. Sending out surveys via emails has a higher response rate as your target audience already knows about your brand and is likely to engage.

Buy Survey Responses

Purchasing survey responses also yields higher responses as the responders signed up for the survey. Businesses often purchase survey samples to conduct extensive research. Here, the target audience is often pre-screened to check if they're qualified to take part in the research.

Embedding Survey on a Website

Embedding surveys on a website is another excellent way to collect information. It allows your website visitors to take part in a survey without ever leaving the website and can be done while a person is entering or exiting the website.

Post the Survey on Social Media

Social media is an excellent medium to reach abroad range of audiences. You can publish your survey as a link on social media and people who are following the brand can take part and answer questions.

Based on Frequency of Deployment

Cross-sectional studies.

Cross-sectional studies are administered to a small sample from a large population within a short period of time. This provides researchers a peek into what the respondents are thinking at a given time. The surveys are usually short, precise, and specific to a particular situation.

Longitudinal Surveys

Longitudinal surveys are an extension of cross-sectional studies where researchers make an observation and collect data over extended periods of time. This type of survey can be further divided into three types:

-       Trend surveys are employed to allow researchers to understand the change in the thought process of the respondents over some time.

-       Panel surveys are administered to the same group of people over multiple years. These are usually expensive and researchers must stick to their panel to gather unbiased opinions.

-       In cohort surveys, researchers identify a specific category of people and regularly survey them. Unlike panel surveys, the same people do not need to take part over the years, but each individual must fall into the researcher’s primary interest category.

Retrospective Survey

Retrospective surveys allow researchers to ask questions to gather data about past events and beliefs of the respondents. Since retrospective surveys also require years of data, they are similar to the longitudinal survey, except retrospective surveys are shorter and less expensive.

Why Should You Conduct Research Surveys?

“In God we trust. All others must bring data” - W. Edwards Deming

 In the information age, survey research is of utmost importance and essential for understanding the opinion of your target population. Whether you’re launching a new product or conducting a social survey, the tool can be used to collect specific information from a defined set of respondents. The data collected via surveys can be further used by organizations to make informed decisions.

Furthermore, compared to other research methods, surveys are relatively inexpensive even if you’re giving out incentives. Compared to the older methods such as telephonic or paper surveys, online surveys have a smaller cost and the number of responses is higher.

 What makes surveys useful is that they describe the characteristics of a large population. With a larger sample size , you can rely on getting more accurate results. However, you also need honest and open answers for accurate results. Since surveys are also anonymous and the responses remain confidential, respondents provide candid and accurate answers.

Common Uses of a Survey

Surveys are widely used in many sectors, but the most common uses of the survey research include:

-       Market research : surveying a potential market to understand customer needs, preferences, and market demand.

-       Customer Satisfaction: finding out your customer’s opinions about your services, products, or companies .

-       Social research: investigating the characteristics and experiences of various social groups.

-       Health research: collecting data about patients’ symptoms and treatments.

-       Politics: evaluating public opinion regarding policies and political parties.

-       Psychology: exploring personality traits, behaviors, and preferences.

6 Steps to Conduct Survey Research

An organization, person, or company conducts a survey when they need the information to make a decision but have insufficient data on hand. Following are six simple steps that can help you design a great survey.

Step 1: Objective of the Survey

The first step in survey research is defining an objective. The objective helps you define your target population and samples. The target population is the specific group of people you want to collect data from and since it’s rarely possible to survey the entire population, we target a specific sample from it. Defining a survey objective also benefits your respondents by helping them understand the reason behind the survey.

Step 2: Number of Questions

The number of questions or the size of the survey depends on the survey objective. However, it’s important to ensure that there are no redundant queries and the questions are in a logical order. Rephrased and repeated questions in a survey are almost as frustrating as in real life. For a higher completion rate, keep the questionnaire small so that the respondents stay engaged to the very end. The ideal length of an interview is less than 15 minutes. ( 2 )

Step 3: Language and Voice of Questions

While designing a survey, you may feel compelled to use fancy language. However, remember that difficult language is associated with higher survey dropout rates. You need to speak to the respondent in a clear, concise, and neutral manner, and ask simple questions. If your survey respondents are bilingual, then adding an option to translate your questions into another language can also prove beneficial.

Step 4: Type of Questions

In a survey, you can include any type of questions and even both closed-ended or open-ended questions. However, opt for the question types that are the easiest to understand for the respondents, and offer the most value. For example, compared to open-ended questions, people prefer to answer close-ended questions such as MCQs (multiple choice questions)and NPS (net promoter score) questions.

Step 5: User Experience

Designing a great survey is about more than just questions. A lot of researchers underestimate the importance of user experience and how it affects their response and completion rates. An inconsistent, difficult-to-navigate survey with technical errors and poor color choice is unappealing for the respondents. Make sure that your survey is easy to navigate for everyone and if you’re using rating scales, they remain consistent throughout the research study.

Additionally, don’t forget to design a good survey experience for both mobile and desktop users. According to Pew Research Center, nearly half of the smartphone users access the internet mainly from their mobile phones and 14 percent of American adults are smartphone-only internet users. ( 3 )

Step 6: Survey Logic

Last but not least, logic is another critical aspect of the survey design. If the survey logic is flawed, respondents may not continue in the right direction. Make sure to test the logic to ensure that selecting one answer leads to the next logical question instead of a series of unrelated queries.

How to Effectively Use Survey Research with Starlight Analytics

Designing and conducting a survey is almost as much science as it is an art. To craft great survey research, you need technical skills, consider the psychological elements, and have a broad understanding of marketing.

The ultimate goal of the survey is to ask the right questions in the right manner to acquire the right results.

Bringing a new product to the market is a long process and requires a lot of research and analysis. In your journey to gather information or ideas for your business, Starlight Analytics can be an excellent guide. Starlight Analytics' product concept testing helps you measure your product's market demand and refine product features and benefits so you can launch with confidence. The process starts with custom research to design the survey according to your needs, execute the survey, and deliver the key insights on time.

  • Survey research in the United States: roots and emergence, 1890-1960 https://searchworks.stanford.edu/view/10733873    
  • How to create a survey questionnaire that gets great responses https://luc.id/knowledgehub/how-to-create-a-survey-questionnaire-that-gets-great-responses/    
  • Internet/broadband fact sheet https://www.pewresearch.org/internet/fact-sheet/internet-broadband/    

Related Articles

Real-life voice of customer examples & takeaways.

Voice of Customer (VoC) is a market research term that describes customer experiences, expectations, and needs.

How to Determine Market Potential of a Product (The 2022 Guide)

How do you determine the market potential, or demand for a product of service? Learn the basics and beyond from the experts at Starlight Analytics.

Price Testing 101: How to Do it The Right Way

Tired of playing the guessing game with your pricing strategy? Learn the 101 of price testing and how to do it the right way with Starlight Analytics.

Moments of Truth: Building Brand Loyalty Among Your Customers

Learn about the four discrete Moments of Truth and how they influence a customer’s perception of—and loyalty to—your brand.

The Hidden Reason Why Market Research Matters

While there are many self-evident reasons why market research matters, they all tend to center around one hidden reason: your intuition is not always right. By recognizing these implicit biases, and regularly challenging your intuition, you open yourself up to new business opportunities, and insights about your market + customers that run counter to your gut.

methodology in survey report example

How to Write a Complete Survey Report

methodology in survey report example

Finding ways to encourage a large number of responses to your surveys is an art. But so is analyzing the data in a way that lets you turn it into actionable insights.

Once you’ve done all the hard work of persuading people, be it your customers or employees , to fill out your survey, the last thing you want is to have all that important data go to waste. 

This happens when surveyors take the answers at face value. The outcome becomes actionable only when you analyze the survey data .

That’s why it’s so important to formulate a complete survey report.

methodology in survey report example

What is a survey report?

A survey report is a document with important metrics gathered from customer feedback .

The goal of a survey report is to present the data in a full and objective manner. The report presents all the results that were collected.

A complete survey report includes:

  • Completion rates

Number of responses

Date of last response, survey views, breakdown of answers per survey respondent, breakdown of closed-ended questions.

All of these are calculated or broken down for you within the Survicate dashboard.

Survicate provides survey reports

Let’s analyze why these metrics are important and what they tell you.

Completion rate

The completion rate is the number of questions answered divided by the total number of questions in your survey. 

If you have a survey of 12 questions but most respondents only answered 6 of those, you have a completion rate of 50%.

Depending on the survey tool you use, the completion rate can indicate many things. 

For instance, if most respondents were only asked 6 questions out of 12 because half of the questions were not relevant and were skipped, that’s likely a completion rate you’ll be happy with.

But what if your 50% survey response rate results from people skipping questions willfully? It might suggest that you may need to improve your survey .

With Survicate, you will see responses from partially completed surveys so you don’t miss out on valuable data. 

You need to know exactly how many people responded to your survey to have enough data to properly analyze your survey results . Beware – some forms of survey tools may not count individual respondents, instead just their responses to individual questions.

Hence, it’s important that your survey platform allows you to count how many different people responded, so you can determine whether you have a significant sample size.

How do you determine the survey sample size you need?

This depends on what data you want to analyze – from your entire audience or just those from a chosen segment.

For example, if you are a beauty brand that sells face creams specifically for women over thirty-five, you may find out in your survey that you also have younger women who use your products.

You may decide to segment these responses into separate age groups to obtain the data you want.

So, if you were surveying them on the effectiveness of a new age-defying cream, you may find that the women under thirty had very different responses to those in their sixties. 

This is the kind of data that you could have overlooked but can help you with your marketing efforts (and will result in a survey report that's pure gold!).

If you are using Survicate, make sure to integrate with a distribution tool that gathers demographic data. You can also include demographic-style questions in your survey.

If you’re running a survey for a short and specific time period this may not seem important. 

Still, if you ask customers to fill out a customer service feedback survey after every ticket is closed, you may get years of data. This can help you figure out whether your customer service team is properly trained.

On the other hand, if you introduce a redesign on your website, develop a new feature, or make some other significant change, a long-term NPS or CSAT survey can show you the impact.

When you are able to determine the response time, you can split your data and analyze responses relevant to each new implementation.

You need to know the total number of survey views and the total of unique survey views (the number of total views versus the number of different people who viewed the survey, as some people may have viewed it more than once).

If there is a large disparity between these two totals, this can point to several things.

First, your survey may be targeted at a large audience and the questions aren’t relevant enough for all your respondents to answer.

Respondents may also view the survey and then decide not to take it because:

  • They don’t have the time
  • They don’t have the right device (things like open-ended questions can be difficult and tedious to answer on a small phone screen)
  • They see the first questions and decide that taking the survey isn’t for them

Such insights can let you know whether you need to work on your survey design or customer segmentation.

You want to see the breakdown per respondent so you can see how individuals answered all the questions in the survey. This can be helpful for seeing trends in certain respondents’ answers.

For example, you may notice a pattern that each person who dealt with a particular customer service agent gave a negative response to your Customer Effort Score (CES) survey.

Then you know you need to train that agent and improve their performance.

Within the “analyze” tab, Survicate allows you to click on any response to view the other answers.

Survicate answer breakdown by respondent

And if you integrate with particular tools like Google Analytics or Intercom , you may even be able to capture demographic data and contact the respondents individually.

Survicate captures demographic data

When you think of a survey report, you likely picture graphs and pie charts displaying the data attained from closed-ended questions.

Survicate single-choice question breakdown

This is important for a good survey report because it allows you to take in a large quantity of data at a glance, and can be easily distributed to those who may find the data valuable.

Graphic representation makes survey analysis user-friendly and doesn’t require a lot of time or prior skills to analyze.

In the example below, we can see the NPS (Net Promoter Score) response breakdown – we know that over 75% of respondents are promoting our brand, 3.2% are detractors , and we had 800 overall responses. All of this data is plain to see and easy to interpret.

Survicate NPS survey report

Survey report example

If you’re not sure how to present your questionnaire results, choose a survey tool that will prepare a mockup for you. Make sure the software you use doesn’t just spit out rows of data in a spreadsheet. 

Your survey report should present the most important information in a neat and easy-to-understand way so you can draw conclusions quickly. 

With Survicate, you don’t have to create a survey report manually. You get a results summary within the dashboard, with all the most important metrics ready to screengrab.

Create a perfect survey report with Survicate

Depending on the type of survey you run and the questions you ask, you might see the results presented differently. 

NPS survey report example

With Survicate’s NPS survey report, you can see at a glance all the most important stats you need to be aware of. 

From the total response number to the completion rate, you can sort the stats by date and compare how they fluctuated over time. 

See the most important stats at a glance with Survicate

When you run a survey report with Survicate, you will see a breakdown of all the responses in the form of a graph. What’s more, you’ll be able to review how the NPS score changed over time, which can be helpful in trying to identify any issues with your product or service from the users’ perspective.

Create a perfect NPS survey report using Survicate

We recommend you integrate Survicate with Google Sheets to get live updates in spreadsheets. If you never want to miss out on feedback, you can also integrate your Slack or Microsoft Teams with Survicate for convenient notifications. With the click of a single button, you can jump to survey results and even follow up with the respondent.

Survicate can send survey responses to Slack and Microfost Teams

Create a complete survey report with Survicate

You don’t need a dedicated team to crunch survey insights for you. A great survey platform will organize your respondents’ data into an easy-to-read dashboard and help you start acting on the data you’ve received.

‍Start creating awesome survey reports with Survicate's intuitive survey tool. Now, it comes with a generous free trial that gives you access to all Business plan features for 10 days. Sign up and start collecting feedback today!

methodology in survey report example

We’re also there

methodology in survey report example

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

methodology in survey report example

Home Surveys

Survey Methodology: How to reach your Audience

A survey methodology is a technique that is carried out by applying a questionnaire to a sample of people. Let's talk about it.

When conducting research and collecting data, it is critical to get accurate information. However, it would be best if you also remembered to get the right sample size to ensure that you are drawing reliable conclusions. There are currently many ways to reach your target audience. Let’s talk about survey methodology.

The main difference between methodologies is the cost, depending on the type of data you want to collect and the number of responses you need for your specific research. However, respondent behavior has changed over time, and different factors should be considered when choosing your survey methodology.

What is a Survey Methodology?

A survey methodology is a technique that is carried out by applying a questionnaire to a sample of people. Surveys provide information on citizens’ opinions, attitudes, and behaviors, and there’s more than one method to conduct them. In fact, there’s a whole universe to learn from in the surveys world.

LEARN ABOUT: Survey Sample Sizes

Survey Methodology Examples

The most common channels to contact respondents are in-person, telephone, mail, and online. Each of these methods has strengths and weaknesses that should be considered.

In-person surveys

These are conducted with an interviewer meeting with the respondent directly. This survey can be in a panel setting, where respondents meet the interviewers in a central location or where the interviewer goes to a product or service location to find customers interested in completing a survey.

The format allows the customer to give open-ended feedback, which a skilled interviewer can put into quantitative values. The interviewer will be able to explain the content of the survey and read non-verbal cues from the customer. This allows the customer to give additional information to the researcher. The feedback saves time in categorizing and interpreting responses later in the research process.

It is the most intimate survey methodology, which makes it a good option for discussing more sensitive topics. It is also the most expensive option. The reason is it requires a large number of responses, which must cover travel, setting up in a physical location, and training interviewers to have the skills to manage and record a survey in this way.

Telephone interviews

These surveys are a way to follow the in-person interview format, with less cost for travel and the ability to reach people more easily. One interviewer will be able to contact more respondents by telephone. While the interviewer can’t read visual cues, they will still be able to explain the survey content and collect more open-ended feedback from the respondent.

However, currently, people are less likely to answer the phone for numbers they don’t recognize and are even less likely to give out information to anyone. This will restrict your sample size while still having costs and time for interviewers to attempt to contact people.

Paper surveys by mail

Surveys can be completed at the respondents’ convenience sample . They are a great way to reach many potential respondents for much less cost. It is an easy way to reach people without feeling as intrusive as a phone call could be. Due to the less personal contact method, it is of greater importance that the survey content is planned out so that questions are clear to the respondent.

This method requires good address records to ensure they end up in the right person’s hands, and unlike an interview, respondents may put off replying and forget to send back a response. If done correctly, this will result in answers that researchers will be able to use when analyzing the data.

This is an excellent method for primarily quantitative data when you want a large sample size . Since they are less personal and responses are sent back by mail, respondents may also be less willing to reply on more sensitive topics. 

Online survey methodology for collecting information

Online survey methodology is prevalent now. An online survey can reach a wide audience and can be completed at their convenience. A short survey can even be conducted while they are waiting for a site to load or other activity to complete. Similar to email surveys , an online survey will have to be written so that a respondent can follow instructions without an interviewer to help them a nd focus on quantitative feedback.

When collecting data online, the answers can be directly stored in a  database, saving costs on tabulating paper responses. Online surveys are a cost-effective way compared to paper and telephonic ways. Online surveys are global and can be taken by each individual through the internet as well as offline means.

Online surveys are scalable and we can reach out to larger sample sizes easily compared to paper-based and telephonic surveys . Online surveys are fast compared to traditional paper-based surveys. Due to contents in the survey chances are the survey may be junked due to links and keywords used in a survey this will cost reacting to the sampling size or target size of the survey.

Online surveys are ignored at times as online surveys are bombarded and in this case, there are chances of missing important surveys. One of the major disadvantages of online surveys is their inability to reach people residing in far-off, remote locations with no access to the internet.

Even elders who do not have internet access and are not proficient in working with online platforms are difficult to connect with using web-based surveys.

An important area where online surveys lag behind is capturing what respondents really feel about the problems mentioned in the survey. Being completely online, there is no way you can notice the facial expressions or body language of the participants. All you have is the responses to decode what customers really feel.

LEARN ABOUT: best time to send out surveys

The future of Survey Methodology

Modern technology means that many respondents are already spending more time with devices connected to the internet, which makes it an excellent contact channel for getting survey responses. Since it is cost-effective and efficient to focus on a smaller number of contact channels, it is critical to understand the strengths and weaknesses of online surveys, including how it affects the demographics of your respondents.

Finding the best way to reach your audience online is important to get the responses needed to conduct meaningful research. Ensure that you have proper contact information, whether that is ensuring that companies are keeping accurate email databases or that web portals for online panels are up to date. Pay attention to respondent feedback to ensure that you are keeping their attention long enough to get responses and that they are motivated to give true responses in an impersonal environment.

You will be competing with many other sources that are trying to get your respondents’ attention online, so ensure you are contacting them in a way where it is clear why your survey should be important to them. With market research to know how to best target your audience, you have the opportunity to collect a large amount of data in an efficient manner.

CREATE FREE ACCOUNT

Authors : Devendra Joag & Davin Moloney

MORE LIKE THIS

AI Question Generator

AI Question Generator: Create Easy + Accurate Tests and Surveys

Apr 6, 2024

ux research software

Top 17 UX Research Software for UX Design in 2024

Apr 5, 2024

Healthcare Staff Burnout

Healthcare Staff Burnout: What it Is + How To Manage It

Apr 4, 2024

employee retention software

Top 15 Employee Retention Software in 2024

Other categories.

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology
  • Doing Survey Research | A Step-by-Step Guide & Examples

Doing Survey Research | A Step-by-Step Guide & Examples

Published on 6 May 2022 by Shona McCombes . Revised on 10 October 2022.

Survey research means collecting information about a group of people by asking them questions and analysing the results. To conduct an effective survey, follow these six steps:

  • Determine who will participate in the survey
  • Decide the type of survey (mail, online, or in-person)
  • Design the survey questions and layout
  • Distribute the survey
  • Analyse the responses
  • Write up the results

Surveys are a flexible method of data collection that can be used in many different types of research .

Table of contents

What are surveys used for, step 1: define the population and sample, step 2: decide on the type of survey, step 3: design the survey questions, step 4: distribute the survey and collect responses, step 5: analyse the survey results, step 6: write up the survey results, frequently asked questions about surveys.

Surveys are used as a method of gathering data in many different fields. They are a good choice when you want to find out about the characteristics, preferences, opinions, or beliefs of a group of people.

Common uses of survey research include:

  • Social research: Investigating the experiences and characteristics of different social groups
  • Market research: Finding out what customers think about products, services, and companies
  • Health research: Collecting data from patients about symptoms and treatments
  • Politics: Measuring public opinion about parties and policies
  • Psychology: Researching personality traits, preferences, and behaviours

Surveys can be used in both cross-sectional studies , where you collect data just once, and longitudinal studies , where you survey the same sample several times over an extended period.

Prevent plagiarism, run a free check.

Before you start conducting survey research, you should already have a clear research question that defines what you want to find out. Based on this question, you need to determine exactly who you will target to participate in the survey.

Populations

The target population is the specific group of people that you want to find out about. This group can be very broad or relatively narrow. For example:

  • The population of Brazil
  • University students in the UK
  • Second-generation immigrants in the Netherlands
  • Customers of a specific company aged 18 to 24
  • British transgender women over the age of 50

Your survey should aim to produce results that can be generalised to the whole population. That means you need to carefully define exactly who you want to draw conclusions about.

It’s rarely possible to survey the entire population of your research – it would be very difficult to get a response from every person in Brazil or every university student in the UK. Instead, you will usually survey a sample from the population.

The sample size depends on how big the population is. You can use an online sample calculator to work out how many responses you need.

There are many sampling methods that allow you to generalise to broad populations. In general, though, the sample should aim to be representative of the population as a whole. The larger and more representative your sample, the more valid your conclusions.

There are two main types of survey:

  • A questionnaire , where a list of questions is distributed by post, online, or in person, and respondents fill it out themselves
  • An interview , where the researcher asks a set of questions by phone or in person and records the responses

Which type you choose depends on the sample size and location, as well as the focus of the research.

Questionnaires

Sending out a paper survey by post is a common method of gathering demographic information (for example, in a government census of the population).

  • You can easily access a large sample.
  • You have some control over who is included in the sample (e.g., residents of a specific region).
  • The response rate is often low.

Online surveys are a popular choice for students doing dissertation research , due to the low cost and flexibility of this method. There are many online tools available for constructing surveys, such as SurveyMonkey and Google Forms .

  • You can quickly access a large sample without constraints on time or location.
  • The data is easy to process and analyse.
  • The anonymity and accessibility of online surveys mean you have less control over who responds.

If your research focuses on a specific location, you can distribute a written questionnaire to be completed by respondents on the spot. For example, you could approach the customers of a shopping centre or ask all students to complete a questionnaire at the end of a class.

  • You can screen respondents to make sure only people in the target population are included in the sample.
  • You can collect time- and location-specific data (e.g., the opinions of a shop’s weekday customers).
  • The sample size will be smaller, so this method is less suitable for collecting data on broad populations.

Oral interviews are a useful method for smaller sample sizes. They allow you to gather more in-depth information on people’s opinions and preferences. You can conduct interviews by phone or in person.

  • You have personal contact with respondents, so you know exactly who will be included in the sample in advance.
  • You can clarify questions and ask for follow-up information when necessary.
  • The lack of anonymity may cause respondents to answer less honestly, and there is more risk of researcher bias.

Like questionnaires, interviews can be used to collect quantitative data : the researcher records each response as a category or rating and statistically analyses the results. But they are more commonly used to collect qualitative data : the interviewees’ full responses are transcribed and analysed individually to gain a richer understanding of their opinions and feelings.

Next, you need to decide which questions you will ask and how you will ask them. It’s important to consider:

  • The type of questions
  • The content of the questions
  • The phrasing of the questions
  • The ordering and layout of the survey

Open-ended vs closed-ended questions

There are two main forms of survey questions: open-ended and closed-ended. Many surveys use a combination of both.

Closed-ended questions give the respondent a predetermined set of answers to choose from. A closed-ended question can include:

  • A binary answer (e.g., yes/no or agree/disagree )
  • A scale (e.g., a Likert scale with five points ranging from strongly agree to strongly disagree )
  • A list of options with a single answer possible (e.g., age categories)
  • A list of options with multiple answers possible (e.g., leisure interests)

Closed-ended questions are best for quantitative research . They provide you with numerical data that can be statistically analysed to find patterns, trends, and correlations .

Open-ended questions are best for qualitative research. This type of question has no predetermined answers to choose from. Instead, the respondent answers in their own words.

Open questions are most common in interviews, but you can also use them in questionnaires. They are often useful as follow-up questions to ask for more detailed explanations of responses to the closed questions.

The content of the survey questions

To ensure the validity and reliability of your results, you need to carefully consider each question in the survey. All questions should be narrowly focused with enough context for the respondent to answer accurately. Avoid questions that are not directly relevant to the survey’s purpose.

When constructing closed-ended questions, ensure that the options cover all possibilities. If you include a list of options that isn’t exhaustive, you can add an ‘other’ field.

Phrasing the survey questions

In terms of language, the survey questions should be as clear and precise as possible. Tailor the questions to your target population, keeping in mind their level of knowledge of the topic.

Use language that respondents will easily understand, and avoid words with vague or ambiguous meanings. Make sure your questions are phrased neutrally, with no bias towards one answer or another.

Ordering the survey questions

The questions should be arranged in a logical order. Start with easy, non-sensitive, closed-ended questions that will encourage the respondent to continue.

If the survey covers several different topics or themes, group together related questions. You can divide a questionnaire into sections to help respondents understand what is being asked in each part.

If a question refers back to or depends on the answer to a previous question, they should be placed directly next to one another.

Before you start, create a clear plan for where, when, how, and with whom you will conduct the survey. Determine in advance how many responses you require and how you will gain access to the sample.

When you are satisfied that you have created a strong research design suitable for answering your research questions, you can conduct the survey through your method of choice – by post, online, or in person.

There are many methods of analysing the results of your survey. First you have to process the data, usually with the help of a computer program to sort all the responses. You should also cleanse the data by removing incomplete or incorrectly completed responses.

If you asked open-ended questions, you will have to code the responses by assigning labels to each response and organising them into categories or themes. You can also use more qualitative methods, such as thematic analysis , which is especially suitable for analysing interviews.

Statistical analysis is usually conducted using programs like SPSS or Stata. The same set of survey data can be subject to many analyses.

Finally, when you have collected and analysed all the necessary data, you will write it up as part of your thesis, dissertation , or research paper .

In the methodology section, you describe exactly how you conducted the survey. You should explain the types of questions you used, the sampling method, when and where the survey took place, and the response rate. You can include the full questionnaire as an appendix and refer to it in the text if relevant.

Then introduce the analysis by describing how you prepared the data and the statistical methods you used to analyse it. In the results section, you summarise the key results from your analysis.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviours. It is made up of four or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with five or seven possible responses, to capture their degree of agreement.

Individual Likert-type questions are generally considered ordinal data , because the items have clear rank order, but don’t have an even distribution.

Overall Likert scale scores are sometimes treated as interval data. These scores are considered to have directionality and even spacing between them.

The type of data determines what statistical tests you should use to analyse your data.

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analysing data from people using questionnaires.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2022, October 10). Doing Survey Research | A Step-by-Step Guide & Examples. Scribbr. Retrieved 2 April 2024, from https://www.scribbr.co.uk/research-methods/surveys/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, qualitative vs quantitative research | examples & methods, construct validity | definition, types, & examples, what is a likert scale | guide & examples.

Book cover

  • Understanding Survey Methodology

Sociological Theory and Applications

  • Philip S. Brenner 0

Department of Sociology, University of Massachusetts Boston, Boston, USA

You can also search for this editor in PubMed   Google Scholar

  • Applies a sociological lens to survey methodology
  • Draws a road map forward for a cohesive paradigm in the “sociological aspects of survey
  • Highlights research on a variety of social issues of current interest

Part of the book series: Frontiers in Sociology and Social Research (FSSR, volume 4)

18k Accesses

33 Citations

10 Altmetric

  • Table of contents

About this book

Editors and affiliations, about the editor, bibliographic information.

  • Publish with us

This is a preview of subscription content, log in via an institution to check access.

Access this book

  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

Table of contents (15 chapters)

Front matter, why survey methodology needs sociology and why sociology needs survey methodology.

Philip S. Brenner

Correction to: Power, Culture and Item Nonresponse in Social Surveys

  • Katharina M. Meitinger, Timothy P. Johnson

Sociological Theory and Survey Methodology

Towards survey response rate theories that no longer pass each other like strangers in the night.

  • Don A. Dillman

Advancing Theories of Socially Desirable Responding: How Identity Processes Influence Answers to “Sensitive Questions”

Culture and response behavior: an overview of cultural mechanisms explaining survey error.

  • Henning Silber, Timothy P. Johnson

Translating Lessons from Status Characteristics and Expectation States Theory to Survey Methods

  • Bianca Manago

Applications

Stigma and the meaning of social desirability: concealed islamophobia in the netherlands.

  • Mathew J. Creighton

Is Not Knowing the Same as Being Incorrect? An Examination of ‘Don’t Know’ Responses to Questions about Immigrant Population Size

  • Daniel Herda

Power, Culture and Item Nonresponse in Social Surveys

The measurement of sexual attraction and gender expression: cognitive interviews with queer women.

  • Dana Garbarski, Dana LaVergne

How Do Interviewers and Respondents Navigate Sexual Identity Questions in a CATI Survey?

  • Jerry Timbrook, Jolene D. Smyth, Kristen Olson

Male/Female Is Not Enough: Adding Measures of Masculinity and Femininity to General Population Surveys

  • Jolene D. Smyth, Kristen Olson

Correlates of Differences in Interactional Patterns among Black and White Respondents

  • Jennifer Dykema, Dana Garbarski, Nora Cate Schaeffer, Isabel Anadon, Dorothy Farrar Edwards

Theories of Public Opinion Change Versus Stability and their Implications for Null Findings

  • Kevin H. Wozniak, Kevin M. Drakulich, Brian R. Calfano

Conclusions and Future Directions for Understanding Survey Methodology

Back matter.

  • Applying Sociological Theory
  • Survey Research Methods from A Sociological Lens
  • Measures of Sociological Concepts
  • Sociological Theories and Applications in Survey Research
  • Conversation Analysis and Survey Methodology
  • Meaning of Social Desirability
  • Sexual Attraction and Gender Expression
  • Size of the Immigrant Population
  • Measures of Masculinity and Femininity
  • Research in Sociological Survey Methodology
  • History of Sociology and Survey Research Methods
  • Identity Theory
  • Social Exchange Theory
  • Status Characteristics Theory

Book Title : Understanding Survey Methodology

Book Subtitle : Sociological Theory and Applications

Editors : Philip S. Brenner

Series Title : Frontiers in Sociology and Social Research

DOI : https://doi.org/10.1007/978-3-030-47256-6

Publisher : Springer Cham

eBook Packages : Social Sciences , Social Sciences (R0)

Copyright Information : Springer Nature Switzerland AG 2020

Hardcover ISBN : 978-3-030-47255-9 Published: 24 October 2020

Softcover ISBN : 978-3-030-47258-0 Published: 24 October 2021

eBook ISBN : 978-3-030-47256-6 Published: 23 October 2020

Series ISSN : 2523-3424

Series E-ISSN : 2523-3432

Edition Number : 1

Number of Pages : X, 346

Number of Illustrations : 18 b/w illustrations, 3 illustrations in colour

Topics : Research Methodology , Statistics for Social Sciences, Humanities, Law , Methodology of the Social Sciences

Policies and ethics

  • Find a journal
  • Track your research

Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Employee Exit Interviews
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Market Research
  • Artificial Intelligence
  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO
  • Experience Management
  • What is a survey?
  • Survey Research

Try Qualtrics for free

What is survey research.

15 min read Find out everything you need to know about survey research, from what it is and how it works to the different methods and tools you can use to ensure you’re successful.

Survey research is the process of collecting data from a predefined group (e.g. customers or potential customers) with the ultimate goal of uncovering insights about your products, services, or brand overall .

As a quantitative data collection method, survey research can provide you with a goldmine of information that can inform crucial business and product decisions. But survey research needs careful planning and execution to get the results you want.

So if you’re thinking about using surveys to carry out research, read on.

Get started with our free survey maker tool

Types of survey research

Calling these methods ‘survey research’ slightly underplays the complexity of this type of information gathering. From the expertise required to carry out each activity to the analysis of the data and its eventual application, a considerable amount of effort is required.

As for how you can carry out your research, there are several options to choose from — face-to-face interviews, telephone surveys, focus groups (though more interviews than surveys), online surveys , and panel surveys.

Typically, the survey method you choose will largely be guided by who you want to survey, the size of your sample , your budget, and the type of information you’re hoping to gather.

Here are a few of the most-used survey types:

Face-to-face interviews

Before technology made it possible to conduct research using online surveys, telephone, and mail were the most popular methods for survey research. However face-to-face interviews were considered the gold standard — the only reason they weren’t as popular was due to their highly prohibitive costs.

When it came to face-to-face interviews, organizations would use highly trained researchers who knew when to probe or follow up on vague or problematic answers. They also knew when to offer assistance to respondents when they seemed to be struggling. The result was that these interviewers could get sample members to participate and engage in surveys in the most effective way possible, leading to higher response rates and better quality data.

Telephone surveys

While phone surveys have been popular in the past, particularly for measuring general consumer behavior or beliefs, response rates have been declining since the 1990s .

Phone surveys are usually conducted using a random dialing system and software that a researcher can use to record responses.

This method is beneficial when you want to survey a large population but don’t have the resources to conduct face-to-face research surveys or run focus groups, or want to ask multiple-choice and open-ended questions .

The downsides are they can: take a long time to complete depending on the response rate, and you may have to do a lot of cold-calling to get the information you need.

You also run the risk of respondents not being completely honest . Instead, they’ll answer your survey questions quickly just to get off the phone.

Focus groups (interviews — not surveys)

Focus groups are a separate qualitative methodology rather than surveys — even though they’re often bunched together. They’re normally used for survey pretesting and designing , but they’re also a great way to generate opinions and data from a diverse range of people.

Focus groups involve putting a cohort of demographically or socially diverse people in a room with a moderator and engaging them in a discussion on a particular topic, such as your product, brand, or service.

They remain a highly popular method for market research , but they’re expensive and require a lot of administration to conduct and analyze the data properly.

You also run the risk of more dominant members of the group taking over the discussion and swaying the opinions of other people — potentially providing you with unreliable data.

Online surveys

Online surveys have become one of the most popular survey methods due to being cost-effective, enabling researchers to accurately survey a large population quickly.

Online surveys can essentially be used by anyone for any research purpose – we’ve all seen the increasing popularity of polls on social media (although these are not scientific).

Using an online survey allows you to ask a series of different question types and collect data instantly that’s easy to analyze with the right software.

There are also several methods for running and distributing online surveys that allow you to get your questionnaire in front of a large population at a fraction of the cost of face-to-face interviews or focus groups.

This is particularly true when it comes to mobile surveys as most people with a smartphone can access them online.

However, you have to be aware of the potential dangers of using online surveys, particularly when it comes to the survey respondents. The biggest risk is because online surveys require access to a computer or mobile device to complete, they could exclude elderly members of the population who don’t have access to the technology — or don’t know how to use it.

It could also exclude those from poorer socio-economic backgrounds who can’t afford a computer or consistent internet access. This could mean the data collected is more biased towards a certain group and can lead to less accurate data when you’re looking for a representative population sample.

When it comes to surveys, every voice matters.

Find out how to create more inclusive and representative surveys for your research.

Panel surveys

A panel survey involves recruiting respondents who have specifically signed up to answer questionnaires and who are put on a list by a research company. This could be a workforce of a small company or a major subset of a national population. Usually, these groups are carefully selected so that they represent a sample of your target population — giving you balance across criteria such as age, gender, background, and so on.

Panel surveys give you access to the respondents you need and are usually provided by the research company in question. As a result, it’s much easier to get access to the right audiences as you just need to tell the research company your criteria. They’ll then determine the right panels to use to answer your questionnaire.

However, there are downsides. The main one being that if the research company offers its panels incentives, e.g. discounts, coupons, money — respondents may answer a lot of questionnaires just for the benefits.

This might mean they rush through your survey without providing considered and truthful answers. As a consequence, this can damage the credibility of your data and potentially ruin your analyses.

What are the benefits of using survey research?

Depending on the research method you use, there are lots of benefits to conducting survey research for data collection. Here, we cover a few:

1.   They’re relatively easy to do

Most research surveys are easy to set up, administer and analyze. As long as the planning and survey design is thorough and you target the right audience , the data collection is usually straightforward regardless of which survey type you use.

2.   They can be cost effective

Survey research can be relatively cheap depending on the type of survey you use.

Generally, qualitative research methods that require access to people in person or over the phone are more expensive and require more administration.

Online surveys or mobile surveys are often more cost-effective for market research and can give you access to the global population for a fraction of the cost.

3.   You can collect data from a large sample

Again, depending on the type of survey, you can obtain survey results from an entire population at a relatively low price. You can also administer a large variety of survey types to fit the project you’re running.

4.   You can use survey software to analyze results immediately

Using survey software, you can use advanced statistical analysis techniques to gain insights into your responses immediately.

Analysis can be conducted using a variety of parameters to determine the validity and reliability of your survey data at scale.

5.   Surveys can collect any type of data

While most people view surveys as a quantitative research method, they can just as easily be adapted to gain qualitative information by simply including open-ended questions or conducting interviews face to face.

How to measure concepts with survey questions

While surveys are a great way to obtain data, that data on its own is useless unless it can be analyzed and developed into actionable insights.

The easiest, and most effective way to measure survey results, is to use a dedicated research tool that puts all of your survey results into one place.

When it comes to survey measurement, there are four measurement types to be aware of that will determine how you treat your different survey results:

Nominal scale

With a nominal scale , you can only keep track of how many respondents chose each option from a question, and which response generated the most selections.

An example of this would be simply asking a responder to choose a product or brand from a list.

You could find out which brand was chosen the most but have no insight as to why.

Ordinal scale

Ordinal scales are used to judge an order of preference. They do provide some level of quantitative value because you’re asking responders to choose a preference of one option over another.

Ratio scale

Ratio scales can be used to judge the order and difference between responses. For example, asking respondents how much they spend on their weekly shopping on average.

Interval scale

In an interval scale, values are lined up in order with a meaningful difference between the two values — for example, measuring temperature or measuring a credit score between one value and another.

Step by step: How to conduct surveys and collect data

Conducting a survey and collecting data is relatively straightforward, but it does require some careful planning and design to ensure it results in reliable data.

Step 1 – Define your objectives

What do you want to learn from the survey? How is the data going to help you? Having a hypothesis or series of assumptions about survey responses will allow you to create the right questions to test them.

Step 2 – Create your survey questions

Once you’ve got your hypotheses or assumptions, write out the questions you need answering to test your theories or beliefs. Be wary about framing questions that could lead respondents or inadvertently create biased responses .

Step 3 – Choose your question types

Your survey should include a variety of question types and should aim to obtain quantitative data with some qualitative responses from open-ended questions. Using a mix of questions (simple Yes/ No, multiple-choice, rank in order, etc) not only increases the reliability of your data but also reduces survey fatigue and respondents simply answering questions quickly without thinking.

Find out how to create a survey that’s easy to engage with

Step 4 – Test your questions

Before sending your questionnaire out, you should test it (e.g. have a random internal group do the survey) and carry out A/B tests to ensure you’ll gain accurate responses.

Step 5 – Choose your target and send out the survey

Depending on your objectives, you might want to target the general population with your survey or a specific segment of the population. Once you’ve narrowed down who you want to target, it’s time to send out the survey.

After you’ve deployed the survey, keep an eye on the response rate to ensure you’re getting the number you expected. If your response rate is low, you might need to send the survey out to a second group to obtain a large enough sample — or do some troubleshooting to work out why your response rates are so low. This could be down to your questions, delivery method, selected sample, or otherwise.

Step 6 – Analyze results and draw conclusions

Once you’ve got your results back, it’s time for the fun part.

Break down your survey responses using the parameters you’ve set in your objectives and analyze the data to compare to your original assumptions. At this stage, a research tool or software can make the analysis a lot easier — and that’s somewhere Qualtrics can help.

Get reliable insights with survey software from Qualtrics

Gaining feedback from customers and leads is critical for any business, data gathered from surveys can prove invaluable for understanding your products and your market position, and with survey software from Qualtrics, it couldn’t be easier.

Used by more than 13,000 brands and supporting more than 1 billion surveys a year, Qualtrics empowers everyone in your organization to gather insights and take action. No coding required — and your data is housed in one system.

Get feedback from more than 125 sources on a single platform and view and measure your data in one place to create actionable insights and gain a deeper understanding of your target customers .

Automatically run complex text and statistical analysis to uncover exactly what your survey data is telling you, so you can react in real-time and make smarter decisions.

We can help you with survey management, too. From designing your survey and finding your target respondents to getting your survey in the field and reporting back on the results, we can help you every step of the way.

And for expert market researchers and survey designers, Qualtrics features custom programming to give you total flexibility over question types, survey design, embedded data, and other variables.

No matter what type of survey you want to run, what target audience you want to reach, or what assumptions you want to test or answers you want to uncover, we’ll help you design, deploy and analyze your survey with our team of experts.

Ready to find out more about Qualtrics CoreXM?

Get started with our free survey maker tool today

Related resources

Survey bias types 24 min read, post event survey questions 10 min read, best survey software 16 min read, close-ended questions 7 min read, survey vs questionnaire 12 min read, response bias 13 min read, double barreled question 11 min read, request demo.

Ready to learn more about Qualtrics?

How to Analyze Survey Results Like a Data Pro

Swetha Amaresan

Updated: November 23, 2021

Published: October 04, 2021

Obtaining customer feedback is difficult. You need strong survey questions that effectively derive customer insights. Not to mention a distribution system that shares the survey with the right customers at the right time. However, survey data doesn't just sort and analyze itself. You need a team dedicated to sifting through survey results and highlighting key trends and behaviors for your marketing, sales, and customer service teams. In this post, we'll discuss not only how to analyze survey results, but also how to present your findings to the rest of your organization.

survey-results

Short on time? Jump to the topics that interest you most:

How to Analyze Survey Results

How to present survey results, how to write a survey report, survey report template examples, 1. understand the four measurement levels..

Before analyzing data, you should understand the four levels of measurement. These levels determine how survey questions should be measured and what statistical analysis should be performed. The four measurement levels are nominal scales, ordinal scales, interval scales, and ratio scales.

Nominal Scale

Nominal scales classify data without any quantitative value, similar to labels. An example of a nominal scale is, "Select your car's brand from the list below." The choices have no relationship to each other. Due to the lack of numerical significance, you can only keep track of how many respondents chose each option and which option was selected the most.

methodology in survey report example

Free Market Research Kit

5 Research and Planning Templates + a Free Guide on How to Use Them in Your Market Research

  • SWOT Analysis Template
  • Survey Template
  • Focus Group Template

You're all set!

Click this link to access this resource at any time.

Ordinal Scale

Ordinal scales are used to depict the order of values. For this scale, there's a quantitative value because one rank is higher than another. An example of an ordinal scale is, "Rank the reasons for using your laptop." You can analyze both mode and median from this type of scale, and ordinal scales can be analyzed through cross-tabulation analysis .

Interval Scale

Interval scales depict both the order and difference between values. These scales have quantitative value because data intervals remain equivalent along the scale, but there's no true zero point. An example of an interval scale is in an IQ test. You can analyze mode, median, and mean from this type of scale and analyze the data through ANOVA , t-tests , and correlation analyses . ANOVA tests the significance of survey results, while t-tests and correlation analyses determine if datasets are related.

Ratio Scale

Ratio scales depict the order and difference between values, but unlike interval scales, they do have a true zero point. With ratio scales, there's quantitative value because the absence of an attribute can still provide information. For example, a ratio scale could be, "Select the average amount of money you spend online shopping." You can analyze mode, median, and mean with this type of scale and ratio scales can be analyzed through t-tests, ANOVA, and correlation analyses as well.

2. Select your survey question(s).

Once you understand how survey questions are analyzed, you should take note of the overarching survey question(s) that you're trying to solve. Perhaps, it's "How do respondents rate our brand?"

Then, look at survey questions that answer this research question, such as "How likely are you to recommend our brand to others?" Segmenting your survey questions will isolate data that are relevant to your goals.

Additionally, it's important to ask both close-ended and open-ended questions.

Close-Ended Questions

A close-ended survey question gives a limited set of answers. Respondents can't explain their answer and they can only choose from pre-determined options. These questions could be yes or no, multiple-choice, checkboxes, dropdown, or a scale question. Asking a variety of questions is important to get the best data.

Open-Ended Questions

An open-ended survey question will ask the respondent to explain their opinion. For example, in an NPS survey, you'll ask how likely a customer is to recommend your brand. After that, you might consider asking customers to explain their choice. This could be something like "Why or why wouldn't you recommend our product to your friends/family?"

3. Analyze quantitative data first.

Quantitative data is valuable because it uses statistics to draw conclusions. While qualitative data can bring more interesting insights about a topic, this information is subjective, making it harder to analyze. Quantitative data, however, comes from close-ended questions which can be converted into a numeric value. Once data is quantified, it's much easier to compare results and identify trends in customer behavior .

It's best to start with quantitative data when performing a survey analysis. That's because quantitative data can help you better understand your qualitative data. For example, if 60% of customers say they're unhappy with your product, you can focus your attention on negative reviews about user experience. This can help you identify roadblocks in the customer journey and correct any pain points that are causing churn.

4. Use cross-tabulation to better understand your target audience.

If you analyze all of your responses in one group, it isn't entirely effective for gaining accurate information. Respondents who aren't your ideal customers can overrun your data and skew survey results. Instead, if segment responses using cross-tabulation, you can analyze how your target audience responded to your questions.

Split Up Data by Demographics

Cross-tabulation records the relationships between variables. It compares two sets of data within one chart. This reveals specific insights based on your participants' responses to different questions. For example, you may be curious about customer advocacy among your customers based in Boston, MA. You can use cross-tabulation to see how many respondents said they were from Boston and said they would recommend your brand.

By pulling multiple variables into one chart, we can narrow down survey results to a specific group of responses. That way, you know your data is only considering your target audience.

Below is an example of a cross-tabulation chart. It records respondents' favorite baseball teams and what city they reside in.

survey analysis cross tabulation

If the statistical significance or p-value for a data point is equal to or lower than 0.05, it has moderate statistical significance since the probability for error is less than 5%. If the p-value is lower than 0.01, that means it has high statistical significance because the probability for error is less than 1%.

6. Consider causation versus correlation.

Another important aspect of survey analysis is knowing whether the conclusions you're drawing are accurate. For instance, let's say we observed a correlation between ice cream sales and car thefts in Boston. Over a month, as ice cream sales increased so did reports of stolen cars. While this data may suggest a link between these variables, we know that there's probably no relationship.

Just because the two are correlated doesn't mean one causes the other. In cases like these, there's typically a third variable — the independent variable — that influences the two dependent variables. In this case, it's temperature. As the temperature increases, more people buy ice cream. Additionally, more people leave their homes and go out, which leads to more opportunities for crime.

While this is an extreme example, you never want to draw a conclusion that's inaccurate or insufficient. Analyze all the data before assuming what influences a customer to think, feel, or act a certain way.

7. Compare new data with past data.

While current data is good for keeping you updated, it should be compared to data you've collected in the past. If you know 33% of respondents said they would recommend your brand, is that better or worse than last year? How about last quarter?

If this is your first year analyzing data, make these results the benchmark for your next analysis. Compare future results to this record and track changes over quarters, months, years, or whatever interval you prefer. You can even track data for specific subgroups to see if their experiences improve with your initiatives.

Now that you've gathered and analyzed all of your data, the next step is to share it with coworkers, customers, and other stakeholders. However, presentation is key in helping others understand the insights you're trying to explain.

The next section will explain how to present your survey results and share important customer data with the rest of your organization.

1. Use a graph or chart.

Graphs and charts are visually appealing ways to share data. Not only are the colors and patterns easy on the eyes, but data is often easier to understand when shared through a visual medium. However, it's important to choose a graph that highlights your results in a relevant way.

how to present survey results: use a graph or chart

2. Minimal Formal Annual Report

This Canva report template lets the data speak for itself. The minimal portrait layout offers plenty of negative space around the content so that it can breathe. Bold numbers and percentages can remain or be omitted depending on the needs you have for each page. One of the rare gems of this template is its ability to balance large, clear images that don't crowd out the important written information on the page. Use this template for hybrid text-visual designs.

survey report template example from canva minimal formal annual report

4. Empowerment Keynote Presentation

This presentation template makes a great research report template due to its clean lines, contrasting graphic elements, and ample room for visuals. The headers in this template virtually jump off the page to grab the readers' attention. There's aren't many ways to present quantitative data using this template example, but it works well for qualitative survey reports like focus groups or product design studies where original images will be discussed.

survey report template example from canva empowerment keynote presentation

Don't forget to share this post!

Related articles.

Nonresponse Bias: What to Avoid When Creating Surveys

Nonresponse Bias: What to Avoid When Creating Surveys

How to Make a Survey with a QR Code

How to Make a Survey with a QR Code

50 Catchy Referral Slogans & How to Write Your Own

50 Catchy Referral Slogans & How to Write Your Own

How Automated Phone Surveys Work [+Tips and Examples]

How Automated Phone Surveys Work [+Tips and Examples]

Online Panels: What They Are & How to Use Them Effectively

Online Panels: What They Are & How to Use Them Effectively

The Complete Guide to Survey Logic (+Expert Tips)

The Complete Guide to Survey Logic (+Expert Tips)

Focus Group vs. Survey: Which One Should You Use?

Focus Group vs. Survey: Which One Should You Use?

Leading Questions: What They Are & Why They Matter [+ Examples]

Leading Questions: What They Are & Why They Matter [+ Examples]

What are Survey Sample Sizes & How to Find Your Sample Size

What are Survey Sample Sizes & How to Find Your Sample Size

28 Questionnaire Examples, Questions, & Templates to Survey Your Clients

28 Questionnaire Examples, Questions, & Templates to Survey Your Clients

Free Guide & Templates to Help Your Market Research

Service Hub provides everything you need to delight and retain customers while supporting the success of your whole front office

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Questionnaire Design | Methods, Question Types & Examples

Questionnaire Design | Methods, Question Types & Examples

Published on July 15, 2021 by Pritha Bhandari . Revised on June 22, 2023.

A questionnaire is a list of questions or items used to gather data from respondents about their attitudes, experiences, or opinions. Questionnaires can be used to collect quantitative and/or qualitative information.

Questionnaires are commonly used in market research as well as in the social and health sciences. For example, a company may ask for feedback about a recent customer service experience, or psychology researchers may investigate health risk perceptions using questionnaires.

Table of contents

Questionnaires vs. surveys, questionnaire methods, open-ended vs. closed-ended questions, question wording, question order, step-by-step guide to design, other interesting articles, frequently asked questions about questionnaire design.

A survey is a research method where you collect and analyze data from a group of people. A questionnaire is a specific tool or instrument for collecting the data.

Designing a questionnaire means creating valid and reliable questions that address your research objectives , placing them in a useful order, and selecting an appropriate method for administration.

But designing a questionnaire is only one component of survey research. Survey research also involves defining the population you’re interested in, choosing an appropriate sampling method , administering questionnaires, data cleansing and analysis, and interpretation.

Sampling is important in survey research because you’ll often aim to generalize your results to the population. Gather data from a sample that represents the range of views in the population for externally valid results. There will always be some differences between the population and the sample, but minimizing these will help you avoid several types of research bias , including sampling bias , ascertainment bias , and undercoverage bias .

Receive feedback on language, structure, and formatting

Professional editors proofread and edit your paper by focusing on:

  • Academic style
  • Vague sentences
  • Style consistency

See an example

methodology in survey report example

Questionnaires can be self-administered or researcher-administered . Self-administered questionnaires are more common because they are easy to implement and inexpensive, but researcher-administered questionnaires allow deeper insights.

Self-administered questionnaires

Self-administered questionnaires can be delivered online or in paper-and-pen formats, in person or through mail. All questions are standardized so that all respondents receive the same questions with identical wording.

Self-administered questionnaires can be:

  • cost-effective
  • easy to administer for small and large groups
  • anonymous and suitable for sensitive topics

But they may also be:

  • unsuitable for people with limited literacy or verbal skills
  • susceptible to a nonresponse bias (most people invited may not complete the questionnaire)
  • biased towards people who volunteer because impersonal survey requests often go ignored.

Researcher-administered questionnaires

Researcher-administered questionnaires are interviews that take place by phone, in-person, or online between researchers and respondents.

Researcher-administered questionnaires can:

  • help you ensure the respondents are representative of your target audience
  • allow clarifications of ambiguous or unclear questions and answers
  • have high response rates because it’s harder to refuse an interview when personal attention is given to respondents

But researcher-administered questionnaires can be limiting in terms of resources. They are:

  • costly and time-consuming to perform
  • more difficult to analyze if you have qualitative responses
  • likely to contain experimenter bias or demand characteristics
  • likely to encourage social desirability bias in responses because of a lack of anonymity

Your questionnaire can include open-ended or closed-ended questions or a combination of both.

Using closed-ended questions limits your responses, while open-ended questions enable a broad range of answers. You’ll need to balance these considerations with your available time and resources.

Closed-ended questions

Closed-ended, or restricted-choice, questions offer respondents a fixed set of choices to select from. Closed-ended questions are best for collecting data on categorical or quantitative variables.

Categorical variables can be nominal or ordinal. Quantitative variables can be interval or ratio. Understanding the type of variable and level of measurement means you can perform appropriate statistical analyses for generalizable results.

Examples of closed-ended questions for different variables

Nominal variables include categories that can’t be ranked, such as race or ethnicity. This includes binary or dichotomous categories.

It’s best to include categories that cover all possible answers and are mutually exclusive. There should be no overlap between response items.

In binary or dichotomous questions, you’ll give respondents only two options to choose from.

White Black or African American American Indian or Alaska Native Asian Native Hawaiian or Other Pacific Islander

Ordinal variables include categories that can be ranked. Consider how wide or narrow a range you’ll include in your response items, and their relevance to your respondents.

Likert scale questions collect ordinal data using rating scales with 5 or 7 points.

When you have four or more Likert-type questions, you can treat the composite data as quantitative data on an interval scale . Intelligence tests, psychological scales, and personality inventories use multiple Likert-type questions to collect interval data.

With interval or ratio scales , you can apply strong statistical hypothesis tests to address your research aims.

Pros and cons of closed-ended questions

Well-designed closed-ended questions are easy to understand and can be answered quickly. However, you might still miss important answers that are relevant to respondents. An incomplete set of response items may force some respondents to pick the closest alternative to their true answer. These types of questions may also miss out on valuable detail.

To solve these problems, you can make questions partially closed-ended, and include an open-ended option where respondents can fill in their own answer.

Open-ended questions

Open-ended, or long-form, questions allow respondents to give answers in their own words. Because there are no restrictions on their choices, respondents can answer in ways that researchers may not have otherwise considered. For example, respondents may want to answer “multiracial” for the question on race rather than selecting from a restricted list.

  • How do you feel about open science?
  • How would you describe your personality?
  • In your opinion, what is the biggest obstacle for productivity in remote work?

Open-ended questions have a few downsides.

They require more time and effort from respondents, which may deter them from completing the questionnaire.

For researchers, understanding and summarizing responses to these questions can take a lot of time and resources. You’ll need to develop a systematic coding scheme to categorize answers, and you may also need to involve other researchers in data analysis for high reliability .

Question wording can influence your respondents’ answers, especially if the language is unclear, ambiguous, or biased. Good questions need to be understood by all respondents in the same way ( reliable ) and measure exactly what you’re interested in ( valid ).

Use clear language

You should design questions with your target audience in mind. Consider their familiarity with your questionnaire topics and language and tailor your questions to them.

For readability and clarity, avoid jargon or overly complex language. Don’t use double negatives because they can be harder to understand.

Use balanced framing

Respondents often answer in different ways depending on the question framing. Positive frames are interpreted as more neutral than negative frames and may encourage more socially desirable answers.

Use a mix of both positive and negative frames to avoid research bias , and ensure that your question wording is balanced wherever possible.

Unbalanced questions focus on only one side of an argument. Respondents may be less likely to oppose the question if it is framed in a particular direction. It’s best practice to provide a counter argument within the question as well.

Avoid leading questions

Leading questions guide respondents towards answering in specific ways, even if that’s not how they truly feel, by explicitly or implicitly providing them with extra information.

It’s best to keep your questions short and specific to your topic of interest.

  • The average daily work commute in the US takes 54.2 minutes and costs $29 per day. Since 2020, working from home has saved many employees time and money. Do you favor flexible work-from-home policies even after it’s safe to return to offices?
  • Experts agree that a well-balanced diet provides sufficient vitamins and minerals, and multivitamins and supplements are not necessary or effective. Do you agree or disagree that multivitamins are helpful for balanced nutrition?

Keep your questions focused

Ask about only one idea at a time and avoid double-barreled questions. Double-barreled questions ask about more than one item at a time, which can confuse respondents.

This question could be difficult to answer for respondents who feel strongly about the right to clean drinking water but not high-speed internet. They might only answer about the topic they feel passionate about or provide a neutral answer instead – but neither of these options capture their true answers.

Instead, you should ask two separate questions to gauge respondents’ opinions.

Strongly Agree Agree Undecided Disagree Strongly Disagree

Do you agree or disagree that the government should be responsible for providing high-speed internet to everyone?

You can organize the questions logically, with a clear progression from simple to complex. Alternatively, you can randomize the question order between respondents.

Logical flow

Using a logical flow to your question order means starting with simple questions, such as behavioral or opinion questions, and ending with more complex, sensitive, or controversial questions.

The question order that you use can significantly affect the responses by priming them in specific directions. Question order effects, or context effects, occur when earlier questions influence the responses to later questions, reducing the validity of your questionnaire.

While demographic questions are usually unaffected by order effects, questions about opinions and attitudes are more susceptible to them.

  • How knowledgeable are you about Joe Biden’s executive orders in his first 100 days?
  • Are you satisfied or dissatisfied with the way Joe Biden is managing the economy?
  • Do you approve or disapprove of the way Joe Biden is handling his job as president?

It’s important to minimize order effects because they can be a source of systematic error or bias in your study.

Randomization

Randomization involves presenting individual respondents with the same questionnaire but with different question orders.

When you use randomization, order effects will be minimized in your dataset. But a randomized order may also make it harder for respondents to process your questionnaire. Some questions may need more cognitive effort, while others are easier to answer, so a random order could require more time or mental capacity for respondents to switch between questions.

Step 1: Define your goals and objectives

The first step of designing a questionnaire is determining your aims.

  • What topics or experiences are you studying?
  • What specifically do you want to find out?
  • Is a self-report questionnaire an appropriate tool for investigating this topic?

Once you’ve specified your research aims, you can operationalize your variables of interest into questionnaire items. Operationalizing concepts means turning them from abstract ideas into concrete measurements. Every question needs to address a defined need and have a clear purpose.

Step 2: Use questions that are suitable for your sample

Create appropriate questions by taking the perspective of your respondents. Consider their language proficiency and available time and energy when designing your questionnaire.

  • Are the respondents familiar with the language and terms used in your questions?
  • Would any of the questions insult, confuse, or embarrass them?
  • Do the response items for any closed-ended questions capture all possible answers?
  • Are the response items mutually exclusive?
  • Do the respondents have time to respond to open-ended questions?

Consider all possible options for responses to closed-ended questions. From a respondent’s perspective, a lack of response options reflecting their point of view or true answer may make them feel alienated or excluded. In turn, they’ll become disengaged or inattentive to the rest of the questionnaire.

Step 3: Decide on your questionnaire length and question order

Once you have your questions, make sure that the length and order of your questions are appropriate for your sample.

If respondents are not being incentivized or compensated, keep your questionnaire short and easy to answer. Otherwise, your sample may be biased with only highly motivated respondents completing the questionnaire.

Decide on your question order based on your aims and resources. Use a logical flow if your respondents have limited time or if you cannot randomize questions. Randomizing questions helps you avoid bias, but it can take more complex statistical analysis to interpret your data.

Step 4: Pretest your questionnaire

When you have a complete list of questions, you’ll need to pretest it to make sure what you’re asking is always clear and unambiguous. Pretesting helps you catch any errors or points of confusion before performing your study.

Ask friends, classmates, or members of your target audience to complete your questionnaire using the same method you’ll use for your research. Find out if any questions were particularly difficult to answer or if the directions were unclear or inconsistent, and make changes as necessary.

If you have the resources, running a pilot study will help you test the validity and reliability of your questionnaire. A pilot study is a practice run of the full study, and it includes sampling, data collection , and analysis. You can find out whether your procedures are unfeasible or susceptible to bias and make changes in time, but you can’t test a hypothesis with this type of study because it’s usually statistically underpowered .

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Prospective cohort study

Research bias

  • Implicit bias
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic
  • Social desirability bias

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analyzing data from people using questionnaires.

Closed-ended, or restricted-choice, questions offer respondents a fixed set of choices to select from. These questions are easier to answer quickly.

Open-ended or long-form questions allow respondents to answer in their own words. Because there are no restrictions on their choices, respondents can answer in ways that researchers may not have otherwise considered.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviors. It is made up of 4 or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with 5 or 7 possible responses, to capture their degree of agreement.

You can organize the questions logically, with a clear progression from simple to complex, or randomly between respondents. A logical flow helps respondents process the questionnaire easier and quicker, but it may lead to bias. Randomization can minimize the bias from order effects.

Questionnaires can be self-administered or researcher-administered.

Researcher-administered questionnaires are interviews that take place by phone, in-person, or online between researchers and respondents. You can gain deeper insights by clarifying questions for respondents or asking follow-up questions.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2023, June 22). Questionnaire Design | Methods, Question Types & Examples. Scribbr. Retrieved April 3, 2024, from https://www.scribbr.com/methodology/questionnaire/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, survey research | definition, examples & methods, what is a likert scale | guide & examples, reliability vs. validity in research | difference, types and examples, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Adv Pract Oncol
  • v.6(2); Mar-Apr 2015

Understanding and Evaluating Survey Research

A variety of methodologic approaches exist for individuals interested in conducting research. Selection of a research approach depends on a number of factors, including the purpose of the research, the type of research questions to be answered, and the availability of resources. The purpose of this article is to describe survey research as one approach to the conduct of research so that the reader can critically evaluate the appropriateness of the conclusions from studies employing survey research.

SURVEY RESEARCH

Survey research is defined as "the collection of information from a sample of individuals through their responses to questions" ( Check & Schutt, 2012, p. 160 ). This type of research allows for a variety of methods to recruit participants, collect data, and utilize various methods of instrumentation. Survey research can use quantitative research strategies (e.g., using questionnaires with numerically rated items), qualitative research strategies (e.g., using open-ended questions), or both strategies (i.e., mixed methods). As it is often used to describe and explore human behavior, surveys are therefore frequently used in social and psychological research ( Singleton & Straits, 2009 ).

Information has been obtained from individuals and groups through the use of survey research for decades. It can range from asking a few targeted questions of individuals on a street corner to obtain information related to behaviors and preferences, to a more rigorous study using multiple valid and reliable instruments. Common examples of less rigorous surveys include marketing or political surveys of consumer patterns and public opinion polls.

Survey research has historically included large population-based data collection. The primary purpose of this type of survey research was to obtain information describing characteristics of a large sample of individuals of interest relatively quickly. Large census surveys obtaining information reflecting demographic and personal characteristics and consumer feedback surveys are prime examples. These surveys were often provided through the mail and were intended to describe demographic characteristics of individuals or obtain opinions on which to base programs or products for a population or group.

More recently, survey research has developed into a rigorous approach to research, with scientifically tested strategies detailing who to include (representative sample), what and how to distribute (survey method), and when to initiate the survey and follow up with nonresponders (reducing nonresponse error), in order to ensure a high-quality research process and outcome. Currently, the term "survey" can reflect a range of research aims, sampling and recruitment strategies, data collection instruments, and methods of survey administration.

Given this range of options in the conduct of survey research, it is imperative for the consumer/reader of survey research to understand the potential for bias in survey research as well as the tested techniques for reducing bias, in order to draw appropriate conclusions about the information reported in this manner. Common types of error in research, along with the sources of error and strategies for reducing error as described throughout this article, are summarized in the Table .

An external file that holds a picture, illustration, etc.
Object name is jadp-06-168-g01.jpg

Sources of Error in Survey Research and Strategies to Reduce Error

The goal of sampling strategies in survey research is to obtain a sufficient sample that is representative of the population of interest. It is often not feasible to collect data from an entire population of interest (e.g., all individuals with lung cancer); therefore, a subset of the population or sample is used to estimate the population responses (e.g., individuals with lung cancer currently receiving treatment). A large random sample increases the likelihood that the responses from the sample will accurately reflect the entire population. In order to accurately draw conclusions about the population, the sample must include individuals with characteristics similar to the population.

It is therefore necessary to correctly identify the population of interest (e.g., individuals with lung cancer currently receiving treatment vs. all individuals with lung cancer). The sample will ideally include individuals who reflect the intended population in terms of all characteristics of the population (e.g., sex, socioeconomic characteristics, symptom experience) and contain a similar distribution of individuals with those characteristics. As discussed by Mady Stovall beginning on page 162, Fujimori et al. ( 2014 ), for example, were interested in the population of oncologists. The authors obtained a sample of oncologists from two hospitals in Japan. These participants may or may not have similar characteristics to all oncologists in Japan.

Participant recruitment strategies can affect the adequacy and representativeness of the sample obtained. Using diverse recruitment strategies can help improve the size of the sample and help ensure adequate coverage of the intended population. For example, if a survey researcher intends to obtain a sample of individuals with breast cancer representative of all individuals with breast cancer in the United States, the researcher would want to use recruitment strategies that would recruit both women and men, individuals from rural and urban settings, individuals receiving and not receiving active treatment, and so on. Because of the difficulty in obtaining samples representative of a large population, researchers may focus the population of interest to a subset of individuals (e.g., women with stage III or IV breast cancer). Large census surveys require extremely large samples to adequately represent the characteristics of the population because they are intended to represent the entire population.

DATA COLLECTION METHODS

Survey research may use a variety of data collection methods with the most common being questionnaires and interviews. Questionnaires may be self-administered or administered by a professional, may be administered individually or in a group, and typically include a series of items reflecting the research aims. Questionnaires may include demographic questions in addition to valid and reliable research instruments ( Costanzo, Stawski, Ryff, Coe, & Almeida, 2012 ; DuBenske et al., 2014 ; Ponto, Ellington, Mellon, & Beck, 2010 ). It is helpful to the reader when authors describe the contents of the survey questionnaire so that the reader can interpret and evaluate the potential for errors of validity (e.g., items or instruments that do not measure what they are intended to measure) and reliability (e.g., items or instruments that do not measure a construct consistently). Helpful examples of articles that describe the survey instruments exist in the literature ( Buerhaus et al., 2012 ).

Questionnaires may be in paper form and mailed to participants, delivered in an electronic format via email or an Internet-based program such as SurveyMonkey, or a combination of both, giving the participant the option to choose which method is preferred ( Ponto et al., 2010 ). Using a combination of methods of survey administration can help to ensure better sample coverage (i.e., all individuals in the population having a chance of inclusion in the sample) therefore reducing coverage error ( Dillman, Smyth, & Christian, 2014 ; Singleton & Straits, 2009 ). For example, if a researcher were to only use an Internet-delivered questionnaire, individuals without access to a computer would be excluded from participation. Self-administered mailed, group, or Internet-based questionnaires are relatively low cost and practical for a large sample ( Check & Schutt, 2012 ).

Dillman et al. ( 2014 ) have described and tested a tailored design method for survey research. Improving the visual appeal and graphics of surveys by using a font size appropriate for the respondents, ordering items logically without creating unintended response bias, and arranging items clearly on each page can increase the response rate to electronic questionnaires. Attending to these and other issues in electronic questionnaires can help reduce measurement error (i.e., lack of validity or reliability) and help ensure a better response rate.

Conducting interviews is another approach to data collection used in survey research. Interviews may be conducted by phone, computer, or in person and have the benefit of visually identifying the nonverbal response(s) of the interviewee and subsequently being able to clarify the intended question. An interviewer can use probing comments to obtain more information about a question or topic and can request clarification of an unclear response ( Singleton & Straits, 2009 ). Interviews can be costly and time intensive, and therefore are relatively impractical for large samples.

Some authors advocate for using mixed methods for survey research when no one method is adequate to address the planned research aims, to reduce the potential for measurement and non-response error, and to better tailor the study methods to the intended sample ( Dillman et al., 2014 ; Singleton & Straits, 2009 ). For example, a mixed methods survey research approach may begin with distributing a questionnaire and following up with telephone interviews to clarify unclear survey responses ( Singleton & Straits, 2009 ). Mixed methods might also be used when visual or auditory deficits preclude an individual from completing a questionnaire or participating in an interview.

FUJIMORI ET AL.: SURVEY RESEARCH

Fujimori et al. ( 2014 ) described the use of survey research in a study of the effect of communication skills training for oncologists on oncologist and patient outcomes (e.g., oncologist’s performance and confidence and patient’s distress, satisfaction, and trust). A sample of 30 oncologists from two hospitals was obtained and though the authors provided a power analysis concluding an adequate number of oncologist participants to detect differences between baseline and follow-up scores, the conclusions of the study may not be generalizable to a broader population of oncologists. Oncologists were randomized to either an intervention group (i.e., communication skills training) or a control group (i.e., no training).

Fujimori et al. ( 2014 ) chose a quantitative approach to collect data from oncologist and patient participants regarding the study outcome variables. Self-report numeric ratings were used to measure oncologist confidence and patient distress, satisfaction, and trust. Oncologist confidence was measured using two instruments each using 10-point Likert rating scales. The Hospital Anxiety and Depression Scale (HADS) was used to measure patient distress and has demonstrated validity and reliability in a number of populations including individuals with cancer ( Bjelland, Dahl, Haug, & Neckelmann, 2002 ). Patient satisfaction and trust were measured using 0 to 10 numeric rating scales. Numeric observer ratings were used to measure oncologist performance of communication skills based on a videotaped interaction with a standardized patient. Participants completed the same questionnaires at baseline and follow-up.

The authors clearly describe what data were collected from all participants. Providing additional information about the manner in which questionnaires were distributed (i.e., electronic, mail), the setting in which data were collected (e.g., home, clinic), and the design of the survey instruments (e.g., visual appeal, format, content, arrangement of items) would assist the reader in drawing conclusions about the potential for measurement and nonresponse error. The authors describe conducting a follow-up phone call or mail inquiry for nonresponders, using the Dillman et al. ( 2014 ) tailored design for survey research follow-up may have reduced nonresponse error.

CONCLUSIONS

Survey research is a useful and legitimate approach to research that has clear benefits in helping to describe and explore variables and constructs of interest. Survey research, like all research, has the potential for a variety of sources of error, but several strategies exist to reduce the potential for error. Advanced practitioners aware of the potential sources of error and strategies to improve survey research can better determine how and whether the conclusions from a survey research study apply to practice.

The author has no potential conflicts of interest to disclose.

Grad Coach

How To Write The Methodology Chapter

The what, why & how explained simply (with examples).

By: Jenna Crossley (PhD) | Reviewed By: Dr. Eunice Rautenbach | September 2021 (Updated April 2023)

So, you’ve pinned down your research topic and undertaken a review of the literature – now it’s time to write up the methodology section of your dissertation, thesis or research paper . But what exactly is the methodology chapter all about – and how do you go about writing one? In this post, we’ll unpack the topic, step by step .

Overview: The Methodology Chapter

  • The purpose  of the methodology chapter
  • Why you need to craft this chapter (really) well
  • How to write and structure the chapter
  • Methodology chapter example
  • Essential takeaways

What (exactly) is the methodology chapter?

The methodology chapter is where you outline the philosophical underpinnings of your research and outline the specific methodological choices you’ve made. The point of the methodology chapter is to tell the reader exactly how you designed your study and, just as importantly, why you did it this way.

Importantly, this chapter should comprehensively describe and justify all the methodological choices you made in your study. For example, the approach you took to your research (i.e., qualitative, quantitative or mixed), who  you collected data from (i.e., your sampling strategy), how you collected your data and, of course, how you analysed it. If that sounds a little intimidating, don’t worry – we’ll explain all these methodological choices in this post .

Free Webinar: Research Methodology 101

Why is the methodology chapter important?

The methodology chapter plays two important roles in your dissertation or thesis:

Firstly, it demonstrates your understanding of research theory, which is what earns you marks. A flawed research design or methodology would mean flawed results. So, this chapter is vital as it allows you to show the marker that you know what you’re doing and that your results are credible .

Secondly, the methodology chapter is what helps to make your study replicable. In other words, it allows other researchers to undertake your study using the same methodological approach, and compare their findings to yours. This is very important within academic research, as each study builds on previous studies.

The methodology chapter is also important in that it allows you to identify and discuss any methodological issues or problems you encountered (i.e., research limitations ), and to explain how you mitigated the impacts of these. Every research project has its limitations , so it’s important to acknowledge these openly and highlight your study’s value despite its limitations . Doing so demonstrates your understanding of research design, which will earn you marks. We’ll discuss limitations in a bit more detail later in this post, so stay tuned!

Need a helping hand?

methodology in survey report example

How to write up the methodology chapter

First off, it’s worth noting that the exact structure and contents of the methodology chapter will vary depending on the field of research (e.g., humanities, chemistry or engineering) as well as the university . So, be sure to always check the guidelines provided by your institution for clarity and, if possible, review past dissertations from your university. Here we’re going to discuss a generic structure for a methodology chapter typically found in the sciences.

Before you start writing, it’s always a good idea to draw up a rough outline to guide your writing. Don’t just start writing without knowing what you’ll discuss where. If you do, you’ll likely end up with a disjointed, ill-flowing narrative . You’ll then waste a lot of time rewriting in an attempt to try to stitch all the pieces together. Do yourself a favour and start with the end in mind .

Section 1 – Introduction

As with all chapters in your dissertation or thesis, the methodology chapter should have a brief introduction. In this section, you should remind your readers what the focus of your study is, especially the research aims . As we’ve discussed many times on the blog, your methodology needs to align with your research aims, objectives and research questions. Therefore, it’s useful to frontload this component to remind the reader (and yourself!) what you’re trying to achieve.

In this section, you can also briefly mention how you’ll structure the chapter. This will help orient the reader and provide a bit of a roadmap so that they know what to expect. You don’t need a lot of detail here – just a brief outline will do.

The intro provides a roadmap to your methodology chapter

Section 2 – The Methodology

The next section of your chapter is where you’ll present the actual methodology. In this section, you need to detail and justify the key methodological choices you’ve made in a logical, intuitive fashion. Importantly, this is the heart of your methodology chapter, so you need to get specific – don’t hold back on the details here. This is not one of those “less is more” situations.

Let’s take a look at the most common components you’ll likely need to cover. 

Methodological Choice #1 – Research Philosophy

Research philosophy refers to the underlying beliefs (i.e., the worldview) regarding how data about a phenomenon should be gathered , analysed and used . The research philosophy will serve as the core of your study and underpin all of the other research design choices, so it’s critically important that you understand which philosophy you’ll adopt and why you made that choice. If you’re not clear on this, take the time to get clarity before you make any further methodological choices.

While several research philosophies exist, two commonly adopted ones are positivism and interpretivism . These two sit roughly on opposite sides of the research philosophy spectrum.

Positivism states that the researcher can observe reality objectively and that there is only one reality, which exists independently of the observer. As a consequence, it is quite commonly the underlying research philosophy in quantitative studies and is oftentimes the assumed philosophy in the physical sciences.

Contrasted with this, interpretivism , which is often the underlying research philosophy in qualitative studies, assumes that the researcher performs a role in observing the world around them and that reality is unique to each observer . In other words, reality is observed subjectively .

These are just two philosophies (there are many more), but they demonstrate significantly different approaches to research and have a significant impact on all the methodological choices. Therefore, it’s vital that you clearly outline and justify your research philosophy at the beginning of your methodology chapter, as it sets the scene for everything that follows.

The research philosophy is at the core of the methodology chapter

Methodological Choice #2 – Research Type

The next thing you would typically discuss in your methodology section is the research type. The starting point for this is to indicate whether the research you conducted is inductive or deductive .

Inductive research takes a bottom-up approach , where the researcher begins with specific observations or data and then draws general conclusions or theories from those observations. Therefore these studies tend to be exploratory in terms of approach.

Conversely , d eductive research takes a top-down approach , where the researcher starts with a theory or hypothesis and then tests it using specific observations or data. Therefore these studies tend to be confirmatory in approach.

Related to this, you’ll need to indicate whether your study adopts a qualitative, quantitative or mixed  approach. As we’ve mentioned, there’s a strong link between this choice and your research philosophy, so make sure that your choices are tightly aligned . When you write this section up, remember to clearly justify your choices, as they form the foundation of your study.

Methodological Choice #3 – Research Strategy

Next, you’ll need to discuss your research strategy (also referred to as a research design ). This methodological choice refers to the broader strategy in terms of how you’ll conduct your research, based on the aims of your study.

Several research strategies exist, including experimental , case studies , ethnography , grounded theory, action research , and phenomenology . Let’s take a look at two of these, experimental and ethnographic, to see how they contrast.

Experimental research makes use of the scientific method , where one group is the control group (in which no variables are manipulated ) and another is the experimental group (in which a specific variable is manipulated). This type of research is undertaken under strict conditions in a controlled, artificial environment (e.g., a laboratory). By having firm control over the environment, experimental research typically allows the researcher to establish causation between variables. Therefore, it can be a good choice if you have research aims that involve identifying causal relationships.

Ethnographic research , on the other hand, involves observing and capturing the experiences and perceptions of participants in their natural environment (for example, at home or in the office). In other words, in an uncontrolled environment.  Naturally, this means that this research strategy would be far less suitable if your research aims involve identifying causation, but it would be very valuable if you’re looking to explore and examine a group culture, for example.

As you can see, the right research strategy will depend largely on your research aims and research questions – in other words, what you’re trying to figure out. Therefore, as with every other methodological choice, it’s essential to justify why you chose the research strategy you did.

Methodological Choice #4 – Time Horizon

The next thing you’ll need to detail in your methodology chapter is the time horizon. There are two options here: cross-sectional and longitudinal . In other words, whether the data for your study were all collected at one point in time (cross-sectional) or at multiple points in time (longitudinal).

The choice you make here depends again on your research aims, objectives and research questions. If, for example, you aim to assess how a specific group of people’s perspectives regarding a topic change over time , you’d likely adopt a longitudinal time horizon.

Another important factor to consider is simply whether you have the time necessary to adopt a longitudinal approach (which could involve collecting data over multiple months or even years). Oftentimes, the time pressures of your degree program will force your hand into adopting a cross-sectional time horizon, so keep this in mind.

Methodological Choice #5 – Sampling Strategy

Next, you’ll need to discuss your sampling strategy . There are two main categories of sampling, probability and non-probability sampling.

Probability sampling involves a random (and therefore representative) selection of participants from a population, whereas non-probability sampling entails selecting participants in a non-random  (and therefore non-representative) manner. For example, selecting participants based on ease of access (this is called a convenience sample).

The right sampling approach depends largely on what you’re trying to achieve in your study. Specifically, whether you trying to develop findings that are generalisable to a population or not. Practicalities and resource constraints also play a large role here, as it can oftentimes be challenging to gain access to a truly random sample. In the video below, we explore some of the most common sampling strategies.

Methodological Choice #6 – Data Collection Method

Next up, you’ll need to explain how you’ll go about collecting the necessary data for your study. Your data collection method (or methods) will depend on the type of data that you plan to collect – in other words, qualitative or quantitative data.

Typically, quantitative research relies on surveys , data generated by lab equipment, analytics software or existing datasets. Qualitative research, on the other hand, often makes use of collection methods such as interviews , focus groups , participant observations, and ethnography.

So, as you can see, there is a tight link between this section and the design choices you outlined in earlier sections. Strong alignment between these sections, as well as your research aims and questions is therefore very important.

Methodological Choice #7 – Data Analysis Methods/Techniques

The final major methodological choice that you need to address is that of analysis techniques . In other words, how you’ll go about analysing your date once you’ve collected it. Here it’s important to be very specific about your analysis methods and/or techniques – don’t leave any room for interpretation. Also, as with all choices in this chapter, you need to justify each choice you make.

What exactly you discuss here will depend largely on the type of study you’re conducting (i.e., qualitative, quantitative, or mixed methods). For qualitative studies, common analysis methods include content analysis , thematic analysis and discourse analysis . In the video below, we explain each of these in plain language.

For quantitative studies, you’ll almost always make use of descriptive statistics , and in many cases, you’ll also use inferential statistical techniques (e.g., correlation and regression analysis). In the video below, we unpack some of the core concepts involved in descriptive and inferential statistics.

In this section of your methodology chapter, it’s also important to discuss how you prepared your data for analysis, and what software you used (if any). For example, quantitative data will often require some initial preparation such as removing duplicates or incomplete responses . Similarly, qualitative data will often require transcription and perhaps even translation. As always, remember to state both what you did and why you did it.

Section 3 – The Methodological Limitations

With the key methodological choices outlined and justified, the next step is to discuss the limitations of your design. No research methodology is perfect – there will always be trade-offs between the “ideal” methodology and what’s practical and viable, given your constraints. Therefore, this section of your methodology chapter is where you’ll discuss the trade-offs you had to make, and why these were justified given the context.

Methodological limitations can vary greatly from study to study, ranging from common issues such as time and budget constraints to issues of sample or selection bias . For example, you may find that you didn’t manage to draw in enough respondents to achieve the desired sample size (and therefore, statistically significant results), or your sample may be skewed heavily towards a certain demographic, thereby negatively impacting representativeness .

In this section, it’s important to be critical of the shortcomings of your study. There’s no use trying to hide them (your marker will be aware of them regardless). By being critical, you’ll demonstrate to your marker that you have a strong understanding of research theory, so don’t be shy here. At the same time, don’t beat your study to death . State the limitations, why these were justified, how you mitigated their impacts to the best degree possible, and how your study still provides value despite these limitations .

Section 4 – Concluding Summary

Finally, it’s time to wrap up the methodology chapter with a brief concluding summary. In this section, you’ll want to concisely summarise what you’ve presented in the chapter. Here, it can be a good idea to use a figure to summarise the key decisions, especially if your university recommends using a specific model (for example, Saunders’ Research Onion ).

Importantly, this section needs to be brief – a paragraph or two maximum (it’s a summary, after all). Also, make sure that when you write up your concluding summary, you include only what you’ve already discussed in your chapter; don’t add any new information.

Keep it simple

Methodology Chapter Example

In the video below, we walk you through an example of a high-quality research methodology chapter from a dissertation. We also unpack our free methodology chapter template so that you can see how best to structure your chapter.

Wrapping Up

And there you have it – the methodology chapter in a nutshell. As we’ve mentioned, the exact contents and structure of this chapter can vary between universities , so be sure to check in with your institution before you start writing. If possible, try to find dissertations or theses from former students of your specific degree program – this will give you a strong indication of the expectations and norms when it comes to the methodology chapter (and all the other chapters!).

Also, remember the golden rule of the methodology chapter – justify every choice ! Make sure that you clearly explain the “why” for every “what”, and reference credible methodology textbooks or academic sources to back up your justifications.

If you need a helping hand with your research methodology (or any other component of your research), be sure to check out our private coaching service , where we hold your hand through every step of the research journey. Until next time, good luck!

methodology in survey report example

Psst… there’s more (for free)

This post is part of our dissertation mini-course, which covers everything you need to get started with your dissertation, thesis or research project. 

You Might Also Like:

Quantitative results chapter in a dissertation

50 Comments

DAUDI JACKSON GYUNDA

highly appreciated.

florin

This was very helpful!

Nophie

This was helpful

mengistu

Thanks ,it is a very useful idea.

Thanks ,it is very useful idea.

Lucia

Thank you so much, this information is very useful.

Shemeka Hodge-Joyce

Thank you very much. I must say the information presented was succinct, coherent and invaluable. It is well put together and easy to comprehend. I have a great guide to create the research methodology for my dissertation.

james edwin thomson

Highly clear and useful.

Amir

I understand a bit on the explanation above. I want to have some coach but I’m still student and don’t have any budget to hire one. A lot of question I want to ask.

Henrick

Thank you so much. This concluded my day plan. Thank you so much.

Najat

Thanks it was helpful

Karen

Great information. It would be great though if you could show us practical examples.

Patrick O Matthew

Thanks so much for this information. God bless and be with you

Atugonza Zahara

Thank you so so much. Indeed it was helpful

Joy O.

This is EXCELLENT!

I was totally confused by other explanations. Thank you so much!.

keinemukama surprise

justdoing my research now , thanks for the guidance.

Yucong Huang

Thank uuuu! These contents are really valued for me!

Thokozani kanyemba

This is powerful …I really like it

Hend Zahran

Highly useful and clear, thank you so much.

Harry Kaliza

Highly appreciated. Good guide

Fateme Esfahani

That was helpful. Thanks

David Tshigomana

This is very useful.Thank you

Kaunda

Very helpful information. Thank you

Peter

This is exactly what I was looking for. The explanation is so detailed and easy to comprehend. Well done and thank you.

Shazia Malik

Great job. You just summarised everything in the easiest and most comprehensible way possible. Thanks a lot.

Rosenda R. Gabriente

Thank you very much for the ideas you have given this will really help me a lot. Thank you and God Bless.

Eman

Such great effort …….very grateful thank you

Shaji Viswanathan

Please accept my sincere gratitude. I have to say that the information that was delivered was congruent, concise, and quite helpful. It is clear and straightforward, making it simple to understand. I am in possession of an excellent manual that will assist me in developing the research methods for my dissertation.

lalarie

Thank you for your great explanation. It really helped me construct my methodology paper.

Daniel sitieney

thank you for simplifieng the methodoly, It was realy helpful

Kayode

Very helpful!

Nathan

Thank you for your great explanation.

Emily Kamende

The explanation I have been looking for. So clear Thank you

Abraham Mafuta

Thank you very much .this was more enlightening.

Jordan

helped me create the in depth and thorough methodology for my dissertation

Nelson D Menduabor

Thank you for the great explaination.please construct one methodology for me

I appreciate you for the explanation of methodology. Please construct one methodology on the topic: The effects influencing students dropout among schools for my thesis

This helped me complete my methods section of my dissertation with ease. I have managed to write a thorough and concise methodology!

ASHA KIUNGA

its so good in deed

leslie chihope

wow …what an easy to follow presentation. very invaluable content shared. utmost important.

Ahmed khedr

Peace be upon you, I am Dr. Ahmed Khedr, a former part-time professor at Al-Azhar University in Cairo, Egypt. I am currently teaching research methods, and I have been dealing with your esteemed site for several years, and I found that despite my long experience with research methods sites, it is one of the smoothest sites for evaluating the material for students, For this reason, I relied on it a lot in teaching and translated most of what was written into Arabic and published it on my own page on Facebook. Thank you all… Everything I posted on my page is provided with the names of the writers of Grad coach, the title of the article, and the site. My best regards.

Daniel Edwards

A remarkably simple and useful guide, thank you kindly.

Magnus Mahenge

I real appriciate your short and remarkable chapter summary

Olalekan Adisa

Bravo! Very helpful guide.

Arthur Margraf

Only true experts could provide such helpful, fantastic, and inspiring knowledge about Methodology. Thank you very much! God be with you and us all!

Aruni Nilangi

highly appreciate your effort.

White Label Blog Content

This is a very well thought out post. Very informative and a great read.

FELEKE FACHA

THANKS SO MUCH FOR SHARING YOUR NICE IDEA

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

Home Toggle navigation FR Toggle Search Search the site Search About us About us Head office Regional offices History Archives Background materials Photos and videos Accessibility Contact us Corporate governance Board of Directors Governing Council and Senior Management Governance documents Educational resources The Economy, Plain and Simple Explainers Financial education resources Careers Take a central role at the Bank of Canada with our current opportunities and scholarships.

Interest Rate Announcement and Monetary Policy Report

09:45 (ET) On eight scheduled dates each year, the Bank of Canada announces the setting for the overnight rate target in a press release explaining the factors behind the decision. Four times a year, Governing Council presents the Monetary Policy Report : the Bank’s base-case projection for inflation and growth in the Canadian economy, and its assessment of risks.

See the media advisory .

We use cookies to help us keep improving this website.

Read our research on: Gun Policy | International Conflict | Election 2024

Regions & Countries

Methodology, the american trends panel survey methodology.

The American Trends Panel (ATP), created by Pew Research Center, is a nationally representative panel of randomly selected U.S. adults. Panelists participate via self-administered web surveys. Panelists who do not have internet access at home are provided with a tablet and wireless internet connection. Interviews are conducted in both English and Spanish. The panel is being managed by Ipsos.

Data in this report is drawn from ATP Wave 143, conducted from Feb. 13 to 25, 2024. A total of 12,693 panelists responded out of 14,762 who were sampled, for a response rate of 89% (AAPOR RR3). The survey includes an oversample of 2,051 Jewish and Muslim Americans from Ipsos’ KnowledgePanel, SSRS’s Opinion Panel, and NORC at the University of Chicago’s AmeriSpeak Panel. These oversampled groups are weighted to reflect their correct proportions in the population. The cumulative response rate accounting for nonresponse to the recruitment surveys and attrition is 4%. The break-off rate among panelists who logged on to the survey and completed at least one item is less than 1%. The margin of sampling error for the full sample of 12,693 respondents is plus or minus 1.5 percentage points.

ATP Panel recruitment

The ATP was created in 2014, with the first cohort of panelists invited to join the panel at the end of a large, national, landline and cellphone random-digit-dial survey that was conducted in both English and Spanish. Two additional recruitments were conducted using the same method in 2015 and 2017, respectively. Across these three surveys, a total of 19,718 adults were invited to join the ATP, of whom 9,942 (50%) agreed to participate.

In August 2018, the ATP switched from telephone to address-based sampling (ABS) recruitment. A study cover letter and a pre-incentive are mailed to a stratified, random sample of households selected from the U.S. Postal Service’s Delivery Sequence File. This Postal Service file has been estimated to cover as much as 98% of the population, although some studies suggest that the coverage could be in the low 90% range. 5 Within each sampled household, the adult with the next birthday is asked to participate. Other details of the ABS recruitment protocol have changed over time but are available upon request. 6

Table shows American Trends Panel recruitment surveys

We have recruited a national sample of U.S. adults to the ATP approximately once per year since 2014. In some years, the recruitment has included additional efforts (known as an “oversample”) to boost sample size with underrepresented groups. For example, Hispanic, Black and Asian adults were oversampled in 2019, 2022 and 2023, respectively.

Across the six address-based recruitments, a total of 23,862 adults were invited to join the ATP, of whom 20,917 agreed to join the panel and completed an initial profile survey. Of the 30,859 individuals who have ever joined the ATP, 11,920 remained active panelists and continued to receive survey invitations at the time this survey was conducted.

The American Trends Panel never uses breakout routers or chains that direct respondents to additional surveys.

Sample design

The overall target population for this survey was noninstitutionalized persons ages 18 and older living in the U.S., including Alaska and Hawaii. All active panel members who completed the ATP wave which fielded from July 31 to Aug. 6, 2023 (ATP W132), or panelists who previously identified as Jewish or Muslim, were invited to participate in this wave.

The ATP was supplemented with an oversample of self-identified Jewish and Muslim American panelists from three other probability panels: Ipsos’ KnowledgePanel, SSRS’s Opinion Panel, and NORC at the University of Chicago’s AmeriSpeak panel. All panelists who met the selection criteria were selected with certainty.

Questionnaire development and testing

The questionnaire was developed by Pew Research Center in consultation with Ipsos, SSRS and NORC. The survey for ATP and KP panelists was programmed by Ipsos, while the survey for SSRS and NORC panelists was programmed by SSRS. A small number of SSRS panelists took their survey over the phone with an interviewer. Both web programs were rigorously tested on both PC and mobile devices by the Ipsos, SSRS and NORC project management teams and Pew Research Center researchers. The Ipsos project management team also populated test data that was analyzed in SPSS to ensure the logic and randomizations were working as intended before launching the survey.

All ATP respondents were offered a post-paid incentive for their participation. Respondents could choose to receive the post-paid incentive in the form of a check or a gift code to Amazon.com or could choose to decline the incentive. Incentive amounts ranged from $5 to $20 depending on whether the respondent belongs to a part of the population that is harder or easier to reach. Differential incentive amounts were designed to increase panel survey participation among groups that traditionally have low survey response propensities.

Respondents from the Ipsos KnowledgePanel, SSRS Opinion Panel and AmeriSpeak were offered the cash equivalent of $10 for completing this survey.

Data collection protocol

The data collection field period for this survey was Feb. 13 to 25, 2024. Postcard notifications were mailed to a subset of ATP panelists with a known residential address on Feb. 12. 7

Invitations were sent out in separate launches. Sixty ATP panelists and 300 KP panelists were included in the soft launch, which began with an initial invitation sent on Feb. 13. The ATP and KP panelists chosen for the soft launch were known responders who had completed previous surveys within one day of receiving their invitation. All remaining ATP and KP sampled panelists were included in the full launch and were sent an invitation on Feb. 14.

Table shows Invitation and reminder dates, ATP Wave 143

Overall, 129 SSRS panelists were included in the SSRS soft launch, which began with an initial invitation on Feb. 14. And 110 NORC panelists were included in the NORC soft launch, which began with an initial invitation on Feb. 15. All remaining SSRS and NORC sampled panelists were included together in the full launch and were sent an invitation on Feb. 15. 

All panelists with an email address received an email invitation and up to four email reminders if they did not respond to the survey. All ATP panelists who consented to SMS messages received an SMS invitation and up to four SMS reminders.

Data quality checks

To ensure high-quality data, the Center’s researchers performed data quality checks to identify any respondents showing clear patterns of satisficing. This includes checking for whether respondents left questions blank at very high rates or always selected the first or last answer presented. As a result of this checking, six ATP respondents were removed from the survey dataset prior to weighting and analysis.

Table shows American Trends Panel weighting dimensions

The data was weighted in a multistep process that accounts for multiple stages of sampling and nonresponse that occur at different points in the survey process. First, each panelist begins with a base weight that reflects their probability of inclusion in the panel to which they belong. Separately for each of the four panels (ATP, KP, SSRS, NORC), the base weights for Muslim and Jewish respondents were scaled to be proportional to the group’s effective sample size. These weights were then combined and calibrated so that the overall proportions of Jewish and Muslim respondents respectively match the National Public Opinion Reference Survey (NPORS) benchmark.

This weight is then calibrated again to align with the full set of population benchmarks identified in the accompanying table (which also includes the NPORS benchmarks for the shares of Jewish and Muslim adults). In order to reduce the loss in precision stemming from variance in the weights, the weights were trimmed separately among Jewish, Muslim, Hispanic, non-Hispanic Black, and non-Hispanic Asian respondents at the 98th percentile, and among all other respondents at the 99.5th percentile. Sampling errors and tests of statistical significance take into account the effect of weighting.

The following table shows the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey.

Table shows Sample sizes and margins of error, ATP Wave 143

Sample sizes and sampling errors for other subgroups are available upon request. In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.

Dispositions and response rates

Table shows Final dispositions, ATP Wave 143

© Pew Research Center, 2024

  • AAPOR Task Force on Address-based Sampling. 2016. “AAPOR Report: Address-based Sampling.” ↩
  • Email [email protected] . ↩
  • Postcard notifications are sent to 1) panelists who have been provided with a tablet to take ATP surveys, 2) panelists who were recruited within the last two years, and 3) panelists recruited prior to the last two years who opt to continue receiving postcard notifications. ↩

Sign up for our weekly newsletter

Fresh data delivered Saturday mornings

Report Materials

Table of contents, younger americans stand out in their views of the israel-hamas war, how u.s. muslims are experiencing the israel-hamas war, how u.s. jews are experiencing the israel-hamas war, majority in u.s. say israel has valid reasons for fighting; fewer say the same about hamas, how americans view the conflicts between russia and ukraine, israel and hamas, and china and taiwan, most popular.

About Pew Research Center Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of The Pew Charitable Trusts .

U.S. flag

An official website of the United States government

Here’s how you know

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( Lock A locked padlock ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Family and Staff Well-Being in Head Start FACES Programs in Fall 2021: The 2021-2022 Study

Family and Staff Well-Being in Head Start FACES Programs in Fall 2021: The 2021-2022 Study

Download Report

  • File Size: 2,920.05 KB
  • Published: 2024

Introduction

Head Start is a national program that helps young children from families with low incomes prepare to succeed in school. It does this by working to promote children’s early learning and health and their families’ well-being. Head Start connects families with medical, dental, and mental health services to ensure that children are receiving the services they need to support their development. Head Start also involves parents in their children’s learning and development and helps parents make progress on their own goals, such as housing stability, continuing education, and financial security (Administration for Children and Families 2020). Head Start provides grants to local public and private nonprofit and for-profit agencies. The agencies in turn deliver comprehensive services to children and families with low incomes.

In 2021, the Office of Planning, Research, and Evaluation in the Administration for Children and Families, U.S. Department of Health and Human Services, contracted with Mathematica to design and conduct the 2021—2022 Study of Family and Staff Well-Being in Head Start Family and Child Experiences Survey Programs (the 2021—2022 Study). The 2021—2022 Study builds on the Head Start Family and Child Experiences Survey (FACES). This report includes information on the 2021—2022 Study design and presents key findings from the study’s fall 2021 data collection. 

The study focused on family and staff well-being, including:  

Children’s social-emotional and learning skills  

Children’s physical health and disability status  

Teacher characteristics  

This report (1) provides information about the 2021—2022 Study, including the background, design, methodology, assessments, and analytic methods; and (2) reports detailed findings on children, families, and teachers from fall 2021. The study focused on family and staff well-being.

Key Findings and Highlights

For children’s characteristics, family background, and home environment (Section A), the tables show:   

Demographic characteristics (for example, age, race/ethnicity, gender, language(s) spoken in the home, child’s primary caregiver(s), who lives in the household)  

Participation in an Early Head Start program  

Parents’ level of completed education and employment status  

Changes in parents’ employment status and household income due to the COVID-19 pandemic  

Family economic well-being (for example, total household income; household income as a percentage of federal poverty threshold; financial strain; food security; family housing, utility, and medical hardships; and sources of public assistance)  

Parents’ mental health (for example, depressive symptoms scores, anxiety symptoms scores, and stress and anxiety level compared to stress and anxiety before March 2020)  

Parents’ overall health status  

Social and community supports available to and useful for parents  

Family housing status, stability, and quality  

Parents’ report of relationship with the child  

How the child attended school in fall 2021  

Strategies parents use to meet child care needs outside of their regular child care arrangements  

Household routines (for example, reading to the child and bedtime and family dinner routines)  

Children’s access to healthcare providers  

Families’ health experiences with COVID-19  

Challenges or and coping strategies for the COVID-19 pandemic and events related to racial injustice  

For children’s social-emotional and learning skills (Section B), the tables show:   

Reliability of and scores for teacher-reported items that measure children’s social skills, problem behaviors, approaches to learning, and literacy skills scores,   

Reliability of and scores for parent-reported approaches to learning   

Parent report of changes in the child’s behavior since March 2020  

Teacher-reported early literacy skills  

Teacher-reported math knowledge and skills   

For children’s physical health and disability status (Section C), the tables show:   

Teacher’s report of child’s disability status and type and Individualized Education Program (IEP) or Individual Family Service Plan (IFSP) status  

Parent’s report of child’s health status  

For teacher’s characteristics (Section D), the tables show:   

Teacher demographic characteristics (for example, gender, age, and race/ethnicity)  

Teacher experience, credentials, and education

Teacher’s mental health (for example, depressive symptoms scores; anxiety symptoms scores)  

Teacher’s feelings about their jobs due to the COVID-19 pandemic  

Teacher’s caregiving situations, among teachers who were primary caregivers at home (for example, stress and anxiety level compared to stress and anxiety before March 2020; parenting behaviors and stress; instructional approach offered by their own children’s schools or child care providers; strategies used to meet child care needs outside of regular child care arrangements  

The 2021—2022 Study gathered data from three sources in fall 2021:

  • A survey of children’s parents, in which children’s primary caregivers answered questions about their children in Head Start FACES programs and their households.
  • A teacher child report (TCR) survey, in which teachers answer questions about the development of specific children in their classrooms.
  • A teacher survey, in which teachers answered questions about themselves.

In total, 60 programs and 113 centers participated in the study in fall 2021. Within those programs and centers, we received parental consent for 1,363 children to participate. We received 785 completed parent surveys, 887 completed TCRs, and 191 completed teacher surveys. Of the parents and teachers who completed surveys, most did so in January 2022, during the omicron wave of the COVID-19 pandemic.  

Although we selected a nationally representative sample of children and teachers in fall 2021, fewer children and their families participated in the 2021—2022 Study than expected. Therefore, the responding sample may not be representative of all Head Start children and their families. Teachers participated at expected rates and estimates based on their survey data represent teachers in Head Start FACES programs in the 2021-2022 program year.  

Because participation and response rates were lower than expected, readers should use caution when interpreting the 2021—2022 Study estimates in this report.   

Doran, Elizabeth, Davis Straske, Natalie Reid, Charlotte Cabili, Tutrang Nguyen, Xinwei Li, Myah Scott, Aden Bhagwat, Will Ratner, Judy Cannon, Jeffrey Harrington, Addison Larson, Ashley Kopack Klein, Katie Gonzalez, Nikki Aikens, and Sara Bernstein (2024). Family and Staff Well-Being in Head Start FACES Programs in Fall 2021: The 2021-2022 Study, OPRE Report 2024-037, Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

IMAGES

  1. FREE 10+ Sample Survey Reports in PDF

    methodology in survey report example

  2. 2014 Survey Methodology

    methodology in survey report example

  3. Survey Questionnaire for research paper

    methodology in survey report example

  4. Research Methodology Report Example

    methodology in survey report example

  5. Example Of Methodology

    methodology in survey report example

  6. Research Methodology Report Example

    methodology in survey report example

VIDEO

  1. RESEARCH METHODOLOGY

  2. RM# SURVEY METHOD# MCQ#VIDEO

  3. online quantitative research and survey

  4. Research Methodologies

  5. very important information about research methodology bcom 6 semester OU (1)

  6. what is research report what are contents RM and PR bcom final years

COMMENTS

  1. Survey Analysis Report Example: What to Include and How to ...

    Before we dive into the example, let's establish a clear understanding of what a survey analysis report should encompass: 1. Define the Survey Goals and Methodology. The survey analysis report should begin by recapping the goals and methodology. Remind readers why the survey was conducted and how you gathered the data.

  2. PDF Rapid Guide to Describing a Survey Methodology

    information (for example, to a person who will be responsible for managing a follow-up survey). Annex I: Questionnaire Your survey questionnaire(s) and other data collection tools (such as observation checklists, tests, etc.) should be provided as an annex to the description of your survey methodology.

  3. Survey Research

    Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout.

  4. How to Make a Survey Report: Complete Guide

    Start a survey report with a brief overview of the purpose of the survey, its objectives, and the methodology used for data collection. This sets the context for the rest of the report and helps readers understand the scope of the survey. ... By following the steps outlined in this guide and utilizing the survey report example like the one ...

  5. A Comprehensive Guide to Survey Research Methodologies

    A survey is a research method that is used to collect data from a group of respondents in order to gain insights and information regarding a particular subject. It's an excellent method to gather opinions and understand how and why people feel a certain way about different situations and contexts. ‍.

  6. How to Write a Complete Survey Report

    Completion rate. The completion rate is the number of questions answered divided by the total number of questions in your survey. If you have a survey of 12 questions but most respondents only answered 6 of those, you have a completion rate of 50%. Depending on the survey tool you use, the completion rate can indicate many things.

  7. PDF Fundamentals of Survey Research Methodology

    the selection of the sample, requirements for determining the needed sample size, and considerations for choosing the appropriate survey media. 2.1.1 Sample Selection . Sample selection depends on the population size, its homogeneity, the sample media and its cost of use, and the degree of precision required (Salant & Dillman, 1994, p. 54). The

  8. How to Create a Survey Results Report (+7 Examples to Steal)

    A great report will increase the impact of your survey results and encourage more readers to engage with the content. Create Your Survey Now. In This Article. 1. Use Data Visualization. 2. Write the Key Facts First. 3. Write a Short Survey Summary.

  9. Survey Methodology: How to reach your Audience

    A survey methodology is a technique that is carried out by applying a questionnaire to a sample of people. Surveys provide information on citizens' opinions, attitudes, and behaviors, and there's more than one method to conduct them. In fact, there's a whole universe to learn from in the surveys world. LEARN ABOUT: Survey Sample Sizes.

  10. Doing Survey Research

    Survey research means collecting information about a group of people by asking them questions and analysing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout. Distribute the survey.

  11. How to Write a Survey Report: 10 Expert Tips

    Explain how you conducted the survey. The methodology section should give a complete overview of how you conducted the survey. You need to explain in detail how you: Developed your questions; ... You can find survey report samples and writing examples on the internet. But help from paper writing services allows you to devote energy to ...

  12. Design and Methodology Report

    Download ACS Design & Methodology Report-2022 [3.0 MB] The purpose of this document is to provide data users and other interested individuals with documentation of the methods used in the ACS. This document describes the basic ACS design and methodology through 2019, unless otherwise indicated. Several key developments since the previous ...

  13. PDF Understanding Survey Methodology

    The connections between survey methodology and sociology, as well as the other social sciences, extend back for nearly a century, long before the products of survey research became a ubiquitous presence in our daily lives and before survey meth-odology was an academic discipline in its own right. Thefirst generation of survey

  14. Reporting Survey Based Studies

    A well-reported survey-based study is a comprehensive report covering all the aspects of conducting a survey-based research. The design of the survey mentioning the target demographic, sample size, language, type, methodology of the survey and the inclusion-exclusion criteria followed comprises a descriptive report of a survey-based study.

  15. What Is a Research Methodology?

    Methods are the specific tools and procedures you use to collect and analyze data (for example, experiments, surveys, and statistical tests). In shorter scientific papers, where the aim is to report the findings of a specific study, you might simply describe what you did in a methods section.

  16. Survey Research: Definition, Examples & Methods

    Survey research is the process of collecting data from a predefined group (e.g. customers or potential customers) with the ultimate goal of uncovering insights about your products, services, or brand overall.. As a quantitative data collection method, survey research can provide you with a goldmine of information that can inform crucial business and product decisions.

  17. How to Analyze Survey Results Like a Data Pro

    5. Include the methodology of your research. The methodology section of your report should explain exactly how your survey was conducted, who was invited to participate, and the types of tests used to analyze the data. You might use charts or graphs to help communicate this data.

  18. Survey methodology

    Survey methodology is "the study of survey methods". ... These self-report scales, which are usually presented in questionnaire form, are one of the most used instruments in psychology, and thus it is important that the measures be constructed carefully, while also being reliable and valid. ... Interviewer effects are one example survey ...

  19. U.S. Survey Methodology

    Since 2014, Pew Research Center has conducted surveys online in the United States using our American Trends Panel (ATP), a randomly selected, probability-based sample of U.S. adults ages 18 and older. The panel was initially built to supplement the prevalent mode of data collection at the Center during that time: random-digit-dial (RDD ...

  20. Survey questions and methodology

    Methodology. This report is based on the findings of a daily tracking survey on Americans' use of the Internet. The results in this report are based on data from telephone interviews conducted by Princeton Survey Research Associates International from November 3-24, 2010, among a sample of 2,257 adults, age 18 and older.

  21. Questionnaire Design

    Questionnaires vs. surveys. A survey is a research method where you collect and analyze data from a group of people. A questionnaire is a specific tool or instrument for collecting the data.. Designing a questionnaire means creating valid and reliable questions that address your research objectives, placing them in a useful order, and selecting an appropriate method for administration.

  22. Understanding and Evaluating Survey Research

    Survey research is defined as "the collection of information from a sample of individuals through their responses to questions" ( Check & Schutt, 2012, p. 160 ). This type of research allows for a variety of methods to recruit participants, collect data, and utilize various methods of instrumentation. Survey research can use quantitative ...

  23. How To Write The Methodology Chapter (With Examples)

    You don't need a lot of detail here - just a brief outline will do. Section 2 - The Methodology. The next section of your chapter is where you'll present the actual methodology. In this section, you need to detail and justify the key methodological choices you've made in a logical, intuitive fashion.

  24. Methodology

    Methodology. The data in this report comes from a self-administered web survey of K-12 public school teachers in the United States. It was conducted online in English from Oct. 17 to Nov. 14, 2023. Out of 6,357 teachers who were sampled, 191 were screened out as no longer eligible. A total of 2,531 completed the survey, for a completion rate of ...

  25. Interest Rate Announcement and Monetary Policy Report

    09:45 (ET)On eight scheduled dates each year, the Bank of Canada announces the setting for the overnight rate target in a press release explaining the factors behind the decision. Four times a year, Governing Council presents the Monetary Policy Report: the Bank's base-case projection for inflation and growth in the Canadian economy, and its assessment of risks.

  26. Church Attendance Has Declined in Most U.S. Religious Groups

    Gallup measures church attendance and religious affiliation on nearly every U.S. survey it conducts. These results are based on aggregated data from Gallup telephone surveys conducted in 2021, 2022 and 2023, which yield enough sample to examine attendance among a larger number of religious groups than would be possible in typical survey samples.

  27. Methodology

    AAPOR Task Force on Address-based Sampling. 2016. "AAPOR Report: Address-based Sampling." ↩ Email [email protected]. ↩; Postcard notifications are sent to 1) panelists who have been provided with a tablet to take ATP surveys, 2) panelists who were recruited within the last two years, and 3) panelists recruited prior to the last two years who opt to continue receiving postcard ...

  28. Family and Staff Well-Being in Head Start FACES Programs in Fall 2021

    A teacher survey, in which teachers answered questions about themselves. In total, 60 programs and 113 centers participated in the study in fall 2021. Within those programs and centers, we received parental consent for 1,363 children to participate. We received 785 completed parent surveys, 887 completed TCRs, and 191 completed teacher surveys.

  29. Trump Leads Biden in Six of Seven Swing States, WSJ Poll Finds

    April 2, 2024 9:00 pm ET. Text. 3220 Responses. Donald Trump is leading President Biden in six of the seven most competitive states in the 2024 election, propelled by broad voter dissatisfaction ...