Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base


  • Questionnaire Design | Methods, Question Types & Examples

Questionnaire Design | Methods, Question Types & Examples

Published on July 15, 2021 by Pritha Bhandari . Revised on June 22, 2023.

A questionnaire is a list of questions or items used to gather data from respondents about their attitudes, experiences, or opinions. Questionnaires can be used to collect quantitative and/or qualitative information.

Questionnaires are commonly used in market research as well as in the social and health sciences. For example, a company may ask for feedback about a recent customer service experience, or psychology researchers may investigate health risk perceptions using questionnaires.

Table of contents

Questionnaires vs. surveys, questionnaire methods, open-ended vs. closed-ended questions, question wording, question order, step-by-step guide to design, other interesting articles, frequently asked questions about questionnaire design.

A survey is a research method where you collect and analyze data from a group of people. A questionnaire is a specific tool or instrument for collecting the data.

Designing a questionnaire means creating valid and reliable questions that address your research objectives , placing them in a useful order, and selecting an appropriate method for administration.

But designing a questionnaire is only one component of survey research. Survey research also involves defining the population you’re interested in, choosing an appropriate sampling method , administering questionnaires, data cleansing and analysis, and interpretation.

Sampling is important in survey research because you’ll often aim to generalize your results to the population. Gather data from a sample that represents the range of views in the population for externally valid results. There will always be some differences between the population and the sample, but minimizing these will help you avoid several types of research bias , including sampling bias , ascertainment bias , and undercoverage bias .

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Questionnaires can be self-administered or researcher-administered . Self-administered questionnaires are more common because they are easy to implement and inexpensive, but researcher-administered questionnaires allow deeper insights.

Self-administered questionnaires

Self-administered questionnaires can be delivered online or in paper-and-pen formats, in person or through mail. All questions are standardized so that all respondents receive the same questions with identical wording.

Self-administered questionnaires can be:

  • cost-effective
  • easy to administer for small and large groups
  • anonymous and suitable for sensitive topics

But they may also be:

  • unsuitable for people with limited literacy or verbal skills
  • susceptible to a nonresponse bias (most people invited may not complete the questionnaire)
  • biased towards people who volunteer because impersonal survey requests often go ignored.

Researcher-administered questionnaires

Researcher-administered questionnaires are interviews that take place by phone, in-person, or online between researchers and respondents.

Researcher-administered questionnaires can:

  • help you ensure the respondents are representative of your target audience
  • allow clarifications of ambiguous or unclear questions and answers
  • have high response rates because it’s harder to refuse an interview when personal attention is given to respondents

But researcher-administered questionnaires can be limiting in terms of resources. They are:

  • costly and time-consuming to perform
  • more difficult to analyze if you have qualitative responses
  • likely to contain experimenter bias or demand characteristics
  • likely to encourage social desirability bias in responses because of a lack of anonymity

Your questionnaire can include open-ended or closed-ended questions or a combination of both.

Using closed-ended questions limits your responses, while open-ended questions enable a broad range of answers. You’ll need to balance these considerations with your available time and resources.

Closed-ended questions

Closed-ended, or restricted-choice, questions offer respondents a fixed set of choices to select from. Closed-ended questions are best for collecting data on categorical or quantitative variables.

Categorical variables can be nominal or ordinal. Quantitative variables can be interval or ratio. Understanding the type of variable and level of measurement means you can perform appropriate statistical analyses for generalizable results.

Examples of closed-ended questions for different variables

Nominal variables include categories that can’t be ranked, such as race or ethnicity. This includes binary or dichotomous categories.

It’s best to include categories that cover all possible answers and are mutually exclusive. There should be no overlap between response items.

In binary or dichotomous questions, you’ll give respondents only two options to choose from.

White Black or African American American Indian or Alaska Native Asian Native Hawaiian or Other Pacific Islander

Ordinal variables include categories that can be ranked. Consider how wide or narrow a range you’ll include in your response items, and their relevance to your respondents.

Likert scale questions collect ordinal data using rating scales with 5 or 7 points.

When you have four or more Likert-type questions, you can treat the composite data as quantitative data on an interval scale . Intelligence tests, psychological scales, and personality inventories use multiple Likert-type questions to collect interval data.

With interval or ratio scales , you can apply strong statistical hypothesis tests to address your research aims.

Pros and cons of closed-ended questions

Well-designed closed-ended questions are easy to understand and can be answered quickly. However, you might still miss important answers that are relevant to respondents. An incomplete set of response items may force some respondents to pick the closest alternative to their true answer. These types of questions may also miss out on valuable detail.

To solve these problems, you can make questions partially closed-ended, and include an open-ended option where respondents can fill in their own answer.

Open-ended questions

Open-ended, or long-form, questions allow respondents to give answers in their own words. Because there are no restrictions on their choices, respondents can answer in ways that researchers may not have otherwise considered. For example, respondents may want to answer “multiracial” for the question on race rather than selecting from a restricted list.

  • How do you feel about open science?
  • How would you describe your personality?
  • In your opinion, what is the biggest obstacle for productivity in remote work?

Open-ended questions have a few downsides.

They require more time and effort from respondents, which may deter them from completing the questionnaire.

For researchers, understanding and summarizing responses to these questions can take a lot of time and resources. You’ll need to develop a systematic coding scheme to categorize answers, and you may also need to involve other researchers in data analysis for high reliability .

Question wording can influence your respondents’ answers, especially if the language is unclear, ambiguous, or biased. Good questions need to be understood by all respondents in the same way ( reliable ) and measure exactly what you’re interested in ( valid ).

Use clear language

You should design questions with your target audience in mind. Consider their familiarity with your questionnaire topics and language and tailor your questions to them.

For readability and clarity, avoid jargon or overly complex language. Don’t use double negatives because they can be harder to understand.

Use balanced framing

Respondents often answer in different ways depending on the question framing. Positive frames are interpreted as more neutral than negative frames and may encourage more socially desirable answers.

Use a mix of both positive and negative frames to avoid research bias , and ensure that your question wording is balanced wherever possible.

Unbalanced questions focus on only one side of an argument. Respondents may be less likely to oppose the question if it is framed in a particular direction. It’s best practice to provide a counter argument within the question as well.

Avoid leading questions

Leading questions guide respondents towards answering in specific ways, even if that’s not how they truly feel, by explicitly or implicitly providing them with extra information.

It’s best to keep your questions short and specific to your topic of interest.

  • The average daily work commute in the US takes 54.2 minutes and costs $29 per day. Since 2020, working from home has saved many employees time and money. Do you favor flexible work-from-home policies even after it’s safe to return to offices?
  • Experts agree that a well-balanced diet provides sufficient vitamins and minerals, and multivitamins and supplements are not necessary or effective. Do you agree or disagree that multivitamins are helpful for balanced nutrition?

Keep your questions focused

Ask about only one idea at a time and avoid double-barreled questions. Double-barreled questions ask about more than one item at a time, which can confuse respondents.

This question could be difficult to answer for respondents who feel strongly about the right to clean drinking water but not high-speed internet. They might only answer about the topic they feel passionate about or provide a neutral answer instead – but neither of these options capture their true answers.

Instead, you should ask two separate questions to gauge respondents’ opinions.

Strongly Agree Agree Undecided Disagree Strongly Disagree

Do you agree or disagree that the government should be responsible for providing high-speed internet to everyone?

The only proofreading tool specialized in correcting academic writing

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. It's the most accurate and reliable proofreading tool for students.

thesis questionnaire about competencies

Correct my document

You can organize the questions logically, with a clear progression from simple to complex. Alternatively, you can randomize the question order between respondents.

Logical flow

Using a logical flow to your question order means starting with simple questions, such as behavioral or opinion questions, and ending with more complex, sensitive, or controversial questions.

The question order that you use can significantly affect the responses by priming them in specific directions. Question order effects, or context effects, occur when earlier questions influence the responses to later questions, reducing the validity of your questionnaire.

While demographic questions are usually unaffected by order effects, questions about opinions and attitudes are more susceptible to them.

  • How knowledgeable are you about Joe Biden’s executive orders in his first 100 days?
  • Are you satisfied or dissatisfied with the way Joe Biden is managing the economy?
  • Do you approve or disapprove of the way Joe Biden is handling his job as president?

It’s important to minimize order effects because they can be a source of systematic error or bias in your study.


Randomization involves presenting individual respondents with the same questionnaire but with different question orders.

When you use randomization, order effects will be minimized in your dataset. But a randomized order may also make it harder for respondents to process your questionnaire. Some questions may need more cognitive effort, while others are easier to answer, so a random order could require more time or mental capacity for respondents to switch between questions.

Step 1: Define your goals and objectives

The first step of designing a questionnaire is determining your aims.

  • What topics or experiences are you studying?
  • What specifically do you want to find out?
  • Is a self-report questionnaire an appropriate tool for investigating this topic?

Once you’ve specified your research aims, you can operationalize your variables of interest into questionnaire items. Operationalizing concepts means turning them from abstract ideas into concrete measurements. Every question needs to address a defined need and have a clear purpose.

Step 2: Use questions that are suitable for your sample

Create appropriate questions by taking the perspective of your respondents. Consider their language proficiency and available time and energy when designing your questionnaire.

  • Are the respondents familiar with the language and terms used in your questions?
  • Would any of the questions insult, confuse, or embarrass them?
  • Do the response items for any closed-ended questions capture all possible answers?
  • Are the response items mutually exclusive?
  • Do the respondents have time to respond to open-ended questions?

Consider all possible options for responses to closed-ended questions. From a respondent’s perspective, a lack of response options reflecting their point of view or true answer may make them feel alienated or excluded. In turn, they’ll become disengaged or inattentive to the rest of the questionnaire.

Step 3: Decide on your questionnaire length and question order

Once you have your questions, make sure that the length and order of your questions are appropriate for your sample.

If respondents are not being incentivized or compensated, keep your questionnaire short and easy to answer. Otherwise, your sample may be biased with only highly motivated respondents completing the questionnaire.

Decide on your question order based on your aims and resources. Use a logical flow if your respondents have limited time or if you cannot randomize questions. Randomizing questions helps you avoid bias, but it can take more complex statistical analysis to interpret your data.

Step 4: Pretest your questionnaire

When you have a complete list of questions, you’ll need to pretest it to make sure what you’re asking is always clear and unambiguous. Pretesting helps you catch any errors or points of confusion before performing your study.

Ask friends, classmates, or members of your target audience to complete your questionnaire using the same method you’ll use for your research. Find out if any questions were particularly difficult to answer or if the directions were unclear or inconsistent, and make changes as necessary.

If you have the resources, running a pilot study will help you test the validity and reliability of your questionnaire. A pilot study is a practice run of the full study, and it includes sampling, data collection , and analysis. You can find out whether your procedures are unfeasible or susceptible to bias and make changes in time, but you can’t test a hypothesis with this type of study because it’s usually statistically underpowered .

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Prospective cohort study

Research bias

  • Implicit bias
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic
  • Social desirability bias

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analyzing data from people using questionnaires.

Closed-ended, or restricted-choice, questions offer respondents a fixed set of choices to select from. These questions are easier to answer quickly.

Open-ended or long-form questions allow respondents to answer in their own words. Because there are no restrictions on their choices, respondents can answer in ways that researchers may not have otherwise considered.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviors. It is made up of 4 or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with 5 or 7 possible responses, to capture their degree of agreement.

You can organize the questions logically, with a clear progression from simple to complex, or randomly between respondents. A logical flow helps respondents process the questionnaire easier and quicker, but it may lead to bias. Randomization can minimize the bias from order effects.

Questionnaires can be self-administered or researcher-administered.

Researcher-administered questionnaires are interviews that take place by phone, in-person, or online between researchers and respondents. You can gain deeper insights by clarifying questions for respondents or asking follow-up questions.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2023, June 22). Questionnaire Design | Methods, Question Types & Examples. Scribbr. Retrieved January 2, 2024, from https://www.scribbr.com/methodology/questionnaire/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, survey research | definition, examples & methods, what is a likert scale | guide & examples, reliability vs. validity in research | difference, types and examples, what is your plagiarism score.

A Model and Questionnaire of Professional Competencies of Iranian EFL Teachers at Public and Private Sectors and its Relation with their Students’ Achievement

  • Published: 30 August 2022
  • Volume 53 , pages 551–568, ( 2022 )
  • Mansooreh Hosseinnia   ORCID: orcid.org/0000-0002-7860-9161 1 ,
  • Hamid Ashraf 1 &
  • Hossein Khodabakhshzadeh 1  

138 Accesses

Explore all metrics

Teachers’ professional competencies are essential abilities which assure the success of teachers in performing their responsibilities. The major purpose of the present study is to construct an EFL teachers’ professional competencies Questionnaire in Iran and to examine the relationship between teachers’ professional competencies and their students’ achievement. With this aim, all the steps and stages of questionnaire development and validation were done. The scale consists of six main categories: personality factors, interpersonal factors, professional factors, factors related to teaching materials, learner factors, and assessment factors. The first draft of the scale consisted of 51 items. After employing Confirmatory Factor Analysis (CFA), it was revealed that the questionnaire consists of a high validity. Also, for having a better fit for model and construct validity, one item was omitted out of the first draft of the questionnaire because of having low loadings. In addition, according to the findings, there were positive significant relationships between all six sub-constructs of teachers’ competencies and students’ achievement. Finally, findings are discussed, and implications are presented in the context of English language teaching. The findings contribute empirical evidence to provide a framework for assessing and evaluating EFL teachers’ competencies in their profession.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

thesis questionnaire about competencies

Aghagolzadeh, F., & Davari, H. (2014). Iranian critical ELT: A belated but growing intellectual shift in Iranian ELT community. Journal for Critical Education Policy Studies, 12 (1), 391–410.

Google Scholar  

Akbari, R., & Yazdanmehr, E. (2011). EFL teachers’ recruitment and dynamic assessment in private language institutes of Iran. Journal of English Language Teaching and Learning, 8 , 29–51.

Atai, M. R., & Mazlum, F. (2013). English language teaching curriculum in Iran: Planning and practice. Curriculum Journal, 24 (3), 389–411.

Article   Google Scholar  

Boset, S. A. A., Asmawi, A., & Abedalaziz, N. (2017). The Relationship between competency and work motivation of EFL teachers at public secondary schools in Yemen. Arab World English Journal, 8 (4), 212–228.

Boyatzis, R. E., Stubbs, E., & Taylor, S. (2002). Learning cognitive and emotional intelligence competencies through graduate management education. Academy of Management Learning & Education, 1 (2), 150–162.

Burden, P., & Troudi, S. (2007). An evaluation of student ratings of teaching in Japanese university context. In C. Coombe, M. Al-Hamly, P. Davidson, & S. Troudi (Eds.), Evaluating teacher effectiveness in EFL/ ESL contexts (pp. 152–167). University of Michigan Press.

Coniam, D., & Falvey, P. (2005). High-stakes testing and assessment: English language teacher benchmarking. In J. Cummins & C. Davison (Eds.), International handbook of English language teaching (Vol. 1, pp. 457–470). Springer.

Darling-Hammond, I. (1999). Teacher quality and student achievement: A review of state policy evidence. Education Policy Analysis Archives, 8 (1), 1–40.

Davari, H., & Aghagolzadeh, F. (2015). To teach or not to teach? Still an open question for the Iranian education system. In C. Kennedy (Ed.), English language teaching in the Islamic Republic of Iran: Innovations, trends and challenges (pp. 11–19). British Council.

Davidson, P. (2007). Faculty attitudes toward three methods of evaluating teacher effectiveness. In C. Coombe, M. Al-Hamly, P. Davidson, & S. Troudi (Eds.), Evaluating teacher effectiveness in EFL/ ESL contexts (pp. 183–199). University of Michigan Press.

Davison, C., & Cummins, J. (2007). Assessment and evaluation in ELT: Shifting paradigms and practices. In J. Cummins & C. Davison (Eds.), International handbook of English language teaching (Vol. 1, pp. 415–420). Springer.

Chapter   Google Scholar  

Dornyei, Z. (2010). Questionnaires in Second Language Research: Construction, Administration, and Processing (2nd ed.). Routledge.

Drovnikov, A. S., Nikolaev, E. L., Afanasev, A. S., Ivanov, V. N., Petrova, T. N., Tenyukova, G. G., Maksimova, N. L., & Povshednaya, F. V. (2016). Teachers professional competency assessment technology in qualification improvement process. International Review of Management and Marketing, 6 (1), 111–115.

Edmond, N., & Hayler, M. (2013). On either side of the teacher: Perspectives on professionalism in education. Journal of Education for Teaching, 39 (2), 209–221.

Eken, D. K. (2007). An exploration of teaching effectiveness: An attempt to define the less easy definable. In C. Coombe, M. Al-Hamly, P. Davidson, & S. Troudi (Eds.), Evaluating teacher effectiveness EFL/ ESL contexts (pp. 167–183). University of Michigan Press.

Freeman, D., & Richards, J. C. (Eds.). (1996). Teacher learning in language teaching . Cambridge University Press.

Griffiths, C. (2007). Language learningstrategies: Students’ and teachers’ perceptions. ELT Journal, 61 (2), 91–99.

Ghazi, S. R., Shahzada, G., Tahir Shah, M., & Shauib, M. (2013). Teacher’s professional competencies in knowledge of subject matter at secondary level in Southern districts of Khyber Pakhtunkhwa, Pakistan. Journal of Educational and Social Research, 3 (2), 453–460.

Grossman, P. L. (1995) A Psychological View of teacher-teacher’s Knowledge”. In the International Encyclopedia of Teaching and Teacher Education, 2nd Education (PP-20–24) Columbia, SC: University of South Carolina

Hamdan, A. R., Ghafar, M. N., & Hwe Li, L. T. (2010). Teacher competency testing among Malaysian school teachers. European Journal of Social Sciences, 12 (4), 610–617.

Hosseinnia, M., Ashraf, H., Khodabakhshzadeh, H., & Khajavy, Gh. H. (2019). A sociopsychological study of professional competencies of a 21st century EFL teacher. Iranian Evolutionary and Educational Psychology, 1 (4), 231–324.

Hosseinnia, M., Ashraf, H., Khodabakhshzadeh, H., & Khajavy, Gh. H. (2020). A comparative study on professional competencies of Iranian EFL teachers at Public and Private sectors. Language Horizons, 3 (2), 125–146.

Ilanlou, M., & Zand, M. (2011). Professional competencies of teachers and the qualitative evaluation. Procedia - Social and Behavioral Sciences, 29 , 1143–1150.

Kaufman, D. (2007). A multidisciplinary approach to assessment in teacher education. In C. Coombe, M. Al-Hamly, P. Davidson, & S. Troudi (Eds.), Evaluating teacher effectiveness EFL/ ESL contexts (pp. 25–39). University of Michigan Press.

Kazemi, A., & Soleimani, N. (2013). On Iranian EFL teachers’ dominant teaching styles in Private language centers: Teacher- centered or student- centered? International Journal of Language Learning and Applied Linguistics World, 4 (1), 193–202.

Kiely, R., & Rea-Dickins, P. (2005). Program evaluation in language education . Palgrave McMillan: Hampshire.

Book   Google Scholar  

Kulshrestha, A. K., & Pandey, K. (2013). Teacher training and professional competencies. Voice of Research, 1 (4), 29–33.

Lasley, T. J., Siedentop, D., & Yinger, R. (2006). A systemic approach to enhancing teacher quality: The Ohio model. Journal of Teacher Education, 57 (1), 13–21.

Mack, S. (2007) Structural Equation Modeling. In English, Fenwick. Encyclopedia of Educational Leadership and Administration. Sage. ISBN 9781412939584

Moreno-Murcia, J. A., Torregrosa, Y. S., & Pedreño, N. B. (2015). Questionnaire evaluating teaching competencies in the university environment. Evaluation of teaching competencies in the university. New Approaches in Educational Research, 4 (1), 54–61.

Mousavi, M., Atai, M. R., & Babaii, E. (2016). Exploring standards and developing a measure for evaluating Iranian EFL teachers’ professional competence in the private sector. Iranian Journal of English for Academic Purposes, 5 (2), 1–30.

Murphey, T., & Yaode, Q. (2007). A Coherent approach to faculty appraisal. In C. Coombe, M. Al-Hamly, P. Davidson, & S. Troudi (Eds.), Evaluating teacher effectiveness in EFL/ESL contexts (pp. 89–106). University of Michigan.

Navidnia, H. (2013) English language teacher performance appraisal in Iranian high schools Improving evaluation and feedback process . (Unpublished PhD dissertation). Tarbiat Modares University, Iran

Naz, K. (2017). Effects of teachers' professional competency on students' academic achievements at secondary school level in Muzaffarabad District (Master’s thesis, University of Preston, Pakistan). Retrieved from http://hdl.handle.net/2123/7128

Niculescu, B. O. (2014). Specific competencies required in promoting the quality of the english language teaching process. Buletin Stiintific, 2 (38), 144–151.

Nur Mustafa, M. (2013). Factors that influence quality service of teachers. International Education Studies, 6 (9), 83–92.

Ostovar-Namaghi, S. A. (2013). Language teachers’ perceptions of evaluation criteria in Iran. The Reading Matrix, 13 (1), 45–60.

Pallant, J. (2007). SPSS survival manual—A step by step guide to data analysis using SPSS for windows (3rd ed.). Open University Press.

Rahaman, O. (2010) Teacher as a Key Factor Affecting Learning Posted: Thursday, http://searchwarp.com/swa596810-Teacher-As-A-Key-Factor - Affecting-Learning.htm

Razmjoo, S. A., & Riazi, A. M. (2006). Do high schools or private institutes practice communicative language teaching: A case study of Shiraz teachers in high schools and institutes. The Reading Matrix, 6 (3), 340–366.

Riazi, A. (2005). The Four language stages in history of Iran. In A.M.Y. Lin, & P.W. Martin (Eds.), Decolonization, globalization: language in education policy and practice . Clevendon: Multilingual Matters

Richards, J. C. (2011). Competence and Performance in Language Teaching . Cambridge University Press.

Rice, J. K. (2003). Teacher quality: Understanding the effectiveness of teacher attributes . Economic Policy Institute.

Rockoff, J., & E. (2004). The impact of individual teachers on student achievement: Evidence from panel data. American Economic Review, 94 (2), 247–252. https://doi.org/10.1257/0002828041302244

Rowan, B. (2004). Teachers matter: evidence from value-added assessments. AERA Research Points. Essential Information for Educational Policy, 2 (2), 1–4.

Saeidi, M., & Kalantarypour, M. (2011). The relationship between Iranian EFL teachers’ self-efficacy and students’ language achievement. World Applied Sciences Journal, 15 (11), 1562–1567.

Sanders, W. L., Wright, S. P., & Horn, S. P. (1997). Teacher and classroom context effects on student schievement: Implications for teacher evaluation. Journal of Personnel Evaluation in Education, 11 , 57–67. https://doi.org/10.1023/A:1007999204543

Shahivand, Z., & Pazhakh, A. (2012). The effects of test facets on the construct validity of the tests in Iranian EFL students. Higher Education of Social Science, 2 (1), 16–20.

Stoynoff, S. (2007). Building a context- specific teacher evaluation system for an ESL Program. In C. Coombe, M. Al-Hamly, P. Davidson, & S. Troudi (Eds.), Evaluating teacher effectiveness in EFL/ESL contexts (pp. 106–119). University of Michigan Press.

TESOL. (2003). TESOL / NCATE program standards: Standards for the accreditation of initial programs in P-12 ESL teacher education . VA: Alexandria.

Wayne, A. M., & Youngs, P. (2003). Teacher characteristics and student achievement gains: A review. Review of Educational Research, 73 (1), 89–122.

Weir, C., & Roberts, J. (1994). Program evaluation in ELT . Blackwell.

Wise, A. E., Darling- Hammond, L., Mclaughlin, M. W., & Berstein, H. T. (1984). Teacher evaluation: A study of effective practices . CA RAND: Santa Monica.

Wright, S. P., & Horn, S. P. (2013). Teacher and classroom context effects on student achievement: Implications for teacher evaluation. Journal of Personnel Evaluation in Education, 11 , 57–67.

Download references

This study was funded by the researchers.

Author information

Authors and affiliations.

Department of English, Torbat-E Heydarieh Branch, Islamic Azad University, Torbat-e Heydarieh, Iran

Mansooreh Hosseinnia, Hamid Ashraf & Hossein Khodabakhshzadeh

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Mansooreh Hosseinnia .

Ethics declarations

Conflict of interest.

The researchers have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 42 kb)

Rights and permissions.

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Cite this article.

Hosseinnia, M., Ashraf, H. & Khodabakhshzadeh, H. A Model and Questionnaire of Professional Competencies of Iranian EFL Teachers at Public and Private Sectors and its Relation with their Students’ Achievement. Interchange 53 , 551–568 (2022). https://doi.org/10.1007/s10780-022-09471-7

Download citation

Received : 23 December 2020

Accepted : 16 August 2022

Published : 30 August 2022

Issue Date : December 2022

DOI : https://doi.org/10.1007/s10780-022-09471-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Professional competencies
  • Personality factors
  • Interpersonal factors
  • Professional factors
  • Factors related to teaching materials
  • Learner factors
  • Find a journal
  • Publish with us
  • Track your research

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Competency Assessment Questionnaire Interpretation Key*

Profile image of Sunil Mukundan

Related Papers

Healthcare Quarterly

thesis questionnaire about competencies

Jonathan Gosling

Jon Fallesen

Centre for Leadership …

Jonathan Gosling , Antonio Marturano

Edmond Yunis

Energy and Buildings

Pantelis Botsaris

Frontiers in neuroscience

Edward Malthouse

Musical preference is highly individualized and is an area of active study to develop methods for its quantification. Recently, preference-based behavior, associated with activity in brain reward circuitry, has been shown to follow lawful, quantifiable patterns, despite broad variation across individuals. These patterns, observed using a keypress paradigm with visual stimuli, form the basis for relative preference theory (RPT). Here, we sought to determine if such patterns extend to non-visual domains (i.e., audition) and dynamic stimuli, potentially providing a method to supplement psychometric, physiological, and neuroimaging approaches to preference quantification. For this study, we adapted our keypress paradigm to two sets of stimuli consisting of seventeenth to twenty-first century western art music (Classical) and twentieth to twenty-first century jazz and popular music (Popular). We studied a pilot sample and then a separate primary experimental sample with this paradigm, an...

Journal of Chemical Education

Martin Pitt

Philosophical transactions of the Royal Society of London. Series B, Biological sciences

Markus F. Peschl

Animal innovations range from the discovery of novel food types to the invention of completely novel behaviours. Innovations can give access to new opportunities, and thus enable innovating agents to invade and create novel niches. This in turn can pave the way for morphological adaptation and adaptive radiation. The mechanisms that make innovations possible are probably as diverse as the innovations themselves. So too are their evolutionary consequences. Perhaps because of this diversity, we lack a unifying framework that links mechanism to function. We propose a framework for animal innovation that describes the interactions between mechanism, fitness benefit and evolutionary significance, and which suggests an expanded range of experimental approaches. In doing so, we split innovation into factors (components and phases) that can be manipulated systematically, and which can be investigated both experimentally and with correlational studies. We apply this framework to a selection ...

JIMD reports

Guja Astrea

Mutations in the guanosine diphosphate mannose (GDP-mannose) pyrophosphorylase B (GMPPB) gene encoding a key enzyme of the glycosylation pathway have been described in families with congenital (CMD) and limb girdle (LGMD) muscular dystrophy with reduced alpha-dystroglycan (α-DG) at muscle biopsy.Patients typically display a combined phenotype of muscular dystrophy, brain malformations, and generalized epilepsy. However, a wide spectrum of clinical severity has been described ranging from classical CMD presentation to children with mild, yet progressive LGMD with or without intellectual disability. Cardiac involvement, including a long QT interval and left ventricular dilatation, has also been described in four cases.Two missense mutations in GMPPB gene, one novel and one already reported, have been identified in a 21-year-old man presenting with elevated CK (38,650 UI/L; normal values <150 UI/L) without overt muscle weakness. Major complaints included limb myalgia, exercise intol...


M. Elhefnawi , Ahmed Salem

Cytometry Part A


Journal of Infection Prevention

Shelley Gower

Journal of Nursing Education and Practice

Mona Cockerham

International Journal of Radiation Oncology*Biology*Physics

Publications of the Astronomical Society of Australia

Roberto Nesci

Materials Today: Proceedings

Faouzi Errachidi

IEEE Transactions on Instrumentation and Measurement

Priscilla Cushman

Indian Dermatology Online Journal

Anubhav Garg

Journal of Physics: Conference Series

Environmental and Experimental Botany

R. Albrizio

Clinical Rheumatology

The Journal of Physical Chemistry C

Mikhail Uimin

Aslib Proceedings

David E. Thornton

Kuman Gabriel

Annals of Nuclear Energy

Vitor Fernandes de Almeida

Julien Théry


  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024
  • Bibliography
  • More Referencing guides Blog Automated transliteration Relevant bibliographies by topics
  • Automated transliteration
  • Relevant bibliographies by topics
  • Referencing guides

Dissertations / Theses on the topic 'Competency-based assessments'

Create a spot-on reference in apa, mla, chicago, harvard, and other styles.

Select a source type:

  • Journal article
  • Video (online)
  • All types...
  • Archival document
  • Book chapter
  • Complete reference
  • Conference paper
  • Copyright certificate
  • Dictionary entry
  • Dissertation / Thesis
  • Encyclopedia
  • Encyclopedia article
  • Extended abstract of dissertation
  • Newspaper article
  • Press release
  • Religious text
  • Social media post

Consult the top 50 dissertations / theses for your research on the topic 'Competency-based assessments.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

Chelimo, Sheila. "Structural Validity of Competency Based Assessments: An Approach to CurriculumEvaluation." Ohio University / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1529504437498332.

White, Melissa. "An intervention study to investigate development centres as an avenue to improve the self-efficacy of university graduates." University of Western Cape, 2020. http://hdl.handle.net/11394/7933.

Mhlongo, Nanikie Charity, and n/a. "Competency-Based assessment in Australia - does it work?" University of Canberra. Education and Community Studies, 2002. http://erl.canberra.edu.au./public/adt-AUC20050530.094237.

Curwood, Maurice Robert. "Competency-based training and assessment in the workplace /." Connect to thesis, 2004. http://eprints.unimelb.edu.au/archive/00001072.

Olivier, Marina. "The development of a model for the assessment of the subject entrepreneurship and business management at the N4 level using an outcomes based education approach." Thesis, Port Elizabeth Technikon, 2002. http://hdl.handle.net/10948/86.

Mothapo, Mocheko Edward. "Factors contributing to the implementation of Outcomes Based Assessment in Mankweng Circuit Primary Schools, Limpopo Province." Thesis, University of Limpopo (Turfloop Campus), 2011. http://hdl.handle.net/10386/536.

Brooks, Billy, Brian Martin, Paula Masters, and Robert Pack. "Tennessee Public Health Workforce Needs Assessment: A Competency-Based Approach." Digital Commons @ East Tennessee State University, 2013. https://dc.etsu.edu/etsu-works/3188.

McAllister, Sue Margery. "Competency based assessment of speech pathology students' performance in the workplace." University of Sydney, 2005. http://hdl.handle.net/2123/1130.

Wells, Elaine, and n/a. "ANCI Competencies: An Investigation of Uniqueness and Importance." Griffith University. School of Nursing, 2003. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20030527.132438.

Slamat, Jerome Albert. "Teachers, assessment and outcomes-based education: a philosophical enquiry." Thesis, Stellenbosch : University of Stellenbosch, 2009. http://hdl.handle.net/10019.1/1131.

Mahoney, Glenna. "Competency Assessment in Sexual Assault Nursing Practice| An Evidence-Based Approach." Thesis, Carlow University, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=3595809.

The purpose of this project was to develop and test a pilot competency assessment tool for sexual assault nurses. The content for the competency assessment was based on available evidence, primarily targeting current standards of sexual assault nurse examiner (SANE) practice. Descriptive statistics from a regional crime lab allowed the researcher to identify areas for improvement in the evidence-collection technique. This information was then used to develop the content of the competency assessment. A team of experts helped inform the development of an online competency assessment using a web-based platform. The competency assessment was tested on a small sample of sexual-assault nurse examiners. The instrument demonstrated a reasonable level of consistency and reliability (KR20 was 0.66) for an initial assessment. The aim of developing and testing an online instrument to serve as a baseline for establishing a valid and reliable competency assessment for sexual assault nurse examiners was achieved.

McAllister, Sue. "Competency based assessment of speech pathology students' performance in the workplace." Connect to full text, 2005. http://hdl.handle.net/2123/1130.

Ng, Wai-yan Vivian. "Impact of competency based assessment on teaching and learning of business subjects." Click to view the E-thesis via HKUTO, 2005. http://sunzi.lib.hku.hk/hkuto/record/B36255853.

Ng, Wai-yan Vivian, and 吳維欣. "Impact of competency based assessment on teaching and learning of business subjects." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B36255853.

Brings, Stanley Dean. "Competency-based assessment techniques : evaluating the effectiveness of community college contract training /." view abstract or download file of text, 2003. http://wwwlib.umi.com/cr/uoregon/fullcit?p3095237.

Flinton, David Maurice. "Competency based assessment using virtual reality (VERT) : is it a realistic possibility?" Thesis, University of East London, 2015. http://roar.uel.ac.uk/5174/.

Hannah, Kerry. "A Qualitative Assessment of Professional Development in a Competency-Based Education Model." ScholarWorks, 2019. https://scholarworks.waldenu.edu/dissertations/7872.

Phuma, Ellemes Everret. "Development of neonatal nursing care clinical competency-based assessment tool for Nurse-midwife technicians in CHAM nursing colleges, Malawi." University of the Western Cape, 2015. http://hdl.handle.net/11394/5079.

Seamonson, Melissa C. "An analysis of authentic assessment in an informational technology networking course at WCTC." Menomonie, WI : University of Wisconsin--Stout, 2007. http://www.uwstout.edu/lib/thesis/2007/2007seamonsonm.pdf.

Tippett, Steven R. Palmer James C. "Student outcome assessment in physical therapy education." Normal, Ill. Illinois State University, 2001. http://wwwlib.umi.com/cr/ilstu/fullcit?p3006628.

Vilakazi, Lesson Ndiyase. "A study of teachers' assessment of learners' work and its influence on the culture of learning in schools." Pretoria : [s.n.], 2002. http://upetd.up.ac.za/thesis/available/etd-07282005-112302.

Liu, Jinghua. "The effect of performance-based assessment on eighth grade students mathematics achievement /." free to MU campus, to others for purchase, 2000. http://wwwlib.umi.com/cr/mo/fullcit?p9974655.

Klein, Colleen J. Padavil George. "Correlation of the competency outcomes performance assessment (COPA) model curriculum process with senior students' self-reported perceptions of nursing competence." Normal, Ill. : Illinois State University, 2006. http://proquest.umi.com/pqdweb?index=0&did=1276394541&SrchMode=1&sid=4&Fmt=2&VInst=PROD&VType=PQD&RQT=309&VName=PQD&TS=1202155104&clientId=43838.

Staley, Marsha L. "Barriers to the trainer-of-trainers' model as used by the Missouri Assessment program : one district's experience /." free to MU campus, to others for purchase, 2001. http://wwwlib.umi.com/cr/mo/fullcit?p3013027.

Harmse, Rudi Gerhard. "A conceptual object-oriented model to support educators in an outcomes-based environment." Thesis, Port Elizabeth Technikon, 2001. http://hdl.handle.net/10948/47.

Lumby, Gail. "Teaching towards outcomes and its effect on assessment practices in a language, literacy and communications classroom." Pretoria : [s.n.], 2006. http://upetd.up.ac.za/thesis/available/etd-02072007-235439.

Ortiz, José Agustín. "Critical factors for universities teaching under a competency-based model." En Blanco y Negro, 2015. http://repositorio.pucp.edu.pe/index/handle/123456789/117129.

Rekman, Janelle. "The Development of a Workplace-Based Surgical Clinic Assessment Tool." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/34234.

Booi, Kwanele. "The implications of the introduction of outcomes based education in the natural sciences curriculum at Cape College of Education: the assessment of perceptions of squatter camp teachers in Khayelitsha towards the outcomes based education." Thesis, Rhodes University, 2000. http://hdl.handle.net/10962/d1003451.

DiGiacomo, Karen. "Program Evaluation of a Competency-Based Online Model in Higher Education." ScholarWorks, 2017. https://scholarworks.waldenu.edu/dissertations/3938.

Duke, Amy McGowan. "Performance-based assessment within a balanced literacy framework an analysis of teacher perceptions and implementation in elementary classrooms /." Click here to access dissertation, 2007. http://www.georgiasouthern.edu/etd/archive/spring2007/amy_m_duke/duke_amy_m_200708_edd.pdf.

Masigan, Peterson. "Competency-based assessment in clinical high-fidelity simulation : a survey of methods used in undergraduate nursing." Thesis, University of British Columbia, 2015. http://hdl.handle.net/2429/54980.

Wilmot, Pamela Dianne. "Teachers as recontextualisers: a case study analysis of outcomes-based assessment policy implementation in two South African schools." Thesis, Rhodes University, 2006. http://hdl.handle.net/10962/d1003677.

De, Bruler Curran A. "Assessment, knowledge and the curriculum : the effects of a competence-based approach to the training of teachers in further and adult education." Thesis, n.p, 2001. http://dart.open.ac.uk/abstracts/page.php?thesisid=131.

Motsenbocker, Pamela S. "A Comparative Analysis of Competency-Based versus Traditional Assessment with Respect to Academic Performance and Feedback Processes." Thesis, Concordia University Chicago, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10747435.

The purpose of this study was to compare the traditional grading and feedback systems used in most classrooms to a competency-based grading and feedback system. The traditional system used the familiar grading system of A, B, C, D and F applied to assignments. The competency model was based on providing students formative and summative feedback regarding their achievement toward proficiency of specific skills and concepts.

This quasi-experimental action research study had a control group and an intervention group comprised of general education and special education sixth grade students in language arts classes. Quantitative data in the form of student achievement scores and student survey responses was analyzed. Qualitative data in the form of teacher interview responses was analyzed.

Overall there was no statistically significant change in the MAP reading scores between the control and intervention group. However, when the variables of time, group and gender from an ANOVA were analyzed, the males in the intervention group showed a statically significant increase in achievement. This achievement was regardless if the male was an IEP or non-IEP student. Overall, the results do not show that either the control or intervention group sees the feedback as effective. However, the results of the girls’ responses in the control group were statistically significant. The girls in the control group did see the provided feedback as effective. The teachers’ interview responses provided three main themes, which included students applied feedback more in the competency-based classroom than in the traditional classroom. Both teachers used the feedback to adjust curriculum and instruction. Additionally, the intervention teacher pointed out that more time is needed to help students apply the competency-based system.

Based on this study, the first recommendation is to implement competency-based grading and feedback processes. The second recommendation is that formative and summative feedback processes based on proficiencies be implemented to assists students in identifying their understanding of and performance on skills and concepts. Recommendations for future studies include having a larger sample size and continuing the study for a longer period of time.

Kruger, Sandra Carolina. "The use of rubrics in the assessment of social sciences (history) in the get band in transformational outcomes-based education." Thesis, Cape Peninsula University of Technology, 2007. http://hdl.handle.net/20.500.11838/1910.

Swartz, Jennifer-Hellen. "Reconceptualising assessment practices in South African schools : making an argument for critical action /." Thesis, Link to the online version, 2006. http://hdl.handle.net/10019/1591.

Wright, Julie, and j. wright@rmit edu au. "Implementation of project based learning in a training package context." RMIT University. Education, 2008. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20080729.165211.

Mortensen, Mark H. "An Assessment of Learning Outcomes of Students Taught a Competency-Based Computer Course in an Electronically-Expanded Classroom." Thesis, University of North Texas, 1995. https://digital.library.unt.edu/ark:/67531/metadc277899/.

Ramoroka, Noko Jones. "Educators' understanding of the premises underpinning outcomes-based education and its impact on their classroom assessment practices." Pretoria : [s.n.], 2006. http://upetd.up.ac.za/thesis/available/etd-04052007-185249/.

Mellroth, Elisabet. "High achiever! Always a high achiever? : A comparison of student achievements on mathematical tests with different aims and goals." Licentiate thesis, Karlstads universitet, Institutionen för matematik och datavetenskap, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-34516.

Van, Wyk Milton Lester. "Die leerderportefeulje as 'n assesseringsinstrument in die leerarea sosiale wetenskappe, intermediêre fase (Grade 4-6)." Thesis, Stellenbosch : Stellenbosch University, 2007. http://hdl.handle.net/10019.1/19885.

Solomons, Inez Denise. "A conceptual exploration of the teaching and assessment of values within the South African Outcomes-Based curriculum /." Thesis, Online access, 2009. http://etd.uwc.ac.za/usrfiles/modules/etd/docs/etd_gen8Srv25Nme4_3915_1277409913.pdf.

Naicker, Sigamoney Manicka. "An investigation into the implementation of outcomes based education in the Western Cape Province." Thesis, University of the Western Cape, 2000. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_5229_1181560156.

Bragg, John M. (John Morris) 1949. "The Effect of Remediation on Students Who Have Failed the TEAMS Minimum Competency Test." Thesis, University of North Texas, 1988. https://digital.library.unt.edu/ark:/67531/metadc330810/.

Lombard, Elsa Helena. "Identifying the need for the development of an instrument to determine senior phase teachers' science-assessment competence." Thesis, Port Elizabeth Technikon, 2002. http://hdl.handle.net/10948/100.

Mtetwa, Albert Charles. "Has it happened in Mpumalanga? An evaluation of the implementation of Curriculum 2005." Diss., Pretoria : [s.n.], 2003. http://upetd.up.ac.za/thesis/available/etd-03042004-141957/.

Sanguinetti, Jill, and edu au jillj@deakin edu au mikewood@deakin edu au kimg@deakin. "Within and Against Performativity: Discursive Engagement in Adult Literacy and Basic Education." Deakin University. Information not given, 1999. http://tux.lib.deakin.edu.au./adt-VDU/public/adt-VDU20040615.103017.

Van, Rensburg Gail Janse. "The development of a modularised curriculum for computer competency courses for technikon learners / Gail Janse van Rensburg." Thesis, Potchefstroom University for Christian Higher Education, 2003. http://hdl.handle.net/10394/516.

Snyman, Margaretha Alberta. "Assessment of professional behaviour in occupational therapy education: investigating assessors’ understanding of constructs and expectations of levels of competence." Thesis, Stellenbosch : Stellenbosch University, 2012. http://hdl.handle.net/10019.1/20037.

Orion Group Holdings: Top Construction Stock To Consider In 2024

Stella Mwende profile picture

  • Orion Group Holdings has achieved a quarterly gross profit of $19.1 million, a 42.5% increase YoY.
  • The company is focusing on improving profitability and has a high backlog of $920 million, exceeding its FY 2022 revenue.
  • Orion Group is divesting its Central Texas business and focusing on Dallas and Houston, which offer higher revenue potential.

Wide shot rear view of a male and female coworker talking on site

Jessie Casson/DigitalVision via Getty Images

Construction company Orion Group Holdings ( NYSE: ORN ) grew its quarterly gross profit to a $19.1 million representing an increase of 42.5% (YoY) from $13.4 million realized in Q3 2022. The company’s share price has surged 110.21% (YoY) indicating a positive turnaround into 2024 boosted by lower cost of revenues and operational expenses.

Orion's focus on improving its profitability has seen it solidify its concrete business not just based on an adjusted EBITDA but also on quality revenue and higher operating income. The company has lined up various marine projects that will accelerate its overall momentum having won building/ design contracts for high-end businesses into 2024. I am particularly excited with the high backlog that stood at $920 million (as of Q3 2023) greater than the company's annual revenue of $748.3 million in 2022.

Business Value Proposition

Orion’s concrete business has been steadily increasing since Q1 2023 and it recorded sales at $50.8 million by Q3 2023 against the overall gross profit of $19.1 million. The gross margin in the sector grew by 390 basis points enhanced by low equipment costs and other operational efficiencies, especially in its dredging business that incurred low labor utilization cost. In its Q3 2023 financial call, Orion explained that it was exiting its Central Texas business and focusing on Dallas and Houston where it stated it would collect higher revenue quality.

First Dallas and Houston are what I would term, "hot spots" for both diverse business centers and efforts of energy transition respectively. For instance, a report shows that over the years Houston has registered more than “60 new low-carbon, climate startups and an innovation ecosystem for energy transition investments.” Further, it is not only the 4 th biggest US city but also houses at least “26 of the Fortune 500 companies.” On its part, Dallas has a great growth potential providing a home to more than 65,000 businesses including international and small businesses offering multiple employment prospects.

The company also announced entry into the Bahamas, where it was awarded a design-build turnkey contract for the “Grand Bahamas Shipyard Dry Dock Replacement project” valued at $100 million. The project’s conclusion is scheduled for Q4 2025 and will be done in conjunction with subcontractors from the Bahamas. One important aspect of the Bahamas is that the business environment in Q2 and Q3 of 2023, favored commercial and public developments as opposed to private spaces while permit values/ numbers continued to increase. Other than the Grand Bahamas, there are also the New Providence and Family Islands, which will also provide important businesses to Orion (as seen in the construction statistics below).

growing construction valuation in Bahamas

Bahamas government construction statistics

There was a 100% (YoY) increase in the number of projects completed within the commercial/ industrial space in the Bahamas against a 7.54% (YoY) decrease in the number of private/ housing projects between Q2 2023 and Q2 2023. In turn, the value of public projects in the same period grew by 2,172.73% (YoY).

Market-level competency

Orion Group also announced a recent $121 million contract award in both the concrete and marine space, indicating its increasing skill & competency giving it a competitive edge in the market. Up to Q3 2023, Orion’s total backlog and projects contracts were valued at $920 million which is not only 5 times more than the company’s market capitalization (of about $160.5 million) but also $200 million plus more than the FY 2022 revenue of $748.3 million.

Of special interest is the recent mention of an oncoming dredging contract from the Army Corps of Engineering. The projects associated with this government agency are mostly on water infrastructure, including dams and levees (mostly in the civil works segment). In 2022, the Army Corps was allowed (through the Water Resource Development Act of 2022) the use of transaction agreements, that would allow the agency to use contracts for these projects in the absence of “procurement/ cooperative agreements of grants.”

Under such a transaction agreement, Orion will have the chance to offer its expertise on various concrete mixes helping the Army Corp with its environmental protection initiative. I view this strategy as an opportunity for Orion to explore new areas of collaboration to raise its revenue and expand its area of operation. With $50.8 million in bid wins in the quarter and Q3 2023 revenue at $169 million against a diluted share of $0.02, Orion is on the path to increasing its profitability into 2024.

Risk to the Business

Low cash availability

Orion’s cash stands at $3.9 million representing a 56.18% (QoQ) decline from $8.9 million recorded in Q2 2023. However, this cash balance shows a 44.4% (YoY) increase from $2.7 million realized in Q3 2022. S till, this cash is lower than the total debt balance that sits at $125.3 million, a 22.24% (QoQ) increase from $102.5 million recorded in Q2 2023. As seen, Orion's main form of financing has been debt that has eroded into the company's cash account.

Future opportunities to consider

In my view, Orion Group is a very attractive stock, considering it is trading under $5 with contracts and a solid backlog of almost $1 billion into 2024. Safe to say, it may be a target of major construction plays considering its renewed focus on the concrete market. Market leader Granite Construction Incorporated ( GVA ) recently completed the acquisition of Coast Mountain Resources for just $26.93 million without any material effect on its balance sheet. An acquisition of Orion Group (in my perspective) by GVA would push the stock to at least $7. GVA's revenue in the 3 months ending on September 30, 2023, stood at $1.117 billion. Of this amount, about $945 million was attributed to construction while materials took up $171.1 million. Acquiring Orion would bring on board, the marine segment thereby expanding the GVA revenue base.

Orion’s forward enterprise value (EV) to sales stands at 0.40 against the industry average of 1.82 (representing a difference of -78.03%). Additionally, ORN’s forward price-to-sales ratio is 0.23 against the sector average of 1.45 (indicating a difference of -84.25%). These metrics show that ORN is highly undervalued with an upside potential above 75% into 2024.

Bottom Line

Despite the low cash-to-debt balance, Orion Group is a buy considering its growing revenue and contractual obligation within and without the US. The company's backlog is about $920 million into 2024, higher than its FY 2022 revenue. I have also considered the acquisition aspect of this stock with the valuation showing it has an upside potential of at least 75%.

This article was written by

Stella Mwende profile picture

Analyst’s Disclosure: I/we have a beneficial long position in the shares of ORN either through stock ownership, options, or other derivatives. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Seeking Alpha's Disclosure: Past performance is no guarantee of future results. No recommendation or advice is being given as to whether any investment is suitable for a particular investor. Any views or opinions expressed above may not reflect those of Seeking Alpha as a whole. Seeking Alpha is not a licensed securities dealer, broker or US investment adviser or investment bank. Our analysts are third party authors that include both professional investors and individual investors who may not be licensed or certified by any institute or regulatory body.

Recommended For You

Comments ( 3 ), about orn stock, more on orn, related stocks, trending analysis, trending news.

thesis questionnaire about competencies


  1. Assessing Students Competencies and Learning in Master Thesis Projects: Towards an Integrated Evaluation Approach


  2. PDF Assessing Students Competencies and Learning

    preliminary results of the student's competencies survey. Competencies survey The study design was though as a cross-sectional survey of postgraduate students who where completing a thesis as part of their coursework master`s degree in three knowledge field areas: master in marketing and sales, master in

  3. PDF Questionnaire evaluating teaching competencies in the university ...

    Questionnaire evaluating teaching competencies in the university environment. Evaluation of teaching competencies in the university Juan Antonio Moreno-Murcia1*, ... and Carrascosa (2005) created a questionnaire made up of 25 items broken down into four dimensions: interaction with the students, methodology, teaching obligations and evaluation,

  4. The role of academic competences and learning processes in predicting

    Writing a thesis requires competences such as analysing information. • Students' organising skills should be supported especially in the Bachelor phase. • The role of thesis grade as indicator of study success should be critically evaluated. Abstract

  5. PDF Towards a framework for assessing teacher competence

    Competency (plural competencies) is a narrower, more atomistic con-cept used to label particular abilities (see also McConnell, 2001). Based. Towards a framework for assessing teacher competence. Erik Roelofs, Piet Sanders 125. on a study of dozens of definitions of competence (e.g. Bunk, 1994; Spencer and Spencer, 1993; Parry, 1996), Mulder ...

  6. PDF Guide to the Competency-based Learning Survey for Students

    1 Survey constructs, modules, and items of the Competency-based Learning Survey for Students 4 2 Comparison of select survey items adapted for use in New Hampshire and Maine 6 B1 Number and percentage of students selecting each statement as most true when asked why their school uses a competency-based grading system, 2014/15 B-1

  7. Questionnaire to Evaluate the Competency in Evidence‐Based ...

    EBP-COQ Prof© is a valid, reliable, and easily administered questionnaire that measures the self-perceived competency of registered nurses in EBP based on an updated and specific competency framework.


    Abstract Preparing students for the real world of work is a vital responsibility of higher education institutions. To determine their preparedness, assessment tools are necessary to verify whether...

  9. Questionnaire evaluating teaching competencies in the university

    Although there is no consensus on teachers' professional competencies, it is still possible to summarize teachers' professional competencies as a set of different abilities, skills,...

  10. Competency Definitions, Development and Assessment: A Brief Review

    Competencies have been used as valid predictors of superior on-the-job performance in business organizations over the last 40 years. An abundant of empirical evidence has suggested that...

  11. PDF Assessment of the Writing Competency Requirement

    of college-level writing, particularly with respect to the ability to formulate and develop a thesis. In addition, a significant proportion of FP students show weaknesses in their ability to support ... While the Writing Competency requirement includes writing in the First-Year Preceptorial, FP writing had previously been assessed independently ...

  12. Questionnaire Design

    Revised on June 22, 2023. A questionnaire is a list of questions or items used to gather data from respondents about their attitudes, experiences, or opinions. Questionnaires can be used to collect quantitative and/or qualitative information. Questionnaires are commonly used in market research as well as in the social and health sciences.

  13. (PDF) Managerial Competencies and Organizations Performance

    January 2015 Authors: Ruba Osama Hawi Dina Alkhodary Tareq Hashem Applied Science Private University Abstract and Figures This study explores the link between the managerial competencies and the...

  14. Teacher Competencies and Its Relation to Academic Performance of The

    It is the principal instrument of data collection. The researchers used a checklist type of survey questions for the third part of the questionnaire seeking for the respondents' assessment on the teaching competencies of their MAPEH teachers. The questionnaire consists of three parts. The first part is the introduction of the study.

  15. PDF Grammatical Competence of Junior High School Students

    The study determined the level of grammatical competence of 177 Junior High School students and on the design and development of a supplementary learning material to enhance the grammatical competence of the students along subject-verb agreement. the study revealed that students favored textbooks as their preferred reading material at home.

  16. PDF Measuring Digital Competence and ICT Literacy: An Exploratory ...

    Those key competencies identified by the European Parliament and of the Council (2006) include: (1) communication in the mother tongue, 2) communication in foreign languages, (3) mathematical competence and basic competences in science and technology, 4) digital competence, 5) learning to learn, (6) social and civic competences, 7) a sense of in...

  17. A Model and Questionnaire of Professional Competencies of ...

    Teachers' professional competencies are essential abilities which assure the success of teachers in performing their responsibilities. The major purpose of the present study is to construct an EFL teachers' professional competencies Questionnaire in Iran and to examine the relationship between teachers' professional competencies and their students' achievement. With this aim, all the ...

  18. (Pdf) Hospitality Management Competencies: Identifying Graduates

    Table1. Competencies for Hospitality Graduates No. Competencies Items Percentage (%) 1 Communication skills (oral, professional writing and email etiquette) 100 2 Delivering exceptional and...

  19. Competency Assessment Questionnaire Interpretation Key*

    Competency Assessment Questionnaire Interpretation Key* A competency is the knowledge, skill, ability and/or enabling behavior required to effectively perform work. The competency profile for each job describes the particular knowledge, skills, abilities, enabling behaviors and level of performance required to do that job.

  20. Dissertations / Theses: 'Development of competencies'

    The Survey of Competencies for Teaching an Online Course, a 23-item instrument designed by the researcher, was mailed to 28 distance education administrators with membership to the Florida Distance Learning Consortium (FDLC) and 100 faculty teaching mathematics or statistics online during spring term 2006.

  21. Dissertations / Theses: 'Staff competencies'

    "Wright's Competency Model and Quality and Safety Competencies." Text, ScholarWorks, 2019. https://scholarworks.waldenu.edu/dissertations/6667. APA, Harvard, Vancouver, ISO, and other styles Abstract: Competent nurses are instrumental in assuring that a patient receives safe patient care of the highest quality.

  22. PDF ESLP 82 Questionnaire: Self-Assessment of English Writing Skills and

    ESLP 82 Questionnaire: Self-Assessment of English Writing Skills and Use of Writing Strategies Please rate your abilities for each item below a scale between 1 to 5. Circle your choice. 1=never or almost never true of me 2=usually not true of me 3=somewhat true of me 4=usually true of me 5=always or almost always true of me

  23. Dissertations / Theses: 'Competency-based assessments'

    Competency-Based Assessment (CBA) is an integral part of CBT that needs particular attention if the new system is to succeed. The key aims of this thesis are to investigate the current assessment policy and practice at the Canberra Institute of Technology (CIT) underpinned by Competency- Based Training system.

  24. Orion Group Holdings: Top Construction Stock To Consider In 2024

    Orion Group also announced a recent $121 million contract award in both the concrete and marine space, indicating its increasing skill & competency giving it a competitive edge in the market. Up ...