Eberly Center

Teaching excellence & educational innovation, creating and using rubrics.

A rubric is a scoring tool that explicitly describes the instructor’s performance expectations for an assignment or piece of work. A rubric identifies:

  • criteria: the aspects of performance (e.g., argument, evidence, clarity) that will be assessed
  • descriptors: the characteristics associated with each dimension (e.g., argument is demonstrable and original, evidence is diverse and compelling)
  • performance levels: a rating scale that identifies students’ level of mastery within each criterion  

Rubrics can be used to provide feedback to students on diverse types of assignments, from papers, projects, and oral presentations to artistic performances and group projects.

Benefitting from Rubrics

  • reduce the time spent grading by allowing instructors to refer to a substantive description without writing long comments
  • help instructors more clearly identify strengths and weaknesses across an entire class and adjust their instruction appropriately
  • help to ensure consistency across time and across graders
  • reduce the uncertainty which can accompany grading
  • discourage complaints about grades
  • understand instructors’ expectations and standards
  • use instructor feedback to improve their performance
  • monitor and assess their progress as they work towards clearly indicated goals
  • recognize their strengths and weaknesses and direct their efforts accordingly

Examples of Rubrics

Here we are providing a sample set of rubrics designed by faculty at Carnegie Mellon and other institutions. Although your particular field of study or type of assessment may not be represented, viewing a rubric that is designed for a similar assessment may give you ideas for the kinds of criteria, descriptions, and performance levels you use on your own rubric.

  • Example 1: Philosophy Paper This rubric was designed for student papers in a range of courses in philosophy (Carnegie Mellon).
  • Example 2: Psychology Assignment Short, concept application homework assignment in cognitive psychology (Carnegie Mellon).
  • Example 3: Anthropology Writing Assignments This rubric was designed for a series of short writing assignments in anthropology (Carnegie Mellon).
  • Example 4: History Research Paper . This rubric was designed for essays and research papers in history (Carnegie Mellon).
  • Example 1: Capstone Project in Design This rubric describes the components and standards of performance from the research phase to the final presentation for a senior capstone project in design (Carnegie Mellon).
  • Example 2: Engineering Design Project This rubric describes performance standards for three aspects of a team project: research and design, communication, and team work.

Oral Presentations

  • Example 1: Oral Exam This rubric describes a set of components and standards for assessing performance on an oral exam in an upper-division course in history (Carnegie Mellon).
  • Example 2: Oral Communication This rubric is adapted from Huba and Freed, 2000.
  • Example 3: Group Presentations This rubric describes a set of components and standards for assessing group presentations in history (Carnegie Mellon).

Class Participation/Contributions

  • Example 1: Discussion Class This rubric assesses the quality of student contributions to class discussions. This is appropriate for an undergraduate-level course (Carnegie Mellon).
  • Example 2: Advanced Seminar This rubric is designed for assessing discussion performance in an advanced undergraduate or graduate seminar.

See also " Examples and Tools " section of this site for more rubrics.

CONTACT US to talk with an Eberly colleague in person!

  • Faculty Support
  • Graduate Student Support
  • Canvas @ Carnegie Mellon
  • Quick Links

creative commons image

Center for Teaching Innovation

Resource library.

  • AACU VALUE Rubrics

Using rubrics

A rubric is a type of scoring guide that assesses and articulates specific components and expectations for an assignment. Rubrics can be used for a variety of assignments: research papers, group projects, portfolios, and presentations.  

Why use rubrics? 

Rubrics help instructors: 

  • Assess assignments consistently from student-to-student. 
  • Save time in grading, both short-term and long-term. 
  • Give timely, effective feedback and promote student learning in a sustainable way. 
  • Clarify expectations and components of an assignment for both students and course teaching assistants (TAs). 
  • Refine teaching methods by evaluating rubric results. 

Rubrics help students: 

  • Understand expectations and components of an assignment. 
  • Become more aware of their learning process and progress. 
  • Improve work through timely and detailed feedback. 

Considerations for using rubrics 

When developing rubrics consider the following:

  • Although it takes time to build a rubric, time will be saved in the long run as grading and providing feedback on student work will become more streamlined.  
  • A rubric can be a fillable pdf that can easily be emailed to students. 
  • They can be used for oral presentations. 
  • They are a great tool to evaluate teamwork and individual contribution to group tasks. 
  • Rubrics facilitate peer-review by setting evaluation standards. Have students use the rubric to provide peer assessment on various drafts. 
  • Students can use them for self-assessment to improve personal performance and learning. Encourage students to use the rubrics to assess their own work. 
  • Motivate students to improve their work by using rubric feedback to resubmit their work incorporating the feedback. 

Getting Started with Rubrics 

  • Start small by creating one rubric for one assignment in a semester.  
  • Ask colleagues if they have developed rubrics for similar assignments or adapt rubrics that are available online. For example, the  AACU has rubrics  for topics such as written and oral communication, critical thinking, and creative thinking. RubiStar helps you to develop your rubric based on templates.  
  • Examine an assignment for your course. Outline the elements or critical attributes to be evaluated (these attributes must be objectively measurable). 
  • Create an evaluative range for performance quality under each element; for instance, “excellent,” “good,” “unsatisfactory.” 
  • Avoid using subjective or vague criteria such as “interesting” or “creative.” Instead, outline objective indicators that would fall under these categories. 
  • The criteria must clearly differentiate one performance level from another. 
  • Assign a numerical scale to each level. 
  • Give a draft of the rubric to your colleagues and/or TAs for feedback. 
  • Train students to use your rubric and solicit feedback. This will help you judge whether the rubric is clear to them and will identify any weaknesses. 
  • Rework the rubric based on the feedback. 
  • Faculty and Staff

twitter

Assessment and Curriculum Support Center

Creating and using rubrics.

Last Updated: 4 March 2024. Click here to view archived versions of this page.

On this page:

  • What is a rubric?
  • Why use a rubric?
  • What are the parts of a rubric?
  • Developing a rubric
  • Sample rubrics
  • Scoring rubric group orientation and calibration
  • Suggestions for using rubrics in courses
  • Equity-minded considerations for rubric development
  • Tips for developing a rubric
  • Additional resources & sources consulted

Note:  The information and resources contained here serve only as a primers to the exciting and diverse perspectives in the field today. This page will be continually updated to reflect shared understandings of equity-minded theory and practice in learning assessment.

1. What is a rubric?

A rubric is an assessment tool often shaped like a matrix, which describes levels of achievement in a specific area of performance, understanding, or behavior.

There are two main types of rubrics:

Analytic Rubric : An analytic rubric specifies at least two characteristics to be assessed at each performance level and provides a separate score for each characteristic (e.g., a score on “formatting” and a score on “content development”).

  • Advantages: provides more detailed feedback on student performance; promotes consistent scoring across students and between raters
  • Disadvantages: more time consuming than applying a holistic rubric
  • You want to see strengths and weaknesses.
  • You want detailed feedback about student performance.

Holistic Rubric: A holistic rubrics provide a single score based on an overall impression of a student’s performance on a task.

  • Advantages: quick scoring; provides an overview of student achievement; efficient for large group scoring
  • Disadvantages: does not provided detailed information; not diagnostic; may be difficult for scorers to decide on one overall score
  • You want a quick snapshot of achievement.
  • A single dimension is adequate to define quality.

2. Why use a rubric?

  • A rubric creates a common framework and language for assessment.
  • Complex products or behaviors can be examined efficiently.
  • Well-trained reviewers apply the same criteria and standards.
  • Rubrics are criterion-referenced, rather than norm-referenced. Raters ask, “Did the student meet the criteria for level 5 of the rubric?” rather than “How well did this student do compared to other students?”
  • Using rubrics can lead to substantive conversations among faculty.
  • When faculty members collaborate to develop a rubric, it promotes shared expectations and grading practices.

Faculty members can use rubrics for program assessment. Examples:

The English Department collected essays from students in all sections of English 100. A random sample of essays was selected. A team of faculty members evaluated the essays by applying an analytic scoring rubric. Before applying the rubric, they “normed”–that is, they agreed on how to apply the rubric by scoring the same set of essays and discussing them until consensus was reached (see below: “6. Scoring rubric group orientation and calibration”). Biology laboratory instructors agreed to use a “Biology Lab Report Rubric” to grade students’ lab reports in all Biology lab sections, from 100- to 400-level. At the beginning of each semester, instructors met and discussed sample lab reports. They agreed on how to apply the rubric and their expectations for an “A,” “B,” “C,” etc., report in 100-level, 200-level, and 300- and 400-level lab sections. Every other year, a random sample of students’ lab reports are selected from 300- and 400-level sections. Each of those reports are then scored by a Biology professor. The score given by the course instructor is compared to the score given by the Biology professor. In addition, the scores are reported as part of the program’s assessment report. In this way, the program determines how well it is meeting its outcome, “Students will be able to write biology laboratory reports.”

3. What are the parts of a rubric?

Rubrics are composed of four basic parts. In its simplest form, the rubric includes:

  • A task description . The outcome being assessed or instructions students received for an assignment.
  • The characteristics to be rated (rows) . The skills, knowledge, and/or behavior to be demonstrated.
  • Beginning, approaching, meeting, exceeding
  • Emerging, developing, proficient, exemplary 
  • Novice, intermediate, intermediate high, advanced 
  • Beginning, striving, succeeding, soaring
  • Also called a “performance description.” Explains what a student will have done to demonstrate they are at a given level of mastery for a given characteristic.

4. Developing a rubric

Step 1: Identify what you want to assess

Step 2: Identify the characteristics to be rated (rows). These are also called “dimensions.”

  • Specify the skills, knowledge, and/or behaviors that you will be looking for.
  • Limit the characteristics to those that are most important to the assessment.

Step 3: Identify the levels of mastery/scale (columns).

Tip: Aim for an even number (4 or 6) because when an odd number is used, the middle tends to become the “catch-all” category.

Step 4: Describe each level of mastery for each characteristic/dimension (cells).

  • Describe the best work you could expect using these characteristics. This describes the top category.
  • Describe an unacceptable product. This describes the lowest category.
  • Develop descriptions of intermediate-level products for intermediate categories.
Important: Each description and each characteristic should be mutually exclusive.

Step 5: Test rubric.

  • Apply the rubric to an assignment.
  • Share with colleagues.
Tip: Faculty members often find it useful to establish the minimum score needed for the student work to be deemed passable. For example, faculty members may decided that a “1” or “2” on a 4-point scale (4=exemplary, 3=proficient, 2=marginal, 1=unacceptable), does not meet the minimum quality expectations. We encourage a standard setting session to set the score needed to meet expectations (also called a “cutscore”). Monica has posted materials from standard setting workshops, one offered on campus and the other at a national conference (includes speaker notes with the presentation slides). They may set their criteria for success as 90% of the students must score 3 or higher. If assessment study results fall short, action will need to be taken.

Step 6: Discuss with colleagues. Review feedback and revise.

Important: When developing a rubric for program assessment, enlist the help of colleagues. Rubrics promote shared expectations and consistent grading practices which benefit faculty members and students in the program.

5. Sample rubrics

Rubrics are on our Rubric Bank page and in our Rubric Repository (Graduate Degree Programs) . More are available at the Assessment and Curriculum Support Center in Crawford Hall (hard copy).

These open as Word documents and are examples from outside UH.

  • Group Participation (analytic rubric)
  • Participation (holistic rubric)
  • Design Project (analytic rubric)
  • Critical Thinking (analytic rubric)
  • Media and Design Elements (analytic rubric; portfolio)
  • Writing (holistic rubric; portfolio)

6. Scoring rubric group orientation and calibration

When using a rubric for program assessment purposes, faculty members apply the rubric to pieces of student work (e.g., reports, oral presentations, design projects). To produce dependable scores, each faculty member needs to interpret the rubric in the same way. The process of training faculty members to apply the rubric is called “norming.” It’s a way to calibrate the faculty members so that scores are accurate and consistent across the faculty. Below are directions for an assessment coordinator carrying out this process.

Suggested materials for a scoring session:

  • Copies of the rubric
  • Copies of the “anchors”: pieces of student work that illustrate each level of mastery. Suggestion: have 6 anchor pieces (2 low, 2 middle, 2 high)
  • Score sheets
  • Extra pens, tape, post-its, paper clips, stapler, rubber bands, etc.

Hold the scoring session in a room that:

  • Allows the scorers to spread out as they rate the student pieces
  • Has a chalk or white board, smart board, or flip chart
  • Describe the purpose of the activity, stressing how it fits into program assessment plans. Explain that the purpose is to assess the program, not individual students or faculty, and describe ethical guidelines, including respect for confidentiality and privacy.
  • Describe the nature of the products that will be reviewed, briefly summarizing how they were obtained.
  • Describe the scoring rubric and its categories. Explain how it was developed.
  • Analytic: Explain that readers should rate each dimension of an analytic rubric separately, and they should apply the criteria without concern for how often each score (level of mastery) is used. Holistic: Explain that readers should assign the score or level of mastery that best describes the whole piece; some aspects of the piece may not appear in that score and that is okay. They should apply the criteria without concern for how often each score is used.
  • Give each scorer a copy of several student products that are exemplars of different levels of performance. Ask each scorer to independently apply the rubric to each of these products, writing their ratings on a scrap sheet of paper.
  • Once everyone is done, collect everyone’s ratings and display them so everyone can see the degree of agreement. This is often done on a blackboard, with each person in turn announcing his/her ratings as they are entered on the board. Alternatively, the facilitator could ask raters to raise their hands when their rating category is announced, making the extent of agreement very clear to everyone and making it very easy to identify raters who routinely give unusually high or low ratings.
  • Guide the group in a discussion of their ratings. There will be differences. This discussion is important to establish standards. Attempt to reach consensus on the most appropriate rating for each of the products being examined by inviting people who gave different ratings to explain their judgments. Raters should be encouraged to explain by making explicit references to the rubric. Usually consensus is possible, but sometimes a split decision is developed, e.g., the group may agree that a product is a “3-4” split because it has elements of both categories. This is usually not a problem. You might allow the group to revise the rubric to clarify its use but avoid allowing the group to drift away from the rubric and learning outcome(s) being assessed.
  • Once the group is comfortable with how the rubric is applied, the rating begins. Explain how to record ratings using the score sheet and explain the procedures. Reviewers begin scoring.
  • Are results sufficiently reliable?
  • What do the results mean? Are we satisfied with the extent of students’ learning?
  • Who needs to know the results?
  • What are the implications of the results for curriculum, pedagogy, or student support services?
  • How might the assessment process, itself, be improved?

7. Suggestions for using rubrics in courses

  • Use the rubric to grade student work. Hand out the rubric with the assignment so students will know your expectations and how they’ll be graded. This should help students master your learning outcomes by guiding their work in appropriate directions.
  • Use a rubric for grading student work and return the rubric with the grading on it. Faculty save time writing extensive comments; they just circle or highlight relevant segments of the rubric. Some faculty members include room for additional comments on the rubric page, either within each section or at the end.
  • Develop a rubric with your students for an assignment or group project. Students can the monitor themselves and their peers using agreed-upon criteria that they helped develop. Many faculty members find that students will create higher standards for themselves than faculty members would impose on them.
  • Have students apply your rubric to sample products before they create their own. Faculty members report that students are quite accurate when doing this, and this process should help them evaluate their own projects as they are being developed. The ability to evaluate, edit, and improve draft documents is an important skill.
  • Have students exchange paper drafts and give peer feedback using the rubric. Then, give students a few days to revise before submitting the final draft to you. You might also require that they turn in the draft and peer-scored rubric with their final paper.
  • Have students self-assess their products using the rubric and hand in their self-assessment with the product; then, faculty members and students can compare self- and faculty-generated evaluations.

8. Equity-minded considerations for rubric development

Ensure transparency by making rubric criteria public, explicit, and accessible

Transparency is a core tenet of equity-minded assessment practice. Students should know and understand how they are being evaluated as early as possible.

  • Ensure the rubric is publicly available & easily accessible. We recommend publishing on your program or department website.
  • Have course instructors introduce and use the program rubric in their own courses. Instructors should explain to students connections between the rubric criteria and the course and program SLOs.
  • Write rubric criteria using student-focused and culturally-relevant language to ensure students understand the rubric’s purpose, the expectations it sets, and how criteria will be applied in assessing their work.
  • For example, instructors can provide annotated examples of student work using the rubric language as a resource for students.

Meaningfully involve students and engage multiple perspectives

Rubrics created by faculty alone risk perpetuating unseen biases as the evaluation criteria used will inherently reflect faculty perspectives, values, and assumptions. Including students and other stakeholders in developing criteria helps to ensure performance expectations are aligned between faculty, students, and community members. Additional perspectives to be engaged might include community members, alumni, co-curricular faculty/staff, field supervisors, potential employers, or current professionals. Consider the following strategies to meaningfully involve students and engage multiple perspectives:

  • Have students read each evaluation criteria and talk out loud about what they think it means. This will allow you to identify what language is clear and where there is still confusion.
  • Ask students to use their language to interpret the rubric and provide a student version of the rubric.
  • If you use this strategy, it is essential to create an inclusive environment where students and faculty have equal opportunity to provide input.
  • Be sure to incorporate feedback from faculty and instructors who teach diverse courses, levels, and in different sub-disciplinary topics. Faculty and instructors who teach introductory courses have valuable experiences and perspectives that may differ from those who teach higher-level courses.
  • Engage multiple perspectives including co-curricular faculty/staff, alumni, potential employers, and community members for feedback on evaluation criteria and rubric language. This will ensure evaluation criteria reflect what is important for all stakeholders.
  • Elevate historically silenced voices in discussions on rubric development. Ensure stakeholders from historically underrepresented communities have their voices heard and valued.

Honor students’ strengths in performance descriptions

When describing students’ performance at different levels of mastery, use language that describes what students can do rather than what they cannot do. For example:

  • Instead of: Students cannot make coherent arguments consistently.
  • Use: Students can make coherent arguments occasionally.

9. Tips for developing a rubric

  • Find and adapt an existing rubric! It is rare to find a rubric that is exactly right for your situation, but you can adapt an already existing rubric that has worked well for others and save a great deal of time. A faculty member in your program may already have a good one.
  • Evaluate the rubric . Ask yourself: A) Does the rubric relate to the outcome(s) being assessed? (If yes, success!) B) Does it address anything extraneous? (If yes, delete.) C) Is the rubric useful, feasible, manageable, and practical? (If yes, find multiple ways to use the rubric: program assessment, assignment grading, peer review, student self assessment.)
  • Collect samples of student work that exemplify each point on the scale or level. A rubric will not be meaningful to students or colleagues until the anchors/benchmarks/exemplars are available.
  • Expect to revise.
  • When you have a good rubric, SHARE IT!

10. Additional resources & sources consulted:

Rubric examples:

  • Rubrics primarily for undergraduate outcomes and programs
  • Rubric repository for graduate degree programs

Workshop presentation slides and handouts:

  • Workshop handout (Word document)
  • How to Use a Rubric for Program Assessment (2010)
  • Techniques for Using Rubrics in Program Assessment by guest speaker Dannelle Stevens (2010)
  • Rubrics: Save Grading Time & Engage Students in Learning by guest speaker Dannelle Stevens (2009)
  • Rubric Library , Institutional Research, Assessment & Planning, California State University-Fresno
  • The Basics of Rubrics [PDF], Schreyer Institute, Penn State
  • Creating Rubrics , Teaching Methods and Management, TeacherVision
  • Allen, Mary – University of Hawai’i at Manoa Spring 2008 Assessment Workshops, May 13-14, 2008 [available at the Assessment and Curriculum Support Center]
  • Mertler, Craig A. (2001). Designing scoring rubrics for your classroom. Practical Assessment, Research & Evaluation , 7(25).
  • NPEC Sourcebook on Assessment: Definitions and Assessment Methods for Communication, Leadership, Information Literacy, Quantitative Reasoning, and Quantitative Skills . [PDF] (June 2005)

Contributors: Monica Stitt-Bergh, Ph.D., TJ Buckley, Yao Z. Hill Ph.D.

rubrics for group presentation in english

  • Teacher Education
  • Nursing Education
  • Behavioral Sciences
  • Sign & Foreign Languages
  • Performing Arts
  • Communication
  • Any Skill You Teach

WATCH FOR FREE

ReAction On-Demand!

Dive into our on-demand library from the skills based conference.

rubrics for group presentation in english

SEE GOREACT IN ACTION

Try for Free

See how GoReact can help empower confident skills

rubrics for group presentation in english

CONTENT TYPE

  • Case Studies
  • Product Demos

rubrics for group presentation in english

ReAction On-Demand

Dive into our on-demand library from ReAction, the skills based conference. Whether you missed a session or want to rewatch, it's all here (and free)!

  • CONTACT SALES EXPLORE GOREACT TRY FOR FREE CONTACT SALES

Higher Education

How to (Effectively) Use a Presentation Grading Rubric

rubrics for group presentation in english

Almost all higher education courses these days require students to give a presentation, which can be a beast to grade. But there’s a simple tool to keep your evaluations on track. 

Enter: The presentation grading rubric.

With a presentation grading rubric, giving feedback is simple. Rubrics help instructors standardize criteria and provide consistent scoring and feedback for each presenter. 

How can presentation grading rubrics be used effectively? Here are 5 ways to make the most of your rubrics. 

1. Find a Good Customizable Rubric

There’s practically no limit to how rubrics are used, and there are oodles of presentation rubrics on Pinterest and Google Images. But not all rubrics are created equal. 

Professors need to be picky when choosing a presentation rubric for their courses. Rubrics should clearly define the target that students are aiming for and describe performance. 

2. Fine-Tune Your Rubric

Make sure your rubric accurately reflects the expectations you have for your students. It may be helpful to ask a colleague or peer to review your rubric before putting it to use. After using it for an assignment, you could take notes on the rubric’s efficiency as you grade. 

You may need to tweak your rubric to correct common misunderstandings or meet the criteria for a specific assignment. Make adjustments as needed and frequently review your rubric to maximize its effectiveness. 

3. Discuss the Rubric Beforehand

On her blog Write-Out-Loud , Susan Dugdale advises to not keep rubrics a secret. Rubrics should be openly discussed before a presentation is given. Make sure reviewing your rubric with students is listed on your lesson plan.

Set aside time to discuss the criteria with students ahead of presentation day so they know where to focus their efforts. To help students better understand the rubric, play a clip of a presentation and have students use the rubric to grade the video. Go over what grade students gave the presentation and why, based on the rubric’s standards. Then explain how you would grade the presentation as an instructor. This will help your students internalize the rubric as they prepare for their presentations.

4. Use the Rubric Consistently

Rubrics help maintain fairness in grading. When presentation time arrives, use a consistent set of grading criteria across all speakers to keep grading unbiased. 

An effective application for rubrics is to apply a quantitative value to students across a cohort and over multiple presentations. These values show which students made the most progress and where they started out (relative to the rest of their class). Taken together, this data tells the story of how effective or ineffective the feedback has been.

5. Share Your Feedback

If you’re using an electronic system, sharing feedback might be automatic. If you’re using paper, try to give copies to presenters as soon as possible. This will help them incorporate your feedback while everything is still fresh in their minds. 

If you’re looking to use rubrics electronically, check out GoReact, the #1 video platform for skill development. GoReact allows you to capture student presentations on video for feedback, grading, and critique. The software includes a rubric builder that you can apply to recordings of any kind of presentation.

Presenters can receive real-time feedback by live recording directly to GoReact with a webcam or smartphone. Instructors and peers submit feedback during the presentation. Students improve astronomically. 

A presentation grading rubric is a simple way to keep your evaluations on track. Remember to use a customizable rubric, discuss the criteria beforehand, follow a consistent set of grading criteria, make necessary adjustments, and quickly share your feedback.

By following these five steps, both you and your students can reap the benefits that great rubrics have to offer.

rubrics for group presentation in english

Personalize Your GoReact Experience

ESL Speaking

Games + Activities to Try Out Today!

in ESL Speaking Resources

ESL Speaking Rubric | ESL Speaking Assessments and Tests

If you want to find out about an ESL speaking rubric, you’re in the right place. I have everything you need to know about ESL rubrics, along with ideas for ESL speaking tests. Keep on reading for all the details about ESL speaking rubrics to evaluate students easily and fairly.

esl-speaking-rubric

Speaking Rubric ESL

This speaking rubric for ESL is appropriate for a conversation between two students, but not for a presentation or speech style of test, or conversation with the teacher. It’s also not an appropriate way to evaluate the reading, or writing skills, although it does touch on listening.

For a more formal assessment, have a look at this one: IELTS Speaking Evaluation .

Speaking Rubric ESL: Everything You Need to Know

Here are some of the most important factors to keep in mind when evaluating speaking for your English learners. Keep on reading for more details about:

  • Simple vs complicated
  • Grammar + vocabulary
  • Interesting, detailed answers
  • Quality of questions

It works well with elementary school students to college students or adults. And it’s quite a helpful framework for cutting through all the confusion and being able to simply separate the top students from the weaker ones. You can also get away from looking simply for errors into rewarding students who go above and beyond that.

Simple vs Complicated ESL Speaking Rubrics

There are also a million and one ways to evaluate speaking tests with an ESL speaking rubric. However, I always prefer the simple way for just about anything, especially with language learners.

If you look on the Internet, you’ll notice that lots of other people have talked about this before. But, a lot of the other ESL speaking rubrics you see are so complicated that I don’t think their students will actually understand them.

I’d rather make it simple, and easy to understand for my students. I want them to know how to get a good score on the test when they’re studying for it. It just seems fair.

I have three categories in my ESL speaking rubric, and each one is worth an equal number of points.

Quick teaching tip for grading: If your speaking test is worth 30% of the final grade, make each category worth 10 points! Or, if you’ve allotted 15% for it, make each category out of 5 points.

It’ll save teachers a ton of time at the end of the semester. Plus, your students will be able to add up their own scores really easily this way.

ESL Speaking Rubric: 3 Sections

Let’s get to the three categories in my ESL Speaking Rubric.

  • Grammar and vocabulary (10 points)
  • Interesting, detailed answers (10 points)
  • Good questions (1o points)

It’s not just useful for English tests but could be applied to any foreign language.

esl-speaking-rubric

ESL Speaking Test Rubric

#1: Grammar and Vocabulary

This section does not cover all vocabulary and grammar possible in the English language, but only what we studied in class up to that point. For example, if we’ve been studying the passive voice I’d expect students to use that, when appropriate for the topic.

Including ALL English grammar and vocabulary isn’t really fair, especially for beginner-level students.

If we’ve been studying about laws and punishment, I’d expect students to use vocabulary terms like jaywalking, shoplifting, life sentence and parole in their answer, if appropriate. Simple words or talking around these words by describing them but not actually saying them would result in a deduction.

For example, using “walking across the street, not at the correct place” would be considered incorrect if I’ve clearly taught “jaywalking” in class. Students know to expect this so it’s not a surprise to them!

I also include other very simple, basic things that students at their level would be expected to have down cold. For example, high-intermediate students should have a very firm grasp on using the simple past and not make mistakes, even though we may not have explicitly studied it.

Absolute beginners require special consideration for this because they usually have no English skills beyond what you’ve taught them. In this case, I stick almost exclusively to what they’ve studied in my class.

Talking around these things will result in a lower score. For example, saying something like, “Crossing the street at the wrong place,” instead of just saying the vocabulary word we’ve studied (jaywalking) will result in a deduction.

When you’re explaining the test, be fair and give plenty of examples about this so students are clear that you expect them to use the appropriate vocabulary terms.

speaking-rubric-esl

ESL Speaking Rubrics

#2: Interesting, Detailed Answers

This means that students should not just give very simple answers to their partner, but should elaborate with one or two extra details. I encourage this is in class every single day, so a failure to do this on the test does not make me happy!

Have the students actually thought about the topics and subjects discussed, and aren’t just giving answers straight out of the textbook? Yes? Great. No, you won’t score that highly on this section.

Basically, is it easy to have a conversation with this student, or not. The best students will find it very easy to get a perfect score in this section.

For example, if a student asks the question:

Q: What do you think is a big problem facing students in Korea these days?

A: Maybe cell-phone addiction.

This answer would result in a very low score. They should have elaborated with 1-2 supporting details. Or, even a follow-up question to their partner would have been okay.

As it is, the burden is on their partner to keep the conversation going.

That said, I always tell students that 1-2 details is enough. Nobody likes having a conversation with someone who won’t stop talking!

101 ESL Activities: Games, Activities, Practical ideas, & Teaching Tips For English Teachers of...

#3: Interesting Questions

This involves actually listening to their partner and asking appropriate follow-up questions in order to keep the conversation going.

It also involves thinking of an interesting way to start the conversation, since I just give them very general topics but leave the actual conversation starter up to them. Since I give my students the topics a couple weeks before the test, there’s almost no excuse except laziness to not have an interesting conversation starter!

I always give plenty of ridiculous examples when I’m explaining the test about all kinds of terrible follow-up questions. It’s funny, but it seems to work and most students do quite well in this section. Students are free to ask any sort of question they want to follow-up on something, but it has to match the answer.

Q: What’s a big problem facing students in Korea these days?

A: I think student debt. Lots of families can’t afford to pay for university anymore, so students have to take on debt. But, it’s a big burden when they graduate because they can’t save money for a house.

Q: So, what about cell-phone addiction?

This is a terrible follow-up question. The second student gave quite an interesting answer, but the first student didn’t even listen to it.

esl-speaking-rubrics

ESL rubrics for speaking

ESL Speaking Test Evaluation Paper

The paper that I use to log scores is super simple. You can see it below:

Grammar + vocabulary    1  2  3  4  5

Interested, detailed answers   1  2  3  4  5

Good Questions   1  2  3  4  5

Should My Lesson Plans Reflect this Speaking Rubric ESL?

That’s a great question and I’m happy you want to know! I’ve seen all kinds of things over the years. For example, a teacher who spends much of their conversation class time focusing on listening or reading, but then tests exclusively on English language speaking.

Or, another teacher who spends the majority of the class time on free-talking, but then requires a presentation for the final exam. To me, these two examples seem to be a major error in teaching methodology.

I prefer to focus my lesson plans on what I’ll be testing on. Because I test conversational English between students, most of the activities in the class are a reflection of this.

Of course, I don’t teach to the test as there’s no standardized test and I get to design my own. However, the students perceive the test as being more fair, and the class overall as better if you test what you’ve been teaching for the most part.

When Do Students Get to See Their Speaking Test Scores?

I’ll generally show the students their scores right after they’re done with their tests. I’ll make a point to privately show them their score because it’s certainly nobody else’s business! But, then I’ll make my comments to them in front of the other students.

I find that this works well if students have questions or want to challenge their scores. It’s impossible to remember exactly what happened a week or two later when you have 100+ students or even 20 of them for that matter!

I generally provide a ton of feedback at the midterm exam so that students have a chance to learn from their mistakes and improve their scores for the final exam.

Here’s Another ESL Speaking Rubric for Language Learners 

Don’t like this one? Here’s another one that you might want to try out. All of these categories are equally weighted:

  • Clarity (Are the questions and answers clear and comprehensible?)
  • Correct Pronunciation (ranging from native speaker life to incomprehensible)
  • Fluency (how quickly the student asked and answered questions)
  • Comprehension (whether or not the student understood the questions, and was able to give appropriate answers)
  • Content (very simple, to quite detailed answers). This is closely related to task completion.

This one is very simple and it wouldn’t be a case of information overload when you tried to explain it to the students.

And One More Grading Rubric for English Speaking

Here’s another one you might want to consider. All the categories are equally weighted.

  • Interactive communication
  • Pronunciation

This one is also quite simple and understanding the test wouldn’t be a problem for almost all the students.

What about Technology for Testing Speaking?

There are some teachers who like to make use of technology when grading English speaking. For example, they’ll record all the conversations and then refer back to them later. Or, they’ll record videos. When testing languages, particularly speaking, things can happen so fast and the teacher may wish they had more time to process things.

I don’t personally do this because I find it adds a layer of complication to it. What if your recording device isn’t working, or runs out of batteries half way through? What if you think you’re recording but actually aren’t? The possibility for disaster is quite high!

I find that it’s easy enough for most teachers to evaluate on the fly if they’re not actually engaging in the conversation. However, if you’re the conversation partner as well as evaluator (as opposed to just an observer), it can be quite tricky. It’s like your doing two functions instead of just one and it is sometimes not that easy to do.

My School Assigns an ESL Rubric: Do I Have to Use It?

Some schools allow a huge degree of freedom with regards to this, while others are specific in how they want students to be tested. Both of these situations have their advantages and disadvantages.

However, if your school requires you to use a specific rubric, then use it and don’t have a second thought about it. No matter which one you use, it’s likely to reward the best English speakers with the best grades and vice-versa.

What is ESL Assessment? 

ESL assessment can happen informally throughout the course and it allows teachers to track the ongoing progress of their students. It’s also known as authentic or alternative assessment. On the other hand, standardized ESL tests measure student’s abilities at a particular point in their English learning journey. Both are useful for ESL students.

3 Options for an English-Speaking Test

ESL-speaking-test

Speaking tests for ESL students

There are various ways for language teachers to do an ESL speaking test, all of which have their positives and negatives. Some are easier on the teacher and some are more difficult. Some are more accurate, while others are less so.

What you choose for an English speaking test really depends on your personality and the amount of time you have to administer your tests and the number of students. Keep on reading for some ESL speaking test questions, along with tips and tricks for doing these kinds of tests the better way.

I will give only the most basic of overviews of three different English speaking test methods for English as a Second Language students. If you want to dive deeper into the topic, I recommend this book: Language Assessment: Principles and Classroom Practices (2nd Edition).

ESL Speaking Test #1: 1-1 Interview with the Teacher

The 1-1 interview with the teacher method is generally thought to have the highest validity, since weaker students cannot affect the stronger students in any way. However, I think there are more negatives than positives:

  • The power dynamic which can come into play
  • The necessity to have students, alone in an office or classroom. This is something that I’ll always try to avoid if possible.
  • Exhaustion on the part of a teacher. It just simply takes a lot of time and mental energy. In some semesters, I’ve had upwards of 200 students. It’s just not feasible to test every single of them in a 1 week period.
  • The teacher needs to serve as examiner and conversation partner, which can get tricky at times, especially at the end of a long day of tests. This is especially hard with the lower-level students who will often depend on you to to keep the conversation going.

ESL Speaking Test #2: Conversations and Role-Plays

Many English teachers get the students to conduct 1-1 conversations among themselves while the teacher just listens, observes and evaluates.

The big negative of this one is that a weaker student can affect a stronger student, and although the teacher accounts for this in grading, it can often be seen as “not fair” in the student’s eyes.

However, there are lots of positives to this 1-1 conversation between students:

  • No power dynamics
  • It can at least partly replicate “real” conversation, where the people are at a similar level of English ability.
  • The teacher can just focus on listening and not have to act as a conversation partner.
  • Students often feel less nervous with at least one other person in the room besides the teacher.
  • It’s far less tiring than option #1 for the teacher because they only have to listen, not participate in the conversation.

Find out more details about how I conduct this kind of test with my students, and also how I prevent the “memorization” factor.

ESL Speaking Tests, Conversation Style

esl-speaking-test

English speaking test evaluation styles.

ESL Speaking Test #3: Presentations

Presentations are perhaps the easiest on the part of the teacher to administer, especially in groups. You can “test” a group of 30 students in as little as a single 1.5 hour class.

The biggest negatives to presentations are that it doesn’t replicate “conversation” at all and this is most often what courses consist of at, especially at universities. But, if the teacher actually spends time teaching students how to do presentations, it can be a valuable life-skill that students can take with them throughout their lives.

If you do decide to teach and test students on their presentation skills, the best resource I recommend is: Speaking of Speech: Basic Presentation Skills for Beginners . I’ve taught presentations for years and have stuck with this book the entire time, with excellent results.

Presentations: I Don’t Use Them for Tests

I personally will have a “presentation day” (or two, depending on class size) in my courses. I make it a small percentage of the final grade (around 10%) and give students lots of freedom about group sizes (1-4), and topic (it can be anything in the news lately).

It usually ends up being one of the most interesting classes of the semester! But, I prefer not to do this for a test in a conversation class.

For more details about this, check out:

Current Events Presentation Project

esl-speaking-test-questions

ESL speaking questions

ESL Speaking Tests FAQs

There are a number of common questions that people have about ESL speaking, including tests and exams. Here are the answers to some of the most popular ones.

How can I evaluate ESL speaking?

There are a number of ways in which you can evaluate ESL speaking. Some of the most common ones (make a rubric) include pronunciation, accuracy, fluency, interaction and ability to communicate effectively. This is an important part of an ESL teaching philosophy .

How can I teach ESL students to speak?

If you’re trying to teach ESL students to speak, employ some of the following ideas:

  • Focus on communication and fluency, not accuracy all the time.
  • Have students study new vocabulary and key grammatical concepts.
  • Use student-centred ESL activities and games.
  • Do lots of pair and group work.
  • Have as much student-talking time in classes as possible.
  • Teach the difference between formal and informal English.

How do I pass an ESL speaking test?

If you’re trying to pass an ESL speaking test, here are some tips. First, make sure you know the format of the test because it varies from exam to exam. Secondly, brush up on your grammar and vocabulary. Finally, do lots of speaking practice, in the format of the test with a language partner or teacher.

How can I know my English speaking skills?

One of the best ways to know your English speaking skills is to take an English proficiency exam like the IELTS. Besides that, pay attention to what happens when talking to other people. Are they easily able to understand what you’re saying or not. And, do they give the expected answer or something else?

What is the World Language Presentational Speaking Rubric

The “World Language Presentational Speaking Rubric” typically refers to a set of guidelines or criteria used to assess and evaluate a person’s ability to deliver a presentation or speech in a world language (a language other than their native language). This type of rubric is commonly used in language learning and assessment contexts to evaluate students’ proficiency in speaking and presenting in a foreign language.

While specific rubrics can vary depending on the institution, educational level, and language being assessed, here is a general overview of what such a rubric might include.

Content and Organization:

  • Clear introduction, main points, and conclusion.
  • Well-structured content with logical progression of ideas.
  • Relevant and accurate information presented.

Language Usage:

  • Appropriate vocabulary and grammar for the level of proficiency.
  • Varied and descriptive language to engage the audience.
  • Correct and accurate use of tenses, verb forms, and sentence structures.

Pronunciation and Intonation:

  • Clear pronunciation of words and phrases.
  • Appropriate stress and intonation patterns for effective communication.
  • Minimal interference from the speaker’s native language.

Fluency and Coherence:

  • Smooth and uninterrupted flow of speech.
  • Effective use of transitional phrases and connectors.
  • Ability to handle unexpected questions or interruptions coherently.

Cultural Awareness:

  • Demonstration of understanding cultural nuances and appropriate language use.
  • Inclusion of culturally relevant examples and references.

Engagement and Audience Interaction:

  • Maintaining eye contact and using appropriate body language.
  • Engaging the audience through effective gestures and facial expressions.
  • Encouraging questions or interaction from the audience.

Time Management:

  • Staying within the allocated time limit for the presentation.
  • Balancing the time spent on different parts of the presentation (introduction, main points, conclusion).

Creativity and Originality:

  • Incorporation of creative elements such as anecdotes, visuals, or personal experiences.
  • Presentation that stands out and captures the audience’s attention.

Overall Impression:

  • Overall effectiveness of the presentation in conveying the intended message.
  • The speaker’s confidence, enthusiasm, and engagement.

Do you like this ESL Speaking Rubric and ESL Speaking Tests?

101 ESL Activities: Games, Activities, Practical ideas, & Teaching Tips For English Teachers of...

  • Amazon Kindle Edition
  • Bolen, Jackie (Author)
  • English (Publication Language)
  • 148 Pages - 03/09/2016 (Publication Date)

Yes? Thought so. It really is super simple and easy to use this rubrci for all your English speaking classes.

If you need some more simple, easy ideas for your English classes, then you’re going to want to check out this book over on Amazon: 101 ESL Activities: For Teenagers and Adults. It’ll make your lesson planning easy, guaranteed.

Just open up the book to the section you’re looking for: speaking, writing, warm-ups, etc. and find an interesting and engaging activity or game to use in your classes. It’s easier than ever to vary your lessons and keep things fun.

You can check out the book for yourself over on Amazon. But, only if you want to get some ESL awesome going on.

It’s available in both digital and print formats. You can download the e-version onto any device with the free Kindle reading app. Yes, it really is that easy to have a ton of fun ESL activities at your fingertips for lesson planning at your favourite coffee shop.

Or, keep a copy on the bookshelf in your office as a handy reference guide. Check out 101 ESL Activities for yourself over on Amazon:

check-price-on-amazon-button

ESL Speaking Rubric: Have Your Say!

What do you include in your rubric for evaluating English speaking? Are there any notable things that you don’t include? How do you evaluate other subjects for ESL students?

Leave a comment below and let us know your thoughts. We’d love to hear from you.

And be sure to contact us with any questions you have about teaching English. Also be sure to give this article a share on Facebook, Twitter, or Pinterest. It’ll help other teachers, like yourself find this useful teaching resource.

Last update on 2024-04-14 / Affiliate links / Images from Amazon Product Advertising API

rubrics for group presentation in english

About Jackie

Jackie Bolen has been teaching English for more than 15 years to students in South Korea and Canada. She's taught all ages, levels and kinds of TEFL classes. She holds an MA degree, along with the Celta and Delta English teaching certifications.

Jackie is the author of more than 100 books for English teachers and English learners, including 101 ESL Activities for Teenagers and Adults and 1001 English Expressions and Phrases . She loves to share her ESL games, activities, teaching tips, and more with other teachers throughout the world.

You can find her on social media at: YouTube Facebook TikTok Pinterest Instagram

rubrics for group presentation in english

This looks great for a classroom setting. Thanks! Do you have any suggestions for initial assessment of an ESL teenager who I will be tutoring one-on-one?

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Our Top-Seller

39 No-Prep/Low-Prep ESL Speaking Activities: For Teenagers and Adults (Teaching ESL Conversation and...

As an Amazon Associate, I earn from qualifying purchases.

More ESL Activities

Silent letter words

What are Silent Letter Words and Common Examples

b adjectives

B Adjectives List | Describing Words that Start with B

animals that start with the letter J

Animal Names: Animals that Start with the Letter J

write your own story in 5 simple steps

How to Write Your Own Story in 5 Simple Steps

About, contact, privacy policy.

Jackie Bolen has been talking ESL speaking since 2014 and the goal is to bring you the best recommendations for English conversation games, activities, lesson plans and more. It’s your go-to source for everything TEFL!

About and Contact for ESL Speaking .

Privacy Policy and Terms of Use .

Email: [email protected]

Address: 2436 Kelly Ave, Port Coquitlam, Canada

Advertisement

Advertisement

Rubric formats for the formative assessment of oral presentation skills acquisition in secondary education

  • Development Article
  • Open access
  • Published: 20 July 2021
  • Volume 69 , pages 2663–2682, ( 2021 )

Cite this article

You have full access to this open access article

rubrics for group presentation in english

  • Rob J. Nadolski   ORCID: orcid.org/0000-0002-6585-0888 1 ,
  • Hans G. K. Hummel 1 ,
  • Ellen Rusman 1 &
  • Kevin Ackermans 1  

10k Accesses

4 Citations

1 Altmetric

Explore all metrics

Acquiring complex oral presentation skills is cognitively demanding for students and demands intensive teacher guidance. The aim of this study was twofold: (a) to identify and apply design guidelines in developing an effective formative assessment method for oral presentation skills during classroom practice, and (b) to develop and compare two analytic rubric formats as part of that assessment method. Participants were first-year secondary school students in the Netherlands ( n  = 158) that acquired oral presentation skills with the support of either a formative assessment method with analytic rubrics offered through a dedicated online tool (experimental groups), or a method using more conventional (rating scales) rubrics (control group). One experimental group was provided text-based and the other was provided video-enhanced rubrics. No prior research is known about analytic video-enhanced rubrics, but, based on research on complex skill development and multimedia learning, we expected this format to best capture the (non-verbal aspects of) oral presentation performance. Significant positive differences on oral presentation performance were found between the experimental groups and the control group. However, no significant differences were found between both experimental groups. This study shows that a well-designed formative assessment method, using analytic rubric formats, outperforms formative assessment using more conventional rubric formats. It also shows that higher costs of developing video-enhanced analytic rubrics cannot be justified by significant more performance gains. Future studies should address the generalizability of such formative assessment methods for other contexts, and for complex skills other than oral presentation, and should lead to more profound understanding of video-enhanced rubrics.

Similar content being viewed by others

rubrics for group presentation in english

Viewbrics: A Technology-Enhanced Formative Assessment Method to Mirror and Master Complex Skills with Video-Enhanced Rubrics and Peer Feedback in Secondary Education

rubrics for group presentation in english

Students’ and Teachers’ Perceptions of the Usability and Usefulness of the First Viewbrics-Prototype: A Methodology and Online Tool to Formatively Assess Complex Generic Skills with Video-Enhanced Rubrics (VER) in Dutch Secondary Education

rubrics for group presentation in english

The Dilemmas of Formulating Theory-Informed Design Guidelines for a Video Enhanced Rubric

Avoid common mistakes on your manuscript.

Introduction

Both practitioners and scholars agree that students should be able to present orally (e.g., Morreale & Pearson, 2008 ; Smith & Sodano, 2011 ). Oral presentation involves the development and delivery of messages to the public with attention to vocal variety, articulation, and non-verbal signals, and with the aim to inform, self-express, relate to and persuade listeners (Baccarini & Bonfanti, 2015 ; De Grez et al., 2009a ; Quianthy, 1990 ). The current study is restricted to informative presentations (as opposed to persuasive presentations), as these are most common in secondary education. Oral presentation skills are complex generic skills of increasing importance for both society and education (Voogt & Roblin, 2012 ). However, secondary education seems to be in lack of instructional design guidelines for supporting oral presentation skills acquisition. Many secondary schools in the Netherlands are struggling with how to teach and assess students’ oral presentation skills, lack clear performance criteria for oral presentations, and fall short in offering adequate formative assessment methods that support the effective acquisition of oral presentation skills (Sluijsmans et al., 2013 ).

Many researchers agree that the acquisition and assessment of presentation skills should depart from a socio-cognitive perspective (Bandura, 1986 ) with emphasis on observation, practice, and feedback. Students practice new presentation skills by observing other presentations as modeling examples, then practice their own presentation, after which the feedback is addressed by adjusting their presentations towards the required levels. Evidently, delivering effective oral presentations requires much preparation, rehearsal, and practice, interspersed with good feedback, preferably from oral presentation experts. However, large class sizes in secondary schools of the Netherlands offer only limited opportunities for teacher-student interaction, and offer even fewer practice opportunities. Based on research on complex skill development and multimedia learning, it can be expected that video-enhanced analytic rubric formats best capture and guide oral presentation performance, since much non-verbal behavior cannot be captured in text (Van Gog et al., 2014 ; Van Merriënboer & Kirschner, 2013 ).

Formative assessment of complex skills

To support complex skills acquisition under limited teacher guidance, we will need more effective formative assessment methods (Boud & Molloy, 2013 ) based on proven instructional design guidelines. During skills acquisition students will perceive specific feedback as more adequate than non-specific feedback (Shute, 2008 ). Adequate feedback should inform students about (i) their task-performance, (ii) their progress towards intended learning goals, and (iii) what they should do to further progress towards those goals (Hattie & Timperly, 2007 ; Narciss, 2008 ). Students receiving specific feedback on criteria and performance levels will become equipped to improve oral presentation skills (De Grez et al., 2009a ; Ritchie, 2016 ). Analytic rubrics are therefore promising formats to provide specific feedback on oral presentations, because they can demonstrate the relations between subskills and explain the open-endedness of ideal presentations (through textual descriptions and their graphical design).

Ritchie ( 2016 ) showed that adding structure and self-assessment to peer- and teacher-assessments resulted in better oral presentation performance. Students were required to use analytic rubrics for self-assessment when following their (project-based) classroom education. In this way, they had ample opportunity for observing and reflecting on (good) oral presentations attributes, which was shown to foster acquisition of their oral presentation skills.

Analytic rubrics incorporate performance criteria to inform teachers and students when preparing oral presentation. Such rubrics support mental model formation, and enable adequate feedback provision by teachers, peers, and self (Brookhart & Chen, 2015 ; Jonsson & Svingby, 2007 ; Panadero & Jonsson, 2013 ). Such research is inconclusive about what are most effective formats and delivery media, but most studies dealt with analytic text-based rubrics delivered on paper. However, digital video-enhanced analytic rubrics are expected to be more effective for acquiring oral presentation skills, since many behavioral aspects refer to non-verbal actions and processes that can only be captured on video (e.g., body posture or use of voice during a presentation).

This study is situated within the Viewbrics project where video-modelling examples are integrated with analytic text-based rubrics (Ackermans et al., 2019a ). Video-modelling examples contain question prompts that illustrate behavior associated with (sub)skills performance levels in context, and are presented by young actors the target group can identify with. The question prompts require students to link behavior to performance levels, and build a coherent picture of the (sub)skills and levels. To the best of authors’ knowledge, there exist no previous studies on such video-enhanced analytic rubrics. The Viewbrics tool has been incrementally developed and validated with teachers and students to structure the formative assessment method in classroom settings (Rusman et al., 2019 ).

The purpose of our study is twofold. On the one hand, it investigates whether the application of evidence-based design guidelines results in a more effective formative assessment method in classroom. On the other hand, it investigates (within that method) whether video-enhanced analytic rubrics are more effective than text-based analytic rubrics.

Research questions

The twofold purpose of this study is stated by two research questions: (1) To what extent do analytic rubrics within formative assessment lead to better oral presentation performance? (the design-based part of this study); and (2) To what extent do video-enhanced analytic rubrics lead to better oral presentation performance (growth) than text-based analytic rubrics? (the experimental part of this study). We hypothesize that all students will improve their oral presentation performance in time, but that students in the experimental groups (receiving analytic rubrics designed according to proven design guidelines) will outperform a control group (receiving conventional rubrics) (Hypothesis 1). Furthermore, we expect the experimental group using video-enhanced rubrics to achieve more performance growth than the experimental group using text-based rubrics (Hypothesis 2).

After this introduction, the second section describes previous research on design guidelines that were applied to develop the analytic rubrics in the present study. The actual design, development and validation of these rubrics is described in “ Development of analytic rubrics tool ” section. “ Method ” section describes the experimental method of this study, whereas “ Results ” section reports its results. Finally, in the concluding “ Conclusions and discussion ” section, main findings and limitations of the study are discussed, and suggestions for future research are provided.

Previous research and design guidelines for formative assessment with analytic rubrics

Analytic rubrics are inextricably linked with assessment, either summative (for final grading of learning products) or formative (for scaffolding learning processes). They provide textual descriptions of skills’ mastery levels with performance indicators that describe concrete behavior for all constituent subskills at each mastery level (Allen & Tanner, 2006 ; Reddy, 2011 ; Sluijsmans et al., 2013 ) (see Figs.  1 and 2 in “ Development of analytic rubrics tool ” section for an example). Such performance indicators specify aspects of variation in the complexity of a (sub)skill (e.g., presenting for a small, homogeneous group as compared to presenting for a large heterogeneous group) and related mastery levels (Van Merriënboer & Kirschner, 2013 ). Analytic rubrics explicate criteria and expectations, can be used to check students’ progress, monitor learning, and diagnose learning problems, either by teachers, students themselves or by their peers (Rusman & Dirkx, 2017 ).

figure 1

Subskills for oral presentation assessment

figure 2

Specification of performance levels for criterium 4

Several motives for deploying analytic rubrics in education are distinguished. A review study by Panadero and Jonsson ( 2013 ) identified following motives: increasing transparency, reducing anxiety, aiding the feedback process, improving student self-efficacy, and supporting student self-regulation. Analytic rubrics also improve reliability among teachers when rating their students (Jonsson & Svingby, 2007 ). Evidence has shown that analytic rubrics can be utilized to enhance student performance and learning when they were used for formative assessment purposes in combination with metacognitive activities, like reflection and goal-setting, but research shows mixed results about their learning effectiveness (Panadero & Jonsson, 2013 ).

It remains unclear what is exactly needed to make their feedback effective (Reddy & Andrade, 2010 ; Reitmeier & Vrchota, 2009 ). Apparently, transparency of assessment criteria and learning goals (i.e., make expectations and criteria explicit) is not enough to establish effectiveness (Wöllenschläger et al., 2016 ). Several researchers stressed the importance of how and which feedback to provide with rubrics (Bower et al., 2011 ; De Grez et al., 2009b ; Kerby & Romine, 2009 ). We now continue this section by reviewing design guidelines for analytic rubrics we encountered in literature, and then specifically address what literature mentions about the added value of video-enhanced rubrics.

Design guidelines for analytic rubrics

Effective formative assessment methods for oral presentation and analytic rubrics should be based on proven instructional design guidelines (Van Ginkel et al., 2015 ). Table 1 presents an overview of (seventeen) guidelines on analytic rubrics we encountered in literature. Guidelines 1–4 inform us how to use rubrics for formative assessment; Guidelines 5–17 inform us how to use rubrics for instruction, with Guidelines 5–9 on a rather generic, meso level and Guidelines 10–17 on a more specific, micro level. We will now shortly describe them in relation to oral presentation skills.

Guideline 1: use analytic rubrics instead of rating scale rubrics if rubrics are meant for learning

Conventional rating-scale rubrics are easy to generate and use as they contain scores for each performance criterium (e.g., by a 5-point Likert scale). However, since each performance level is not clearly described or operationalized, rating can suffer from rater-subjectivity, and rating scales do not provide students with unambiguous feedback (Suskie, 2009 ). Analytic rubrics can address those shortcomings as they contain brief textual performance descriptions on all subskills, criteria, and performance levels of complex skills like presentation, but are harder to develop and score (Bargainnier, 2004 ; Brookhart, 2004 ; Schreiber et al., 2012 ).

Guideline 2: use self-assessment via rubrics for formative purposes

Analytic rubrics can encourage self-assessment and -reflection (Falchikov & Boud, 1989 ; Reitmeier & Vrchota, 2009 ), which appears essential when practicing presentations and reflecting on other presentations (Van Ginkel et al., 2017 ). The usefulness of self-assessment for oral presentation was demonstrated by Ritchie’s study ( 2016 ), but was absent in a study by De Grez et al. ( 2009b ) that used the same rubric.

Guideline 3: use peer-assessment via rubrics for formative purposes

Peer-feedback is more (readily) available than teacher-feedback, and can be beneficial for students’ confidence and learning (Cho & Cho, 2011 ; Murillo-Zamorano & Montanero, 2018 ), also for oral presentation (Topping, 2009 ). Students positively value peer-assessment if the circumstances guarantee serious feedback (De Grez et al., 2010 ; Lim et al., 2013 ). It can be assumed that using analytic rubrics positively influences the quality of peer-assessment.

Guideline 4: provide rubrics for usage by self, peers, and teachers as students appreciate rubrics

Students appreciate analytic rubrics because they support them in their learning, in their planning, in producing higher quality work, in focusing efforts, and in reducing anxiety about assignments (Reddy & Andrade, 2010 ), aspects of importance for oral presentation. While students positively perceive the use of peer-grading, the inclusion of teacher-grades is still needed (Mulder et al., 2014 ) and most valued by students (Ritchie, 2016 ).

Guidelines 5–9

Heitink et al. ( 2016 ) carried out a review study identifying five relevant prerequisites for effective classroom instruction on a meso-level when using analytic rubrics (for oral presentations): train teachers and students in using these rubrics, decide on a policy of their use in instruction, while taking school- and classroom contexts into account, and follow a constructivist learning approach. In the next section, it is described how these guidelines were applied to the design of this study’s classroom instruction.

Guidelines 10–17

Van Ginkel et al. ( 2015 ) review study presents a comprehensive overview of effective factors for oral presentation instruction in higher education on a micro-level. Although our research context is within secondary education, the findings from the aforementioned study seem very applicable as they were rooted in firmly researched and well-documented Instructional Design approaches. Their guidelines pertain to (a) instruction, (b) learning, and (c) assessment in the learning environment (Biggs, 2003 ). The next section describes how guidelines were applied to the design of this study’s online Viewbrics tool.

  • Video-enhanced rubrics

Early analytic rubrics for oral presentations were all text-based descriptions. This study assumes that such analytic rubrics may fall short when used for learning to give oral presentations, since much of the required performance refers to motoric activities, time-consecutive operations and processes that can hardly be captured in text (e.g., body posture or use of voice during a presentation). Text-based rubrics also have a limited capacity to convey contextualized and more ‘tacit’ behavioral aspects (O’Donnevan et al., 2004 ), since ‘tacit knowledge’ (or ‘knowing how’) is interwoven with practical activities, operations, and behavior in the physical world (Westera, 2011 ). Finally, text leaves more space for personal interpretation (of performance indicators) than video, which negatively influences mental model formation and feedback consistency (Lew et al., 2010 ).

We can therefore expect video-enhanced rubrics to overcome such restrictions, as they can integrate modelling examples with analytic text-based explanations. The video-modelling examples and its embedded question prompts can illustrate behavior associated with performance levels in context, and contain information in different modalities (moving images, sound). Video-enhanced rubrics foster learning from active observation of video-modelling examples (De Grez et al., 2014 ; Rohbanfard & Proteau, 2013 ), especially when combined with textual performance indicators. Looking at effects of video-modelling examples, Van Gog et al. ( 2014 ) found an increased task performance when the video-modelling example of an expert was also shown. De Grez et al. ( 2014 ) found comparable results for learning to give oral presentations. Teachers in training assessing their own performance with video-modelling examples appeared to overrate their performance less than without examples (Baecher et al., 2013 ). Research on mastering complex skills indicates that both modelling examples (in a variety of application contexts) and frequent feedback positively influence the learning process and skills' acquisition (Van Merriënboer & Kirschner, 2013 ). Video-modelling examples not only capture the ‘know-how’ (procedural knowledge), but also elicit the ‘know-why’ (strategic/decisive knowledge).

Development of analytic rubrics tool

This section describes how design guidelines from previous research were applied in the actual development of the rubrics in the Viewbrics tool for our study, and then presents the subskills and levels for oral presentation skills as were defined.

Application of design guidelines

The previous section already mentioned that analytic rubrics should be restricted to formative assessment (Guidelines 2 and 3), and that there are good reasons to assume that a combination of teacher-, peer-, and self-assessment will improve oral presentations (Guidelines 1 and 4). Teachers and students were trained in rubric-usage (Guidelines 5 and 7), whereas students were motivated for using rubrics (Guideline 7). As participating schools were already using analytic rubrics, one might assume their positive initial attitude. Although the policy towards using analytic rubrics might not have been generally known at the work floor, the participating teachers in our study were knowledgeable (Guideline 6). We carefully considered the school context, as (a representative set of) secondary schools in the Netherlands were part of the Viewbrics team (Guideline 8). The formative assessment method was embedded within project-based education (Guideline 9).

Within this study and on the micro-level of design, the learning objectives for the first presentation were clearly specified by lower performance levels, whereas advice on students' second presentation focused on improving specific subskills, that had been performed with insufficient quality during the first presentation (Guideline 10). Students carried out two consecutive projects of increasing complexity (Project 1, Project 2) with authentic tasks, amongst which the oral presentations (Guideline 11). Students were provided with opportunities to observe peer-models to increase their self-efficacy beliefs and oral presentation competence. In our study, only students that received video-enhanced rubrics could observe videos with peer-models before their first presentation (Guideline 12). Students were allowed enough opportunities to rehearse their oral presentations, to increase their presentation competence, and to decrease their communication apprehension. Within our study, only two oral presentations could be provided feedback, but students could rehearse as often as they wanted outside the classroom (Guideline 13). We ensured that feedback in the rubrics was of high quality, i.e., explicit, contextual, adequately timed, and of suitable intensity for improving students’ oral presentation competence. Both experimental groups in the study used digital analytic rubrics within the Viewbrics tool (both teacher-, peer-, and self-feedback). The control group received feedback by a more conventional rubric (rating scale), and could therefore not use the formative assessment and reflection functions (Guideline 14). The setup of the study implied that all peers play a major role during formative assessment in both experimental groups, because they formatively assessed each oral presentation using the Viewbrics tool (Guideline 15). The control group received feedback from their teacher. Both experimental groups used the Viewbrics tool to facilitate self-assessment (Guideline 16). The control group did not receive analytic progress data to inform their self-assessment. Specific goal-setting within self-assessment has been shown to positively stimulate oral presentation performance, to improve self-efficacy and reduce presentation anxiety (De Grez et al., 2009a ; Luchetti et al., 2003 ), so the Viewbrics tool was developed to support both specific goal-setting and self-reflection (Guideline 17).

Subskills and levels for oral presentation

Reddy and Andrade ( 2010 ) stress that rubrics should be tailored to the specific learning objectives and target groups. Oral presentations in secondary education (our study context) involve generating and delivering informative messages with attention to vocal variety, articulation, and non-verbal signals. In this context, message composition and message delivery are considered important (Quianthy, 1990 ). Strong arguments (‘logos’) have to be presented in a credible (‘ethos’) and exciting (‘pathos’) way (Baccarini & Bonfanti, 2015 ). Public speaking experts agree that there is not one right way to do an oral presentation (Schneider et al., 2017 ). There is agreement that all presenters need much practice, commitment, and creativity. Effective presenters do not rigorously and obsessively apply communication rules and techniques, as their audience may then perceive the performance as too technical or artificial. But all presentations should demonstrate sufficient mastery of elementary (sub)skills in an integrated manner. Therefore, such skills should also be practiced as a whole (including knowledge and attitudes), making the attainment of a skill performance level more than the sum of its constituent (sub)skills (Van Merriënboer & Kirschner, 2013 ). A validated instrument for assessing oral presentation performance is needed to help teachers assess and support students while practicing.

When we started developing rubrics with the Viewbrics tool (late 2016), there were no studies or validated measuring instruments for oral presentation performance in secondary education, although several schools used locally developed, non-validated assessment forms (i.e., conventional rubrics). For instance, Schreiber et al. ( 2012 ) had developed an analytic rubric for public speaking skills assessment in higher education, aimed at faculty members and students across disciplines. They identified eleven (sub)skills of public speaking, that could be subsumed under three factors (‘topic adaptation’, ‘speech presentation’ and ‘nonverbal delivery’, similar to logos-ethos-pathos).

Such previous work holds much value, but still had to be adapted and elaborated in the context of the current study. This study elaborated and evaluated eleven subskills that can be identified within the natural flow of an oral presentation and its distinctive features (See Fig.  1 for an overview of subskills, and Fig.  2 for a specification of performance levels for a specific subskill).

Between brackets are names of subskills as they appear in the dashboard of the Viewbrics tool (Fig.  3 ).

figure 3

Visualization of oral presentation progress and feedback in the Viewbrics tool

The upper part of Fig.  2 shows the scoring levels for first-year secondary school students for criterium 4 of the oral presentation assessment (four values, from more expert (4 points) to more novice (1 point), from right to left), an example of the conventional rating-scale rubrics. The lower part shows the corresponding screenshot from the Viewbrics tool, representing a text-based analytic rubric example. A video-enhanced analytic rubric example for this subskill provides a peer modelling the required behavior on expert level, with question prompts on selecting reliable and interesting materials. Performance levels were inspired by previous research (Ritchie, 2016 ; Schneider et al., 2017 ; Schreiber et al., 2012 ), but also based upon current secondary school practices in the Netherlands, and developed and tested with secondary school teachers and their students.

All eleven subskills are to be scored on similar four-point Likert scales, and have similar weights in determining total average scores. Two pilot studies tested the usability, validity and reliability of the assessment tool (Rusman et al., 2019 ). Based on this input, the final rubrics were improved and embedded in a prototype of the online Viewbrics tool, and used for this study. The formative assessment method consisted of six steps: (1) study the rubric; (2) practice and conduct an oral presentation; (3) conduct a self-assessment; (4) consult feedback from teacher and peers; (5) Reflect on feedback; and (6) select personal learning goal(s) for the next oral presentation.

After the second project (Project 2), that used the same setup and assessment method as for the first project, students in experimental groups could also see their visualized progress in the ‘dashboard’ of the Viewbrics tool (see Fig.  3 , with English translations provided between brackets), by comparing performance on their two project presentations during the second reflection assignment. The dashboard of the tool shows progress (inner circles), with green reflecting improvement on subskills, blue indicating constant subskills, and red indicating declining subskills. Feedback is provided by emoticons with text. Students’ personal learning goals after reflection are shown under ‘Mijn leerdoelen’ [My learning goals].

The previous sections described how design guidelines for analytic rubrics from literature (“ Previous research and design guidelines for formative assessment with analytic rubrics ” section) were applied in a formative assessment method with analytic rubrics (“ Development of analytic rubrics tool ” section). “ Method ” section describes this study’s research design for comparing rubric formats.

Research design of the study

All classroom scenarios followed the same lesson plan and structure for project-based instruction, and consisted of two projects with specific rubric feedback provided in between. Both experimental groups used the same formative assessment method with validated analytic rubrics, but differed on the analytic rubric format (text-based, video-enhanced). The students of the control group did not use such a formative assessment method, and only received teacher-feedback (via a conventional rating-scale rubric that consisted of a standard form with attention points for presentations, without further instructions) on these presentations. All three scenarios required similar time investments for students. School classes (six) were randomly assigned to conditions (three), so students from the same class were in the same condition. Figure  4 graphically depicts an overview of the research design of the study.

figure 4

Research design overview

A repeated-measures mixed-ANOVA on oral presentation performance (growth) was carried out to analyze data, with rubric-format (three conditions) as between-groups factor and repeated measures (two moments) as within groups factor. All statistical data analyses were conducted with SPSS version 24.

Participants

Participants were first-year secondary school students (all within the 12–13 years range) from two Dutch schools, with participants equally distributed over schools and conditions ( n  = 166, with 79 girls and 87 boys). Classes were randomly allocated to conditions. Most participants completed both oral presentations ( n  = 158, so an overall response rate of 95%). Data were collected (almost) equally from the video-enhanced rubrics condition ( n  = 51), text-based condition ( n  = 57), and conventional rubrics (control) condition ( n  = 50).

A related study within the same context and participants (Ackermans et al., 2019b ), analyzed the concept maps elicited from participants to reveal that their mental models (indicating mastery levels) for oral presentation across conditions were similar. From that finding we can conclude that students possessed similar mental models for presentation skills before starting the projects. Results from the online questionnaire (“ Anxiety, preparedness, and motivation ” section) reveal that students in experimental groups did not differ in anxiety, preparedness and motivation before their first presentation. Together with the teacher assessments of similarity of classes, we can assume similarity of students across conditions at the start of the experiment.

Materials and procedure

Teachers from both schools worked closely together in guaranteeing similar instruction and difficulty levels for both projects (Project 1, Project 2). Schools agreed to follow a standardized lesson plan for both projects and their oral presentation tasks. Core team members then developed (condition-specific) materials for teacher- and student workshops on how to use rubrics and provide instructions and feedback (Guidelines 5 and 7). This also assured that similar measures were taken for potential problems with anxiety, preparedness and motivation. Teachers received information about (condition-specific) versions of the Viewbrics tool (see “ Development of analytic rubrics tool ” section). The core team consisted of three researchers and three (project) teachers, with one teacher also supervising the others. The teacher workshops were given by the supervising teacher and two researchers before starting recruitment of students.

Teachers estimated similarity of all six classes with respect to students’ prior presentation skills before starting the first project. All classes were informed by an introduction letter from the core team and their teachers. Participation in this study was voluntary. Students and their parents/caretakers were informed about 4 weeks before the start of the first project, and received information on research-specific activities, time-investment and -schedule. Parents/caretakers signed, on behalf of their minors of age, an informed consent form before the study started. All were informed that data would be anonymized for scientific purposes, and that students could withdraw at any time without giving reasons.

School classes were randomly assigned to conditions. Students of experimental groups were informed that the usability of the Viewbrics tool for oral presentation skills acquisition were investigated, but were left unaware of different rubric formats. Students of the control group were informed that their oral presentation skills acquisition was investigated. From all students, concept maps about oral presentation were elicited (reflecting their mental model and mastery level). Students participated in workshops (specific for their condition and provided by their teacher) on how to use rubrics and provide peer-feedback (all materials remained available throughout the study).

Before giving their presentations on Project 1, students filled in the online questionnaire via LimeSurvey. Peers and teachers in experimental groups provided immediate feedback on given presentations, and students immediately had to self-assess their own presentations (step 3 of the assessment method). Subsequently, students could view the feedback and ratings given by their teacher and peers through the tool (step 4), were asked to reflect on this feedback (step 5), and to choose specific goals for their second oral presentation (step 6). In the control group, students directly received teachers’ feedback (verbally) after completing their presentation, but did not receive any reflection assignment. Control group students used a standard textual form with attention points (conventional rating-scale rubrics). After giving their presentations on the second project, students in the experimental groups got access to the dashboard of the Viewbrics tool (see “ Development of analytic rubrics tool ” section) to see their progress on subskills. About a week after the classes had ended, some semi-structured interviews were carried out by one of the researchers. Finally, one of the researchers functioned as a hotline for teachers in case of urgent questions during the study, and randomly observed some of the lessons.

Measures and instruments

Oral performance scores on presentations were measured by both teachers and peers. A short online questionnaire (with 6 items) was administered to students just before their first oral presentation at the end of Project 1 (see Fig.  4 ). Interviews were conducted with both teachers and students at the end of the intervention to collect more qualitative data on subjective perceptions.

Oral presentation performance

Students’ oral presentation performance progress was measured via comparison of the oral presentation performance scores on both oral presentations (with three months in between). Both presentations were scored by teachers using the video-enhanced rubric in all groups (half of the score in experimental groups, full score for control group). For participants in both experimental groups, oral presentation performance was also scored by peers and self, using the specific rubric-version (either video-enhanced or text-based) (other half of the score). For each of the (eleven) subskills, between 1 point (novice level) and 4 points (expert level) could be earned, with a maximum of 44 points for total performance score. For participants in the control group, the same scale applied but no scores were given by peers nor self. The inter-rater reliability of assessments between teachers and peers was a Cohen’s Kappa = 0.74 which is acceptable.

Anxiety, preparedness, and motivation

Just before presenting, students answered the short questionnaire with five-point Likert scores (from 0 = totally disagree to 4 = totally agree) as additional control for potential differences in anxiety, preparedness and motivation, since especially these factors might influence oral presentation performance (Reddy & Andrade, 2010 ). Notwithstanding this, teachers were the major source to control for similarity of conditions with respect to dealing with presentation anxiety, preparedness and motivation. Two items for anxiety were: “I find it exciting to give a presentation” and “I find it difficult to give a presentation”, a subscale that appeared to have a satisfactory internal reliability with a Cronbach’s Alpha = 0.90. Three items for preparedness were: “I am well prepared to give my presentation”, “I have often rehearsed my presentation”, and “I think I’ve rehearsed my presentation enough”, a subscale that appeared to have a satisfactory Cronbach’s Alpha = 0.75. The item for motivation was: “I am motivated to give my motivation”. Unfortunately, the online questionnaire was not administered within the control group, due to unforeseen circumstances.

Semi-structured interviews with teachers (six) and students (thirty) were meant to gather qualitative data on the practical usability and usefulness of the Viewbrics tool. Examples of questions are: “Have you encountered any difficulties in using the Viewbrics online tool? If any, could you please mention which one(s)” (both students of experimental groups and teachers); “Did the feedback help you to improve your presentation skills? If not, what feedback do you need to improve your presentation skills?” (just students); “How do you evaluate the usefulness of formative assessment?” (both students and teachers); “Would you like to organize things differently in applying formative assessment as during this study? If so, what would you like to organize different?” (just teachers); “How much time did you spend on providing feedback? Did you need more or less time than before?” (just teachers).

Interviews with teachers and students revealed that the reported rubrics approach was easy to use and useful within the formative assessment method. Project teachers could easily stick to the lessons plans as agreed upon in advance. However, project teachers regarded the classroom scenarios as relatively time-consuming. They expected that for some other schools it might be challenging to follow the Viewbrics approach. None of the project teachers had to consult the hotline during the study, and no deviations from the lesson plans were observed by the researchers.

Most important results on the performance measures and questionnaire are presented and compared between conditions.

A mixed ANOVA, with oral presentation performance as within-subjects factor (two scores) and rubric format as between-subjects factor (three conditions), revealed an overall and significant improvement of oral presentation performance over time, with F (1, 157) = 58.13, p  < 0.01, η p 2  = 0.27. Significant differences over time were also found between conditions, with F (2, 156) = 17.38, p  < 0.01, η p 2  = 0.18. Tests of between-subjects effects showed significant differences between conditions, with F (2, 156) = 118.97, p  < 0.01, η p 2  = 0.59, and both experimental groups outperforming the control group as expected (so we could accept H1). However, only control group students showed significantly progress on performance scores over time (at the 0.01 level). At both measures, no significant differences between experimental groups were found as was expected (so we had to reject H2). For descriptives of group averages (over time) see Table 2 .

A post-hoc analysis, using multiple pairwise comparisons with Bonferroni correction, confirms that experimental groups significantly (with p  < 0.01 level) outperform the control group at both moments in time, and that both experimental groups not to differ significantly at both measures. Regarding performance progress over time, only the control group shows significant growth (again with p < 0.01). The difference between experimental groups in favour of video-enhanced rubrics did ‘touch upon’ significance ( p  = 0.053), but formally H2 had to be rejected. This finding however is a promising trend to be further explored with larger numbers of participants.

An independent t-test comparing the similarity of participants in both experimental groups before their first presentation for anxiety, preparedness, motivation showed no difference, with t (1,98) = 1.32 and p  = 0.19 for anxiety, t (1,98) = − 0.14 and p  = 0.89 for preparedness, and t (1,98) = − 1.24 and p  = 0.22 for motivation (see Table 3 for group averages).

As mentioned in the previous section (interviews with teachers), it was assessed by teachers that presentation anxiety, preparedness and motivation in the control group were no different from both experimental groups. It can therefore be assumed that all groups were similar regarding presentation anxiety, preparedness and motivation before presenting, and that these factors did not confound oral presentation results. There are missing questionnaire data from 58 respondents: Video-enhanced (one respondent), Text-based (seven respondents), and Control group (fifty respondents), respectively.

Conclusions and discussion

The first purpose was to study if applying evidence-informed design guidelines in the development of formative assessment with analytic rubrics supports oral presentation performance of first-year secondary school students in the Netherlands. Students that used such validated rubrics indeed outperform students using common rubrics (so H1 could be accepted). This study has demonstrated that the design guidelines can also be effectively applied and used for secondary education, which makes them more generic. The second purpose was to study if video-enhanced rubrics would be more beneficial to oral presentation skills acquisition when compared to text-based rubrics, but we did not find significant differences here (so H2 had to be rejected). However, post-hoc analysis shows that the growth on performance scores over time indeed seems higher when using video-enhanced rubrics, a promising difference that is ‘only marginally’ significant. Preliminary qualitative findings from the interviews point out that the Viewbrics tool can be easily integrated into classroom instruction and appears usable for the target audiences (both teachers and students), although teachers state it is rather time-consuming to conform to all guidelines.

All students had prior experience with oral presentations (from primary schools) and relatively high oral presentation scores at the start of the study, so there remained limited room for improvement between their first and second oral presentation. Participants in the control group scored relatively low on their first presentation, so had more room for improvement during the study. In addition, the somewhat more difficult content of the second project (Guideline 11) might have slightly reduced the quality of the second oral presentation. Also, more intensive training, additional presentations and their assessments might have demonstrated more added value of the analytic rubrics. Learning might have occurred, since adequate mental models of skills are not automatically applied during performance (Ackermans et al., 2019b ).

A first limitation (and strength at the same time) of this study was its contextualization within a specific subject domain and educational sector over a longer period of time, which implies we cannot completely exclude some influence of confounding factors. A second limitation is that the Viewbrics tool has been specifically designed for formative assessment, and not meant for summative assessment purposes. Although our study revealed the inter-rater reliability of our rubrics to be satisfactory (see “ Measures and instruments ” section), it is likely to become lower and less suitable when compared to more traditional summative assessment methods (Jonsson & Svinby, 2007 ). Thirdly, just having a reliable rubric bears no evidence for content-validity (representativeness, fidelity of scoring structure to the construct domain) or generalizability to other domains and educational sectors (Jonsson & Svinby, 2007 ). Fourth, one might criticize the practice-based research design of our study, as this is less-controlled than laboratory studies. We acknowledge that the application of more unobtrusive and objective measures to better understand the complex relationship between instructional characteristics, student characteristics and cognitive learning processes and strategies could best be achieved in a combination of more laboratory research and more practice-based research. Notwithstanding some of these issues, we have deliberately chosen for design-based research and evidence-informed findings from educational practice.

Future research could examine the Viewbrics approach to formative assessment for oral presentation skills in different contexts (other subject matters and educational sectors). The Viewbrics tool could be extended with functions for self-assessment (e.g., record and replay one's own presentations), for coping with speech anxiety (Leary & Kowalski, 1995 ), and goal-setting (De Grez et al., 2009a ). As this is a first study on video-enhanced rubrics, more fine-grained and fundamental research into beneficial effects on cognitive processes is needed, also to justify the additional development costs. Development of video-enhanced rubrics is more costly when compared to text-based rubrics. Another line of research might be directed to develop multiple measures for objectively determining oral presentation competence, for example using sensor-based data gathering and algorithms for data-gathering, guidance, and meaningful interpretation (Schneider et al., 2017 ), or direct measures of cortisol levels for speaking anxiety (Bartholomay & Houlihan, 2016 ; Merz & Wolf, 2015 ). Other instructional strategies might be considered, for example repeated practice of the same oral presentation might result in performance improvement, as has been suggested by Ritchie ( 2016 ). This also would enable to downsize the importance of presentation content and to put more focus on presentation delivery. The importance of finding good instructional technologies to support complex oral presentation skills will remain of importance throughout the twenty-first century and beyond.

Ackermans, K., Rusman, E., Brand-Gruwel, S., & Specht, M. (2019a). Solving instructional design dilemmas to develop Video-Enhanced Rubrics with modeling examples to support mental model development of complex skills: The Viewbrics-project use case. Educational Technology Research & Development, 67 (4), 993–1002.

Google Scholar  

Ackermans, K., Rusman, E., Nadolski, R. J., Brand-Gruwel, S., & Specht, M. (2019b). Video-or text-based rubrics: What is most effective for mental model growth of complex skills within formative assessment in secondary schools? Computers in Human Behavior, 101 , 248–258.

Allen, D., & Tanner, K. (2006). Rubrics: Tools for making learning goals and evaluation criteria explicit for both teachers and learners. CBE Life Sciences Education, 5 (3), 197–203.

Baccarini, C., & Bonfanti, A. (2015). Effective public speaking: A conceptual framework in the corporate-communication field. Corporate Communications, 20 (3), 375–390.

Baecher, L., Kung, S. C., Jewkes, A. M., & Rosalia, C. (2013). The role of video for self-evaluation in early field experiences. Teacher Teaching Education, 36 , 189–197.

Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory . Prentice-Hall.

Bargainnier, S. (2004). Fundamentals of rubrics. In D. Apple (Ed.), Faculty guidebook (pp. 75–78). Pacific Crest.

Bartholomay, E. M., & Houlihan, D. D. (2016). Public Speaking Anxiety Scale: Preliminary psychometric data scale validation. Personality Individual Differences, 94 , 211–215.

Biggs, J. (2003). Teaching for quality learning at University . Society for Research in Higher Education and Open University Press.

Boud, D., & Molloy, E. (2013). Rethinking models of feedback for learning: The challenge of design. Assessment & Evaluation in Higher Education, 38 (6), 698–712.

Bower, M., Cavanagh, M., Moloney, R., & Dao, M. (2011). Developing communication competence using an online Video Reflection system: Pre-service teachers’ experiences. Asia-Pacific Journal of Teacher Education, 39 (4), 311–326.

Brookhart, S. M. (2004). Assessment theory for college classrooms. New Directions for Teaching and Learning, 100 , 5–14.

Brookhart, S. M., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educational Review, 67 (3), 343–368.

Cho, Y. H., & Cho, K. (2011). Peer reviewers learn from giving comments. Instructional Science, 39 (5), 629–643.

De Grez, L., Valcke, M., & Berings, M. (2010). Peer assessment of oral presentation skills. Procedia Social and Behavioral Sciences, 2 (2), 1776–1780.

De Grez, L., Valcke, M., & Roozen, I. (2009a). The impact of goal orientation, self-reflection and personal characteristics on the acquisition of oral presentation skills. European Journal of Psychology of Education, 24 (3), 293–306.

De Grez, L., Valcke, M., & Roozen, I. (2009b). The impact of an innovative instructional intervention on the acquisition of oral presentation skills in higher education. Computers & Education, 53 (1), 112–120.

De Grez, L., Valcke, M., & Roozen, I. (2014). The differential impact of observational learning and practice-bases learning on the development of oral presentation skills in higher education. Higher Education Research Development, 33 (2), 256–271.

Falchikov, N., & Boud, D. (1989). Student self-assessment in higher education: A meta-analysis. Review of Educational Research, 59 (4), 395–430.

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77 (1), 81–112.

Heitink, M. C., Van der Kleij, F. M., Veldkamp, B. P., & Schildkamp, K. (2016). A systematic review of prerequisites for implementing assessment for learning in classroom practice. Educational Research Review, 17 , 50–62.

Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2 (2), 130–144.

Kerby, D., & Romine, J. (2009). Develop oral presentation skills through accounting curriculum design and course-embedded assessment. Journal of Education for Business, 85 (3), 172–179.

Leary, M. R., & Kowalski, R. M. (1995). Social anxiety . Guilford Press.

Lew, M. D. N., Alwis, W. A. M., & Schmidt, H. G. (2010). Accuracy of students’ self-assessment and their beliefs about its utility. Assessment and Evaluation in Higher Education, 35 (2), 135–156.

Lim, B. T., Moriarty, H., Huthwaite, M., Gray, L., Pullon, S., & Gallagher, P. (2013). How well do medical students rate and communicate clinical empathy? Medical Teacher, 35 , 946–951.

Luchetti, A. E., Phipss, G. L., & Behnke, R. R. (2003). Trait anticipatory public speaking anxiety as a function of self-efficacy expectations and self-handicapping strategies. Communication Research Reports, 20 (4), 348–356.

Merz, C. J., & Wolf, O. T. (2015). Examination of cortisol and state anxiety at an academic setting with and without oral presentation. The International Journal on the Biology of Stress, 18 (1), 138–142.

Morreale, S. P., & Pearson, J. C. (2008). Why communication education is important: Centrality of discipline in the 21st century. Communication Education, 57 , 224–240.

Mulder, R. A., Pearce, J. M., & Baik, C. (2014). Peer review in higher education: Student perceptions before and after participation. Active Learning in Higher Education, 15 (2), 157–171.

Murillo-Zamorano, L. R., & Montanero, M. (2018). Oral presentations in higher education: A comparison of the impact of peer and teacher feedback. Assessment & Evaluation in Higher Education, 43 (1), 138–150.

Narciss, S. (2008). Feedback strategies for interactive learning tasks. In J. M. Spector, M. D. Merrill, J. J. G. van Merrienboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 125–144). Lawrence Erlbaum Associates.

O’Donnevan, B., Price, M., Rust, C., & Donovan, B. O. (2004). Teaching in higher education know what I mean? Enhancing student understanding of assessment standards and criteria. Teacher Higher Education, 9 (3), 325–335.

Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposed revisited: A review. Educational Research Review, 9 , 129–144.

Quianthy, R. L. (1990). Communication is life: Essential college sophomore speaking and listening competencies . National Communication Association.

Reddy, Y. M. (2011). Design and development of rubrics to improve assessment outcomes. Quality Assurance in Education, 19 (1), 84–104.

Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation in Higher Education, 35 (4), 435–448.

Reitmeier, C. A., & Vrchota, D. A. (2009). Self-assessment of oral communication presentations in food science and nutrition. J. of Food Science Education, 8 (4), 88–92.

Ritchie, S. M. (2016). Self-assessment of video-recorded presentations: Does it improve skills? Active Learning in Higher Education, 17 (3), 207–221.

Rohbanfard, H., & Proteau, L. (2013). Live versus video presentation techniques in the observational learning of motor skills. Trends Neuroscience Education, 2 , 27–32.

Rusman, E., & Dirkx, K. (2017). Developing rubrics to assess complex (generic) skills in the classroom: How to distinguish skills’ mastery Levels? Practical Assessment, Research & Evaluation. https://doi.org/10.7275/xfp0-8228

Article   Google Scholar  

Rusman, E., Nadolski, R. J., & Ackermans, K. (2019). Students’ and teachers’ perceptions of the usability and usefulness of the first Viewbrics-prototype: A methodology and online tool to formatively assess complex generic skills with video-enhanced rubrics in Dutch secondary education. In S. Draaijer, D. Joosten-ten Brinke, E. Ras (Eds), Technology enhanced assessment. TEA 2018. Communications in computer and information science (Vol. 1014, pp. 27–41). Springer, Cham.

Schneider, J., Börner, D., Van Rosmalen, P., & Specht, M. (2017). Presentation trainer: What experts and computers can tell about your nonverbal communication. Journal of Computer Assisted Learning, 33 (2), 164–177.

Schreiber, L. M., Paul, G. D., & Shibley, L. R. (2012). The development and test of the public speaking competence rubric. Communication Education, 61 (3), 205–233.

Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78 (1), 153–189.

Sluijsmans, D., Joosten-ten Brinke, D., & Van der Vleuten, C. (2013). Toetsen met leerwaarde [Assessments with value for learning] . The Hague, The Netherlands: NWO. Retrieved from https://sluijsmans.net/wp-content/uploads/2019/02/Toetsen-met-leerwaarde.pdf

Smith, C. M., & Sodano, T. M. (2011). Integrating lecture capture as a teaching strategy to improve student presentation skills through self-assessment. Active Learning in Higher Education, 12 , 151–162.

Suskie, L. (2009). Assessing student learning: A common sense guide (2nd ed.). Wiley.

Topping, K. (2009). Peer assessment. Theory into Practice, 48 (1), 20–27.

Van Ginkel, S., Gulikers, J., Biemans, H., & Mulder, M. (2015). Towards a set of design principles for developing oral presentation competence: A synthesis of research in higher education. Educational Research Review, 14 , 62–80.

Van Ginkel, S., Laurentzen, R., Mulder, M., Mononen, A., Kyttä, J., & Kortelainen, M. J. (2017). Assessing oral presentation performance: Designing a rubric and testing its validity with an expert group. Journal of Applied Research in Higher Education, 9 (3), 474–486.

Van Gog, T., Verveer, I., & Verveer, L. (2014). Learning from video modeling examples: Effects of seeing the human model’s face. Computers and Education, 72 , 323–327.

Van Merriënboer, J. J. G., & Kirschner, P. A. (2013). Ten steps to complex learning (2nd ed.). Lawrence Erlbaum.

Voogt, J., & Roblin, N. P. (2012). A comparative analysis of international frameworks for 21st century competences: Implications for national curriculum policies. Journal of Curriculum Studies, 44 , 299–321.

Westera, W. (2011). On the changing nature of learning context: Anticipating the virtual extensions of the world. Educational Technology and Society, 14 , 201–212.

Wöllenschläger, M., Hattie, J., Machts, N., Möller, J., & Harms, U. (2016). What makes rubrics effective in teacher-feedback? Transparency of learning goals is not enough. Contemporary Educational Psychology, 44–45 , 1–11.

Download references

Acknowledgements

Authors would like to thank the reviewers for their constructive comments on our paper and all students and teachers that participated in this study as well as the management from the participating schools.

The Viewbrics-project is funded by the practice-oriented research program of the Netherlands Initiative for Education Research (NRO), part of The Netherlands Organization for Scientific Research (NWO), under Grant Number: 405-15-550.

Author information

Authors and affiliations.

Faculty of Educational Sciences, Open University of the Netherlands, Valkenburgerweg 177, 6419 AT, Heerlen, The Netherlands

Rob J. Nadolski, Hans G. K. Hummel, Ellen Rusman & Kevin Ackermans

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Rob J. Nadolski .

Ethics declarations

Conflict of interest.

The authors declare that they have no conflict of interest.

Ethical approval

This research has been approved by the ethics committee of the author's institution (U2017/05559/HVM).

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Nadolski, R.J., Hummel, H.G.K., Rusman, E. et al. Rubric formats for the formative assessment of oral presentation skills acquisition in secondary education. Education Tech Research Dev 69 , 2663–2682 (2021). https://doi.org/10.1007/s11423-021-10030-7

Download citation

Accepted : 03 July 2021

Published : 20 July 2021

Issue Date : October 2021

DOI : https://doi.org/10.1007/s11423-021-10030-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Digital rubrics
  • Analytic rubrics
  • Oral presentation skills
  • Formative assessment method
  • Find a journal
  • Publish with us
  • Track your research
  • Professional development
  • Knowing the subject
  • Teaching Knowledge database Q-S

Rubric refers to the written instructions for a task.

rubrics for group presentation in english

It can also refer to criteria for assessing tasks. 

Example The writing part of a proficiency exam such as FCE or IELTS has a rubric giving instructions.

In the classroom Learners may need to be trained to pay attention to the rubric, especially for an exam. They can be encouraged to do this by group analysis of questions, the highlighting of key points, and quick answer plans - which they can then explain to other learners.

Further links: https://www.teachingenglish.org.uk/blogs/katherine-bilsborough/teachers-materials-writers-some-considerations   https://www.teachingenglish.org.uk/blogs/vicky-saumell/vicky-saumell-using-rubrics-assess-projects

Research and insight

Browse fascinating case studies, research papers, publications and books by researchers and ELT experts from around the world.

See our publications, research and insight

  • Grades 6-12
  • School Leaders

Get our FREE Mother's Day Printable 💐!

15 Helpful Scoring Rubric Examples for All Grades and Subjects

In the end, they actually make grading easier.

Collage of scoring rubric examples including written response rubric and interactive notebook rubric

When it comes to student assessment and evaluation, there are a lot of methods to consider. In some cases, testing is the best way to assess a student’s knowledge, and the answers are either right or wrong. But often, assessing a student’s performance is much less clear-cut. In these situations, a scoring rubric is often the way to go, especially if you’re using standards-based grading . Here’s what you need to know about this useful tool, along with lots of rubric examples to get you started.

What is a scoring rubric?

In the United States, a rubric is a guide that lays out the performance expectations for an assignment. It helps students understand what’s required of them, and guides teachers through the evaluation process. (Note that in other countries, the term “rubric” may instead refer to the set of instructions at the beginning of an exam. To avoid confusion, some people use the term “scoring rubric” instead.)

A rubric generally has three parts:

  • Performance criteria: These are the various aspects on which the assignment will be evaluated. They should align with the desired learning outcomes for the assignment.
  • Rating scale: This could be a number system (often 1 to 4) or words like “exceeds expectations, meets expectations, below expectations,” etc.
  • Indicators: These describe the qualities needed to earn a specific rating for each of the performance criteria. The level of detail may vary depending on the assignment and the purpose of the rubric itself.

Rubrics take more time to develop up front, but they help ensure more consistent assessment, especially when the skills being assessed are more subjective. A well-developed rubric can actually save teachers a lot of time when it comes to grading. What’s more, sharing your scoring rubric with students in advance often helps improve performance . This way, students have a clear picture of what’s expected of them and what they need to do to achieve a specific grade or performance rating.

Learn more about why and how to use a rubric here.

Types of Rubric

There are three basic rubric categories, each with its own purpose.

Holistic Rubric

A holistic scoring rubric laying out the criteria for a rating of 1 to 4 when creating an infographic

Source: Cambrian College

This type of rubric combines all the scoring criteria in a single scale. They’re quick to create and use, but they have drawbacks. If a student’s work spans different levels, it can be difficult to decide which score to assign. They also make it harder to provide feedback on specific aspects.

Traditional letter grades are a type of holistic rubric. So are the popular “hamburger rubric” and “ cupcake rubric ” examples. Learn more about holistic rubrics here.

Analytic Rubric

Layout of an analytic scoring rubric, describing the different sections like criteria, rating, and indicators

Source: University of Nebraska

Analytic rubrics are much more complex and generally take a great deal more time up front to design. They include specific details of the expected learning outcomes, and descriptions of what criteria are required to meet various performance ratings in each. Each rating is assigned a point value, and the total number of points earned determines the overall grade for the assignment.

Though they’re more time-intensive to create, analytic rubrics actually save time while grading. Teachers can simply circle or highlight any relevant phrases in each rating, and add a comment or two if needed. They also help ensure consistency in grading, and make it much easier for students to understand what’s expected of them.

Learn more about analytic rubrics here.

Developmental Rubric

A developmental rubric for kindergarten skills, with illustrations to describe the indicators of criteria

Source: Deb’s Data Digest

A developmental rubric is a type of analytic rubric, but it’s used to assess progress along the way rather than determining a final score on an assignment. The details in these rubrics help students understand their achievements, as well as highlight the specific skills they still need to improve.

Developmental rubrics are essentially a subset of analytic rubrics. They leave off the point values, though, and focus instead on giving feedback using the criteria and indicators of performance.

Learn how to use developmental rubrics here.

Ready to create your own rubrics? Find general tips on designing rubrics here. Then, check out these examples across all grades and subjects to inspire you.

Elementary School Rubric Examples

These elementary school rubric examples come from real teachers who use them with their students. Adapt them to fit your needs and grade level.

Reading Fluency Rubric

A developmental rubric example for reading fluency

You can use this one as an analytic rubric by counting up points to earn a final score, or just to provide developmental feedback. There’s a second rubric page available specifically to assess prosody (reading with expression).

Learn more: Teacher Thrive

Reading Comprehension Rubric

Reading comprehension rubric, with criteria and indicators for different comprehension skills

The nice thing about this rubric is that you can use it at any grade level, for any text. If you like this style, you can get a reading fluency rubric here too.

Learn more: Pawprints Resource Center

Written Response Rubric

Two anchor charts, one showing

Rubrics aren’t just for huge projects. They can also help kids work on very specific skills, like this one for improving written responses on assessments.

Learn more: Dianna Radcliffe: Teaching Upper Elementary and More

Interactive Notebook Rubric

Interactive Notebook rubric example, with criteria and indicators for assessment

If you use interactive notebooks as a learning tool , this rubric can help kids stay on track and meet your expectations.

Learn more: Classroom Nook

Project Rubric

Rubric that can be used for assessing any elementary school project

Use this simple rubric as it is, or tweak it to include more specific indicators for the project you have in mind.

Learn more: Tales of a Title One Teacher

Behavior Rubric

Rubric for assessing student behavior in school and classroom

Developmental rubrics are perfect for assessing behavior and helping students identify opportunities for improvement. Send these home regularly to keep parents in the loop.

Learn more: Teachers.net Gazette

Middle School Rubric Examples

In middle school, use rubrics to offer detailed feedback on projects, presentations, and more. Be sure to share them with students in advance, and encourage them to use them as they work so they’ll know if they’re meeting expectations.

Argumentative Writing Rubric

An argumentative rubric example to use with middle school students

Argumentative writing is a part of language arts, social studies, science, and more. That makes this rubric especially useful.

Learn more: Dr. Caitlyn Tucker

Role-Play Rubric

A rubric example for assessing student role play in the classroom

Role-plays can be really useful when teaching social and critical thinking skills, but it’s hard to assess them. Try a rubric like this one to evaluate and provide useful feedback.

Learn more: A Question of Influence

Art Project Rubric

A rubric used to grade middle school art projects

Art is one of those subjects where grading can feel very subjective. Bring some objectivity to the process with a rubric like this.

Source: Art Ed Guru

Diorama Project Rubric

A rubric for grading middle school diorama projects

You can use diorama projects in almost any subject, and they’re a great chance to encourage creativity. Simplify the grading process and help kids know how to make their projects shine with this scoring rubric.

Learn more: Historyourstory.com

Oral Presentation Rubric

Rubric example for grading oral presentations given by middle school students

Rubrics are terrific for grading presentations, since you can include a variety of skills and other criteria. Consider letting students use a rubric like this to offer peer feedback too.

Learn more: Bright Hub Education

High School Rubric Examples

In high school, it’s important to include your grading rubrics when you give assignments like presentations, research projects, or essays. Kids who go on to college will definitely encounter rubrics, so helping them become familiar with them now will help in the future.

Presentation Rubric

Example of a rubric used to grade a high school project presentation

Analyze a student’s presentation both for content and communication skills with a rubric like this one. If needed, create a separate one for content knowledge with even more criteria and indicators.

Learn more: Michael A. Pena Jr.

Debate Rubric

A rubric for assessing a student's performance in a high school debate

Debate is a valuable learning tool that encourages critical thinking and oral communication skills. This rubric can help you assess those skills objectively.

Learn more: Education World

Project-Based Learning Rubric

A rubric for assessing high school project based learning assignments

Implementing project-based learning can be time-intensive, but the payoffs are worth it. Try this rubric to make student expectations clear and end-of-project assessment easier.

Learn more: Free Technology for Teachers

100-Point Essay Rubric

Rubric for scoring an essay with a final score out of 100 points

Need an easy way to convert a scoring rubric to a letter grade? This example for essay writing earns students a final score out of 100 points.

Learn more: Learn for Your Life

Drama Performance Rubric

A rubric teachers can use to evaluate a student's participation and performance in a theater production

If you’re unsure how to grade a student’s participation and performance in drama class, consider this example. It offers lots of objective criteria and indicators to evaluate.

Learn more: Chase March

How do you use rubrics in your classroom? Come share your thoughts and exchange ideas in the WeAreTeachers HELPLINE group on Facebook .

Plus, 25 of the best alternative assessment ideas ..

Scoring rubrics help establish expectations and ensure assessment consistency. Use these rubric examples to help you design your own.

You Might Also Like

What is Project Based Learning? #buzzwordsexplained

What Is Project-Based Learning and How Can I Use It With My Students?

There's a difference between regular projects and true-project based learning. Continue Reading

Copyright © 2024. All rights reserved. 5335 Gate Parkway, Jacksonville, FL 32256

  • help_outline help

iRubric: Grade 7 Group Presentation Rubric

  • Presentation

rubrics for group presentation in english

English presentation rubric

All formats, resource types, all resource types.

  • Rating Count
  • Price (Ascending)
  • Price (Descending)
  • Most Recent

Preview of 2 Oral Presentation Rubrics in Spanish & English | Used for any Language!

2 Oral Presentation Rubrics in Spanish & English | Used for any Language!

rubrics for group presentation in english

  • Word Document File

Preview of Presentation Rubric for English Language Learners

Presentation Rubric for English Language Learners

rubrics for group presentation in english

  • Google Docs™

Preview of Speech, oral presentation organizer, rubric, Spanish/English

Speech, oral presentation organizer, rubric , Spanish/ English

rubrics for group presentation in english

Presentation Rubric Animated Google Slide for English Language Learners

  • Google Slides™

Preview of Presentation Rubric for English Language Learners

English - Generic Rubric - Oral Presentation

rubrics for group presentation in english

English Jurassic Park Ethical Issues Presentation Rubric

rubrics for group presentation in english

"The Kite Runner" Social Issues Presentation Rubric English

Preview of Free Presentation Rubric English and Social Studies Secondary Free

Free Presentation Rubric English and Social Studies Secondary Free

rubrics for group presentation in english

English Presentation Rubric

rubrics for group presentation in english

Oral Presentation Rubric

rubrics for group presentation in english

Personal Narrative Essay Writing - Presentation , Graphic Organizers, and Rubric

rubrics for group presentation in english

5 Paragraph Essay Writing Presentation and Essay Graphic Organizers and Rubric

Preview of Oral Presentation Rubric

Editable Google Slides Presentation Rubric

rubrics for group presentation in english

  • Google Apps™

Preview of Mastering Presentation Skills: Lesson, Handouts & Rubric (Common Core Aligned)

Mastering Presentation Skills: Lesson, Handouts & Rubric (Common Core Aligned)

rubrics for group presentation in english

Writing Rubrics (3), Homework, Groups, Behavior and Presentation : BUNDLE

Preview of Student Self Assessment Oral Presentation Rubric Special Education 1st 2nd 3rd

Student Self Assessment Oral Presentation Rubric Special Education 1st 2nd 3rd

rubrics for group presentation in english

  • Easel Activity

Preview of Visual Presentation Rubric (Google Docs or Google Slides)

Visual Presentation Rubric (Google Docs or Google Slides)

rubrics for group presentation in english

Oral presentation rubric

rubrics for group presentation in english

Reader's Theatre Rubrics : Assessing Oral Fluency and Presentations

rubrics for group presentation in english

The Complete High School English Rubric Bundle

rubrics for group presentation in english

¡Quién Soy Yo! // Who I Am Presentation Project (Spanish & English )

rubrics for group presentation in english

EDITABLE Group Oral Presentation Rubric - T/st: Peer, formative, reflection.

rubrics for group presentation in english

  • We're hiring
  • Help & FAQ
  • Privacy policy
  • Student privacy
  • Terms of service
  • Tell us what you think

IMAGES

  1. 10 Best Printable Rubrics For Oral Presentations PDF for Free at Printablee

    rubrics for group presentation in english

  2. 10 Best Printable Rubrics For Oral Presentations PDF for Free at Printablee

    rubrics for group presentation in english

  3. Group Presentation Rubric by Laurel Barnes

    rubrics for group presentation in english

  4. 10 Best Printable Rubrics For Oral Presentations PDF for Free at Printablee

    rubrics for group presentation in english

  5. 10 Best Printable Rubrics For Oral Presentations PDF for Free at Printablee

    rubrics for group presentation in english

  6. 10 Best Printable Rubrics For Oral Presentations PDF for Free at Printablee

    rubrics for group presentation in english

VIDEO

  1. Microsoft Teams tutorial- Marking students work using Rubric and Standards on Microsoft Teams

  2. Understanding Mind Rubrics

  3. Rubric Workshop

  4. RUBRIC DIFFERENTIATION

  5. Team grading in Canvas with Rubrics

  6. Ep. 17

COMMENTS

  1. PDF Presentation Marking Rubric

    Presentation Marking Rubric (Group) 4 3 2 1 Mark Visual Appeal There are no errors in spelling, grammar and punctuation. Information is clear and concise on each slide. Visually appealing/engaging. There are some errors in spelling, grammar and punctuation. Too much information on two or more slides. Significant visual appeal. There are many ...

  2. Creating and Using Rubrics

    Rubrics can be used to provide feedback to students on diverse types of assignments, from papers, projects, and oral presentations to artistic performances and group projects. Benefitting from Rubrics A carefully designed rubric can offer a number of benefits to instructors. Rubrics help instructors to:

  3. Using rubrics

    A rubric can be a fillable pdf that can easily be emailed to students. Rubrics are most often used to grade written assignments, but they have many other uses: They can be used for oral presentations. They are a great tool to evaluate teamwork and individual contribution to group tasks. Rubrics facilitate peer-review by setting evaluation ...

  4. PDF Rubric for Presentation: PUBH 5900 Graduate Project

    Scoring Rubric for Oral Presentations: Example #1 Author: Testing and Evaluation Services Created Date: 8/10/2017 9:45:03 AM ...

  5. PDF ORAL COMMUNICATION RUBRIC

    evaluated separately for panel or group presentations. This rubric best applies to presentations of sufficient length such that a central message is conveyed, supported by one or more forms of supporting materials, and includes a purposeful organization. An oral answer to a single question not designed to be structured into a presentation does ...

  6. iRubric: Survival Guide: Group Video Presentation Rubric

    The students will put together a poster, powerpoint, collage or video to show during their oral presentation. The presentation will contain information that the students collected regarding intelligence. The group presentation must be aligned with the assigned topic and objectives and will last in duration at least 15 minutes. Rubric Code: K36W84.

  7. iRubric: Role-Play Presentation rubric

    This rubric will be used to evaluate the team role-play presentation. Rubric Code: P23C9AC. By Alyy147. Ready to use. Public Rubric. Subject: Business. Type: Presentation. Grade Levels: 6-8. Role-Play Presentation.

  8. Creating and Using Rubrics

    The English Department collected essays from students in all sections of English 100. A random sample of essays was selected. ... Scoring rubric group orientation and calibration. ... reports, oral presentations, design projects). To produce dependable scores, each faculty member needs to interpret the rubric in the same way. The process of ...

  9. How to (Effectively) Use a Presentation Grading Rubric

    1. Find a Good Customizable Rubric. There's practically no limit to how rubrics are used, and there are oodles of presentation rubrics on Pinterest and Google Images. But not all rubrics are created equal. Professors need to be picky when choosing a presentation rubric for their courses. Rubrics should clearly define the target that students ...

  10. iRubric: ESL Group Oral Presentation Rubric

    Discuss this rubric with other members. Do more with rubrics than ever imagined possible. iRubric BX3373X: Evaluates the key components of an effective oral presentation for a small group of ESL students, on a a holistic assessment.. Free rubric builder and assessment tools.

  11. PDF Role Play Rubric Student

    Role Play Rubric Student: Levels of Quality Criteria 4 Excellent 3 Proficient 2 Adequate 1 Limited Participation in Preparation and Presentation Always willing and focused during group work and presentation. Usually willing and focused during group work and presentation. Sometimes willing and focused during group work and presentation. Rarely ...

  12. PDF Oral Presentation Rubric

    Oral Presentation Rubric . 0 (Unacceptable) 1 (Marginal) 2 (Good) 3 (Excellent) Score Body Language No movement or descriptive gestures. Very little movement or descriptive gestures. Movements or gestures enhance articulation. Movements seemed fluid and helped the audience visualize. Eye Contact. No eye contact with audience.

  13. PDF Group Participation Rubric (sample analytic rubric)

    Group Participation Rubric (sample analytic rubric) Criteria Level of Participation Distinguished Proficient Basic Unacceptable Workload Did a full share of the work—or more; knows what needs to be done and does it; volunteers to help others. Did an equal share of the work; does work when asked; works hard most of the time.

  14. Speaking Rubric ESL

    ESL Speaking Rubric: 3 Sections. Let's get to the three categories in my ESL Speaking Rubric. Grammar and vocabulary (10 points) Interesting, detailed answers (10 points) Good questions (1o points) It's not just useful for English tests but could be applied to any foreign language. ESL Speaking Test Rubric.

  15. Rubric formats for the formative assessment of oral presentation skills

    Acquiring complex oral presentation skills is cognitively demanding for students and demands intensive teacher guidance. The aim of this study was twofold: (a) to identify and apply design guidelines in developing an effective formative assessment method for oral presentation skills during classroom practice, and (b) to develop and compare two analytic rubric formats as part of that assessment ...

  16. Rubric

    Rubric. Rubric refers to the written instructions for a task. It can also refer to criteria for assessing tasks. The writing part of a proficiency exam such as FCE or IELTS has a rubric giving instructions. Learners may need to be trained to pay attention to the rubric, especially for an exam. They can be encouraged to do this by group analysis ...

  17. 15 Helpful Scoring Rubric Examples for All Grades and Subjects

    15 Helpful Scoring Rubric Examples for All Grades and Subjects. In the end, they actually make grading easier. By Jill Staake, B.S., Secondary ELA Education. Jun 16, 2023. When it comes to student assessment and evaluation, there are a lot of methods to consider. In some cases, testing is the best way to assess a student's knowledge, and the ...

  18. iRubric: Grade 7 Group Presentation Rubric

    Grade 7 Group Presentation Rubric. Grade 7 Group Presentation Rubric. This rubric is designed to assess the presentation of the group activity. The rubric should consider the performance of the group as a whole, as well as individual contributions. Rubric Code: Y66797.

  19. Rubrics For Group Activity

    Rubrics for Group Activity - Free download as Word Doc (.doc / .docx), PDF File (.pdf), Text File (.txt) or read online for free. This document contains two rubrics for evaluating group activities. Each rubric contains the same criteria: cooperation, organization, time frame, and presentation. The rubrics rate performance on each criteria on a scale from 1 to 5, with 5 being the highest score.

  20. PDF ESL Level 5 Oral Presentation Rubric

    ESL Level 5 Oral Presentation Rubric . Assessment Task: The student will give a well-organized oral presentation on a general or specialized topic for 3 minutes or longer. Total Possible Points = 25 . Content/Organization - Maximum 10 points * 10 7 4 0 Very interesting Interesting Parts not interesting Not interesting

  21. PDF Microsoft Word

    Assessment Rubric for Presentations. The presentation has a concise and clearly stated focus that is relevant to the audience. The presentation is well-structured with a clear storyline. Ideas are arranged logically; they strongly support the presentation focus. Sections are well- connected with smooth transition.

  22. Group Presentation Rubric by Laurel Barnes

    A rubric to help making grading Group Presentations much easier! Log In Join. Cart is empty. Total: $0.00. View Wish List. View Cart. ... A rubric to help making grading Group Presentations much easier! ... Spanish-English dictionary, translator, and learning. Emmersion.

  23. English Presentation Rubric Teaching Resources

    These rubrics were adapted from ACTFL, but made more student friendly by adding visuals and easier to understand vocabulary. Included in this product are 2 rubrics. 1 rubric 100% in Spanish. 1 rubric 100% in English to assess oral presentation projects.However, these rubrics are universal and can be used for any foreign language. Many customers have used them for French class.