Using Assessment Analytics 2.0 allows you to take a deeper dive into the performance on assessments that are given in Otus. To access Assessment Analytics 2.0, follow the steps below.

  • Step 1: Navigate to the Analytics module. Note - the screenshot above is from within a teacher account. If you are an administrator, the Analytics module is located second from the top.

  • Step 2: Select Assessment 2.0.

  • ❗ Step 3: Check the date range (very important). The default date range is set for the previous 7 days. If the assessment you are looking for does not fall within that date range, it will not appear in the dropdown menu. For best results, we recommend changing the date range to All Time.

Once you have identified the necessary date range, you can click into the Search Assessment field to see a dropdown list of the assessments that appear within that date range. You can also search using keywords.

In this article, you will find the following sections.

❗ This article is written specifically for Simple Assessments that use the points grading scale. If you are looking for details on how to analyze a different type of assessment, check out this article.


Completion Widget

The completion widget will show you the overall completion status of the assessment.

  • 1: A visual depiction of the number of graded (dark blue) assessments relative to the total number of students this assessment was assigned to.

  • 2: The number of graded assessments.

  • 3: The total number of students this assessment was assigned to.

  • 4: The percentage of assessments that have been graded.


Performance Breakdown

The performance breakdown allows you to a summary of the results, including the highest score, average score, and average time to complete.

  • The x-axis contains the percentage ranges.

  • The y-axis represents the number of students.

The default display (shown above) is the Overall Performance, which represents all of the students that took the assessment. Each bar represents a 10% range; they are listed highest to lowest from left to right. Hovering over any of the bars will show you the number of students that performed in that range. For details on the other ways to see the breakdown (by class, group, or gender), keep reading, or skip ahead 🔗 .

Customizing the X-Axis

➡️ Change Order: you can choose from highest to lowest (default) and lowest to highest.

➡️ Edit X-Axis: you can change the percentage ranges.

  • As shown above, the default contains 10 ranges (the maximum amount), with equal intervals of 10.

💡 Tip from the Otus Team: the ranges do not have to be equal intervals; the only rule is that they have to add up to 100.

See below for examples of customized ranges:

Example 1: Four ranges with equal intervals of 25.

Example 2: Three ranges, which do not have equal intervals. The first interval is 70, the second is 15, and the third is 15.

What do these buttons do?

  • 1 - Removes that range (x)

  • 2 - Adds a range (+)

  • 3 - Evenly distributes the intervals based on the number of ranges you have (=). See example below; originally, the three ranges were 70, 15, and 15. Selecting the = button changed them to equal intervals of 33 (to the hundredths place).

❓ Can you customize the y-axis?

  • When looking at Overall performance The y-axis will always remain the number of students. The intervals will change based on the total number of students being measured. It is not something that can be customized

➡️ Other Performance Breakdown Options:

In addition to Overall performance, the performance breakdown can be analyzed in three other ways:

By Class

If multiple classes took this assessment, you can see the average percentage of each class.

By Group

If students in your classes have been assigned to district-created groups, you can see the average percentage of each group.

Note: only district-created groups will be represented here. If district-created groups haven't been assigned to your students, you will not have any data to display with this option.

By Gender

If students have an assigned gender attribute, you can see the average percentage separated by gender.

Note: if gender has not been an assigned attribute in your classes, you will not have any data to display with this option.


Item Analysis

The Item Analysis will show the performance of each individual question on the assessment. You can see this analysis in two forms:

Table

The default sort order is lowest to highest in order of performance based on the number of students that earned full credit on the question. You can change the sort order by clicking on the heading of any column.

Let's analyze the first row of the table to better understand the columns:

  • Question: This is question #1.

  • % Correct Visual Display: 45.45% of the students earned full credit for this question; this shows the percentage of students that earned full credit for the question (dark blue) relative to the percentage that did not (orange).

  • Number Correct: Five students earned full credit on this question.

  • Number Incorrect: Six students did not earn full credit on this question.

  • A - F: Two students chose A, zero chose B, two chose C, and five chose D. The orange color indicates an incorrect answer choice, while the blue indicates the correct answer choice. True/False questions will only have options A and B, where A represents True and B represents False. Short answer questions will not have any information in these columns.

  • Avg. Score: The average score on this question was a 0.73/1.

  • Q Type: This was a Multiple Choice question (other options are True/False or Short Answer).

Individual Question

Clicking on any row in the table will show a more detailed view. Let's click on the first row of the table, also used in the example above.

  • You will see the content of the question, along with the answer choices, with the correct answer indicated in dark blue.

  • To the right, you'll see that 5 out of 11 students earned full credit on this question. The doughnut chart shows the percentage of students that earned full credit for the question (dark blue) relative to the percentage that did not (orange).

  • You can navigate to the next question or go to the previous question by using the arrows in the top-right corner.

  • To go back to the table, close out the individual item analysis by selecting the x in the top-right corner.


Student Performance

The Student Performance will show the performance of each individual student that was assigned the assessment. You can see this analysis in two forms:

Table

The default sort order is alphabetical by last name. You can change the sort order by clicking on the heading of any column.

Let's analyze the first row of the table to better understand the columns:

  • First and Last Name: The first and last name of the student is Chandler Bing.

  • Submitted: This student submitted the assessment on January 13th, 2021.

  • Points: This student earned 5.5 out of 6 points.

  • Percentage: This student earned a 92% on this assessment. The horizontal bar is a visual display of their percentage.

  • Email: This is the student's email address.

  • Site: This is the school the student is rostered in.

  • Grade: The student's grade level (if assigned) would be in this column.

Individual Student

Clicking on any student in the table will show a more detailed view of that student. Let's click on the first student, also used in the example above.

  • You will find the student's overall score and the length of time it took them to complete the assessment in the upper-right corner.

  • The table contains details for each question; the score, type, and selection (which can also be considered their answer).


Optional Filters

If you are looking for other ways to filter the data, you can select the filter icon in the top-right corner. When you make any selections here, the Run Report button will flash, indicating that you need to re-run the report to show the results for the selected filters.


Next Steps

Did this answer your question?