Around our rating element, Content Designer offers a comprehensive and customizable reporting source for the Moodle report builder. Thanks to the use of variable logic, it is possible to create cross-course, data-driven evaluations. Each rating element can be assigned to a variable, and variables can be grouped into variable types, allowing for the creation of diverse and flexible reporting schemes.
For example, rating results can be compared over time — showing how a learner evaluates themselves before and after a learning unit. Instructors teaching multiple courses can also be evaluated, making it possible to identify consistent strengths or areas for improvement across different courses. Additionally, classic metrics like the Net Promoter Score (NPS) can be analyzed, either based on variable assignments or by course creator.
The data source captures not only the course and activity where the rating was placed but also who rated what, when, and how. Course creators can also add individual labels to each rating, ensuring that every response is clearly identified and available for reporting.
User responses are recorded both as a numeric value and as a name. For example, a traditional Likert scale is reported with both the numeric score (e.g., “5”) and the corresponding label (e.g., “Strongly agree”), allowing for flexible and detailed analysis.

✔️ Cross-course, data-driven insights
Compare ratings over time and across multiple courses, identify patterns, and evaluate instructors’ impact consistently.
✔️ Flexible variable and reporting structure
Assign ratings to variables and group them into variable types, enabling customized reporting schemes like skill growth, instructor feedback, or Net Promoter Score (NPS) analyses.
✔️ Detailed and dual-format data capture
User responses are recorded both numerically and by label.