Back to basics—let’s review the key dimensions that structure digital assessment processes. Together, we’ll aim to build a somewhat systemic approach. This approach raises questions about the ability of the current educational system to fully grasp this challenge. It’s certainly complex, and the stakes are high, so let’s take a deep breath and think critically!
I’ve identified 14 dimensions that digital assessment designers must consider. It’s a good way to position your work and refine your approach.
Dimension 1: Purpose
Purpose | Comparison | Descriptions |
---|---|---|
Formative | Against training content | An online quiz after a lesson to check students’ immediate understanding. |
Summative | Against training program | An online exam at the end of a module to assess overall knowledge acquired. |
Diplomatic | Against diploma framework | A certification exam to obtain a diploma or professional qualification. |
Diagnostic | Expected difficulties | A test to identify gaps in students’ knowledge or skills. |
Placement | Based on knowledge/competence thresholds | An initial test to determine students’ prior knowledge, helping to place them in the appropriate course. |
Normative | Performance relative to a group | A standardized test where students’ results are compared to a reference group, like national or international tests. |
Criterion-based | Mastery thresholds across various domains | An evaluation where performance is measured against predefined criteria, such as a driver’s license test. |
Standardized | Uniform administration | A test administered identically for all participants to ensure comparability of results, like PISA or national exams. |
Performance-based | Focused on activity-based performances | A practical evaluation where a student must demonstrate specific skills, such as a musical recital or project presentation. |
Aptitude-based | Measures potential | A test assessing a student’s innate abilities or potential, like verbal or logical aptitude tests for career orientation. |
Efficiency-based | Measures speed of learning | An evaluation that measures how quickly a student acquires new skills, such as timed mental math tests. |
Self-assessment | Helps students shape their learning methods | A questionnaire or rubric that allows students to self-evaluate, reflect on their learning, and identify strengths and areas for improvement. |
Control-based | Verifies expected work (follows teacher’s rules) | Regular check-ins to ensure students complete assigned tasks, like submitting assignments or participating in virtual classes. |
Dimension 2: Modalities
Modalities | Responses | ||||
---|---|---|---|---|---|
Instructions | Written | Oral | Visual | In Action | In a Simulation |
Written | Create a short text (open-ended response) | Read aloud a sentence in English | Match labels | Click to respond | Set up a context to perform a task |
Oral | Open-ended response | Dialogue | Draw on a Canvas | Click to respond | Dialogue |
Visual | The instruction relies on a diagram for an open-ended response | The instruction relies on a diagram for an open-ended response | Modify a diagram | Perform an operation on a diagram (clicking areas) | Drawing/graphic action like painting, circling… |
By Simulation | Fill in text fields within the simulation | Oral interaction within the simulation (phone call) | Graphic interaction within the simulation | Interaction with a defined interface in the simulation | Restore a situation, modify, sequence necessary actions… |
Dimension 3: Sequencing of Tests/Items
Sequencing | Descriptions |
---|---|
Adaptive | The next item depends on the correctness of the previous one. |
Linear | Items follow a defined order, with final answers. |
Non-linear | Students can revisit and change their answers. |
Conditional | Some questions guide the next steps in the questionnaire. |
Adaptive branching | Following questions depend on previous answers, creating a personalized path. |
Independent modules | Students can choose the order in which they complete different evaluation modules. |
Dimension 4: Support
Support (in addition to instructions) | Descriptions |
---|---|
None | The question is complete and needs no additional support. |
Single stimulus | A single stimulus may relate to one or more questions. |
Multiple stimuli | Information retrieval across several documents is an example. |
Controlled external resources | Access to specific documents permitted by the teacher. |
Open web navigation | To be used with caution. |
Open book | Learning resources are necessary. |
Applications | Use of simulations, virtual labs, or immersive environments. Tools like GeoGebra. |
Physical materials | Instruments, models, or specific equipment required for certain practical evaluations. |
Dimension 5: Frequency
Frequency | Examples |
---|---|
Single | Summative/graduating evaluation. |
Multiple | Continuous assessment. |
Regular | Weekly or monthly follow-up on learning. |
Continuous | Portfolio mode: all production counts as evaluation. |
Ongoing | Evaluation used during lessons. |
Sporadic | Spot checks to encourage work discipline. |
Student-requested | The student chooses when to take an evaluation. |
Dimension 6: Assistance
Assistance | Descriptions |
---|---|
None | No assistance provided. |
Immediate feedback | Instant feedback after each response to guide learning. |
Hints (tips) | Hints help guide students, e.g., eliminating choices. |
Comments/Reminders | Hints can factor into the score (penalty). |
Step-by-step guidance | Progressive assistance to help students solve complex questions. |
Resource bank | Access to explanations or lessons related to the questions. |
Optional assistance | Students can ask for help. |
Solution | Correct answer provided after submission. |
Detailed review at the end | Helps students avoid repeating mistakes. |
Dimension 7: Control
Control (Proctoring) | Descriptions/Considerations |
---|---|
Control strategy | The control level should match the evaluation stakes. |
Generating equivalent but different questions | Items are designed to generate variation automatically. |
Shuffling responses | Option at the item level: e.g., MCQ, order, pairs interactions. |
Shuffling questions | Option at the test level. |
Student isolation: headset and webcam | Limits communication with neighboring students in a computer lab. The webcam can monitor surroundings; the microphone may remain on to detect voices. |
Limiting internet access | Certain URLs are banned, though risks persist (missing URLs). |
Banning internet access | Total isolation to prevent messaging or AI usage. |
Mobile phone ban | Especially for messaging apps. |
Online surveillance | Student navigation can be monitored, with warnings of consequences post-test. |
Reinforced authentication | Secure access to the assessment platform. |
Dimension 8: Material & Infrastructure
Material/Infrastructure | Descriptions |
---|---|
Computer lab at an educational institution | Uniform equipment and controlled environment. |
Personal computer | Specific precautions for a smooth evaluation process, e.g., monitoring sessions remotely using webcams. |
Tablet | Often suited to specific formats. Online tests should follow responsive design rules. |
Mobile phone | Different modalities possible; the environment can be controlled (BYOD). |
Network access: internet/local network | Constraints come with network access; strictly local applications are logistically difficult for large groups. |
Accessibility | Consideration for students with disabilities (screen readers, subtitles, adapted interfaces). |
Dimension 9: Grading Modalities
Data produced | Descriptions |
---|---|
Score calculation | These rules must align with the type of assessment to simplify usage and reduce post-processing. Score rules should be well-designed at the assessment’s conception. |
Automated grading | Simple algorithms verify responses based on a predefined scheme of correct answers. |
Human grading | Requires a correction platform and trained human coders for consistency. |
AI grading | Specific AI procedures for grading, though LLMs are unreliable for this purpose. |
Multi-grading and mixed grading | Combination of automated, human, and AI grading for open-ended questions. |
Self-assessment | Students correct themselves using provided criteria, often in digital tools. |
Peer assessment | Students correct each other’s work, fostering collaborative learning. |
Dimension 10: Data Usage
Data usage | Descriptions |
---|---|
Scores for group comparison | Standardization/level/acceptability relative to the test population. |
Profile definition | Strengths/weaknesses related to the tested activity. |
Threshold verification | Whether thresholds are met or not—knowledge quantity thresholds. |
Review | Detailed breakdown of correct/incorrect answers. |
Recommendations | Comments and advice. |
Predictive analysis | Using data to anticipate learning needs or identify risks of failure. |
Personalized learning | Adapting teaching content based on assessment results. |
Dashboards | Visualizing performance for students and teachers. |
Dimension 11: Evaluation Evolution
Evaluation evolution | Examples |
---|---|
Public release | Practice exercises for students. |
Confidential storage | Question banks. |
Historical preservation | Comparing results over time. |
Archiving | Avoid repetition or enable comparison. |
Regular updates | Updating questions to reflect changes in the subject or curriculum. |
Feedback on question quality | Collecting feedback from students and teachers to improve assessments. |
Versioning | Tracking changes made to assessments. |
Dimension 12: Pedagogical Aspects
Pedagogical aspects | Descriptions |
---|---|
Alignment with learning objectives | Ensuring each question evaluates a specific goal. |
Bloom’s Taxonomy | Classifying questions by cognitive level (knowledge, comprehension, application, analysis, synthesis, evaluation). |
Variety of question types | MCQs, open-ended questions, case studies, problem-solving tasks. |
Dimension 13: Security and Confidentiality
Security & confidentiality | Descriptions |
---|---|
Protection of personal data | Compliance with regulations (e.g., GDPR). |
Platform security | Protection against intrusions and information leaks. |
Dimension 14: Impact on Learning
Impact on learning | Descriptions |
---|---|
Student motivation | How does the evaluation influence student engagement and motivation? |
Stress and well-being | Considering the psychological impact of assessments on students. |