Sharing iOBE Data Safely to Protect Data Privacy

Sunday, August 10, 2025

The iOBE system generates valuable data analytics into student performance, offering deeper insights to educators, students, and institutions alike.

While these benefits are clear, the way in which iOBE data is shared is equally important. All sharing of the iOBE outputs in this website strives to follow relevant data privacy standards.

In this post, I will outline the principles and practices I follow when sharing iOBE data, providing context, examples, and illustrations of how these data are presented and shared safely and effectively.

1. Data is Presented Only in Aggregated and Collective Form

All iOBE outputs that I share online or publish formally are aggregated summaries of assessment results. These are typically presented as Box Plots or Population Charts (as shown in the figure below) that represent the overall performance of a class or cohort.
Key characteristics of this approach:
  • No row-level data is released. That is, no dataset is published where each row corresponds to a specific student’s marks across assessment elements.
  • Individual scores are never displayed directly; instead, they are merged into collective statistics that summarise group trends.
  • No personal identifiers (names, IDs, email addresses, or any other identifiable attributes) are ever included in the published outputs.
This method ensures that while the general performance of the group is clearly communicated, no single student’s results can be isolated or traced back to them.

2. Use of Simulated or Adapted Datasets When Necessary

Where possible, iOBE visuals are generated from simulated or adapted datasets — particularly in demonstration, training, or public outreach contexts.

When real data is used, the following safeguards apply when I want to share those data online:
  • Removal of all identifiers before any analysis or visualisation takes place.
  • Presentation in aggregated visual form only, never as individual rows or lists.
  • Exclusion of small group data where small sample sizes could increase the likelihood of indirect identification.
In certain cases, real datasets may undergo minor controlled modifications (e.g., small randomised adjustments to scores) to further reduce the risk of re-identification (to ensure the privacy of students' data), while keeping overall trends intact.

3. Compliance with Legal and Ethical Data Protection Standards

The iOBE data-sharing process has been designed to align with:
  • Best practices in educational data protection, as recommended in academic and institutional guidelines.
Specific compliance measures include:
  • Anonymisation by aggregation – Once scores are summarised in statistical form, they no longer constitute personal data under PDPA definitions, provided they cannot be used to identify an individual.
  • Removal of identifiers – No names, student IDs, or indirect identifiers are included in shared outputs.
  • No re-identification risk – Outputs are constructed in a way that prevents linking back to an individual, even if additional external information is available.
This compliance framework ensures that published visuals are not only useful but also legally sound and ethically responsible.

4. Advantages of Aggregated Visual Sharing Over Row-Based Lists

It is still common practice in some settings to share results in row-based lists—often anonymised using passcodes or partial student IDs. While this may appear to protect privacy, it has several drawbacks:
  • Re-identification risk remains – Students can often deduce each other’s codes, especially in small cohorts or close-knit groups.
  • Limited insight – Row lists only show individual scores without revealing the broader performance trends.
  • Increased complexity for interpretation – Students must scan through multiple rows to guess where they stand compared to peers.
By contrast, iOBE visual outputs:
  • Provide instant performance context through statistical summaries.
  • Eliminate direct identification risk by avoiding row-level disclosures entirely.
  • Communicate trends, variability, and benchmarks more efficiently than lists.
In short, iOBE’s method not only meets privacy standards but also delivers richer and more actionable insights.

5. Purpose and Benefits of the iOBE Data-Sharing Approach

The aim of sharing iOBE data is twofold:
  • Educational value – To provide clear and meaningful feedback to students and educators, enabling targeted improvement.
  • Trust and transparency – To demonstrate that data is handled with care, in line with legal and ethical obligations.
For educators, these visuals:
  • Highlight class-wide strengths and weaknesses.
  • Support evidence-based teaching decisions.
  • Serve as a communication tool that is both privacy-conscious and information-rich.
For students, aggregated visuals help them:
  • Understand their position in relation to class medians, quartiles, and performance ranges.
  • Recognise areas where improvement is needed, without the discomfort of being individually exposed.

6. Summary of Data Privacy Safeguards

To reiterate, all iOBE data sharing follows these principles:
  • Aggregated data only – No individual records are published.
  • No student identifiers – Neither direct nor indirect identifiers are included.
  • Effective anonymisation – Data is presented in a form that prevents re-identification.

Final Note

The iOBE data-sharing framework reflects a commitment to maximising educational benefit while minimising privacy risks. By replacing traditional row-based result lists with aggregated visual summaries, educators can share important performance insights in a format that is clear and secure.

For further details on implementing this approach, please feel free to contact me here.

Read more...

Using iOBE to Assess Students' Ability to Solve Problems

Wednesday, July 30, 2025

What Are We Really Measuring in Problem-Solving?

In engineering education — and in many other disciplines — we often highlight “problem-solving ability” as a core learning outcome. It's written in course syllabi, emphasized in program objectives, and cited as a critical skill for future professionals. But when it comes to actual assessment, there’s often a disconnect between this aspiration and the methods we use to evaluate students.

Most assessments tend to focus narrowly on whether the student arrived at the correct answer, sometimes with partial credit awarded for showing steps. While this approach may capture the outcome of a student’s work, it often fails to reveal how the student got there — what they understood about the problem, how they planned their approach, and what choices they made during the solution process.

In other words, we claim to value problem-solving as a skill, but we rarely assess the cognitive process behind it. What remains invisible is the thinking journey — the way a student interprets the question, builds a mental model, navigates uncertainties, and reflects on the plausibility of their answer. This gap matters, because it’s the process — not just the final result — that defines whether a student truly knows how to solve problems.

A Structured Way to Solve Problems

If we want to assess students' problem-solving ability more meaningfully, then we need to define it more carefully—not just as a destination, but as a process. Over the past decade, I’ve been developing and refining a structured method I call the 3-Step Approach, designed to help students tackle problems in a more intentional and organized way.

The three core steps are:
  1. Understand
  2. Strategize
  3. Solve
A fourth step, Evaluate, is often included when appropriate to encourage students to reflect on the reasonableness, limitations, or broader implications of their solution. However, I intentionally don't include it as part of the 3-Step process, since many test or exam questions tend to involve smaller-scale problems that do not require deeper evaluation beyond arriving at a solution.

Each of these steps reflects a distinct stage of cognition. Rather than diving straight into equations, students are encouraged to begin by grasping the essence of the problem — through sketching, interpreting what is known and unknown, and identifying what’s being asked. Once that foundation is clear, they proceed to plan their approach before executing it. Finally, they revisit their solution with a critical lens.

This structure not only mirrors how expert problem-solvers actually work, but it also gives students a practical, repeatable framework they can carry into different contexts — be it academic, professional, or real-world.

To support students more concretely, I’ve expanded this structured approach into a 3x3 Framework, where each major step is broken down into three sub-steps. This helps guide learners more precisely through the thinking process. You can explore this framework — and the story behind its development — in these two Medium articles:
By introducing this structure in class, I found that many students who previously struggled with vague, intuitive approaches began to approach problems more deliberately. The framework gave them something to hold onto — a map of what it means to "solve" rather than just “answer.”

But even with this structure in place, one important question remained: How do we assess these steps in a way that is consistent, meaningful, and actionable — for both lecturers and students?

Assessing These Steps with iOBE

Having a structured method to solve problems is only part of the solution. The next challenge is how to assess students along that structure — consistently, meaningfully, and at scale.

Over the years, I’ve developed a set of Master Rubrics that allow me to evaluate student performance across the key problem-solving steps: Understanding, Strategizing, Solving, and, where applicable, Evaluating. These rubrics are designed to be flexible enough to apply across various question types and difficulty levels, while still preserving the logic of the 3-Step Approach. They are also applicable across other kinds of assessment types, including project presentations, technical reports, and essays.

Once students’ marks are assigned across these stages, the next step is to make sense of the data. This is where the iOBE software becomes essential. It allows me to convert these staggered evaluations into clear visual patterns that reveal how the process unfolds — not just for individual students, but for the class as a whole.

Here’s how it works:
  • First, the marks are entered into a pre-formatted Excel sheet. If the data is already available, this step usually takes less than five minutes.
  • Then, with a single click, iOBE extracts the data and processes it within seconds — even for an entire course. (Users interact with the software through its simple GUI panel, as shown in the figure below.)
  • The software then automatically generates two key visualizations (in addition to other data): Box Plots and Population Charts.
The Box Plots display statistical summaries for each stage — minimum, maximum, median, and quartile ranges — allowing us to see how tightly or widely student performance is spread.

The Population Charts group students by performance bands, using intuitive color coding (e.g., green for high performers, red for those struggling). These charts make it immediately clear how many students fall within each outcome range, and where the bulk of the class stands.

A Real Example from My Course

To illustrate, here’s an example from one of my courses:


To produce the graphs above, I compiled student marks from several problem-solving questions across two separate tests. Each question was evaluated using the same Master Rubrics on the 3+1 steps for solving problems. After importing the data into iOBE, the software aggregated the results and produced visual outputs that revealed more than raw marks ever could.

Each of the two charts in the figure above includes five columns. These columns, from left to right, are:
  1. Overall marks
  2. Understanding
  3. Strategizing
  4. Solving
  5. Evaluating
This layout mirrors the structure of the 3-Step Approach (plus Evaluation), allowing a direct comparison between students’ performance across each phase of the problem-solving process.

What emerged from these charts was a pattern I’ve seen repeatedly over the years — and one that often goes unnoticed without tools like iOBE. Performance tends to start relatively strong in Understanding, then declines noticeably in Strategizing, and continues downward through Solving and Evaluating. The final step is often the most overlooked, even though it plays a critical role in developing reflective thinkers.

These insights don’t just confirm what we suspect — they give us a clear, visual map of student cognition, which we can respond to through our teaching.

What the Data Reveals

When viewed together, the data from the class example revealed a consistent and telling pattern — one that aligns with my experiences across multiple cohorts over the years.

As we trace student performance from Understanding to Evaluating, there is a noticeable and progressive decline. The box plots and population charts reflect this clearly: students start reasonably well with identifying and interpreting the problem, but their performance drops as they move into planning, executing, and evaluating their solutions.

This is more than just a grading trend — it reflects how students engage with the cognitive demands of problem-solving.

Most students begin with a fair grasp of the problem. Their Understanding step is often adequate, although not always complete. They may sketch something, list what is known, and state the target variable. However, this step also includes making appropriate assumptions to model the physical situation effectively — a sub-skill that many students overlook or apply loosely. This oversight weakens the foundation upon which the rest of the solution process depends.

The next phase—Strategizing—is where difficulties become more apparent. Some students either engage with it only superficially or even skip this step entirely. Rather than selecting a coherent path toward the solution, they often default to recalling formulas from familiar examples without confirming if those are actually relevant. Their plan, if present, lacks structure or justification.

This weak planning carries over into the Solving phase. Execution becomes error-prone when students operate without a clear roadmap. Even when the correct equations are known, their use may be inconsistent or fragmented. Students may substitute values prematurely, mismanage units, or misapply boundary conditions. One critical part of this step — often neglected — is the internal verification of results. This includes checking whether the final answer has appropriate units, whether it is within a plausible range, and whether the outcome aligns with expectations. These checks belong to the end of Step 3, and when omitted, they increase the likelihood of unnoticed errors.

The Evaluation step, which comes after solving, involves a broader kind of reflection. It asks students to consider the meaning, implications, or limitations of their results. In real-world contexts, this might involve asking: What does this result tell us? Can it inform a design decision? Would the solution still hold under different conditions? Yet in classroom settings, this step is rarely emphasized. As a result, most students treat their answer as an endpoint rather than a springboard for further thinking.

These patterns — made visible through iOBE’s visualizations — offer more than just confirmation of what we suspect. They provide a structured, data-driven diagnosis of where students are struggling. Rather than guessing whether a low mark was due to weak content knowledge or careless error, we can pinpoint the stage of cognitive breakdown. And this opens up opportunities to respond intentionally — through targeted instruction, guided practice, and more formative feedback.

Sharing Outcomes Data with Students

One of the most immediate benefits of using iOBE is how easily and clearly feedback (in the form of these outcomes data) can be shared with students — often right after a test, assignment, or project evaluation has been completed.

After entering the scores and running the software, I typically show students the aggregated visual results for the entire class. These graphs — always anonymized — help students see where the cohort as a whole performed well, and where they struggled. It also allows each student to reflect on where they stand in relation to others, not just in overall marks, but in the specific cognitive phases of problem-solving: Understanding, Strategizing, Solving, and Evaluating.

This kind of class-level transparency fosters a shared awareness of learning — not as competition, but as a collective journey through complex thinking tasks. It allows students to normalize struggle in certain areas, and to identify stages they personally need to improve.

Importantly, this approach to feedback is not limited to tests. For project work or coursework, the same visual structure can be used, but the plotted metrics (i.e., the horizontal axis) can be adjusted. Instead of using the 3+1 steps, the x-axes might represent course learning outcomes, Bloom’s taxonomy levels (or other taxonomies), or even specific topic categories, as illustrated in the graphs below. iOBE is flexible in this sense — it supports a wide range of learning goals while preserving a common format of feedback.


What students receive from this visual dataset is more than a percentage grade. They see their performance in context, broken down by stages of thinking or categories of competence. They gain clarity — not just about what went wrong, but about what kind of thinking needs to improve. A student may realize they skipped the planning phase, applied content without reflection, or didn't evaluate the relevance of their result. These are subtle gaps that conventional grading rarely makes visible.

Over time, this kind of feedback helps nurture self-awareness and more self-regulated learning. Students begin to engage not just with whether they scored well, but with how they approached a problem—and what they could do differently next time.

Feedback from Students

I've received countless feedback from students, over the years, on their experience learning and using this 3-Step Approach to solve problems. Many of them said good things about this approach (normally after nearing the end of the semesters). But, quite many of them too expressed difficulties on getting familiar and embodying this approach into their problem-solving routine.

One recent comment that I received was quite direct and honest on recalling his inital impression on having to go through this unlearning and relearning process with me:
At the beginning, I thought that this kind of teaching method was just wasting time and making it harder for us to obtain a good grade for this course. In this digital world, we can obtain an answer in a quick and easy way. Why do we need to think so much about a simple question? But after some lessons, I realized that this kind of teaching method not only trained us to think critically and systematically but also helped us appreciate the real application...

...it provided a structural [method] for me to answer an engineering question. In the past, I often rushed into calculations without fully grasping the problem’s intent. This approach slowed me down in a good way, which forced me to pause, sketch diagrams, make assumptions, and clearly define what I knew and what I needed.

Another student wrote below, reflecting on meaningful - and potentially lifelong - changes on her approach to learning:
Before taking this course, I tended to focus mainly on learning how to perform calculations to answer exam questions. I would memorise the steps and formulas required, often without fully understanding the theory behind the problem. The 3-Step Approach, however, forces me to really understand the theoretical concepts before doing any calculations. As a result, I have found myself learning more meaningfully, and I no longer rely solely on memorisation. I believe this change in myself will be beneficial for my future, regardless of the career path I pursue as the ability to understand a problem thoroughly before attempting to solve it is a fundamental skill in any job.

During the individual test, I really experienced how writing the “Understanding” part helped clarify the concepts in the question. By the time I reached the “Solution” part, I found that it was much easier than expected because I had fully understood the problem...

Why This Matters

In most classrooms, grades reduce a student’s performance into a single number — an average that hides the complexity of how they arrived at that outcome. While useful for summarizing achievement, such scores rarely tell us why a student succeeded or struggled.

What iOBE offers is not just finer resolution in marking — it offers a way to see the architecture of learning as it happens. It reveals trends across stages of thinking, uncovers recurring bottlenecks, and enables us to ask better questions about how students think — not just what they got right.

This kind of insight is valuable for instructors, but just as powerful for students. When learning is reduced to scores, students often assume their only path forward is to “try harder.” But when they see where their thinking faltered — whether in interpretation, planning, execution, or reflection — they gain specific, actionable guidance. They shift from blind repetition to informed revision.

Seen this way, assessment becomes more than a tool for judgment. It becomes a tool for awareness, intervention, and ultimately, growth. And tests are no longer endpoints—but mirrors for deeper learning.

Final Thoughts

Problem-solving is often treated as a black box. We pose a question, wait for an answer, and assign a grade. But what happens between the posing and the solving is where the real work lies — and where the true nature of learning resides.

When we begin to look inside that process — to see how students understand, plan, and reason — we start teaching differently. And when students see their own thinking with clarity, they start learning differently, too.

The iOBE software doesn’t impose a new pedagogy — it makes existing thinking visible. And in doing so, it supports a shift: from evaluating work to understanding minds, from checking answers to cultivating insight.

In the long run, helping students see how they think may be just as important as teaching them what to think. Because when they leave our classrooms, the problems will keep coming. What we give them, in the end, is not just content — but a way of engaging with complexity that lasts far beyond the exam.

Read more...

Introducing iOBE to Secondary Schools

Monday, June 23, 2025

A Step towards More Insightful Student Assessment at the Secondary Level

Today, I had the opportunity to meet with two teachers from MRSM Transkrian — Cg. Elias and Cg. Amir — in a follow-up session (from an earlier MUAFAKAT meeting) to explore the use of the iOBE software at their MRSM secondary school system.
The conversation built on an earlier introduction, where I presented how iOBE allows schools to visualize student assessment data in compact and insightful ways — especially through box plots and population plots that compress and provide more information than those typically available in many assessment systems (either at the school or university level).
This follow-up discussion marked a positive step towards introducing the iOBE software to pre-university school systems — and outside of the tertiary education system for the first time — demonstrating its versatility to adapt to different assessment systems and processes. The main intention is aimed at helping schools make sense of their assessment data — not just for reporting, but for deeper insights and meaningful actions.

Simple Setup, Powerful Insights

One of the key strengths of iOBE is its simplicity. Preparing the input requires only a standard Excel spreadsheet containing student assessment data — something schools and teachers already manage regularly. From there, the software is run with just a single click, either for a single subject or across multiple courses.
Figure: The only windows panel that users need to interact with when using the iOBE software.

The process is fast and completely offline. It runs locally on any standard Windows-based computer/laptop and produces results within seconds. This ease of use makes iOBE especially suitable for school environments, where time and infrastructure are often limited.

Visualizing What Really Matters

Where iOBE makes the biggest difference is in how it presents information. Rather than displaying raw marks, general averages, or simple bar charts, the system generates box plots and color-coded population plots that offer a multidimensional view of collective student performance.

These visuals highlight trends and variations in learning outcomes across a group of students. Box plots reveal the spread, median, and quartiles of achievement, while population plots show performance distribution in a more intuitive, color-coded form. With a quick glance, teachers can identify strengths, areas for attention, and outcome-specific patterns — without needing to manually sort or interpret spreadsheets.
Figure: Students' collective performance plotted against the subtopics covered in Mathematics.

Figure: Students' collective performance plotted against the typical nine subjects taken at secondary levels.

Note: the iOBE plots shown above are simulated data, not based on real students' data.

Empowering Students, Teachers, and the Wider School Community

The benefits of these visual tools extend beyond the classroom. While the graphs present collective data anonymously, individual students can privately interpret their own position within the broader performance curve. This helps students reflect on where they stand and consider how to improve, based on real evidence.

At the same time, visuals like these can help inform parent-teacher conversations, especially within groups like MUAFAKAT, MRSM’s equivalent of the PIBG groups for national schools. By presenting the overall performance landscape clearly (without revealing information on any specific students), parents and school communities can better understand progress at a cohort level — while respecting individual privacy.

Most importantly, these shared visuals strengthen communication between students, teachers, and parents. Everyone sees the same picture and can work together with greater clarity and purpose.

From Data to Action

More than just a reporting tool, iOBE encourages teachers to turn data into actionable insights. Whether it’s identifying students who may need extra support, reflecting on how well certain learning outcomes are being met, or adjusting teaching strategies based on cohort-level patterns, iOBE helps educators move from intuition to evidence.

It also allows schools to track changes and improvements over time — helping build a clearer picture of learning progress, not just at the end of a term, but throughout the academic year.

Fostering a Culture of Shared Insight

Sharing assessment results — especially in visual form — can be a powerful way to strengthen collaboration across all levels of the school community. When data is presented clearly and collectively, it invites reflection, discussion, and the opportunity to grow together.

The focus of iOBE is to support this kind of environment—one where insights drawn from data help teachers reflect on trends, celebrate progress, and consider areas for continued improvement. At the same time, students and parents gain a better understanding of how learning is progressing across the board, making the entire assessment process more open, meaningful, and responsive.

By encouraging a shared view of outcomes, iOBE helps schools build a stronger bridge between teaching strategies, student performance, and community engagement—moving everyone forward with a clearer sense of direction.

Read more...

An invitation to an f2f workshop on the iOBE software

Tuesday, June 17, 2025

SEEING BEYOND THE STANDARD OUTCOMES ASSESSMENT

An iOBE Workshop @ USM, Penang | National Training Week 2025
📅 Friday, 20 June 2025
🕗 8:30 AM – 10:30 AM
📍 Auditorium 3, Kompleks Eureka, USM
👋 A Warm Invitation to Educators and Program Coordinators
If you're an educator or program coordinator who’s looking for a better way to understand and improve your student assessment results, I’d like to personally invite you to join a focused, face-to-face workshop at USM Penang.

This session is part of National Training Week (NTW) 2025, and it’s designed to help you rethink how you interpret student outcomes—with the help of a versatile yet easy-to-use software tool called The Integrated OBE (iOBE) Software.
Why This Workshop Matters
Let’s be honest — many of us track student results using averages, total marks, or spreadsheet tables. But often, the outcomes data we collect just sits there — unanalyzed, uncommunicated, and underutilized.

What if we could do more with that data? Imagine turning it into simple, visual graphs that you could share with your students—showing how they performed in a midterm test, a final exam, a project, or even the course overall. These visuals can help students see not just their individual scores, but also where they stand in relation to the rest of the class.
When performance is made visible and meaningful, students often feel more motivated to improve, and educators gain clearer insights into how to adapt their teaching strategies.

This workshop is designed to help you realize that potential — by showing you how to identify patterns, communicate outcomes, and make informed improvements using easy-to-read graphs generated with the iOBE software.

💡 What iOBE Brings to the Table
The iOBE software is a smart and lightweight tool designed specifically for educators to compute, analyze, and visualize student learning outcomes at both the course and program levels.
At the course level, you can:
  • Assess outcomes from tests, assignments, projects, exams, and the overall course
  • Generate compact graphs and statistical summaries for quick decision-making
  • Communicate results directly to students using visual tools like box plots and population graphs
At the program level, iOBE scales up to:
  • Aggregate results from multiple courses
  • Support structured CQI and accreditation reportin
  • Provide visual clarity when presenting outcomes across course groups
Whether you’re working solo on your course or contributing to program-level reviews, iOBE gives you the insights you need—fast.

🧠 What You'll Experience in the 2-Hour Session
This isn't a theory-only session. It’s designed to be interactive and immediately useful, broken into three engaging parts:
  1. Conceptual Session: We’ll start by revisiting the fundamentals of outcome-based assessment, and explore how computed outcomes (rather than raw marks alone) offer deeper insights into student learning and course performance.
  2. Software Demonstration: A guided walk-through of how the iOBE software works, using real or sample datasets. You’ll see how outcomes are mapped, processed, and visualized.
  3. Practical Hands-On Session: You’ll try it out yourself—entering sample data, generating graphs, and discussing how the outputs can inform your teaching and CQI strategies.

🎯 Who Should Join?
This session is designed for:
  • Lecturers, educators, and program coordinators at tertiary institutions
  • Anyone involved in course assessment, CQI documentation, or accreditation processes
  • Educators who want to present assessment results more clearly to students in their classes
  • Those new to OBE tools — you don’t need technical expertise to benefit
If you’re curious about improving how you see, interpret, and act on student outcomes, this session is for you.

📝 How to Register
This workshop is part of National Training Week (NTW) 2025, and registration is open now through the official portal:
📌 Once you’re on the site:
Click ENROL NOW to secure your spot!

Read more...

Making Assessment Data Meaningful

Wednesday, June 11, 2025

How Compact Visualizations Support Teaching in Higher Education

As academics in higher education, we’re used to dealing with extensive data from exams, coursework, and projects. Often, though, this valuable data ends up in detailed tables and spreadsheets—thorough but sometimes overwhelming. While we rely heavily on this information, its presentation in numeric form can make it challenging to quickly recognize patterns, interpret performance accurately, or take timely action to improve teaching.

A more effective way to approach assessment results is to visualize the data in compact, meaningful forms. Graphs generated by the iOBE software such as box plots and population plots (see the figure below, plotted against program outcomes) simplify complex information into clear, readable formats, highlighting crucial insights quickly and efficiently.
These visualizations enable educators to quickly identify key patterns and potentially concerning trends in student performance, guiding them in refining and improving their teaching and learning strategies.

Additionally, if the data indicates performance issues arising primarily from student-specific challenges rather than instructional factors, educators can use these insights to provide targeted support, helping students understand their own areas for improvement and encouraging them to proactively engage in enhancing their academic performance.

Why Spreadsheet Tables Can Be Limiting

Most of us have experienced the struggle of sorting through dense spreadsheets. When results are presented only as raw numbers in tables — or reduced to simplistic bar charts that display just a single metric, such as average values — it becomes difficult to grasp the full picture of what’s really happening in our classes. Essential questions may come up but remain tricky to answer clearly:
  • How is the overall performance in my course?
  • Which learning outcomes are consistently challenging for students?
  • Are recent adjustments in teaching methods having any measurable impact?
Answering these questions is considerably easier when the data is presented visually, compact with rich information, allowing quick identification of meaningful trends.

The Power of Visual Summaries

Compact visualizations, like box plots and population distribution graphs, help educators to rapidly understand and interpret data:
  • Box plots instantly reveal performance ranges, median values, and variability, highlighting strengths and areas needing attention at a glance.
  • Population plots clearly illustrate how each student's achievement compares to the entire class, making it easier to provide targeted support or adjustments.
Such visuals not only enhance individual reflection but also facilitate productive conversations among colleagues and with students themselves.

Effective Across Different Academic Scales

One notable strength of visualizing assessment data is the flexibility it provides across different academic scales:
  • Program-level: Visual summaries help coordinators identify long-term trends and achievement patterns, crucial for program reviews, curriculum adjustments, and accreditation.
  • Course-group level: A group of lecturers who manage multiple related courses can collectively analyze outcomes, spot common difficulties, and align their teaching practices.
  • Single course-level: Individual lecturers gain clearer insights into their semester results, facilitating more precise and timely teaching interventions.
  • Micro-level (tests and assignments): Visuals are even useful at the smallest scale, clarifying student responses to particular tests (see the graphs below, plotted against Problem-solving Steps) or assignments, thereby directly informing teaching strategies.
This scalability supports reflective and evidence-based teaching practices, enabling lecturers to continuously refine their approaches.

Driving Better Conversations Through Clearer Data

Ultimately, the greatest value of presenting assessment data visually is its capacity to spark meaningful, practical conversations. With clearer data, educators can quickly pinpoint issues, discuss strategies, and adjust their teaching effectively. Students also benefit from visual clarity—understanding their performance in a broader context empowers them to actively participate in improving their learning.

When we move beyond traditional numeric tables toward concise visual representations, we strengthen the connection between data-driven insights and practical teaching strategies—benefiting educators and learners alike.

Read more...

iOBE Workshop @ UPNM Engineering

Tuesday, May 20, 2025

We've successfully conducted a full-day workshop titled "Understanding and Using iOBE for the Preparation of Course and Programme Assessment Reports" at UPNM Engineering yesterday (Monday 19/5/2025). Many thanks to Assoc. Prof. Dr Ku Zarina (Academic Deputy Dean) and Assoc. Prof. Dr. Rashdan (HoD for the Aviation Dept.) for the invitation and hosting of this workshop. Huge thanks to all the participants for being good sports throughout the entire full-day sessions.
This is the second iOBE session for the faculty members @ UPNM Engineering, with the practical aim for them to immediately apply iOBE on their course data. The workshop started with the theoretical session to cover the foundations of OBE assessment, the framework of the software, and the interpretation of results from the software.
In the afternoon, we ran the hands-on session where the lecturers trained by running the software on sample courses first, before moving on to run the software on their own courses, as well as aggregating data for several courses together. It was exciting to see that all of them successfully ran the software on their courses (within several seconds of running the software), and being able to see all their students' performance from many different angles that were not seen before.
We hope that this workshop opens up an opportunity for them to see the potentials of seeing into their course and programme data to a deeper level, with the outcomes of having meaningful data to "close the loop" in the CQI process. We wish them all the best for their upcoming preparation of the Course and Programme Assessment Reports.

Read more...

iOBE Software Version 7.0: Major Enhancements and New Features

Tuesday, March 18, 2025

I'm excited to announce the release of iOBE version 7.0, marking a significant upgrade from iOBE v6.2. This latest version leverages AI analysis, introduces substantial improvements in data visualization, data reporting, computational efficiency, and user experience, while maintaining full compatibility with older input files down to iOBE v6.1. The full version remains freely available to the public, especially for educational institutions seeking robust OBE solutions.


🔹 Key Enhancements in iOBE v7.0

1️⃣ Advanced Visualization and Interactive MATLAB Figures
iOBE now produces high-quality visual outputs as MATLAB figures, giving users enhanced interactivity and flexibility when working with their data. Users can:
✔ Zoom in/out and pan graphs for a closer inspection of data points.
✔ Move and reposition legend boxes for improved clarity.
✔ Resize and refine graph appearances dynamically before exporting them.
✔ Save figures in multiple formats, including PNG, JPEG, PDF, and vector formats (EPS, EMF, SVG), ensuring compatibility with different publishing and reporting needs.
📌 Additional Benefits of using MATLAB Figures
✔ Lossless resizing and scaling, especially useful when exporting figures for research papers, presentations, or reports.
✔ Custom modifications, such as changing axis labels, gridlines, or line styles, even after the figure is generated.
✔ Multi-window support, allowing users to compare multiple datasets simultaneously.
2️⃣ Improved Data Export to Excel
Users can now easily access and analyze iOBE-generated data, which is automatically saved in Excel files. These include:
✔ Student outcomes data at the course, multi-course, and program levels.
✔ Statistical data from box plots (median, quartiles, outliers).
✔ Population distribution data, capturing performance trends at both the course and program levels.
This enhancement allows users to customize their own analysis, integrate data into reports, and use AI tools for deeper insights.

3️⃣ Enhanced User Experience & Front-End Processing
To further streamline workflow and usability, the following upgrades have been implemented:
Improved progress bar tracking: Users now receive detailed updates on the computation stages during software execution, giving them a clearer picture of ongoing processes.

Reduced screen clutter: Data tables are now written directly into Excel files, eliminating unnecessary on-screen outputs and improving processing speed.

Optimized computational processing: The back-end algorithm refinements enhance data processing speed and efficiency, ensuring faster computations and smoother operation.
4️⃣ AI-Assisted Data Analysis Capability
A key enhancement in this version is the ability to leverage AI tools like ChatGPT 4o to perform in-depth analysis of the output files generated by iOBE.
🔹 AI-Driven Insights: The Excel files and visualization charts produced by iOBE contain a substantial amount of computed data. These files can be fed into AI tools like ChatGPT for detailed technical analysis using pattern recognition, statistical analysis, and advanced data analytics.

🔹 Enhanced Decision-Making & CQI: By analyzing iOBE-generated data with AI, educators can gain deeper insights into student performance trends, statistical distributions, and program outcomes. This facilitates better decision-making for Continuous Quality Improvement (CQI) in academic programs.

📌 Important Note: iOBE does not have built-in AI integration. However, its structured output files (Excel + graphs) are compact with detailed information and easily analyzable by AI tools, allowing users to extract more insights beyond what the software presents directly.
5️⃣ User-Friendly Design
While introducing powerful new capabilities, iOBE version 7.0 continues to prioritize ease of use with a streamlined and intuitive interface:
Single-Click Operations – The software features a compact (and the one and only) control panel that users interact with to execute functions with a single click, making data processing effortless. (Shown below is the actual iOBE panel on a full desktop screen, against the background of a self-captured Kinabalu sunrise.)
Simple Input Files – iOBE is designed to work with straightforward input files, ensuring that users can operate the software with minimal setup.

Offline Functionality – iOBE remains an offline tool, providing full data security and privacy, eliminating concerns about exposure to online threats or cloud dependencies, or slow server issues that can impact performance and accessibility.

Comprehensive Documentation – The software is supported by detailed documentation, ensuring that users can easily understand its features, troubleshoot issues, and maximize its capabilities.

📌 Designed for Efficiency: These features make iOBE accessible to all users, from individual educators to institutional decision-makers, without requiring extensive training or technical expertise.
🖥️ Software Availability and Requirements
📌 The Download Links:
  1. iOBE Software Version 7.0
  2. MATLAB Runtime Compiler R2024b (24.2) for 64-bit Windows
  3. Software Manual v7.0
  4. Other Documentations

🖥️ System Requirements:
  1. Built using MATLAB R2024b (64-bit Windows OS).
  2. Requires the MATLAB R2024b compiler (included in the download link above).
  3. Can also run on Apple computers but requires a 64-bit Windows OS environment installed first.
🎯 Who Can Benefit from iOBE?
iOBE is designed to support Outcome-Based Education (OBE) assessment at tertiary institutions, but can also be easily adapted for primary and secondary education levels.
✔ Lecturers and teachers can use iOBE for immediate insights into their students’ performance.
✔ Departments and institutions can integrate iOBE into their academic quality assurance processes.
✔ Education researchers can analyze trends in student achievement over time using iOBE’s robust statistical features.
📩 Contact for Training & Support
If your department or institution is interested in implementing iOBE and requires further explanation or training on its effective use for academic programs, please feel free to contact me here.

🔹 Summary of What’s New in iOBE v7.0
✅ New MATLAB interactive figures for refined data visualization & export.
✅ Excel data export for student outcomes, statistics, and performance trends.
✅ Improved front-end processing with detailed progress updates.
✅ More efficient computational algorithms for faster performance.
✅ Full compatibility with older input files (iOBE v6.1+).
✅ AI-assisted analysis compatibility, allowing iOBE data to be analyzed with AI.

Read more...

Online iOBE Seminar @ UPNM Engineering Faculty

Thursday, March 13, 2025

An online seminar titled "Implementing iOBE for the Preparation of Self-Assessment Reports (SAR)" was conducted on 12/3/2025 to the group of HoDs at the UPNM Engineering Faculty, for them to evaluate the suitability of the iOBE software in their outcomes assessment and CQI processes.

This session introduced and demonstrated, for the first time, the newly upgraded version of the software - iOBE v7.0 - with enhanced visualization graphics using box plots, new color schemes for easier interpretations on population distribution charts, and extended data computations easily accessible through Excel files.

Many thanks to Assoc. Prof. Dr. Rashdan (Aeronautics HoD) for initiating this session and to Assoc. Prof. Dr. Ku Zarina (Academic Deputy Dean) for organizing and hosting this seminar.

Inshaa Allah, we will follow up soon with a live workshop @UPNM for a more detailed presentation and demonstration of the software.

Read more...

  © Designed and managed by zmk. Modified from OBT

Back to TOP