For many companies, LMS reporting is a black box of data that provides limited insight. It’s not that there isn’t any data, but accessing or using the data can be very challenging and frustrating.
Data-driven LMS reporting is a fundamental part of assessing the success of your online learning initiative. But without data-driven insights, you have no way of measuring learners’ success or improving your online learning programs—It’s like flying blind.
5 Basic Ways to Measure Learning Success with LMS Reporting
Usually, companies are blocked by one or more of the following challenges:
- Technology limitations—the LMS is limited in its ability to let you use the data it collects
- Inadequate access to data—especially data about the time people spend learning
- Lack of reporting expertise—even with robust data reporting tools, you need expertise on your team to make the best use of the tools and the data
In fact, a recent survey by Chief Learning Officer and Raytheon found that 50% of learning and design departments lacked the skills to do formal reporting. With these tips, we’ll show you how you can start taking first steps toward measuring your learners’ success.
Many people want to be data-driven and want to do more reporting on learning success, but they don’t know where to start. Here are five basic, straightforward LMS reports that you can quickly get started on.
1. Monitoring Learner Progress
This is what you might call the first wave of reporting. It contains very basic information about enrollment and learner activity—including the following data:
- Who is enrolled?
- Who is NOT enrolled?
- What are the start, end, and due dates?
- Who is complete or not complete?
While these are basic reports, they can be helpful for many kinds of purposes, including enrollment reports or tracking progress reports. They can also be very useful to identify if someone is falling behind.
2. Course Activity and Engagement
Course activity and engagement reports provide the second wave of reporting. You’ll start to get more insight into the learning and engagement of each learner. These reports help you to see trends and correlations among different data sets, and compare results with different activities.
Your reports can include a wide range of information about activity and engagement level. For example:
- Number of logins, course visits, or resource views (URLs, videos, quizzes, etc.). Notice which resources were used and not used. Are some resources more useful? Are some resources more difficult to find?
- Forum activity. How many posts, replies, and views did each learner have? Use this information to infer topics that were most interesting or challenging.
- Activity submissions
- Completion rates
- Time spent. Notice how much time was spent on each of the resources, sessions, and activities. This can be a very helpful indicator of interest, value, and perceived importance.
3. Time Spent Learning
Time Spent Learning reports focus mostly on the time that learners spend on activities and courses. You’ll want to look for correlations and trends between their time spent learning and course grades.
This can be helpful for instructional design purposes. As you analyze the data, ask questions such as these:
- Does the course or activity take as long to complete as intended?
- Are topics difficult to grasp?
- Are learners actually spending time on learning?
4. Analyze Learning Effectiveness
This report is the third wave of reporting and gets at Level 2 of the Kirkpatrick model—providing the data to determine if the courses are actually effective. With this report, you’re analyzing the learning itself: did students actually learn what they were supposed to learn, and to what degree?
You can use data from feedback analysis and pre-/post-assessments. For example:
- Feedback, surveys, questionnaires
- Quiz statistics
- Quiz questions that stand out (e.g., a high percentage of learners get a specific question wrong)
As you evaluate the data, look for trends across courses, sections, and groups of learners (age, sex, occupation, etc.). These kinds of spikes or trends can reveal poorly worded questions or unintended built-in biases.
5. Visualizing Compliance Data
Not everyone will benefit from this report, but it can be very helpful in organizations that need to audit compliance. If you provide compliance training, you can create automated reports that keep your compliance officers fully informed of your company’s status. You can also automate reminders to employees when their certifications are about to expire.
Your compliance reports can include the following data:
- Completion / attendance information
- Time spent on courses
- Certificates issued or expired
- Audit trail reporting—have the learners actually gone through all of the required components of the course?
Meaningful Reports You Can Actually Use
Reporting the data and reporting the data meaningfully are two different things. Learning reports can be arduous to decipher if you don’t have a reporting solution that makes it easy to read.
Zoola Analytics is a cloud-based reporting software that allows you to transform your LMS data into meaningful learning and training insights in minutes. Zoola makes it faster to perform analyses and build reports. You can dig deeper into learner data, generate full insights into learning programs, and create customized reports quickly. You can eliminate hours of effort to report on eLearning usage and outcomes. Zoola even lets you create stunning visual reports and dashboards, making your data easy to understand at a glance.
Discover for yourself how Zoola can make your data-driven reporting easy and manageable—watch our on-demand Zoola Analytics webinar.
About Stewart Rogers