Posts Tagged ‘e-mail’

How to Use a Learning Management System to Solve the I-Didn’t-Get-the-Memo Problem

February 5, 2013 Leave a comment

Richard Nantel, Vice President, Enterprise Learning Solutions, Blatant Media | Absorb LMS

Somewhere among the hundreds or perhaps thousands of e-mails in your inbox, somewhere in the skyscraper-high piles of paper on the corner of your desk, is something really really important. This might be an urgent policy or procedural change. If you work in emergency response, healthcare, security, transportation, among many other industries, people’s safety may be compromised if you don’t get and act on this information.

URGENT: License Some rights reserved by RambergMediaImages/Flickr

E-mail has been and continues to be the primary communication media in the workplace. Consequently, many organizations turn to e-mail for such updates.  They’ll send out an e-mail blast with the words `IMPORTANT‘ or `URGENT‘ in the title to the people who need to be informed. Perhaps the sender will click the little checkboxes in their mail client for delivery and read notifications to be returned, but, if you’ve sent the notice to more than just a few recipients, manually tracking who has opened the e-mail is a headache.

A learning management system (LMS) can provide an effective way to monitor who has viewed such important information and who may need a follow-up telephone call. Here’s how:

1. Create a `course’ that contains the critical information embedded as a PDF document. PDF is a nice file format for something like this because it will display on many different devices, including iPads and other tablets, phones, etc.


2. Create the e-mail that will be sent when people are enrolled in the course as well as the reminder e-mail. Make sure these e-mails communicate the urgency of this information. If you can set the frequency of the reminder e-mails, don’t be shy about nagging the individual daily.


3. Enroll the individuals who should get this important update into this `course.’ The system will send out the enrolment e-mail and depending on your LMS, a message to the learner’s LMS dashboard.


Learning management systems typically provide feedback to learners on their progress. For the learner, this is like crossing an item off their TO-DO list, which provides happy feelings of accomplishment.


4. Track who has accessed the document using your LMS’s reporting features. In the example below, British singer/songwriter Laura Marling and ex Led Zeppelin guitarist Jimmy Page have accessed the important update. The others have not.


The purpose of tracking who has accessed such important information isn’t to lay blame. There are many extremely valid reasons why people may not have read important updates: spam filters, off the Internet grid due to travel or meetings, illness, etc. The real purpose of tracking access is to identify who may require a secondary attempt through means other than e-mail, perhaps a phone call.

“Can’t We Just Manage Learning Using Email and a Survey Tool?”

March 20, 2012 2 comments

Hammer and screw, photo by by justinbaederMy home is filled—evidently—with a visual history of using the wrong tool for the job. Unable to find nails, artwork has been hung with screws that were hammered into walls. Once the frame has crashed to the ground, I’ve filled the hole with toothpaste for lack of having drywall filler. Invariably, these workarounds have eventually caused greater problems and expenses. I now believe in using the right tool for the job.

I recently spoke to someone who works in a highly regulated industry who mentioned that his organization is weighing the pros and cons of different learning management approaches. One of the options they are considering is to:

  • E-mail their 1250 learners a link to an online course located on a server
  • Have learners complete an assessment created using a popular, low-cost survey tool such as SurveyMonkey or Zoomerang
  • Issue a certificate to qualified learners

Are e-mail and online surveys the right tools for this job?

Online survey tools have become easy-to-use, powerful, and very affordable. They are a fantastic way to gather feedback, so are commonly used for market research. This approach could be a really easy way to provide learners with access to a course and gather learner reaction data, commonly referred to as Kirkpatrick Level 1 evaluations. The challenge, though, lies in tracking progress and assessing whether the learners qualify to receive a certificate.

Survey tools are not assessment tools. Here’s what SurveyMonkey has to say about using their technology for assessments:

“At this time, we do not have a question type that can be automatically scored or graded. But you can manually score all responses.”

Zoomerang as well states that

“Zoomerang does not automatically grade quizzes … You (can) login to Zoomerang at any time to review and grade quizzes.”

Unless you’re prepared to manually grade 1250 `tests,’ popular online survey tools are a poor choice to assess the knowledge of your learners and establish whether they qualify to be certified.

Since survey tools fail in providing assessment in this learning initiative, you might instead decide to create tests using a popular authoring tool such as Adobe Captivate and embed those tests into your course. You can even configure Captivate to display the test results to the learner. The problem with this approach is getting reports for all learners in an easy and convenient manner. Popular authoring tools assume that tracking information will be passed to a learning management system, which will, in turn, handle reporting.

How to evaluate the right approach

In evaluating which learning management approach is best, it’s a good idea to ignore the technology and write out in plain language an efficient way to manage the initiative. These descriptions, called use cases, are immensely useful in identifying technological requirements and consequently, evaluate potential solutions.

For the initiative above, here’s a use case that might make sense:

  • Send learners an e-mail informing them that they are enrolled in a course
  • Send a reminder e-mails, every seven days, to learners who haven’t started the course
  • Provide administrators/instructors/managers with reports showing:
    • Who has completed the course
    • Who is currently undertaking the course
    • Who has yet to access the course
    • How each learner scored in the final exam and intermediate knowledge checks
    • How much time the learner spent in each section of the course
  • Provide instructional designers with reports showing whether certain test questions were more likely to be answered incorrectly, which might indicate a problem with the course content or the wording of the questions.
  • Send a congratulatory completion e-mail and certificate to qualified learners
  • Gather Kirkpatrick Level  1 reaction data from learners. (This data will help instructional designers improve the course over time.)
  • Send a post-certification “two-minute review” to learners two weeks after they’ve completed the initial course, and perhaps a “one-minute review” one month later
  • Provide learners with a way to view their learner transcript and print their certificate
  • Provide learners with ongoing access to the course so that they can review the content any time and reference the content for performance support
  • Automatically enroll learners in related learning events if the course in question is part of a learning plan
  • Track the date certification expires for each learner and e-mail recertification notices to learners prior to the expiry of their certification
  • In the event of an audit, provide auditors with a centralized repository of learner data

You don’t need a learning management system for all learning initiatives. But, when you’re dealing with 1250 learners in a regulated industry and need to assess their understanding of the content, track their progress, and issue certificates, a learning management system beats out pen and paper, spreadsheets, and e-mail and surveys for ease of use, learner experience, administrative effort, and return on investment.