This is the second (slow!) of four pieces reflecting on the experiences of Emilio, a subject matter expert who was tasked with converting his successful F2F training into an elearning offering. Emilio has let me interview him during the process.
This piece focuses on the thorny issue of learning objectives at the front end of an elearning project and assessment at the other end. You can find the context in part 1 here. (Disclaimer: I was an adviser to the project and my condition of participation was the ability to do this series of blog posts, because there is really useful knowledge to share, both within the colleague’s organization and more widely. So I said I’d add the blog reflections – without pay – if I could share them.)
Nancy: Looking back, let’s talk about learning objectives. You started with all of your F2F material, then had to hone it down for online. You received feedback from the implementation team along the way. What lessons came out of that process? How do we get content more precise when you have fewer options to assess learner needs and interests “in the moment” as you do face to face, and with the limited attention span for online?
Emilio: I realize now that I had not thought about being disciplined with learning objectives. I had created them with care when I first developed my F2F offering. Once I had tested the course several times, I recognized that I forgot my own initial learning objectives because in a F2F setting I adapted to student’s interest and knowledge gaps on the spot, and I was also able to clarify any doubts about the content. Therefore, over time, these learning objectives become malleable depending on the group of students, and thus lost presence in my mind.
This became apparent as I was doing the quizzes for the online work and got comments back from Cheryl (the lead consultant). She noted where my quiz questions were and were NOT clarified in the learning objectives and content. I realized I was asking a bunch of questions that were not crucial to the learning objectives.
With that feedback, I narrowed down the most important questions to achieve and measure the learning objectives. It was an aha moment. This is something that is not necessarily obvious or easy. You have to put your mind into it when you are developing an e-learning course especially. It applies to the F2F context as well, but in an e-learning setup you are forced to be more careful because you cannot clarify things on the spot. There is less opportunity for that online. That was very critical. (Note: most of the course was asynchronous. There were weekly “office hours” where clarifications happened. Those learners who participated in the office hours had higher completion rates as well.)
It was clear I had to simplify the content for elearning set up – and that was super useful. While my F2F materials were expansive to enable me to adapt to local context, that became overload online.
Nancy: What was your impression of the learners’ experiences?
Emilio: It was hard to really tell because online we were dealing with a whole different context. Your indicators change drastically. When I’m in F2F I can probe and sense if the learners are understanding the material. It is harder online to get the interim feedback and know how people are doing. For the final assessment, we relied on a final exam with an essay question. The exam was very helpful in assessing the learner’s experience, but since it is taken at the end of the course, there are no corrective measures one can take.
Nancy: Yes, I remember talking about that as we reviewed pageviews and the unit quizzes during the course. The data gives you some insight, but it isn’t always clear how to interpret it. I was glad you were able to get some feedback from the learners during your open “office hours.”
We used the learning objectives as the basis for some learner assessment (non graded quizzes for each unit and a graded final exam which drew from the quizzes.) How did the results compare with your expectations of the learners’ acquisition of knowledge and insights? How well did we hit the objectives?
Emilio: We had 17 registered learners and 7 completed. That may sound disappointing. Before we started, I asked you about participation rates and you warned me that they might be low and that is why I am not crying. The 7 that completed scored really well in the final exam and you could see their engagement. They went through material, did quizzes and participated in the Office Hours. One guy got 100% in all of the quizzes, and then 97% in the exam.
We had 8 people take the final exam. One learner failed to pass the 70% required benchmark, but going deep into it, Terri (one of our consultants) discovered the way Moodle was correcting the answers on the multiple choice was not programmed precisely. It was giving correct answers for partially correct answers. We need to fix that. Still, only one failed to pass the 70% benchmark even with the error.
The essay we included in the exam had really good responses. It achieved my objective to get an in depth look at the context the learners were coming from. Most of them described an institutional context. Then they noted what they thought was most promising from all the modules, what was most applicable or relevant to their work. There were very diverse answers but I saw some trends that were useful. However, it would be useful to have know more of this before and during the course.
Nancy: How difficult was it to grade the essays? This is something people often wonder about online…
Emilio: I did not find it complicated, although there is always some degree of subjectivity. The basic criteria I used was to value their focus on the question asked, and the application of all possible principles taught during the course that relate to the context described in the question.
Nancy: One of the tricky things online is meaningful learner participation. How did the assessment reflect participation in the course?
Emilio: We decided not to give credit for participation in activities because we were not fully confident of how appropriately we had designed such activities for an e-learning environment in this first beta test. I think this decision was the right one.
First, I feel that I did not do a good job at creating an atmosphere, this sense of community, that would encourage participation. Even though I responded to every single comment that got posted, I don’t really feel that people responded that much in some of the exercises. So I would have penalized students for something that is not their fault.
Second, we had one learner who did every exercise but did not comment on any of the posts. He is a very good student and I would have penalized him if completion relied on participation. Another learner who failed did participate, went to the office hours and still did not pass the final exam.
We failed miserably with the group exercise for the second module. I now realize the group exercise requires a lot of work to build the community beforehand. I sense this is an art. You told me that it is completely doable in the elearning atmosphere, but after going through the experience I really feel challenged to make it work. Not only with respect to time, but how do you create that sense of community? I feel I don’t have a guaranteed method for it to work. It is an art to charm people in. I may or may not have it!
Nancy: The challenges of being very clear, what content you want to share with learners, how you share it, and how you assess it should not be underestimated. So often people think it is easy: here is the content! Learning design in general is far more than content and learning design online can be trickier because of your distance from your learners – and not just geographic distance, but the social distance where there is less time and space for the very important relational aspects of learning.
Up Next: Facilitating Online
La distancia que hay entre los objetivos de aprendizaje y la evaluación como dos polos dentro de elearning, proceso que parte de una mala premisa ya que con elearning lo más interesante precismente es su capacidad de implementar no ya los procesos ni las evaluaciones finales, si no los PROCESOS DE APRENDIZAJE y su evaluación continuada, con lo que Nancy creen que es mejor ese tipo de elearning que aún se hace y que nunca hubiese permitido su despegue de haber seguido así.
Aunque presentan pruebas de investigaciones, estas son hechas a la antigua usanza, expresando aspectos como los siguientes “Es más difícil en línea para obtener la retroalimentación provisional y saber cómo la gente está haciendo. Para la evaluación final, que se basó en un examen final con una pregunta de ensayo. El examen fue muy útil en la evaluación de la experiencia del alumno, pero ya que se toma al final del curso, no hay medidas correctivas que se pueden tomar” …. Juan domingo Farnos
Tengo la sensación de que esto es un arte. Usted me dijo que es completamente factible en la atmósfera e-learning, pero después de pasar por la experiencia que realmente sienten desafiados para hacer que funcione. No sólo con respecto al tiempo, pero ¿cómo se crea ese sentido de comunidad? Siento que no tengo un método garantizado para que funcione. Es un arte para encantar a las personas. Me puede o no puede tenerlo! Nancy White
Hi Juan Domingo. First I’m going to paste in a Google Translate version of your reply so you can see I may or may not be fully understanding! 🙂
The distance between the learning objectives and evaluation as two poles within elearning process of a bad premise because with elearning most interesting precismente is its ability to not implement the processes and final evaluations, if not learning processes and continuous assessment, which Nancy believe it is better that kind of elearning that is still done and would never have allowed her off to have gone well.
Although research present evidence, these are made the old fashioned way, expressing aspects such as “It’s harder online to get the interim feedback and how people are doing. For the final assessment, which was based on a final exam with an essay question. The examination was very useful in the evaluation of the student’s experience, but as it is taken at the end of the course, no corrective measures can be taken “…. Juan Domingo Farnos
I feel that this is an art. You told me it is completely feasible in e-learning atmosphere, but after going through the experience they really feel challenged to make it work. Not only over time, but how that sense of community is created? I feel I do not have a guaranteed method to work. It is an art to charm people. I may or may not have it! Nancy White
If I understand correctly, your concern is the art of engaging in online learning? I think there are a couple of pieces – and one that was challenging in this instance, which is the pedagogical approach of delivering CONTENT online. It is so easy to become passive as a consumer of content. And yet it is a bigger investment in time and attention to be a creator of content, and thus as learners, we are pushed to deeper engagement. On top of that, I think – just like in the classroom – some teachers have a better sense of how to RELATE to their learners (vs just talk at them.) This is not unique to online, but our narrowed means of communication may make it challenging.
Does that make any sense Juan Domingo?