My proposed theory of causation for online learning success

After reading the Dooley (2001) reading about theories and constructs, I decided to draw up a mock theory based on my own proposed research question. Below is the result:

my-theory.png

Steps in making & using theory – adapted from Dooley (2001)

In Figure 1, the straight single headed arrows assert a causal link. In my example, academic ability and effective learning strategies would affect online learning success – ie a causal link – but a learner could have either one or the other to be successful and not necessarily both. This is indicated by the curved double headed arrow that indicates there is no claim that academic ability and effective learning strategies are necessarily related.  Motivation and online learning success have reciprocal causation as learners would be motivated if they are successful online learners and also if their motivation is what causes their online learning success. Support provides learners with indirect causation as it might motivate (intervening variable) to obtain online learning success.

In Figure 2, I have taken one theoretical variable, effective learning strategies, to propose that these can lead to success in online learning. The square boxes indicated the constructs made concrete so that they can be observed. For learning strategies, qualitative data can be gathered by asking learners what learning strategies they used. Quantitative data can be gather in the form of grades and course completion to indicate online learning success.

Should I use this proposed theory, I can see how pragmatism would be the most suitable paradigm as I would be able to gather qualitative and quantitative data. Based on the results propose effective learning strategies are incorporated into online design as one practical result of the theory.


Reference

Dooley, D 2001, ‘Theory : tentative explanations’, chapter 4 in Social research methods, 4th edn, Prentice-Hall, New Jersey, pp. 58–72.

Philosophical assumptions

Following on from the question “why am I doing research?” the next logical step is to align this to a philosophical belief. Using Creswell (2013) excellent chapter on Philosophical assumptions and interpretive frameworks, I realised the philosophy I wanted to use, pragmatism, is better for research where practical actions are taken during the research. Pragmatism appealed to my sense of “reality is what is useful, is practical, and ‘works’” (Creswell, 2013, p.37). However, my research question might work better using social constructivism (or interpretivism) as I am trying to find out the realities that are constructed through lived experiences and interactions with others. Pragmatism would be a great paradigm if I wanted to act on the results of the research.

Social constructivism philosophical beliefs:

Ontology: Multiple realities are constructed through our lived experiences and interactions with others.

Epistemology: Reality is co-constructed between the research and the researched and shaped by individual experiences.

Axiology: Individual values are honoured, and are negotiated among individuals.

Methodology: More of a literary style of writing used. Use of an inductive method of emergent ideas (through consensus) obtained through methods such as interviewing, observing, and analysis of texts.

Pragmatism philosophical beliefs:

Ontology: Reality is what is useful, is practical, and “works”.

Epistemology: Reality is known through using many tools of research that reflect both deductive (objective) evidence and inductive (subjective) evidence.

Axiology: Values are discussed because of the way that knowledge reflects both the researchers’ and the participants’ views.

Methodology: The research process involves both quantitative and qualitative approaches to data collection and analysis.

(Adapted from Creswell, 2013, p.36-37)

After reviewing both paradigms I am still in a quandary but leaning towards pragmatism as I am keen to use both qualitative and quantitative data measures.


Reference

Creswell, J. W. (2013). Qualitative inquiry and research design: Choosing among five approaches. London: Sage publications.

Why am I doing research?

I have just finished reading Dawn Darlaston-Jones’ excellent reading Making connections: the relationship between epistemology and research methods. Not only has Dawn provided me with a clear understanding of why I should be investigating a research question but also how this clearly connects with our world view as human beings.

I have struggled to come up with a research question that I was happy with as I had not connected it to my view of reality. As Dawn suggests by asking why we are investigating a particular issue, we can get to core of what we believe and how the world works. The why will show my view of reality and whether I assume that others have the same view that we cannot necessarily perceive (realism) or that there are multiple realities that are contextual and socially constructed (relativism). Reading Dawn’s article made me recognise that I am now closer to researching a question that I am truly interested in as it fits in with my view of the world.

Why I am doing the research?

To progress successfully a real passion for the subject is required, a way of truly understanding a gap in your understanding of the world that you hope to understand by doing the research. My specialisation in the Masters of Education is Online and Distributed Learning and this is also my full-time job as an Online Educational Designer. My own interest in this area was sparked by not having studied myself for over 10 years when I started on this fully online external degree. I quickly recognised that it was only my persistence and determination through a huge effort that enabled me to succeed and not the way the course was delivered. I was disappointed by the lack of teaching and engagement from other students. Essentially, I was learning on my own using some curated content.

Even though I have been successful, I only recently recognised that I have been studying very inefficiently using inadequate learning strategies that meant that I spent huge amounts of time and effort without very good learning outcome. This was a factor I was not aware of that might contribute to the high attrition rate and stress of online study and adds to other factors such as a sense of isolation online students feel, lack of time and support. I noticed in myself as well as others that my study skills, learning strategies were ill informed. I did not know how to effectively study independently. I was spending large amount of time on study to the detriment of my family and social life. The fact that I did well, I cannot attribute to the ‘teaching’ of the online courses but rather to my doggedness. Working in a university myself I know that there is a greater push for students to study independently and online. The students who choose to study online often do so because they are already in full-time employment and may wish to advance or change their career.

I thus want to focus my research on successful online learning and before reading Dawn’s article had been looking for a paradigm that has an ontology and methodology that would fit. Instead I should have been going the other way around – understanding why I am doing the research, why I have asked a particular question which then informs the epistemology – how I will know what I know – which then informs the methodology. This is where Jonathan Grix’s article on social science terminology was particularly useful in understanding the relationship between all the elements and that we must start at the top with ontology. For me, what is my view of reality regarding online learning = why am I doing this research?

With all this in mind I now think I am closer to investigating a topic I am truly interested. From this and Dawn’s own research question I now pose a very similar question:

Why are some students successful at online learning while others are not?


References

Darlaston-Jones, D. (2007). Making connections: The relationship between epistemology and research methods. Special Edition Papers19(1), 19-27.

Grix, J. (2002). Introducing students to the generic terminology of social research. Politics22(3), 175-186.

Reflection: evaluation of content

How does the content of my flexible learning experience contribute to meeting my identified needs?

Let’s recap what my needs were (in bold) and my evaluation below:

My purpose

🙂 to experience a course that was specifically created to be taught online by a team of experts on a conceptual topic

Yes, this was achieved by choosing the Think101: The science of everyday thinking course. The course was professionally delivered with 22 experts as well as the 2 course instructors. The online teaching method used learning principles that can easily be applied to a less flexible face-to-face course. Indeed the course provided a sample of how the course was delivered in a face-to-face environment:

🙂 to enable evaluation of the course for the Assignment 2 in the Designing for Flexible Learning course

Yes, not only was I able to evaluate the course on it’s content but also it’s design. Both the content and design will enable me to better design courses in the future.

🙂 My skills & knowledge

  • review and (re)design higher education courses
  • good technical skills
  • ability to learn independently (self-directed)

Yes, I was able to use my existing skills and knowledge and apply this to the course. Not only that, my skills and knowledge were tested during the course and I had to adjust my thinking.

🙂 My expectations

🙂 to see if environments can be designed where people learn how to think and learn

Yes, with the caveat that you still need to be self-directed, motivated and willing to put in effort to succeed.

🙂 to experience an online learning environment created by a team of experts

Yes, I was introduced to the EdX learning environment and I have an understanding that the course was created as a very collaborative effort – expert interviews, high-quality video, online activities, navigation, video captions and scripts

🙂 fully online course that is self-paced

Yes, although it would have been useful if the external links were changed in colour once visited, as I navigated back and forth through episodes and would forget which links I had already been to

🙂 ability to re-use design ideas and concepts

Yes, biggest gain here was the ability to not just use the design ideas but also the concepts of how people think

🙂 no credentials required

Yes

🙂 access course through internet on laptop and mobile devices

Yes, I only accessed the course using my laptop. Three links to external content did not work any longer.

😐 variety of mediums and teaching approaches

Not really, main teaching methodology was video interviews. Quizzes and discussion boards as summative and formative assessment. Limited variety in teaching approaches. Ironically this was something that was brought up by a video interview with Richard Feynman, The Pleasure of Finding Things Out↑. Richard Feynman pointed out that he tried many different teaching methods over the years as he found that no one approach would suit everyone. So it is best to have many different approaches. This relates strongly to using Universal Design for Learning principles in course design.

🙂 highly rated course by other students

Yes, this was evident in the initial search and selection of the course but also by scanning the comments on the weekly discussion boards.

🙂 not a ‘how-to’ course

Well, some skills are provided in the course that can be applied – such as desirable difficulties and the 6 lead questions that help you to change your and other people’s minds. The main thread of the course is how to apply scientific processes to everyday thinking.

🙂 free

Yes

Desirable Difficulties in Think101

The instructors and designers in Think101 have created an online course that takes advantage of desirable difficulties.

Distributed Practice

Self-paced: The content has been spaced over 12 episodes. Each episode contains the content and activities. Learners can self-pace through the course. It is easy to bookmark a page but the course tracks where you are up to and opens at that page.

Structured activities: each episode has a quiz that test the knowledge of that episode and the previous ones. Each week has a discussion forum at the end.

Every week is shot in a different location and context using different experts.

Retrieval Practice

Discussion forums allow student an opportunity to link things up with what we already know and everyday life. Your understanding of concepts is also tested. Staff will provide feedback to discussion questions.

Quizzes test your retrieval methods of what has been covered in the episode and in previous episodes. The quizzes can be saved and submitted later. They also provide instant feedback once submitted.

Everyday examples are provided to discuss content or ‘test’ content outside of the course (with family, friends etc).

Interleaving

The distribution of learning one subject and then another does not seem to have been done as well in this course apart from the weekly quizzes. The course has a clear structure moving from concepts to scientific methods to application of those methods.

I have been moving through the course using this established structure. Students are able to move freely between episodes. As this is a free, online, self-paced course that has balanced the learning principles with high production video and expertise interviews – perhaps it can be said that students can do their own interleaving.

Certainly the interleaving occurs by interviewing lots of experts who provide different perspectives.

Students do have the opportunity at the end of each episode to ‘Learn more’, ‘Learn even more’ and ‘Learn lots’ which provides them with links to a variety of mediums for learning more.

screenshot-think101

Critical observations

Although I think overall the Think101 learning experience has been great. Here are a few things that I have observed that would be worth considering when re-designing the course or when designing a course in general:

  • All content is presented as a video interview – there is very little other mediums except those found at the end of each episode – Learn more etc – references to relevant work
  • All interviews and experts are dominated by males – the two teachers and 18 experts are male whilst only 4 women are interviewed (there might be a perfectly good reason for this but it would be worthwhile for the instructors to point this out)
  • Every week is similar – content (videos) > quiz > discussion forum:
    • content could be presented in different ways
    • weekly quizzes are very valuable and should always be used but need thoughtful questions
    • discussion forums are too big (too many entries) so that instead of ‘discussion’ you might just put up a post without discussing others – I have a problem with their use as they are not really discussions due to the asynchronous nature.
  • All experts basically identify that learning and changing your mind is hard work, something that you need to do. If you are unwilling to do this no matter how well the course is designed you will not learn or be able to change your mind.

 

Desirable Difficulties

I promised that I would correct my previous retrieval attempts to list Bob Bjork’s desirable difficulties. I have made my bed and had some dinner so I feel I have ‘interleaved’ enough to write this blog post 😉

Retrieval

Bjork and Bjork (2011) call this Generation effects. Trying to generate an answer (or retrieve an answer) without looking it up. Testing is seen as extremely beneficial for learning rather than re-studying as allows you to apply the retrieval strategy and identify what has been understood or not. Bjork and Bjork (2011) put it in the following way:

Basically, any time that you, as a learner, look up an answer or have somebody tell or show you something that you could, drawing on current cues and your past knowledge generate instead, you rob yourself of a powerful learning opportunity (p.61)

 

Spacing

In addition to my summary of spacing, Bjork and Bjork (2011) add that spacing also improves the transfer of learning as it allows you to build on existing knowledge to enhance new learning.

Interleaving

I struggled with this one as it closely related to spacing. Bjork and Bjork (2011) state that interleaving is a good way to space different topics or tasks. Interleaving will also improve retrieval learning and transfer of learning. Interleaving is the opposite of blocking (mass learning) – which is what I said under spacing.

Varying the conditions of practice

I missed the target on this one. What Bjork and Bjork (2011) mean is that if learning always occurs in the same setting, the learning becomes contextualised to that setting. They suggest changing the settings (eg study at work, home, different room, socially, online/offline). I had this listed under interleaving.

Designing courses

As I was learning about these strategies, I noted down ways to incorporate these into the courses I help design as well as my own learning:

 ‘Test yourself’ – retrieval strategies

  • summarise from memory
  • debate course material
  • describe examples (on discussion boards, blogs etc)
  • apply what you have learnt to everyday life
  • test understanding using quizzes (student or teacher generated)
  • fill in the blanks (created by another student)
  • flash cards
  • reproduce outline of a chapter
  • students asking each other questions about content
  • others? …

Spacing

  • segmenting content and activities
  • self-pacing
  • time planners (weekly estimates)
  • shorter content and activities
  • weekly quizzes

Interleaving

  • previous topics in new topics
  • separate topics or tasks

Varying the conditions of practice

  • novel ways to present/generate information
  • encourage students to study in different locations
  • opportunities for practice at work, life
  • learning outside the classroom/learning management system

Up next…

Were desirable difficulties used in the Think101 course?


References

Bjork, E. L., & Bjork, R. (2011). Making Things Hard on Yourself, But in a Good Way: Creating Desirable Difficulties to Enhance Learning. In Psychology and the Real World: Essays Illustrating Fundamental Contributions to Society (pp. 56–64). Retrieved from https://teaching.yale-nus.edu.sg/wp-content/uploads/sites/25/2016/02/Making-Things-Hard-on-Yourself-but-in-a-Good-Way-2011.pdf

Reflection on content

I have just finished Episode 5 in my flexible learning experience. Episode 5 was called Learning to Learn, not only useful for my own learning but how to set up learning for others as part of my job as educational designer. I learnt a great deal, in actual fact I learnt that I haven’t been learning in the right way – at least not the right way for long term memory gain. Block learning, note-taking, highlighting and re-reading apparently do not serve much of a purpose in learning for the long-term. Sure you can do well on an exam by cramming the night before the exam, however in a month (or so) after your exam you will have forgotten what you have learnt.

Another element of this is that both teachers and students falsely believe that good performance is due to the technique of block learning.  In actuality only short memory is measured in the performance, not long term memory. This equally applies to learning new skills and concepts. In a study people were shown 6 pictures by different artists. The group that was shown the pictures in a block had more trouble identifying the artist when shown a random new picture. The people who were randomly shown the pictures by different artists had more success identifying who painted the new picture. However, people self identified that they would learn better if the pictures were shown in a block of the same artist before moving onto the next. This effect is called the fluency effect. It relates to our false learning strategies such as re-reading or note-taking. By re-reading an article we tend to understand it better because we have already read it. We mistake this ‘fluency’ for understanding the article and it’s concepts.

Input less, output more

Bob Bjork describes four learning strategies that will enhance your learning for long-tem gain – he calls these desirable difficulties. As part of writing this blog post I am not referring to my notes or re-reading, listen to the content. I am trying to use the learning methods described in the course. So I write the following with the caveat that it might not be entirely correct but this is the way I have understood it.  This is one of the desirable difficulties that Bjork describes, retrieval strategy.

Retrieval strategy

This desirable difficulty is created when trying to recall a concept without looking at your notes, or re-reading/watching the content. At this moment in time, I am trying to recollect what this strategy is and am trying to convey it to you. The process of me trying to retrieve the information is vastly more powerful in me learning the concept (even if I get it wrong) than if I had just looked up the answer. Of course this method is more difficult, it requires more effort but it is therefore a desirable difficulty.

In a slight segue, I feel that because everyone is connected to the internet with ready-made answers by googling it, the skills of retrieval by only using your own brain is even more crucial than ever if we are going to improve learning. Especially learning on our own.

Ways you can increase your retrieval opportunities:

  • Flash cards – write the concept on one side and try to remember what means without looking at the answer until you have retrieved your version
  • Discuss the concept with a friend
  • Ask and answer questions with other students
  • Summarise a reading
  • Teach someone else what you have learnt
  • Provide examples
  • Debate
  • Test yourself

Spacing

Spacing is another desirable difficulty that will aid long-term learning. Spacing is the antithesis of block learning. As an example, imagine you wrote an essay the night before it was due, you wrote it in 7 hours. If you had spent the one hour over 7 days or maybe even weeks you would retain the information longer even though the total time spent is the same.

Interleaving

I struggled with this strategy as I currently tend to delve into a topic head on and might not immerse until I have completed all the content. I believe (desperately trying to retrieve 😉 this learning strategy means we need to mix up our activities so that they are not all logically organised like in a textbook. I think, content needs to be mixed up but also that we ourselves need to involve ourselves in different activities. Maybe mixing up two subjects or learning another skill. Interleaving also refers to mixing up where you learn – the environment. As learning is context sensitive by learning something in a novel environment or way will help us to retain the information better but also to transfer the information to other domains.

There was a 4th strategy and I am sitting here desperately trying to retrieve what it was. It is so tempting to look at my notes as I wrote this down but I am trying to resist. I think it is integrated practice. I’m going to put this down but will follow up this post with one where I correct my errors as I do not want to mislead.

Integrated practice

Relate the ideas that you have learnt to your own life. Rather than watching or re-reading content, practice what you have learnt.

Note:

This is one of the hard things in online learning, learning by yourself. You do not have others to discuss things with. Thus we are left with discussion with colleagues, friends and family who may or may not have the interest. Discussion boards seem hopelessly inadequate as the asynchronous ‘discussion’ is not really a discussion. Even a blog post is the same. Really I would have preferred to do this on my own but I wanted to highlight my struggles with this ‘new’ learning methods. In reality, I could just persist with my note-taking, re-reading and block learning and perform well enough for my course. However I am REALLY interested in learning for long-term memory and how to create conditions/designs so that others will learn in this better way too beyond a course term.

Thus I am trying to output more and input less even if it does lay bare my difficulties.

Reflection on search & selection

And the winner is…

Think101x

How did I locate and select Think101x?

  1. I randomly searched for online courses…
  2. I started a few…mainly how-to courses on Lynda.com
  3. I did a needs analysis on myself…
  4. Searched The 50 best free online courses of all time↑! by www.class-central.com ↑(online course search engine)
  5. I looked at popular choice, Learning how to learn↑, but didn’t go ahead as it looked like you had to pay
  6. I previewed Think101x course overview video and read student reviews
  7. I selected Think101x to fulfil my needs:
    • free, self-paced professional course developed by a reputable university created by and with experts
    • a highly rated course as reviewed by former students
    • the course enables me to incorporate what I learn from the design perspective and content in the courses I design – the course is conceptual rather than technical
    • improves my own thinking, learning and understanding
    • direct application in my own life
  8. I started episode 1 of 12 and liked what I learnt
  9. I was motivated to continue by the personable teachers, interesting interviews, and activities that made links from the content that I am learning to ‘real’ life

I have immersed myself quickly and am up to episode 4. There is one interesting point though made by Professor Richard Nesbitt↑ in episode 3 that made me think that all this reflection on my search and selection was rather pointless as:

"...we have no real insights into the determinants of our own behaviour..."

So I wonder really if there was any point to my search and selection process and whether my random choice would have been just as good?

Are we in control of our own decisions?

Needs analysis reflection

The course I am currently studying online, Designing for Flexible Learning Environments↑, asks that students participate in a flexible learning experience and review that experience. Interestingly, this is what I do professionally as an educational designer, I review and (re)design higher education courses. Privately, I also analyse the courses that I take as a student in a Masters of Education to see if they have been designed well. My interest is to see if environments can be designed where people learn how to think and learn.

The search for a flexible learning experience

Ironically, I could have selected the Designing for Flexible Learning Environments course but I decided not to as I want to experience something different. I want to experience a course that was specifically created to be taught online by a team of experts rather than one academic plugging away inexpertly in a learning management system.

To find a suitable flexible learning experience a needs analysis should be done, this time on oneself! Initially I found this counter-intuitive and just randomly looked for courses that might be of interest to me, without doing the needs analysis. This led me to start several courses on Lynda.com↑, including starting a course on WordPress. I thought this might be a good course to do as I could use the skills in my job and personal life. As I reflected on my choice and talked to others, it dawned on me that my real interest for learning would not be satisfied, I didn’t want a technical ‘how-to’ course, I wanted a course about thinking. Unwittingly I had begun the process of a needs analysis.

A needs analysis enables you to identify the purpose of creating a course  – in this case  it is the purpose for doing the course. Here is what I learnt about myself using the Instructional Design Expert’s analysis process↑, as part of the ADDIE Model, as a guide (Instructional Design Expert, 2009).

My purpose

  • to experience a course that was specifically created to be taught online by a team of experts on a conceptual topic
  • to enable evaluation of the course for the Assignment 2 in the Designing for Flexible Learning course

My skills & knowledge

  • review and (re)design higher education courses
  • good technical skills
  • ability to learn independently (self-directed)

My expectations

  • to see if environments can be designed where people learn how to think and learn
  • to experience an online learning environment created by a team of experts
  • fully online course that is self-paced
  • ability to re-use design ideas and concepts
  • no credentials required
  • access course through internet on laptop and mobile devices
  • variety of mediums and teaching approaches
  • highly rated course by other students
  • not a ‘how-to’ course
  • free

Stay tuned for my reflection on the course selection…


References

Instructional Design Expert. (2009). Analysis Process. Retrieved from http://www.instructionaldesignexpert.com/analysisProcess.html

My DBR Proposal

Statement of the problem and context

I work as an Online Educational Designer at a higher education institution in Australia. My role is to assist staff in the online delivery of courses. The courses are delivered using the Moodle learning management system but the university is open and invested in using any technology that will create more engaging and interactive online courses. Beyond knowledge competence graduate attributes such as communication skills, problem solving skills, critical and creative thinking, collaboration, digital literacy and life-long learning are also required to be embedded in fully online and blended courses.

A sample course review using a student’s point of view has revealed that many courses are content heavy but have little evidence of online student engagement, interaction or knowledge construction. The online environments are teacher-led, didactic and information dense. Conversion from traditional face-to-face approach to online has meant the online environment is used more as an add-on, a tool much like a textbook, augmenting the course at most.

Discussions with the head of school and academic developer confirmed the well documented issues that teachers face in converting their traditional pedagogy to an online environment (Rovai & Jordan, 2004; Torrisi-Steele & Drew, 2013). Current teaching practice can be changed to online if professional development is focussed on academics’ current pedagogy and support from peers using networked learning principles.

Research questions

  • What is (online) learning?
  • What are good teaching practices?
  • What is a good online course design?
  • What is a good course review tool?
  • What is missing/hidden from online teaching?
  • How can academics learn from each others online teaching practice?

Literature Review

Before we can address how online courses could be designed so that students can learn, we have to understand how people learn and what are good instructional methods. Both students and teachers have to adapt to a new teaching method. In How people learn (Bransford, Brown, & Cocking, 2000) three core learning principles are described that enhance learning. First, teachers must understand students’ pre-existing knowledge and understandings; second, teachers must have in-depth factual knowledge with many examples; and third, metacognitive skills must be taught all subject areas (Bransford et al., 2000).

The networked learning principles fit well with the three core learning principles. To foster learning in the online environment, network learning principles are useful because they are: (1) learner-centred; (2) facilitate knowledge construction and distribution; and (3) develop connections (Anderson & Dron, 2012; Ehlers, 2013; Siemens, 2005). In networked learning the student is put central as they are able to choose their own learning path based on their prior knowledge and interests. In-depth factual knowledge can be found in various people and tools beyond the teacher, providing many examples and perspectives. Teaching students about connections lays bare the learning process: how to find information; who are the people to connect to; or what tools to use.

Most current higher education teaching is not using networked learning principles to teach students, even though the online environment affords them with the technology and tools to do so. Most traditional education is still focused on teaching factual knowledge in a didactic manner rather than teaching the thinking skills using real-world problem based approaches (Bransford et al., 2000; Cabrera & Colosi, 2012; Putman & Borko, 2000). This is a problem with education in general, not just online education. Educational practices in higher education are still teacher-dominated not learner-centred, thinking skills are not explicitly taught, and courses are still content driven (Keppell, Suddaby, & Hard, 2015; Kirkwood, 2014; Sharkova, 2014).

Academics use learning management systems to make information using various perspectives and media accessible to students. Student interaction is possible but this does not mean student interaction, learning or changes in teaching practice are occurring (Dias & Diniz, 2014; Price & Kirkwood, 2013; Sharkova, 2014). Networked learning principles can be used to transition information output driven online course environments to an environment where the student constructs knowledge via people, tools and connections.

Higher education institutions often focus on the transformational ability of online learning and technologies (Graham, Woodfield, & Harrison, 2013). However, Clark (1994) states that learning outcomes can be achieved regardless of the medium used.

“All methods required for learning can be delivered by a variety of media and media attributes. It is the method which is the ‘active ingredient’ or active independent variable that may or may not be delivered by the medium to influence learning” (Clark, 1994, p. 26)

Clarke (1994) suggests that teachers should focus on the instructional methods regardless of medium to achieve learning outcomes. Bransford et al. (2000) state that “there is no universal best teaching practice” (p.22). In higher education the problem is further exacerbated as academics are not necessarily teachers, their time is divided between research and teaching. “Professors know their content; however, more likely than not, they have not been exposed to the pedagogy of teaching and learning” (Bernauer & Tomei, 2015). Technological, time, design and support are further challenges for academics in universities around the world (Garrison & Kanuka, 2004; Moskal, Dziuban, & Hartman, 2013; Wanner & Palmer, 2015).

Students, especially fully online students, expect and require an engaging online learning environment to interact with teachers and peers and receive feedback on their learning (Garrison & Kanuka, 2004). Students “increasingly see a disconnect between the tools they use to learn and the tools they use to live and operate in modern life” (Herrington & Parker, 2013, p. 608).

Designing for learning has often been led by the technology rather than focussing on learning outcomes (Graham et al., 2013; Kirkwood, 2014). Developing the use of educational technology has often emphasised the know-how rather than the why or for what educational goals of using the technology (Kirkwood, 2014). There are some staff who have changed their pedagogy but a new method of teaching such as online learning is not widespread among teaching academics (Bohle Carbonell, Dailey-Herbert, & Gijselaers, 2013; Torrisi-Steele & Drew, 2013). Also the design process is an iterative process that occurs before teaching begins, during and after (Bennett, Agostinho, & Lockyer, 2016). Good practice reports of technology enhanced learning produced by the Office for Learning & Teaching (Partridge, Ponting, & McCay, 2011) are available but “evidence of the widespread awareness and uptake of the reports by educators has remained limited” (Keppell et al., 2015).

Numerous intervention strategies and tools are available to assist academics to move to a more learner-centred pedagogy for blended and fully online learning environments. A plethora of resources are available from the LAMS tool for designing, managing and delivering online collaborative learning activities (LAMS Foundation, 2015) to standards and rubrics for online learning (Online Learning Consortium, 2016; Quality Matters, 2014; University of Western Sydney, 2016) to design templates. Some of these strategies work for some people, but considering the lack of overall improvement in online learning the learning principles that should be in the learning environment are not evident.

The tools are too generic, not contextualised and do not link staff with others in their field. Teaching Teachers of the Future Project (Romeo, Lloyd, & Downes, 2013) made recommendations that each discipline should have their own accessible database of resources, support in how to use those resources, and provide good practice models in the use of information communication technology. Price and Kirkwood (2013) concluded that academics prefer to discuss the use of technology in teaching with their peers and academic developers rather than reading the literature. Bennett et al. “argue that tools to support teachers’ design work are more likely to be adopted if they first seek to connect with teachers’ existing practices” (Bennett et al., 2016). The proposed intervention hopes to build on existing practices, use peer review, provide exemplars, identify leaders, leverage good practice and connect staff in a discipline to practically incorporate the recommendations from the literature.

Guidelines to proposed intervention

To change the dynamic between an academic and online learning, the intervention strategy needs to imbue networked learning principles that could equally be applied when they design their online learning environments. The proposed intervention is underpinned by the following networked learning principles:

  • Learner-centred: the intervention needs to be available at the point of need and contextualised
  • Knowledge construction and distribution: the intervention should allow the creation and self-publishing of content
  • Developing connections: self-regulated collaborations, reflection and peer-networking

The proposed intervention will empower staff to self-review their online delivery of a course compared to others in their discipline. The review tool will make transparent exemplar elements of teaching design and practice by linking to people and online course environments. The tool will allow staff to leverage good practice and identify leaders by making connections because the tool will be created in a cloud-based environment where all responses are transparent for those who use the tool.

Description of the proposed intervention

The creation of an Online Course Delivery Self-Review Tool for academics that allows critical reflection on their own and other’s teaching practice in an online environment. The self-review tool will allow other teachers to access the self-review data and associated online course so that they can view and gain ideas from other teachers in their discipline. The self-review tool enables the designable elements of an online learning environments using Carvalho and Goodyear’s (2014) analytic framework to be reviewed. The design elements are the set design, the look and feel; the epistemic design, the tasks, assessments, learning outcomes; and the social design, interaction, collaboration. The Online Course Delivery Self-Review Tool is also designed using this framework.

The intended outcomes are that staff are able to build on their existing knowledge, share and reflect on their own teaching practice and consider the examples from their peers. The tool could become part of their design repertoire. Staff could use the tool to discover leaders, exemplars and form groups in interest areas.

Networked learning principles are particularly useful in the design as the tool is learner-centred in that it has the user’s knowledge in it and connections with others’ knowledge. Knowledge may reside inside someone else’s course and/or creators can be contacted for further explanation. Peer learning and co-creation is possible by using the tool. The more academics use the tool to self-reflect on their course, the more data is available for others to access and be informed by.

The Online Course Delivery Self-Review Tool:

  • is dynamic so that we can learn from the change
  • is collaborative and social – co-created by all those that use it
  • is using technology so that access to information is just-in-time/just-for-me facilitating the iterative design process
  • is networked which fosters mentoring, site visits, and learning by seeing
  • allows external feedback & peer review for validating and improving the online learning environment
  • allows you to link and leverage existing knowledge and people by creating a peer network. (Adapted from Flipcurric, 2016)

Plan for implementation

The implementation plan consists of the following stages:

implentation_plan

Consult – In the first consulting stage, the basic course review that was conducted as part of the peer-review problem analysis process will inform the design of the Online Course Delivery Self-Review Tool. Consultation with the academic developer, head of school/discipline and identification of a pilot group of academics to finesse the tool.

Create & Distribute – The new cloud-based Online Course Delivery Self-Review tool will be created by the educational designer using a Google form. The form and responses will be accessible to the identified pilot group. A use guide for the form will also be created and distributed. Users can review entries via a graphical representation, qualitative comments, and a link to the Moodle course site is included.

Review – The review phase will be conducted after the pilot group have all reviewed their courses. The data will then be analysed and interviews with the academics to identify problems, effectiveness of use, and ideas on how to promote the tools to a wider audience will be incorporated.

Amend – The form, use guide and promotional resources will be created by the educational designer based on the review phase, including the data from the pilot group so users can review responses before doing a self-review or to get ideas.

Distribute – Tool will be distributed to a wider audience using a variety of methods (email, meetings, forums etc).

Consult – The responses will be reviewed periodically by the educational designer, head of disciplines and academic developers to identify strengths and weaknesses. The data can be used to contact staff and/or provide targeted support using a variety of methods (training, one-on-one meetings, peers etc.). Academics will have the opportunity to contact peers or simply review other people’s courses, perhaps looking for sharable design elements, these could be technologies used, tasks, teaching practices, assessments or even look and feel. Ultimately, interest groups or communities of practice might be created.

Unknowns

  • How to make the tool robust? Is Google form the best tool?
  • The responses might have to be cleared out periodically to stop it from getting unwieldly and to enable courses that have changed to be updated. Does a new form get recreated/designed on a yearly basis?
  • How can the data already entered be changed at a later date?
  • Can some elements of the tool be automated? e.g. data flows from the learning management system
  • Can automatic flags be created? e.g. emails/rss feeds for usage alerts/highly rated elements
  • How to incorporate the tool into the existing reiterative design repertoire of an academic?
  • Other uses? e.g. educational designers could ask academics to use tool prior to meeting with them
  • How to creating interdisciplinary links?
  • How to create accessible connections to other disciplines’ reviews?

The design of the Online Course Delivery Self-Review Tool should be refined with each iteration using the principles of networked learning. The tool could be redesigned at any stage and should be continuously evolving. Ultimately the academics have to collectively take ownership and incorporate it into their course design methodology. If successful, the tool could be rolled out to other disciplines. The tool should be contextualised to a discipline and re-designed to suit that discipline. The tool can be measured as successful if:

  • there is a positive impact on practice
  • there is widespread awareness
  • disengaged staff become engaged
  • a cultural change from ‘why don’t you’ towards ‘why don’t we’ happens
    (Flipcurric, 2016)

References

Anderson, T., & Dron, J. (2012). Learning technology through three generations of technology enhanced distance education pedagogy. European Journal of Open, Distance and E-Learning. Retrieved from http://www.eurodl.org/?p=archives&year=2012&halfyear=2&article=523

Bennett, S., Agostinho, S., & Lockyer, L. (2016). The process of designing for learning: understanding university teachers’ design work. Educational Technology Research and Development, 1–21. http://doi.org/10.1007/s11423-016-9469-y

Bernauer, J. A., & Tomei, L. A. (2015). Integrating pedagogy and technology: improving teaching and learning in higher education. Book, Lanham, Maryland: Rowman & Littlefield.

Bohle Carbonell, K., Dailey-Herbert, A., & Gijselaers, W. (2013). Unleashing the creative potential of faculty to create blended learning. Internet and Higher Education, 18, 29–37.

Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school. (Expanded E). Book, Washington, D.C.: National Academy Press.

Cabrera, D., & Colosi, L. (2012). Thinking at every desk: Four simple skills to transform your classroom. WW Norton & Company.

Carvalho, L., & Goodyear, P. (2014). The architecture of productive learning networks. New York, Routledge.

Cassano, N. (2016). Nat8117. Retrieved from http://nat8117.weebly.com/

Clark, R. e. (1994). Media Will Never Influence Learning. Eductional Technology Research and Development, 42(2), 21–29.

Dias, S. B., & Diniz, J. A. (2014). Towards an enhanced learning management system for blended learning in higher education incorporating distinct learners’ profiles. Educational Technology & Society, 17(1), 307–319.

Ehlers, U.-D. (2013). Open learning cultures: A guide to quality, evaluation, and assessment for future learning. Book, Springer Science & Business Media.

Flipcurric (2016) Networked learning & quality-assured peer support as a key implementation support & learning tool. Retrieved from http://flipcurric.edu.au/make-it-happen/peer-support

Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher education. Internet and Higher Education, 7, 95–105.

Graham, C. R., Woodfield, W., & Harrison, J. B. (2013). A framework for institutional adoption and implementation of blended learning in higher education. Internet and Higher Education, 18, 4–14.

Herrington, J., & Parker, J. (2013). Emerging technologies as cognitive tools for authentic learning. British Journal of Educational Technology, 44(4), 607–615.

Keppell, M., Suddaby, G., & Hard, N. (2015). Assuring best practice in technology-enhanced learning environments. Research in Learning Technology, 23(1063519). http://doi.org/10.3402/rlt.v23.25728

Kirkwood, A. (2014). Teaching and learning with technology in higher education: blended and distance education needs “joined-up thinking” rather than technological determinism. Open Learning: The Journal of Open, Distance and E-Learning, 29(3), 206–221. http://doi.org/10.1080/02680513.2015.1009884

LAMS Foundation (2015) LAMS Foundation. Retrieved from https://www.lamsfoundation.org/

Moskal, P., Dziuban, C., & Hartman, J. (2013). Blended learning: A dangerous idea? Internet and Higher Education, 18, 15–23.

Northcote, M., Seddon, J., & Brown, P. (2011). Benchmark yourself: Self-reflecting about online teaching. In G. Williams, P. Statham, N. Brown & B. Cleland (Eds.), Changing demands, changing directions. Proceedings ASCILITE Hobart 2011 (pp. 904-908). Hobart, Australia: Australasian Society for Computers in Learning in Tertiary Education

Online Learning Consortium (2016). Blended Learning Quality Scorecard. Retrieved from http://onlinelearningconsortium.org/consult/quality-scorecard/olc-blended-quality-scorecard/

Partridge, H., Ponting, D., & McCay, M. (2011). Good Practice Report: Blended Learning. Retrieved from http://www.olt.gov.au/resource-blended-learning-2011

Price, L., & Kirkwood, A. (2013). Using technology for teaching and learning in higher education: a critical review of the role of evidence in informing practice. Higher Education Research & Development, 1–16. http://doi.org/10.1080/07294360.2013.841643

Putman, R., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational Researcher, 29(1), 4–15.

Quality Matters (2014). Quality Matters Higher Education Rubric 5th edition. Retrieved from https://www.qualitymatters.org/rubric

RMIT University (2016). Peer Review Dimensions of Teaching. Retrieved from http://www1.rmit.edu.au/browse;ID=nbzgaxw4n9t9.

Romeo, G., Lloyd, M., & Downes, T. (2013). Teaching teachers for the future: How, what, why, and what next? Australian Educational Computing, 27(3), 3–12.

Rovai, A. P., & Jordan, H. M. (2004). Blended Learning and Sense of Community: A comparative analysis with traditional and fully online graduate courses. International Review of Research in Open and Distance Learning, 5(2), 1–13.

Sharkova, N. (2014). Learning supported by technology in higher education: From experience to practice. Education Inquiry, 5(3), 429–444. http://doi.org/10.3402/edui.v5.24610

Siemens, G. (2005). Connectivism: A Learning Theory for the Digital Age. International Journal of Instructional Technology & Distance Learning, 2(1). Retrieved from http://www.itdl.org/journal/jan_05/article01.htm

Torrisi-Steele, G., & Drew, S. (2013). The literature landscape of blended learning in higher education: the need for better understanding of academic blended practice. International Journal for Academic Development, 18(4), 371–383.

University of Western Sydney (2016). Quality in Learning & Teaching Resources. Retrieved from http://www.westernsydney.edu.au/qilt/qilt/resources

Wanner, T., & Palmer, E. (2015). Personalising learning: Exploring student and teacher perceptions about flexible learning and assessment in a flipped university course. Computers & Education, 88, 354–369.

Peer Review

The peer review process was used for problem analysis. The process included an evaluation of current online course delivery for a sample of Bachelor of Education courses, two problem analysis discussions. Initial discussion was held with student peer Natalie Cassano (2016) using blog posts and comments to define the problem. An initial problem statement was identified:

How can an online learning environment be designed that enables students to learn?

Goodyear and Carvalho’s (2014) activity-centred analytic framework provided the lens to perform a sample online course review which in turn allowed the peer review discussions to be framed. The framework includes three designable elements of networked learning environments. One, the set design: the physical look and feel of the environment; two, the epistemic design: the tasks, knowledge and process; and three, the social design: the relationships and roles of teachers and students. Co-creation and co-configuration of the learning environment by participants is paramount to learning using this framework (Carvalho & Goodyear, 2014).

Sample course review: 3 courses each year level (12 courses)

The sample course review process provided an objective problem analysis from the point of view of a student navigating the site to discover content, assessment and activities to be performed online. The sample review revealed that the courses are information dense with little evidence of online student engagement, interaction or knowledge construction.

Online Course Review form (based on University of Western Sydney’s Basic Standards for Blended and Fully Online Learning Environments)

Problem analysis discussion: 1 hour meeting organised via contextualised email (14 Oct)

Problem analysis discussion with the School of Education Head confirmed the problems identified in the course review and indicated some causes such as staff turnover, difficulty in converting face-to-face teaching to online, use of technologies and workload. Regional students have complained about lack of engagement in online courses and desire face-to-face courses. Solutions suggested were a course audit, education course template and educational technology training for teaching staff.

Problem analysis discussion: 1 hour meeting (17 Oct)

The academic developer suggested that the audit/review can be used as an intervention strategy. To be accepted by teachers, stakeholders and evidence need to be included in the design. For example, using current good practice examples in schools, current exemplary courses/teaching. Rather than designing another template, she suggested trying to address: What is a good course site? What is a good review tool? What is missing/hidden from online teaching?

Peer Review Notes

Response

The peer review problem analysis process steered me to investigate exemplary online courses, course review tools (Northcote, Seddon & Brown, 2011; OLC, 2016; Quality Matters, 2014; RMIT, 2016) and identify principles of good practice in teaching (Bransford, Brown, & Cocking, 2000; Chickering & Gamson, 1987; Race, 2010). Learning happens using diverse teaching strategies as diverse as each teacher’s pedagogy. To enable staff to design online environments for learning the diverse range of design and teaching strategies need emphasis. Teachers need to explicitly think about their current and preferred teaching strategies, the needs of the students, and the resources available to them (Bates, 2015). A (self) review tool will endeavour to recognise individual’s design and teaching strategies, map them against principles of good practice and enable connections to be build with peers and technologies that enable online learning.

My Design-based Research Proposal Mindmap

https://atlas.mindmup.com/2016/10/f5f5e1409e6b11e6be347bef5e8f3072/design_based_research_proposal/index.html

References

Bates, A. W. (2015). Teaching in the Digital Age. Tony Bates Associates Ltd. Retrieved from https://opentextbc.ca/teachinginadigitalage/

Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school. (Expanded Edition). Washington, D.C.: National Academy Press.

Carvalho, L., & Goodyear, P. (2014). The architecture of productive learning networks. New York, Routledge.

Cassano, N. (2016). Nat8117. Retrieved from http://nat8117.weebly.com/

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, (3), 7.

Northcote, M., Seddon, J., & Brown, P. (2011). Benchmark yourself: Self-reflecting about online teaching. In G. Williams, P. Statham, N. Brown & B. Cleland (Eds.), Changing demands, changing directions. Proceedings ASCILITE Hobart 2011 (pp. 904-908). Hobart, Australia: Australasian Society for Computers in Learning in Tertiary Education.

Online Learning Consortium (2016). Blended Learning Quality Scorecard. Retrieved from http://onlinelearningconsortium.org/consult/quality-scorecard/olc-blended-quality-scorecard/

Quality Matters (2014) Quality Matters Higher Education Rubric 5th edition. Retrieved from https://www.qualitymatters.org/rubric

Race, P. (2010) Making Learning Happen: a Guide for Post-compulsory Education. (2nd Edition). London: Sage Publications.

RMIT University (2016) Peer Review Dimensions of Teaching. Retrieved from http://www1.rmit.edu.au/browse;ID=nbzgaxw4n9t9.

University of Western Sydney (2016). Quality in Learning & Teaching Resources. Retrieved from http://www.westernsydney.edu.au/qilt/qilt/resources

Talking through the problem

In conversation with Natalie, via blog post comments, my own musings and talking to one of my peers at work, I have nutted out the problem I’d like to tackle for my DBR proposal. The following are some of my observations from the conversations in part taken from my comments on Natalie’s blog post (sorry for the repeat, Natalie).

Good intentions

Everyone has such good intentions in providing students with assistance, by building websites, course sites, program sites, or might provide links to information on a case by case basis, providing students with the contextualised answer that they desire, instant gratification for the student! At the University of South Australia they have implemented 24/7 academic help via a service called YourTutor. The link to YourTutor is at the top of each course page. As I discovered by clicking on it accidentally, they are very persistent and apparently this has been well received by students, especially because of the 24/7 nature of the assistance.

Another service at UniSA is that in the library you can no longer talk to a librarian face-to-face. You have a choice of picking up a phone (and connecting with the library staff who are just sitting behind a wall) or using the chat service. The chat service is enabled on all library websites and provide answers by library assistants (not librarians! who are qualified to provide reference support) who are located behind the wall or by a bot for frequently asked questions (for example, what are the library opening times, etc). The FAQs are somehow input and are spat out as if a real human is chatting. Again, management state that this has been extremely successful. However, I know that when you take a service away such as face-to-face support, students just don’t know what they are missing and won’t ask. No longer are they taken through the process of how to find the information, as was done via the face-to-face service. It is a self-fulfilling prophecy, remove the face-to-face service and then if nobody is asking for the service claim there is no need for it!

Learning

We want students to become effective lifelong learners who can find the information and answers to questions themselves. If we continue to focus on providing students with this ‘instant gratification’, how do we teach students to learn not just to direct them to information (not matter how well intended)? We are not providing learners with the ability to explore, discover and create for themselves.

We provide students with information on websites and courses but do not provide students with the process to learn that enables students to transfer these skills to other areas of learning. How do we teach students the transferable skills so that they can find the information relevant to themselves at their point of need?

Point of view

Natalie is trying to combat this from her point of view as academic advisor. As I do not really ‘teach’ something, I have to provide guidance via design. The problem from my point of view is:

How can an online environment be designed that enables students to learn?

 

The next step…listen

Now that I have framed my problem, I need to ask how others see this problem and potential solutions that can be built into the design-based research proposal. I hope to seek feedback from:

  • my manager
  • other educational designers
  • students
  • academics
  • case studies
  • literature
  • other?