I am still exploring ways to better study the impact of technology on learning, as with online and blended learning. A few weeks ago I presented some of my ideas and queries through a CIDER presentation. In that session I shared that I was not completely satisfied with my doctoral study methodology or much that I have seen presented in the literature. There needs to be a stronger mix of qual and quant methods that can capture the impact of learning with technology, such as the study of:
- human-computer interaction
- the brain (cognitive learning)
- the senses (visual, design)
- system and social dynamics
EDUCAUSE has recently created a initiative to find ways to explore the ‘evidence’ that would reveal the impact of technology as they, too, are unsatisfied with current research methods. They state:
“ With many options and constrained budgets, faculty and administrators must make careful decisions about what practices to adopt and about where to invest their time, effort, and fiscal resources. As critical as these decisions are, the information available about the impact of these innovations is often scarce, uneven, or both. What evidence do we have that these changes and innovation are having the impact we hope for?”
Some of their preliminary readings/lit focuses on developed learning theories and the use of technology. I think this is a good start and feel we must not through out the baby with the bath water. Regardless of our intent to claim something new and innovative we need some established framework from which to work. People are people and they behave and learn in certain ways; albeit, they are shaped and affected by the evolving environment and technology.
Back to my notion to get more technical with research data. A new discussion is emerging about creating systems the collect and interpret data on the resources used and activities performed by learners/users, whether informally or formally. The point is to help mine data for users due to the overwhelming amount of information and resources available on the net. Even the most information literate person struggles to bring together, peruse, and decipher the multitude of info. I can hardly keep up!
Creating a data collection and interpretive system would hone information and provide recommenders (i.e. what to read or explore next based on past activities). The group at Athabasca/TEKRI with George Siemens (sense making and learning analytics) and the group at NRCC with Stephen Downes (Web X) are exploring this concept.
I am excited about the possibility to use honed and interpreted data to study the processes and needs of learners who use technology. Learners are no longer in a bubble but have extended learning environments (PLE), social networks, and vast amounts of web-based resources. How are they using these artefacts and systems? How can we help them use it to learn? How is their learning developed with them, or not developed? Such questions are taking on new meaning compared to olden days due to advanced information, communication and networked technologies.
I hope to attend the learning analytics conference to learn more about this research method. Until then, I’ll keep exploring potential methodologies to study the impact of technology on learning.