AcAdv, AdvTech, Higher Education

Today’s #AcAdv Chat Topic: Data Analytics in Academic Advising #highered

A couple of week’s ago, I was fortunate to join the Open SUNY COTE Summit 2017. I will be sure to share more about the #COTEsummit learning in the coming weeks; however, the last session helped me think about framing TODAY’s (3/21) #AcAdv Chat I’ll be moderating from 12-1 pm CT: Data Analytics in #AcAdv 

During the #COTEsummit Learning Analytics panel hosted by OLC, we dug into what information we know and how we use it to understand more about our learners.  Many academic advising units/divisions, often jump to the platform or process for how we analyze students to predict learner behavior:

//platform.twitter.com/widgets.js

But before advising leaders in higher ed jump on the big data bandwagon or decide to implement technology platform to collect data, I think our support units need to identify what information and data we need to know to effectively support our learners. Let’s make decisions on the data that is most helpful, instead of letting predictive analytics make decisions for us at our institutions. What often gets lost in this conversation and planning is this: learning or learning analytics.

Learning analytics are about learning (Gašević, Dawson, & Siemens, 2015). Sometimes we forget this about learning analytics when the phrase data is tossed out at the  “strategic-planning-task-force-retention-student-success-operation” meeting occurs at our universities and colleges. Sure, learning analytics might be most relevant for instructors and faculty; however, learning data is also critical for those who support the instructional design, scaffold student success, and provide academic advising/support in higher education.

Image c/o Giulia Forsythe

In thinking about academic advising and learner support, I have SO many questions about data and data analytics for this #AcAdv Chat topic… here are just a few:

  • How does your institution collect, store, and share data campus-wide?
  • What do you do as a staff or faculty member to interpret the data?
  • Are you able to interpret, read, and translate the information provided about your learners?
  • Are there real-time notifications where students, staff, and faculty can interpret academic progress? What does this look like at your campus?
  • Do your data sets on campus talk to one another? Is there much interaction between your student information system, learning management system, institutional portal, or institutional research data? Why or why not?
  • What challenges and/or issues have you thought about for how data is collected and/or reviewed for learner support?
  • Who or what office can you reach out to on campus for “data analysis” or digging into your learner data to interpret further to support the work you do?

What thoughts or questions do you have about this issue, higher ed? Won’t you join us for today’s #acadv chat conversation? Here’s how:

TWEETS from the #AcAdv Chat conversation on 03.21.17

Reference:

Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64-71.

dalmooc

Do You Want to Learn About Learning Analytics? #dalmooc

Last week, I attended the UTA LINK Lab talk presented by Dragan Gasevic (@dgasevic) on learning analytics and research. This discussion shared all the digital traces and learning that can be collected and measured in our various learning environments, and questions how we are best doing some of these analytics within our institutions. Although we have a number of statistics, data, and information on our learners – how can we offer actionable insight, summative feedback, and information about learner progress. Our post-secondary institutions seem to want to only deal with the “R” word = Retention. Often institutions are looking to identify students at risk, provide information about learning success, and understand how to enhance learning – but how can we effectively use data when often times our metrics only focus on single outcomes?

data-analytics-608x211

Photo c/o the #dalmooc edX Course Site

Instead, it is the process and context that our education institutions need to identify when looking at learning analytics, that is, the need to understand and optimize learning (Butler & Winne, 1995). Whether we apply the community of inquiry framework,  cognitive presence, which includes triggering events, exploration, integration and resolution (Garrison, Anderson & Archer, 2001), or the COPES (Conditions, Operations, Products, Evaluation, & Standards) model (Winnie, 1997) –  it is the meaningful data points for learning analytics that really need to be identified within our educational institutions.  As @dgasevic said, “Learning analytics is about LEARNING!” Often we assume the data collected from our courses and our systems will provide us with the answers; however if not identified in a purposeful way – why bother? What we really need to consider is, what does it mean to study and support the learning experience and not just the end results?

Here are a few areas of learning analytics and data evaluation need to be considered (just to name a few):

  • learner agency and self-regulation
  • interaction effect – external and internal conditions
  • formal and informal learning communities
  • instructional intervention methods
  • multimodal learning
  • emerging technology impact, i.e. mobile, wearable tech, etc.

Here are  questions our institutions need to consider when they want examine learning analytics:

  • What data we are collecting? And why?
  • How does the learner information we know contribute to the PROCESS of learning?
  • Who should be part of this learning analytic research for learning?
  • How can we best present and interact with the data? Can this be more immediate?
  • How can we encourage and support multidisciplinary teams to study learning analytics at our institutions?
  • Are we being being driven by questions of need, access, and availability for the learning data collection?
  • What ethical and privacy considerations should be considered when collecting data around learning?

Interested in learning more about learning analytics and data in education? Check out the paper in press by Gasevic, Dawson, and Siemens http://bit.ly/techtrends15  or better yet – join the 9-week Data Analytics & Learning MOOC that UTA & edX is hosting on this very topic starting Monday, October 20th: http://linkresearchlab.org/dalmooc/ or follow along with the conversation on Twitter #dalmooc.

References

Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical synthesis. Review of educational research, 65(3), 245-281.

Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7-23.

Gasevic, Dawson, Siemens (inpress). Let’s not forget: Learning analytics are about learning. TechTrends. http://bit.ly/techtrends15

Winne, P. H. (1997). Experimenting to bootstrap self-regulated learning. Journal of educational Psychology, 89(3), 397.