Minding the Gap: Analytics and Organizational Capacity

Analytics enjoys strong support and endorsement at most of our institutions; it’s probably harder to find a campus without an initiative today than one with it.  To those of us withbackgrounds in the field, it’s a very exciting time with lots of potential projects being envisioned.  But to recall the late Steven Covey (Covey, 1989), we create two times: once when we create a goal, and a second time in the actions we take to achieve it.  In the case of analytics in higher education, I believe that we have an organizational capacity gap between the first and second creations.

Subway image with text "Mind the Gap"

The 2012 ECAR analytics study found high expectations for analytics; 86% of respondents thought that in two years analytics would be more important for higher education’s success than it is today (Bichsel, 2012).  The study found current deployments limited, especially when considering predictive analytics or proactive use of data for decision-making.  However, these results are all-too-familiar.  In the 2005 Educause report that coined the term “Academic Analytics”, 63% of respondents (with capacity beyond transaction systems) planned to upgrade their analytics infrastructure within two years (Goldstein & Katz).  What happened? Either this capacity was created and lies unused, or the plans were not implemented.  Why do we continue to have these unfulfilled expectations when it comes to analytics in higher education?

Both of these studies report a lack of staff with the requisite knowledge and backgrounds to conduct analytics.  But just adding staff won’t create an analytics program.  If people were found and convinced to join our ranks, where would they fit in an institution?  Asked another way, whose responsibility are analytics?

At present, I don’t think that analytics neatly fits anywhere: Institutional Research is overburdened with mandated and standard reports, and frequently not accustomed to working with innovative enterprise technologies and changing requirements.  Institutional (or Academic) Technology is accustomed to innovation and technology, but doesn’t have the research methods skills.  Academic affairs often has experienced leaders in these areas, but doesn’t have the resources for campus-wide implementations.

If campuses want to break this cycle of deflated expectations, a preliminary question to answer is where to place analytics as a program and set of responsibilities.  To be successful, this will require changes to our organizations – not just in leaders using data for decision-making, but also in creating an organizational culture and responsibilities that don’t currently exist.  We’ve already created this goal once; it’s time we’re successful on the implementation.

Works Cited

Bichsel, Jacqueline. (2012). 2012 ECAR Study of Analytics in Higher Education. Educause Center for Applied Research. Boulder, CO: EDUCAUSE and Associaction for Institutional Research.

Covey, Steven. (1989). The 7 Habits of Highly Effective People. New York: Fireside (Simon and Schuster).

Goldstein, Philip J, & Katz, R.N. (2005). Academic analytics: The uses of management information and technology in higher education. Washington, DC.


On Getting Lost, Serendipity and Learning Analytics

Redwood Canopy PhotoI’m wrapping up my doctoral dissertation and have been presenting the findings of my study (here and here).  But as I reflect on my graduate school experience, I’m realizing that the most important thing I learned in graduate school wasn’t quantitative methods, educational theory, or empirical literature in learning analytics.  It was how I produce my best work.  And what I learned runs counter to my notions of productivity and efficiency.

I found that my best ideas don’t come immediately, but they take time to cultivate, mature, and develop.  Once launched, they seem to grow and mature on their own.  To write something good (e.g. original and interesting) I needed to pre-write, brainstorm, and outline my chapters.  But once I started writing, I deviated from that outline.  It turned out to be less a script than a warmup exercise.  The process seemed hopelessly inefficient at times, especially compared to cohort-mates who would write everything at one shot.  But once I started writing, the words would flow.  And I even found the writing process a little bit fun.

This approach was encouraged by my graduate school program.  One of our lead professors (Dr. Heckman) used his trademark New York style to give compelling terms for this process — encouraging us to continue “noodling” about our ideas and “thinking about thinking”.  I have to admit that at the time, as a part-time student with a full-time job and family responsibilities I just wanted to get to my end goal.  But about midway through my dissertation the benefits of this process became clear.

This realization made me wonder … about how we work, write, and analyze data for decision-making outside of the protected space of academic research programs.

Rebecca Solnit speaks to this process in “A Field Guide to Getting Lost.” She traces the etymology of the word “lost” to the Old Norse word los, which means “the disbanding of an army … suggests soldiers falling out of formation to go home, a truce with the wide world.”  (P.7).  She distinguishes getting lost — operating without a mental map or predictive sense of where you are, from losing something —  not being able to locate something that you already know.  I love to hike.  What I love best about getting into the wild is precisely this aspect of getting lost in the complexity of nature’s beauty.  I’ve been trying to get lost lately in suburbia and my work life to discover more of what’s around me.

With low staffing and reduced funding, we’ve got way more questions and formulated hypotheses than we have time to investigate.  But with big data (and even small data), we have the opportunity to get lost and see something new.  I revised my research questions three times, which might be considered taboo.  But I was able to ask better questions once I get familiar with the data.  Beyond refuting or validating a hypothesis, we can discover something that we didn’t know before; discover questions that we didn’t have.  This process takes more time, thought, and energy.  It is also less “productive” in the tasks we check off.  But the results have incomparably greater potential.

As we develop Learning Analytics for decision making, I hope we’ll give ourselves a chance to get lost in the data, to disband our conceptual armies and discover new patterns to help us improve student success.

OK, now I’d better get to writing that summary and checking off the next to-do item.

Thanks to Phil Hill (@PhilOnEdTech), Michael Feldstein (@mfeldstein67) and Kimberley Hayworth for their feedback and  comments on this post.

Just getting rolling ….

This website is under construction (1/13/2013); I’m posting ideas, resources, and research related to learning analytics. I’m especially interested in the use of data for student success, especially data from educational technologies that helps us understand what colleges students are doing – and how we can improve student achievement.