#BbWorld16 – Open Source Analytics for Growing Adoption of Blackboard Learn using BbStats

Pictured from left to right: Eric Kunnen, Shahbaz Khan, Cheryl McKearin, and Szymon Machajewski

BbStats has been a leading OSCELOT project for about seven years. The analytics tool reports on Blackboard Learn system health and usage and provides valuable information about activity and use patterns. This session will dive deeper into the tools and reports needed to increase adoption.

This session highlighted the use of the free open source Blackboard Building Block called BbStats, written by Szymon Machajewski. The session was facilitated by Szymon who is an affiliate instructor at Grand Valley State University (GVSU) and serves at a Blackboard system administrator at UIC.

Eric Kunnen, Associate Director of eLearning and Emerging Technologies at GVSU participated on a panel with Shahbaz Khan and Cheryl McKearin from the University of Illinois at Chicago.

Highlights from the session included discussing the value of using BbStats for:

  1. Administrative Communication – Institutional Awareness, Transparency and Common Understanding/Importance of using Blackboard at the University

    “What are the system metrics of usage?”

  2. Measuring and Informing Teaching and Learning Impact, Discovery, and Increasing Adoption  – Target Emailing, PD, and Campus Events

    “Who are our innovative and engaged faculty and what tools are they using?”

  3. Showcasing, Celebrating, Marketing, and Promotion – Infographics, Email Highlights

    “Did you know how Blackboard is used and by how many faculty and students?”

Szymon Machajewski provided an overview and tour of the large variety of data that BbStats collects and displays.


Here is an example infographic for GVSU’s use of Blackboard from the Fall 2015 semester using BbStats data:

  • 90% of faculty and 97% of students logged into Blackboard at GVSU




#BbWorld16 – Mining for Gold in your Institution’s Blackboard Learn Data

Analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and their environments.

Faculty and students are most interested in

San Diego State University Learning Analytics Study

  • President level initiative
  • Goal, identify effective interventions driven by learning analytics “triggers”
  • Multiple triggers include: LMS Access, Grades, Online Homework
  • Focused on high need courses
  • Reference: EDUCAUSE Presentation Slides

Research Study Plan

  1. Identify courses and recruit instructors
  2. Prior to course start, review syllabus, meaningful “triggers” for each course (e.g. Attendance, graded items, Bb use, etc.)
  3. Run reports in Bb, online homework, quiz, identify students with low activity or performance
  4. Send “flagged” student in experimental group a notification/intervention (friendly reminder send by the instructor)
  5. Aggregate data and analyze


  • Triggers vary by course and by section were found
  • Highly effective in courses with high LMS use
  • High school GPA is a good predictor of success, LMS usage overtook the predictor after 2 weeks
  • Interventions didn’t make a difference in learning outcomes – found low reach of intervention messages (75% opened and only 35% clicked on link in email)

New Intervention – Supplemental Instruction

  • Using supplemental instruction (participated in optional sessions led by peer) was shown dramatic results in improvement

How can course design influence student success (via Bb Analytics for Learn) via UMBC:

  • Usage of Blackboard (content – syllabus, documents, videos, textbooks, interactive tools – discussion, chats, blogs, wikis, assessment – grades, assignments, quizzes)
  • Key finding:

    “students were 1.5 to 2 times more likely to earn a C or better when the Grade Center was used”

  • Retention by Learn Risk Profile – High Grades and High Engagement is Key

Data Sources available in Bb: