#BbWorld16 – Open Source Analytics for Growing Adoption of Blackboard Learn using BbStats

Pictured from left to right: Eric Kunnen, Shahbaz Khan, Cheryl McKearin, and Szymon Machajewski

BbStats has been a leading OSCELOT project for about seven years. The analytics tool reports on Blackboard Learn system health and usage and provides valuable information about activity and use patterns. This session will dive deeper into the tools and reports needed to increase adoption.

This session highlighted the use of the free open source Blackboard Building Block called BbStats, written by Szymon Machajewski. The session was facilitated by Szymon who is an affiliate instructor at Grand Valley State University (GVSU) and serves at a Blackboard system administrator at UIC.

Eric Kunnen, Associate Director of eLearning and Emerging Technologies at GVSU participated on a panel with Shahbaz Khan and Cheryl McKearin from the University of Illinois at Chicago.

Highlights from the session included discussing the value of using BbStats for:

  1. Administrative Communication – Institutional Awareness, Transparency and Common Understanding/Importance of using Blackboard at the University

    “What are the system metrics of usage?”

  2. Measuring and Informing Teaching and Learning Impact, Discovery, and Increasing Adoption  – Target Emailing, PD, and Campus Events

    “Who are our innovative and engaged faculty and what tools are they using?”

  3. Showcasing, Celebrating, Marketing, and Promotion – Infographics, Email Highlights

    “Did you know how Blackboard is used and by how many faculty and students?”

Szymon Machajewski provided an overview and tour of the large variety of data that BbStats collects and displays.


Here is an example infographic for GVSU’s use of Blackboard from the Fall 2015 semester using BbStats data:

  • 90% of faculty and 97% of students logged into Blackboard at GVSU



#BbWorld16 – Mining for Gold in your Institution’s Blackboard Learn Data

Analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and their environments.

Faculty and students are most interested in

San Diego State University Learning Analytics Study

  • President level initiative
  • Goal, identify effective interventions driven by learning analytics “triggers”
  • Multiple triggers include: LMS Access, Grades, Online Homework
  • Focused on high need courses
  • Reference: EDUCAUSE Presentation Slides

Research Study Plan

  1. Identify courses and recruit instructors
  2. Prior to course start, review syllabus, meaningful “triggers” for each course (e.g. Attendance, graded items, Bb use, etc.)
  3. Run reports in Bb, online homework, quiz, identify students with low activity or performance
  4. Send “flagged” student in experimental group a notification/intervention (friendly reminder send by the instructor)
  5. Aggregate data and analyze


  • Triggers vary by course and by section were found
  • Highly effective in courses with high LMS use
  • High school GPA is a good predictor of success, LMS usage overtook the predictor after 2 weeks
  • Interventions didn’t make a difference in learning outcomes – found low reach of intervention messages (75% opened and only 35% clicked on link in email)

New Intervention – Supplemental Instruction

  • Using supplemental instruction (participated in optional sessions led by peer) was shown dramatic results in improvement

How can course design influence student success (via Bb Analytics for Learn) via UMBC:

  • Usage of Blackboard (content – syllabus, documents, videos, textbooks, interactive tools – discussion, chats, blogs, wikis, assessment – grades, assignments, quizzes)
  • Key finding:

    “students were 1.5 to 2 times more likely to earn a C or better when the Grade Center was used”

  • Retention by Learn Risk Profile – High Grades and High Engagement is Key

Data Sources available in Bb:

#ETOM16 and MCO Summer Retreat

The Educational Technology Organization of Michigan and Michigan Colleges Online join together every year for a summer retreat.  This #ETOM16/MCO Summer Retreat is focused on distance education and educational technology. This year more than 30 attendees from a variety of university and community colleges joined together to discuss research and good practice around supporting student success in online learning as well as the use of open educational resources.

Online Student Success – From Theory to Practice

How Do We Know?


Presenter: Bill Knapp, Chief Academic Technology Officer at Lakeland Community College
Session Links via Diigo

We will consider the research, literature, and evidence surrounding Online Student Success related to learner engagement, student satisfaction, retention, persistence, and student achievement (GPA). This highly interactive session will include presentations of research material and case studies interspersed with small group breakout session. Small groups will reflect on the research findings in an effort to identify what we know and what we think we know about (assumptions and beliefs) student success and online learning. The case studies will reflect innovative approaches to campus initiatives aimed a improving online student success and measuring learning outcomes.

Our presenter is Bill Knapp. Bill serves as the Chief Academic Technologies Officer at Lakeland Community College in northeast Ohio, where he oversees the Library, the Center for Learning Innovation, Technical Customer Services, and Distance Learning. Bill has over eighteen years of experience in supporting the campus community in learning technologies and distance learning. He has presented at national, regional, and state conference on a wide range of topics related to online teaching & learning.


  • Persistance – NCES uses retention as an institutional measure and persistence as a student measure. In other words institutions retain and students persist.

“Students are more likely to become committed to the institution and, therefore stay, when they come to undersand that the institution is committed to them.” – Vince Tinto

  • Tinto’s Model of Student Retention – Includes prior qualifications, individual attributes, family attributes as inputs, followed by goal and institutional commitment. The social and acdemic integration is important and key in preventing drop outs.
  • Student Persistence and Online Learning (Hart, 2012)
    • Sense of Belonging to a Learning Community
    • Student Motivation
    • Peer and Family Support
    • Time Management
    • Increased Communication with Instructor
  • Social Presence is key and the student is more likely to complete the course. – Gomez/Yen 2009
  • Measuring Social Presence
    • Social Context (informal)
    • Online Communication (meaningful)
    • Interactivity (responsive)
    • Online Privacy (confidential)
  • Community of Inquiry

    • Teaching Presence
    • Social Presence
    • Cognitive Presence
  • Faculty Involvement – Croxton (2014) “Online course interactivity, particularly between students and instructor, plays an important role in a student’s choice to persist in an online course.” <<< This is key for persistence.
  • The interactivity and student GPA. Online instructors tend to make minimal use of interactive technologies. The more interpersonal interaction the better the student GPA. Creating an Effective Online Instructor Presence – Community College Research Center
  • Virtual Office Hours – The average satisfaction in classes that offered virtual office hours was HIGHER than the classes without… (Pitts, 2009) It is important to at least offer them as the perception is that you are available as an instructor.
  • Early Alert LMS Analytics – Self-regulating learning in order of importance: regular study, late submissions, number of session logins, proof of reading course materials.
  • Open Textbooks and Learning Outcomes – Withdrawl rates: 21% using a commercial textbook and 6% with open textbooks. Credit load: Students in courses using OER enrolled in significantly higher number of credits the following semester.
  • Building a Culture of Inquiry – “It is up to us to take advantage of the research and data that is there to carry use forward.”
  • Grade, completion, and attrition rate comparison for online/hybrid/face to face classes are helpful to measure. These data can inform the knowing of the differences between face to face and online learning and if there are further questions to ask or areas to target to improve.
  • Innovative online orientations using video – research has shown that withdrawl rates were improved (withdraw rates were reduced) by 13%!
  • What would it look like if more universities and colleges implemented a mandatory online orientation before students could register for an online class.
  • Student support services are important for student success including embedded librarians, tutors, or student success center staff.
  • What could Lakeland do to help you be more successful in online courses?
    • Reliable Technology, Video, More Online Courses, Assignment Reminders, Consistency in Online Course Design/Navigation, Instructor Availability, Timely Feedback, Faculty Involvement, Online Testing,
  • More session resources:

ETOM Board Meeting around the Campfire at Center Lake


Michigan Colleges Online Update

Ronda Edwards, Executive Director for Michigan College ONline, will provide an update on MCO initiatives and issues around online learning – including SARA, HLC accreditation, group technology purchases, MCO OER repository, MCO professional development series for 2016-2017.


  • MCO Survey – Distance Education Survey Results
    • Online enrollments nationally up 4.7% and in Michigan down 3.7%. This is the first year that there has been a decline. Reasons seem to be around faculty contract and the offering of less sections. Some colleges also indicate higher HLC requirements for qualified instructors required some reduction.
    • Online programs nationally 92% offer at least 1 and in Michigan 75% offer online degrees. Enrollment tends to grow when you have a strategic direction of courses offered with online programs.
    • Course development, average length of time to develop is 3 to 6 months with an average number of 10 courses developed and 173 courses were developed newly.
    • 88% have mandatory training to teach online.
    • Re-certification of online teaching credentials is at 8%.
    • 75% of colleges have gone through HLC approval for courses and programs.
    • SARA – Currently 3 colleges completed the state application, 5 colleges have completed and submitted to the state, 5 colleges will seek individual state authorizations, 1 college seeking funds next year, 1 college approved, 7 colleges haven’t decided yet, 1 college does not register out-of-state students.
    • 50% of institutions use an internal quality standard followed by QM:
    • 58% of institutions report use of OER textbooks:
    • 33% of institutions use a team development model when developing an online course, with the average number of new online courses being created within the past at 9.6 courses.
    • Average length of time to develop an online course is 3-6 months at 67%
    • Greatest challenges reported include adequate assessment of DL courses followed by accessibility and budgets:
    • Faculty policies requiring response/interaction with students:
    • Required orientations:
  • Last week it was announced that the US Dept of Ed will push to finalize rule on state approval of online programs before the end of the year.
  • Michigan Colleges Online – OER Repository Initiative is now available in beta. The goal is to improve student success and completion, lower costs, inter-institutional faculty collaboration. Steering committee includes faculty, instructional designers, DE admins, and librarians.
  • OER activities have been underway with webinars, training, repository hub/group, publishing, adopting/adapting, grants for OER work, policy work, working with bookstores, and next tracking savings and evaluating success.
  • OER Commons has a variety of excellent webinars available on faculty and how they are using open resources.
  • MCO Collaborative Programs – MRI Technician Program includes: Kellogg, Lake Michigan, Mid-Michigan, GRCC, Lansing CC, and GVSU.
  • MCO Collaborative Purchases – Include: NetTutor, TechSmith (50% off), ZOOM, ReadSpeaker. ReadSpeaker is an accessibility tool and is offered 40% off. Next year, captioning, online testing, Camtasia Relay.
  • Professional development this year:
    • More OER Training
    • Online Remedial Math
    • Competency Based Courses
    • Authentication Options (Currently only required to have a unique password/account.)
    • Pell Grant Fraud
    • Authentic Assesment
    • Net Tutor
    • Instructor Review
    • Program Review
    • Gamification
  • Help Desk Initiative – Working on a 24×7 model with Kirtland, KVCC, LMC, NCMC, Mott via Black Belt Help. This is used to augment what you already have.
  • MCO Guided Digital Pathway Tool – Provides a student success strategy that provides a tech solution which crosses institutional silos and connects with all students. Explore options for college, scheduling tool, advisors/mentors communication tool. Students can use the pathways tool to explore career interest, explore majors and programs at the college, take free assessments for personality, learning preferences, ready for online, etc., financial planning, connect with college and develop a plan, maintain connections with college throughout academic career.

Setting up Your Own OER Initiative

During this presentation, Nicole Finkbeiner, Associate Director of Institutional Relations at OpenStax, will utilize her experience working with colleges and universities across the country to outline the key components of a successful OER initiative. She’ll cover key metrics, real-world examples of successful strategies, and suggestions on how to adapt an OER initiative to your specific budget and campus culture. Nicole will also preview the new OpenStax authoring tool that makes it easy for faculty to customize their textbooks, and the new initiative with bookstores.


  • Measure Outcomes not Actions
    • Number of faculty using OER
    • Number of students using OER
    • Amount of savings for students (Average is $98 per student savings)
    • Student Success (grades, completion, etc.)
  • Actions that contribute to, but don’t equal success
    • Holding a meeting
    • Having a workshop
  • Report sample for OpenStax uses:
    • 859 students using OpenStax books saving $84,000
    • 2.5% of the students are using OpenStax in this example so that impact can be much larger
  • Focus on Scale “How many students will be impacted by OER”
    • Key to transformative change
    • Focus on high-enrollment courses
  • Start with easy wins to impact students now!
  • We = not me – there really needs to be a designated leader and a champion.
  • Have a leader but include: faculty, admin, librarians, instructional support, bookstore, students, …everyone!
  • Successful initiatives take several approaches simultaneously.
  • Implementation Strategies:
    • Expressed support from administration – eg. email from Provost as to the recommendation and adoption of the initiative.
    • Presentations during department meetings
    • Ask faculty directly to try the OpenStax books in a pilot
    • Promote “Textbook Heroes” and ask them to promote OER
    • Hold faculty workshops and offer a stipend for evaluating and reviewing OER resources
    • Provide an OER grant program (orientation day, measurable outcomes, efficacy studies) and easy wins are looking at OpenStax high enrolled courses with adoptions at scale.
    • Connect work to faculty tenure or promotion
    • Involve students – student government, student newspapers
    • Include a search filter for OER classes in your SIS ($40 low cost/no cost courses).
  • Writing OER
    • Start with adopting current textbooks as OpenStax texts (or other such as Saylor)
    • Reinventing the wheel is time consuming and costly vs finding existing resources
    • Independent peer review is critical for scale, other institutions to adopt
    • Check with OER groups firs to see if another project is underway
  • Modifying OER
    • Most faculty say they want to modify OER, but few actually do
    • Modifying OER can increase print copy costs, ability to sell used copies
    • Check licenses carefully
  • OpenStax CNX
    • Can be used to custom and publish open education resources.
  • Custom Print Options via accesyourtextbook.com
    • Can be used to create a custom print version of an OpenStax textbook.

Highlights from MACUL16

The MACUL (Michigan Association for Computer Users in Learning) conference is one of the premier events in Michigan, bringing together educators from across the state, and nationally to talk about how technology can advance education.

colleencameronThis post highlights a couple of sessions from the conference, captured by Colleen Cameron, Systems Analyst, in the eLearning and Emerging Technologies department at GVSU.

Iteration and Innovation Drives Transformation

D1_1-KeynoteJaimeCasapsm Jaime Casap, Chief Education Evangelist, Google, Inc. kicked off MACUL with some great questions to help change the way we think about education. Some of my favorites are highlighted below:

 “What’s the right education system that we need for the different economy we’re facing?”

In today’s globally connected economy, the need for advanced skills continues to grow (and change). As we look to the future, what skills do students need and how can we best educate students in a global way? What is the best combination of soft-skills that can be informed by computer literacy?

“What’s the role of technology?”

Generation Z is a global, social,  visual and technological. They have unlimited access to information, but that information isn’t useful until it is converted into intelligence.

“Collaboration is HOW problems are solved. …Real collaboration is the ability to

  • listen
  • ask good questions
  • change your mind
  • build consensus”

When thinking about the future and working with students, it’s helpful to change your question – instead of asking kids, “what do you want to be when you grow up?”, ask “what problem do you want to solve?” Then, provide the skills needed to solve any problem.

Iteration consists of success and failure” – there is no end point.


Sustainable Innovation: Successful Strategies for Schools, Not Startups

Presentation materials from the session are available on Google drive.

Elson Liu, Director of Integrated Technology Systems, Plymouth Canton Community Schools

Elson began by quoting Simon Sinek’s model of “What > How > and Why”. In businesses, marketing the “Why” is really what drives people to purchase a product. The best example of this is Apple. When you see an Apple commercial, they’re not marketing their device, they’re marketing a lifestyle of innovation and design. Think of how boring it would be if Apple only marketed the “What”: For example, “Here is our new phone, which is slightly larger than our last phone.”

With education, it’s important to know the “why” – your destination.  If you know the “why”, then the device you’re using doesn’t matter. It’s important not to mistake the device for the innovation – because the device will constantly change and improve (the “fail forward” theory). Education is the real innovation.

Lessons to learn from current technology trends:

  • Devices are designed to fail fast.
  • Consumerization of IT – people have expectations of devices, which may differ when you bring them into the classroom (e.g. encountering web filters or firewalls).
  • Products will follow the market, not education.
  • No strategy will last forever.

Create time to learn, and space to take risks.

“Innovation requires risk.”


Additional presentations included the following:

Ottawa County Innovation and Technology Forum 2016


Big Data Panel Discussion

Kevin Desouza, Rod Davenport, and Paul Stephenson, professor and chair of the GVSU department of statistics talk about the value and challenges of big data.



What is Big Data?

big_data_landscape_2016Source: https://whatsthebigdata.com/2016/02/08/big-data-landscape-2016/

Big Data at GVSU

  • Real-world applications (link from the real world back to the classroom)
  • Big Data (real world context and data sets are available)
  • Complex Content
  • Internships and Jobs (students are interested)
  • Knowledgeable Community (educational institutions have a drive)
  • Students at universities have: tech skills, eager and creative minds, discretionary time.

Data Scientist Skills

  • Visualization, Communication, Storytelling
  • Basic statistics and computer programming.
  • Domain knowledge and teamwork
  • Sampling and data storage/retrieval
  • Statistical modeling and machine learning
  • Curious, evaluative (critical thinking), innovative, strategic

Challenges of Big Data

  • Data quality
  • Accessibility of data
  • Re-purposed data
  • Privacy and security
  • Complicated systems
  • Analysts that don’t understand the question or understand the solution
  • Inferential thinking (focus on error bars and intervals)
  • Differentiating “signals from noise”
  • Balancing time constraints

Realizing the Promise of Data and Technologies for Local Governments

Kevin Desouza, associate dean for research at the College of Public Service & Community Solutions and ASU Foundation professor in the School of Public Affairs at Arizona State University, presented at the Ottawa County Innovation and Technology Forum.


  • Complex platforms and governance now requires use of tech, data, mobile.
  • Data and technologies provide situational awareness, transparency, engagement, policy, innovation, and governance.
  • Open data includes many platforms, including crowdsourcing so that the data management and tool development occurs using the data.
  • Issues with open data include: limited tech talent, public/private partnerships, success metrics are not defined, and there is a transparency vs privacy concern.
  • Example: Arizona Budget Analysis Tool (AZBat) – Took for months to build and it was built with 8 undergrads for a reasonable cost vs hiring an outside firm.
  • Big data issues include: local governments lack IT infrastructure and talent to conduct large-scale predictive analytics projects. It also includes data that is “volunteered”. How can we link all of our separate databases? With the access to the data, what does that mean, ethically.  Should we/can we “discriminate” aka predicative policing, knowing that we have the data…
  • Mobile data includes Fitbit like devices that contribute data around health and activities, this data is given up by end users. Real-time data from phones, wearable tech, social networks, etc. is growing rapidly. Issues include byod, regulating apps, encryption, interoperability, video data processing and curation (police cameras).
  • Emerging tech such as autonomous vehicles will cost local governments big bucks. There are big data concerns and challenges.
  • We all make decisions based on data, once we made a decision we often stop processing data. People often have emotions, hopes, and instincts but without data you can’t align resources.

Top 10 Governing Data and Tech for Societal Value

  1. Start with a Goal in Mind – Evidence-based Decision Making. Knowing what the objective and outcome is needed.
  2. Explore Design Options – Designing for the customer vs designing with the customer. Needs are best met with working directly with the customer.
  3. Rapid Prototype Development – Open and frugal innovation.
  4. Manage Scope Creep – Bound the problem and hold.
  5. Build Partnerships – Leverage and connect to resources.
  6. Harness Collective Intelligence – Design civic labs and crowdsourcing platforms. Open and welcoming for people to experiment.
  7. Experiment Constantly – Test interventions, simulate intended and unintended consequences. Bring in end users to do actually test and use, simulate, experiment, try, provide recommendations.
  8. Release in Beta – Iterative and the project is really never done.
  9. Promote “Intrapreneurship” – Develop competencies from WITHIN and promote innovation inside of the organization to promote innovation.
  10. Outputs and Outcomes – Track both for evaluation and communicate the ROIs.


#MVUsym16 – Managing Quality in Online Learning

IMG_7756Session Description

A panel of online learning experts will discuss issues of quality as they pertain to online teachers and instruction, online content and course design, and program evaluation.


Peter Arashiro, Director of Instructional Product Design, MVU
Kristin Flynn, Interim Executive Director of Student Learning Services, MVU
Joe Friedhoff, Vice President of Research, Policy, and Professional Learning, MVU.


Session Presentation Slides

Redefining Quality Online Instruction

  • Moved contracted instructors to part time employees.
  • HumanEx screener tool was used to find student-centered staff with empathy, positive attitude, results oriented, etc. Phone screener was a 30 minute call with over 200 interviews.

Uniform Onboarding Experience and Establishing Consistency Online

  • Common policies around academic integrity.
  • Shared best practices in engagement, efficiency, feedback.
  • Collaboration opportunities via Google Apps for Education
  • Pre-flight Checklists
  • Instructor monitor checklists for classroom observations.

Do your faculty have a growth mindset? “When we are failing we are learning…”


Building Capacity Online

  • Annual performance reviews.
  • Ongoing professional development includes “synergy” sessions with part time faculty.
  • “Collaboration of the Minds” is an event to bring everyone together to compare notes and share best practices.
  • Webinars are offered regularly.
  • 1-1 Coaching Opportunity.
  • Reflection is captured through blogging.

Managing Quality through the Course Design Process

  • DESIGN is intentional (produces useful results, derived from process, able to stand on their own)
  • QM is used to monitor Quality
  • Scope and Sequence, Standards, Meeting Objectives, Alignment of Assessments (Iterative process is key.)


Program Evaluation BEFORE

  • iNACOL, QM, AdvancED



Program Evaluation DURING

  • Learning Analytics


  • Monitoring Gradebook and Student Achievement


  • Tracking Login and Activity within Blackboard


  • Tracking Tool Activities in Courses


  • Tracking Completion


Program Evaluation AFTER


#MVUsym16 – Digital Badging in Educational Settings

Session Description
Digital badging provides a mechanism to acknowledge both formal and informal learning by students. This session will provide an overview of digital badging and highlight several Michigan badging projects that are underway.

Michelle Ribrant, Assistant Director, Office of Education Improvement and Innovation (MDE)

See > mibadges.org


  • Career and College Readiness is a focus of MDE.

  • Personalized learning is based on intentional instruction and integration, competency-based education, flexible learning options with a foundation of multi-tiered systems of support.
  • Personalized Learning
    • Choice, Context, Pacing, Relevance, Proficiency
  • Personalized Teaching
    • Collaboration, Flexibility, Student Ownership, Faciliation
  • Educational Technology
    • Access, Customization, Engagement, Data Use
  • Open badges are part of a reporting system. Competency-based pathways include: demonstrated mastery vs seat time, explicit and measurable learning objectives, rapid and differentiated support, application of knowledge, and flexible learning options and multiple pathways.
  • How do we recognize and value the way we learn…
  • Badges are digital documentation of skills and achievement. A badge is a digital icon and within the badge are credentials of who issued, what did the user have to do to earn the badge, and the evidence of outcomes/standards of the badge that is earned.

Image from Classhack

  • Students has control of what they have learned, and they can share their badge on social media via Mozilla open badges.
  • Badges can play an integral role by supporting recognition on a skill or competency level and allowing learners to create custom pathways.
  • Why badges? Schools and the workforce can see student learning that happens in and out of school in informal settings. Completion of a project, mastery of a skill, and life experience represents student learning. Provide a comprehensive picture and demonstrated evidence of gained competencies. Student ownership of their learning. Assurance of credibility.
  • Considerations for issuing badges: Aligned to standards (academic, industry, out of school learning). Multiple pathways to demonstrate content. Communication of badges and levels of accomplishments.
  • Credibility is important in badging. There is informal credibility, but also there are formal badges and the continuum therein.
  • Cyber security badges is a good example. It could take a student years to get through the “gold standard” certification.
  • Sample MDE badges include Digital Adventures bade for Detroit Public TV, FIRST Lego League, MiBadges, CS First (Google program). MDE is using MSU’s badges.msu.edu site to help manage badges.
  • Credly is another great solution to issuing and collecting badges.
  • FIRST in Michigan is a great example of what is happening locally. https://www.firstbadges.com This includes attendance badges of 60 hours and then there are levels of additional badging in 2nd and 3rd level.
  • Moving ahead:
    • Continue work with partners to develop criteria for awarding badges to ensure legitimacy
    • Alignment to TRIG, 21st Century Afterschool Learning, First Robotics, 21 Things for Students, CS First, DPTV, and other initiatives.
    • Design/enhance reporting systems that include competencies and badges to indicate student skills and knowledge aligned to career and college readiness.