Talk by Tony Wagner at GLOBALEDCON

This summary comes form a talk by Tony Wagner, the Author of Global Achievement Gap, which he did at the online Global Education conference. Recordings are available at the website:

Two problems in education today:

  1. Access to education is becoming more available outside of school: Udacity, Coursera, etc. , hence no need to go to school to acquire knowledge.
  2. Changing nature of work, i.e. “global knowledge economy”, which calls for new workplace skills

He was influenced by Thomas Friedman’s book “The World if Flat”. Based on an extensive analysis and interaction with major corporations, Wagner comes to a conclusion that our students need 7 survival skills:

  1. Critical thinking skills – how to improve their company productivity. In real world, critical thinking is defined as “asking the right questions” unlike what school does – giving the right answers
  2. Collaboration in the professional network to solve problems often involving cross-cultural, cross-religious boundaries
  3. Agility and adaptability
  4. Initiative and entrepreneurial skills
  5. Effective oral and written communication – number one complain from senior executives. “They do not know how to write with voice” – quote from one of the CEOs, ability to speak with passion
  6. Accessing and analyzing a lot of information
  7. Creativity and imagination

What is taught and tested through standardized assessment due to accountability pressure is in conflict with 21st century demands. Hence he suggests an approach to a developing accountability system:

  • We should work with businesses and organizations in the real world to develop a new system of accountability
  • Starting from kindergarten, students should develop an electronic portfolio that shows their skills and talents
  • He cited Finland for being very successful in developing the right kind of accountability in education

Another compounding problem is the global economic problem. BA degree is not worth much any more. Graduating with a BA does not add more skills for work place. This economy is innovation driven economy.

“What must we do differently?” – He interviewed a number of young highly motivated and innovative people from different social groups. Then he interviewed their parents and mentors or teachers. His conclusion was that their teachers from elementary all the way through colleges were “outlier teachers”, people with unique approach to teaching. He came up with contradictions between traditional teachers vs. outlier teachers

  1. Culture of schooling – celebrating individualism vs. Culture of innovation – value of team work and collaboration
  2. Compartmentalized approach to teaching vs. Cross-disciplinary approach
  3. Class of students sitting passively, consumer approach vs  Class of creators and producers
  4. Mistakes are stigmatized vs. failing often is encouraged, such that this word is not even used; instead ‘iteration’ is a more appropriate word
  5. Extrinsic motivation (grades, money, parents’/teachers’ satisfaction  vs. intrinsic motivation, i.e. inner interest encouraged by parents through ‘play and passion’

These young people all were driven by a sense of purpose to change the world for a better place.

Neat website: 

Social Network Indicators for Learning

This talk was delivered online as part of the MOOC on “Current/Future of Higher Education” (available at this page).

  • Social networks are defined in a broader sense not only specific to online social media
  • components: actors (individuals, collectives, other), ties (strong or week), relations, networks
  • using visualization tools and data from online platforms, distance and blended course instructors could analyze how different online tools can form different kinds of networks: chats, discussion boards, email.

A talk by Simon Buckingham Shum: Social Learning Analytics

This talk was delivered online as part of the MOOC on “Current/Future of Higher Education” (available at this page).

  • MOOCs are partly about big data; they provide us with the data to answer alot of pedagogical questions
  • HIs take on analytics – three levels: micro (individual users), meso (institutional level), and macro (region, state, international)
  • Use of analytics calls for a new kind of literacy – do stakeholders and decision makers have that?
  • A lot of efforts among VLE providers to integrate analytics in their solutions
  • Analytics provide a “feedback loop” for educators that has larger sampling and better sampling rate
  • This feedback loop is available to learners too
  • R&D related questions: who else to involve in the analytics endeavor: data capture and design teams, IT, etc.
  • The goal is to optimize the system. What are we optimizing it for? … for better engagement, assessment, outcomes, and larger scale.
  • Learning analytics is a “revolutionary technology” because we have much more quantifiable data handy
  • Analytics is assessment – what kind of assessment is good?  – formative assessment, i.e. feedback that is constructive, timely, sensitive to learners’ emotions, conducive to learners’ self-reflection, motivational
  • Three kinds of discourse in educational settings: disputational, cumulative, and exploratory, the last one being the best (from Mercer et al). Learning analytics should have a theory of learning underlying it. This sociocultural theoretical approach is easily applied using computational linguistics – scanning and identifying cues that signal certain kinds of discourses (a project co-author by the presenter)
  • Another example of research using the theory above – a machine learning approach (a project co-author by the presenter)
  • Scholarly writing with analytics using syntax parsers (e.g. xerox incremental parser) (a project co-author by the presenter)
  • Semantics analysis between learners (a project co-author by the presenter)
  • visualized data of semantic webs could be filtered by factors of interest, e.g. learner, friend network vs. respond network, etc.
  • Bottom line: learning analytics can “catalyze” a debate about what “good education looks like” not “what we can measure” with this tool.

Some Thoughts

  • How to make learning analytics more constructive and learner-centered is a great point. How do we do it practically?

A talk by Erik Duval: Learning analytics

This talk was delivered online as part of the MOOC on “Current/Future of Higher Education”  (available at this page)

Erik Duval from Leuven, Belgium:

  • Learning analytics is about collecting and analyzing ‘traces’ of users’ online presence
  • Educational data mining is something they do not like doing because it dis-empowers students and faculty
  • Duval does not like LMSs because it’s a barrier to innovation, and because the learner activity that is shown in LMS is a ‘tip of the iceberg’
  • Learning analytics should be happening real time, not ‘post-mortem’ because the idea is to help the learners in the process, sort of formative assessment
  • LMS dashboards are very limiting for students. He cites Bb dashboards, and generalizes this problem to other LMSs too.
  • Quantified Self – food intake, jogging, alcohol intake, etc. – is like  Learning analytics.
  • Learning analytics is like a mirror for a learner where they can reflect on their learning, understanding of what effort is and compare it with the professor’s definition of learning
  • Building learner-friendly dashboards to make decisions as they take the course is his main goal, and they design dashboards by involving students: a) twitter, schedule, comments, files, all, rss and a meter that shows where a  learner stands compared to others.
  • Perhaps comparing to others or certain expectations is not a good idea according to Duval
  • He’s not excited about the idea of telling students what to do, i.e. building algorithms, action items for students to take based on the data that emerges. He wants to provide this info to the students and have them reflect and act on it on their own.
  • The more students do “drill and practice” activities, the easier it is to use learning analytics. You need to be more creative to  be able to tap into more open-ended tasks and make sense of them.

Some thoughts:

  • Seems that using LMS data for decision making about student engagement and predict their performance is less justified with hybrid or F2F courses. Perhaps the proposal for the integrated model of learning analytics is a solution to this problem (discussed in the article by Siemens et al).
  • Because I don’t have experience with LMS learning analytics dashboards, it’s not clear how the learning analytics dashboard developed by Erik Duval is different from the LMS dashboards.

A talk by John Baker

This talk was delivered online as part of the MOOC on “Current/Future of Higher Education”  (available at this page)

John Baker: The Desire2Learn Story: Problem Finding, Passion, Perseverance

Some major shifts:

  • move towards online and blended learning among campus students
  • move from hard copy textbooks to  digital books
  • flip classroom and webcasting of courses
  • learners becoming producers vs. consumers of knowledge

Academic & Learning Analytics

This topic comes from Week 4 of the MOOC on “Current/Future of Higher Education”


A talk available at

  • LMSs do have some benefits, e.g. they keep track of students’
  • Reference to the ECAR study of 2005 with 5 stages of analytics use by schools

Purdue Experience

  • “Signals” in Blackboard, an example of predictive  approach of analytics
  • Success = academic preparation+performance+EFFORT, hence identifying measures of effort
  • CMS and other technologies (real time) => Prediction <= SIS Data (historic), Other data
  • Better student performance: more Bs and Cs, fewer Ds and Fs [apparently not much impact on B-students]

UMBC Experience

  • D students stay in Blackboard about 30-40% consistently less time than C-students
  • Students may draw faulty conclusions by drawing connections between their online presence and grades on specific assignments too directly
  • According to Fritz’s research, 28% students were surprised what the data showed about their performance
  • Overall, it looks like it’s not clear whether and how students use these data and what they do with them, and how they interpret them
  • At the institutional level, data help identify most active courses and best practice teachers
  • Frequency and duration of clicks are critical to consider not just number of clicks

Grand Rapids Community College: Project Astro Experience

  • Bb Building block for analytics available for installation free of charge[??]: Dashboard that shows visuals for tool use overall and by course, instructor vs. student activity, specific student activity, etc.
  • offers an alert system: possibility to raise a flag for specific students a) manually by the instructor or student counselling office or b) automatically by the system
  • More effort is needed to bring multiple systems, sources, and databases together to merge different kinds of data for decision making
  • FIRPA concerns about student privacy

PENETRATING THE FOG by Long and Siemens

  • A lot of decisions are not data driven
  • Learning (course- and department-specific level) vs. Academic (institutional, regional, national/international level) analytics
  • Interesting study cited by Morris, Finnegan and Wu about online presence and behavior of undergraduate students in online courses
  • important caution against taking a deterministic view of analytics as opposed to a more justified probabilistic view


  • This is a proposal for a “abroad-based, multi-sourced, contextual and integrated” model of learning analytics (p. 6).
  • It pulls together a number of data sources to draw a more holistic picture of student learning and academic effectiveness at the course and institutional level.
  • This model also provides recommendations by outsourcing to external sites, e.g. Amazon, etc.


  1. What analytics tools are available for Moodle?
  2. When is the proposed platform going to be ready?

Beyond MOOC Hyperbole: Why We Should Support MOOC Experimentation … Critically and Carefully by Siva Vaidhyanathan

This is the first post based on the MOOC “Current/Future of Higher Education 2012“.  (available at this page)

Since I joined this MOOC late, I ended up listening to recordings of presentations instead of attending live events. This one was for Week 1 (direct link at Here are some key ideas as they relate to my work and experience:

  • Siva Vaidhyanathan takes a Critical view on MOOCs based on his experience at the University of Virginia. High expectations of his university to offer MOOCs and some controversies around it.
  • He’s still positive about Moocs and their potential esp. for the marginalized, disenfranchised populations.
  • He’s hopeful that Moocs could help identify the best technologies for more traditional courses.
  • Privacy of students’ data including grades – one of his concerns
  • He cites, a website that offers MOOCs, apparently sensitive to student privacy data
  • Another concern of his is the talking head mode in which most MOOCs are delivered

FInal notes:

Some neat courses I personally liked at

  • Surviving Disruptive Technologies
  • Introduction to Sustainability
  • Writing in the Sciences
  • Social Network Analysis

EFL self-study plan for an 11-year-old

Here’s another case, a family that would like to facilitate their 11-year-old daughter’s learning of English.

She has a decent level of English (around B2 according to CEFR). The four language skills are more or less equally developed, which exceptional in the Armenian context where most kids of this age attending a regular school are not able to fluently express themselves in speaking.

After I talked to her and her family, we identified the following areas of improvement for her:

  • enrich her vocabulary and structures so they represent more authentic English
  • improve her reading skills to handle texts that are beyond her listening and speaking skills
  • raise her awareness that words in English may be pronounced differently from how they are written, and have her expose to new words by listening to them and not always by reading them
Some challenges that we’ve identified:
  • She seems very sensitive to ambiguity. If she reads, watches, or listens to something beyond her level, she gets easily turned off and loses motivation
Some strengths:
  • She’s interested in speaking English with English-speaking people around her
  • She’s highly motivated to improve her English
  • Her parents support her and always look for ways to create opportunities for learning
  • She natively speaks three other languages: Armenian, Russian, and Farsi
  • She continues attending the Sunday Baha’i class for English-speaking children – once/week/1hour
  • She continues working on “Get, Set, Go” series with her father – 1-2 times/week/1hour
  • She continues speaking only English with her father – 1-2 hours/daily
  • They will find reading resources that she herself picks and reads – 1-2 times/week/30 min:
a) They will check out a local English language library for authentic magazines and books
b) They will visit local bookstores and buy storybooks for her reading level that she herself picks
c) They will work on online activities that she herself picks from the Bag of Tricks social bookmarks (search by ‘reading’ tag)
  • They will write for different purposes using any of the following resources below- 1-2 times/week/30 min
a) They will consider starting an ebook that she is personally interested to write ( She will share her stories with friends English Club at Facebook
b) They will consider working on She will share her stories with friends and in the English Club at Facebook
c) They will consider create online multimedia posters on topics of her own choosing at She will share her glogs in the English Club at Facebook. She could make glogs based on what she reads or listens to.