I thought I would take a stab at defining what connectivist metrics could include. Having read in Stephen’s post, Connectivist Dynamics in Communities, that connectivist networks produce connective knowledge and that four elements (autonomy, diversity, open-ness and interactivity & connectedness) distinguish a knowledge-generating network from a mere set of connected elements, I thought it would make sense to start here.
The metrics that are typically being suggested are metrics such as page views, new memberships, number and type of new media contributions, number of discussion threads, ratings (satisfaction and post ratings), number of new topics, number of connections, social network tracking and number of posts/discussions etc.
I would venture some possible metrics for the four categories that Stephen outlined. Disclaimer: These are random speculations for now.
- Number of individuals who joined the network through invitation vs. requested membership/enrolled by their own agency?
- Number of members who understand how to use tools that are engaged by the network to perform basic functions necessary to participate in the network?
- Number of members who initiated a conversation that involved other people?
- Number of people who participated in a learning activity initiated by others?
- Number of times a member agreed or disagreed with an opinion expressed in the community?
- Number of members belonging to distinct backgrounds (coud be multiple views here)
- Unique resources bookmarked by the member?
- Unique connections vs shared connections
- How many unique conversations exist at any point in time?
- Number of homogeneous or differentiated conversations, by context and by participation?
- What is the net flow of connections to and from the network? Positive/negative vs high/medium/low
- Number of accepted/rejected requests to join the network
interactivity & connectedness
- How many members are engaged in each conversation on average?
- Per member, per background, per conversation statistics of participation
- Trend analyses offered by SNA
Further, the RoI from the network would perhaps emerge if these metrics can answer the following questions:
- Did the network generate new knowledge?
- Did members who needed to learn in order to perform actually learn?
- Were any members disadvantaged in any way and could not benefit from the interactions?
- Did any innovative ideas arise out of the interactions?
- Did nodes in the network become more connected?
- Did members show an increased ability to manage new information and adapt?
More thoughts to follow.
I like these metrics. I have given examples of some of these in my conversations and tweet conversation. But you do a great job at consolidating these. Great post Viplav.
Of course, now we need to figure out how to make these easily measurable.
Thanks Manish. Could you excerpt some of the tweets and summarize some of your conversations around these metrics. That would be really great!
I have responded to your post via my response post in http:suifaijohnmak.wordpress.com.
Many thanks for your wonderful insights on metrics
I appreciate the consolidation of factors according to various parameters. A lot of these factors such as participation have been traditionally used to mark grades in educational institute .
However, I feel that the parameters are quantative. For instance is a multipart reply to a post a better than a single consolidated reply.
In my humble opinion, we still have to look for for new factors that can provide a wider understanding of networks. Malcom Gladwell in his book tipping point has suggested a factor called the STICKINESS FACTOR. Quantifying this factor can be a starting point to defing the connectivist matrix.
This is very relevant to a project I am working on at the moment.
The problem we are faced with is how to automatically gather this information? Or if it is even possible to gather it automatically?! Our tutors have understandable time constraints that are a barrier to gathering this valuable information.
Thanks for any thoughts!
Unfortunately Maria, there is no single link I can direct you to. Part of the problem is that not many people have focused on collection of data in this manner. Traditional systems like SNAPP (see also George Siemen’s Learning Analytics group at http://groups.google.com/group/learninganalytics – there are some interesting discussions there including his discussion on daa trails) don’t transcend the quantitative (learner clicked a link) type of analytics. That is also because we need newer tools or structured collaboration activities to assess situations such as “did any new knowledge result” or “does X trust Y”. As George points out, there are many other sources outside a formal LMS or portal kind of environment that can contribute to Learner Analytics. I recommend reaching out to the OPUS2 team (Al Pedrazzoli at GSIServer). Viplav
Thank you Viplav, I appreciate the feedback. I had a feeling this would be the case!
I will look at those links for more information