Club Admiralty

v7.0 - moving along, a point increase at a time

Multilitteratus Incognitus

Traversing the path of the doctoral degree

First attempt at recording lectures...

 Permalink
I feel like this took forever to do, but it's finally done!

Last November we had a guest lecturer, Dr. Bessie Dendrinos, from the National and Kapodistrian Univerity of Athens, come speak to us on the subject of "Global Economy and the Urgent Need for Languages: American and European responses to foreign language learning exigency".  I recorded the video on a FlipCam (remember those?) and edited really quickly on Camtasia to add in some clearer slides.  The FlipCam isn't that bad.  I think a lavalier microphone and a tripod would do wonders for any future things I record.  The transcription (big thank you to Kathleen, Liz, and Laura) and the captioning on YouTube took a while since we're all new at this.  Let's see how the next one goes in terms of production


 Comments

Assessment of....?

 Permalink
Image from Flickriver, Brian Hillegas
A few days ago, and totally by stroke of chance, I happened upon a twitter discussion between @HybridPed@otterscotter, @actualham, and a few others.  I am not sure what the original topic was but I came in when they were discussing assessment. Do we assess learning or competency? Some regarded learning as transcending competency and some saw competency as transcending learning. It's hard to to really have a meaningful exchange of ideas in 140 characters, especially when the twitter train grows and grows.

When I jumped into this conversation I took the stance that what we assess is learning, not competency.  Competency, I would argue, is something that develops over a period of time. It is something you hone and improve.  Your skills (i.e. your competency is something) becomes improved the more your practice it. And, by practice, I mean being present while doing it and analyzing your own performance while doing the task, not just going on autopilot.    Learning on the other hand, for me, is learning distinct facts.  That the declaration of independence was signed in 1776. That the Greek War for Independence was in 1821, that many historians think of the assassination of Archduke Ferdinand kicked off World War I, that π = 3.14 (and goes on to infinity), and so on.†

From what I gathered, the individuals that saw learning as transcending competency think of competency the way I think of learning - i.e. I can demonstrate that I know how to code an HTML page (got a badge for that, I haz competency), whereas learning is something akin to lifelong learning. It is a skill you acquire to continue learning and it is something that can continue ad infinitum if the learner wants.  Both positions, in my mind, are equally valid because we are defining things differently.

Some types of learning are easy to assess, at least in the short term.  Things that allow the learner to regurgitate discreet pieces of information area easy to implement (short answers, multiple choice test, and so on).  Things that require the learner to demonstrate systems knowledge can be done at the individual class level if you can find microcosms of the system skills that you want to assess and extrapolate from that some broader competency, and it's a bit easier to do near the end of one's studies through a thesis or some sort of comprehensive exam because the learner will have had a broader set of learnings to draw from in order to explain what is going on in that system.

This issue of learning and assessment is big.  It's big in many fields.  It's big money for  companies like Pearson.  It's a big question for accrediting agencies.  It's big in the field of MOOCs.  I've most recently seen it in MOOCs where some claim that watching videos is learning, and some claim it is not.  Videos are just a tool. They can be used for learning, but they can also go merrily on in the background and they can become background noise.  I've had the privilege of being able to see live videos...aka lectures...when I was an undergrad (and a grad student sometimes too!). They were just as interactive as the videos I watch on OCW or various xMOOCs.

Even in courses that were interactive and active learning took place do I still remember everything 10 years down the road?  As part of my BA, focusing on computer science and minoring in Italian (and almost minoring in German, just needed 1 more class), I took courses in Italian literature, in German culture, in world history since 1500, the history of the Weimar Republic and WW II Germany. I learned ANSI C and Java. I learned SQL, and about automata.  Do I remember everything?  Hell no.  Does this mean that my undergraduate education is null and void? I don't think so.  It was just a little building block to get me to where I am now, despite the fact that I don't remember discreet pieces of information.  Even with my most recent MA in Applied Linguistics there are things that I just don't remember any more.  There are some things that are really vivid because I know them, and some that are vivid because they still trouble me today (Processability Theory being one of them).

I agree with Maha, who joined the twitter train on that topic, who says that some types of learning cannot as easily assessed as others. Maybe they'll take my instructional designer practitioner's membership card away for agreeing (LOL), but I don't think everything can be assessed by an ABCD method (Audience, Behavior, Condition, Degree‡). This might be doable in some skills, such as firearm training, but in many topics in education there is just too much fuzziness for ABCD to work without reducing assessment to a caricature.

Maha continued with another comment, which is also quite true, that assessment is no guaranty of lifelong learning.  I am sure I did well in all those classes I mentioned (I got the degrees to prove that I didn't fail anything), but the lifelong journey I am on has little to do with those classes specifically and more with my own curiosity.  I'd expand on Maha's comment and say that assessment is no guaranty of practice in that field either.  I completed my computer science degree, but I opted to not get a job in that field. Something else came up that seemed more interesting, and I haven't coded anything in Java or C since.  The closest I come to coding is HTML and Javascript on my own website.

So, the question is this:  beyond credentialing and certification, does assessment matter?  And if if it does matter, in what ways does it matter?  Take #rhizo15 for instance. This was a course♠, but how does one assess what I "learned" in it?  Does it matter to anyone but me?




SIDENOTES
† hey, I am channeling Latour with all of these examples!
‡ an example of ABCD is "Learners in INSDSG 601 with a blank chart of the Dick & Carey model will be able to demonstrate knowledge of the names of the phases of the Dick & Carey Model with 80% accuracy"
♠ Rhizo15 was a course, wasn't it? I guess that's a whole other discussion about what makes a course...
 Comments

DALMOOC, episode 2: Of tools and definitions

 Permalink
My Twitter Analytics, 10/2014
Another day, another #dalmooc post :)  Don't worry, I won't spam my blog with DALMOOC posts (even if you want me to), I don't have that much time.  I think over the next few days I'll be posting more than usual in order to catch up a bit.   This post reflects a bit of the week 1 (last week's) course content and prodding questions. I am still exploring ProSolo, so no news there (except that I was surprised that my twitter feed comes into ProSolo.  I hope others don't mind seeing non-DALMOOC posts on my ProSolo profile.

Week 1 seemed to be all about on-boarding, of tools and definitions.  So what is learning analytics?  According to the SOLAR definition, "Learning Analytics is the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs." It's a nice, succint, definition - which I had honestly forgotten about since I was in LAK11.

Analytics has interesting potential for assisting in learning and teaching. Data collected from social interactions in the various learning spaces (the LMS comes to mind as the main one, but that's not necessarily the only one, external-to-the-LMS and internal-to-the-LMS spaces can also count as learning spaces in their own right), learning content (learner-content interaction for instance), and the data on the effects of various interventions and course changes can potentially yield useful insights.

These insights might be about the learning that might be happening, participant's patterns of interaction, their feeling and attitudes toward people and non-animate resources, how people learn, and what they might be doing next based on what they've already done (predictive analytics?).  My main issue with learning analytics is that those with whom I've interacted about analytics seem to feel like this is a magic bullet, that analytics will be some sort of panacea that will help us teach better and help our learners learn.  A similar thing which we've seen with the MOOC-hype mind you. 

The truth is certain this cannot be quantified yet, and things that are quantified can't always tell us what's going on.  As an example, I had a conversation with a colleague recently who came to me because of my background in applied linguistics and educational technology. The query was about text response length (presumably in discussion forums?) and student achievement; were there any studies around this topic?  The answer (at least according to my knowledge of the field) was no, there aren't studies like that (that I know of).  That said, even if someone wanted to do a study around this, I think that the study is flawed if you only look at textual comments in a discussion forum from a quantitative perspective.  Length doesn't really tell you much about the quality and relevance of the posted text, other dimensions, qualitative ones, need to be examined in order to come to better conclusions (good ol' Grice comes to mind as another possible analysis dimension). Don't get me wrong, I think there probably is some positive correlation between text length in a goldilocks zone for response length, but response length isn't the end-all-be-all determinant of student achievement. If the only rubric for me getting an "A" is an essay of 4000 words, I'll just give you Lorem Ipsum text :-)

Another thing pointed out in week 1 was that there are Ethical implications and privacy issues around the use of analytics.  I think that this is a much larger topic.  If it comes up in a future week I'll write about it (or if you really want me to write about my thoughts on this earlier, just leave a note).

So, those were the definitions. Now for some tools! There were a number of tools discussed such as
NodeXL (free, Social Network Analysis tool), Pentaho (30 day trial, integrated suite), IBM Analytics suite (integrated suite, definitely not free), SAS (integrated suite - also not free), R Language (free), Weka (free, java based).  R is something that we use in Corpus Linguistics analysis.  I haven't delved too much into that field, but I am considering it since there are analytics related corpus projects that might be of interest.  One of my colleagues might be teaching this course in the spring semester so I'll see if I can sit in (if I have time.  Not sure how much time EDDE 802 will take). SNAPP (free) was another tool mentioned, and this is something I remember from LAK11.  I've tried to get this installed on our Blackboard server over the last few years, but I've been unsuccessful at convincing the powers that be.  I'd love to run SNAPP in my courses to see how connections are formed and maintained amongst the learners in my classes.  This is one of the issues when you don't run your own servers, you're waiting for someone else to approve the installation of a Bb extension.  Oh well... Maybe in 2015. 

Anyway, those are all the tools that we won't be using directly in DALMOOC.  These are the tools that we will be using: Tableau (paid, but free for us until January 2015), Gephi (free), RapidMiner (has a free version) and, LightSide (free).  Gephi I already downloaded and installed because I was auditing the Coursera Social Network Analysis course that they are currently running.  I'll be going back to those videos in January (or next summer, it all depends on EDDE 802) and messing around more with it then. I know we'll be using it here, but I am no sure to what extent.  Tableau I already downloaded and installed last week on my work machine.  I'll be messing around with Week 2 data when I get back in the office on Monday.  This looks pretty interesting!

Finally (for this post anyway), DALMOOC has a bazaar assignment each week. Here is the description:
In this collaborative activity, we will reflect on what you have learned about the field of learning analytics. We would like you to do this portion of the assignment online with a partner student we will assign to you. You will use the Bazaar Collaborative Chat tool. To access the chat tool, click on the link below. You will log in using your EdX ID. When you log in, you will enter a lobby program that will assign you to a partner. If it turns out that a partner student is not available, after 5 minutes it will suggest that you try again later.  

For experimentation purposes I know I should give this a try, but I probably won't do these bazaar assignments. I have an affinity for asynchronous learning (as Maha Bali put it in one of her posts) :)



SIDENOTES:
1) Great to put a face to a name.  Realized that Carolyn Rose and Matt Crosslin are part of this MOOC. Carolyn is writing a piece of the upcoming special issue of CIEE (this used to be the Great Big MOOC Book), and Matt and co-authoring a piece of the special issue  of CIEE journal for summer 2015 on the Instructional Design of MOOCs

2) Carolyn mentioned in one of the videos that statistics are pretty cool.  I've been lukewarm on them since I was a college undergraduate, mostly because I mess up the math and my numbers don't make sense ;-)
 Comments

Educational Based Research - Part 1

 Permalink

Well, in a week I will be in Edmonton starting off my EdD in distance education at Athabasca University.  I know that most North American doctoral students probably don't think of their dissertation topic this early (I haven't even completed my first course), but I want to be pro-active and work on the thing while taking courses.  So, Rebecca's post on Educational Design Research (EDR) was quite timely.  This isn't my first go around at a dissertation topic, my current topic has evolved over the past couple of years as I was thinking about what I want to do (and which university is best to pursue this).

My initial idea was to blend my background in Instructional Design and MOOCs to teach a language, specifically designing a MOOC to teach Greek as a foreign language to novices. This actually came out of making a MOOC out of my MEd capstone.  This was circa 2011-2012 after my experiences with MobiMOOC, LAK11, CCK11 and before the xMOOCs invaded the scene.  After some thought about this I decided that Greek would be way to much work for a dissertation. There isn't a lot of material out there and available as free OER, or under creative commons, for me to use in designing this course.  If I were to not only design and implement the MOOC, but also a lot of the kickstart materials for it, I might be stuck in dissertation purgatory. So that was scrapped.

Then, I read about the Polytechnic University of Milan switching the language of instruction to English, and I thought that  this would be a good opportunity to refocus the project on native speakers of Italian, in Higher Education, improving their English through a MOOC format.  This had the benefits of having an audience that would need the course (so I wouldn't worry about students joining, in theory), it would be focused in terms of  the target learner (thus cutting down on the variables for any post-dissertation analyses I wanted to conduct and write about), and it gave me an opportunity to brush up on my Italian.  This was when I was thinking about applying to the National & Kapodistrian University of Athens. As we know, I ended up not applying there because there were strikes which prevented me from getting my materials in (and also getting my degrees from the US translated, which required me to go to Greece).  This idea is on the back burner, but I would love to explore it in the future.  I downloaded a lot of CALL and SLA articles during this time frame as an initial literature review which I would love to read and put to use.

Finally, after getting accepted at Athabasca, I thought about who's currently there, what their expertise is, and then further refining what might be a dissertation topic.  It should be stated that my goal with my dissertation is not to do something earth-shattering and wow people with any potential brilliance I might have. The point of the dissertation, for me anyway, is to receive apprenticeship into research, to hone my skills, and to then be certified (by getting my dissertation approved) that I know what I am doing without the training wheels on.  In pursuit of this goal I decided to take the course that I currently teach, INSDSG 684: The Design and Instruction of Online Learning, and make that into a MOOC. This isn't going to be new or novice.  If you look at OLDSMOOC, Learning to Teach Online (UNSW/Coursera) and Teaching Online: Reflections and Practice (Kirkwood CC/Canvas.net) just to name a few (I am sure there are others), you'll see that others are tackling the topic now. But, as I said, I am not interested in novelty, and any research that comes out of those courses will most likely be on the data analysis side (at least for xMOOCs).

So, to address the questions that Rebecca posted on her blog post about this latest incarnation of my topic:

What is your research question? Is it is a ‘design question’?
The thing I wish to tackle with this dissertation is the "conversion" process (even though conversion is not really the right term for this) of a campus course to a MOOC.  This MOOC would need to address the needs of a traditionally online student population paying for the opportunity to learn and be evaluated for 3 graduate credits; as well as a population of professionals out there who need skills, but they are pursuing it more as professional development and don't need (or want to pay) for graduate credits. While I would love to analyze data collected from this experiment in other ways, the dissertation will focus strictly on the ADDIE aspects of the course.

Do enough academics at your institution appreciate ‘design’ as research?
It's hard to say at this point. I have read faculty profiles a couple of time already, but it's hard to really know people until you've talked to them, one way or another.  My instinct tells me that there are enough people at Athabasca University who are interested in design, and considering that this is a "professional"† doctorate, I would think that design research would be interesting to someone.

How will you defend your study to researchers who don’t see ‘design’ as research?
I guess I will cross that bridge when I get to it, but my main take-away point is that all research is designed.  There is a certain know-how and skill required in order to even setup a research design, so design research is really (in a sense) further up-stream.  Furthermore, there is a real need to go back to the established literature of learning (and online learning specifically), design MOOC interventions based on this literature, evaluate and iterate. Otherwise, further downstream it your wonderful data analytics just digital clutter with nothing previous informing it.

How will you differentiate research from practice?
This is also another false dichotomy, in my opinion. You can't separate research from practice in the field of education. I see this with students in courses I've taught (or MEd students I've chatted with outside of the course).  Many of them seem to come into the program wanting concrete answers, absolutes, processes and procedures to be awesome designers, but they don't like research articles that are really focused, or provide caveats and exceptions, and articles that state that "more research is required." They don't seem to get that in order to be good practitioners they need to engage with the research in some fashion and do it continuously. Even the research folks in education can't operate in a vacuum. They need see what's happening in the field so that they can ponder, problematize, hypothesize and test. It's all cyclical, to try to break this into to distinct and separate items is a big issue.


† for what it's worth I dislike the term "professional" doctorate. It sounds like an insult to both those who have worked hard to attain it, and to those who have PhDs, because it makes PhDs sound like they are not professionals. I wonder who came up with this.
 Comments

Teaching at a distance...or not?

 Permalink
You are using it wrong...
A little while back I was reading Rebecca's post titled When teaching online doesn’t mean ‘at a distance’. Quite a few things came to mind, but they were too many for one blog post, so I thought I would do two separate ones.  One from my experiences as a program coordinator (and unofficial instructional designer) for the Applied Linguistics department where I work, and one for my role as an (adjunct) faculty member in the Instructional Design program.  This first post tackles the day job role (program coordinator).

In the post, Rebecca writes that she taught a course, online, for a predominantly residential program. The students were thus using online tools to schedule face to face meetings, instead of using the tools to facilitate online meetings to get group work done.  This seems like a lost opportunity to us who have been using technology to communicate and work online for a while now, but to a novice the path of least resistance (and optimal path) is so use what you already know instead of learning something else.

At work we have quite a peculiar setup.  We are one graduate program with two facets: the online facet and the on-campus facet.  The two don't mix except in certain elective courses that both online and campus students can be cross-enrolled into the same online course.  Our online students work cooperatively (or collaboratively) with the tools they have.  They don't have an option.  The campus students, it seems to me, used to cluster around each other in the online courses and worked together so at to avoid having to use the technology to work with each other online.

This was a bit of a problem in many ways.  A few years ago we embarked on an experiment.  We had a faculty member teaching the same course online and on-campus every fall semester.  Why maintain two blackboard courses when one would do? The content (readings, viewings, etc.) was the same for both, and so were major course deliverables. We decided to put the on-campus course into the online course shell.  This way, if campus students wanted to follow along with the weekly online discussions they could.  To get people's toes wet (since most campus students have no online experience), one class a month was substituted for a fully online module (thus, in effect, making the course a hybrid course).  For major course projects, which were group based, on-campus and online students were deliberately put into mixed groups so that campus students had to interact and work with people who were miles away and meeting in person was not an option.  This had a two-fold goal: increase the ICT literacy of campus students (not just the tool usage, but also how to work with one another), but also to enable cross-pollination of ideas.  The online students are in diverse contexts all over the US, East Asia, South East Asia, Central and South America, and Europe and the Middle East.  Whereas campus students are mostly coming from a Massachusetts context. This cross-pollination helps the distance students see a very specific Massachusetts-local context, and helps campus students see other local contexts.

The first year we did this the reviews were not that great.  This was to be expected.  Most people "revolted" because they just didn't want to do online courses, they didn't want to do the technology thing.  Even though they were meeting on-campus every week, the fact that they were required to participate online every so often and work with online students made the course, in their mind, and online course.  There was a bit of reason for the student's dissatisfaction, but not for the reasons that they claimed.  The reason was simple: lack of scaffolding.  Up until that point on-campus courses did not have course complements on the LMS. Students were expected to come once a week for their face to face time and go off and do their thing until next week.  The LMS made them more connected, and thus more "on the hook" in-between face to face meetings.  This course was also taken more than half-way through their studies so they were set in their ways as learners.  I am not saying that learners shouldn't adapt, but I see where they were coming from.  Fast forward to today, the majority of our campus courses use an LMS complement, from the start of the program, thus students are more aware that they will have to be active in the course even when they are not physically present in the course.


The other thing that Rebecca notes is that in her graduate program they
had a six-week course dedicated specifically to what it meant to learn in an online environment. This involved activities that allowed us to learn how to do online group work effectively (which looked a lot different before Internet-based audio was an option).

I was asking around in my department to see if we ever had such a thing in our program since our online students are only online, and since I started my day job I've had no knowledge of something similar.  It turns out that we don't, but I think it's an important thing.  A six-week course on how to be a successful online learner, coupled with interactions with the faculty teaching online, would be a good introduction to our program and help build a sense of community.  I am wondering what other online programs do for their on-boarding.  How comprehensive are their on-boarding program and what happens when students don't complete the on-boarding?  I guess I am thinking a bit with an administrator's hat here and not a pedagogues.  Any other program admins out there?
 Comments

Attention splitting in MOOCs

 Permalink
The other day I caught a post by Lenandlar on the #Rhizo14 MOOC which is over, but we amazingly are keeping it going.  At the end of his post on motivation that I wanted to address, since they've been on my mind and they've come up a few times in the past week.

Are MOOC participants in favor of shorter or longer videos or it doesn’t matter?  

I can't speak for all MOOC participants, I can only speak for myself, and from my own experiences. I can say that video length does matter, but it's not just about the video length.  On average, I would say that you don't need a video that is longer than 20 minutes. My feeling is that if I want to watch a documentary, I will watch a documentary, not participate in a MOOC. Anything longer than 20 minutes is probably unfocused and not suitable to the medium and the goals of the course.

Of course, simply having 20 minutes to work with doesn't mean that you should take up all that time.  This goes back to figuring out what your message is, what you need to talk about, how you are going to present it, and what the ultimate goals are of the video.  I think that Grice's maxims are a perfect fit here :). If your video is 20 minutes long but is just right for what you intend to do with it, great.  If your video is 5 minutes but it failed miserably, then you wasted my time, or worse you diminished my interest in a topic I was previously interested in. At the end of the day, it's not about the length. So long as the learner knows the duration of the video, and any dependencies (i.e. do I need to watch something else before I watch this), if the video is well made, on point, and on-time, you are OK. The learner can carve out the time that they need to watch certain videos if they know the duration ahead of time.

What is the extent of discussions taking place on Forums set up for MOOCs?  
Again, this is only my experience.  I think that some forums work well, and some do not.  Forum that work as list-servs, for me, work well because I can keep an eye out on things that are happening in the forums while I commute on my smartphone, and respond accordingly.  If I have to wait to get home, after all else is done, then I am lost in a sea of posts.  This is useless, so I avoid those forums.  A good example of forums working well was mobiMOOC 2011.

There is of course another element here, and that is learner choice.  If forum discussions are created with prompts, like traditional online learning, then the forums get barraged by the 2, 3, 4, 10 ,15 possible answers and you end up having a lot of repetition.  There are countless examples of this, but one that comes to mind is the Games in Education MOOC that I did last fall.  Interesting stuff, and I did try to participate in the forums, but as soon as one person makes a post about a particular game (let's say Metal Gear Solid), then why are there six other threads with the same game?  Those should all be in one thread.

Meaningful discussion could conceivably take place in a MOOC discussion forum, but I don't think that the variables have yet been determined as to how to best setup a forum from a technological and a pedagogical end. The other thing that comes to mind is this notion (from PLENK2010 research if I remember correctly, Kop et al?) there were quite a few people who seemed to be "refugees from the forum" who started blogging. Having an alternative vehicle is great, but the thing I started pondering in Rhizo14 was how many media can a learner reasonably keep track of at any given time?  For me, in rhizo14,  the Blog and the Facebook group were primary.  P2PU secondary (check in every few days), and twitter tertiary, in other words whenever I could remember.  Two primaries and two secondaries are what I could handle (and not that well I might add).  So in MOOCs, where forums don't work well, or where forums are an option among other venues (Fb groups, G+, twitter, blogs, wikis and so on), what toll does that take on the learner?

Does course duration matter to MOOC participants? If so, what is an optimal length? What is too short? What is too long?
I will refer you back to my video answer for this one ;-). In all honesty, it depends on the subject matter at hand, and who you expect the learners/participants to be.  In some cases, like CCK, the course was structured for 13 weeks (if I remember correctly).  Perhaps this was a university requirement, since it did run for credit at the University of Manitoba, but it may just as well have been a design consideration outside of university norms.  That said, I would say, from the research I've read thus far, such as Weller's analysis of Katy Jordan's data (I think I've seen a recent article on IRRODL that I have not read yet by her) the  sweet spot seems to be six weeks for MOOCs.  Now, I think this data is based on coursera xMOOCs, so the design decisions for those MOOCs are probably affecting the appropriate length.

Going back to my earlier comment, I would say that if you think of your message, and your delivery, and your goals, you will have an idea of how long the MOOC needs to be.  I will go ahead and state that a MOOC that is less than 3 weeks is not really a MOOC. I don't know what it is, but a MOOC it ain't (assuming C = course).  It took me a couple of weeks to get acclimated to the people in Rhizo14, even if I knew some of them from before.  Depending on your participation in the MOOC, it may take you a week to get comfortable and in the head-space to be where you need to participate, or lurk/consume.  Thus, three weeks are, for me, the very minimum needed.  The max...well, current research seems to indicate six, or maybe eight, weeks, but this depends on a variety of factors.



 Comments

1 week, 3 completed MOOCs, 1 MOOC Experience Reflection

 Permalink
Online Games & Narrative Course Logo
Last summer, when I signed up for these things, I really didn't  keep proper timing of the courses I signed up for, because I was signed up for three concurrent MOOCs, while working a full time job, and messing around with other interesting things (MOOC related).  In any case, after several PACKED weeks, three MOOCs are done, and I have some thoughts about MOOC design and MOOC Process to go along with them from my own personal experiences.

MOOC 1
The first MOOC was on Coursera, and it was Online Games: Literature, New Media, and Narrative with Jay Clayton  of Vanderbilt University. The thing that attracted me to this course was the aspect of online gaming and how it tied into other media. The theme was Lord of the Rings, which I am sort of lukewarm about. It's fine, but it's not the type of literature, or game for that matter, that I would spend a ton of time on.  The nice thing about the course was that the people talking about the materials were real geeks about it.  The material wasn't dry, and the enthusiasm about the subject really came alive on the screen (at least for me).  The course did fall a bit short of me in the assessment area.  I did partake in the quizzes, which were good enough for formative assessment, in other words it helped me make sure I was on the right track, but beyond that, I really didn't have an incentive to really participate in the forums. The combination of being a bit "m'eh" about the subject and the fact that forums in MOOCs just don't work that well made me avoid the forums for this course.  The other thing that was a bit of an eyebrow raiser was the "distinction track" of the course.  Now that I have seen all three peer-reviewed assignments, I really don't see the distinctiveness of the distinction track.  Yes, it requires more work, but at the end of the day it's peer reviewed, and that peer reviewer grade doesn't necessarily do justice to evaluate any work I would have contributed. The two redeeming thing about the Distinction Track assignments was that they (1) didn't limit the amount of words you could use for text-submissions, and (2) you could actually use a variety of media (papers, game-making, videos) to submit your work.  As an assignment it was interesting, as an evaluation of learning it was not.

MOOC 2
The second MOOC was on Coursesites, and you this was the Mozilla Open Badges MOOC that you've seen me write about on this blog. This MOOC had weekly live streamed sessions (as well as recordings from them), and Open Labs for badges.  In addition there were discussions forums and weekly challenge assignments that could award you badges. I have to say, that the awarding of badges was motivational for me because the assignments required enough time and thought, that I don't know if I would have bothered to put pen to paper to hash out some ideas if there wasn't some external award for these.  Now, this is part of my PhD brainstorm, so I would have written something in my PhD ideas notebook, but I wouldn't necessarily have gone into this amount of detail.

So, as far as process for this MOOC:  I liked the live sessions. It kept a degree or regularity in the course that allowed me to attend to these live videos on Mondays, think about the content for a few days, jot down some notes, and on Saturdays write-up a little something for the weekly challenge and submit it.   The challenges were interesting, but some were a bit out of my domain, so I used some assumptions to complete them.  Probably 1/3 of what I submitted was returned to me for improvement, and I resubmitted it.  I actually got feedback on what I submitted which was awesome.  This was something that continued my motivation to participate in the MOOC.  As far as discussion forums go....well, I did make an attempt to participate in the forums, but I didn't participate as much as I had intended to. The nice thing is that there were separate areas (using the Groups tool) to discuss Badges in different contexts, such as Badges for MOOCs or Badges for Higher Ed courses. Unfortunately there didn't seem to be many participants in the MOOC (or at least in some groups), so while the "intimate" feeling was nice, it also meant that it didn't really fit with the type of participation I wanted,which was 70% read, 30% write.  All things considered, this wasn't bad, and the discussions made much more sense.  Still not optimal, but good enough.

MOOC 3
Finally, the last MOOC, that in theory is concluding this week, but I am already done with, is the MOOC on Pragmatics at the Virtual Linguistics Campus.  My motivation here was to fill in some knowledge from the time that I took an introduction to linguistics course and we only did a few weeks on Pragmatics.  The course, like before was set-up like a self-paced eLearning course, with automated testing, self-paced multimedia, and lectures.  Everything was available at the beginning of the course, so there was no need to wait until someone released a new module for you.

This MOOC was a bit of a hit or miss. I think I definitely enjoyed the Phonetics and Transcription MOOC that they had last spring more than I did this one.  The recorded lectures were fine, and the self-paced eLearning materials were fine as well.  There seemed to be less attention given to some of the assessments (multiple choice quizzes) this time in that some assessments in some chapters were just one question! If you get it right, you pass the module with 100%.  The other issue was that learner evaluations in Module X references things that learners would learn in Module X+3, so some people were confused by this.  Luckily I was not since I had already covered some of this through my Master's in Applied Linguistics. This seems like an oversight, but attention to those finer details is something that, for me, can make or break a MOOC.

Discussions were used mostly as a way to troubleshoot, for me anyway, even though some participants used it as a way to disambiguate, especially in those evaluations from Module X where things from Module X+3 were mentioned and taken as previous knowledge. Just like the Spring Phonetics & Transcription course, I didn't spend a lot of time in the forum. The other thing that was different this time around was the lack of reading materials.  Last spring, each week, there was a scan from a book chapter (different books each time). This was pretty nice because each week, in addition to the self-paced eLearning materials, I read something from a book. This year, they cut that out, presumably for copyright reasons. In either case, the course this time around seemed more bare.  It lacked a level of detail that I had come to expect.

Finally, it's still left to be seen what the Statement of Participation looks like, but I hope that they didn't overlook design issues because they were looking to make some money from certified, graded, Certificates of Participation.  Last spring, the certificate of participation listed all modules taken, and the final percentage grade.  This year, it seems that you will need to pay to get that level or reporting (and have your grade registered at the University of Marburg), and people who don't pay just get a Coursera-style certificate of participation.  Let's wait and see.

I am now enrolled in some other MOOCs. Let's see how those pan out.  In the mean-time, back to reading about MOOCs in the press, and writing more about them.
 Comments

Language MOOCing

 Permalink
This past week, crazy events in Boston aside, two new MOOCs began: LTMOOC, on Blended Language Teaching, and the Phonetics and Phonology MOOC from the Virtual Linguistics Campus at the University of Marburg.  The Edx course on the Ancient Greek Hero took a hiatus week to allow people to catch up.  I am still sticking to the Ancient Greek Hero course, and I did try to catch up with the scrolls, a secondary reading that's meant to be "fast reading," but apparently I am not fast enough (I seem to be taking my time).  In any case, my strategy for the Edx course is to read the main reading, and participate in the course, and worry about the scrolls later.

As far as the language MOOCs go, I decided to stick only with the Phonetics MOOC.  Blended learning is something that I already know about since I am an instructional designer, and given my applied linguistics background I can put 1 + 1 together; so with limited time and resources I opted to just keep an eye out on the LTMOOC.  It would be interesting to talk to the Instremia guys at some point, but I don't need a MOOC for it. 

The phonetics MOOC, thus far, is pretty interesting.  I've been interested in the topic for a while, and now the opportunity to learn a bit more, in a novel way.  The interesting thing about this MOOC is that it really brings me back to around 2001 when I was taking online workshops through ICIA and getting my CTS certification. The user interface, the teaching style, and the exercises really are a throwback to those old days of self-paced learning. 

Don't get me wrong, I really like the subject (which means I am highly motivated), and as an instructional designer now I get to see this setup with a new pair of eyes, but the learning experience is a solitary one.  There are forums, but they are not really well integrated with the course.  All of the learning modules are available from the start (which is a nice plus!), so learners can proceed at this own pace.  For example, the recommended pacing means that we should be on Module 1, but I am already on Module 3. Who knows, maybe I will complete this MOOC before I leave for vacation :)

More on the MOOC learning experience as the modules progress.
 Comments

Yay! Linguistics MOOCs!

 Permalink
Well, now we're talking! ;-)

I came across two MOOCs that are related to (one of) my subject(s) of study :-)  The first MOOC comes to us from Germany, although it looks like it will be conducted in English, and it's the Phonetics, Phonology and Transcription MOOC from the Virtual Linguistics Campus. I am actually quite psyched about this MOOC for several reasons:
  • Phonetics and Phonology is something I've been wanting to undertake for a while, but haven't had the time;
  • It uses a platform that I have not seen before, so I am curious on the technical end;
  • It comes from a Non-English speaking country (I am interested in academic production in other languages, and how they are represented in MOOCs).

Here is the intro video for the course:




The second MOOC is LTMOOC (language teaching MOOC) which tackles the topic of blended language teaching. This probably won't be new to me (like the phonetics MOOC), but I signed up nevertheless because I am curious about the platform that they will be using, and I want to see what they say about blended language learning.  This (blended language learning) is something I worked on for my Master's Thesis/Capstone a few years back with a project I called greek for travelers. :-)

So, who is enough of a language geek to join me in these MOOCs? :-)
 Comments (1)

Connecting and weaving knowledge

 Permalink
This week's Change topic was a nice break for me, allowed me to take a moment, catch up with other people's blogs, and the weekly session (which I made an attempt to visit while it was live, but I somehow missed it) was loose enough to allow for this break.

In any case, the topic of this week was "Triangulating, weaving and connecting our learning." I've written before about disconnected knowledge (although I forget if it was a blog post here, somewhere else, or a comment on someone else's blog...) and disconnected knowledge is an issue when teaching and learning. One can't learn facts and figures in isolation, they are meaningless and we end up forgetting them anyway. If we can put them to use, that is one form of making these facts meaningful to us and thus providing for a mechanism to remember them.

This week's topic reminded me of a mentor that I had in one of my master's programs. When I was a first-semester graduate student in the instructional design program the program director helped me to start mindfully (or consciously?) connecting all of my previous education and knowledge to what I was doing. I had a gut feeling about how things connected with each other, but when people saw my resume their first comment was that I lacked focus.  After all, I was some with a BA in computer science who loved languages (in their minds two separate things), who went on to get an MBA with foci in IT and Human Resources, who then went on to an MS in IT, an MEd in Instructional Design and then wrapping it all up with Applied Linguistics.

To most people it seems like I am jumping from discipline to discipline, but I always saw some sort of connection between all of them, it was just hard to articulate it to others.  It seems like both in Academia and in the "real world" there is an emphasis in specialization (or hyper specialization once you get to PhD levels), and this specialization entails some sort of certification that in the minds of people clearly delineates one thing from another; however some people don't seem to realize is that it is all connected. Some links are stronger and some links are weaker, but when it all comes down to it, there is no such thing as "no connection"...just a lack of the ability to see connections :-)

In order to try to explain the method to the madness, I ended up creating a mindmap of how all of this previous knowledge, education and interests fit together, although I think that this is rather simplistic.  I can "see" more connection if I drill down deeper, but without some sort of 3D diagram it will just look like one big messy spaghetti bowl ;-)

 Comments