Networked Learning you say?
Mon, May 15 2017 03:00 | dissertation, education, network, Networked learning, PhD, procrastination, research
Hybrid Presence which Suzan presented for us (since I could not attend in person) and we worked on an expanded version of the paper which should be coming out in a book soon.
Networked Learning was a new concept to me so I thought I would spend some time reading more on the topic. I got as many books on the topic as I could get my hands on last year and I started reading. Now that journey is coming to end having started reading the last book I got my hands on on the subject. I was briefly considering going through and downloading and going through all conference proceedings from the past 10 years, but having each article was a single download and perhaps a better use of my time is to go back to my own dissertation topic and read up on it rather than academically procrastinate by learning more on Networked Learning ;-)
So, what is networked learning? Networked Learning is defined by Goodyear, Banks, Hodgson,and McConnell (2004)* as:
learning in which information and communications technology (ICT) is used to promote connections: between one learners and other learners, between learners and tutors; between a learning community and its learning resourcesOne (or more, actually) of the chapters that I read says that this definition has been remarkably resilient to the passage of time. The thing that I've noticed with this definition is that it's remarkably broad, which might explain its resiliency. From my observations (from readings) ideas and concepts that have fallen under (or play nicely with) the main concept of Networked Learning are specific types of problem-based learning, mobile learning, online learning, web-enhanced face-to-face learning, learning in augmented reality, informal learning, authentic learning, and many more. I've also noticed that most people writing about the topic tend to be from Europe. The concept has not been adopted widely in elsewhere in the world. It strikes me that here (where everyone seems to strive to coin a name for something) such a broad definition wouldn't necessarily have sticking power. I do like it though because it's a good foundation to build further work on.
In my short(ish) detour into Networked Learning I've come across some ideas for my own dissertation as well...which I noted somewhere...I do admit that I need to be a little better at note taking for longer works if I am to make my way through this dissertation process. My note taking has been tuned for shorter articles (the standard 6000 to 8000 word research articles) and for a 200 page dissertation research (where some topics need to be ELI5) my current note taking practices may not be cutting it.
What do you know of networked learning? Have you used the concept? Have you written about it? Are there any articles from the conference proceedings from the past 10 years that are a must read?
* in the book Advances in research on networked learning (Boston, MA: Kluwer)
Are conferences places where we repeat ourselves?
Leafing though these booklets I noticed something that hasn't been as evident to me in the past: It's the same people that are in the presentation spotlights this year as have been in the past two, five, or more years! Now, the truth is that I had noticed in previous years, but this year some conferences have moved to a new location (which isn't local), and it was a bit odd to have certain locals highlighted as presenters when the new venue is a 16 hour drive (or 3 hour flight). Thinking back at other conferences too - both ones that appeal to academia, and the private industry of learning design - I've noticed that year after year the list of "A-list" presenters and session leaders tends to be the same.
This made me wonder about my own recent distaste (or perhaps burnout is a better word) with EdTech (and related) conferences. When I started attending these types of conferences (with any regularity, and always only local) about 8 or so years ago they were amazing... well, at least amazing to me. New ideas, new products (yes, I love gizmos), ability to talk to people who were implementing and getting data from things I had considered doing myself. Generally I really liked the freshness and the new ideas vibe. Then I noticed that while presentations were incrementally new, the people never really changed a whole lot. Don't get me wrong, there are people that I'd like to see again and see what they are working on now, but the "point release"-ness of presentations and topics has made me not care as much about what people present at conferences any more. I tend to get more intellectual stimulation off virtually connecting sessions than attending conferences in person. Yes, virtually connecting does piggyback off conferences frequently, but I find it much more potent. Perhaps because I know I can sign up for one session, attend, discuss, think, and get back to other parts of life rather than feel like the ROI of time-spent/learning isn't working out in my favor.
As I was pondering this, Joshua Kim and Kristen Eshleman posted on EdSurge with their Five reasons [they] will avoid EdTech conferences. It's interesting that they (too!) also bring up things like vConnecting. Out of the things that Kristen and Joshua mention the two immediate things that jump off at me and are echoed in my sentiments about EdTech conferences are the ROI and getting over the hype. Even if I still like talking to vendors (take note that I don't like your emails most times!), there have been fewer and fewer new products in the marker. Even presenters are (in some aspects) hawking their wares. In their case it might not be a product, but it might be mindshare for themselves and/or their institutions. This leads me to ROI, both for the intangibles (my time and energy), and the tangibles (money to get there, and for conference registrations). I don't think the product is worth the investment any longer.
That said, I think Kristen and Joshua make a point that doesn't immediately pop-up from my own 'me-centric' view - where are the faculty and students? Perhaps faculty can have their attendance paid for by institutions, but students are effectively priced out. Those are the people who I'd most like to interact with after we all get to speak to vendors, or listen to presentations from peers at other institutions, because then we can have meaningful discussions about what we can do at our institution, and what sort of interesting pedagogical things we can do with other institutions. Most of the people that attend these conferences are techies (like me), and while I can see applicability for the classes that I teach, I am also part of a larger department with colleagues who don't get to see what I see.
In the end, I am wondering: what's next? If we aren't doing conferences (because we are bored, uninterested, and/or priced out), how do we work on our professional development in meanginful ways this summer?
EDDE 806: Epilogue (of reboots and alternative universes)
I guess this is my "806 is dead, long live 806!" post ;-)
One of the final requirements for EDDE 806 is to:
Create a final blog post linking to the 6 earlier posts and providing a final reflection, feedback and any recommendations on the course as a whole.
For those who are keeping score at home, other course requirements included the following:
I am not really sure what a final reflection really looks like for 806, especially considering that I will most likely attend quite a few sessions next fall when the remainder of my cohort will be presenting their in-progress proposals. So, I thought that doing a reflection on the EdD program up to now would be worthwhile, and proposing a new way of pacing the program, and in the process something about 806 might come of it.
- Present a 30-45 minute presentation on their proposal or dissertation work and progress and respond to comments and questions.
- Post reactions and reflections on at least 6 of the presentations (over one or more semesters) using a response template created by the instructor, to their blog in the Athabasca Landing (tagged with EDD 806)
- Attend and participate in discussion in at least 6 sessions over one or two semesters of the course
It seems hard to believe that about three years ago (give or take a month) I had just been accepted to the EdD program at Athabasca University (the stomping grounds of Anderson, Dron, Siemens, Cleveland-Innes, and another researchers I had been reading the work of in the years preceding my application to this university), and I had submitted my program fee payment to matriculate. Three years, and nine AU courses later I am plugging away at my dissertation proposal. If you are wondering what those 9 courses are:
- 6 compulsory courses (EDDE 801, 802, 803, 804, 805, 806)
- 2 optional courses (MDDE 701, 702)
- 1 course in which I was a teaching assistant/intern (MDDE 620), for the "Greek cohort" no less :-)
Spring 2014 (same as the original timeline)
- Accepted! Woohoo!
- Got books mailed to me (nice job, AU!)
- May 2014 - "welcome to the program" online adobe connect session with the cohort and select faculty. Placed in pre-selected groups to work on Assignment 1 (due when we meet in August in Edmonton)
Summer 2014 (same as the original timeline)
Fall 2014 (somewhat similar to what I did in the original timeline)
- June-July: meet with Marc, Renate, and Steph (cohortmates) online -a few times to work on assignment 1
- June-July: read through textbook, and download PDFs from course site.
- August: meet in Edmonton, meet cohort-mates in person, polish assignment, present assignment.
- EDDE 801 (Advanced Topics and Issues in Distance Education)
- Weekly live sessions for class
- Weekly guests from the field of distance education (woot!)
- EDDE 806.
- Peruse through the recent recorded sessions and listen to two.
- No need for reflective posts --> Get feet wet, see what other cohorts are doing, ideate on own work. Maybe in final 801 session students share their ideas from having viewed 2 recordings.
- EDDE 802 (Advanced Research Methods in Education)
- Bi-weekly live session for class
- EDDE 806
- Attend 2 sessions during the weeks 802 didn't meet.
- Reflect on 1 presentation (live or recorded)
- MDDE 702 (qualitative refresher for those who need a refresher)
- Start brainstorming on your proposals. Jot down big research questions, diagram some potential research methods for them, and write brief abstracts about the problem to be solved. This is sort of like a TV show elevator pitch. Be ready to pitch 3 ideas.
- Go back to 802 materials and see them in light of your pitches
- EDDE 803 (Teaching and Learning in Distance Education)
- Bi-weekly class sessions
- Tackle topics on teaching and learning.
- Does anything from course seem to connect to your 3 pitches? File it!
- Intern in an MDE class
- EDDE 806
- Attend 2 live sessions (during weeks when 803 does not meet)
- Reflect on two sessions (either live or recorded that semester)
- Present your top 3 pitches and receive feedback from audience in one of the live sessions
- Continue refining your 3 pitches based on feedback (this task crosses into spring 2016)
- EDDE 804 (Leadership and Project Management in Distance Education)
- Bi-Weekly sessions
- Tackle topics on leadership.
- Does anything from course seem to connect to your 3 pitches? File it!
- EDDE 806
- Attend 2 live sessions (during weeks when 804 does not meet)
- Reflect on two sessions (either live or recorded that semester)
- Pitch 2 expanded ideas (expanded in terms of what type of literature you might look into, and updated ideas about methodology and problem)
- MDDE 701 (Quantitative research refresher for those that need it)
- Pick one of your pitches and develop a literature review (due at the beginning of 805)
- Pick one of your pitches and develop a preliminary intro (due at the beginning of 805)
- EDDE 805
- Bi-weekly class meetings
- Polish off a draft of your proposal that includes summer deliverables + methods
- EDDE 806
- Attend 2 live sessions
- Reflect on two sessions (either live or recorded that semester)
- EDDE 806
- Attend 6 live sessions
- Reflect on 4 live sessions
- Present your proposal
- Polish off proposal
- Connect with your cohort over the summer (support network)
So, what are the differences between what I did and what I wish I had done (and what was an option?) I had dipped my toes in 806 when I started in 801; I was curious, and since the course was open to anyone, why not? But, once 802 kicked in, it was difficult to keep up, so I was on-and-off in 806 throughout the program (more off than on, until 805). Despite the on-and-off nature I ended up reflecting on some of the sessions before I officially signed up for the course (hence the number of posts at the end). I think that 806 was originally conceived as a course to keep the band together in some formal manner while we're all off doing our own thing, but I think that the strength of 806 is really in being a connecting strand (on the program side) from start to end. I think being part of 806 from the start can help future cohorts conceptualize what they want to do, see what others are doing, and know that for proper execution of the dissertation lots of planning needs to go into it (and we can all commiserate at our setbacks, and celebrate our victories). I think this is important to see early on.
- Defend Proposal & start Research
In terms of Summer terms, summer is definitely a good time to kick back, relax, have a few beers in the garden while reading your favorite fiction...but in all honesty two weeks off would be have been fine, and then we should have been on the path again. Coming from a US background, I am used to the term ending at the end of May. AU's early start and early end of the spring (Winter) semester meant that I had boatloads of free time. This dissertation won't write itself and I think that the summers (Mid-April through August) are a good time to do it in a structured (cohort driven, program driven) way. Summers can be a time when you work on your elevator pitch for some ideas - and you can present them in 806 (during your second year) to see what might stick - you can also get feedback and things to think about. During the summer of year 2 you can spend those 4 months in doing a lit-review and an introduction (2 chapters!), and be ready to roll in 805 (start of year 3). I think this plan gets you better positioned to defend at the beginning of year 4.
A potentially controversial issue might be the requirements for 806. At the moment it's an attend 6 and reflect on 6 setup. My proposal is attending and/or viewing 14 sessions (as opposed to 6), reflecting on more sessions overall, but you'd have the freedom to reflect on some recorded ones - in case the live session wasn't something you could really say something about; and EDDE 806 is more integrated into the entire EdD process. Of course, this means that 805 + 806 (Research Seminar I & II) would need to be tweaked, but I think that the end goal would be better.
So, what do other EdD folks think? Does this work for you?
|Post Title||Reflection |
& Live attendance
|Post I - On prepping for a dissertation|
EDDE 806 - Post XIII - It's the end of the semester, and I feel fine
The three proposed research projects are Kim's, titled "Student Satisfaction Levels among Canadian Armed Forces Members toward their distance learning experiences" which deals with Canadian armed forces training and distance education; Rosemarri's , titled "Transforming Learning in Higher Education: Implementing UDL in Higher Education"; and Scott's, titled "College Leadership and Distance Learning"
There were some common themes between these three presentations, and presentations that have been done previously in the semester, be it underlying reasons for the research, methodologies employed, or potential timelines. Having seen the timelines of friends from Cohort 6 (and to some extent from Cohort 5), I can say that I've certainly revised my own timeline to a much more realistic expectation (how does 2019 sound?).
Going back to some common threads, Kim discussed a little bit about the training costs associated with the CAF (approximately $1.3B Canadian per year). I am not sure what the size of the CAF is, but I was wondering how much is that amortized per member of the CAF; not that every member of the CAF will have an equal dollar amount of training spent on them, but it was a thought. The thing that really stood out for me was the story about officer training and how a member of the CAF can spend 1 year in residence to complete their training, or do it over a period of 2 years via distance education in the field (because Distance Ed is considered by the brass less rigorous and hence you have to have more). It's interesting to see such (unfounded) biases alive not just in academia (my playground) but also in other places. This question isn't really related to Kim's research (which is survey research) but I'd love to see a compare and contrast of the on-campus officer training vs the online version. They should have equal outcomes, but I am wondering what the pros/cons are for each modality.
Another presentation (Scott's) dealt more with college leadership and the adaptation of colleges in Canada to distance education. The idea behind this research is to look at leadership variables that promote growth of distance education at the university president level. The underlying rationale here (at least one of points) was that the role of the university is changing, and the university must adapt or go extinct. Scott quoted O'Meara who said that Higher Education as it is found to be "irretrievably immersed in a merciless marketplace" (O'Meara). If I remember correctly Scott is the only person who has presented thus far (from our Cohort) that is doing mixed-methods.
I think the idea of leadership variables promoting growth at the university is important. Bad leaders do have a chilling effect both to individuals and to the organization as a whole. That said, the thing that was running through my mind is the framing of the argument. A lot of what we see today (at least on my side of the border) tends to be about framing higher education in the framework of a market economy: get degree X to do work X, come back often for CPD. There is, however, in my mind a disjuncture here. School costs a lot. Both from a financial aspect and a time aspect (not to mention any emotional aspects). Education isn't a new pair of jeans you buy every other season. Adapting to a market economy (IMHO) isn't what institutions of higher education should be doing. We should be innovative, but the evolve or die out framing doesn't work well for this particular sector of life and society.
As an aside, in the chat Norine wrote a paper called "Adult learning theories: shows that hurt the feet" - LOL. Now I am curious to read that paper. I am not sure how this came up, but it must have been in someone's lit review :-)
That's all for this season of EDDE 806 :-) See you in the fall!
EDDE 806 - Post XII - Of Navigators and Succession...
One question that came to mind, outside of the context of these presentations, was how long do EdD students stick around in 806 after they have met the requirements of the course? If they don't come back, why is that? If they do return, why do they return, and what influences their regularity of participation? I guess this could be a dissertation topic in and of itself, but it's a question that came to mind as I saw some very familiar names in the guest list on Adobe connect last night, and noticed the absence of other names that I've seen over the last year or so of my 'informal' 806 participation. Of course, a dissertation topic like this would most likely add 2-3 years to my studies, and that doesn't seem like an appealing prospect :p
For the presentations of the evening, Neera presented on her proposed study, titled Succession planning in higher education: condition for sustainable growth and operational resilience, and Stephanie's proposed research topic is titled Developing routine practices for health system navigation in Canada. Neera is focusing succession planning, with a focus on Polytechnics (with potential study participants coming from Alberta, Saskatchewan, Ontario, and British Columbia. I was surprised that there are Community Colleges in Canada - I tend to think of community colleges as a US term. Looking little community college might be one of many terms that refers to the same type of institution in Canada.
In any case, I think that Neera made an interesting point, and something that I've seen at my own institution: In higher education it often seems that hiring new people at the institution, or replacements for key positions, takes a lot of time. That ends up potentially costing the institution money because there isn't someone in the position to take care of the critical needs for that institution; and if someone is hired but the search fails (bad fit for example) it's still money down the drain. Hence a good succession plan (implemented well) would conceivably benefit the institution.
For Stephanie's presentation, since I am not in the healthcare field, I am a little less able to say something other than it's a cool project :-) I don't want to just summarize her presentation though. The thing that struck me, both with Stephanie's presentation and Neera's (and other presentations I've seen over the years) is that most dissertations and dissertation proposals seem to be either Qualitative in nature or Mixed Methods, but I have yet to see a strictly Quantitative approach just yet. I wonder if others have seen those in their experiences in EDDE 806.
Loyalty a one way street?
Loyalty Is a One-Way Street, and the tagline was "Loyalty of students and faculty is often demanded. Is it returned?" The main thesis of the article is that in higher education the job you're in is the job you're in unless you apply for another job and get in, at which point you can either leave your old job or use your new offer as leverage for a better job (or better pay) at your current job. The article is written from a faculty perspective, but it resonated with my own experiences at the university. However, I wouldn't really call it an issue with loyalty, but rather it's an issue of organizational culture and lack of meaningful (to the individual) rewards for that loyalty. Here are my observations as a staff member from the last (close to) 20 years at my institution, and a story from my first job on-campus.
When I first started working here, I worked as an assistant (no benefits, hourly employee) in media services while studying as an undergraduate. It was a fun job, I worked with, and for, interesting people and I can say that I learned a lot from the job and from the people I interacted with. The job was always meant to be temporary since it was renewed on a semester-by-semester basis. And in my second year of employment I got more responsibility by being given the reigns of the weekend operations (again hourly, non-benefited, but more responsibility). After about 3 years of being non-benefited someone retired and his job posting opened up.
Having show progressive responsibility I was a prime candidate for the position. I applied, interviewed, and ultimately got the position. I still worked in the same place as before, doing about the same things, but now I was benefited, full-time, with managerial responsibilities on top of everything else. For five years I did my best to learn more about my job, and to try to be innovative to help the department. I started an MBA, I joined a professional association (with my own money), I learned, prepared, and passed the relevant entry level certification, I connected with IT folks from the university to keep my department in the loop, and I volunteered for AV projects with my colleagues during slow periods in the office. I didn't do this for recognition, but so that I can be better at my job. Ultimately however, one does expect some sort of recognition (in some way, shape, or form). Our university does not award merit points for employees who continue to keep up with their professional development. Everyone gets the same Cost of Living (COLA) increase as everyone else. If you want a pay increase you need to show that your duties have significantly changed since you were hired.
In five years my duties had indeed changed in practice, but not on my job description (what governs your pay). I was doing different work than my colleagues, but we were all paid the same; they actually were paid more as a result of compounding COLA increases, because they had been working here longer, which was fine. Our supervisor was a nice guy, but he hated to differentiate (the kind of person who treats all his kids equally, no matter what). This was problematic because everyone he managed "exceeded expectations", but this praise felt a little hollow after a few years. Praise needs to be accompanied by something else to be useful (if you use it a lot), like a little more flexibility on vacation, or a pay increase, or some money to attend a PD event, or whatever. So, the only option for a little more money was to go through the official procedure (which was fine). My boss at the time told me that he supported me, but privately he told others that he would never support it unless others got the same deal (regardless of their duties). This was a natural extension of "treat everyone the same". Since I ultimately did not get a promotion there, I looked elsewhere for work. It was sad because I liked both the job and my colleagues, but you do what you have to do. When I told my boss that I got another job, his boss attempted to retain me in the department asking if I would stay if they matched the salary. I would! But, I wouldn't wait around for it (two in the hand is worth more than two in the bush). Since they couldn't make it happen, I left. I still kept in contact with my colleagues there, they were great people (and it's a small campus), but I left that department. And they were inconvenienced because they couldn't hire a replacement right away, and my area was the busiest on campus (based on department held statistics).
To bring it back to the IHE article, without knowing that this is the game to play in academia, I ended up playing this game. I looked for other jobs, I interviewed for them, got an offer for them, and did respond in the affirmative that I would stay if they matched the salary (which would also mean that I would get another job description, which was originally turned down). But, given the steep bureaucracy of the university (at least mine), it wouldn't have been nimble enough to do it as quickly as accepting the new job offer, and the trust relationship was broken since my supervisor told me one thing and told others something else (those others eventually telling me), so there was no guarantee that the retention offer was any good.
This is one story. I've experienced other things in my close-to-20-years here, and I've spoken to colleagues and have heard their stories too over the years. My 2c on the matter are as follows (mileage at your college may vary, this is just based on my local colleagues around the Boston area):
There is a fundamental problem of organization culture. Warner writes that he has"witnessed genuine loyalty among colleagues at the department level, but this is a reaction and response to the lack of loyalty at the larger institution level. They have banded together as protection from above." I've seen this myself, and have heard it from colleagues at other institutions as well. Some departments are better at self-supporting than other departments but this creates structural inequalities within the organization as a whole. If your supervisor likes you, and you get all the perks in your department, but a colleague is not liked (or has an ineffective supervisor, or doesn't enjoy the group protection you do, etc.) they do not get any of the perks you get, an in some cases doing the same job! This type of unequal treatment isn't a hypothetical, it's happening. And in instances where merit payment are involved some employees may be eligible for a merit pool because their supervisor loves them and gives the "Exceeds expectations" all the time, while other employees might be working for someone who believes in the power of the bell curve, and everyone "meets expectations" with the exception of a few 'high performers' and a few 'low performers', so in essence these managers no only shoehorn people into the bell curve, they deny them an opportunity for merit/bonus pay that they would be entitled to if their supervisor were someone else.
Another issue I've seen is that everything is treated as a net-zero outcome. Someone's gain is regularly someone else's loss. So if you work for a big department (or a college/faculty within a university), if an employee has an opportunity to grow in their job, but that growth takes them out of their smaller sub-unit into another sub-unit of the organization, the organization is resistant to embrace this. Even though the employee will still be connected to their previous sub-unit, and could help take care of work/issues within that sub-unit as well, that "transfer" would be most likely blocked because the originating sub-unit would not necessarily be able to get funding to replace that previous position. There concern seems to be how many warm bodies each department has, and not necessarily what type of work needs to be done. Just as an example from my first job on-campus. It's been 12 years since I left that job. The number of warm bodies doing the same work has remained the same even after having 2 retirements and (sadly) 1 death. Those positions have been replaced to do pretty much the same thing, regardless of where research into educational technology and learning have lead us since. That department is still a separate fiefdom and people get annoyed when they are asked to take care of something that another IT department "should" be doing (never-mind that they are all part of the same IT parent department).
Finally, it might seem that my position is higher salary (or other monetary perks) as a general acknowledgment of employees' good work and loyalty. Or, associated with more money is moving up the ladder work-wise into a more managerial position. While money and career development are nice, sometimes they are not the end-all be-all. My former colleagues seemed to like what they did. They didn't seem interested in changing jobs for higher pay. Maybe pay for them wasn't even a top concern (bills paid, mortgage paid, savings at an OK level), but they may have wanted more flexibility for vacations in order to spend more time with their family. A flexible organization should be able to be able to give such perks (fairly, and across the organization) to people who earn it (good work, loyalty, and so on), and at the same time have the resilience to work around any issues that might arise from this individual flexibility provided.
At our institution I think that the institution does attempt to demonstrate appreciation of loyalty to its employees, and I do think that upper administrators care (to some extent at least); that is to say I don't believe them to be greedy monsters that just look at the bottom line. One of the events we have each year is the "years of service" event where people are recognized for their service in 5 year increments. Last year I was recognized for 15 years of (benefitted) service to the institution for example.
In the end I don't think it it's a matter of loyalty. Loyalty (or lack of loyalty) is a conscious effort (or lack of effort). I think the issue is systemic, and it's really an issue of management.
So... my question (to anyone who is reading this), is how to we make academia responsive, and at the same time equitable, and flexible so that it works both at the individual level and at the organizational level? Thoughts?
EDDE 806 - Post XI - Get your Waldorf on...
|Statler & Woldorf, muppet critics|
Angie is coming at this problem from a corporate instructional design lens, where a lot of money is spent in corporate environments for training, however 50%-90% of this training is deemed ineffective. Because of this training departments are one of the first things that get cut when a company needs to tighten the budget (explains a lot of the angst that friends who are corporate IDs feel). I do wonder though what about corporate training makes it ineffective. Being a bit of a Waldrof (or am I more of a Statler?), it seems to me that fellow instructional designers in corporate settings do what's expected of them to do (self-paced, drill & kill interventions), but those don't work because they are usually compliance (and everyone seems to hate that). If instructional designers were more integral in the talent development cycle, the interventions might be more effective. Anyway, I think I digress.
So, one might ask, what is DACUM? DACUM was new for me, and it is defined as:
Developing a Curriculum (DACUM) is a process that incorporates the use of a focus group in a facilitated storyboarding process to capture the major duties and related tasks included in an occupation, as well as, the necessary knowledge, skills, and traits. This cost-effective method provides a quick and thorough analysis of any job.It seems to me to be one of the tools used by instructional designers in the needs analysis phase to determine what is needed to be accomplished by the learning intervention. Apparently DACUM is only done in person at the moment, which can be quite expensive when done face to face, and synchronously, for the same reason that training is deemed expensive at times: you need to pull employees away from their work to do this thing. Angie is looking at employing Design Based Research (DBR) with a Delphi approach. Her expert informants will be 6 PhD Psychometricians at her company, distributed over a geographic distance (some are in the same office, but some are not). She will have one group of senior psychometricians and one group of junior psychometricians (it will be interesting to see if there are differences between those who are more senior).
On another note, it's interesting that this is not Angie's first idea. She's had several over the years, but opportunities dry up and doctoral students are left trying to pick up the pieces. I often wonder what happens if you've passed your dissertation proposal defense (and hence you are formally an EdD candidate), but that opportunity dries up and you need to do something else. Does you committee ask you to re-defend something new? Do you try to salvage what you have with what little is left? Do you put together a new proposal with just your advisor? With coursework it's pretty cut and dry - you do the work, you get a good grade, you pass. The dissertation can be a year long project (or longer) after you defend the proposal. What happens when stuff hits the fan when you're in the thick of it?
If any cohort 1 or cohort 2 folks are reading this, advice is definitely welcomed :-)
Are MOOCs really that useful on a resume?
Thu, Feb 23 2017 12:45 | assessment, cMOOC, CV, HR, instructionalDesign, MBA, MOOC, pondering, resume, work, xMOOC
7 Tips for Listing MOOCs on Your Résumé, and it was citing a CEO of an employer/employee matchmaking firm. One piece of advice says to create a new section for MOOCs taken to list them there. This is not all that controversial since I do the same. Not on my resume, but rather on my extended CV (which I don't share anyone), and it serves more a purpose of self-documentation than anything else.
The first part that got me thinking was the piece of advice listed that says "only list MOOCs that you have completed". Their rationale is as follows:
"Listing a MOOC is only an advantage if you've actually completed the course," Mustafa noted. "Only about 10 percent of students complete MOOCs, so your completed courses show your potential employer that you follow through with your commitments. You should also be prepared to talk about what you learned from the MOOC — in an interview — and how it has helped you improve."
This bothered me a little bit. In my aforementioned CV I list every MOOC I signed up for(†) and "completed" in some way shape or form. However, I define what it means to have "completed" a MOOC. I guess this pushback on my part stems from me having started my MOOC learning with cMOOCs where there (usually) isn't a quiz or some other deliverable that is graded by a party other than the learner. When I signed up for specific xMOOCs I signed up for a variety of reasons, including interest in the topic, the instructional form, the design form, the assessment forms, and so on. I've learned something from each MOOC, but I don't meet the criterion of "completed" if I am going by the rubrics set forth by the designers of those xMOOCs. I actually don't care what those designers set as the completion standards for their designed MOOCs because a certificate of completion carries little currency anywhere. Simple time-based economics dictate that my time shouldn't be spent doing activities that leading to a certificate that carries no value, if I don't see value in those assessments or activities either. Taking a designer's or professor's path through the course is only valuable when there is a valuable carrot at the end of the path. Otherwise, it's perfectly fine to be a free-range learner.
Another thing that made me ponder a bit is the linking to badges and showcasing your work. Generally speaking, in the US at least, résumés are a brief window into who you are as a potential candidate. What you're told to include in a resume is a brief snapshot of your relevant education, experience, and skills for the job you are applying for. The general advice I hear (which I think is stupid) is to keep to to 1 page. I ignore this and go for 1 sheet of paper (two pages if printed both sides). Even that is constraining if you have been in the workforce for more than 5 years. The cover letter expounds on the résumé, but that too is brief (1 page single spaced). So, a candidate doesn't really have a ton of space to showcase their work, and external linkages (to portfolios and badges) aren't really encouraged. At best a candidate can whet the hiring committee's palate to get you in for an interview. This is why I find this advice a little odd.
Your thoughts on MOOCs on résumé?
† This includes cMOOC, xMOOC, pMOOC, iMOOC, uMOOC, etcMOOC...
Course beta testing...
Wed, Feb 22 2017 10:38 | coursera, development, instructionalDesign, MOOC, qualitymatters, software, testing, xMOOC
Software Goes Through Beta Testing. Should Online College Courses? I don't often see educational news on slashdot so it piqued my interest. Slashdot links to an EdSurge article where Coursera courses are described as going through beta testing by volunteers (unpaid labor...)
The beta tests cover things such as:
... catching mistakes in quizzes and pointing out befuddling bits of video lectures, which can then be clarified before professors release the course to students.
Fair enough, these are things that we tend to catch in developing our own (traditional) online courses as well, and that we fix or update in continuous offering cycles. The immediate comparison, quite explicitly, in this edsurge article is the comparison of xMOOCs to traditional online courses. The article mentions rubrics like Quality Matters and SUNY's open access OSCQR ("oscar") rubric for online 'quality'. One SUNY college is reportedly paying external people $150 per course for such reviews of their online courses, and the overall question seems to be: how do we get people to do this beta test their online courses?
This article did have me getting a bit of a Janeway facepalm, when I read it (and when I read associated comments). The first reason I had a negative reaction to this article was that it assumes that such checks don't happen. At the instructional design level there are (well, there are supposed to be) checks and balances for this type of testing. If an instructional designer is helping you design your course, you should be getting critical feedback as a faculty member on this course. In academic departments where only designers do the design and development (in consultation with the faculty member as the expert) then the entire process is run by IDs who should see to this testing and control. Even when faculty work on their own (without instructional designers), which happens to often be the case in face-to-face courses, there are checks and balances there. There are touch-points throughout the semester and at the end where you get feedback from your students and you can update materials and the course as needed. So, I don't buy this notion that courses aren't 'tested'.†
Furthermore, a senior instructional designer at SUNY is cited as saying that one of the challenges "has been figuring out incentives for professors or instructional designers to conduct the quality checks," but at the same time is quoted as saying “on most campuses, instructional designers have their hands full and don’t have time to review the courses before they go live.” You can't say (insinuate) that you are trying to coax someone to do a specific task, and then say that these individuals don't have enough time on their hands to do the task you are trying to coax them to do. When will they accomplish it? Maybe the solution is to hire more instructional designers? Maybe look at the tenure and promotion processes for your institutions and see what can be done there to encourage better review/testing/development cycles for faculty who teach. Maybe hire designers who are also subject matter experts to work with those departments.‡
Another problem I have with this analogy on beta testing is that taught courses (not self-paced courses, which is what xMOOCs have become) have the benefit of a faculty member actually teaching the course, not just creating course packet material. Even multimodal course materials such as videos, podcasts, and animations, are in the end, a self-paced course packet if there isn't an actual person there tutoring or helping to guide you through that journey. When you have an actual human being teaching/instructing/facilitating/mentoring the course and the students in the course there is a certain degree of flexibility. You do want to test somewhat, but there is a lot of just-in-time fixes (or hot-fixes) as issues crop up. In a self-paced course you do want to test the heck out of the course to make sure that self-paced learners aren't stuck (especially when there is no other help!), but in a taught course, extensive testing is almost a waste of limited resources. The reason for this is that live courses (unlike self-paced courses and xMOOCs) aren meant to be kept up to date and to evolve as new knowledge comes into the field (I deal mostly with graduate online courses), Hence spending a lot of time and money testing courses that will have some component of the course change within the next 12-18 months is not a wise way to use a finite set of sources.
At the end of the day, I think it's important to critically query our underlying assumptions. When MOOCs were the new and shiny thing they were often (and wrongly) compared with traditional courses - they are not, and they don't have the same functional requirements. Now that MOOCs are 'innovating' in other areas, we want to make sure that these innovations are found elsewhere as well, but we don't see a stop to query if the functional requirements and the environment are the same. Maybe for a 100 level (intro course) that doesn't change often, and that is taken by several hundred students per year (if not per semester) you DO spend the time to exhaustively test and redesign (and maybe those beta testers get 3-credits of their college studies for free!), but for some courses that have the potential change often and have fewer students, this is overkill. At the end, for me, it comes down to local knowledge, and prioritizing of limited resources. Instructional Designers are a key element to this and it's important that organizations utilize their skills effectively for the improvement of the organization as a whole.
† Yes, OK, there are faculty out there have have taught the same thing for the past 10 years without any change, even the same typos in their lecture notes! I hope that these folks are the exception in academia and not the norm.
‡ The comparison here is to the librarian world where you have generalist librarians, and librarians who also have subject matter expertise in the discipline that they are librarians in. Why not do this for instructional designers?
Wed, Feb 15 2017 01:30 | academia, Employment, institutionalMemory, knowledgeManagement, Management, work
It's been a long time since I've blogged about something educational, other than my classes of course. With one thing down (and a million more to go), I decided to take a little breather to see what's accumulated on Pocket over these past few months. I saw a post by Martin Weller on Institutional Memory, and it seemed quite pertinent to my day to day work existence these past six or so months. Martin points to a BBC article indicating that the optimal time in a specific job is around 3 years.
This isn't the first time I've heard this. About 11 years ago (wow!) I was working for my university library. I was new to the Systems Department (the IT department in a library) and my supervisor was new. When we were getting to know more about each other's work histories (before you could look at LinkedIn profiles), she had told me that she aimed to stay there for a few years and then move on. People should only stay in their current work for 3 years. At the time I found this advice a little odd, after all I had stayed with my previous department for 8 years total, before moving to the library, and even then I still stayed within the institution.
From my own experience I can say that if institutions were perfectly running machines, with perfectly documented procedures, and good version histories that we could reference to get an insight into why things are done the way they are done, then "short" 3 year stays at a job (or an institution) might (in theory) make sense. You come in, the institution benefits from your expertise, you benefit from the experience, you (metaphorically) hug and go your separate ways at the end of your tour. However, institutions are complex organisms. The reasons why things are the way they are might not be documented. Sometimes the procedure was a backroom deal between one academic Dean and another. Sometimes it's the duct tape and paper-clip that holds everything together because at the time the organization didn't have the ability to break everything down and rework something from scratch. Other times it's good ol' fashioned human-human relationships that make things work (i.e. bypassing parts of the system where things are bottlenecked but no one will change things).
Given this reality, I think 3 years is a rather short time to spend at a job or an institution. I know that when I've changed jobs it's taken me up to a year to fully "get" all the connections, the people, and the systems in place to not only do my job but to do my job effectively and efficiently. Leaving before you can make a lasting impact at the institution is a little selfish given that the employee gets good exposure to new skills and ideas, but leaves before they can really put those to use on anything more than a bandaid†.
Sure. Even when you stay at an organization for more than 3 years, after a little while you will reach the plateau of efficiency in what you are doing. It may take you 3 years, it might take you 2, it might take more. Sooner or later you will get there. At that point, that's when the organization has a responsibility to keep things fresh for their employees. This benefits both the organization and the employees. Employees feel challenged, in good ways, (think of it as a ZPD for work), and organizations get to retain and employ the talent that they've incubated. If people leave because they feel bored that's a shortcoming of the organization.
I know that in my own experience working at my university (19 years now), even though my jobs have changed, and my departments have changed, that institutional knowledge follows me, and I share it with other people. Just because something might not be of particular use to me right now, it doesn't mean that it's not useful to another colleague who is newer at the institution. Having this oral history, and this means of passing it down to others is of use. Leaving your post and experiencing this high turnover rate is detrimental to an institution‡.
† Don't get me wrong, private sector companies, especially ones that vehemently refuse union organization, and use globalization as a way to use and abuse employees by not paying them a living wage, by not providing good benefits, and by shirking their responsibilities in their social contracts are not worthy of employee loyalty of this nature. We just can't afford, as people to to say "I am only looking out for myself".
‡ Another thing that came to mind, as I was writing this, has to do with hiring. Hiring isn't as simple as posting a job at the university's "help wanted" site. Between the time a need for someone arises, and someone is hired, it can take a very (very) long time. Just as an example, there are two jobs that come to mind that I applied for. One for my current job where I applied in March, interviewed in December, started in February). My job at library systems where I applied in February (I think), got the call for an interview in November, heard that I got the job in December, started in January. All of this is considered "fast", so when it takes that long to get hired, I would say that 3 years somewhere is a rather short time.