Club Admiralty

v7.2 - moving along, a point increase at a time

Multilitteratus Incognitus

Pondering what to learn next ๐Ÿค”

When MOOCs turn into Self-Paced eLearning...

decorative image showing the letters MOOC

It is true that, as of this writing, there is much more serious stuff happening in the world today, both in the US and abroad, but this has been percolating in my brain for a while, so I thought I'd jot down some thoughts on one of my favorite topics: the MOOC.

Now that I am done with my dissertation, and I've had a little time to rest my brain and refocus on what I want to geek out on, I've wanted to do a retrospective piece on MOOCs. I was going to call it a post-mortem because I think that the time of the MOOC has passed. Don't get me wrong, I think there is still gas in the tank of companies like Coursera, Edx, and Futurelearn, but I wouldn't call them MOOCs. The innovative pedagogical stuff I saw early on doesn't quite seem to be there these days, with a focus going to AI, massification, and Machine Learning.  

In any case, my idea for a post-mortem was particularly poignant because 2022 is the 10th anniversary of the year of the MOOC (time flies...๐Ÿ˜ฎ). To be fair, MOOCs existed prior to 2012, with classics like CCK (connectivism and connective knowledge) debuting in 2008 and arguably defining the genre of a MOOC (of the connectivist variety), but as is the case with media, even in academics you need a little popular awareness to spice things up, so the post-mortem would have started from the debut of the xMOOC. If you're interested in a curated collection that provides a retrospective on the decade of MOOCs, check out Katy Jordan's co-edited JIME issue which came out this May.

While I did do some lit-review on the MOOC for my dissertation, the MOOC wasn't the focus of my work, so there was a bit of a gap that I wanted to address for my own personal knowledge, and in turn, use that top-level view to write something. Alas, while I was away the MOOC seems to have turned [even more] into your standard self-paced eLearning ๐Ÿ™„. A learner can log in, view the content provided by the design team, complete some multiple-choice formative and summative assessments, and log out.  Sure, in the heyday of the cMOOC it was possible to lurk in a MOOC, get what you wanted to get out of it, and not interact with anyone, however, many people did blog, tweet, and reflect in their own spaces, and that those were aggregated via the MOOC, which allowed other to engage with you in those spaces. There was more content, ideas, and ponderings (and opportunities for connection) permitted in the cMOOC days.  

In the xMOOC I tend to avoid the only place for connection, the forum, because it is one huge mess. Contributions get lost in a forum with that many people, and it really is a demotivator when it comes to participation.  Why put the effort to write anything cogent when you don't own the platform and people may not read it? To make things worse, mandatory participation by means of forum posts (borrowed, no doubt, from smaller-scale online classes) invites short responses such as "agree" and "nice job!" or, perhaps, a regurgitation of what was in that 10-minute mini-lecture.  The xMOOC has tried to be "innovative" by shutting off access to past courses (unless, of course, in some cases, if you subscribe...), so it ratches up FOMO forcing you to participate in what is essentially a self-paced eLearning course. 

Of course, when you're focused on "completing" the materials by the end of the access period you cram, and in the end, you remember nothing ๐Ÿ™„. It's just wasted time. Some of my more memorable xMOOCs were ones from the early days of Edx and Coursera where I could sign-up for a variety of courses, and actually treat them as self-paced eLearning without worrying about expiration, and immersing myself in the content (and cMOOCing it a bit by writing about it). 

Finally, being the MOOC expert that I am (๐Ÿ˜…), I am invited to peer review articles on the subject. People seem to be treating MOOCs as a technology, which they are not.  People also treat MOOCs as self-paced eLearning or content-dumps, which IMO they shouldn't be.  I guess the MOOC has evolved and moved on...or perhaps I've stuck with a more idealistic version of the connectivity variety while financial interests have morphed the MOOC into something that might make money.  Either way, I am having more engaging conversations on spaces like MyFest, DS106, and tweetchats than the ecosystem that was once called a MOOC.

So, my question for MOOCers current and past is this: what do you think?  Is the MOOC as a concept done? or is there something there that can be salvaged?  Or, do we just pick out the parts that we liked and then remix them into something new?๐Ÿค”

Stacks Image 20

Graduate Students as OSCQR reviewers

OSCQR Digital Badge

In the beforetimes (summer 2019), I had access to some graduate assistant hours and I needed to find a project for them.  Since this group of graduate assistants was destined to become educators, I thought it would be an interesting idea to train them on the OSCQR rubric and have them be " reviewer 1" and "reviewer 2" on a few course reviews that I wanted to undertake. I took on the role of the instructional designer in this exercise (reviewer 3).  Now, I know that the faculty member who is teaching these courses also needs to be part of the conversation, but more on that later...

My original goal for this exercise, beyond the actual review, was to conduct a collaborative autoethnography of the process of having graduate students conduct OSCQR reviews of courses that they had themselves had most likely taken as a learner. Content-wise the material should have been similar even if the instructors and modalities were potentially different.  Well, the Fall 2019 semester got very busy, and then we've been living in COVID world since 2020. Additionally, those students graduated and moved on with their professional lives, so such a paper is no longer possible. I was considering using an autoethnographic approach just with my own reflections on the process (after all, I still have most of my notes), but I've got several other irons in the fire, and I am not sure how useful it would be to go through the process of trying to find a journal to get a peer-reviewed version out.  Recently, I was inspired by Maha though, and her blogging of an unpublished paper, that I thought I would give this approach a try.  So here it goes!

The Need

It's been my observation over the last 15 years in higher education that there are many silos that just don't connect.  One might expect silos to exist between departments, but through interactions with various departments I've come to the conclusion that many faculty know what they teach and have a vague idea of what other courses exist in their department, however, they don't have deep knowledge of what goes on in those courses. Additionally, even if faculty are "peer reviewing" a fellow colleague's course (including courses of colleagues in other departments), there is a reluctance to address issues with pedagogy or material because of "academic freedom" ๐Ÿ™„. Even things like accessibility get rolled into the avoidance technique of academic freedom, which makes course improvement an issue. I suppose there is a curriculum committee in each department for this, but as a designer every time I've brought it up with academic departments I always get the audible grunts or roll eyes.  No one wants to peer review someone else's course, even with a rubric.  In any case, for this project, I was interested not in what course materials faculty were using in their courses, but in things like the alignment of objectives and readings and activities (making sure the connections were clear to learners), accessibility, design, and tool usage. My main focus was the student experience, and the goal was to present findings and recommendations in the Fall semester. This would be a good opportunity to open up a dialogue between the faculty on course improvement and cross-course communication in general.

The Assistants

The graduate assistants who were my reviewers for this project were students 25-30 years old, who were mostly done with their degrees, so they had experienced most of the curriculum that they were reviewing from a student lens already.  Some courses were new to reviewers. None of the assistants had any instructional design experience and their pedagogical training only included face-to-face contexts.  My hope in training them on OSCQR during this review was that they would also be able to take what they learned, both from OSCQR and from reviewing all virtual classes into their face-to-face classrooms (and virtual classrooms if the need arose). Each assistant had 10-12 hours per week of work (if I remember correctly).

The Process

The process started with a virtual training session on OSCQR 3.0. This took place over zoom since, even in 2019, we had a hard time getting everyone to campus at the same time.  It was also not necessary to meet face-to-face for this. I walked the three assistants through the OSCQR rubric and its annotations and provided a quick instructional design for online learning BootCamp. This was the highlights of the highlights version of ID for OL.  To demonstrate the rubric I used one of the soon-to-be-reviewed courses as an exemplar.

There were 18 courses to be reviewed.  These were the most commonly offered courses in the department, so they represented the biggest bang for time spent.  With two reviewers per course, this meant that each reviewer would review 12 courses over the summer.  This also meant that we were reviewing about one course each week of the summer, and once a week we tried to have a debrief about our findings. I was available for questions throughout the week. At the conclusion of this review, each of the 18 courses had qualitative feedback from the three reviewers. 

What worked in the training and what didn't?

One of the areas of improvement is definitely the demo course used to train the reviewers on OSCQR. I didn't have much of an opportunity to create a sample course to run the training, so we all evaluated a course from the pool of courses that were going to be reviewed that summer. I think there are pros and cons here.  One pro is that you tend to get one course more under the microscope than what you might normally have, but on the other hand, you risk reviewing every course in that collection similarly to that initial course, so reviews of courses might end up recommending that courses look like that first one (if the reviewers thought it was particularly well designed). 

In retrospect, I should have used another course, for example, I could have reached out to a colleague in another department to see if they'd volunteer their course as a course to train on.  I did end up using one of my old INSDSG courses to provide exemplars or alternate ways of addressing elements of the different categories of OSCQR. I think the zoom training session worked well, and the weekly debriefs worked well enough. If I were to do this again, I'd formalize the debrief a bit, maybe ask reviewers to show and tell some things that they felt worked rather well in courses that they reviewed and any questions they may have had.  I might also include a weekly reflection.  

One of the things I wanted to get to with some sort of ethnographic component in the original project was to see if student reviewers had any trepidations about reviewing courses in their department, if they felt like they needed to be positive, or if they felt that there might be fear of retaliation. I didn't sense a lot of this, but it would be good to have some data.

Finally, I think that 18 courses was a little much for the reviewers we had available.  I think the reviews felt a little bit like a conveyor belt rather than an opportunity for review and conversations, which I hoped to foster.

How did the courses fare?

Most courses made it through the OSCQR review with minor or moderate changes being suggested. I think that this is partly a sign of the maturity of the program (courses have undergone many revisions over the last 15 years), and partly due to faculty familiarity with things like Quality Matters and OSCQR. Also, in order to teach online (at least back then), faculty needed to complete some coursework to prepare them to teach online. Additionally, our ID group had begun an accessibility campaign the year prior to the review, so I have a sense that many of the inaccessible elements were handled then.  We did find some accessibility issues, but I think there would have been more had our colleagues in ID not been proactive with their initiatives. 

Some of the things that came up as elements for review included onboarding information, including technological requirements for the courses, links and information about campus services (library, accessibility office, tech support, etc.), contact info for the department,  and of course accessibility of attached documents and presentations.  My two big takeaways here are these:

1) Some common elements of courses (such as access to campus services, department contacts, and so on), can go into a template that every course can use.  While I don't think one template could apply to every single course a department uses, at the very least some common things that you want every student in the department to have and know should be included. Each onboarding for a class will vary depending on the topic, the course, and the instructor, but some things can be standardized.  I think having faculty develop their own department's template(s) would go a long way to help learners recognize the signposts in each course so that they can wayfind with more ease, yet still retain faculty voice in the decisions made in those templates.

2) Accessibility is an ongoing effort! It needs to be baked in, not sprinkled on (as my old friend Margaret used to say).  Even though we worked hard in 2018-2019 to address accessibility issues, there were still elements of courses that were inaccessible. I think that this is one of those battles that require a few stakeholders to be involved.  The faculty member should certainly strive to make the content as accessible as possible right from the start, but maybe there could be a team in the department or the college that helps out with ensuring last mile compliance. Automation has certainly helped a lot, for example automatically generated closed captioning, but those should be verified by a human.

Reflection: Technolutionism and Faculty Learning Communities

Over the past decade (or more), our ID group has been great! They've nurtured faculty who've been teaching online (at least from what I can see from my end) but there is a certain spirit that seems to just stick around. That is a faith in technosolutionism.  From automatic AI transcriptions, to "plagiarism detection" tools like Turn It In, to remote proctoring. From chats with other colleagues, some departments seem to see cheaters and plagiarizers everywhere.  This isn't healthy.    Another issue arises with adjunct faculty. There needs to be a way to welcome our adjuncts into the overall discussion and training around teaching and learning, but it can't be uncompensated. Full-time faculty can count this kind of thing toward their PD and be paid for it, whereas adjuncts are only paid for the time they are in the classroom.  If we want to include our adjunct colleagues into these discussions (which in turn translate into classroom pedagogies) we need to compensate them for their time, and encourage (or require?) participation in faculty learning communities, both intra- and inter-departmentally. I also think it's a good idea (at least for graduate students who will be going into teaching after they graduate) to prepare them on how to review courses and offer constructive feedback, even for people who they know.  I am not sure how to get current faculty members unstuck from the fear/concern of making suggestions to their colleagues (you know...because of "academic freedom"), but maybe we can break that cycle with the next-gen of teachers and IDers.

That's it for now.  Your thoughts?

Stacks Image 20

What Dual Modeing Taught me about Remote Work

person sitting in a cubicle amongst many empty cubicles.
"I go into the office for collaboration" - the collaboration...

I suppose the title should be "some of what I learned...," since I probably can't fit everything in a blog post, but let's begin and see where I end up ;-)  

The TL;DR: be careful about the word "choice," while choice is good, you might get results that you didn't expect and are ill-equipped to handle.

I follow a Twitter personality who evangelizes about the remote office. I admit, I am biased and lean toward positive views of the remote office and remote work in general.  This morning one of the Twitter posts went like this:



Old people: young people need the office for social contact

Young people: actually we'd like to live closer to our friends and family

Respondent 1 (toward personality):

My 20-year old son old prefers to go into the office and doesn't enjoy working from home the one day a week. People need choice. Working from home is not for everyone. Would I go back to an office? No.

Respondent 2 (toward respondent 1):

That's the point: choice. Employers, instead, impose office or hybrid. Should be a choice. X wants full time/hybrid office? go ahead. I want full-time remote work. Employers can impose office/hybrid, but they will lose talent attraction over time, and die.


I think that choice in life is important, but what overrides a choice (or personal preference) sometimes is a critical mass!  Take my dual-mode institution for example. Back in 2008, I was pursuing my last 2 master's degrees concurrently.  Both programs had an on-campus variety and an online variety. If you lived within 20 miles of the campus (within the I-495 belt, for those familiar with the region), you were required to apply to the campus program, you were simply not going to be accepted into the online variety. I suppose that some small exceptions were probably made for people with disabilities or "extraordinary life circumstances," but those students really had to parade their special needs to be accepted into the online version (or so it seemed to me). At the time, I was working on-campus so leaving work and stepping into the classroom 50 meters down the hall wasn't a big deal. The late nights were a bit of a problem, but I still had the energy to just do it. There were differences between the two master's programs, of course. Here's how things started when I was a student:

MEd program: More courses were offered online than on-campus.  On-campus offered 2-3 core courses (required) regularly, and the 1-2 electives per semester.  Online offered 2-3 core courses each semester and 3-4 electives online.  The electives on-campus and online were never the same, so students picked both based on modality and on the topic (if they could make it to campus).  You could complete the majority (75%) of your degree online (as an on-campus student) if you wanted to since there were only 4 core courses.  As an online student, there were two classes you needed to complete in person, but they were offered in a blended format with a 1 week residential (for each course). You could not complete the degree purely on-campus, some courses needed to be online (course offerings were not as plentiful on campus, so you always needed at least 1 elective online). I was a campus student but completed ยพ online. Campus classes had a healthy number of students (about 15 per class), but the late nights eventually got to me.

MA Program: Distinct on-campus and online populations. Online students were in a cohort setting, and there was no class variation possible.  Everyone took the same courses. No electives were offered online (electives were essentially preset, canceling the notion of an "elective").  Campus students had more flexibility in their electives, and more electives were offered. Classes both online and on-campus had a good number of learners in each section (about 20), but some of the campus numbers were sustained by external funding which attracted more students to campus. I was a campus student and completed all courses on-campus. By and large campus and online populations didn't mix


OK, so - fast forward a few years (circa 2013).  The being forced into a specific modality based on your location element was dropped for both programs (and mostly across the university I think). Do you know what happened?  More people started applying to the online programs while fewer applied to the campus programs! This was true for people who lived a few miles from campus! How did this impact the two programs?

MEd Program: Campus core courses were down to 1-2 per semester, and maybe 1 elective.  Electives had a hard time getting enough students to run.  Eventually, all electives ran online-only, and by 2015(ish?) all courses were also online-only.  The program still required a blended residential component, but as soon as other institutions started offering similar programs without the residential component, that was dropped as well. It was undeniable that student enrollment had gone down, and the attitude that some folks had of "oh, but students crave the social experience of in-person, so we need to keep our residential requirement" rationale didn't have any proof to support it. Currently, the program is fully online. Some folks do need a campus residence component for work/grant/financial-aid requirements, but it's not possible to offer that to folks. There simply isn't a critical mass of people to offer that.

MA Program:  More students applied to the online program than on-campus. External grants also dried up during the Trump years. More electives were offered online, online students now had options for electives, and ultimately electives were offered only online (so the elective situation flipped from the original state in 2008). It became easier to offer electives online because both learner populations (campus and online) could participate in an online format. Additionally, there was a critical mass to offer more than 2 electives per semester if student populations combined, and to try some new things out!  Alone, the campus population wouldn't be able to sustain 1 elective offered on-campus. All core (required) courses were offered in fall and spring for both online and on-campus, but eventually, the campus had to pare down and offer courses in alternating semesters.  There just wasn't enough of a critical mass to make courses run on-campus.


Now, from my examples above, it might seem like there is no interest in the on-campus programs.  That's not the case.  For both programs, there is interest by some learners to be on-campus. The problem is that there aren't enough people to have a critical mass to run the courses and to have a learning community. Personal choice is important, but it doesn't mean that the economics and the community mechanics work out to the degree needed to make the learning experience happen.  It's like having a *flex class where the instructor and 3 students are in a room, while 25 other students are on zoom or asynchronous.  At that point, you decide that the weekly F2F doesn't make sense for the amount of effort you put in.

This is my parallel to remote work: You can have employees who can work remotely (based on the job needs) but want to be in the office.  Fair enough (I have colleagues like that), however, is there enough of a critical mass within an organizational unit (those people you work closely with) to be in the office? 

Many people who talk about remote work seem to frame it as a matter of choice.  I suppose it can be a matter of choice whether you are 1 day, 2 days, 4 days or 0 days in the office, but it depends on which direction you are coming from.  If you're coming from a "butts in seats, on-site" mode of thinking, sure - remote is probably a choice.  However, if you're coming from a remote-first frame of reference, then I am not convinced that for your average team that choice + critical mass complement one another.  The numbers might be there for an organization to offer hot desking accommodations for anyone in the company who wants to work at an office setting, but being in a space, and actually cognitively being with a community of peers is not the same.  If you work with others this kind of flexibility means that unless there is a critical mass in the office you are essentially a remote employee but not at your home (or preferred location).

For certain jobs remote (with maybe a little hybrid) is the future, and people need to develop interpersonal and managerial skills to work in both environments.  Just going into the office won't be the panacea that you are looking for, if what you are seeking is collaboration. The physical office could provide a quiet space, a non-distracting space, a whatever space to get your work done. It may provide you with a cognitive break between work and life (if you haven't developed that and need some external stimulus to activate it), but it's important to understand that there needs to be a critical mass of people who want to be in-person to make it happen, and this takes critical leadership skills to make telework and in-person work seamlessly together.  This comes not just from observations during the pandemic years, but from observations working in an academic department for the past 11(ish?) years with mixed-mode faculty, being a learner both online and on-campus, being an adjunct who teaches online, and a person who collaborates and chats with colleagues across campus academic programs.

Stacks Image 20

Dr. Academic Generalist?

Puzzle board generalist

Over the past few months, this idea has been floating around in my head, but I haven't really found the words to describe my general ponderings, so here goes a freewriting activity that I hope makes sense...

July will be the one-year anniversary from whence I passed my dissertation defense (yay!) and became a "doctor"๐Ÿ‘ (not the 'damn it Jim!" kind๐Ÿ˜œ).  Over the last few years, leading up to my dissertation defense, I had spent a lot of time becoming an expert in collaboration, and specifically in an open educational context. There was a little rhizomatic stuff there, but I need to go back and read more about it. I had also spent a lot of time building my expertise on the Community of Inquiry model before I abandoned that line of inquiry, as well as communities of practice, MOOCs, and other peripheral areas to collaboration and open ed. 

One of my friends, who had already completed their doctoral journey a few years prior, told me to really take in, and relish, this experience of doing a deep dive into a particular topic. After I graduated I would not have many opportunities like that.  I took this to mean that future academic work would either probably build on top of past knowledge, hence literature reviews would be more additive in nature (i.e., what's new this year?), or one doesn't really have time for these sorts of fun deep dives.

I've seen the latter (the shallow dive) with people who come from other disciplines and start to write about education. This lack of depth really shows in much of what's been published as "research" into Emergency Remote Teaching these past couple of years. But I  digress...there's a book chapter about that forthcoming ๐Ÿ˜….

 Now that I am getting my research groove back, I've started to ponder: Is research (the deep dive stuff) incompatible with a breadth of expertise?  Or, to put it a different way, do academics tend to become typecast and pigeonholed into a specific field (or limited areas of inquiry), so that they have the opportunity to do deep dives, throughout their careers? Or are there academics who are jacks of many academic areas of inquiry?

Is there room in the field for an academic, with a doctorate, that's a generalist? Someone who reads a lot processes it, maybe writes on a blog, and engages with others, and generally has a broader view of the field than someone who specializes on something specific during their careers?  It seems like much of the oxygen in the field is sucked up by people who publish a lot (be it articles, chapters, or books), but there really is little recognition for those deep divers who might be quietly brilliant. ๐Ÿค”* I guess traditional "success" for an academic is measured in papers had h-indexes, but that seems limiting to me. My metric for success has been measured is interesting experiences, good thought-provoking discussions with people who've become friends, and "aha moments". This seems like enough to me, but does the broader field value that? At the moment, I am not eyeing any academic jobs, so I don't particularly care if they value it or not, just curious. So, if you've gotten to this point in the blog...

What does "success" look like for you fellow doctors and abd'ers?


* Re-reading this before I post, I am sure there is a lurker connection somewhere here...

Stacks Image 20

Pondering the MOOC post-mortem

Statute of thinking man
Photo by NeONBRAND on Unsplash

Back in December, I had an idea: 2022 is the 10 year anniversary since the "year of the MOOC," so why not write something about it? After all, open education and MOOCs are subjects that interest me a lot, still. MOOCs of course go back before 2012, and the year of the MOOC in 2012 was really relevant in North America.  I've seen other proclamations for the year of the MOOC being 2013 or 2014, but that's in different contexts. Anyway, since it's 10 years from some proclamation, and this year is actually one where I am free and clear from the obligations of dissertation writing, I thought it would be fun to revisit my old stomping grounds and do a 10-13 year retrospective research article (I need to get back into publishing somehow, no? ๐Ÿ˜‚). I also have a title:  MOOC post-mortem: A decade(+) of MOOCs.  More on this title later.

Anyway, I decided to start this project in a very predictable way.  I already had a treasure trove of research from 2009 through 2019(ish) about MOOCs that I had poured over for my dissertation. I was already familiar with the majority of major findings. So, the logic went, I only really needed to see what was new from 2019(ish) to 2022(ish) and just revisit the literature review work I did for my dissertation with a new lens through which to look through.  However, this left me totally unfulfilled. I mean, I've read the vast majority of this literature. I've also authored or co-authored some of it.  Heck, I probably peer-reviewed some of it. I didn't want to do yet another meta-analysis. There are a few of those around, I've done one with Zaharias, Aras and his colleagues did one, George and Peter did one, Sangrร  and his colleagues have one, Zhu et al have one... Do we really need another meta-analysis? ๐Ÿค” Probably not.๐Ÿ˜…๐Ÿ™„  

So, I've taken a step back to really reassess what I want to do with this MOOC post-mortem.  I really feel like there is a need to do a post-mortem on MOOCs because they are essentially dead. Sure, platforms like edx, FutureLearn, and coursera still enroll students, I even took a John's Hopkins course in 2020 on coursera about contact tracing (fascinating stuff!), and I periodically jump into FutureLearn for new courses. FutureLearn has emerged as my favorite platform, but that's a thought for another post.  Anyway, I am, at this point, hesitant to call these things MOOCs. I think there is something missing here, and I am not quite sure what that thing is, but I doubt that I begin this exploration in currently published literature.

In thinking about this project, what comes to mind is something like an oral history (maybe...๐Ÿค”), but I am not quite sure how to proceed with it. MOOCs were pretty formative for me, both as a post-post-graduate learner and a researcher. I learned a lot from the courses and the communities I was a member of, but I am not quite sure how to proceed in researching this post-mortem because MOOCs went from being an exciting thing to examine and be part of..., to being m'eh๐Ÿคทโ€โ™‚๏ธ, to being things that are constrained by the same (or similar) things that constrain regular education.  All of this happened while I was in my dissertation cocoon, so the before/after MOOCs comparison right now is pretty striking. IDK, maybe others have already experienced this and have moved on, and I am just catching up.

So, I guess my question to the audience that reads this:  Do you feel like the MOOC is dead?  Does a post-mortem (and/or eulogy) make sense?  How might one go about it? ๐Ÿค”
Stacks Image 20

Who moved my cheese?

I was reading Twitter the other daaayyy (just picture someone from Letterkenny saying this๐Ÿ˜œ) and I came upon yet another discussion that basically boiled down to online (or remote) versus face-to-face. I made the mistake of reading the replies๐Ÿ™„.  

On the one hand, I was hoping that there would be a more nuanced discussion (and to be fair, I did get those types of replies from people I follow), but there were so many more about "online not being for everyone,"  about how people who teach face to face can "see if their points are landing and if learners are confused," or how "face to face is easier for building community" (maybe for you extroverts...๐Ÿคจ), or even "online works great if you love reading information off a screen and taking self-paced tests" (someone's learning like it's 1999...๐Ÿคฃ) and other such (insert word/phrase of your choice). Obviously, the quotes are paraphrased.

I became upset and irritated reading such non-sense posts, so I took a step away and reflected on one of my MBA courses from a long long time ago: Change Management. I am not necessarily upset at individual faculty who are biased toward physically-proximal learning (although they do raise my blood pressure at times), I've dealt with similar biases over the past decade teaching INSDSG684 (but in class, I have a whole semester to work with learners on these perceptions, twitter's 280 characters aren't suited for this). These beliefs are what they've been enculturated in, and if their institutions didn't expect them to continue with their professional development into different pedagogies and modalities, they aren't necessarily to blame. They've remained in their bubble. However, I am also reminded that this is a perfect Who Moved My Cheese situation. This is a kid's book that we read and discussed as part of the Change Management course and it's a story about adapting to change (see Wikipedia for synopsis)

As I write this we are concluding our second full year of COVID, and all that this entails. The coming year, 2022, doesn't look to be any better, and I blame university administrators (broadly as an industry, not any particular institution) and complicit faculty who are stuck on a physically-proximal teaching modality for our collective burnout.  They keep doing the same thing over and over again and expecting things to change (That's the definition of insanity, according to Vaas๐Ÿ˜†).

Listen, I get it.  In March 2020, we didn't know the extent of this pandemic.  We were, in earnest, doing emergency remote teaching.  Everyone who had a class in a physically-proximal modality was moved so quickly online that they probably got virtual whiplash.  Many faculty were not prepared to teach in a virtual setting, yet there they were.  They did the best they could and should be applauded. The same with learners.  They were not prepared to be at a distance, but they did the best they could to wrap up the spring 2020 semester.

Summer 2020 came and we were all talk. Committees were formed at campuses across the nation (maybe even the world) with the mandate to have different plans for different contingencies, and their (administrator's) hearts were filled with hope that we'd once again be physically-proximal in the fall.  We know that this didn't happen.  Neither did it happen in Spring 2021, nor in Summer 2021.  At my institution, in Fall 2021 we brought back some classes, but we've also had to deal with masks, enforcing vaccination requirements, still getting sick, being away from class and quarantined, and being worried about getting sick.  This is not normal. ๐Ÿ™„. We keep pining for physically-proximal learning and the good ol' days (which weren't that good for all learners, but let's put that aside for the moment). It ain't gonna happen in 2022 (putting on my Nostradamus hat๐ŸŽฉ). Stop doing the same stuff over and over again and hoping for a different result.  That's not how science works (we're in academia after all!)

As we are bracing for the third year of the pandemic, how about we collectively make a resolution to be realistic about the situations we're facing and not have faith of the heart that some miracle will happen and covid will just disappear? Barring some sort of mass vaccination success where everyone is vaccinated, I don't see that happening because we are, apparently, selfish as a species.  So what does this more realistic outlook look like?

First, let's accept that we should no longer be in emergency mode for most classes. Emergency was March 2020, and perhaps Summer & Fall 2020.  Anything beyond that is really us putting our heads in the sand by expecting to go back, and then having the Surprised Pikachu face when we're told that we're going to be remote.  It's not good for students, it's not good for faculty, and it's not good for staff.  How about we use our time constructively to prepare our faculty members to teach good online and blended courses for the duration of this pandemic?

Second, let's prepare faculty.  I mean really prep them.  A once-and-done BootCamp doesn't really do much good beyond emergency management.  For the spring semester give every faculty a courseload reduction (or reduce expectations for other types of service and publishing) and create faculty support communities to be better distance and blended educators.  Once the spring semester is over, pay faculty, especially adjuncts during the summer months to be learners themselves. To attend workshops and work on their fall and spring semester course materials. Be prepared, even if you're prepping to go back to a physically-proximal mode of teaching. Have a backup!

Third, let's help students become successful at any modality. In our physically-proximal teaching modalities, we let our learner's past experience do a lot of the heavy lifting in student preparation. If we're being honest with one another, it really is a sink-or-swim situation. We can do better, and we really should!  We should help learners navigate being a learner in higher education, both before they start their journey with us, and through our curricula. Part of this may actually involve campus space for learners to go have access to computing, internet, and quiet study space that is safe.

Finally, we need learner-friendly policies.  One awesome policy change, during the first year of the pandemic, was that learners were able to withdraw from a class on the last day of courses.  This way, if they had a chance of passing and wanted to get a course over and done with they could see where things stood on the last day of class and decide then if they wanted to stay or withdraw.  In this academic year that's gone, and we're back to the old system where learners need to decide 1 month before the semester ends if they are staying or going.  Seems unfriendly to learners if you ask me ๐Ÿคทโ€โ™‚๏ธ. There are other learner-friendly policies that have cropped up over the last 2 years.  Why not learn from them and actually make them part of standard operations?

So...instead of resigning ourselves to be yoyos by opting for campus-first and then scurrying along back to "emergency remote" when it's not possible, how about we plan a little? How about we acknowledge that offering a good class experience takes time for faculty and staff, and it takes energy from students. That energy gets sapped when we're all working from an emergency position with no stable footing, not just for students, but also for faculty and staff.  There are few things worse in a work environment than dealing with an emergency that could have been prevented or even mitigated.  Not every course can be fully online, we get that. Labs can have components that are virtual, but at some point, you need hands-on time with beakers, test tubes, centrifuges, breadboards, accelerometers, and oscilloscopes (depending on your discipline).  Planning for a smart return to campus is better for all involved.

Just my 2c.  What do you think? 

Stacks Image 20

Wicked Smaht: ID PD as a branching path and not a ladder

It's a Boston thing...

Alright, this blog post has been sitting in my drafts for a while, since I am procrastinating writing that paper on video game preservation (a story for another blog), why no blog?๐Ÿ˜‚.

Now that the dissertation is done, and the doctoral degree is completed, I've been spending a little more time observing the ID-sphere on Reddit, Facebook, and Twitter and I have seen a fair number of threads that solicit feedback and advice regarding doctoral studies in the field of instructional design or something like educational leadership.  These two things come up often and it's no surprise given that that advertising for PhDs in ID and Ed Leadership come up even for me (including in Instagram where I basically mostly post nature photos!!!). It's usually certain for-profit universities that are responsible for the bulk of this advertising - at least for me, but I see certain names come up in the Facebook and Reddit threads as well. It's never your local state college๐Ÿ˜œ

Anyway, I digress. The advertising practices of certain schools isn't the issue here (maybe it should be, but maybe that's a thought for a different post). The issue is the lack of proper advising and mentoring during a Master's program that leaves graduates susceptible to the siren calls of such advertising, and potentially leading them to rough terrain and unpleasantly unexpected destinations.  I am not sure degree programs often prepare students to be lifelong learners; to enable them to wayfind their way through to new learning opportunities and to critically assess them.  I think that we need a kind of "all the questions you had about a PhD but were afraid to ask" requirement for graduates of our graduate programs, for their sake.

The three reasons why people give for wanting to pursue a doctoral degree are:

  1. Continued professional development, but they essentially want to still be IDers and they don't care about research;
  2. Continued professional development and they want to be professors (and may or may not care about research);
  3. Receiving inquiries and advice from friends and family, who are not in academia, acknowledging their mastery of a subject (aka "being wicked smaht"), and asking, unassumingly, when they are going to do a PhD, because for them education is a ladder to be climbed.

Let me tackle the third one first because I had experienced this. Back in 2010, when I finished my last master's degree, I did hear from friends, family, and colleagues.  They were impressed by my accomplishments and the natural next step, in their minds, was the doctorate (and perhaps becoming a professor). I suppose to some extent it is correct, the doctorate is a terminal degree in many fields, but I would say that most non-academics don't really know what a doctorate entails, what the degree prepares you for, and what the job market is like for holders of those degrees.  They mean well, but this kind of thing really makes for social pressure that is unhealthy, and perhaps exploited by certain schools.  

So, interested people start out looking at PhD programs, but they don't know what they are looking for, and how to compare programs, and that's where the issues become even greater. When I was shopping around for doctoral programs in early 2012 I made a spreadsheet comparing programs, locations, faculty of interest, degree requirements, and so on. After a lot of analysis I ended up where I thought would be best for me and my goals. I don't know if prospective students, ones that come from the ID field, do this. Most probably don't because it is a type of literacy that they may not be exposed to during their master's degree program.  Some might have believed the difference between the professional EdD and the more research-y PhD, but I've written elsewhere about why I think this is bunk, so I won't' repeat myself. The core idea is that all doctorates are about research and both research and research translation/application are important regardless of whether it's an EdD or PhD.

Anyway, this misunderstanding (or total ignorance that the doctoral degree is about research), leads prospective students down the wrong path, and you get comments like this one (paraphrased of course):

I didnโ€™t realize that a PhD was designed to make you a researcher when I applied. My friends were like โ€œyouโ€™re smaht, apply to a PhD!!!โ€ And, so I did! And I learned that it was about research when I got here. It was never my intention to become a researcher, yet here we are.

Another comment in the IDsphere was a bit like this:

I took me to year 3 (when all coursework was done) and my advisors sitting me down and was like... "we're training you to be a researcher! All the stuff you do around [school topic redacted] is great, but you're here to do research and hone your research skills." Well, here I was thinking that I was here to improve lives of students that I care about! Silly me ๐Ÿ™„

I think the onus lies partly on the student to know what they are getting into (especially when it costs money!) but there is a responsibility for institutions of higher education to prepare their masters graduates to seek out pertinent information about PhDs to help their alumni community avoid situations like these, and to avoid the siren calls of marketing folks. One such university lists the following careers as potential placement for graduates the PhD program (list concatenated and cleaned up since it seems to come from job tiles from the Bureau of Labor Statisics):

  • Instructional Coordinators
  • Instructional Designers and Technologists
  • Training and Development Managers
  • Education Administrators
  • Distance Learning Coordinators
  • Training and Development Specialists
  • Education Teachers, Postsecondary

Literally, none of these requires a PhD! But you might not know that if you don't have a support network already in the field to let you know!

You shouldn't be in Year 2, or Year 3 or Year 4 when you realize that it's all about the research!

Some comments elsewhere on the IDsphere talk about the desire to have different paths in doctoral programs. The researchers can have their cake, but PD should exist for the rest of us - or so their logic goes.  This is a relatively recent example that encapsulates this thought process:

I think we should have different paths in doctoral programs. If we did, I'd probably have finished the PhD I started; I basically finished everything except for the dissertation. A research-based dissertation doesn't interest me at all. I wanted the knowledge and experience that came with a degree. That's it.

Putting aside that the knowledge and experience of the degree is basically about doing research, this seems to be an odd position to take.  Until you realize that most US degrees throw 2-3 years of coursework in front of students before they start engaging with research, thus potentially obscuring the ultimate goal or outcome of the degree. College websites (even state colleges these days ๐Ÿ™„) are pretty opaque about what the degree entails unless you know to look at a graduate catalog and determine what the degree requirements are.

In any case, my point here is that you don't need a doctoral degree to continue your education.  It's not a ladder, but rather it's many branching and intersecting pathways that you can take. Yes, you won't be called a doctor once you're done, but your ultimate goal of further education will be realized.  You can, for instance, pursue another master's degree, you can pursue credentials like FHEA/SFHEA, ISTECPTD, CPT/CFT/CDT, CMALT, the various CompTIA+ certifications if you are into IT, PMP for project management, PSM, or CSM, for agile, CTS if you're an IDer who does AV, SHRM for those in IDers in HR, and so forth. Heck, you can even go back to your alma matter and take courses as a non-matriculated student to learn more about a topic that is of interest to you (I am doing that this semester with the Archives course I am taking).  There is no dearth of places to learn and credentials to earn.  They cost money, but so does a PhD. At least certification programs are short, and many require continuing PD to maintain the certification after 3 years. Depending on what you need, there are also free options.  Google, Microsoft, and Apple (for example) have free training options for educators.

At the end of the day, expecting learners to magically know all of this stuff on their own is a fantasy.  I think that, as professionals in higher education, we need to prepare MA students for lifelong learning, and if they are considering a doctoral degree, we should prepare them for the realities of PhD life.

Just my Toonie's [that's a Canadian thing] worth of thoughts on the matter  ๐Ÿ˜œ

Your thoughts? Leave a comment :-)

Stacks Image 20

Pondering the point of publishing as an "alt-ac"

Ancient Greek theater masks on wall
Image by Greg Montani from Pixabay

Okay...okay!  I know, it's only been a couple of months since I defended my dissertation, and it's only been one month since it was totally official and on my transcript, but in thinking about further research I am simultaneously filled with both excitement and dread.  There are some things I want to pull out of my dissertation and polish up for an article, there are also threads on MOOCs and lurking that I want to return to, but I am feeling this sense of "oof๐Ÿ˜ซ" when I think about actually jumping in again.   It is quite possible that I need a much longer break, and maybe an actual vacation, but this reflection on research and publishing has gotten me to ponder the point of publishing as an alt-ac.

Now that I am done with the doc program, many people ask if I'm going to pursue a tenure position somewhere. It's an interesting thought (neither appealing nor unappealing), but then the question does make me reflect on what I do now, of my employment as an alt-ac.  I enjoy what I do for work, and I enjoy teaching (as a side gig). Yes, I also enjoy research, but I think that tenuredom is broken and exploitative, so...that's an aspect that is not appealing to me. I've heard way too many stories of toxic departments (both from staff and faculty, and across institutions) over the last 20 years in academia that I really don't want to risk finding my way into potential a snakepit.  It seems like fellow colleagues in tenured positions are entrenching around tenure but aren't thinking critically about what tenure means.  Tenure is just a word. What do get for it, and why don't others have it? Why are there so many faculty precariously employed as lecturers or adjuncts? And, why are the tenured folks so snooty at times toward those other colleagues? Is this the system that we should be supporting? Anyway - I think I got a bit off course there.  Let's put this plane back on the landing runway.

So, with that said,  I've been pondering the value of academic research publishing for staff members. While I enjoy the process of going at it alone, at times anyway, the most enjoyable research times for me have been when I've worked with others to explore ideas. Even if/when I reconnect with my thought-provocateurs from the various rhizos, and if/when we work on research projects together, I've been wondering:  What value does an alt-ac get from having stuff published, beyond that sense of fiero that you get from knowing that you've done the thing?  If you're on a tenure-track job, it's a requirement for the job, and you get to keep your job if your peers deem your research worthy enough to reward with tenure.  If you're a staff member and don't necessarily have sights on a tenure, what is the value of publishing in peer-reviewed publications? Is it really a glorious hobby?  Or do people get tangible career (or other) benefits from it, sort of like a halo effect?

As an aside, there are two pieces that I still get tagged on Twitter for: my critique of "Digital Natives" and our work on Lurking (done collaboratively with friends online). It makes my day when people find our work useful, but do employers value this value-add? ๐Ÿค” When face with limited hours outside of work to do this sort of thing, where do other alt-acs draw the line?

Your thoughts?
 Comments (1)
Stacks Image 20

Cut the bull: The demise of the Baccalaureate has been greatly exaggerated


"Cut the crap" decorative image
Courtesy of Redbubble

After one-too-many "news" posts on IHE about plagiarism and "rigor" I decided to stop subscribing to IHE's RSS feed.  Idiocies that used to be a comment left in an actual news article (one which you could ignore) now have been promoted as opinion pieces on IHE.  To put it quite simply the junk-to-treasure ratio is now completely off on that site and there is no longer value to keep checking it as part of my regular news feed.

That said, there was an article back in August (wow, a month went by!) by Ray Schroeder that I wanted to respond to. The article is titled Demise of the Baccalaureate Degree and it provokes the reader with the following lede "Overpriced, outdated and no longer required by an increasing number of employers, is the baccalaureate in a death spiral?"  Let me just say that this is BS, right now. Go ahead and read it, if you'd like, and come back after that.

It's disheartening to see a leader in distance education, one with so much experience and expertise, sprew the same nonsense bullshit as any 25-year old "entrepreneur" that has an idea for a mobile app. I want to tackle some of Ray's points on a point-by-point basis:

Ray asks (and answers): "Are we teaching the competencies and emphases that will be required to thrive in 2025? I fear not"

The future is unknown.  Even the immediate future is unknown.  Hell, if someone told me that my dream of working from home (most of the time) would come true because of a global pandemic I would have said "shut the front door!". The best thing we can do is prepare for an uncertain future the best way we can.  We don't know tomorrow's required competencies. All we can do is hedge a bet that what we think will be tomorrow's competencies are what's going to actually materialize.  The best way, IMO, to be prepared is to always be learning and to instill curiosity in our learners. They need to become flexible, adaptable, lifelong learners who take their past knowledge and apply it to critically analyze all of the future's uncertainties. We need folks who won't try to push a square peg into a round hole, and that is something we can prepare learners for.  Not some amorphous moving target like "20th-century knowledge" or "21st-century knowledge" or "Competencies 2025". That peg will just keep getting pushed and in the end no meaningful and actionable items can come out of it. Self-directed, Lifelong learning, on the other hand, is real, and you do need a historical basis for it to be able to actualize it.

Ray continues, "I have sat through debates as to whether foreign language courses (not foreign culture courses) should be required of every student in a time when computer-driven automatic written and oral speech translation is readily available. I find the argument for such requirements akin to the debate some decades ago, of whether college students should be allowed to use calculators. Let me be clear that I do not suggest that such courses should not be available to students, rather, that they should not be required of all students in this 21st century."

Sadly Ray, no, those two things are not the same. Using a calculator is not the same as learning another language.  As Mr. Saru said, "am I the only one who bothered to learn another language?" [youtube link; Star Trek: Discovery].  You can do math on your own if you know a few basic building blocks. It may take time, but it can be done.  Langauge also has building blocks, but it's much more complex. It's not formulaic (at least not all of it), and once you superimpose the layers of cultural knowledge and aspects of intertextuality, you can't just use a universal translator to do day-to-day work. You can probably get around as a tourist, but that's a far cry from a nativized/acquired competence that you get from studying languages and cultures!  Certain knowledge is fundamental to lifelong learners, and the acquisition of a second language (actual acquisition, not just the "take an intro to [language] and call it a day."

Another point made by Ray: "We currently document our baccalaureate degrees with transcripts that are owned and controlled by the colleges and universities. This documentation of what has been achieved is withheld if parking tickets have not been paid or other infringements have been committed. A growing movement demands that transcripts shift to blockchain delivery controlled by the students."

There is a problem with withholding (or holding hostage) a transcript because someone has unpaid library fines or parking fines.  This is something my campus should do some soul-searching on.  That said, the answer isn't blockchain.  Blockchain has problems, lots of them. For an interesting podcast on the topic listen to Tech won't save us. The issuing authority (the university) has the responsibility of access control and verification of credentials issued by it. along with that responsibility comes the responsibility of maintaining an evergreen source of accessing those records, even when the institution shuts its doors.  When an institution closes, as we've seen in recent years, those alumni records need to be maintained by someone until they are presumably no longer needed.  At my campus, for example, we maintain the Boston State College alumni records because we merged with them back in the day. If blockchain dies, as we've seen with the lackluster adoption of digital badges, the credential itself becomes meaningless because it's not verifiable.  Colleges and Universities that retain control of their records do maintain them and provide verification in perpetuity.

Ray also says: "This [blockchain] holds the potential for students to assemble their own transcripts with selected courses, internships, monitored experiences, projects and more from a variety of sources -- not just one university -- that can be validated by HR departments and others reviewing the transcripts."

There is nothing out there that prevents people from creating their own learner record ("transcript"). In fact, things like LinkedIn, CVs, and other means already exist to document your learning. We don't need blockchain for it. Maybe we need a new way of conceptualizing the CV, for example, but those efforts have met with resistance from the very same people that Ray says need these new things: HR departments and employers.  Degreed, for instance, showed promise, but what's the actual usage? [Wikipedia link, in case degreed is ever down]. This point also seems to have nothing to do with the bachelor's degree, and more to do with skills documentation.

Ray's other point: "Employers and students are seeking shorter credentialing than the baccalaureate; in particular they are looking for alternative credentials in the form of professional and continuing and online programs that are to the point and immediately applicable in the workforce."

While I don't disagree that people are looking for shorter credentialing schemes (at least for certain things), nothing says that the bachelor's degree is in direct competition with these. The idea that a college degree isn't required at high-tech companies (and by extension companies in general) is a myth.  It gets brought out like the corpse in Weekend at Bernie's. Don't get me wrong, it's an appealing myth because it implies that other pathways exist (and in my mind should exist), but actual hiring practices don't bear this out.  Go on.  I'll wait.  Go look at the open job posting at amazon, apple, google, and other major companies.  I'll wait...  Anyway, I'd say that lifelong learning is compatible with short credentials as people grow through various stages of their personal and professional growth.  I don't see this as making the bachelor's degree obsolete.

Finally, Ray says: "It seems that the โ€œclientsโ€ of higher education -- both the students and the employers -- recognize that the baccalaureate is too long and all too often teaches dated material rather than preparing students for the future. Shorter, just-in-time sequences of courses could better address the emerging needs in the workforce and society as a whole."

Maybe employers think this but maybe not, based on what they post as hiring requirements for open jobs.  Even if the bachelor's degree takes "too long" to complete it is possible that the BA/BS is not the right tool for the job! Again here, Ray leans into the tired trope of the university being too slow to adapt and teaching dated materials.  It's hard to argue against "dated materials" because it's predictably unspecific.  You'd have to do a deep dive into the discipline in order to determine what is "dated" and under what rubrics and units of measurement.  Different disciplines have different requirements, and different jobs have different requirements.  There isn't always going to be a match between the two, no should a college degree be simply about your ticket to enter a job.  Employers are able to control who they hire, and they choose college grads.  The college degree is just your "starter pack" in work-life and civic life.  If your employer needs you to do something specific, they need to invest in employee training and talent development! The cost of this type of training should not be laid on the student's shoulders.

Just my 2c. Adieu IHE ๐Ÿคฉ

Stacks Image 20

Pondering on counteroffers

Talent Development, manager supporting employees in climbing the ladder
I don't often get to put on my HR/management/talent development hat, but I came across this interesting tweet this past week that got me thinking.  I started to write a tweet reply to it, but it got too lengthy (so here we are - back in the blog๐Ÿ˜). A week or so ago I met with some former colleagues and mentors (folks who've already retired), which also reminded me of the mini scramble when I resigned a position years ago, and when counteroffers were considered.

The text of the tweet (for posterity) is as follows:

Resigned today.

Current employer is scrambling to counter. 

This is my PSA to those with the power to promote - donโ€™t wait until your best employees want to leave to give them an offer that shows you value them.

I couldn't agree more with the sentiments expressed by the tweet.  When someone resigns they most likely have another job already lined up and are least likely to take you up on your counteroffer. At this point, you've already lost your valued employee.  I assume they are valued because you bothered to make a counteroffer.

Personally, I think that counteroffers don't work because they focus on the monetary aspect of compensation. At best, counteroffers are a temporary bandage to the situation, and if someone takes the offer and remains, I would bet good money that they will leave the company within a few years, and maybe even start looking elsewhere within a few quarters. Monetary compensation shouldn't be discounted, it is important, however, there are other things that matter at a job as well.  Things such as flex-time, working from home, vacation time (and being able to take it), respect for one's skills and what they bring to the team, and possibly things like professional development opportunities or career ladders.  

Focusing on the monetary, in addition to not addressing corporate culture aspects, also has another hurdle: timing. Most organizations (that I know) can't adjust your salary right away, this needs to be budgeted. The old saying is true: One in the hand is worth two in the bush.  Even if the counteroffer is higher than what you got elsewhere you still have to wait for it, and it's still a promise at your current company and not a concrete fact reflected in your paystub. You might not even get it if you stay, or they might drag things out delaying higher earnings. If you leave now, you already know what your compensation and benefits will be.  If the company can afford to increase your salary right now and beat that other offer, it begs the question: why did they have to wait until you resigned to do this? ๐Ÿค” It doesn't paint a good image for long-term retention at that company.

My recent reunion with old colleagues (from my first "real" job) got me thinking back at the reasons I resigned and started working elsewhere.  After all, I liked the work and the people I worked with. For me, it was a combination of compensation, benefits, and respect. The new job (same org that I am in now) actually paid more. Not a whole lot more, but it paid more. Prior to leaving my first department, I had submitted paperwork for a promotion, with a rationale for it. This promotion would have been roughly equivalent in pay as the job offer I had, and it would have given me equal vacation time. This request for review was denied by HR after a long period of radio silence. The rationale given by HR was that there would be a wholesale IT revaluation soon (I was part of the broader IT department at the time) and everyone's job would be looked at systematically for promotional and re-org purposes. Just hand in there!  Sounded fine at the time, but people in the know told me that my promotion was nixed because my supervisor didn't want to have people reporting to them that were all different levels and jobs and he didn't want to give everyone a promotion ๐Ÿคทโ€โ™‚๏ธ.  Subsequently, the CIO who supposedly made this proclamation left, and no wholesale review happened, and I don't think any of my colleagues ever got their jobs reviewed for promotional purposes. I was already gone by then.

Some of the benefits I would have received via the promotion would also include more vacation time.  Having relatives abroad means that I would have benefitted from more paid time off so that I could spend more time with them. Up until that point, I took a vacation every 2 years so that I could bank the time and take it all in a 4-week block. Sadly even this was also disallowed (after my first 4-week block was complete one year). Managing summer employee schedules was a little too much for Mr. Supervisor, so people were not allowed to take more than a week off at a time. I am sure this impacted my co-workers and was probably in violation of collective bargaining agreements. Younger me didn't know any better.

The final thing for me was the job ceiling.  After 7 years in the department (4 years as benefitted), I had reached a point where I hit the ceiling.  No professional development was available (unless I really fought hard for it, and that didn't win me friends), and despite any Professional Development that came my way, I wouldn't have an opportunity to apply it, and definitely not in a way that led to a promotion. I had paid for my own PD, and my own certifications, but that got me nowhere.  When I resigned I did so by email and gave my 2 weeks notice, just before Christmas break.  I had already accepted a position in another area of IT.  Mr. Supervisor was surprised (unsurprising given the low level of communication he had with us), and his supervisor asked if I'd accept a counter (which would have to wait 6-7 months of budgetary reasons anyway - so it was more of a promise๐Ÿ˜‰ - see above for an example of broken promises).  I would have considered a higher monetary compensation (compared to the offer I got from where I was going) if the pay increase was immediate (as in HR was drafting up the paperwork as I was on the phone with said Supervisor), but the erosion of trust and the career ceiling meant that I would be looking elsewhere soon enough.

Once your mind is made up to leave, a monetary counter isn't enough, and organizations need to pay attention to the culture that they are fostering in order to develop and retain talent. While we live in a capitalist society where monetary means are used to entice workers to stay or go elsewhere, I think the current pandemic has also shined a spotlight on other things that are important to employees, things like flex-time and WFH. Responsive organizations listen to their employees and foster inclusive and openly communicating environments. Those that don't ultimately lose out on talent.

Your thoughts?

Stacks Image 20


 Jun 2022 (1)
 May 2022 (2)
 Mar 2022 (2)
 Dec 2021 (2)
 Oct 2021 (1)
 Sep 2021 (1)
 Aug 2021 (2)
 Jul 2021 (1)
 Jun 2021 (1)
 Nov 2020 (1)
 Oct 2020 (1)
 Sep 2020 (1)
 Jun 2020 (2)
 May 2020 (2)
 Mar 2020 (1)
 Jan 2020 (3)
 Nov 2019 (2)
 Sep 2019 (1)
 Aug 2019 (1)
 Jun 2019 (1)
 May 2019 (1)
 Apr 2019 (1)
 Jan 2019 (5)
 Dec 2018 (1)
 Nov 2018 (2)
 Oct 2018 (2)
 Jul 2018 (1)
 May 2018 (1)
 Apr 2018 (2)
 Mar 2018 (2)
 Feb 2018 (1)
 Jan 2018 (1)
 Dec 2017 (2)
 Nov 2017 (1)
 Oct 2017 (2)
 Sep 2017 (2)
 Aug 2017 (2)
 Jul 2017 (4)
 Jun 2017 (7)
 May 2017 (3)
 Mar 2017 (4)
 Feb 2017 (5)
 Jan 2017 (5)
 Dec 2016 (9)
 Nov 2016 (1)
 Oct 2016 (6)
 Sep 2016 (4)
 Aug 2016 (7)
 Jul 2016 (8)
 Jun 2016 (9)
 May 2016 (10)
 Apr 2016 (12)
 Mar 2016 (13)
 Feb 2016 (7)
 Jan 2016 (11)
 Dec 2015 (10)
 Nov 2015 (7)
 Oct 2015 (5)
 Sep 2015 (8)
 Aug 2015 (9)
 Jul 2015 (7)
 Jun 2015 (7)
 May 2015 (15)
 Apr 2015 (2)
 Mar 2015 (10)
 Feb 2015 (4)
 Jan 2015 (7)
 Dec 2014 (5)
 Nov 2014 (13)
 Oct 2014 (10)
 Sep 2014 (8)
 Aug 2014 (8)
 Jul 2014 (5)
 Jun 2014 (5)
 May 2014 (3)
 Apr 2014 (4)
 Mar 2014 (8)
 Feb 2014 (10)
 Jan 2014 (10)
 Dec 2013 (4)
 Nov 2013 (8)
 Oct 2013 (6)
 Sep 2013 (10)
 Aug 2013 (6)
 Jul 2013 (4)
 Jun 2013 (3)
 May 2013 (2)
 Apr 2013 (8)
 Mar 2013 (4)
 Feb 2013 (10)
 Jan 2013 (11)
 Dec 2012 (3)
 Nov 2012 (8)
 Oct 2012 (17)
 Sep 2012 (15)
 Aug 2012 (16)
 Jul 2012 (19)
 Jun 2012 (12)
 May 2012 (12)
 Apr 2012 (12)
 Mar 2012 (12)
 Feb 2012 (13)
 Jan 2012 (14)
 Dec 2011 (19)
 Nov 2011 (21)
 Oct 2011 (31)
 Sep 2011 (12)
 Aug 2011 (8)
 Jul 2011 (7)
 Jun 2011 (3)
 May 2011 (2)
 Apr 2011 (8)
 Mar 2011 (5)
 Feb 2011 (6)
 Jan 2011 (6)
 Dec 2010 (3)
 Nov 2010 (2)
 Oct 2010 (2)
 Sep 2010 (4)
 Aug 2010 (9)
 Jul 2010 (8)
 Jun 2010 (5)
 May 2010 (4)
 Apr 2010 (2)
 Mar 2010 (3)
 Feb 2010 (7)
 Jan 2010 (9)
 Dec 2009 (5)
 Nov 2009 (9)
 Oct 2009 (13)
 Sep 2009 (13)
 Aug 2009 (13)
 Jul 2009 (13)
 Jun 2009 (15)
 May 2009 (15)
 Apr 2009 (14)
 Mar 2009 (13)
 Feb 2009 (10)
 Jan 2009 (12)
 Dec 2008 (6)
 Nov 2008 (8)
 Oct 2008 (2)
 Jul 2008 (1)
 Jun 2008 (6)
 May 2008 (1)
Stacks Image 18