Club Admiralty

v7.0 - moving along, a point increase at a time

Multilitteratus Incognitus

Traversing the path of the doctoral degree

Graduate admissions process pondering

 Permalink

This post has been brewing in my head for a couple of years now. Since I am waiting for IRB clearance for my dissertation I thought it would be a good time to jot things down and see what others think.  I usually tend to have people in my network who either teach or work in some sort of instructional designer (or faculty developer) capacity.  I don't (think) I know too many people in the higher education administration aspects of my work to discuss these kinds of things with (so I may just be speaking to no one 😝).

Anyway, one thing that's been gnawing at me for the past number of years is how one enters into graduate programs.  I'll focus more on the master's level for a few reasons.  I manage an MA program, I teach for an MEd program, and from observation, I've seen that masters programs probably don't have as many administrative constraints: for example, [virtual] classroom space, working with a cohort model that's more tightly integrated [as compared to doctoral programs that are more tightly interconnected], and most masters programs allow non-matriculated students to take courses for fun or interest (doctoral programs typically do not). 

In the US, a typical application to a graduate program requires the following:
  • An application (usually electronic these days)
  • An application fee (my institution has it at $60 for domestic students)
  • A statement of purpose (why do you want to apply to the program, how does it meet your goals, etc.)
  • 2-3 letters of recommendation from academic referees (former professors)
  • Your transcript indicating that you completed an undergrad program (sometimes ALL transcripts, even if you decided to take ENGL501 for fun somewhere...).
  • Some sort of standardized test (albeit this is not as common these days)
I've done this song and dance four times since 2004.  The first time I did it (for my MBA), I didn't think much of it.  The second time I did it (for my MS), it was a little easier, albeit annoying because I was applying to the college I was already a student in.  The third and fourth times (MA and MEd degrees) I reused letters of recommendation and I phoned it in. I actually used one application to apply to two programs, and one statement of intent that covered both.  I almost did it a fifth time for another MA degree, but they required GRE scores to prove that I could go graduate work (after having successfully completed 4 masters programs mind you...), which is when I decided to not apply to that program.

As part of my day-to-day work, the admissions process is one of those things I manage, and I'm convinced that the entire apparatus is mostly an arcane and archaic gatekeeping mechanism, built around the constraints of "p-learning" (Dron, 2016) and program scarcity (when MA programs only really accepted a few students each year).   The admissions process doesn't feel like an enabling mechanism.

Take letters of recommendation for example. If you've attended a western institution, and you ask for a letter of recommendation from a former professor, the letters of recommendation that they provide will mostly be good. No one decent will agree to write a recommendation if they are going to negatively evaluate you. Hence, the asking for, and receiving, of a recommendation becomes a ritual of asking former professors to vouch for you, but really knowing ahead of time that, if they agree, what they write will most likely be glowing.  It privileges people who've already built those connections with some professors, and it is a colossal waste of time for the referee. It also privileges western applicants because (it's been my experience) that non-western referees typically write sparse recommendations that typically reference a transcript ("AK was a good student, got A's in my class").  In cases where someone doesn't have academic recommendations (for whatever reason), a common tactic is to take a course or two in the department (as a non-matriculated student) to obtain those letters of recommendation.  If they've taken two courses, and gotten good grades (however you define "good grades"), then the recommendations are pointless (especially since those who write the recommendations are also those who often read them for admissions🤷).

Another thing that I find interesting, is the requirement for a BA (sometimes the requirement is just a BA, even if it's not in the discipline).  A bachelor's degree can be verified by transcript, but I do find it odd that some schools require *every* transcript. So if someone has earned an Associate's Degree in Automotive engineering (school 1), a Bachelor's in Sociology (school 2), and wants to apply to an MA program in school 3 (let's say in a sociology program), that school could ask for a transcript from his AA degree transcript even though it's not relevant, just in case the applicant has anything to hide🙄. If the applicant has demonstrated that they can do academic work by showing a transcript with a completed degree, why request everything? 

Related to this, here is a hypothetical (I haven't seen or heard of this happening in real life, but it theoretically could): students are able to take graduate courses as a non-matriculated student.  As a matter of fact, learners can string together all required courses for a degree as a non-degree student but still not earn the MA because they either lack a BA, or they took all their courses as a non-degree student, so we can't count most of them if they matriculate near the end of their studies.  I am actually not sure where I stand on this.  On the one hand, the MA is positioned as an advanced degree in something (which assumes an introductory degree in it?) but on the other hand if you can do the work, and you demonstrated that you can by actually doing the work, why should you be prevented from earning that degree if you don't have the previous one in the series? 🤔

The graduate admissions protocols (at least for masters programs) seem like unnecessary gatekeeper these days.  Because of all of the documentation required, and the process of review for the materials submitted, admissions has defined calendar deadlines each year, which can shut people out if they happen to miss those deadlines.  Personally, I'd like to see more Open Admissions process to start to take place.  Perhaps some introductory and required courses could be open to anyone interested in taking them (I'll pick a random number, and say that this represents 40% of the curriculum), and then whoever is interested in continuing their studies can submit a brief application, and a reflective essay/statement of purpose and the faculty can make a more informed decision in terms of formal acceptance given their own experiences in class with the applicant. This would cut out the unnecessary transcript evaluation, letters of recommendation, and standardized scores (and all the processing and vetting that goes on with that).

Your thoughts?
 Comments

Technology will save us all!

 Permalink

...or wait... will it?

It's been a while since I wrote something on here†, and in all honesty, I thought about taking a sabbatical from blogging to focus on dissertation-related matters.  However, I really hate the current practice of threading on twitter where someone writes 10, 20, 30, or 40 tweets in a thread.  We've even invented an app to make these threads more readable.  I can't roll my eyes hard enough at this because it's a solution for a problem we shouldn't have.  We have long-form means of communicating - they are called blogs.  But anyway - I'll cease my "get off my lawn"-ness and move on to the point.  Now, where was I?  Oh yeah... I wanted to respond to something I saw on twitter, but I didn't was to just create a stupidly long thread.

So, in case you have not been paying attention, there is a bit of a global health scare going on, namely COVID-19 (or Coronavirus as the media calls it). It's gotten to the point where cities, states, or even whole regions are under quarantine.

Screenshot of WHO COVID-19 tracker
The question that comes to our mind, as education professionals, is this: well, what happens to school?  And people tend to respond by saying "put it online! problem solved!"  Well... the problem is not solved.  There is no magic fairy dust that will make this a "turnkey" solution or any other marketing jargon that will make this seamless. I've been seeing a whole lot of non-sense tweets about this over the last few days as more and more universities are announcing that they are going online...for now.  I've (snarkily) written responses like "I think I rolled my eyes so hard I experienced whiplash...🙄" to both technoloving, and technohating tweets. But I think it's important to be a little more detailed in my 🙄reaction to some of these so that we can have a constructive conversation around this topic, and so that I don't just come off as a snarky teenager saying "OK, boomer".

So a fellow colleague tweeted the following:
Hello #MOOC platform providers  @edXOnline @coursera @udacity @udemy @FutureLearn @CanvasLMS and others: many higher education institutions are in need of scalable technologies to serve the needs of students and teachers in times of the #COVID19 #coronavirus crisis. Can you help?

Canvas may be the exception here, seeing as they have a "regular" LMS that they also use for their Canvas Network MOOC platform, but most MOOC platforms are awful. I saw this as a user of them!  Yes, I do enjoy the free livelong learning content that they provide‡ but those platforms have been created with very specific UX design constraints in mind. Furthermore, many appear to rely on pre-recorded videos for their pedagogical approach, something which really won't mesh well with the short timeframes that we might be experiencing in the coming weeks♠.  There is also an issue in thinking that a technology solutions provider will be your best bet as a subject-expert contact to help your institution to move online.  They sell a product.  A product with specific design and pedagogical constraints, and - as we've seen recently - with potentially murky data practices.  Your go-to shouldn't be a technology provider to solve your issues.  Your go-to should be the staff that you employ at your university.  Your instructional designers, systems architects, and IT/IS people. They are the ones that know your needs, and they can figure out what the minimally viable product is.  If it turns out that edx is the right platform for you...then guess what?  It's open-source, you can run it on your own!  The same is true with operating systems like Moodle and Sakai, and they are not MOOC related, and have been used to deliver courses at a distance for 18 years!

Another colleague wrote:
Taking college courses temporarily online as an emergency measure to provide minimally acceptable continuity of instruction in response to a pandemic is not an admission that MOOCs are a good or even acceptable substitute for in-person teaching.

The three fallacies here are as follows:

  • You are conflating MOOCs with distance learning broadly.
  • You are assuming that MOOCs are just "lousy products"
  • You are putting on-campus courses on a pedestal.


MOOCs being conflated with any (and all) forms on distance learning has been happening since xMOOCs hit the market in 2011/2012. They are not one and a the same.  MOOCs are a form of distance learning, but they are not the form of distance learning.  MOOCs are also not a bad product.  You always have to go back and ask "what is our goal?" and even then "what is this good for?"  The adhesive used on post-it notes is a lousy product.  Yes, you heard it right.  It's a lousy product because the goal was to develop a super-strong adhesive. However, someone saw this product and created an ingenious use for it, and something that couldn't have existed without the lousy product was created♥. MOOCs have their purpose. It may not be the lofty goal of democratizing education¤ that we kept hearing back in 2012, but that doesn't mean that they are failures in totality.

On another track, many colleagues have been posting about this outbreak being the perfect opportunity for institutions to embrace online learning, and that this global turn of events will (magically) make people see the light. The unspoken assumption being that attitudes will change, and long-term practices will change.  This is completely and utterly false, and it's exemplified by the tweet above.   Vanguards of the "campus is best for learning" camp won't experience an attitudinal change en masse because of this turn of events.  They'll most likely hold their metaphorical nose, get through it, and then go back to their established practices.  Why?  Many reasons§, but here are the highlights IMO:

Attitudinal change requires an open mind - I don't think most campus faculty have that when it comes to pedagogy (sorry!). This lack of creativity, I would say comes from a lack in pedagogical training.  Doctoral programs prepare you to research, and teaching is always secondary (or even tertiary!).  It seems like many doctoral programs just drop people into teaching situations and have them sink or swim (pretty stressful, if you ask me!).  So what happens? Those doctoral students rely on mimicry - doing what they've seen done unto them in the classroom.  Maybe some will break through this cycle and experiment with pedagogy, but that's not a given. And, when faculty are hired lots of attention is paid to attending conferences and publishing, but little (if any) on teaching PD! So, previous behavior and belief patterns are reinforced through the pre-tenure period¶ and in your post-tenure period∞.  I don't need to see the outcome of the coronavirus to know that teaching faculty with these attitudes will use distance learning like a rented car, and when their ride is back from the shop, they will never think about the affordances (and the learners that might need online learning) again...or at least until the next emergency.

Anyway - to wrap this up, one voice that is conspicuously absent is the voice of staff members in this.  Staff will be called upon to support learners at a distance, and/or faculty who will (maybe, possibly, probably) be teaching online for a little while.  What is their role in all this?  How are they supported to do their work, and what are their thoughts and needs in the process.  The university is a complex organism but only faculty are seen as valuable stakeholders here🙄. This attitude needs to change if we are to have productive solutions and discussions when it comes to emergencies.


thoughts? comments?


Notes and Marginalia:
† hey, this is starting to sound like a confessional...let's see where it goes...
‡ I am currently signed up for 2 MOOCs on FutureLearn and 1 on EdX
♠ I'd also argue that Udemy is more of a self-paced eLearning platform and not a MOOC LMS...but that's a whole other discussion.
♥ and used all over the world in offices today
¤ personally I think this goal was overstated as people got swept up in the MOOC fever and institutional FOMO.  We might be seeing another kind of FOMO here with this coronavirus.
§ and probably best suited for a separate blog post
¶ where you might be on emergency-mode all the time while you're attempting to get tenure
∞ if your institution hasn't spent too much time fretting about your teaching until now, why would they do it in the future?

 Comments

Letters of recommendation - what's up with that?

 Permalink
It's been a while since I've blogged, or at least it really feels like it.  I've had my nose stuck in (virtual) books trying to get through my literature review - but more on that on some other blog post. I came across an article on InsideHigherEd this past week asking whether or not letters of recommendation are really necessary. My most immediate context is admissions, given that that's part of my work at the university, but the people who gave their two cents also mentioned something I had not considered: academic jobs. I won't rehash the opinions of the people who wrote for the article, but I will add my own two cents, mostly from a graduate admissions perspective. I don't have a fully formed opinion on letters of recommendation for employment purposes, but I'll add my two cents as a prospective hire (in a few years when I might be done with my EdD :p)

For admissions to graduate course of study, be it a masters program, a PhD program, or even a certificate program, I really personally don't see much value in letters of recommendation any longer.   My point of view is framed from the perspective of a student, an instructor, and a program administrator.   When I was applying for my first master's degree I bought into the rationale given to me for letters of recommendation: former professors can provide the admissions committee qualitative information about you as a learner that a transcript cannot provide.  This is all fine and dandy, and for me to worked out: I was working on-campus as an undergraduate student, and I had some professors who I had for more than one course and who were able to write letters of recommendation.  This was a privilege that I had that other students may not have had.  For my second masters I was applying to the same college, and I was applying to a new program of the college, so they looking for student, so getting recommendations wasn't that big of a deal.  Once I finished my second masters, I really didn't want to deal with more solicitations for letters of recommendation - I started to feel odd, and I kept going back to the regular well of people for recommendations.

So, I applied to two programs concurrently so that I could write one statement, and the letters of recommendation could pull double duty.  After I finished my last two masters degrees I took some time off regular, "regimented" school and programs and focused on MOOCs.  Going back to earn an EdD posed some issues as far as recommendations go.  I had previously applied to a PhD program at my university (at the college in which I earned two masters! - never heard a final decision on my application by the way), and by the time I wanted to apply to Athabasca I felt that the well had run dry for recommendations.  Former professors still gave me recommendations, but I kind of feel I was taking advantage of their kindness by asking for a recommendation for yet another degree program I wanted to pursue (don't judge, at least I complete my degree program haha 😜).  Not that I am thinking a ton past my completion of the EdD, but should I want to pursue a regimented course of study in the future (degree or certificate program) the recommendations will be an issue; not because I can't get them, but because I feel bad about asking for them - after all I am asking for someone to volunteer their time to give me a recommendation when my academic record should suffice. This is how I feel about the GRE and other entrance tests, by the way.  If you've completed undergraduate studies then the GRE is pointless - you can do academic work.  If you are unsure of the academic work capabilities of applicants, accept them provisionally.  Just my two cents.

Another lens I view this through the administrative.  Asking for letters of recommendation, and subsequently receiving them (or not) requires time.  It requires time from the student (especially in tracking down referees if they don't submit stuff in time), it requires processing time from admissions departments, and it requires reading time on the part of committees who review applications. When such a system takes that much time and effort into it, you have to ask what the benefit, or net positive gain, is.  Going back to the story I was told - the qualitative component of the transcript, basically - does make sense in theory, but in practice... not so much. 

While I don't make decisions on applications that come to my department for review, I sneak and peek at materials that come in because I need to process them.  What I've noticed is that by and large (1) recommendation are uneven, and (2) they tend to be the same, more or less, just with different names.  The unevenness is partially cultural in nature.  If you get a recommendation from someone employed at a western institution you tend to get (more or less) what you seek.  However, non-western colleagues don't use the recommendation system so for them a recommendation is just an affirmation that the student was indeed in their class, in the specific semester, and from what they remember they performed well.  The "basically the same" aspect of recommendations runs into the same problem as non-western recommendations; that is that recommendations basically boil down to: student was in class, they performed well, so accept them.  It just turns out that western colleagues are more verbose in their recommendations so they happen to add in some anecdotes of your awesomeness as a candidate, but even those anecdotes tend to run along the same wavelength most of the time:  asked interesting questions in class, was the first to post in forums, engaged fellow classmates, submitted assignments early, etc.  From an administrative perspective there is (so far as I know) no background check on these folks providing recommendations so we are taking what they are writing in good faith.

Finally, as an instructor, I am lucky, in a sense, that I haven't had to write a ton of recommendations.  I've done so a couple of times but after a few original recommendations I've basically gone back to the awesome student, accept them, here are a couple of anecdotes formula because that's life, we're not living on Lake Wobegon. I'd gladly give a recommendation to former students who did well in my classes, but it's hard to not feel like I am writing a generic letter sometimes. So why spend time writing something that feels like a template letter if I am not providing much value to the system?

In short, recommendations for admission add no value while taking away time and resources from other areas.

In terms of letters of recommendation for academic employment, on a purely theoretical basis I'd say that they are pointless too.  Both for reasons articulated in the IHE commentary piece, but also for one of the reasons that's similar to graduate program admissions: the genericness aspect.  I think having some references is fine, but I think a quick conversation (or heck, a survey-style questionnaire) would be more preferable to a letter. The reason I think it's not that useful in hiring decisions is the same reason no one gives recommendations anymore (for us regular plebes getting work), and that is that people sue if they get wind that they got a bad recommendation. Generally speaking no one will agree to give you a letter of recommendation (or reference) if they can't give you positive reviews, and HR departments just confirm dates of employment these days.  Nothing more, nothing less; otherwise they risk a lawsuit. So, if you're not getting much information about the candidate, and if the information is skewed toward the positive (because that's how the system works), then is the information you're getting valuable?  I'd say no.

So, what are your thoughts?
 Comments

Campus deadzones, and creepy hallways: where did everyone go?

 Permalink
Found image on Google
(not actually a photo of me)
Happy Friday dear readers! (umm...anyone still there?  I swear! I am alive! 😆)

I've been attempting to write a blog post all week (and trying to do the 10 minutes of writing per day), but I've been failing on that account...I guess Fridays are a better day as things wind down from the week.  In any case, there is an article from the Chronicle that's been on my mind this week titled "Our Hallways are too quiet". Our department chair sent this to us (everyone in the department) as a thought piece, perhaps something to ponder and discuss in the fall - probably because our department is also like the department that is described in the article.

I had a variety of cognitive and emotional processes go off, and get gears grinding while I was reading this.  I actually hadn't noticed that the author was from MIT...who only recently "discovered" online learning (like Columbus discovering the New World).  Yes, I am a little snarky, but I also think that your frame of reference is really important.  If you are a Bricks and Mortar institution what you consider "community" might look different from an institution that is focused on distance education (or at least has a substantial DE component).  But I think I am getting ahead of myself.  Let me just say this:  My job title is "Online Program Manager" - as in the person who runs the online components of a specific MA program.  Having been on campus for close to 20 years now, in a variety of roles, I can see both sides.  I think this particular article is really biased, in ways that their author doesn't even get.

Let's start with this:
Entire departments can seem like dead zones, and whole days can pass with only a glimpse of a faculty member as someone comes to campus to meet a student, attend a meeting, or teach a class. The halls are eerily quiet. Students, having figured this out, are also absent. Only the staff are present.
This excerpt, as well as the rest of the article, is very faculty-centric.  As if the faculty (or this particular faculty member anyway) are the only ones who suffer any consequences from creepy hallways.  In my most recent job (headed into my 6th year soon!), and my first in an academic department, I've experienced the demoralization that comes with absence of colleagues.  In all of my others jobs on campus I've always had colleagues around (with the exception of vacations and such).  Whereas in an academic department I didn't (don't) always see people.  In my induction period (when I was getting the lay of the land and doing a SWOT analysis of the program I was managing so I could be more effective) Mondays through Thursdays I'd at least see my fellow program managers and faculty here and there, but on Fridays it almost felt like being in the movie I Am Legend.  Granted, this didn't bother me back then because there was a lot of paper records to go through and make heads and tails out of everything. Being busy meant that I didn't really mind being alone.  Once all paper was organized, made sense of, and work could be done remotely, the big question that comes to mind is this:  Well, if I can do my work remotely, and I don't have to deal with the x-hour commute, why would I need to go in?  especially for someone who manages a distance learning program.  If one group of employees (faculty) can work remotely (effectively) why not another group whose job duties are conducive to it?  I do agree with one point made above:  students having figured out that faculty aren't there are also not there; but there is a big caveat here:  who are your students? Students in my department are (by and large) working adults, so even if faculty were around it doesn't mean we'd suddenly have students sitting around in semi-circles, drinking their dunkies coffee (local affectionate term for Dunkin' Donuts) and discussing Derrida.  If you think that way, you're living in a fantasy.  Student demographics matter.

Goin' onto the next point. The author writes about faculty avoid the office for a variety of old fashioned reasons, such as not being able to get work done, avoiding feuds, and avoiding time-sinks like watercooler talk, but then she turns her attention to the perennial foe: technology!
A big reason for decreased faculty presence in their campus offices is technology. Networked computers that allow one to write anywhere also allow us to have conversations with students and colleagues that used to take place in person. Creating new course materials and ordering books is easily done online. Cloud software has made pretty much all our work processes easily done from home, a vacation cabin, a foreign conference hotel. For many scholars, this has been a very liberating occurrence, giving them wondrous flexibility.
Pardon me, I don't know you, but I call 💀💢🐄💩😡 on this argument.  Yes.  technology has facilitated certain efficiencies, like not having to fill out a form in triplicate, or not having to wait overnight for a journal article query that only returns title and abstract of potentially relevant articles to you.  Technology has not caused faculty not to want to come to the office.  Other organizational factors play a major role in the day to day decisions on whether or not to work remotely.  When research productivity is sought more, then people will do what they need to do to be more productive in their research.  If community engagement, service, teaching, or other aspects of the professoriate are valued more, than people will gravitate toward those.  I basically comes down to incentives, and when there is little incentive to be on campus to meet those objectives, then you will undertake them at a place that is most convenient for you.  I think a lot has to do with the expectations set forth by the institution, the institutional culture, and by extension the departmental culture.  Sure, you can have a department chair (the head honcho in an academic department) mandate that everyone (yes, including faculty) have to be there 3 days per week, and put in at least 10 hours of  'face time' into the department during regular business hours (9-5).  That's really only 3 hours per day. Does 3 hours per day really build community?  Nope.  Does 3 hours per day guarantee that people will be there on those same days and hours?  Nope.  This is the equivalent of butts in seats, for no good reason.  It's as anachronistic as forcing students to endure a long lecture just because you haven't through of your pedagogies.  First you determine what your root goal is (and no, more face time isn't a worthy goal), and then you hatch a plan to get there, while at the same time taking into consideration the various local variables, norms, and expectations (heck, maybe those need some rethinking too!)

Every time I hear about technology as the "big bad" I am reminded of the rebooted (and cancelled) Thurdecats.  From the fan wiki article (with my own annotations in brackets):
Most citizens [of Thundera] abhorred technology, denying the existance of machinery entirely and leaving thoughts of such things as fairy tales. This belief was a major contributing factor to their destruction as the lizards [their enemy] attacked them with advanced bipedal war machines Warbots while the ThunderCats fought with bows and arrows.
Just an interesting side-trip - take it as you will 😂

Anyway, moving along, finally, I see a conflation of the sense of community with face time, and they are not the same thing.  The author writes:
Some would argue that worrying about departmental community is ridiculous. After all, professors aren’t hired or promoted on the basis of departmental relationships, or civic engagement, and most faculty members desperately need quiet time in which to do research and write. True enough. As my colleague, Sherry Turkle, has argued: Conversation matters. Personal contact matters. It is very hard to build relationships with people we do not see in person, and such relationships are the bedrock of so much else that matters on any campus.
I think community is important.  However just because someone is not in their office at the same time YOU are in your office doesn't mean that you can't have community.  And just because you re not meeting face to face doesn't mean that you aren't communicating.  And just because you aren't meeting face to face doesn't mean that you aren't having personal contact! I've had lots of meaningful conversations, and personal contact with my many distance friends, family, and colleagues over the year.  From my doctoral cohort, to vconnecting friends and colleagues (sorry I've been a ghost - dissertation is sucking my mental energy), to colleagues who are geographically dispersed.  Every time I hear of Sherry Turkle I can't help but roll my eyes. Yes, face to face is nice.  Yes, I like face to face sometimes, but face to face ain't the end all be all of conversations, connections, communities, and work.  Yes, we do need community.  Without it we are just a loosely joined confederation of people maybe striving toward a common goal (maybe not), but with community we become stronger, and we get smarter.  But community can be achieved in a different ways (look at vconnecting for example).

To wrap up, I am reminded of a joke, or something that one of my mentors (Pat Fahy) kept saying "It's the parking, stupid!".  This was the response to the question "why do students pursue distance education?".  Of course, this is just one piece of the puzzle; others being things like mobility issues, health issues, childcare, elder-care, working two (or more) jobs, and so on.  I think in an era where we are offering some really great distance education programs (oh yeah...welcome to the party, MIT), and we've seriously considered what makes a good online program for our disciplines in order to get here, it would behoove us to also look at what makes our jobs effective and how we can effectively build communities of various modalities.  Forcing grown human beings to have face time so that they form community is the equivalent of having your kids forced to stay with "weird uncle mike" or grandma, because you feel like your kids need a connection with the rest of your family, but you haven't bothered making them part of your family in the day to day, except only on holidays.  Both kids, and adults, resent such forced actions.  We can do better.  Just sayin'

OK, now that I've ranted on 😏 - what do you think? 😃


 Comments

MOOCs as admissions considerations

 Permalink

It's been a while since I've sat down to blog (with the exception of my brief postings last week).  I guess I've had my nose firmly planted in books (physical and digital) trying to get through the reading components of my dissertation proposal so I can sit down and write. I tend to find (for me anyway) that having a bit more of a complete picture in my head as to what I want to write about cuts down a a ton of edits down the road. Because of this I also haven't really engaged a lot with my learning community (MOOCs and LOOMs alike).

That said, a recent work encounter broke my blogging slumber and has pulled me from my dissertation a bit.  In my day job one of my roles is to answer questions about our department's program (what is applied linguistics, anyway? j/k 😆) and that includes questions about admissions. While we prefer applicants with a background in linguistics or related background  such as languages (such as French, Italian, Spanish, Arabic, Greek, whatever language and literature background) we do accept others who did their BA in something different.  Personally I think that the language is archaic and comes from a time when the mission and vision of our department was slightly different, but that's neither here no there.  My point is that when there are people interested in our program who come from a background other than languages (such as business, or computer science for example) the question always becomes  how can I better prepare for this program, and ensure I get admitted? Basically ensuring that the applicant shows some sort of connection with between their interest in our program and what they did, or want to do.

In the past couple of years MOOCs have come up!  Even though I've been steeped in MOOCs for the past six years I didn't really think others were.  Furthermore, it amazes me how much value others place in MOOCs, and MOOCs that they have taken. Personally, while I like taking xMOOCs (I just signed up for about 10 of them recently through edx and future learn, and I am trying to do one on Canvas on collaborative ICT...) I don't know if I would ever mention my exploits in the MOOC arena to others (except maybe through my blog, or through a group of close MOOC friends).  My rationale for not sharing my learning is this:  While I personally derive value from what I do in MOOCs (it expands my own horizons, even if I am just viewing some videos) I also know that assessments are a little forced in xMOOCs.  Simple MCEs or short-answer peer-graded assignments don't really point toward mastery of something.  In ye olde days of xMOOCs the certificates of participation were free; provided that you completed the MOOC in its original run.  Now xMOOCs require you to pay for a certificate of participation, and I personally don't see any value to that.  Even if you pay for a verified certificate where you have someone proctor you while taking MCEs, what does that really mean?  That you can take a test?

This all got me thinking about the potential use of MOOCs for application purposes.  I personally think that by taking (and completing) a MOOC it shows interest in the topic, so that's a positive for the applicant, but it doesn't necessarily show any mastery. So, while useful, it definitely has its limitations.  The certificates don't really mean much to me for my current work, and yes - I do hold on to the certs that received while they were still free (😉) but I don't see additional value to the ones that people get these days in exchange for cash.

What do you think? Is there a value to students doing MOOCs with the aim of getting into a specific part of higher ed?
 Comments

Curriculum Management as a Supply Chain issue?

 Permalink
I don't often write about my dayjob - as manager of an academic program. There are probably a lot of interesting and nuanced things to study academically in higher education administration and non-profit management, things that I also find interesting (from time to time) - but I tend to spend most of my time looking at EdTech, pedagogy, language learning, and the like (more so than higher ed administration.

Recently I saw a blog post from a friend who is also pursuing a PhD that made me put on my management academician thinking cap, and it got me in a reflecting mood as far as my dayjob goes. It also brought back fond memories of me being an MBA student in a supply-chain management. The successful running of an academic program is a complex dance between various external (to the academic department) actors, such as the admissions office, the registrar's office, the bursar's office, and the room scheduling office (if your program is on-campus). This is also in addition to internal actors such as curriculum committees, admissions committees, faculty, and advisors, and the students (I think of students being "in" the department).  If we look at it from Actor Network Theory, there are also those devices that facilitate (or put up roadblocks) for our efforts. One big actor in this network for me is Google Docs given that I use it to plan for a lot of things.

I've been in my position over four years now and it's been quite an interesting, and educational, experience. Some of our faculty are tenured, and some are adjunct lecturers - although they've been with us so long that we really think of them as one of us.  One thing that really has stood out to me, comparing the then with the now, is really how important those connections are, and the domino effect of the supply chain.  When I was a student studying supply chain management it was fascinating to see how changes in the factory output, the connections between factories, the warehouses, stores, and pricing made a huge systemic difference in what was happening in the end‡.

When I started working there had been a gap in that position for the program, which meant that there wasn't really a day-to-day maintenance that was happening (nor was there systematic improvement). One of the things that had lagged behind (seriously behind) was communications with current students and communications with prospective students.  That for me was a huge domino that had already fallen and we were seeing it's effects - lower than average applications. Why would one apply to a program if there isn't good communication?  Effective, and timely, communication is important not only with your perspective students, but also your current ones in order to ensure that prospective students find the right program for them (even if it isn't yours), and current students are on a steady path to graduation. An internal policy of 2 business days (at most) to respond to inquiries and emails seems to have solved that issue.  Email also became the preferred method of contact. This doesn't please everyone (especially those who like talking on the phone), but with limited resources it's the most efficient.  Phone conversations are available for more in-depth and tricky subjects, not "routine" questions.

Another area where I see supply chain as much more prevalent is course registration.  Course registration is probably the major cause of departmental firefighting (we're all familiar with putting out fires, right?). It's was also a bottleneck for hiring and assigning courses to adjuncts.  In a nutshell, prior to my arrival† students were able to sign themselves up for classes.  This left made course sign up the student's responsibility. There is something empowering about signing up for courses, but even with a late registration fee (if students registered after a certain date) many students would simply wait to register. This meant that we didn't know if some courses would run (you need to have a minimum amount of students in each course to make it viable). Not knowing if something would run also means that you couldn't commit to assigning specific courses to our adjunct faculty, which meant that they didn't have access to Blackboard and the resources they needed to plan for effectively for their courses. It also meant that for the students who did sign up early, there  might be a mad dash near the beginning of the semester to change courses if their courses were axed.  Lots of fires to put out right there!

The solution, which seems to work, is to prevent students from enrollment activity and have us (in the office) enroll them for courses, and to make sure through advising that all advising is accounted for and a month before the current semester ends students have spoken to their advisor and we know what courses they'd like to be signed up for. This gatekeeping activity has been pretty successful thus far.  About 85% of students see their advisor and are queued up for fall courses 45 days before the semester ends, and around 10 days before the late fee kicks in, 94% are signed up.  Not too shabby if I do say so myself!  Having an (almost) 95% completion rate also means that our faculty have a better idea of what they are teaching in the fall so that they can prep over the summer (if they'd like) and the college and HR departments can start processing their fall contracts earlier than before since we have confirmed enrollments. These contracts also mean that instructors gain access to university resources that they need  - such as email access!

Supply chain management may seem to impersonal in a higher education context, but I think that it has applicability.  I wonder what others in higher education admin think about this?




NOTES:
‡ I also hated the grading schema for that class, as I've probably written in this blog before, but the class was pretty interesting all things considered.
† I didn't singlehandedly do this - it was a team effort, but I did initiate a lot of this
 Comments

Measuring Learning

 Permalink
I know... I know... This is perhaps a tricky question to answer, but bear with me here, Perhaps the answer to this question of "how do we measure learning" is "well, d'uh! with some sort of test or assessment".  This might be true in one-off training, you visibly see employees either performing or not performing, but when it comes to a higher education context what does it mean to have been badged, branded, certified (whatever the term you use) as having had an education?  In Higher Education we measure "learning" through a credit hour system. But what the heck does that mean? Well, I know what it means and its history, but how the heck does that connect to learning?

There are three critical incidents that got me thinking about this today.  First is a conversation I had with a prospective student for my day-job. The person who was inquiring about our program was asking about how many weeks our courses run each semester.  When I informed them that our programs run on a semester-basis and run for 13 weeks, this person was perplexed as to why the courses were 3 credits and not 5. This in turn perplexed me, which opened the door for an interesting mental exercise.  The potential student, it turns out, is used to a 8-week course structure for 3 credits, and so, they rightfully assumed that all schools do the same thing. There was also a good assumption (a folk explanation, but good for the amount of data that they had), that credits are based on the number of weeks a course runs.

For those who don't know, in the US the definition of a credit hour is a minimum of 3 hours of student effort per week for 15 weeks for 1 college credit). This means that a 3 credit course will require a minimum effort on the part of learners of 135 hours for each 3-credit course.  This, however, is the minimum.  A more realistic amount of effort is 4 hours of effort for each credit, which makes the total hours per 3-credit course 180 hours.  Now, your mileage may vary.  If you are taking a course in which you already know some stuff, your hours may be less.  If you are wicked smaht it might take less.  If you are like me (going off on tangents to explore interesting things) it might take more.  That said, the definition in the US ultimately boils down to the number of hours a student has put in some effort for those credits.  So, a 30-credit graduate degree from my department is something like 1400 hours - assuming you put in the minimum amount of time during class, and throw in some token time to study for your comprehensive exams. A bit short of Gladwell's 10,000 hour rule, but getting there ;-)

When I was an undergraduate, and even a graduate student, I didn't give this any thought.  I needed something like 120 credits for my undergraduate, and anywhere between 30-credits and 54-credits for my each graduate degree I earned.  I never really thought about what those credits meant, just that I needed them.  Courses were assigned a certain amount of credits, so I just scratched off courses from my list like a prison inmate at times.  My graduate degrees it didn't feel that bad, but my undergrad felt like I needed to put in my time.  Once those courses were done I banked my credits and moved onto the next course. Being now where I find myself, as a college lecturer, and a graduate program manager, what credits mean, and how to measure learning is much more important to me than when I was a student (ironic, eh?)

This whole situation reminded me of something that happened at work last year (incident #2). One of our alumnae was applying for a PhD program in Europe (#woot!) and she needed a document from us certifying she had completed a certain amount of graduate-level hours for her Masters degree in Applied Linguistics.  I was expecting to be able to do get some sort of US to ECTS conversion calculator and be done with it, but it was more complicated than that.  Europe also runs on a credit system as well.  Doing a little digging on the interwebs Masters programs are rated anywhere between 30-60 ECTS.  One ECTS, at least in Spain, is 25 hours of effort (Greece is 30), so the range of hours of effort for a European MA vary between 750 and 1500 hours.  This is still based on effort put in, and not real actual learning.  Students are still banking credits like I did.

Then, the third incident was a few weeks ago when I blogged about Campus Technology, one of the keynotes for this year, and competency based education. This isn't the first time there was something about Competency Based Education at a Campus Tech conference keynote,  A few years back there was a keynote by someone from Western Governors University.  The CBE model doesn't seem to be based on time spent on task, or effort put in by students. To some extent it seems to be more connected to what students can do rather than how much time is spent doing it.  That said, I am still left wondering how we go about measuring learning, measuring what students have learned so we can accurately certify them with confidence. This way we are reasonably assured that we have done our due diligence to prepare minds for the world outside of our own sphere of influence?

Even with the CBE model, it seems to me that there is a certain degree of banking that is going on.  I've gone through, I've completed x-many courses, I still have y-many courses left to complete to get my degree in Z. It seems to me that the discussions around curriculum are still constrained on the notion that undergrad degrees look like xyz, and they require abc of effort.  MA degrees, regardless of discipline, look like xyz, and they require abc of effort.  And so on.  I think that this isn't just a matter of assessment, bur also a matter of curriculum, but it all comes back to the same question for me: how do you measure something that may or may not be measurable?

Thoughts?
 Comments

Institutional Affiliation or Itinerant Scholar?

 Permalink
Rebecca, the other, posted a question on Twitter on #adjunctchat, and later on wrote a little more in length on her blog about this question: What is the value in affiliation? More specifically:
In our new world of adjunctification and alt-metrics, does an affiliation matter? Am I better to declare myself as an itinerant scholar than a scholar associated with a particular university? What is the value of the affiliation, especially when the institution isn’t providing any resources to support the project?
Just to start off, I like the idea of the Itinerant or Nomadic Scholar. I suppose that this notion of nomadism has sort of stuck with me from my work with cMOOCs, and I see nomadic scholars as an extension of this idea. So, the question is what is the value of affiliation?  I think it depends. If you are doing certain types of research, even if the University doesn't support you as a researcher-scholar due to the nature of your adjunct employment, there may be doors that you can open simply by dropping a name. Now, that name doesn't have to go in your final scholarship, but claiming some affiliation at the onset of a research project can help in getting things started.

I would argue that when a scholar reports their institutional affiliation in published research, in those instances, it is the University that benefits from this reporting. The university can count on the name brand recognition it receives when scholarship is penned under the auspices of that university.  When tenured faculty publish (or even if your institution doesn't have tenure, but has some other method of permanence), then it makes sense to publish under the name of that university. The university has hired you to teach, research and publish, and to provide service.  This, I would say, is expected from the terms of your employment.  For adjuncts however, who are only hired to teach specific courses, they aren't hired with research or service in mind.  In cases like these I think that it's not fair for a university to claim some glory from the work of nomadic researchers that they didn't support.  One may argue that they are "supporting" research by hiring that scholar to teach, however I don't see it this way.  I think that if universities want a shot at the limelight they need to support research of adjuncts.

In my case, I am an adjunct at the university where I am employed full time.  My day-job is flexible enough, and appreciative of research, that if I needed some time "on the job" to finish off a paper submission that is due tomorrow, I could do it on the clock without any hassle. It helps that I keep on-top of my regular duties, too - but working on research outside of the scope of my duties isn't frowned upon.  In this instance I do get support from work, measured in time "off" from work, to support my research, so I am more than happy to put the University's name as my affiliation (as much as I like the Nomadic Scholar title).

If I were an adjunct, and taught at a few schools, I would most likely claim the Itinerant Scholar status, and if any research support was given to me from a specific institution, I would put that in an acknowledgements section.  The reason for this is as follows: being employed by more than one institution of higher education is problematic.  If you put one institution down, instead of another, you might be seen as playing favorites, and in future semesters you might be asked to choose your place of employment - us or them.  This is an unfair position to put an adjunct, so the acknowledgements section of published work is a mid-way point.  You acknowledge any help or support received by the institution without making them the marquee. This way you can sidestep issues of people asking why University A was mentioned as the affiliation and not University B but still give a tip of the hat to the appropriate entities.

At the end of the day I think that the current adjunct system is not a good way forward, and higher education needs to address this adjunct issue.  Your thoughts on this?
 Comments

Online Doctorates, degree designation, and misunderstanding of what it all means...

 Permalink


Happy new year to all! The other day I was catching up on some reading in my Pocket account when I read an article in eLearn Magazine about online doctorates. I feel like I should have a grumpy-cat image on this blog with a big "no" on it since there were a number of things that seemed really wrong to me about this article. Some of them are probably the author's interpretation, or way of explicating things, and other things are wide-spread academia myths (in my view). I think the article author intends well, as the crux of the article seems to be to research things well before you commit to any online program, but things seem to go awry in explanations.  First the author writes:

As the number of master's students from the initial flush of fully online degrees stabilizes, those interested in increased revenue streams have opened up the university gates wide and have started to look to doctoral-level education for the next big democratization of higher education.
I think that this is a gross misassessment of what online degrees do, including the doctorate.  I do think that there is a demand for more education, and because we are all busy during traditional school hours to attend MA and PhD program online programs are growing or expanding to fill in the gap.  That said this should not be mistaken with democratization of education. Just because higher education is now potentially available to more people, via online delivery, it does not mean that there is an actual democratization of knowledge and information.  People still have to pay tuition and fees in order to be able to take those online programs.  When the financial cost of attending school is not $0 then it's not democratization.  Democratization is when knowledge and information becomes available to all of those who want to partake in it, regardless of their socioeconomic background.  Online degrees, even the doctorate, is still only available to those elite who are able to pay for it!

The second thing that really didn't sit well with me is this artificial distinction between a "real" doctorate (PhD) and the "professional" doctorate (EdD in my discipline).  The distinction between the two, the one that I've heard over the years, and the one that the author uses is the following.  In a PhD (or academic doctorate as the author refers to them as):

The expectation for those in academic doctorates is that they will focus on the creation and dissemination of new knowledge in their disciplines. They will have experience in presenting their ideas to academic communities for criticism and feedback. Academic doctoral students are expected to publish their work in peer-reviewed journals with findings from their original research.
In a "professional" doctorate, like the EdD however:

[doctoral candidates] are trained differently. While much of the coursework will appear similar, those inside the academy can attest to the differences in the kinds of experiences for these two different tracks. Those with practical or professional doctoral preparation will focus on improving practice. They are more likely to learn to consume research rather than producing copious amounts of original research themselves. They focus on translating current research findings into practical implications for those in their fields. They tend to be leaders in their chosen practice areas, and typically don't work in academic appointments.
This to me is more or less bunk.  First of all, it seems to me that in most disciplines (i.e. those who are not considered lab sciences) that this distinction is irrelevant.  In academia one would never hire a PhD in any of the disciplines I've studied only to conduct research.  Being part of academia, on the tenure track, means that you are conducting research, peer reviewing, being a member of committees to provide service to the institution and to the profession, and teaching.  Teaching is about mentoring and about bringing the theoretical and the practical together so that students can see theory in application, and then in turn start hypothesizing and working things out on their own.  As a matter of fact, we've seen a whole alt-ac movement where PhDs who are thinking about not going into academia (because let's face it, there is a dearth of tenure track jobs available) need the skills required to put theory into practice and to translate that knowledge for other audiences.  Are you telling me that they should have gone for a professional doctorate instead?

On the other hand, you've got the professional doctorate who, by definition, seems to me to be the mama-bird who pre-digests food for her younglings.  This is also wrong. Professional doctorates might have more of a focus on practice (maybe skipping that required skilling for alt-ac) however this does not absolve them of research responsibilities, if they want to be taken seriously.  People who graduate from professional doctoral programs need to be every bit as much as thought leader, to borrow a phrase from a friend of mine, as those graduating from academic doctoral programs. They still need to be able to conduct research as part of their day-to-day job because that's how we determine if theoretical practice, put into day-to-day use actually works. No one needs to produce copious amounts of research to be an academic, they just need to produce good research.  Research isn't measured by the pound, but rather by impact. And, what if someone with a professional doctorate wants to pursue the tenure track? Should this degree designation prevent them from doing so?

Finally, the author cites a 2005 article on perceptions of hiring authorities, in academia, about online doctorates.  Leaving aside that this was 10 years ago and things will most likely have changes, I think the conclusion, that those who earned a doctorate by online means, are not welcomed to apply for academic postings.  I think this is looking at it in a manner that is too simplistic.  I think that we need to dive deeper into the nitty gritty here and see who was offering online doctorates ten years ago?  The only people I know who've earned online doctorates during that time period are people who went to Capella, or maybe even University of Phoenix.  The push-back felt in these cases is not because of the online doctorate, but because what that online doctorate is associated with: a for-profit institution who is seen to lack quality (in other words the perception of the diploma mill). Now if the degree were attained through Harvard Online (if that existed back then), I would say that an online degree would most likely be welcomed if it were associated with a positive name.  It's not the online degree, but rather the name.


There is no excuse for a poor academic program - period - be it BA, MA, or PhD level. The sense that I am getting from articles like this is that the difference between a PhD and an EdD basically translates to "PhD = more rigorous" and "EdD = be done quickly, call yourself a doc" - this to me is a false conclusion. I personally don't see a difference, from a theoretical perspective, between a PhD and an EdD.  your thoughts?
 Comments (3)

Connecting the dots...thoughts about working in academia

 Permalink
[warning: lengthier post than usual] Before I left for December my mini vacation I had a holiday themed catch-up with a number of friends and colleagues on campus. With the semester winding down, and with the holidays as an excuse it was a good opportunity for people to get together and share some news about what had transpired over the past semester, share notes, best practices, and so on. One of my colleagues inquired how things are going in the office as far as admissions go. There seems to be some doom and gloom over falling admission on campus, but that's a topic for another day. Things are going well in my department (knock on wood), so much so that we are not able to admit all qualified applicants since we don't have enough people to teach for us.

My colleague's solution (my colleague is a full time instructional designer, for what it's worth) was that we need to "change the model," instead of relying on tenure stream professors to teach our courses, we could have subject matter experts design the online courses and hire and army of adjuncts to teach for us, thus the tenured professors would have a final say on the content and the adjunct, who costs less would teach to that content. This, after all, seems to be the model that other schools employ, especially those with online programs, so the message seemed to be that we need to get with the program and move from an outdated model.  Now tenure may have its issues but I think that swinging the pendulum mostly the other direction is the wrong solution. My bullshit alarm (for lack of a better term) starts to go off when I hear about some of these "new models" in the same ways by BS alarm went off when I was hearing about sub-prime mortgages and derivatives when I was an MBA student (you remember those?).

I don't know how I found myself in higher education administration, but I did end up here. As a matter of fact I am coming up to three years in my current job (closing in on that 10,000 hours that Malcolm Gladwell wrote about!) The thing that became abundantly clear to me is that there is a compartmentalization of information, know-how, and most importantly understanding of what needs to happen in a large organization, such as a university, so simplistic solutions, such as "changing" the model become the norm in thinking. This is quite detrimental, in my opinion, to the overall longevity of programs. These simplistic solutions may come from the best of intentions, but when one doesn't have the entire information at their disposal it's easy to come to bad solutions.

First, we have an assumption that we don't have an overall curriculum, thus bringing the point of "master courses" that any ol' adjunct can teach. The fact is that we do have extensive program level outcomes in our program, and somewhat set curriculum.  At the broad level it is set, but at the day to day level there is flexibility for subject matter expertise.  I don't want to get into the issue of academic freedom, I find that this term gets abused to mean (almost) anything that faculty members want it to mean. However, in this case I do want to draw upon it to illustrate the point that at the day-to-day level of class, so long as faculty are meeting the learning objectives of the course, the readings that they choose as substitutes to the agreed-upon curriculum of the course (especially if more than 2 people are in charge of teaching the same course) is are not put under the microscope, and faculty aren't prevented from exercising their professional license.

Secondly, and most importantly, simplistic (and often cheap for the institution) solutions to expand capacity treat all adjuncts as the same an interchangeable. This is patently wrong on so many levels. The way I see is there are two types of adjuncts (those of you who study higher education administration may have more - please feel free to comment). The first type are the people who the adjunct system was "built" for.  Those are people like me: people who have a day-job somewhere, they enjoy what they do, and they share their practice with those who are training to enter our profession. Our day-job essentially pays our wages and what we do we do as service to the profession and for the love of teaching. This way the (usually) small payment per course can be really seen as an honorarium rather than as payment for services rendered.  The second type of adjunct is the person who is doing it as their day-job and they thus need to teach many courses (perhaps at multiple institutions) to make ends meet.  This second type of adjunct is probably what is most prevalent in academia today, at least from what I read.  Regardless of whether they are of type 1 or type 2, Adjuncts who teach, both for our institution and elsewhere, are professionals who have earned their PhDs, in many cases conduct research, and are active in their fields in one way or another; but most of all they are human beings. By coming to the table with the mentality that they are interchangeable, just give them a pre-made course shell and let them run with it, you are not only undermining their humanity but also their expertise in the field - after all someone you crank up and let them run doesn't necessarily have a voice to help your department improve their course offerings and their programs. You are shutting them out.

Now, at the moment, as a case study, let's take my program.  I would estimate that depending on the semester anywhere from 75%-90% of the online courses are taught by adjuncts.  In the summers (optional semesters) the ratio is actually the inverse. By hiring more adjucts, in order to matriculate more students, the tenure to non-tenure ratio gets more skewed. This, to me, is problematic.  A degree program isn't just about the 10 courses you take in order to complete you degree.  A degree program is about more than this, and tenure stream faculty (i.e. permanent faculty) are vital to the health of degree programs and to the success of learners in that degree program. Adjuncts, as seasonal employees are only hired to teach the courses that they are hired to teach, and nothing else. This represents a big issue for programs. Here is my list of six issues with over-reliance on adjunct labor

Issue 1: Advising

I must admit my own experience with advising, throughout my entire learner experience has been spotty at best.  Some students don't take advantage of advising, we think we know better and we know all the answers.  Some advisors treat advising as a period to get students signed up for courses.  Both attitudes are wrong.  Advising is about relationships. It's about getting to know the student, their goals, their intents, and their weaknesses and working with them to address those issues. At the end of a student's studies, the advising that occurred during the student's period of study should help them get to the next leg of where they are going to be, on their own.  Through this type of relationship building advisors get to know their advisees and can even provide references for them if they decide to move on to the next level of study, or if they require a reference for work. Even if one compensated adjuncts for advising, how do you quantify the pay?  Do you do it in terms of hours? That's kind of hard to do.  Even if you derived at a fair and equitable pay for the work, adjunct hiring is subject to volatility, you don't make a long term commitment to them, and they don't necessarily make it to you! (see issue 3).  This is no way to build an advising relationship.

Issue 2: committee work

This second issue brings us back to those master courses that my colleague talked about.  These things are decided by committee on the grand scheme since curriculum needs to make sense - it's not a hodgepodge of a little-bit-of-this and a little-bit-of-that. Faculty are not hourly employees, but adjuncts are sort of treated as hourly employees if we decide to compensate them for this type of work. It may work, but it might require punching a card.  For people who are basically paid honoraria do you really want to nickel and dime them? Sometimes committees meet for their usual x-hours per month and things are done fairly quickly, and other times committees meet many hours in preparation for accreditation, just as an example. This, of course, assumes that adjunct faculty members can do committee work for some additional pay (which usually isn't a lot). What if they can't? What if they have other priorities? If this is the case all of the work falls upon the few tenure-stream people in the department. This has the effect of both keeping adjuncts away from critical decisions and implementations made by the department, and it dumps more on the full time people in the department. Adding more adjuncts to the payroll would most likely serve to amplify this, and to add to the factory model of producing academic products.


Issue 3: department stability: vis-à-vis perpetual hiring

When you hire a full-time staff member chances are high that they will be around for a while if they are worth their salt. If you hire a faculty member, on the tenure stream, chances are that this is a career move and that this person won't be leaving any time soon.  This provides the department with stability in many ways.  It providers a core group of people to shepherd the department, its curriculum, and most importantly the students.  With adjuncts, given their semester to semester nature (i.e. no long term contract with the institution) it makes sense that these individuals will most likely be working elsewhere and have other commitments; or they might just be looking for a full time gig. In which case your institution or department will come second.  This isn't good, and if adjunct instructors leave your department you need to look for replacement. This adds to the workload of the few full-time faculty who need to start a search, review CVs, and go through and interview people.  This isn't a job for one person, but rather a job for a committee of at least 3 members to vet and verify what's on CVs and conduct the interviews.

Once the hiring is complete there is some mentoring that goes on to make sure that they are successful, and even then you aren't guaranteed that these new hires will work out. I'd say that you need at least 2, of not 3, semester to be able to get an accurate idea of how well these new hires teach, work, and fit in with your institutional culture. If things work out, great! Then you pray that they won't leave you in the lurch when something better comes along.  If it doesn't work out not only do you have to start the search again (which is time and energy consuming), you may have issues with your learners; it may have been the case that these new hires were awful and as such did a major disservice to your learners. This is something that needs mending, both from a content perspective and a human relations perspective.  Again, this takes time and effort.  Yes, I hear some of you say that this is also the case with tenure stream faculty.  This is true! It's true for all new hires. There is a period of  trial-and-error, acclimation, and kicking the tires that happens, both by the new hire's side and the department's side. However, once a new hire passes their 4th year review and they are reasonably certain of tenure, that's basically it, you don't generally need to worry that you are going to lose them and you need to start your search all over again. Not so the case with adjuncts. Commitment is a two-way street.

Issue 4: quality of adjuncts

The issue of quality of adjuncts cuts in a number of ways.  If luck out and find someone good in your search, you'll know within a semester or two if they pass the muster (and they will know if they are a good fit for your department). It is risky having any new hire, especially one with so much power over the learning of a group of students, as I mentioned above.  There are, however, other dimensions of quality. One of my considerations for quality is how current are people in their fields?  I generally do not like people who myopically focus on their own research as the cutting edge of what's out there in the field, but this is one of the legitimate ways of keeping current.

Many departments that I've been in contact with use one measurement for adjunct quality: course evaluations.  I am the first to say that I am not an expert in this arena since I have not studied it, but I think this is complete bunk.  As I like to say, you can have an instructor who is Mr. or Ms. Congeniality and basically bamboozle students into thinking that they have learned something relevant and worthwhile. Thus the students are more apt to give good reviews to bad instructors. Those people are then hired to continue teaching to the detriment of future learners. As an aside, I just read a story on NPR on course evaluations. Pretty interesting read - course evaluation apparently are bad measurement instruments.

Finally, just to wrap this section up, another issue I've seen is course-creep.  Someone is hired specifically to teach one course, CRS 100 for example, and then due to many, and varying reasons, they are given courses CRS 150, 200, 350, 400, 420, and 450.  The person may not really be a subject expert in these fields, and may not even have enough time to catch up on the latest developments for their own sake and the sake of their learners, but due to inadequate quality measurement instruments those people get to teach more and more courses in their respective programs.  As a side note, it seems as though accreditors might be taking notice of the increased reliance on adjunct faculty.

Issue 5:  disproportionate representation of faculty by teaching more courses, and issues of diversity

So, we've come to a point in our discussion (with my instructional design colleague) where the suggestion is to just create additional sections for the instructors that have proven themselves over the years.  First this assumes that the people we hire can teach additional courses for us. This, generally speaking, is not usually the case.  The people who teach for us have day-jobs. They are professors at their own institutions and they have responsibilities to their own home departments.  Adding more courses to their teaching roster simply isn't feasible from a logistics point of view.   Even if it were possible, departments don't grow by simply hiring more of the same.  The way organizations grow is through diversification. New faculty hires would be able, surely, to teach some intro level courses in our program, however they would bring in their own expertise.  This expertise would allow the department to create additional tracks of study, offer different electives, provide seminar series for diverse interests to current students and alumni.  The more of the same approach may work short term, however it's not a great long term strategy.

Still, some departments do expand someone's course-load to include more courses.  As we saw in issue #4, this is an issue of quality.  It is also an issue of lack of diversity and disproportionate representation of one faculty member.  I would feel very odd if I were teaching and students were doing 1/4, or /3, or 1/2 of their courses with me because it was compulsory.  If students really opted to take more courses with me, then more power to them, they've made an informed decision.  However, if courses are required and students only have 1 faculty member to choose from, then that is bad for them in the long run because they don't get a diversity of views, opinions, expertise, and diverse know-how from the field (if the adjuncts are from a more practical background).

Issue 6: research of Tenure stream faculty

Now, as I wrote above, I really don't like it when faculty drone on and on about their research, and their research agenda, and look for ways to get out of teaching. Being a faculty member is often compared to being a three-legged stool: teaching, research, and service. You can't extend one leg, shorten another and expect to have balance.  If you wish to be a researcher then by all means, quit your academia job and go find a research-only job.  That said, research, and being up-to-date, is important.  For me it connects with a measurement of quality. Adjuncts are only hired, and paid for, teaching.  Since there is no research requirement in their jobs research and continuous quality improvement may not be something that they undertake. This is bad not only for the students, but also for the department.  One of the ways that we are able to attract students to our respective programs is through name-brand recognition.  In a recent open house my department had published books by our faculty.  Several students commented on the fact that we had that Donaldo Macedo who worked with the Paulo Freire in our department. Yes, we have that Charles Meyer, who's a pioneer in corpus linguistics. These are just two examples, but it gets people to pay attention to you.  Even with my own studies, one of the reasons I chose Athabasca was the fact that I had read work by Fahy, Anderson, Dron, Ally, and Siemens. I was familiar with the CoI framework and the work done on that, and I am a reader of IRRODL.  The fact that AU is the place where all these things are happening was a catalyst for me to apply and attend. All of this stuff comes directly from the research work and public outreach of the full time faculty of this institution. Adding more adjuncts to the payroll doesn't get you this in the long term. Again, you invest in your faculty and your get paid back with dividends!


Conclusion

To wrap this up, in this big organization that we all work in, we all have many different jobs, little communication, and no one has a big picture. I consider myself lucky. Having worked as a media technician, a library systems person, a library reference and training person, an instructional designer, as an adjunct faculty member, and now a program manager, I've seen all of the different levels of what's going on in academia.  I have a more complete picture, much more so than any of my colleagues who are in the same job/career path. The upper administration is still a bit of a mystery to me, but I guess I still have room to grow. I am grateful that friends and colleagues want to help out with growing our program, but without having all of the information, I am afraid that "changing the model" is simply code for do it quicker and cheaper and churn out more students.  Students need mentors, advisors, and role models. The adjuncts we've had teaching for us for the past 3 years (or more) are great and do, unofficially, provide that out our learners. However, you can't grow a program on adjuncts. What it comes down, for me, is recognizing the humanity of adjuncts, compensating them well, getting them into the fold as valuable contributors to the department, and investing long-term in programs.  Figure out what you need tenure stream people for, what you need Lecturers for (adjuncts with long-term contracts) and work strategically. Semester-to-semester, and adjunct majority, is not the way forward.


Your thoughts?
 Comments