Club Admiralty

v7.0 - moving along, a point increase at a time

Multilitteratus Incognitus

Traversing the path of the doctoral degree

Graduate admissions process pondering

 Permalink

This post has been brewing in my head for a couple of years now. Since I am waiting for IRB clearance for my dissertation I thought it would be a good time to jot things down and see what others think.  I usually tend to have people in my network who either teach or work in some sort of instructional designer (or faculty developer) capacity.  I don't (think) I know too many people in the higher education administration aspects of my work to discuss these kinds of things with (so I may just be speaking to no one 😝).

Anyway, one thing that's been gnawing at me for the past number of years is how one enters into graduate programs.  I'll focus more on the master's level for a few reasons.  I manage an MA program, I teach for an MEd program, and from observation, I've seen that masters programs probably don't have as many administrative constraints: for example, [virtual] classroom space, working with a cohort model that's more tightly integrated [as compared to doctoral programs that are more tightly interconnected], and most masters programs allow non-matriculated students to take courses for fun or interest (doctoral programs typically do not). 

In the US, a typical application to a graduate program requires the following:
  • An application (usually electronic these days)
  • An application fee (my institution has it at $60 for domestic students)
  • A statement of purpose (why do you want to apply to the program, how does it meet your goals, etc.)
  • 2-3 letters of recommendation from academic referees (former professors)
  • Your transcript indicating that you completed an undergrad program (sometimes ALL transcripts, even if you decided to take ENGL501 for fun somewhere...).
  • Some sort of standardized test (albeit this is not as common these days)
I've done this song and dance four times since 2004.  The first time I did it (for my MBA), I didn't think much of it.  The second time I did it (for my MS), it was a little easier, albeit annoying because I was applying to the college I was already a student in.  The third and fourth times (MA and MEd degrees) I reused letters of recommendation and I phoned it in. I actually used one application to apply to two programs, and one statement of intent that covered both.  I almost did it a fifth time for another MA degree, but they required GRE scores to prove that I could go graduate work (after having successfully completed 4 masters programs mind you...), which is when I decided to not apply to that program.

As part of my day-to-day work, the admissions process is one of those things I manage, and I'm convinced that the entire apparatus is mostly an arcane and archaic gatekeeping mechanism, built around the constraints of "p-learning" (Dron, 2016) and program scarcity (when MA programs only really accepted a few students each year).   The admissions process doesn't feel like an enabling mechanism.

Take letters of recommendation for example. If you've attended a western institution, and you ask for a letter of recommendation from a former professor, the letters of recommendation that they provide will mostly be good. No one decent will agree to write a recommendation if they are going to negatively evaluate you. Hence, the asking for, and receiving, of a recommendation becomes a ritual of asking former professors to vouch for you, but really knowing ahead of time that, if they agree, what they write will most likely be glowing.  It privileges people who've already built those connections with some professors, and it is a colossal waste of time for the referee. It also privileges western applicants because (it's been my experience) that non-western referees typically write sparse recommendations that typically reference a transcript ("AK was a good student, got A's in my class").  In cases where someone doesn't have academic recommendations (for whatever reason), a common tactic is to take a course or two in the department (as a non-matriculated student) to obtain those letters of recommendation.  If they've taken two courses, and gotten good grades (however you define "good grades"), then the recommendations are pointless (especially since those who write the recommendations are also those who often read them for admissions🤷).

Another thing that I find interesting, is the requirement for a BA (sometimes the requirement is just a BA, even if it's not in the discipline).  A bachelor's degree can be verified by transcript, but I do find it odd that some schools require *every* transcript. So if someone has earned an Associate's Degree in Automotive engineering (school 1), a Bachelor's in Sociology (school 2), and wants to apply to an MA program in school 3 (let's say in a sociology program), that school could ask for a transcript from his AA degree transcript even though it's not relevant, just in case the applicant has anything to hide🙄. If the applicant has demonstrated that they can do academic work by showing a transcript with a completed degree, why request everything? 

Related to this, here is a hypothetical (I haven't seen or heard of this happening in real life, but it theoretically could): students are able to take graduate courses as a non-matriculated student.  As a matter of fact, learners can string together all required courses for a degree as a non-degree student but still not earn the MA because they either lack a BA, or they took all their courses as a non-degree student, so we can't count most of them if they matriculate near the end of their studies.  I am actually not sure where I stand on this.  On the one hand, the MA is positioned as an advanced degree in something (which assumes an introductory degree in it?) but on the other hand if you can do the work, and you demonstrated that you can by actually doing the work, why should you be prevented from earning that degree if you don't have the previous one in the series? 🤔

The graduate admissions protocols (at least for masters programs) seem like unnecessary gatekeeper these days.  Because of all of the documentation required, and the process of review for the materials submitted, admissions has defined calendar deadlines each year, which can shut people out if they happen to miss those deadlines.  Personally, I'd like to see more Open Admissions process to start to take place.  Perhaps some introductory and required courses could be open to anyone interested in taking them (I'll pick a random number, and say that this represents 40% of the curriculum), and then whoever is interested in continuing their studies can submit a brief application, and a reflective essay/statement of purpose and the faculty can make a more informed decision in terms of formal acceptance given their own experiences in class with the applicant. This would cut out the unnecessary transcript evaluation, letters of recommendation, and standardized scores (and all the processing and vetting that goes on with that).

Your thoughts?
 Comments

Academic Facepalm (evaluation edition)

 Permalink
Back in December, I was searching for the #tenure hashtag on twitter.   There was some discussion (probably stated by Jesse Stommel 😜) which prompted me to search for this #hashtag out of curiosity to see what was tagged.  Along with heartwarming stories of people who've just earned tenure (a nice perk right before the winter break!), there was this wonderful tweet specimen...



I'm not gonna lie.  IT BUGS ME.

It bugs me as a learner.  I've always completed course evaluations and I tried to give honest feedback to the professor.  If the course was easy, hard, just right, I wanted them to know.  If I was appreciative, I wanted them to know.  Yes, sometimes I've half-assed it and just completed the Likert scale with a "loved the course" comment at the end, but many times I try to be more concrete about the feedback.

It bugs me as a program manager. I am the individual who sets up, collects, and often reminds students, about the course evaluations.  My colleague is in charge of making sure things like these get into personnel files and maintains department records, and also seems to manager tenure and promotion paperwork for our department (among her other duties).  Faculty committees spend time discussing this each year for merit increases.  So. much. wasted. effort! 

It bugs me as an adjunct.  Yes, I teach for the fun of it. I like helping new instructional designers find their footing.  As an adjunct, if my course evaluations are bad I could be no hired again just for that.  There are no protections.  And, then you've got this tenured individual who openly flaunts their privilege.

Now, don't get me wrong.  I know that Level 1 evaluations are flawed.  They don't measure learning, they measure reactions to the learning event.  But they are feedback nevertheless.   If you don't give a bleep about what students say about your course, one day, despite your tenure, you might not have any students left...

As an aside, I feel like tenure is an outdated institution.  I'd advocate for strong unions over tenure any day of the week.

Your thoughts?
 Comments

A look back at this summer's PD - Part I: Conferences

 Permalink

Summer is usually the time for some professional development, after all during the academic year things are going at such a fast and furious pace that it doesn't leave much time (let alone brain/mind-space) to undertake much professional development.  This summer (because of "factors") professional development was not as easy going as it has been the past few years, so I needed to pick a time to do schedule in the PD rather than pick it up throughout the summer.  This year one of my big work projects was  to manage and lead an OSCQR review of my department's online courses.  I started out with a manageable goal of 10 courses (our core courses consisting of 80% of the required curriculum for all of our students), but once I saw that our I am our three fabulous summer student aides were cracking through those 10 courses in about half the time I had originally budgeted, I decided to utilize the resources that I had on had (three great reviewers) and add another 8 courses, for a total of 18 (or - to put it another way: 75% of our entire course catalog).  Because of this massive project I ended up bracketing my PD to a week in August, and a week in June.  Even though it was a bit more compact than previous years, I thought I'd reflect a bit on it, and perhaps reflect a bit on the OSCQR process.

In this blog post I will focus on the two conferences I attended in June: LINC at MIT, and the Mass College Online Conference.  Hat tip to John (@dbeloved) for letting me know of the LINC conference - something that was in my back yard, but I was totally oblivious to.  The Proceedings for the 2019 conference are not online yet (hopefully they will be soon) because I want to dive into the sessions that I missed.  In addition to this thing being local, one of the things that piqued my curiosity was that Peter Senge was presenting (of Firth Discipline fame).  He was one of the favorite people of some past professors I had in the instructional design program, so it was an opportunity to see them live.  Throughout the three days there were a number of very interesting talks (to many to even recall everyone - but names do pop out when I look at the schedule).  From this conference there are two things that are still vivid in my mind:

There seems to be a lot of talk about using technology platforms and MOOCs to teach ESL. As a language learning geek, and a MOOC person (yes, still am!) I find this fascinating.  I WANT to see it happen: language learning through massive online environments.  It is undeniable that English at this point has a massive advantage as being the world's lingua franca, and it's understandable that people want to use technology to teach ESL broadly.  But... what about using the expertise that we have, and the technology at our hands, to teach other languages?  For example why not a Greek MOOC†?  Or an Arabic MOOC? Or Algonquian (the language of the people who are native to Massachusetts prior to colonization)?  It seems like you can't walk a kilometer in Greece (or any other country in Europe for that matter) and not see an advertisement about learning English.  There are so many people that teach English, so why not use our technology and knowledge to promote other languages?

The other thing that stands out is the panel about Educating the Future of Work (see here for schedule) which had panelists from Microsoft, American Job Exchange, and San Jose – Evergreen Community College District.  One of the things that is often talked about are alternative credentials (and related to that micro-credentials).  Not that it was brought up in the panel (as far as I remember), but news (greatly exaggerated IMO) about companies doing away with the college degree as an entry credential to the job market have been making the rounds.  Taking these two threads an putting them together might make it sound like the days of the college degree are numbered.  However, I don't feel like we reached that conclusion with this panel.  A lot, from what I read into the discussion, seems to revolve around trust relationships.  A college degree, as vague as it may be when it comes down the the specifics, comes from a trusted source, whereas some letter vouching for your apprenticeship or micro-credentials (as they currently stand) do not.  Furthermore, there is a bit of an implicit bias favoring bigger college names.  So, someone who graduated from MIT (we liked to pick on MIT since they hosted  us 😜) would have an advantage over someone who graduates from a State college somewhere.  There is lots to unpack here I think, and lots of room for discussion. It's also such a multilevel/complex problem that I think we need constituents discussing this now just from higher education, but from government and private industry.

Anyway - some short reflections from the two conferences I attended this past June.  Your thoughts?




MARGINALIA:
† - MOOC here is just a placeholder. Fill it in with whatever open classroom you'd like.

 Comments

Hey! This isn't what I signed up for!

 Permalink

In my last blog post I was responding to the academy that isn't - or, perhaps, as some comments indicated, the academy that never actually was.  This past week I was at MIT's LINC conference.  It was a good opportunity to attend (since it was local), listen into some interesting panel discussions, and meet some folks from all over the world doing interesting things.  It was also a good opportunity to connect with folks (via twitter mostly for me) to think about academia (and the role it has) from a systems point of view.  I was rather happy to have been there to see Peter Senge speak at the end of LINC 2019 as he is a systems person, and someone whose work was foundational in my instructional design learning.

Now, I wasn't really planning a follow up to my last post.  I sort of wrote it in order to contribute my 2-cents to the discussion, as a response to @Harmonygritz (George), and also point people to it when they ask me if I want to pursue a tenure-track job.  However, the topic of faculty not being prepared  to do what their schools ask them to do came up on twitter during my #mitlinc2019 posts (via @ksbourgault) and oddly enough when I returned home and checked the subreddit r/professors the following post was made by one of the users:

I got into academia because I love creating and sharing knowledge. As I sit here working through my day, I can't help but wonder how I turned into a website administrator and customer service agent. Next year I've been told I'm going to have administration/management duties I never wanted and won't be very good at. I used to be the kind of person that didn't work a day in their life because I loved what I did. Now...well...getting through the days require medication. God dammit. Dammit. Dammit. Dammit.

So, I thought - what the heck?  Why not write about this?  After all, some people tend to give you a strangle glance when you point out the problem but offer no solutions.  So,...here is my tenative solution, as imperfect as it may be.

As I mentioned in my previous post, a tenured (or tenure-track) professor job has three main responsibilities: Teaching, Research, and Service. I would say that here, at the "job description" level there is a problem. Faculty are not prepared for all of these things during their studies.  Faculty are only prepared for one thing in their doctoral studies.  That one thing is Research.

Educating credible, ethical, and competent researchers is the distinguishing characteristic of a doctoral program and that is what makes as doctoral program different from a master's program.  Some people may argue with me that this is specific to a "PhD" whereas an "EdD" is more applied in nature - but I respectfully disagree; I've written this in another blog post year ago, and I am sticking with it. The crux of my argument was this:  Both PhDs and EdDs need to be able to critically consume literature, critically produce research literature, and critically apply research literature.  If you can't do that, then there is a problem.

What you'll notice is that those three verbs (consume, produce, apply) do not include the verb "to teach".  This is something that, in my opinion, could be remedied at the doctoral education level.  It's also something that could be remedied at the hiring level.  K-12 teachers (and other professionals) are expected to complete a certain amount of hours in CPD (continuous professional development) every year to maintain their teaching license.  Why not tenure-track faculty?  My fellow instructional designers bemoan the fact that faculty rarely reach out to them for training, and no one attends the workshops that they spend a lot of time on preparing.  Well, I can tell you why (and I've told my colleagues this too):  This type of CPD is not something that is valued at an institutional level.  No one is forcing faculty members to attend CPD sessions and apply what they learn in their teaching. It's not something that faculty get 'brownie points' on their annual reviews for, and when push comes to shove and they need something to clear off their places, CPD is it.   In my proposal I would say that doctoral students should get their "starter pack" in instructional design and teaching while they are doing their doctoral studies, and then they continue with CPD at the workplace (and have it be required).  Simple.

But, hold on, let me get a little more granular here, because I think it's needed. My proposal doesn't just stop at mandatory CPD.  I would argue that - depending on the needs of the organisation - the job duties of the "professor" position should be malleable and negotiable every so often. What do I mean by this?  Well, I'd say that we should start off with two "starter" JDs (job descriptions), and for a lack of better terms I'll call them:
  • Researching Professor (RP)
  • Teaching Professor (TP)
The RP would spend 25% of their time teaching, and 75% of the time applying and getting grants, and researching and publishing.  The TP would spend 75% of their time teaching and 25% of their time researching and publishing.  Both positions would be compensated the same, would get the same prestige, and the same benefits, but there would be a difference in how they were evaluated.  A researcher would be mostly evaluated on the quality and volume of their published work, they would need to attend teaching CPD (although I think less than a TP), and they would be evaluated on their teaching, but we'd go "light" on them since this would be a part time responsibility on them.   The TP on the other hand would be required to have higher amounts of teaching & learning CPD for the year, given their teaching-first responsibilities, and conversely would be evaluated annually with more weight going to the teaching than the research output.  This is important because at the moment (from my own little microcosm) I see a lot of emphasis placed on research and publishing in tenure and promotion cases.  Knowing what "track" you've applied to, and what track you are in is extremely important in my proposed model.

Another key element here is the negotiability of the position.  How frequently this happens is up to the organizational needs.  But, let's say that I am hired into a TP-tenure-track position and after a few years of courses I really want to focus a bit on my research for the next year or two. Maybe I want to be 50-50 (teaching/research), maybe I want to be 25/75 (thus being moved into the RP structure).  This should be negotiable between the faculty member and the chair - keeping in mind the needs of the department as well as the needs of the individual.  Likewise, if I am in an RP-type of position but my department suddenly has a ton new students and needs me to teach more courses, I could negotiate to go into a TP-type position for a year or two with this new cohort of learners, and thus be evaluated mostly on my teaching.  The key thing here with evaluations is that we don't privilege teaching over research (and vice versa) when conducting annual evaluations (or even tenure/promotion evaluations).

But...wait!  You are asking me "what happened to service???"  Well... service is kind of a tricky subject, isn't it?  I would treat the service category as a category that would push a faculty member to the "exceed expectations" category of the annual job evaluation, and because of this consideration, the service category would potentially want merit pay (for a job well done, not just having it on paper). One reason for this is that lots of things could fall under service: such as: Organizing a conference, undertaking student advising, sitting on someone's thesis/dissertation committee, doing some marketing for the program or recruiting new students, serving on the library committee, or on the technology advisory board, etc. Because there is such variety in terms of service postings it's hard to say what faculty should or should not be part of. However, CPD and some method of evaluation should be part of these service decisions.

For example, when I was an undergraduate, meetings with my major advisor were short, he looked at my transcript, signed off on courses I wanted with very little dialog, even when I tried to engage about my interests in computer science and future goals I'd basically hear crickets, and that was about it. Except, when my GPA dipped and he advised me to change major - instead of figuring out why this was the case, where my areas of deficit were, and how to improve (I guess he was worried about departmental averages than retaining students in the STEM field...).  In the meantime (last 15 years) I've met faculty "advisors" from all across campus that "advise" students without even knowing the degree requirements for their own programs. This is just plain wrong. So, taking this as a use-case as an example, if your service advising I'd say that those faculty members should attend CPD to get informed (and test on) departmental, college, and university policies; get trained on degree requirements; know the costs of attending college; and getting to know people in other departments that support students (such as the writing center, the ADA center, and so on).

There are some service duties that faculty shouldn't be in charge of.  Marketing and recruitment being one of them (I am sure there are others too).  Faculty just don't have the skill set, and it's not really efficacious to have them obtain it.  There are positions on campus of people who do this type of thing.  If faculty want to switch careers, that's fine, but I do have an issue with faculty keeping their position as faculty while half-assing something (or worse, passing it onto staff...). Faculty can be part of these processes (of course), as experts in their own discipline and experts of their department, but really marketing and recruitment should reside elsewhere, not with faculty.

In the end, here are the guidelines for my NuFaculty setup:
  • Get rid of Tenured and Non-Tenured distinctions.  Everyone now becomes Tenure track with two possible starting points:  An RP and a TP. Having tenured and non-tenured tracks leads to discrimination and classism IMO. Just as there can be a lot of different types of professionals, there can be (and should be) a lot of different equal types of faculty.
  • Faculty get evaluated not on on a one-size-fits-all model, but rather based on their designated positions and through consultation with their department chairs
  • Faculty CPD is a requirement, especially for TP.  CPD factors into annual and tenure evaluations.
  • Positions are flexible based on the needs of the individuals and the needs of the department, but they need to be setup before the evaluation period begins (can't change horses in mid-stream...)
  • Faculty positions are 12 month positions, not 9 as they currently are now.  Yes, they accrue vacation time that they can take. As a 12-month employee they can decide when their "summers" are when they don't teach, so their period of teaching responsibility can be flexible, and this is a win-win both for faculty and the school.
  • Service isn't required for faculty positions, but highly encouraged.  To undertake service the appropriate service type needs to be matched with pre-existing skills.  CPD is available for those skills if faculty want to grow into that area, but you can't practice until you show some competency.  Depending on the department needs service can substitute for 25% of the research or teaching component with prior approval and for defined periods of time.
  • The incentive to sign up for service is merit pay.

Your thoughts?




 Comments

This is not the academy you are looking for...

 Permalink
Have PhD...(source)
George Station had posted this article titled "The academy I dreamed of for 20 years no longer exists, and I am waking up" with the lead in of: Ellen Kirkpatrick has yearned for an academic career for many years. But 18 months after finally earning her doctorate, she is no longer sure she wants to remain in a sector defined by precarity, exploitation – and ‘quit lit’ 

George asked us (tried to bait us? 😏) to see what we think about it in his Fb posting, and I am surprised (given the circle of people George and I follow) that no one else jumped into the discussion.  I though I'd give it the old college try and write a blog post about it.  It is something that's been on my mind in the past couple of years. Once the coursework component of my EdD was done people started asking me what I plan to do after I earn my EdD.  I am still at the proposal stage (one of these days I'll write about it), but inching forward, so I guess the end will be near at some point - which makes the question quite salient: what's next? For the purposes of this piece a PhD and an EdD are the same (I've written elsewhere about the artificial distinction).

So, why a doctoral degree in the first place?
Any discussion about post-doctoral work (not PostDoc, but post-graduation employment) would have to start with the original motivation as to why one wants to pursue a doctoral degree in the first place.  It's a very good question! Everyone answering this has a different answer, and I would say that if your answer is "I want to become a professor," then you better have a Plan B!

For me it's the convergence of a few factors:
(1) Many people in my immediate circles saw the number of masters degrees I had earned and asked me (some jokingly, some not) when I'll finally go get a doctorate.  As if a college degree of any sort is something you can get (I'd say it's earned).  Still, I suspect that they saw it as my natural progression.  So one small reason to pursue it was to shush those (well meaning) folks up.  However, there is always another follow up question from the crowds once you read that doctorate: what's next?
(2) I wanted the credential for my own purposes. For better or for worse, a doctoral degree does open up some doors that those without a doctorate can't open.
(3) I wanted to learn from people in my field (and the people at the university I applied to are world renowned for that).
(4) Finally, a tenure-track job was a minor consideration, but factors 2 and 3 were the bigger ones in my decision-making process.  Having started my career in clerical-style work (could earn overtime, but had little autonomy) to holding down various professional jobs (lots of autonomy, no overtime), I thought that a potential jump to the professoriate could be a thing for me.  After all, I did like research, I liked teaching, and from teaching I liked mentoring those who were just entering the instructional design field. So, #4 has a little bit of #2 in it.   This rationale came before I really got entrenched in an academic department.

So, what do I think now?
I guess the questions never end, eh? Aren't you people happy? 😛 The same people who were asking me about when I would go get my doctoral degree are asking me what's next? Are you going to apply for faculty jobs?

The tl;dr answer is: No, I won't be applying for faculty jobs. If something comes my way - fine, maybe, but I won't pursue it.

Why?  Well, I've had time to think about it, and like the article, what I thought the academy was 20 years ago turns out to not be true (I started working at my institution a little over 20 years ago too!). I've basically distilled it to a few key points (from my own views and data gathered over the years).

1) Tenure is its own type of oppression - I know what you are thinking now: AK! What are you smoking?  Professors have a sweet deal!  They only teach (x-many) classes, they hold their own office hours, come and go as they want, and they have the entire summer off!  Jeez man! Who doesn't want that?  -- Well when I was an hourly employee I thought this too.  Even as a salaried employee who didn't have much contact with the ins and outs of tenure I thought so.  But having seen behind the curtain, the path to tenure (or promotion) doesn't seem very rosy.  The way that I would describe the tenure process is as one long probationary period. A probationary period where people should be mentored into the profession, but often seem not to be the case. In addition, there is the famous publish-or-perish aspect, which some institutions prescribe what and where is to be published (and how much), and some just look at the raw count of publications (and it's up to you to figure it all out, and hope that your eventual reviewers will approve of your choices). During this time period there is precarity in the profession, and you are royally screwed if you happen to be hired by a department (or university) with toxic personalities and non-supporting colleagues.  In my professional life I have never had such a protracted probationary period. Furthermore, I have always been paid for my time [faculty often work summers, even though their contracts as September-May]  and (generally) been asked to do things that I know how to do in order to prove my skills. This bring me to point #2:

2) Faculty are not (always) prepared for the work asked of them, and have no clear understanding of the system that they are a part of - This might range from teaching (although I think doctoral programs are getting better at preparing candidates for teaching) to meaningless committee work that doesn't match their skills, but that they nevertheless need in order to make it to tenure (or promotion).  There are many examples of this: from faculty who are placed on marketing and recruitment committees (universities have staff for that!) or chairs and directors who reluctantly do the work because someone has to do it, and it has to be faculty. Now, I do acknowledge that some people have the knack for this sort of work, some have the knowledge, and some have been conscientious enough to attend workshops, get feedback, and be the best [insert management title here] that they can be; and I've been lucky enough to have met and worked with some of those folks.  That said, the modern academy needs professionals in those positions; professionals who are not going to cycle through their term as [management title] and then a whole new group of younglings enter the [management title position] who ask the same questions of staff and need to be trained to do the work.  Yes, we collectively complain of administrative bloat, but we need to realize that we need to have the right hire(s) for the right job(s) and we need to realize that in the modern academy someone who is trained as a doctor in engineering, sociology, linguistics, mathematics, computer science...etc won't necessarily be the best manager, marketer, director, or curriculum specialist. This brings me to point #3

3) The faculty system, as a whole, is classist and the power dynamics are messed up. - There are many examples of this, and when I say system, I do mean system as a whole.  I've met many, many, fabulous faculty members who see staff (clerical, professional, librarian) and adjuncts (lecturers) as colleagues, and want to work together collaboratively for the benefit of the students.  I've also met raging idiots (to put it mildly) who look down upon their colleagues, whether those colleagues are staff members, part time adjuncts, full time lecturers, or just "junior" faculty who have not obtained tenure yet.  To be honest, any organization will have highs and lows, but as a system the tenure system both tacitly encourages such classist attitudes and at the same time provides the space and fertile ground for meaningless ego stroking. One example is  point #2, where faculty are asked to do stuff they have no skills or preparation in doing [simply because they see it as their job], but at the same time some faculty think they are the bees knees at that topic even when what they do is meaningless and/or badly done.

There is also a level of tone-deafness to the faculty system.  For example,  my campus recently had major parking cost issues. The short version goes like this: is that management is supposed to negotiate with unions across campus if they want to raise prices.  The faculty union on our campus broke away from the campus coalition of unions that negotiated this thing jointly in the hopes (?) that they would get a better deal than the rest of us.  At the same time, on another front, the Faculty Council (you know, joint governance and all) issued a statement encouraging faculty [tenured track, that is] to not come to campus on the days that they don't teach since the costs of parking are prohibitive.  However, in this resolution there was clearly no thought of the students (who might need to be here 3-5 days per week) or staff who usually have to be here all week.  Such unintentional or international classism is what I dislike about this whole system.  This brings me to point #4

4) There is more than enough fear to go around! -  Now granted, there will be some idiots is any work environment (as I said above), however I have a sneaking suspicion that much of what motivates faculty is fear. Fear can make us do some pretty bad things, and make us be pretty crappy people. Fear is an awful way to live one's life!   Now, what do I mean by fear?  I've seen countless examples over the last 15 years.   First, fear that if you don't publish enough, or of the type of work that people expect [but don't necessarily communicate to you] you will perish; but at the same time publishing requirements can be either opaque, so you don't know what you're being evaluated on; or they can be super specific and at a high bar making it hard to meet those requirements.  For example, last semester I was conversing with a colleague from the Classics department. Some of the high ranking Classics journals have a 2-year wait time! That's easily half your tenure-trial period!  In my field, one of the big name Open Access Journals has already met their quota for 2019 and they are not accepting any submissions past May 1st (so a month ago).  That is pressure if you are a tenure track faculty member.   Furthermore, lots of publishing guidelines include journals that are high impact and not Open Access.  If your philosophical positioning is that you want to publish OA and create valuable OER, these things won't necessarily count for tenure.  So, there is fear that you won't make tenure if you don't comply.  There is also fear that you won't reach full faculty rank.

So, between pre-tenure, and promotion, that's easily 15 years where you might keep your head down, say yes to whatever service comes your way, and smile and try to not make enemies that might derail your tenure or promotion down the road.  This fakeness and fake politeness is bad for the profession.  When you are on a committee, and the committee's charge is to evaluate courses that are coming up as new offerings, and you as a subject expert detect bad pedagogical design and are afraid or reluctant to say anything because of academic freedom (or a mistaken notion thereof), or are afraid to offer constructive critique because the person receiving your critique might make your life difficult in the future...then my friend there is a problem! Don't get me wrong, working with others can be a challenge at times (this is undisputed), but we should all expect a professional demeanor and expect that we are all working toward the common goal of improved teaching and learning outcomes for our learners.

This leads me to my last point about tenure...

5) Tenure is a trap! - Perhaps I am being a bit dramatic...but maybe not.   Tenure is a trap in my view.  Once you get tenure, or even once you get promoted to full Professor, you don't want to leave - regardless of the on-ground conditions.  You were successful in running the gauntlet.  You got a permanent job at your institution and you're set for life [errmmm...maybe].  After 5 (or 15) years of keeping your nose down you've made it. Now you can do what you really want, right?  Sure!  Or you might be resigned and bitter because of all those years.  But there are three caveats:

a) even tenure isn't a full-proof way of  guaranteeing that you get to keep you job long term.  If your institution folds (like many SLACs have in the New England area over the last two years), your job is on the chopping block.  If your department shuts down or gets absorbed, your job is on the line.  You say well sure AK, that only makes sense, right? True, and you could apply to other institutions for jobs. However, if you go to another institution chances are high that you will have to go through some tenure-track process again (argh!). Even if you do get hired with tenure, you might still not be at the same rank as before.  And, I'd venture to say that most people don't get hired with tenure; unless you're some sort of Chomsky-type and the institution is actively courting you

b) Tenure is like the Hotel California, once you get it at your institution, you can't easily move elsewhere.  Let's say life circumstances change and you need to move to another state for whatever reason - it's not like you can get hired again elsewhere with tenure - again, not unless you're some sort of Chomsky-type and the institution is actively courting you. If you aren't in the power position many people [appear to] treat you as if there is something wrong with you.  What is it?  Why did AK leave his tenured position at ___?  Was it a personality conflict? Was AK a total jerk? Do we want a total jerk at our university? Oh man, what if he's an axe murderer??? - as you can see a total rational decision to move can devolve into something ludicrous due to speculation and how people view the field. It's just beyond the comprehension of many in the field as to why anyone would leave a tenured position! As an hourly or salaried employee I've never had to deal with this type of catch-22.  I've always felt that if the job doesn't suit me anymore, or want to try my hand at something different, I can look for, and apply, for another job without fearing that my good standing at my current job will be twisted into some sort of what's wrong with him?

c) Finally, What if you don't get tenure?  If you leave before you obtain tenure, or are flatly denied tenure (for whatever reason, a real deficiency or just campus politics), I get the sense that you just get a scarlet letter on you and other institutions don't want to take the a risk on you, especially with such an oversupply of doctoral graduates in the market.  Same as part (b) above, but I think worse.

Tenure, as a system, was meant to incentivize honest opinions, feedback, and true academic freedom, but it's become a place where all those things go to die - because of...fear. This fear is perpetuated and amplified when one considers the precarity of adjuncts, not just how poorly pair they are for their academic labor, but also how far they are from actual job security.  The tenure system, as it exists, pits person against person in this academic fight club. This is plainly wrong.

The wrap: Pay & Job Security
So... where does this leave me personally?  Well, I tend to think practically and pragmatically about job decisions. After all, there are bills to pay. At the moment I think I have a fair amount of job security, broadly speaking, because I've done a variety of jobs in the past, and I have the skills to go to a variety of places and fill a variety of positions if need be. I personally don't want to join the tenure track system where there is a loss of nominal job security until you get tenure, and I certainly don't want to join the precarity of adjuncthood.  But...what if there were a monetary incentive to do so? What if they pay was high enough as to defray the costs of such precarity and fear?  That's where I needed some data.  So...I headed to the 2019 AAUP faculty compensation survey which game me some data.

According to the above data, the average pay public university Assistant Professor Salary is $84,062 [tenure track position] while the average pay for a public university Lecturer is $57,079.  Hmmm... The Professor salary may do it... the lecturer definitely does not.  What does the report say about my university specifically? Well, according to the report the average pay for an Assistant Professor at my current university is:  $91,400, while the average pay for a Lecturer is $73,400. These are numbers that made me question reality for a moment. I know that we hired new faculty recently and none of them go anywhere near $91,000, or even $84,000 as assistant professors.  Hmmm...  So, I looked at the faculty union's contract, and lo and behold there are salary floors for the various ranks. The floor for a Lecturer (the starting rank for that position, there are two higher ranks that you can be elevated to through a tenure-like-review) is $52,000.  The floor for Assistant Professor is: $64,000. I know that the people who were hired as new assistant professors got a little better than the $64,000 floor, but there is a big chasm in my mind between the floor and the average reported institutionally in AAUP. I think that the reality is that most new incoming faculty (tenure track) are paid around $70,000 around here. 



Considering that faculty often work unpaid over the summer (if you want to make tenure, you use your summer to research and publish!), and also considering the classist attitudes of the tenure system - i.e., why isn't 1 FTE [lecturer] not the same in terms of pay as 1 FTE [tenure]? They are both 1 FTE and both have terminal degrees! -  My answer is a hard pass on pursuing tenure. Mostly on points 1, 3, and 4 above.  So, from a practical perspective, I'd venture to say that the pay provided does not defray the costs (monetary, mental, emotional, physical) of pursuing the tenure track. I think I would be better paid, and more impactful in student's lives as a staff member (as I am now), and not as a tenure-track faculty member.


Ultimately my big takeaway from the article was this:
I have no answers to these questions, yet. But I know this. I do not want to further the culture of precarity by relocating for a temporary position. I do not want to prop up the current academic publishing model, in which publishers take all of the profit and bear none of the risk. I do not want to teach so many hours that I cannot pursue my intellectual curiosity and creativity: the things that got me here in the first place. I do not want to nurture that nascent competitive twitch over my collaborative sensibilities. And I do not want to sacrifice my work-life balance and, it follows, my mental and physical health.

I couldn't agree more!


What do you think?
 Comments

eLearning and Identity

 Permalink
A week or so ago...well...maybe two weeks by now♣, the topic of eLearning 3.0 was identity, and the video guest of the week was Maha Bali.  I finally managed to view all of it, even though it was in 10 minute increments. My Pocket's save-for-later is getting rather lengthy now that I am saving articles on the daily to read at some other future time. This time of the semester is rather busy, so I guess that's my disclaimer for this post: I am only commenting on the video and points that were brought up.

In looking at the notes I took during the vConnecting session during week 4 (mid-way through the MOOC!) there are a few organizing factors that sort of came to me, so I've organized the post in this manner.

What's in a name?
At the beginning of the conversation Stephen had a bit of a hard time getting the native pronunciation of Maha's name. It's interesting to kick off a discussion about identity in such a mundane way, but I think that the concept of a name is quite powerful, on many levels.  Often times names are given to us and we have no control over them. My name for example was given to me by my parents and godparents. There is a particular nickname given to me in grade 4 that I really only use in the company a certain close-knit group of friends from that time in my life.  Other times we choose our names, case in point the username that you chose for your email address, that forum you joined, or your xbox gamertag. Those names we choose for ourselves usually have a story behind them. I would venture to say that stories are associated with names, and (my hypothesis is that) no two names share the same origin story. The multiple names that identify us - legal names, user names, nicknames - are to some fashion a pointer device for some aspect of our multifaceted identity.

Seeing as most languages are spoken, how one says someone else's name is also important. When I moved back to the states only my language teachers (French, ESL, and English) could correctly pronounce my given name without correction. No surprise there, I suspect that they (being language teachers) had a sense of how different languages have different rhythms and cadences.  Early on I adopted "AK" (there's a story there), as well as the English pronunciation of "Apostolos"† (minor eye roll, but whatever) as synonyms for Απόστολος. Many Greek Americans (second or third generation usually) also attempted to call me Paul, which didn't really carry much favor with me. Paul is another name entirely. I suspect that someone, at some point, probably at Ellis Island decided that an Apostolos would be a Paul in English, and newly arriving immigrants just adopted for a variety of reasons. Paul is an identity marker that I rejected right off the bat simply because Paul ain't my name, and Greek Americans should be able to say Απόστολος, no? 😛


What about the things we studied?
Stephen asked if there is an identification with the institutional affiliation, and how important that was as part of one's own identity. And, in relation to that the discussion flowed toward our identities as learners and how those are connected to the institutions that were part of our learning experiences‡. This gave me pause to ponder.  As far as my educational experiences (in higher education) go I have two institutions: UMass Boston, and Athabasca University.  While I've made many connections at both of these institutions; connections to people, places, knowledge, and experiences, I don't really think of myself as a "Beacon" (the UMB mascot).  I am wondering if part of this is cultural.  I do see a lot of people with academic paraphernalia (mugs, hats, cups, sweaters, flags) from their alma matters,  but this isn't something that appears to be of importance to me.  Now, granted I have (over the years) had mugs, hats, and sweaters from UMB...and a coffee mug from AU that I've used so much that it's falling apart, but I really don't know if those things were gone if I'd miss them, and hence seek out a replacement.  The institutions that have played a role in our education have certainly helped to shape us into who we are, but as far as identity goes, what do those various physical objects that we keep say about us in the end? Is it about the institution itself? Or about the individual who gifted the item to us?♠ 

Relating to the institution is what we studied. In one of the past vConnecting sessions I was part of, when we were socializing before starting the recording, Maha had made an observation that a number of us (me included) had started off as computer scientists for our undergraduate degrees, but have moved on from that field. My own learning journey has taken me into many twisting, turning, and branching areas of knowledge; and while I ultimately chose Education as the field for my doctoral work, the previous fields I've studied are still elements of curiosity for me, and I would say inform my day-to-day work.  Over the past number of years I've wanted to get back into coding, partly because I thought that this is what computer scientists do. I was thinking to myself: well, how can I call myself a computer scientist if I don't code.  Maybe I am a "lapsed" computer scientist? Nah, that sounds too negative, after all I still use a lot of the knowledge gained through that course of study to understand the world now, so how can I be lapsed? Then came the aha moment:  Maha said something interesting - she said that she was a "Computer scientist who left the code behind".  This was a good definition of where I am.  I still consider myself a computer scientist, but I have no interest in coding at the moment (at least not for money or for my 9-to-5). What we study/ied has helped frame how we see the world, and how we interact with it, whether we like it or not; and we get to see connections between one domain of knowledge and another♦.


What about what we do?
Another strand of being and identity came from what we do (I assume this is professional or from our pastimes).  The question is do the things we do define us?  And if we stopped doing them would they still define us?  I think that what we do defines us to some extend, but not others.  For example, from a professional perspective, I've had a variety of job titles over the past 20 years at my institution (hard to believe it's been that long). While I am no longer performing the duties of a media services worker, or a library worker, or even an IT worker on a day to day basis, many of my colleagues who've known me  over the last two decades (who are still here) do reach out to me for Tech-y, IT-y, Instructional Design-y things that concern my department, even though many of those things are not in my formal job description.  The institution, by means of the people involved, remembers me and my skillset, and it's just natural for them to reach out to the person that they know to get things done.  Do I miss certain aspects of old jobs?  Sure.  Do I miss the old jobs? not really.  Would I miss my current job if I moved elsewhere?  Probably not.  I enjoy what I do, but it doesn't define me.  The relationships between people (faculty, staff, and students of my department) are what define my time here, and who I am (in relation to them) more than being the person who manages the various processes that need to happen for a department to run successfully.


Identity is like a tree
I am not sure if the heading is something I thought of while watching the video chat between Stephen and Maha, or if it's something they said.  Either way, I see identity as a tree. It's a complex organism. It has roots that are nurtured by what surrounds it.  It is impacted by the environment it's in, and it grows leaves and branches. Periodically it gets pruned as conditions change.



Miscellaneous Observations
academic identity
There was an interesting point that Maha brought up.  She indicated isn't all that up to speed with reading literature in her field in arabic.  Her academic identity is English (or in English? it's been a while since I took these notes). This was interesting in that I identify the same way.  My academic competence was developed in English (college and graduate school) and the last time I was in an academic environment where Greek was the language of instruction was in 8th grade. While I can read just fine (and I would like to expand my repertoire to read academic literature in my field in Greek), I do no consider myself fully bilingual when it comes to academic materials.  I can read and comprehend just fine, but writing academic materials in Greek isn't as easy as it is in English.

Despite the bumpy road with academic Greek, I've wanted to write in Greek, partly because it would potentially open the access to non-English speakers. However, when considering the fine time one has, the cost of such a transaction (time and effort spent), and the fact that it doesn't necessarily advance your academic career, you do have to pause and wonder whether your resources are spent well.  If translating your work to another language is a hobby - great.  Another thing that I consider, even for someone like me (who is on the fence about such a career), you want to be 'future proof' your career in a sense, so English makes the most sense as the primary language to publish in.

Finally, a big question that has come up♥  is who gets to call themselves an academic? While I do teach from time to time, that's not my day-job (I am a manager by title, administrator by function).  When I teach I am a lecturer even though I don't lecture, and I am never a professor even though some of my students address me with that title.  I do research and publish from time to time, but that's neither required nor rewarded from neither my part-time teaching gig, nor my day-job. While I do perform tasks in the three categories that many consider key categories in the work of an academic in the US∋, I am not generally considered an academic (and feel rather weird calling myself that).  I think Frances Bell and Jenny Mackness call themselves an Itinerant Scholar which sounds more appealing to me. While I do work in higher education, the noun academic doesn't feel welcoming as a title/descriptor.  At least in the US there seems to be a sharp distinction between faculty and staff (everyone not faculty). Faculty being more prestigious and at a higher tier than us lowly staff. Academic many times feels like a synonym for faculty, hence the oppression of the system I work in somehow makes claiming that title feel wrong - like you're an impostor. At the end of the day who gets to call themselves an academic?  Is it an endonym? or an exonym? What should it be?


personal brand
As much as I want this term to die out already, watching this interview I was left wondering where identity fits in with the concept of personal brand. Not sure what the answer is - maybe an entire discussion of its own.


On one of these days I need to do that identity graph 'assignment' :-)



-------------
Marginalia:
♣ OK, fine, it's been quite a while ago - been busy with other things ;-)
† emphasis to note where people mistakenly put the accent
‡ OK! OK! This is my extrapolation of the discussion in order to make this category. The discussion seemed much more interweaved with institutions and what we do in the discussion. Bear with me.
♠ In thinking about both the AU and UMB stuff I have, most have been gifts from mentors and colleagues. I have yet to buy something with my own money with the university logo on it.
♦ The lack of interconnection is something I actually see when I peer review journal submissions.  Many people tend to publish in their disciplinary journals, and only do their literature reviews in those domains of knowledge, even when they are writing about teaching and learning (e.g., a chemist, or management PhD writing about teaching online).  This leads to a lot of poor research writing because the lack of cross-disciplinary connections means that the people writing don't have a good understanding of the field they are writing about, and if they attempt to have an understanding it's often surface level. Just a random thought.
♥ as I re-read this blog post several weeks after I started it...
∋ Research, Teaching, and Service being those three categories. 
 Comments

Letters of recommendation - what's up with that?

 Permalink
It's been a while since I've blogged, or at least it really feels like it.  I've had my nose stuck in (virtual) books trying to get through my literature review - but more on that on some other blog post. I came across an article on InsideHigherEd this past week asking whether or not letters of recommendation are really necessary. My most immediate context is admissions, given that that's part of my work at the university, but the people who gave their two cents also mentioned something I had not considered: academic jobs. I won't rehash the opinions of the people who wrote for the article, but I will add my own two cents, mostly from a graduate admissions perspective. I don't have a fully formed opinion on letters of recommendation for employment purposes, but I'll add my two cents as a prospective hire (in a few years when I might be done with my EdD :p)

For admissions to graduate course of study, be it a masters program, a PhD program, or even a certificate program, I really personally don't see much value in letters of recommendation any longer.   My point of view is framed from the perspective of a student, an instructor, and a program administrator.   When I was applying for my first master's degree I bought into the rationale given to me for letters of recommendation: former professors can provide the admissions committee qualitative information about you as a learner that a transcript cannot provide.  This is all fine and dandy, and for me to worked out: I was working on-campus as an undergraduate student, and I had some professors who I had for more than one course and who were able to write letters of recommendation.  This was a privilege that I had that other students may not have had.  For my second masters I was applying to the same college, and I was applying to a new program of the college, so they looking for student, so getting recommendations wasn't that big of a deal.  Once I finished my second masters, I really didn't want to deal with more solicitations for letters of recommendation - I started to feel odd, and I kept going back to the regular well of people for recommendations.

So, I applied to two programs concurrently so that I could write one statement, and the letters of recommendation could pull double duty.  After I finished my last two masters degrees I took some time off regular, "regimented" school and programs and focused on MOOCs.  Going back to earn an EdD posed some issues as far as recommendations go.  I had previously applied to a PhD program at my university (at the college in which I earned two masters! - never heard a final decision on my application by the way), and by the time I wanted to apply to Athabasca I felt that the well had run dry for recommendations.  Former professors still gave me recommendations, but I kind of feel I was taking advantage of their kindness by asking for a recommendation for yet another degree program I wanted to pursue (don't judge, at least I complete my degree program haha 😜).  Not that I am thinking a ton past my completion of the EdD, but should I want to pursue a regimented course of study in the future (degree or certificate program) the recommendations will be an issue; not because I can't get them, but because I feel bad about asking for them - after all I am asking for someone to volunteer their time to give me a recommendation when my academic record should suffice. This is how I feel about the GRE and other entrance tests, by the way.  If you've completed undergraduate studies then the GRE is pointless - you can do academic work.  If you are unsure of the academic work capabilities of applicants, accept them provisionally.  Just my two cents.

Another lens I view this through the administrative.  Asking for letters of recommendation, and subsequently receiving them (or not) requires time.  It requires time from the student (especially in tracking down referees if they don't submit stuff in time), it requires processing time from admissions departments, and it requires reading time on the part of committees who review applications. When such a system takes that much time and effort into it, you have to ask what the benefit, or net positive gain, is.  Going back to the story I was told - the qualitative component of the transcript, basically - does make sense in theory, but in practice... not so much. 

While I don't make decisions on applications that come to my department for review, I sneak and peek at materials that come in because I need to process them.  What I've noticed is that by and large (1) recommendation are uneven, and (2) they tend to be the same, more or less, just with different names.  The unevenness is partially cultural in nature.  If you get a recommendation from someone employed at a western institution you tend to get (more or less) what you seek.  However, non-western colleagues don't use the recommendation system so for them a recommendation is just an affirmation that the student was indeed in their class, in the specific semester, and from what they remember they performed well.  The "basically the same" aspect of recommendations runs into the same problem as non-western recommendations; that is that recommendations basically boil down to: student was in class, they performed well, so accept them.  It just turns out that western colleagues are more verbose in their recommendations so they happen to add in some anecdotes of your awesomeness as a candidate, but even those anecdotes tend to run along the same wavelength most of the time:  asked interesting questions in class, was the first to post in forums, engaged fellow classmates, submitted assignments early, etc.  From an administrative perspective there is (so far as I know) no background check on these folks providing recommendations so we are taking what they are writing in good faith.

Finally, as an instructor, I am lucky, in a sense, that I haven't had to write a ton of recommendations.  I've done so a couple of times but after a few original recommendations I've basically gone back to the awesome student, accept them, here are a couple of anecdotes formula because that's life, we're not living on Lake Wobegon. I'd gladly give a recommendation to former students who did well in my classes, but it's hard to not feel like I am writing a generic letter sometimes. So why spend time writing something that feels like a template letter if I am not providing much value to the system?

In short, recommendations for admission add no value while taking away time and resources from other areas.

In terms of letters of recommendation for academic employment, on a purely theoretical basis I'd say that they are pointless too.  Both for reasons articulated in the IHE commentary piece, but also for one of the reasons that's similar to graduate program admissions: the genericness aspect.  I think having some references is fine, but I think a quick conversation (or heck, a survey-style questionnaire) would be more preferable to a letter. The reason I think it's not that useful in hiring decisions is the same reason no one gives recommendations anymore (for us regular plebes getting work), and that is that people sue if they get wind that they got a bad recommendation. Generally speaking no one will agree to give you a letter of recommendation (or reference) if they can't give you positive reviews, and HR departments just confirm dates of employment these days.  Nothing more, nothing less; otherwise they risk a lawsuit. So, if you're not getting much information about the candidate, and if the information is skewed toward the positive (because that's how the system works), then is the information you're getting valuable?  I'd say no.

So, what are your thoughts?
 Comments

Campus deadzones, and creepy hallways: where did everyone go?

 Permalink
Found image on Google
(not actually a photo of me)
Happy Friday dear readers! (umm...anyone still there?  I swear! I am alive! 😆)

I've been attempting to write a blog post all week (and trying to do the 10 minutes of writing per day), but I've been failing on that account...I guess Fridays are a better day as things wind down from the week.  In any case, there is an article from the Chronicle that's been on my mind this week titled "Our Hallways are too quiet". Our department chair sent this to us (everyone in the department) as a thought piece, perhaps something to ponder and discuss in the fall - probably because our department is also like the department that is described in the article.

I had a variety of cognitive and emotional processes go off, and get gears grinding while I was reading this.  I actually hadn't noticed that the author was from MIT...who only recently "discovered" online learning (like Columbus discovering the New World).  Yes, I am a little snarky, but I also think that your frame of reference is really important.  If you are a Bricks and Mortar institution what you consider "community" might look different from an institution that is focused on distance education (or at least has a substantial DE component).  But I think I am getting ahead of myself.  Let me just say this:  My job title is "Online Program Manager" - as in the person who runs the online components of a specific MA program.  Having been on campus for close to 20 years now, in a variety of roles, I can see both sides.  I think this particular article is really biased, in ways that their author doesn't even get.

Let's start with this:
Entire departments can seem like dead zones, and whole days can pass with only a glimpse of a faculty member as someone comes to campus to meet a student, attend a meeting, or teach a class. The halls are eerily quiet. Students, having figured this out, are also absent. Only the staff are present.
This excerpt, as well as the rest of the article, is very faculty-centric.  As if the faculty (or this particular faculty member anyway) are the only ones who suffer any consequences from creepy hallways.  In my most recent job (headed into my 6th year soon!), and my first in an academic department, I've experienced the demoralization that comes with absence of colleagues.  In all of my others jobs on campus I've always had colleagues around (with the exception of vacations and such).  Whereas in an academic department I didn't (don't) always see people.  In my induction period (when I was getting the lay of the land and doing a SWOT analysis of the program I was managing so I could be more effective) Mondays through Thursdays I'd at least see my fellow program managers and faculty here and there, but on Fridays it almost felt like being in the movie I Am Legend.  Granted, this didn't bother me back then because there was a lot of paper records to go through and make heads and tails out of everything. Being busy meant that I didn't really mind being alone.  Once all paper was organized, made sense of, and work could be done remotely, the big question that comes to mind is this:  Well, if I can do my work remotely, and I don't have to deal with the x-hour commute, why would I need to go in?  especially for someone who manages a distance learning program.  If one group of employees (faculty) can work remotely (effectively) why not another group whose job duties are conducive to it?  I do agree with one point made above:  students having figured out that faculty aren't there are also not there; but there is a big caveat here:  who are your students? Students in my department are (by and large) working adults, so even if faculty were around it doesn't mean we'd suddenly have students sitting around in semi-circles, drinking their dunkies coffee (local affectionate term for Dunkin' Donuts) and discussing Derrida.  If you think that way, you're living in a fantasy.  Student demographics matter.

Goin' onto the next point. The author writes about faculty avoid the office for a variety of old fashioned reasons, such as not being able to get work done, avoiding feuds, and avoiding time-sinks like watercooler talk, but then she turns her attention to the perennial foe: technology!
A big reason for decreased faculty presence in their campus offices is technology. Networked computers that allow one to write anywhere also allow us to have conversations with students and colleagues that used to take place in person. Creating new course materials and ordering books is easily done online. Cloud software has made pretty much all our work processes easily done from home, a vacation cabin, a foreign conference hotel. For many scholars, this has been a very liberating occurrence, giving them wondrous flexibility.
Pardon me, I don't know you, but I call 💀💢🐄💩😡 on this argument.  Yes.  technology has facilitated certain efficiencies, like not having to fill out a form in triplicate, or not having to wait overnight for a journal article query that only returns title and abstract of potentially relevant articles to you.  Technology has not caused faculty not to want to come to the office.  Other organizational factors play a major role in the day to day decisions on whether or not to work remotely.  When research productivity is sought more, then people will do what they need to do to be more productive in their research.  If community engagement, service, teaching, or other aspects of the professoriate are valued more, than people will gravitate toward those.  I basically comes down to incentives, and when there is little incentive to be on campus to meet those objectives, then you will undertake them at a place that is most convenient for you.  I think a lot has to do with the expectations set forth by the institution, the institutional culture, and by extension the departmental culture.  Sure, you can have a department chair (the head honcho in an academic department) mandate that everyone (yes, including faculty) have to be there 3 days per week, and put in at least 10 hours of  'face time' into the department during regular business hours (9-5).  That's really only 3 hours per day. Does 3 hours per day really build community?  Nope.  Does 3 hours per day guarantee that people will be there on those same days and hours?  Nope.  This is the equivalent of butts in seats, for no good reason.  It's as anachronistic as forcing students to endure a long lecture just because you haven't through of your pedagogies.  First you determine what your root goal is (and no, more face time isn't a worthy goal), and then you hatch a plan to get there, while at the same time taking into consideration the various local variables, norms, and expectations (heck, maybe those need some rethinking too!)

Every time I hear about technology as the "big bad" I am reminded of the rebooted (and cancelled) Thurdecats.  From the fan wiki article (with my own annotations in brackets):
Most citizens [of Thundera] abhorred technology, denying the existance of machinery entirely and leaving thoughts of such things as fairy tales. This belief was a major contributing factor to their destruction as the lizards [their enemy] attacked them with advanced bipedal war machines Warbots while the ThunderCats fought with bows and arrows.
Just an interesting side-trip - take it as you will 😂

Anyway, moving along, finally, I see a conflation of the sense of community with face time, and they are not the same thing.  The author writes:
Some would argue that worrying about departmental community is ridiculous. After all, professors aren’t hired or promoted on the basis of departmental relationships, or civic engagement, and most faculty members desperately need quiet time in which to do research and write. True enough. As my colleague, Sherry Turkle, has argued: Conversation matters. Personal contact matters. It is very hard to build relationships with people we do not see in person, and such relationships are the bedrock of so much else that matters on any campus.
I think community is important.  However just because someone is not in their office at the same time YOU are in your office doesn't mean that you can't have community.  And just because you re not meeting face to face doesn't mean that you aren't communicating.  And just because you aren't meeting face to face doesn't mean that you aren't having personal contact! I've had lots of meaningful conversations, and personal contact with my many distance friends, family, and colleagues over the year.  From my doctoral cohort, to vconnecting friends and colleagues (sorry I've been a ghost - dissertation is sucking my mental energy), to colleagues who are geographically dispersed.  Every time I hear of Sherry Turkle I can't help but roll my eyes. Yes, face to face is nice.  Yes, I like face to face sometimes, but face to face ain't the end all be all of conversations, connections, communities, and work.  Yes, we do need community.  Without it we are just a loosely joined confederation of people maybe striving toward a common goal (maybe not), but with community we become stronger, and we get smarter.  But community can be achieved in a different ways (look at vconnecting for example).

To wrap up, I am reminded of a joke, or something that one of my mentors (Pat Fahy) kept saying "It's the parking, stupid!".  This was the response to the question "why do students pursue distance education?".  Of course, this is just one piece of the puzzle; others being things like mobility issues, health issues, childcare, elder-care, working two (or more) jobs, and so on.  I think in an era where we are offering some really great distance education programs (oh yeah...welcome to the party, MIT), and we've seriously considered what makes a good online program for our disciplines in order to get here, it would behoove us to also look at what makes our jobs effective and how we can effectively build communities of various modalities.  Forcing grown human beings to have face time so that they form community is the equivalent of having your kids forced to stay with "weird uncle mike" or grandma, because you feel like your kids need a connection with the rest of your family, but you haven't bothered making them part of your family in the day to day, except only on holidays.  Both kids, and adults, resent such forced actions.  We can do better.  Just sayin'

OK, now that I've ranted on 😏 - what do you think? 😃


 Comments

University Education, the Workplace, and the learning gray areas in-between

 Permalink

Many years ago, maybe around 16 years ago, I was sitting in the office of my computer science major advisor, getting my academic plan for next semester signed off on.  My computer science program was actually an offshoot of the mathematics department, and until recent years (2003?) they were one and the same.  My advisor, while looking at my transcript, noticed that (on average) I was doing better in language courses rather than my computer science courses; which was technically true, but many courses designated as CS courses (and ones that were required for my degree) were really math courses, so you need to do a deeper dive to see what I was doing better in.

I never really forgot what he said next.  He said I should switch major; and it was odd that he didn't offer any suggestions as to how to improve†...  Being a bit stubborn (and relatively close to graduation) I doubled down and completed my major requirements (ha!).  During this chat I told him that I really wish there were more coursework, required in my degree, in additional programming languages because that is what I was expected to know when I graduated for work. His response was I could learn that on the job... needless to say, my 20-year-old self was thinking "so why am I majoring in this now, anyway?"

Fast forward to the recent(ish) past, flashback brought to your courtesy of of this post on LinkedIn. I had recently completed my last degree (this time in Instructional Design) and I was having coffee with some good friends (and former classmates). We were a year or so out of school. Two of us already had jobs (same institutions as when we were in school) and one was on the hunt. His complaint was that school didn't prepare him for the work environment because he didn't know the software du jour (which at the time were captivate and articulate). I did my best to not roll my eyes because software comes and goes, but theory (for the most part) really underlies what we do as professionals. In class there wasn't a dearth of learning about software, but there were limitations: namely the 30-day trial period of these two eLearning titles.  So we did as much as we could with them in the time we had with them, and we applied what we learned from the theoretical perspective.  No, we didn't spent a ton of time (relatively speaking) on the software because that sort of practice in a graduate program should really be up to the learner, and it would cost them.  Captivate cost $1100 for a full license, while articulate costs $999/year to license. That cost is actually more than double what the course cost! Furthermore, it privileges one modality (self-paced eLearning) and two specific elearning titles. The fact of the matter is that not all instructional designers do self-paced eLearning, enabled by these titles. Not all instructional designers are content developers‡. I find the author's following suggestion a bit ludicrous:

To replace the non-value add courses, decision makers can study current open job descriptions, and ignore academic researchers' further suggestions. Programs can then be revolutionized with relevant course topics. These new courses can include relevant production tools (e.g. Storyline, Captivate, Camptasia, GoAnimate, Premier, etc.) and numerous cycles of deliberate practice, where students develop a course on their own, and receive the feedback they need. This will make hiring managers very happy.
While I do see value in learning specific technologies, that's not the point of a graduate degree, and graduate courses should prepare you to be a self-supporting, internally motivated learner.  Courses should give you the staples that you need to further make sense of your world on your own, and to pickup tools and know-how that you need for specific situations♠.  Focusing a graduate degree on production tool is a sure way to make sure to really ignore the vast majority of what makes instructional design what it is. Practice is important (i.e. building your learning solutions) but it's not the only thing that's important. I also do think that employers need to do a better job when posting instructional designer job descriptions, but that's a whole other blog post.

I do think that if you are new to any field you (as a learner) should be taking advantage of any sorts of internships, where the rubber (theory) meets the road.  In some programs internships are required, and in others they are optional.  I do think that internships are an important component for the newbies in the field.  When I was pursuing my MA in applied linguistics, and being in a program that focused on language acquisition and language teaching, the field experience (aka internship) was a requirement.  People with classroom teaching experience could waive the requirement and take another course instead, but for me it was valuable (as much as I had to be dragged to to kicking and screaming).  In hindsight, it gave me an opportunity to see what happens in different language classrooms, something I wouldn't have experienced otherwise.

So, what are your thoughts? What do you think of the LinkedIn article?


Notes:
† I guess this must have been a problem with advising in the college in general because years later the college of science and maths put together a student success center.  They were probably hemorrhaging students.

‡ I suspect this is another, brewing, blog post.

♠ So, yeah...Years later I see some of wisdom of my advisor.  I think he was partly right, in that I should be able to pick up what I need once I get the basic blocks, but I think he was wrong to suggest for me to change major, and I do think that less math, more computer science with applied cases would have been better as a curricular package.
 Comments

Academic Identities, Terminal Degrees, power of the network...

 Permalink
It's been a while since I last just sat down to think and write about something (like the good old days when I was cMOOCing...).  These past few weeks have been about conferences, and getting back on track with my dissertation proposal (although I think I am the only one who is keeping a score on that at this point).

In my attempt to get back to writing, and engaging with friends and colleagues out there in the wild blue yonder which is the internet, I thought I would pick through my accumulated Pocket list until it's almost empty.  One of the ponderings of interest came by means of an article on Inside Higher Ed titled Academic Identities and Terminal Degrees, where the overall question was: Does one need an academic terminal degree to identify professionally with that discipline? And, as Josh goes on to explicate

Can only someone with a Ph.D. in economics call herself an economist? Do you need a Ph.D. in history to be a historian? How about sociology and sociologist? Biology and biologist? Anthropology and anthropologist?

My views on the topic have changed in the past fifteen years; where I basically compare my views as someone who just finished a BA, to my current views...on the road to a earning a doctorate (are we there yet? 😂).  Originally I would have said that someone could call themselves something only if they've earned a degree in that field. I think today I would call that by the term protected professional title, and a degree or some sort of certification would be a way to demonstrate that you've been vetted into that profession somehow by somebody. Now, which titles (economist, linguist, archaeologist, biologist, etc.) are protected, and up for grabs...well...that's a subject for debate! At the time the only means of obtaining that expertise (in my mind) was through formal degree programs.

Since that time, in addition to completing a few masters programs and discovering new fields and new knowledge, I've also discovered the power of the network, the potency of communities of practice,  groups such as virtually connecting, and expanding my own learning and practice outside of the classroom.  My current feeling is that it's not really as black and white at my younger self thought.  I do think that obtaining a doctorate in the field is one path to getting there, but it's not the main criterion to developing your identity in that field.  The main criterion that I have (at this point in time anyway) is practice and expansion of your own skill set in that field. I guess a good way to describe this is through some examples that came to mind while I was trying to tease it out for myself:

Example 1: The non-practicing PhD
A few years ago I was a member of a search committee looking to fill the position of a program director for an academic program at my university. Among the requirements for this position was a terminal degree (PhD or EdD being defined in the job search posting).  We got a variety of CVs from interested applicants.  In reviewing CVs I noticed an interesting cluster of applicants: those who had earned a terminal degree (four, five, six, ten) years ago, but had no publications (or other academic work) under their name other than their dissertation.  Their dissertation was listed on their CV, but nothing else. I am not saying that publishing in academic journals is the only way to demonstrate academic work. You could for example be presenting at conferences, presenting at professional association workshops, writing for a blog or professional publication (basically translating academese to professionals). These job applicants had none of that, so they were demonstrating a lack of practice and continuous improvement in their field.  So they had earned their badge of honor by completing a doctoral program but there was no follow through.   For individuals like that I'd have a hard time calling them an economist, a biologist, a demographer, or a whatever.  I'd called them Doctor so-and-so, but they - in my mind - are not an embodiment of what it means to be a ___________ (fill in blank).


Example 2: Word ambiguity
When I was close to finishing my degree in Applied Linguistics I came across a podcast and a blog of someone who called himself a linguist. I was really happy to come across this podcast and blog because I could continue to learn about a topic of interest once I graduated (and also while I was in school), and this was exciting because back then there weren't really that many linguistics blogs or podcasts around.   My working definition of linguist a person who studies linguistics (where linguistics is the scientific study of language).  This is how I've always understood linguistics.  The person on the other end of this podcast was not a linguist in that sense.  He was a linguist in the dictionary sense of a person skilled in foreign languages.  Personally I'd call that a polyglot and not a linguist. Although, I don't think that it would have bothered me too much if this person called himself a linguist if he didn't really start to preach in his podcast about the best way to learn a language.  I find that at that moment he crossed the line into the domain of what I consider linguists: those who are either clinical linguists (for lack of a better term), and those who are teachers of language and take an inquisitive and critical approach to their teaching and either share what they've learned through their research (published or not). This individual calling himself a linguist was neither a teacher, nor a linguist (in the scientific meaning). Hence the more accurate term that I would use is polyglot not linguist.


Example 3: The practicing MA graduate
In many fields conducting an MA thesis is the only means to graduating from your Master's program.  Even if you don't conduct a thesis to graduate, but you've studied research methods, and continue to hone your skills of inquiry, and continue to read up on advances in the field, I feel like you have the right to call yourself a ________ (fill in relevant blank), if of course there isn't a regulatory board for your profession (nursing, medical, legal, accounting, and other profession of that type). There are many smart people out there who do a lot of work, and who diligently work on keeping their knowledge and skills updated.  Some of them even research and publish.  Through their continued efforts I think that they've demonstrated that they are serious enough about their profession to be included in that group that calls themselves a ___________ (fill in blank).


At the end of the day, for me, an academic identity isn't necessarily tied to a degree earned.  A degree earned on someone's CV might give you clues as to what their academic identity is, but it's not the only consideration.  I think that practice and application are key considerations when you're deciding where you are in the group, or you're not.  I think if a word has double meaning - as with example #2 - the thing to do is stick with the more accepted or widely used meaning, instead of something that isn't used.  I think it's the honest thing to do.


Your thoughts?

 Comments