Club Admiralty

v7.0 - moving along, a point increase at a time

Multilitteratus Incognitus

Traversing the path of the doctoral degree

Is "online learning" the new "community college"?

 Permalink


Me, pondering
OK, maybe the analogy isn't totally clear to you, so let me explain my context. 

When I was in high school (mid-to-late 90s) the advertised (or expected) path after high school seemed pretty clear to me: go to college. There were really no "buts" about it, and there were no gap years considered (those were luxuries that well-off people had since they had money to burn). It was an expectation, from guidance counselors, from teachers, from parents, maybe even from society. Higher education was the path to a good middle-class life, and people were willing to take out loans to go to their dream school in order to achieve this goal. This was a pretty important goal for my parents considering that neither one of them made it to university and I'd be the first in the family (maybe even my broader family) to do this. No pressure, eh? ;-)

One thing that seemed like an underlying current was how dismissive some (many?) people were about community colleges at the time. I had never really thought of community college as an option because of jokes like this one:

You better do well in __(subject)__ otherwise, you'll be attending Cape Cod Community College!

I don't know why CCCC was the butt of the joke for this particular teacher in high school, but the frequency of such jokes (and the virality of them between students) definitely left an impression of community college being a consolation prize, rather than a fantastic (and comparatively cheaper) educational resource! Imagine how much money would be saved if students decided to complete the first two years of their higher education studies at a CC and then transfer into another school! Or graduate from CC and then go into university with advanced standing. From what I know, in my local context, CCs were (and are) commuter schools.  You don't live on-campus at a CC. Compare that to some big-name school in Cambridge (Massachusetts) that my folks wanted me to apply to that required first-year student to stay in the dorms if they wanted to attend that school.

Anyway, I diverge from my point I started with.  The main idea here is that CC, although valuable, was constantly dismissed.  Fast forward to our current pandemic-world.  Students are suing universities for the return of their tuition and fee costs. Nevermind that some of these law firm pitches sound a lot like ambulance chasers, let's dive down to the core:  Universities have been pitched as a place where people go to explore subjects and topics; a type of free-range learning. This is true for both undergraduate and graduate education. 

In recent years, what you saw in university advertising tended to be anything but the learning.  Learning objectives?  Snore! learning outcomes?  yawn!   Rooms with lavish wood paneling?  Noice!  Parties?  Awesome!   Spring fling dances and cookouts?  I'm there!  When you consider the marketing message of the modern university which focuses on amenities, it's not hard to see why people are pushing back against the price tag.  If you paid for a Cruise in the Bahamas, why would you "settle" for the Holodeck?

What's hiding behind those amenities is the promise of a free-range learning environment where you too can learn and be inspired by the greats!  The reality, though, is that you aren't really in a free-range learning environment.  When your tuition and fees cost $60,000 per year (or more), a wise student would do a reality check and see that it's not free-range learning, but rather a prix-fixe menu (in many cases), and students pick X-many courses from column A, Y-many from column B, and Z-many from column C to graduate as soon as possible.  The longer you stay, the bigger your bill!

Conversely, in online learning, where you don't have the striking visuals of campus life and all the non-academic distractions you are forced to start with the learning outcomes.  You need to assess programs based on the outcomes, and you need to advertise based on the transformative experience of the learning and what sorts of careers you are prepared for, not the extracurriculars.   However, it seems, that prospective students (and their parents) don't have metrics by which to assess programs on their learning outcomes, so lacking the social visuals or metrics offered by a campus experience, they dismiss online education; much like how CC education was dismissed by the relevant authority figures in my teenage life.  I think that for-profit schools also have not helped with the reputation of online learning, but talking about "zoom university" and framing educational costs as an all or nothing is also not very productive.  Education is valuable.  I would argue that education at $60k/year was never valuable to people like me, first-generation students, but I hope that more people are teasing out what matters in education. I hope the medium doesn't impact the message in this case.  And I hope that dual-mode universities finally put some support behind their online offerings beyond the classroom.

Your thoughts?  Do you see a connection between online learning and the community college in how they are talked about?

~AK

 Comments

Depth or Breath?

 Permalink
I was reading this on Slashdot the other day about a person going back to school to complete their computer science degree.

Here's a quick quote:

I recently went back to college to finish my CS degree, however this time I moved to a new school. My previous school taught only C++, except for a few higher level electives (OpenGL). The school I am now attending teaches what seems like every language in the book. The first two semesters are Java, and then you move to Python, C, Bash, Oracle, and Assembly. While I feel that it would be nice to get a well-rounded introduction to the programming world, I also feel that I am going to come out of school not having the expertise required in a single language to land a good job. After reading the syllabi, all the higher level classes appear to teach concepts rather than work to develop advanced techniques in a specific language. Which method of teaching is going to better provide me with the experience I need, as well as the experience an employer wants to see in a college graduate?



Now there are a ton of opinions in the slashdot article that geeks and non-geeks alike should have a look because it poses a good question about what type of education you should get. Should it be as broad as possible? Or should it be more contained but more comprehensive?

This story also brings up an interesting exchange that I had with my undergraduate advisor in computer science. My computer science program did not take the breadth approach, but rather took the more narrow approach. Yes we did learn about automata, basic and advanced algorithms, logic and so on (so all the things that are mentioned in the comments section, and all the things that every computer scientist should know) BUT we didn't do a lot of languages. We covered Java, ANSI C, and x86 Assembly, and if you took specific electives you would get PL/SQL and SmallTalk.

The problem for me was that I was not being familiarized with more languages that exist out there in the real world (like C# for example). What I failed to realize back then is that Java, C and assembly is really what you need to get started. My advisor told me that the program focuses on concepts (well duh!) and that I can learn any language I want on my own easily. The issue I had was that the languages used in the curriculum were not used a whole heck of a lot. Two semester of Java, 2 of C, and one of assembly.

Yes you need to take the bull by the horns and program you own projects and have what the Greeks call μεράκι (I guess the closest equivalent is the concept of being "jazzed about something"), but as an undergraduate with a full course load, and a job, it's not easy to fit in project just for fun.

Personally I would have preferred more familiar with more languages and then I can practice more on my own, rather than this uncomfortable in-between.

What do you think?
 Comments

50 years of Strunk and White

 Permalink
Or...rather...50 years of bad grammar advice!

I was reading this article on the Chronicle of Higher Ed a few weeks back and I didn't get an opportunity to fully savor it, so I re-read it.

As a typical American undergraduate student Strunk and White was a required book, a style manual that we had to abide by. I remember really disliking my English 101 and 102 classes, but I don't remember why. Perhaps Strunk and White was one of the reasons - I have completely blocked the experience from memory it seems :-)


In any case, the article was QUITE interesting and I recommend that you read it, even if you are not that much into writing or grammar or linguistics.
 Comments

How much do you remember from LANG 101/102?

 Permalink
I was reading Revising and Defending the Foreign Language Major on InsideHigherEd the the other day when I had a small flashback to recent conversations that I've had with former classmates about their language learning experiences and the language retention that they have.

In high school, I was required to take two years of a foreign language in order to graduate. I elected to take 4 years (coming up to an intermediate-advanced level). Had I started French in 8th grade I would have had the opportunity to take 5th year French (AP level).

When I went to college as an undergrad, I was required to take two semesters (101 and 102) of a language in order to graduate. I elected to minor in Italian (6 or 7 courses if I remember correctly) and I almost minored in German (took 6 out of 7 courses). My interest in language is cultural and communicative - not literature, and that 7th German course would have been a German literature course in english (so I couldn't even practice the language) and it would have meant one extra semester to graduation - no thanks, I said. With French, German and Italian I am conversant to various degrees (depending on the language)

Now I also took 101 and 101 of Japanese, Chinese and Russian. My recollection of these languages is very limited. I can say good morning, hello and thank you, maybe even "my name is..."

I have asked classmates in those classes that only took 101/102 with me if they remember much beyond that and the response was negative. In other words, wasted time, wasted money, wasted credits! What is the point of requiring someone to take x-amount of classes in a foreign language if they don't see a benefit from it? At least with Art, Sociology, Psychology and Philosophy the way you look at things, the way you think, the way you process is altered in some fashion. Language is a communicative process. We learn language to communicate with others, so requiring so-many-courses and to have nothing to show for it is not good.

So here's my modest proposal: Require every undergraduate to minor in a language and pass a proficiency exam before they can graduate, and no one is exempt! You know Greek and English? Excellent opportunity to pick up Chinese, or Russian, or Japanese, or Spanish or whatever! You only know English? Excellent opportunity to learn more about another language and culture.

A minor is six or seven courses. Within the confine of 18 to 21 credits students can become conversant in a foreign language, learn a bit about world history as it relates to that language and culture, learn a bit of its literature (don't overdo the literature, after all the focus is communication), and be able to communicate well in an oral and written manor!

Now, if high schools were the same, if all students were required to have four years of English and four years of another language (plus pass competency exams), the time spent in the classroom would be well worth it because we would come out with tangible outcomes!


-just my two cents on the issue
 Comments

When the academic world and the real world meet

 Permalink
I saw this article over at the NEA journal. (click here for the full PDF)

Having recently visited my dad, a person who is very intelligent but, who like the dad in the article, didn't go to college (heck my dad didn't even go to middle school). This story reminded me of a conversation I had with him about his work and salary versus mine (i.e. being the same) despite my education.

I've heard a lot of banter over at blogs like the brazen careerist about not learning concrete skills in college. My undergrad experience has been more of a "learn how to think" lesson. Learn to be critical, and analytical, and calculating, and have that rounded learning that everyone covets. When I first graduated I felt like the early-20-somethings on brazen careerist, like my college education was almost a waste of time because I did not learn concrete skills.

I kinda learned java, and kinda learned C, but I wouldn't be readily employable by a company. In recent years though my undergraduate education has surfaced many, many, times in the oddest of places! Those computer science classes that I thought were useless are actually useful. The only thing that I wish I had was a required internship.

In the article we see some advocacy for required internships, or hands-on learning where when you graduate you don't only have theoretical skills, but also have employable skills. A mix of academic and vocational training is a good thing.




Excerpt:
My father was born in 1911. Like many from that era, he left formal education before completing grade school and went to work helping support his family. He never learned to read well. When I was a child, it was my father’s Sunday morning ritual to gather his kids around him on the sofa and read us the comics from the just delivered Milwaukee Journal. It was an act of love for his children, but by the time I was in third grade, I could read the “funnies” more quickly than my father. I am certain he would have failed any exam I have ever given to my students.

But my father was an excellent automotive mechanic who owned and operated
a Ford-Mercury dealership for over 40 years, and owned and operated the
school buses in my small central Wisconsin hometown for nearly 30 years.
Between both businesses he typically had 30 to 40 full- and part-time employees.
He was one of a handful of individuals instrumental in building the first hospital
in our community, was president of the hospital board for many years, served locally
as president of the Chamber of Commerce, and statewide as chairman of the
Wisconsin School Bus Owners Association. He made considerably more money
than I do as a teacher, but he worked far more hours, and year-round.

An accomplished and intelligent human being, my father lacked the sophistication
that comes with higher education. His abilities would have appeared marginal
by most of the measurements used in academic assessment. Those who worked with him knew better of course, but the point is that our concepts of what
it means to be educated or intelligent are often inadequate. Just as important, my
father’s abilities would have meant nothing had they not been supported by his
attitudes—his deep humility, simple approach to life, and unwavering commitment
to those around him.

I tell this story because it relates to the students I now teach and to issues I
believe need to be addressed. To better understand this, it might be helpful to tell
the story of my own journey through the educational system. It would probably
not be noteworthy, except that I hear variations of it from many of my students.
 Comments