Too cool for school: a retrospective on dropping out of university

The homo computatis college drop-out was a cliché whose establishment in folklore predated my departure from the University of Georgia by at least two decades. Nevertheless, I also joined the club. In the first semester of the 2005-2006 academic year, after dabbling half-heartedly in coursework for two years as a philosophy major, I threw open the gates and exiled myself into the great beyond.

Although my actions conformed to a known stereotype, I still feel I was something of an early adopter, virtually alone in my group of peers. I came to count many in my acquaintance who never pursued post-secondary education in the first place, or who floated in and out of community and technical colleges amidst working and financial struggles, but knew of vanishingly few, especially at that time, who straight-up dropped out of a four-year institution–that is, academic majors who unregistered abruptly from their courses mid-semester and skipped town with no intention of returning. That sort of thing seemed to be the province of Larry Page, Sergey Brin and Bill Gates–definitely outliers. And unlike them, I wasn’t at the helm of a world-changing startup on a clear trajectory into the multi-billion dollar stratosphere, so I couldn’t point to an overwhelming and self-evident justification.

A lot has changed since then. By all appearances, we seem to be passing through a watershed moment where existential questions about the value and purpose of college and traditional higher education in America are emerging onto the mass level among Millenials and Generation Z-ers. This discussion has been spurred on by the ensuing housing crisis, a growing tower of debilitating student loan debt, tuition rises, and mounting questions about the future of employment in the developed world, especially the ways in which opportunity has become more and less democratic in the context of technological shifts and globalisation. There’s also a growing interest in open courseware and novel forms of technology-enabled correspondence learning–though, I should say, I don’t share in the TEDdies’ conflation of the Khan Academy with higher education. Still, nearly a decade has passed since I made my fateful decision to forsake the path of higher learning, so it seems like a good time to reflect on where it’s taken me and whether it was a good call.

Visiting Western Michigan University in Kalamazoo for a conference, somewhere around 4th or 5th grade.

Some aspects of the progression of events will sound familiar to many in IT. I grew up mostly around university environments, and computer programming figured dominantly among my childhood interests. It was an interest easily encouraged by proximity to lots of expensive computer equipment, good Internet connectivity and access to sympathetic graduate student mentors. I had been playing with computers and the Internet since age 8 or so, and wrote my first program at 9. I spent much of my adolescent and teenage years nurturing this hobby, having a strong interest in both software engineering and operational-infrastructural concerns. As most people in IT know, ours is a profession that offers unrivaled self-teaching opportunities, enabled by a highly Googleable open-source software ecosystem and collaborative social dynamic. That’s why so many programmers like me are self-taught.

I also had various other intellectual interests, however, and had no plans to make a tech career. In fact, for most of my life prior to age eighteen or so, I wasn’t even particularly aware that I had a marketable skill set. The desire to get into computing as a child was utterly innocent and non-ulterior, as arbitrary as some kids’ choice to take up the cello or oil painting. I entered UGA in 2004 as a political science major and shortly switched to philosophy, with vague ideas of law school in the future.

It’s also worth remarking that I came from a humanities-oriented academic family and cultural background; my parents were professors of philosophy at a top-tier Soviet university, and my father is a professor of the same at UGA. My extended family background includes a venerable dynasty of musicians, including my great-grandfather, Mikhail Tavrizian, the conductor of the Yerevan State Opera and a National People’s Artist of the USSR, as well as his Russian wife Rassudana, a renowned ballerina in the Bolshoi Theatre. My late grandmother was a philologist and a member of the philosophy faculty of the Russian Academy of Sciences. When my parents emigrated to the US in 1992 (I was six years old), they redid graduate school entirely at the University of Notre Dame, which is where my primary school years were spent. My social and cultural life at that time played out in housing for married graduate students with children, where I ran around with friends from dozens of different nationalities.

All this to say, I was on a strong implicit academic trajectory as a function of my upbringing, a trajectory rooted in the humanities, not hard sciences. In fact, my parents were not especially supportive of my computing hobbies. As they saw it, I think, spending my days holed up in my room modem-ing away interfered with schoolwork and was not especially promotional of cultural development consonant with the mantle I was meant to inherit.

Nevertheless, I began working when I was eighteen (my parents did not let me work prior to that–Russian parents do not, as a rule, share the American faith in the virtues of part-time work for teenagers or students). My first job was technical support at a small Internet Service Provider in our university town of Athens, GA, at first very much part-time and reasonably complementary to college. I earned a 4.0 GPA in the first semester of my freshman year.

However, I was ambitious and precocious, decidedly more interested in work than school. Within a year, after some job-hopping (which included a stint in IT and/or warehouse labour at a Chinese importer of home and garden products–how’s that for division of labour?), I returned to the ISP at twice the pay rate and assumed the role of systems administrator. I was learning a great deal about real-world business and technology operations, getting my hands on industrial infrastructure and technologies, and rapidly assimilating practical knowledge. I had been around theoretical publications and underpaid graduate student assistants hunkered in dimly lit carrels my whole life, but I never had to learn the basics of business and how to communicate with all kinds of everyday people on the fly. Although the cultural clash was sometimes frustrating, the novelty of learning practical skills and how to run a real-world operation was intoxicating. It occasionally led to an outsized moral superiority complex, too, as I became conscious of the fact that at age 19, I could run circles around most of the job candidates being interviewed, some of whom had master’s degrees. Clearly, I was doing something right!

3191_754237504880_3594838_n

Fielding customer calls at the ISP. Clearly, I’m thrilled to be doing customer support despite being a sysadmin.

From that point, my career rapidly evolved in a direction not compatible with school. Formally, I was still part-time and hourly, but it was effectively closer to a full-time gig, and I rapidly took on serious responsibilities that affected service and customers. Small as the company was, in retrospect, I was a senior member of its technical staff and a sought-after authority by people ten to twenty years older. My commitment to school, already a decidedly secondary priority, rapidly deteriorated. I had no semblance of campus life immersion or student social experience. From my sophomore year onward, I was effectively a drop-in commuter, leaving in the middle of the day to go to a class here and a class there, then hurrying back to the office. I neither had time for studying nor made the time. My GPA reflected that. I didn’t care. School droned on tediously; meanwhile, T1 circuits were down and I was busy being somebody!

As my interests in telecom, networking, telephony and VoIP deepened, it became clear that the next logical career step for me was to move to Atlanta; Athens is a small town whose economy would not have supported such niche specialisation. Toward the end of the second semester of my sophomore year, I began looking for jobs in Atlanta. I unconsciously avoided the question of what that means for my university; I was simply too engrossed in work and captivated by career advancement. In the first semester of my junior year, by which point my effort at university had deteriorated to decidedly token and symbolic attendance, I finally found a job in Alpharetta (a suburb of Atlanta) at a voice applications provider. In October 2006, at the age of twenty, I announced that I was quitting university and moving to the “big city”.

My parents reacted better than I thought they would. I halfway expected them to disown me. However, in hindsight, I think they were pragmatic enough to have long realised where things were headed. It’s hard for me to say, even now, to what degree they were disappointed or proud. I don’t know if they themselves know. What was most clear at that moment was that I am who I am, and will do as I do, and there’s no stopping me.

That’s not to say that East and West didn’t collide. I remember having a conversation that went something like:

– “But what happens if you get fired in a month?”

– “Well, I suppose that’s possible, but if one performs well, it’s generally unlikely.”

– “But is there any guarantee that you won’t lose your job?”

Guarantee? That was definitely not a concept to which I was habituated in my private sector existence.

– “There are never any guarantees. But my skills are quite portable; if such a thing happened, I could find another job.”

– “It just seems very uncertain.”

– “That’s how it goes in the private sector.”

All the same, it was clear enough that, for all the problems this decision might cause me, I certainly wasn’t going to starve. Even in Athens, I was an exceptionally well-remunerated twenty year-old. My first salary in Atlanta was twice that. Moreover, it was clear that IT was an exceptionally democratic and meritocratic space; if one had the skills, one got the job. My extensive interviews in Atlanta drove home the point that potential employers did not care about my formal higher education credentials by that point in my career development. The “education” section from my résumé was long deleted, replaced by a highly specific employment history and a lengthy repertoire of concrete, demonstrable skills and domain knowledge with software and hardware platforms, programming languages, and so on. The résumés I was sending out to Atlanta companies at age twenty proffered deep and nontrivial experience with IP, firewalls, routers, switches, BGP, OSPF, Cisco IOS, Perl, Bash, C, PHP, TDM, ADSL aggregation, workflow management systems, domain controllers, intrusion detection systems–what didn’t I do at that ISP? There aren’t many companies that would have let someone with my age and experience level touch production installations of all those technologies. I was bright-eyed, bushy-tailed, and soaked it all up like a sponge. And when asked for substantiation by potential employers, I sold it.

Those of you in IT know how this works: formal education is used by employers as signalling about a candidate only in the absence of information about concrete experience or skills. All other things being equal, given two green, inexperienced candidates among whom one has a university diploma and one doesn’t, employers will choose the one who finished university, as it’s a proxy for a certain minimal level of intelligence and ability to complete a non-trivial multi-year endeavour. When concrete experience and skills are present, however, the educational credentials fly out the window for most corporate engineering and operations jobs, and the more one’s career evolves, the less relevant early-stage credentials become. Moreover, there are innumerable people in mainstream IT whose university degrees were not in computer science or affiliated subject matter, but rather in a specialty like literature, history or ecology.

My next three jobs were in Atlanta, within the space of about the next year and a half. I averaged a job change every six months or so, often considerably increasing my income in the process. By the time I was barely twenty-two, I had worked for a voice applications provider, a major local CLEC and data centre operator, and an online mortgage lender.

Of course, certain jobs were off-limits. I couldn’t do research work that required a formal computer science background, nor take jobs in government or certain other large institutions who remained sticklers for credentials. I lacked the formal mathematics and electrical engineering background necessary for low-level hardware design work. It’s also quite likely that if I had tried to climb the corporate ladder into middle to upper management, I would at some point, later in life, bump into a ceiling for degree-less drop-outs. When one gets high enough, it becomes comme il faut to have an alma mater in one’s biography, even if it’s just an airy-fairy management degree from a for-profit correspondence course mill. The only way I know of to get around that is to have had a famous and inscrutable business success (i.e. an acquisition) to overshadow it. Click on “Management Team” under the “About” section of some tech companies’ web sites to get the drift. Appearances are important at that level.

I didn’t stick around long enough to figure out where exactly the limits are (although I didn’t get the impression there were many, as long as one could demonstrably do the work). In early 2008, I was abruptly fired after some political clashes. Also, they don’t take kindly to the habitual morning tardiness of “programmer’s hours” in financial services. Instead of looking for my seventh job in four years, I decided to go out on my own. I had been itching to do it for quite some time, but didn’t quite have the wherewithal to walk away from the steady paycheck. Getting fired has a way of forcing that issue.

33446_10100118160946360_1749762_nAnd so, on a cold, windy January day in 2008, barely twenty-two, I left the building with my belongings in a box, with nearly zero dollars to my name, having wiped out my savings with a down payment on a downtown Atlanta condo. I had no revenue and no customers. A friend and I went to celebrate. I was determined and hit the ground running, though, and that’s how I started Evariste Systems, the VoIP consultancy turned software vendor that I continue to operate today, nearly eight years later.

Because the US does not have a serious vocational education program and because the focus of the “everyone must go to college” narrative of the last few decades is reputed success in the job market (or, more accurately, the threat of flipping burgers for the rest of one’s life), the first and most pertinent question on American students’ minds would be: do I feel that I have suffered professionally because I did not finish my degree?

I didn’t think I would then, and I still don’t think so now. Notwithstanding the above-mentioned limitations, it’s safe to say that I could qualify for almost any mainstream, mid-range IT job I wanted, provided I evolved my skill set in the requisite direction. In that way, IT differs considerably from most white-collar “knowledge work” professions, which are variously guilded (e.g. law, medicine) or have formal disciplinary requirements, whether by the nature of the field (civil engineering) or by custom and inertia (politics, banking). Although politics perverts every profession, IT is still exceptionally meritocratic; by and large, if you can do the job, you’re qualified.

The inextricable connection of modern IT to the history and cultural development of the Internet also moves me to say that it’s still the easiest and most realistic area in which one can receive a complete education through self-teaching. You can learn a lot about almost anything online these days, but the amount of resources available to the aspiring programmer and computer technologist is especially unparalleled.

An insert from a stack of mid-1990s PC Magazines discarded by my neighbour, which I coveted tenaciously.

That doesn’t mean I’d recommend skipping college generically to anyone who wants to enter the profession at roughly the same level. I put in, as a teenager, the requisite ten to twenty thousand hours thought to be necessary to achieve fundamental mastery of a highly specialised domain. However, I can’t take all the credit. I was fortunate to have spent my formative years in a university-centric environment, surrounded by expensive computers and highly educated people (and their children), some of whom became lifelong mentors and friends. Although my parents were not especially thrilled with how I spent my free time (or, more often, just time), they had nevertheless raised a child–as most intelligentsia parents do–to be highly inquisitive, open-minded, literate and expressive, with exposure to classical culture and literature. Undergoing emigration to a foreign land, culture and language at the age of six was challenging and stimulating to my developing mind, and the atmosphere in which I ended up on the other side of our transoceanic voyage was nurturing, welcoming and patient with me. The irony is not lost upon me that I essentially–if unwittingly–arbitraged the privilege associated with an academic cultural background into private sector lucre. A lot owes itself to blind luck, just being in the right place and in the right time. I could probably even make a persuasive argument that I lucked out because of the particular era of computing technology in which my most aggressive uptake played out.

This unique intersection of fortuitous circumstances leads me to hesitate to say that nobody needs a computer science degree to enter the IT profession. My general sense is that a computer science curriculum would add useful, necessary formalisation and depth to the patchwork of the average self-taught techie, and this certainly holds true for me as well–my understanding of the formal side of machine science is notoriously impoverished, and stepping through the rigourous mathematics and algorithms exercises would have doubtless been beneficial, though I don’t think it would have been especially job-relevant in my particular chosen specialisation.

Still, I’m not committed to any particular verdict. I’m tempted to say to people who ask me this question: “No, you don’t need a degree to work in private industry–but only if you’re really good and somewhat precocious.” Many nerds are. Almost all of the really good programmers I know have programmed and tinkered since childhood. It comes part and parcel, somewhat like in music (as I understand it). In the same vein, I don’t know anyone who wasn’t particularly gifted in IT but who came out that way after a CS degree.

On the other hand, for the median aspiring IT professional, I would speculate that a CS degree remains highly beneficial and perhaps even essential. For some subspecialisations within the profession, it’s strictly necessary. I do wonder, though, whether a lot of folks whose motive in pursuing a CS degree is entirely employment-related wouldn’t be better off entering industry right out of high school. They’d start off in low entry-level positions, but I would wager that after four years of real-world experience, many of them could run circles around their graduating peers, even if the latter do have a more rigourous theoretical background. If practicality and the job market are the primary concern, there are few substitutes for experience. Back at my ISP job, CS bachelors (and even those with master’s degrees) were rejected commonly; they had a diploma, but they couldn’t configure an IP address on a network interface.

Another reason I don’t have a clear answer is because things have changed since then; a decade is geological in IT terms. I’ve also spent twice as much time self-employed by now as I did in the employed world, and niche self-employment disconnects one from the pulse of the mass market. I know what I want in an employee, but I don’t have a finely calibrated sense of what mainstream corporate IT employers want from graduates these days. When I dropped out, Facebook had just turned the corner from TheFacebook.com, and there were no smartphones, no Ruby on Rails, no Amazon EC2, no “cloud orchestration”, no Node.js, no Docker, no Heroku, no Angular, no MongoDB. The world was still wired up with TDM circuits, MPLS was viewed as next-generation, and VoIP was still for relatively early adopters. The point is, I don’t know whether the increasing specialisation at the application layer, and increasing abstraction more generally, has afforded even more economic privilege to concrete experience over broad disciplinary fundamentals, and if so, how much.

All I can firmly say on the professional side is that it seems to have worked out for me. If I were in some way hindered by the lack of a university diploma, I haven’t noticed. I’ve never been asked about it in any employment interview after my student-era “part-time” jobs. For what I wanted to do, dropping out was the right choice professionally, and I would do it again without hesitation. It’s not a point of much controversy for me.

The bigger and more equivocal issue on which I have ruminated as I near my thirtieth birthday is how dropping out has shaped my life outside of career.

I don’t mean so much the mental-spiritual benefits of a purportedly well-rounded liberal education–I don’t think I was in any danger of receiving that at UGA. 80% of my courses were taught by overworked graduate teaching assistants of phenomenally varying pedagogical acumen (a common situation in American universities, especially public ones). The median of teaching quality was not great. And so, I’m not inclined to weep for the path to an examined life cut short. It’s not foreclosed access to the minds of megatonic professorial greats that I bemoan–not for the most part, anyway.

However, moving to Atlanta as a twenty year-old meant leaving my university town and a university-centric atmosphere. My relatively educated environs were replaced with a cross-section of the general population, and in my professional circles, particularly at that time, I had virtually no peers. My median colleague was at least ten years older, if not twenty, and outside of work, like most people living in a desolate and largely suburban moonscape, I had nobody to relate to. At the time I left, I found value in the novelty of learning to work and communicate with the general public, since I never had to do it before. I thought our college town was quite “insular”. In retrospect, though, it would not be an exaggeration to say that I robbed myself of an essential peer group, and it’s no accident that the vast majority of my enduring friendships to this day are rooted in Athens, in the university, and in the likeminded student personalities that our small ISP there attracted.

1636_692097982980_2912_n

Beautiful morning view from the balcony of my recently foreclosed condo.

As a very serious and ambitious twenty-year old moving up the career ladder, I also took a disdainful view of the ritualised rite of passage that is the “college social experience” in American folklore. I didn’t think at the time that I was missing out on gratuitous partying, drinking, and revelatory self-discovery in the mayhem of dating and sex. If anything, I had a smug, dismissive view of the much-touted oat-sowing and experimentation; I was leapfrogging all that and actually doing something with my life! Maybe. But I unraveled several years later, anyway, and went through a brief but reckless and self-destructive phase in my mid-twenties that wrought havoc upon a serious romantic relationship with a mature adult. I also at times neglected serious worldly responsibilities. Being a well-remunerated mid-twenties professional didn’t help: it only amplified gross financial mistakes I made during that time, whereas most people in their twenties are limited in the damage they can do to their life by modest funds. I’m still paying for some of those screw-ups. For example, few twenty-one year olds are equipped to properly weigh the wisdom of purchasing a swanky city condo at the top of a housing bubble, and, subsequent developments suggest that I was not an exception. Oh, a word of advice: pay your taxes. Some problems eventually disappear if you ignore them long enough. Taxes work the opposite way.

But in hindsight, a bigger problem is that I also missed out on the contemplative coffee dates, discussion panels, talks and deep, intelligent friendships that accompany student life in the undergraduate and post-graduate setting. While the median undergraduate student may not be exceptionally brilliant, universities do concentrate smart people with thoughtful values densely. It’s possible to find such connections in the undifferentiated chaos of the “real world”, but it’s much harder. I situated myself in a cultural frame which, while it undergirds the economy, is not especially affirmative of the combinations of the intellect. To this day, there is an occasionally cantankerous cultural clash between my wordy priorities and the ruthlessly utilitarian exigencies of smartphone-thumbing business. Get to the point, Alex, because business. Bullet points and “key take-aways” are the beloved kin of e-solutions, but rather estranged from philosophy and deep late-night conversations.

This facet of campus life is less about education itself than about proximity and concentration of communities of intelligent people at a similar stage of life. Because I grew up in universities, I didn’t appreciate what I had until I lost it; I traded that proximity to personal growth opportunities for getting ahead materially and economically, and my social life has been running on fumes since I left, powered largely by the remnants of that halcyon era of work and school.

If leaving the university sphere was a major blow, self-employment was perhaps the final nail. Niche self-employment in my chosen market is a largely solipsistic proposition that rewards hermitism and prolific coding, perfect for an energetic, disciplined introvert. I probably would have done better at it in my teenage years , but it didn’t suit my social nature or changed psychological priorities as an adult. A lot of time, money and sacrifice was emitted as wasted light and heat into the coldness of space as I spun my wheels in vain trying to compensate for this problem without fully understanding it.

The essential problem is much clearer in hindsight: in leaving the university and the employment world, with its coworker lunches and water cooler talk, I had robbed myself of any coherent institutional collective, and with it, robbed myself of the implicit life script that comes with having one. I was a man without any script whatsoever. I rapidly sequestered myself away from the features of civilisation that anchor most people’s social, romantic and intellectual lives, with deleterious consequences for myself. I did not value what I had always taken for granted.

There are upsides to being a heterodox renegade, of course. Such persistent solipsism mixed with viable social skills can make one very fluid and adaptable. I took advantage of the lifestyle flexibility afforded by the “non-geographic” character of my work to travel for a few years, and found unparalleled freedom few will experience in wearing numerous cultural hats. I had the incredible fortune to reconnect with my relatives and my grandmother on another continent. For all its many hardships, self-employment in IT has much to recommend it in the dispensation it affords to write the book of one’s life in an original way.

Be that as it may, the foundations of my inner drive, motivation and aspirations are notoriously ill-suited to the cloistered life of a free-floating hermit, yet I had taken great pains to structure such a life as quickly as possible, and to maximal effect. My reaction to this dissonance was to develop a still-greater penchant for radical and grandiose undertakings, a frequent vacillation between extremes, in an effort to compensate for the gaping holes in my life. The results were not always healthy. While there’s nothing wrong with marching to the beat of one’s own drum, I should have perhaps taken it as a warning sign that as I grew older and made more and more “idiosyncratic” life choices, the crowd of kindred spirits in my life drastically thinned out. “Original” is not necessarily “clever and original”.

In sum, I flew too close to the Sun. When I reflect upon the impact that my leaving the university has had upon my life, I mourn not professional dreams deferred, nor economic hardship wrought, but rather the ill-fated conceit that I could skip over certain stages of a young adult’s personal development. Now that the novelty has worn off and the hangover has set in, I know that it would have been profoundly beneficial to me if they had unfolded not within the fast and loose patchwork I clobbered together, but within a mise en scène that captures the actions, attitudes and values of the academy–my cultural home.


On “communication skills” and pedagogy

Here’s a pet peeve: the widespread belief that any two people, regardless of the disparity in their levels of intellectual development, are destined to fruitfully converse, as long as both exhibit “good communication skills”.

First, acknowledgment where it’s due. It is indeed an important life skill to be able to break down complex ideas and make them accessible to nonspecialists.

“If you can’t explain it simply, you don’t understand it well enough” is a remark on this subject often attributed to Einstein (though, as I gather, apocryphally). The idea is that explaining something simply in ways anyone can understand is the sign of true mastery of a subject, because only deep knowledge can allow you adroitly navigate up and down the levels of abstraction required to do so.

Those of us in the business world also know about the importance of connecting with diverse personalities–customers, managers, coworkers. In the startup economy, there’s a well-known art of the “elevator pitch”, wherein a nontrivial business model can be packaged into ten-second soundbites that can hold a harried investor’s attention–the given being that investors have the attention spans of an ADHD-afflicted chipmunk.

I would also concur with those who have observed that scholarly interests which don’t lend themselves to ready explanation–that are “too complex” for most mortals to fathom–are often the refuge of academic impostors. There are a lot of unscrupulous careerists and political operators in academia, more interested in what is politely termed “prestige” than in advancement of their discipline and of human understanding. These shysters, along with more innocent (but complicit) graduate students caught up in the pressures of the “publish or perish” economy, are the spammers of house journals, conferences and research publications, often hiding behind the shield of “well, you see, it’s really complicated”. Most legitimate scholarly endeavours can be explained quite straightforwardly, if hardly comprehensively. Complexity is an inscrutable fortress and a conversation-stopper in which people more interested in being cited and operating “schools of thought” (of which they are the headmasters, naturally) hide from accountability for scholarly merit.

All this has been polished into the more general meme that productive interaction is simply a question of “learning to communicate”. With the right effort, anyone can communicate usefully with anyone. It doesn’t matter if someone is speaking from a position of education and intelligence to someone bereft of those gifts. Any failure to achieve necessary and sufficient understanding is postulated as a failure of communication skills, perhaps even social graces (e.g. the stereotypical nerdling).

This is an extreme conclusion fraught with peril. We should tread carefully least we impale ourselves on the hidden obstacles of our boundless cultural enthusiasm for simplification.

First, there’s a critical distinction between clarity and simplicity. It is quite possible to take an idea simple at heart and meander around it circuitously, taking a scenic journey full of extraneous details. Admittedly, technologists such as programmers can be especially bad about this; their explanations are often vacillatory, uncommitted to any particular level of abstraction or scope, and full of tangents about implementation details which fascinate them immeasurably but are fully lost on their audience. I’ve been guilty of that on more than a few occasions.

However, there is an intellectually destructive alchemy by which the virtues of clarity and succinctness become transformed into the requirement of brevity. Not all concepts are easily reducible or lend themselves to pithy sloganeering–not without considerable trade-offs in intellectual honesty. This is a point lost on marketers and political activists alike. It leads to big ideas and grandiose proclamations that trample well-considered, moderate positions, as the latter are thermodynamically outmatched by simplistic reductions. Brandolini’s Law, or the Bullshit Asymmetry Principle, states: “The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.” As always, sex sells–a fact of which the TEDdies have a firm grasp, with their peddling of seductive insight porn. As Evgeny Morov said:

Brevity may be the soul of wit, or of lingerie, but it is not the soul of analysis. The TED ideal of thought is the ideal of the “takeaway”—the shrinkage of thought for people too busy to think.”

Second, the idea that “communication skills” are at the heart of all matters has wormed its way into pedagogy rather disturbingly in the form of group work and so-called collaborative models of learning. As the thinking goes, the diversity of a student body is an asset; students have much to learn from each other, not just the lecturer, and encouraging them to do so prepares them for “the real world”, where they’re ostensibly going to be coworkers, police officer and arrestee, and so on.

It reminds me of an episode recounted by my favourite author William Blum in his memoir about the political upheaval of the 1960s:

At one point I enrolled for a class in Spanish at the so-called Free University of Washington, and at the first meeting I was flabbergasted to hear the “teacher” announce that he probably didn’t know much more Spanish than the students. And that’s the way it should be, he informed us–no authoritarian hierarchy. He wanted to learn from us as much as we wanted to learn from him.”

The counterculture kids were challenging incumbent hierarchies of authority. I see the same kind of anti-intellectualism recycled today into the putatively more laudable goal of social flattening.

But there’s a limit to the productive fruit of such ventures. It’s best illustrated by an anecdote from my own life.

When I was a freshman at the University of Georgia, I took an obligatory writing and composition course, as part of the infamous “core requirements” (remedial high school) that characterise the first year or two of four-year undergraduate university education in the US. One day in November, our drafts of an expository essay were due, presumably for commentary and feedback on writing mechanics by the English graduate student teaching the course.

Instead, we were paired with a random classmate and told to critique each other’s papers. My partner was an Agriculture major–a farmer’s son, he readily volunteered–who was only at the university because his father insisted that he needed a college degree before taking up his place in the family business. I would estimate his reading level to have been somewhere in the neighbourhood of fifth to eighth grade. I was going to critique his paper, and he was going to critique mine.

Candidly, his paper was largely unintelligible gibberish; it would have taken many improbable megajoules of energy input much for it to rise merely to the level of “unpolished”. Were the problems strictly mechanical–paragraphs lacking topic sentences, no discernible thesis in sight, no clear evidentiary relationship between his central claims and the sentences supporting them–I would have earned my keep in a few minutes with a red pen.

The problem was much deeper: his ideas were fundamentally low-quality, benighted in a commonsensically evident kind of way. They were at once trite, obvious, and all but irrelevant to the assigned topic. The few empirical claims made ranged from startling falsehoods to profoundly unfalsifiable arrangements of New Agey words that grated on the ear of someone accustomed to the idea that the purpose of arranging words was to convey meaning. He was hurtling at light speed toward an F. What could I do, rewrite his paper for him? How would I even begin to explain what is wrong with it? There was no room to start small or to evolve toward bigger, more summative problem statements; it was a genuine can of worms: pry it open, and all the worms come out to play at once.

I don’t mean to impugn him as a human being; he just wasn’t suited to the university’s humanities wing, whose business was reputed to be the life of the mind, set in a programme of liberal education. He didn’t know how to argue or how to write — period. He was more of a hero of proletarian labour, as it were, reared in a life script ineffably different to my own, never having crossed paths with me or anyone else in the pampered, effete, bourgeois “knowledge work” setting before, and destined to never cross paths with me in any such setting again. I was utterly paralysed; there just wasn’t much I could do to help him. Plainly, I couldn’t tell him that his thoughts issue forth from a nexus of civilisation unrecognisable to me. There wasn’t much of anything to say, really. I made a few perfunctory remarks and called it a day.

His feedback on my paper, which in turn suffered from organisational and topic transition problems that continue to dog my writing today, was: “Looks good, man!” Verily, his piercing insight knew no bounds. We really learned a lot from each other that day. Along the way, I overheard bits and pieces of a rather erudite peer review by a considerably better-educated classmate. Why couldn’t she review my paper? It would have almost certainly helped. My writing wasn’t stellar, and my devoted readership–I do it all for you, much love!–knows it still isn’t.

Later, I privately enquired to the lecturer as to how I was supposed to condense a lifetime–however hampered by the limitations of my age and experience–of literacy, intellectual curiosity, familial and cultural academic background, semi-decent public education and informal training in polemic and rhetoric into a functional critique that would realistically benefit my beleaguered cohort and help him write a better paper. She replied: “That was the whole point; you need to work on your communication skills.”

In defiance not only of the comme il faut tenets of political correctness, but in fact–in some sense–of the national mythos of our putatively classless and democratic melting pot, I brazenly suggest something that is, I think, considered fairly obvious elsewhere: not all categories of people are destined to communicate deeply or productively.

When such discord inevitably manifests, we should not reflexively blame so-called communication skills or processes. People operate in silos that are sometimes “civilisationally incommensurable”, as it were, and sometimes there just isn’t much to communicate. This is the reality of culture, class and education, and the thinking on collaborative learning and teaching methodologies should incorporate that awareness instead of unavailingly denying it. Matching partners in group activities by sociological and educational extraction clearly presents political challenges in the modern classroom, though. Instead, I would encourage teachers to rediscover–“reimagine” is the appropriate neologism, isn’t it?–the tired, hackneyed premise of leadership by example. At the risk of a little methodological authoritarianism and a few droopy eyelids, perhaps the best way to ensure that students leave your course better than you found them is to focus on their communication with you. They’ll have the rest of their lives to figure out how to transact with each other.