More lessons from my twenties: surviving as an amateur literator in tech

My upbringing and extraction unquestionably lie in the liberal arts and humanities. I’m moderately extroverted, and always leaned hard on the side of verbal, expressive and linguistic capabilities. For the time I was in university, I was a philosophy major with vague notions of law school. My parents are philosophy professors, and most of my relatives have an academic pedigree. I also had the unique intercultural experience of coming to America at age six with no knowledge of English and subsequently learning it in an academic social setting (my parents were graduate students). I carried that formative experience, and the globally conscious, relativistic outlook on language and people that it fosters, forward with me through life.

That doesn’t mean I’m a great writer, but I can write. Great writing, though, is really hard. As with many other things, if you plot a line from “can’t write at all” to “great writer”, you’d have to plot it on a logarithmic scale. Having a broad vocabulary, a firm command of language, and adroit self-expression will get you to the table stakes of “can write”, but that last bit, on the right, is a hundred, a thousand times as hard as what precedes it.

You know great expository writing when you read it; the thoughts and ideas are scrupulously organised, yet presented in a compelling way, with varied transitions and entertaining use of language, at once colourful and precise. Come to think of it, it feels pedestrian to anatomise it this way. You know great writing when you see it.

My writing is far too disorganised and repetitive to hit those notes. I’m verbose and can write a lot quickly and easily, but quantity is not quality; organisation has always been a struggle amidst my desire to relate a lot of details. If you read this blog with any regularity, you’ve seen that battle play out.

Though I’ve got better at condensing my thoughts and communicating ideas simply with age, I’ll never write like my friend Alan. His writing is incredibly brief and terse, but his gift is succinctness per se, which is not the same as brevity, though the two are very often confused in contemporary minimalistic fashions in communication. He can say much with little where many others merely say little with little.

As far as I can tell, the real gift there is the ability to accurately foresee the details and connections that the reader’s mind can work out for itself. Then you can say only what’s necessary to anchor the conceptual tent, cloth not included, avoiding most of the potential redundancy that makes verbose text tedious. This post would be about six times shorter if Alan were writing it, yet say every bit as much–if it’s truly important.

I tried, for a time, to emulate his style growing up, but the results were farcical, much more along the lines of saying little with little. Not everyone’s intellectual output can be compact and tidy. I have to ply my version of the craft, such as it is, differently.

Anyway, I lay out all these concerns not to be pompous, but rather to say that the kind of stuff I spend a lot of time worrying about doesn’t typify the STEM personality one commonly finds in the software engineering profession, nor the pragmatic, utilitarian–and often Spartan, at least when it comes to writing–communicators in the business world. There are exceptions, of course, but as a whole, my life experience is that it’s a valid generalisation about engineer types and MBAs thumbing out curt txt spk on their Blackberries. And this is the environment in which I’ve spent almost my entire adult life, having dropped out of university to seek the exalted heights of corporate America.

BUCHAREST, ROMANIA - OCTOBER 23, 2013: Unidentified lonely thoughtful retired man on his 60s rests on a bench in a park, autumn scene. The retirees represents 22% of the Bucharest's total population.

As you can imagine, this occasionally leads to amusing and infuriating conflicts of style and culture, and in general doesn’t make for an easy professional life. It’s not easy to talk to people when you have completely different psychological priorities than they do. The curse of being somewhat better-rounded is that my mind often takes detours not travelled by fellow Professionals. To their mostly utilitarian sensibilities, idle musings and the cultivation of an inner life beyond the immediate task at hand are, above all else, a waste of time. It’s not enough to just write this e-mail; it must be a good e-mail, at once brief and useful, but also poignant, articulate, maybe even with a dash of wit, or clever and original use of the English language? They’re thinking: get to the point, Alex, because business. There’s money on the line, or action items or something. Never mind the existential why! Business

Being wordy didn’t make for an easy childhood, either. I don’t think I came off overtly as bookish, being mostly chatty and rarely seen with an actual book per se. All the same, I can’t remember how many times I was called “Dictionary” or “Thesaurus” in school, or otherwise suffered social opprobrium for… well, for using words like “opprobrium”.

Outside of the liberal arts wing of a university environment (which I forsook at age twenty), the rest of the world offers a pretty steady diet of hostility to aspirant wordsmiths, and, as far as I can see, more generally to broader combinations of the intellect. There’s the automatic, default hostility of idle, unemployed kids in school, and the studied hostility of busy professional grown-ups. It’s easy to get depressed shouting into a waterfall, or, more accurately, pissing into the wind. I often feel an impostor, not quite sure what I’m doing here donning the regalia of tech entrepreneurship. When almost everyone I mix with expects small talk, being the guy always keen to start some big talk is demoralising and lonely.

And yet, as I pass the thirty mark, I’ve noticed something interesting. As more and more friends, colleagues and classmates move up the career ladder or otherwise evolve higher-order life needs, they’re coming to me for help in formulating thoughts: delicate requests, polite demands, cover letters, biographies, dating profiles, admissions essays, crowdfunding campaigns, petitions. All of this and more has landed at my feet in the past year.

“You always know how to say this stuff just right.”

“I don’t know how to say this – help!”

“You can put it a lot better than I can.”

Every once in a while, I’ll even get a note from a customer: “We always appreciate your thorough explanations and your going the extra mile.”

So, good news from our own “It Gets Better Project” for fellow closeted English majors in their twenties and thirties: keep your head up. As folks who know you move up the value chain into managerial realms requiring them to flex their communication muscles for the first time, you’re going to be more in demand.

Moreover, through my own experiences in hiring and being hired in the technology sector, I’m firmly of the impression that the most valuable candidates in the long run are those who both possess raw skills and can communicate well. There’s a lot of bottom-line value in clear analysis, disentangling messy ideas, and presenting esoteric information in an accessible way to outside stakeholders. Wordy missives may always be ignored by MBA frat boys as a matter of course, but effective and engaging communicators have more influence and audience.

The point is, as you gain confidence on your professional ascent and increase your leverage, stop taking shit from philistines. Don’t shy away from selecting aggressively for employers, customers and partners who realise that better-rounded people bring more to the table and appreciate you for who you are. Much has been said about how the customer is always right, and while compromises are necessary in life, you don’t have to concede everything and always. The fibres most integral to your self-actualisation should be armoured. The rightful sense of self is not for sale.

Evaluating potential hires for “culture fit” is all the rage in human resources now. Why not evaluate them for culture fit? What’s the culture like at the new gig? Neverending arguments about last night’s Steelers vs. Cowboys and the impact on Fantasy Football picks? Spirited discussion of the pros and cons of sundry brotein shakes? A thriving marketplace of World of Warcraft items? Hackneyed memes about bringing democracy to Syria? Either way, fire ’em. Sounds like bad “culture fit”.

Finally, choose your cohorts and your spouses wisely. Your true friends will help, not hinder you in leading an examined life.


Nitpicking is the lowest form of criticism

In general, pedantic nitpicking isn’t one tenth as insightful as it might seem. In a pointed critique, it’s best to think holistically about big, central ideas and refute on that level.

It’s harder work, since it requires a close reading to truly grapple with and comprehend the idea one aims to dismantle. It’s also more risky, because there’s more “skin in the game”; big-picture theses necessarily depend on advancing certain generalisations, summations and interpretations of opposing positions, exposing the critic to accusations of having failed to properly understand what they’re assailing.

As with many gambles in nature, however, there’s a risk-reward ratio here. Fortune favours the bold. Engaging at a high level is lot more efficient, effective and persuasive, but you’ve got to put yourself out there. That is why taking small pot shots at ideas from the sidelines is a more popular sport; guerilla warfare from a sniper position, ensconced in the foliage, is a lot safer. It’s a common refuge of intellects too feeble or cowardly to wrestle with the main corpus.

However, not everyone whose criticism operates on technicalities is feeble or a coward. Some simply don’t understand that it’s not generally insightful. I’ve commonly encountered this in two areas.

One is in academic disciplines in the humanities, where bright-eyed, bushy-tailed graduate students, often young, uninitiated and not capable of much original thinking in their field, fall prey to great institutional pressures to “publish or perish” into the sizzling carousel of spam cite-able research work. A seductive, intellectually lazy cop-out for them is to bring a little of the quantitative “rigour” of the hard sciences to the reputedly equivocal headspaces of their fields by “troubling” broad claims with pedantic caveats. And so, conference papers and entire dissertations are built on the dubious notion that there’s something meaningful to be illuminated in demonstrating that yes, Virginia, there do exist exceptions to generalisations!

Another bastion of notionally insightful pedanticism is techies, and more broadly, technocrats. The kinds of left-brained–if you’ll excuse the pop psychology–people who are stereotypically drawn to mathematics and computer science often seem to have an axe to grind with the humanities and “soft” sciences. Many programmers get into software in part because they are powerfully drawn, at an aesthetic and psychological level, to disciplines with finite, deterministic, and self-consistent systems of deductively logical rules. Such systems are not only elegant in their eyes, but appeal strongly to their sense of justice and fairness. Clear, distinct, binary, black-and-white right and wrong answers leave little to whims, tastes, customs and habits.

And so, aggrieved and slighted by the “subjectivity” of some literature, arts, history or philosophy professor in their past–fields for which they did not show exceptional aptitude–they find in pedantry a personal vehicle for restitutive vindication. If they can just show that not all truisms in sociology apply to everyone, they’ll expose the whole field for the colossal tower of bullshit that it is! There are powerful currents of vicious contempt circulating in the technocracy for anything not “user-friendly”–that is, problems which threaten to burden the thinker with consideration of relative meaning and varied interpretation. As most crystal balls into future of private sector employment in the developed world prophecy demand for human thought processes and skills that are complementary to machine intelligence, I do worry about what this means for the already imperiled state of the liberal arts.

Anyway, there are certainly some claims where small details matter or whose foundations can be invalidated by singular exceptions. Such quality control is table stakes for the design of satellite guidance systems and aircraft engines, for the teaching of open-heart surgery, and the certification of medications for sale. We wouldn’t have it any other way.

Still, before being “that guy”–ever quick on the draw with the “actually…”–it’s wise to ask onesself in an honest and open-minded way: if I pull this block out, will the whole tower collapse? Or have I spun my wheels, emitting a lot of heat and light into the cold emptiness of space, in committing a disposable, utterly forgettable act of superficial vandalism?


Too cool for school: a retrospective on dropping out of university

The homo computatis college drop-out was a cliché whose establishment in folklore predated my departure from the University of Georgia by at least two decades. Nevertheless, I also joined the club. In the first semester of the 2005-2006 academic year, after dabbling half-heartedly in coursework for two years as a philosophy major, I threw open the gates and exiled myself into the great beyond.

Although my actions conformed to a known stereotype, I still feel I was something of an early adopter, virtually alone in my group of peers. I came to count many in my acquaintance who never pursued post-secondary education in the first place, or who floated in and out of community and technical colleges amidst working and financial struggles, but knew of vanishingly few, especially at that time, who straight-up dropped out of a four-year institution–that is, academic majors who unregistered abruptly from their courses mid-semester and skipped town with no intention of returning. That sort of thing seemed to be the province of Larry Page, Sergey Brin and Bill Gates–definitely outliers. And unlike them, I wasn’t at the helm of a world-changing startup on a clear trajectory into the multi-billion dollar stratosphere, so I couldn’t point to an overwhelming and self-evident justification.

A lot has changed since then. By all appearances, we seem to be passing through a watershed moment where existential questions about the value and purpose of college and traditional higher education in America are emerging onto the mass level among Millenials and Generation Z-ers. This discussion has been spurred on by the ensuing housing crisis, a growing tower of debilitating student loan debt, tuition rises, and mounting questions about the future of employment in the developed world, especially the ways in which opportunity has become more and less democratic in the context of technological shifts and globalisation. There’s also a growing interest in open courseware and novel forms of technology-enabled correspondence learning–though, I should say, I don’t share in the TEDdies’ conflation of the Khan Academy with higher education. Still, nearly a decade has passed since I made my fateful decision to forsake the path of higher learning, so it seems like a good time to reflect on where it’s taken me and whether it was a good call.

Visiting Western Michigan University in Kalamazoo for a conference, somewhere around 4th or 5th grade.

Some aspects of the progression of events will sound familiar to many in IT. I grew up mostly around university environments, and computer programming figured dominantly among my childhood interests. It was an interest easily encouraged by proximity to lots of expensive computer equipment, good Internet connectivity and access to sympathetic graduate student mentors. I had been playing with computers and the Internet since age 8 or so, and wrote my first program at 9. I spent much of my adolescent and teenage years nurturing this hobby, having a strong interest in both software engineering and operational-infrastructural concerns. As most people in IT know, ours is a profession that offers unrivaled self-teaching opportunities, enabled by a highly Googleable open-source software ecosystem and collaborative social dynamic. That’s why so many programmers like me are self-taught.

I also had various other intellectual interests, however, and had no plans to make a tech career. In fact, for most of my life prior to age eighteen or so, I wasn’t even particularly aware that I had a marketable skill set. The desire to get into computing as a child was utterly innocent and non-ulterior, as arbitrary as some kids’ choice to take up the cello or oil painting. I entered UGA in 2004 as a political science major and shortly switched to philosophy, with vague ideas of law school in the future.

It’s also worth remarking that I came from a humanities-oriented academic family and cultural background; my parents were professors of philosophy at a top-tier Soviet university, and my father is a professor of the same at UGA. My extended family background includes a venerable dynasty of musicians, including my great-grandfather, Mikhail Tavrizian, the conductor of the Yerevan State Opera and a National People’s Artist of the USSR, as well as his Russian wife Rassudana, a renowned ballerina in the Bolshoi Theatre. My late grandmother was a philologist and a member of the philosophy faculty of the Russian Academy of Sciences. When my parents emigrated to the US in 1992 (I was six years old), they redid graduate school entirely at the University of Notre Dame, which is where my primary school years were spent. My social and cultural life at that time played out in housing for married graduate students with children, where I ran around with friends from dozens of different nationalities.

All this to say, I was on a strong implicit academic trajectory as a function of my upbringing, a trajectory rooted in the humanities, not hard sciences. In fact, my parents were not especially supportive of my computing hobbies. As they saw it, I think, spending my days holed up in my room modem-ing away interfered with schoolwork and was not especially promotional of cultural development consonant with the mantle I was meant to inherit.

Nevertheless, I began working when I was eighteen (my parents did not let me work prior to that–Russian parents do not, as a rule, share the American faith in the virtues of part-time work for teenagers or students). My first job was technical support at a small Internet Service Provider in our university town of Athens, GA, at first very much part-time and reasonably complementary to college. I earned a 4.0 GPA in the first semester of my freshman year.

However, I was ambitious and precocious, decidedly more interested in work than school. Within a year, after some job-hopping (which included a stint in IT and/or warehouse labour at a Chinese importer of home and garden products–how’s that for division of labour?), I returned to the ISP at twice the pay rate and assumed the role of systems administrator. I was learning a great deal about real-world business and technology operations, getting my hands on industrial infrastructure and technologies, and rapidly assimilating practical knowledge. I had been around theoretical publications and underpaid graduate student assistants hunkered in dimly lit carrels my whole life, but I never had to learn the basics of business and how to communicate with all kinds of everyday people on the fly. Although the cultural clash was sometimes frustrating, the novelty of learning practical skills and how to run a real-world operation was intoxicating. It occasionally led to an outsized moral superiority complex, too, as I became conscious of the fact that at age 19, I could run circles around most of the job candidates being interviewed, some of whom had master’s degrees. Clearly, I was doing something right!

3191_754237504880_3594838_n

Fielding customer calls at the ISP. Clearly, I’m thrilled to be doing customer support despite being a sysadmin.

From that point, my career rapidly evolved in a direction not compatible with school. Formally, I was still part-time and hourly, but it was effectively closer to a full-time gig, and I rapidly took on serious responsibilities that affected service and customers. Small as the company was, in retrospect, I was a senior member of its technical staff and a sought-after authority by people ten to twenty years older. My commitment to school, already a decidedly secondary priority, rapidly deteriorated. I had no semblance of campus life immersion or student social experience. From my sophomore year onward, I was effectively a drop-in commuter, leaving in the middle of the day to go to a class here and a class there, then hurrying back to the office. I neither had time for studying nor made the time. My GPA reflected that. I didn’t care. School droned on tediously; meanwhile, T1 circuits were down and I was busy being somebody!

As my interests in telecom, networking, telephony and VoIP deepened, it became clear that the next logical career step for me was to move to Atlanta; Athens is a small town whose economy would not have supported such niche specialisation. Toward the end of the second semester of my sophomore year, I began looking for jobs in Atlanta. I unconsciously avoided the question of what that means for my university; I was simply too engrossed in work and captivated by career advancement. In the first semester of my junior year, by which point my effort at university had deteriorated to decidedly token and symbolic attendance, I finally found a job in Alpharetta (a suburb of Atlanta) at a voice applications provider. In October 2006, at the age of twenty, I announced that I was quitting university and moving to the “big city”.

My parents reacted better than I thought they would. I halfway expected them to disown me. However, in hindsight, I think they were pragmatic enough to have long realised where things were headed. It’s hard for me to say, even now, to what degree they were disappointed or proud. I don’t know if they themselves know. What was most clear at that moment was that I am who I am, and will do as I do, and there’s no stopping me.

That’s not to say that East and West didn’t collide. I remember having a conversation that went something like:

– “But what happens if you get fired in a month?”

– “Well, I suppose that’s possible, but if one performs well, it’s generally unlikely.”

– “But is there any guarantee that you won’t lose your job?”

Guarantee? That was definitely not a concept to which I was habituated in my private sector existence.

– “There are never any guarantees. But my skills are quite portable; if such a thing happened, I could find another job.”

– “It just seems very uncertain.”

– “That’s how it goes in the private sector.”

All the same, it was clear enough that, for all the problems this decision might cause me, I certainly wasn’t going to starve. Even in Athens, I was an exceptionally well-remunerated twenty year-old. My first salary in Atlanta was twice that. Moreover, it was clear that IT was an exceptionally democratic and meritocratic space; if one had the skills, one got the job. My extensive interviews in Atlanta drove home the point that potential employers did not care about my formal higher education credentials by that point in my career development. The “education” section from my résumé was long deleted, replaced by a highly specific employment history and a lengthy repertoire of concrete, demonstrable skills and domain knowledge with software and hardware platforms, programming languages, and so on. The résumés I was sending out to Atlanta companies at age twenty proffered deep and nontrivial experience with IP, firewalls, routers, switches, BGP, OSPF, Cisco IOS, Perl, Bash, C, PHP, TDM, ADSL aggregation, workflow management systems, domain controllers, intrusion detection systems–what didn’t I do at that ISP? There aren’t many companies that would have let someone with my age and experience level touch production installations of all those technologies. I was bright-eyed, bushy-tailed, and soaked it all up like a sponge. And when asked for substantiation by potential employers, I sold it.

Those of you in IT know how this works: formal education is used by employers as signalling about a candidate only in the absence of information about concrete experience or skills. All other things being equal, given two green, inexperienced candidates among whom one has a university diploma and one doesn’t, employers will choose the one who finished university, as it’s a proxy for a certain minimal level of intelligence and ability to complete a non-trivial multi-year endeavour. When concrete experience and skills are present, however, the educational credentials fly out the window for most corporate engineering and operations jobs, and the more one’s career evolves, the less relevant early-stage credentials become. Moreover, there are innumerable people in mainstream IT whose university degrees were not in computer science or affiliated subject matter, but rather in a specialty like literature, history or ecology.

My next three jobs were in Atlanta, within the space of about the next year and a half. I averaged a job change every six months or so, often considerably increasing my income in the process. By the time I was barely twenty-two, I had worked for a voice applications provider, a major local CLEC and data centre operator, and an online mortgage lender.

Of course, certain jobs were off-limits. I couldn’t do research work that required a formal computer science background, nor take jobs in government or certain other large institutions who remained sticklers for credentials. I lacked the formal mathematics and electrical engineering background necessary for low-level hardware design work. It’s also quite likely that if I had tried to climb the corporate ladder into middle to upper management, I would at some point, later in life, bump into a ceiling for degree-less drop-outs. When one gets high enough, it becomes comme il faut to have an alma mater in one’s biography, even if it’s just an airy-fairy management degree from a for-profit correspondence course mill. The only way I know of to get around that is to have had a famous and inscrutable business success (i.e. an acquisition) to overshadow it. Click on “Management Team” under the “About” section of some tech companies’ web sites to get the drift. Appearances are important at that level.

I didn’t stick around long enough to figure out where exactly the limits are (although I didn’t get the impression there were many, as long as one could demonstrably do the work). In early 2008, I was abruptly fired after some political clashes. Also, they don’t take kindly to the habitual morning tardiness of “programmer’s hours” in financial services. Instead of looking for my seventh job in four years, I decided to go out on my own. I had been itching to do it for quite some time, but didn’t quite have the wherewithal to walk away from the steady paycheck. Getting fired has a way of forcing that issue.

33446_10100118160946360_1749762_nAnd so, on a cold, windy January day in 2008, barely twenty-two, I left the building with my belongings in a box, with nearly zero dollars to my name, having wiped out my savings with a down payment on a downtown Atlanta condo. I had no revenue and no customers. A friend and I went to celebrate. I was determined and hit the ground running, though, and that’s how I started Evariste Systems, the VoIP consultancy turned software vendor that I continue to operate today, nearly eight years later.

Because the US does not have a serious vocational education program and because the focus of the “everyone must go to college” narrative of the last few decades is reputed success in the job market (or, more accurately, the threat of flipping burgers for the rest of one’s life), the first and most pertinent question on American students’ minds would be: do I feel that I have suffered professionally because I did not finish my degree?

I didn’t think I would then, and I still don’t think so now. Notwithstanding the above-mentioned limitations, it’s safe to say that I could qualify for almost any mainstream, mid-range IT job I wanted, provided I evolved my skill set in the requisite direction. In that way, IT differs considerably from most white-collar “knowledge work” professions, which are variously guilded (e.g. law, medicine) or have formal disciplinary requirements, whether by the nature of the field (civil engineering) or by custom and inertia (politics, banking). Although politics perverts every profession, IT is still exceptionally meritocratic; by and large, if you can do the job, you’re qualified.

The inextricable connection of modern IT to the history and cultural development of the Internet also moves me to say that it’s still the easiest and most realistic area in which one can receive a complete education through self-teaching. You can learn a lot about almost anything online these days, but the amount of resources available to the aspiring programmer and computer technologist is especially unparalleled.

An insert from a stack of mid-1990s PC Magazines discarded by my neighbour, which I coveted tenaciously.

That doesn’t mean I’d recommend skipping college generically to anyone who wants to enter the profession at roughly the same level. I put in, as a teenager, the requisite ten to twenty thousand hours thought to be necessary to achieve fundamental mastery of a highly specialised domain. However, I can’t take all the credit. I was fortunate to have spent my formative years in a university-centric environment, surrounded by expensive computers and highly educated people (and their children), some of whom became lifelong mentors and friends. Although my parents were not especially thrilled with how I spent my free time (or, more often, just time), they had nevertheless raised a child–as most intelligentsia parents do–to be highly inquisitive, open-minded, literate and expressive, with exposure to classical culture and literature. Undergoing emigration to a foreign land, culture and language at the age of six was challenging and stimulating to my developing mind, and the atmosphere in which I ended up on the other side of our transoceanic voyage was nurturing, welcoming and patient with me. The irony is not lost upon me that I essentially–if unwittingly–arbitraged the privilege associated with an academic cultural background into private sector lucre. A lot owes itself to blind luck, just being in the right place and in the right time. I could probably even make a persuasive argument that I lucked out because of the particular era of computing technology in which my most aggressive uptake played out.

This unique intersection of fortuitous circumstances leads me to hesitate to say that nobody needs a computer science degree to enter the IT profession. My general sense is that a computer science curriculum would add useful, necessary formalisation and depth to the patchwork of the average self-taught techie, and this certainly holds true for me as well–my understanding of the formal side of machine science is notoriously impoverished, and stepping through the rigourous mathematics and algorithms exercises would have doubtless been beneficial, though I don’t think it would have been especially job-relevant in my particular chosen specialisation.

Still, I’m not committed to any particular verdict. I’m tempted to say to people who ask me this question: “No, you don’t need a degree to work in private industry–but only if you’re really good and somewhat precocious.” Many nerds are. Almost all of the really good programmers I know have programmed and tinkered since childhood. It comes part and parcel, somewhat like in music (as I understand it). In the same vein, I don’t know anyone who wasn’t particularly gifted in IT but who came out that way after a CS degree.

On the other hand, for the median aspiring IT professional, I would speculate that a CS degree remains highly beneficial and perhaps even essential. For some subspecialisations within the profession, it’s strictly necessary. I do wonder, though, whether a lot of folks whose motive in pursuing a CS degree is entirely employment-related wouldn’t be better off entering industry right out of high school. They’d start off in low entry-level positions, but I would wager that after four years of real-world experience, many of them could run circles around their graduating peers, even if the latter do have a more rigourous theoretical background. If practicality and the job market are the primary concern, there are few substitutes for experience. Back at my ISP job, CS bachelors (and even those with master’s degrees) were rejected commonly; they had a diploma, but they couldn’t configure an IP address on a network interface.

Another reason I don’t have a clear answer is because things have changed since then; a decade is geological in IT terms. I’ve also spent twice as much time self-employed by now as I did in the employed world, and niche self-employment disconnects one from the pulse of the mass market. I know what I want in an employee, but I don’t have a finely calibrated sense of what mainstream corporate IT employers want from graduates these days. When I dropped out, Facebook had just turned the corner from TheFacebook.com, and there were no smartphones, no Ruby on Rails, no Amazon EC2, no “cloud orchestration”, no Node.js, no Docker, no Heroku, no Angular, no MongoDB. The world was still wired up with TDM circuits, MPLS was viewed as next-generation, and VoIP was still for relatively early adopters. The point is, I don’t know whether the increasing specialisation at the application layer, and increasing abstraction more generally, has afforded even more economic privilege to concrete experience over broad disciplinary fundamentals, and if so, how much.

All I can firmly say on the professional side is that it seems to have worked out for me. If I were in some way hindered by the lack of a university diploma, I haven’t noticed. I’ve never been asked about it in any employment interview after my student-era “part-time” jobs. For what I wanted to do, dropping out was the right choice professionally, and I would do it again without hesitation. It’s not a point of much controversy for me.

The bigger and more equivocal issue on which I have ruminated as I near my thirtieth birthday is how dropping out has shaped my life outside of career.

I don’t mean so much the mental-spiritual benefits of a purportedly well-rounded liberal education–I don’t think I was in any danger of receiving that at UGA. 80% of my courses were taught by overworked graduate teaching assistants of phenomenally varying pedagogical acumen (a common situation in American universities, especially public ones). The median of teaching quality was not great. And so, I’m not inclined to weep for the path to an examined life cut short. It’s not foreclosed access to the minds of megatonic professorial greats that I bemoan–not for the most part, anyway.

However, moving to Atlanta as a twenty year-old meant leaving my university town and a university-centric atmosphere. My relatively educated environs were replaced with a cross-section of the general population, and in my professional circles, particularly at that time, I had virtually no peers. My median colleague was at least ten years older, if not twenty, and outside of work, like most people living in a desolate and largely suburban moonscape, I had nobody to relate to. At the time I left, I found value in the novelty of learning to work and communicate with the general public, since I never had to do it before. I thought our college town was quite “insular”. In retrospect, though, it would not be an exaggeration to say that I robbed myself of an essential peer group, and it’s no accident that the vast majority of my enduring friendships to this day are rooted in Athens, in the university, and in the likeminded student personalities that our small ISP there attracted.

1636_692097982980_2912_n

Beautiful morning view from the balcony of my recently foreclosed condo.

As a very serious and ambitious twenty-year old moving up the career ladder, I also took a disdainful view of the ritualised rite of passage that is the “college social experience” in American folklore. I didn’t think at the time that I was missing out on gratuitous partying, drinking, and revelatory self-discovery in the mayhem of dating and sex. If anything, I had a smug, dismissive view of the much-touted oat-sowing and experimentation; I was leapfrogging all that and actually doing something with my life! Maybe. But I unraveled several years later, anyway, and went through a brief but reckless and self-destructive phase in my mid-twenties that wrought havoc upon a serious romantic relationship with a mature adult. I also at times neglected serious worldly responsibilities. Being a well-remunerated mid-twenties professional didn’t help: it only amplified gross financial mistakes I made during that time, whereas most people in their twenties are limited in the damage they can do to their life by modest funds. I’m still paying for some of those screw-ups. For example, few twenty-one year olds are equipped to properly weigh the wisdom of purchasing a swanky city condo at the top of a housing bubble, and, subsequent developments suggest that I was not an exception. Oh, a word of advice: pay your taxes. Some problems eventually disappear if you ignore them long enough. Taxes work the opposite way.

But in hindsight, a bigger problem is that I also missed out on the contemplative coffee dates, discussion panels, talks and deep, intelligent friendships that accompany student life in the undergraduate and post-graduate setting. While the median undergraduate student may not be exceptionally brilliant, universities do concentrate smart people with thoughtful values densely. It’s possible to find such connections in the undifferentiated chaos of the “real world”, but it’s much harder. I situated myself in a cultural frame which, while it undergirds the economy, is not especially affirmative of the combinations of the intellect. To this day, there is an occasionally cantankerous cultural clash between my wordy priorities and the ruthlessly utilitarian exigencies of smartphone-thumbing business. Get to the point, Alex, because business. Bullet points and “key take-aways” are the beloved kin of e-solutions, but rather estranged from philosophy and deep late-night conversations.

This facet of campus life is less about education itself than about proximity and concentration of communities of intelligent people at a similar stage of life. Because I grew up in universities, I didn’t appreciate what I had until I lost it; I traded that proximity to personal growth opportunities for getting ahead materially and economically, and my social life has been running on fumes since I left, powered largely by the remnants of that halcyon era of work and school.

If leaving the university sphere was a major blow, self-employment was perhaps the final nail. Niche self-employment in my chosen market is a largely solipsistic proposition that rewards hermitism and prolific coding, perfect for an energetic, disciplined introvert. I probably would have done better at it in my teenage years , but it didn’t suit my social nature or changed psychological priorities as an adult. A lot of time, money and sacrifice was emitted as wasted light and heat into the coldness of space as I spun my wheels in vain trying to compensate for this problem without fully understanding it.

The essential problem is much clearer in hindsight: in leaving the university and the employment world, with its coworker lunches and water cooler talk, I had robbed myself of any coherent institutional collective, and with it, robbed myself of the implicit life script that comes with having one. I was a man without any script whatsoever. I rapidly sequestered myself away from the features of civilisation that anchor most people’s social, romantic and intellectual lives, with deleterious consequences for myself. I did not value what I had always taken for granted.

There are upsides to being a heterodox renegade, of course. Such persistent solipsism mixed with viable social skills can make one very fluid and adaptable. I took advantage of the lifestyle flexibility afforded by the “non-geographic” character of my work to travel for a few years, and found unparalleled freedom few will experience in wearing numerous cultural hats. I had the incredible fortune to reconnect with my relatives and my grandmother on another continent. For all its many hardships, self-employment in IT has much to recommend it in the dispensation it affords to write the book of one’s life in an original way.

Be that as it may, the foundations of my inner drive, motivation and aspirations are notoriously ill-suited to the cloistered life of a free-floating hermit, yet I had taken great pains to structure such a life as quickly as possible, and to maximal effect. My reaction to this dissonance was to develop a still-greater penchant for radical and grandiose undertakings, a frequent vacillation between extremes, in an effort to compensate for the gaping holes in my life. The results were not always healthy. While there’s nothing wrong with marching to the beat of one’s own drum, I should have perhaps taken it as a warning sign that as I grew older and made more and more “idiosyncratic” life choices, the crowd of kindred spirits in my life drastically thinned out. “Original” is not necessarily “clever and original”.

In sum, I flew too close to the Sun. When I reflect upon the impact that my leaving the university has had upon my life, I mourn not professional dreams deferred, nor economic hardship wrought, but rather the ill-fated conceit that I could skip over certain stages of a young adult’s personal development. Now that the novelty has worn off and the hangover has set in, I know that it would have been profoundly beneficial to me if they had unfolded not within the fast and loose patchwork I clobbered together, but within a mise en scène that captures the actions, attitudes and values of the academy–my cultural home.


On “communication skills” and pedagogy

Here’s a pet peeve: the widespread belief that any two people, regardless of the disparity in their levels of intellectual development, are destined to fruitfully converse, as long as both exhibit “good communication skills”.

First, acknowledgment where it’s due. It is indeed an important life skill to be able to break down complex ideas and make them accessible to nonspecialists.

“If you can’t explain it simply, you don’t understand it well enough” is a remark on this subject often attributed to Einstein (though, as I gather, apocryphally). The idea is that explaining something simply in ways anyone can understand is the sign of true mastery of a subject, because only deep knowledge can allow you adroitly navigate up and down the levels of abstraction required to do so.

Those of us in the business world also know about the importance of connecting with diverse personalities–customers, managers, coworkers. In the startup economy, there’s a well-known art of the “elevator pitch”, wherein a nontrivial business model can be packaged into ten-second soundbites that can hold a harried investor’s attention–the given being that investors have the attention spans of an ADHD-afflicted chipmunk.

I would also concur with those who have observed that scholarly interests which don’t lend themselves to ready explanation–that are “too complex” for most mortals to fathom–are often the refuge of academic impostors. There are a lot of unscrupulous careerists and political operators in academia, more interested in what is politely termed “prestige” than in advancement of their discipline and of human understanding. These shysters, along with more innocent (but complicit) graduate students caught up in the pressures of the “publish or perish” economy, are the spammers of house journals, conferences and research publications, often hiding behind the shield of “well, you see, it’s really complicated”. Most legitimate scholarly endeavours can be explained quite straightforwardly, if hardly comprehensively. Complexity is an inscrutable fortress and a conversation-stopper in which people more interested in being cited and operating “schools of thought” (of which they are the headmasters, naturally) hide from accountability for scholarly merit.

All this has been polished into the more general meme that productive interaction is simply a question of “learning to communicate”. With the right effort, anyone can communicate usefully with anyone. It doesn’t matter if someone is speaking from a position of education and intelligence to someone bereft of those gifts. Any failure to achieve necessary and sufficient understanding is postulated as a failure of communication skills, perhaps even social graces (e.g. the stereotypical nerdling).

This is an extreme conclusion fraught with peril. We should tread carefully least we impale ourselves on the hidden obstacles of our boundless cultural enthusiasm for simplification.

First, there’s a critical distinction between clarity and simplicity. It is quite possible to take an idea simple at heart and meander around it circuitously, taking a scenic journey full of extraneous details. Admittedly, technologists such as programmers can be especially bad about this; their explanations are often vacillatory, uncommitted to any particular level of abstraction or scope, and full of tangents about implementation details which fascinate them immeasurably but are fully lost on their audience. I’ve been guilty of that on more than a few occasions.

However, there is an intellectually destructive alchemy by which the virtues of clarity and succinctness become transformed into the requirement of brevity. Not all concepts are easily reducible or lend themselves to pithy sloganeering–not without considerable trade-offs in intellectual honesty. This is a point lost on marketers and political activists alike. It leads to big ideas and grandiose proclamations that trample well-considered, moderate positions, as the latter are thermodynamically outmatched by simplistic reductions. Brandolini’s Law, or the Bullshit Asymmetry Principle, states: “The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.” As always, sex sells–a fact of which the TEDdies have a firm grasp, with their peddling of seductive insight porn. As Evgeny Morov said:

Brevity may be the soul of wit, or of lingerie, but it is not the soul of analysis. The TED ideal of thought is the ideal of the “takeaway”—the shrinkage of thought for people too busy to think.”

Second, the idea that “communication skills” are at the heart of all matters has wormed its way into pedagogy rather disturbingly in the form of group work and so-called collaborative models of learning. As the thinking goes, the diversity of a student body is an asset; students have much to learn from each other, not just the lecturer, and encouraging them to do so prepares them for “the real world”, where they’re ostensibly going to be coworkers, police officer and arrestee, and so on.

It reminds me of an episode recounted by my favourite author William Blum in his memoir about the political upheaval of the 1960s:

At one point I enrolled for a class in Spanish at the so-called Free University of Washington, and at the first meeting I was flabbergasted to hear the “teacher” announce that he probably didn’t know much more Spanish than the students. And that’s the way it should be, he informed us–no authoritarian hierarchy. He wanted to learn from us as much as we wanted to learn from him.”

The counterculture kids were challenging incumbent hierarchies of authority. I see the same kind of anti-intellectualism recycled today into the putatively more laudable goal of social flattening.

But there’s a limit to the productive fruit of such ventures. It’s best illustrated by an anecdote from my own life.

When I was a freshman at the University of Georgia, I took an obligatory writing and composition course, as part of the infamous “core requirements” (remedial high school) that characterise the first year or two of four-year undergraduate university education in the US. One day in November, our drafts of an expository essay were due, presumably for commentary and feedback on writing mechanics by the English graduate student teaching the course.

Instead, we were paired with a random classmate and told to critique each other’s papers. My partner was an Agriculture major–a farmer’s son, he readily volunteered–who was only at the university because his father insisted that he needed a college degree before taking up his place in the family business. I would estimate his reading level to have been somewhere in the neighbourhood of fifth to eighth grade. I was going to critique his paper, and he was going to critique mine.

Candidly, his paper was largely unintelligible gibberish; it would have taken many improbable megajoules of energy input much for it to rise merely to the level of “unpolished”. Were the problems strictly mechanical–paragraphs lacking topic sentences, no discernible thesis in sight, no clear evidentiary relationship between his central claims and the sentences supporting them–I would have earned my keep in a few minutes with a red pen.

The problem was much deeper: his ideas were fundamentally low-quality, benighted in a commonsensically evident kind of way. They were at once trite, obvious, and all but irrelevant to the assigned topic. The few empirical claims made ranged from startling falsehoods to profoundly unfalsifiable arrangements of New Agey words that grated on the ear of someone accustomed to the idea that the purpose of arranging words was to convey meaning. He was hurtling at light speed toward an F. What could I do, rewrite his paper for him? How would I even begin to explain what is wrong with it? There was no room to start small or to evolve toward bigger, more summative problem statements; it was a genuine can of worms: pry it open, and all the worms come out to play at once.

I don’t mean to impugn him as a human being; he just wasn’t suited to the university’s humanities wing, whose business was reputed to be the life of the mind, set in a programme of liberal education. He didn’t know how to argue or how to write — period. He was more of a hero of proletarian labour, as it were, reared in a life script ineffably different to my own, never having crossed paths with me or anyone else in the pampered, effete, bourgeois “knowledge work” setting before, and destined to never cross paths with me in any such setting again. I was utterly paralysed; there just wasn’t much I could do to help him. Plainly, I couldn’t tell him that his thoughts issue forth from a nexus of civilisation unrecognisable to me. There wasn’t much of anything to say, really. I made a few perfunctory remarks and called it a day.

His feedback on my paper, which in turn suffered from organisational and topic transition problems that continue to dog my writing today, was: “Looks good, man!” Verily, his piercing insight knew no bounds. We really learned a lot from each other that day. Along the way, I overheard bits and pieces of a rather erudite peer review by a considerably better-educated classmate. Why couldn’t she review my paper? It would have almost certainly helped. My writing wasn’t stellar, and my devoted readership–I do it all for you, much love!–knows it still isn’t.

Later, I privately enquired to the lecturer as to how I was supposed to condense a lifetime–however hampered by the limitations of my age and experience–of literacy, intellectual curiosity, familial and cultural academic background, semi-decent public education and informal training in polemic and rhetoric into a functional critique that would realistically benefit my beleaguered cohort and help him write a better paper. She replied: “That was the whole point; you need to work on your communication skills.”

In defiance not only of the comme il faut tenets of political correctness, but in fact–in some sense–of the national mythos of our putatively classless and democratic melting pot, I brazenly suggest something that is, I think, considered fairly obvious elsewhere: not all categories of people are destined to communicate deeply or productively.

When such discord inevitably manifests, we should not reflexively blame so-called communication skills or processes. People operate in silos that are sometimes “civilisationally incommensurable”, as it were, and sometimes there just isn’t much to communicate. This is the reality of culture, class and education, and the thinking on collaborative learning and teaching methodologies should incorporate that awareness instead of unavailingly denying it. Matching partners in group activities by sociological and educational extraction clearly presents political challenges in the modern classroom, though. Instead, I would encourage teachers to rediscover–“reimagine” is the appropriate neologism, isn’t it?–the tired, hackneyed premise of leadership by example. At the risk of a little methodological authoritarianism and a few droopy eyelids, perhaps the best way to ensure that students leave your course better than you found them is to focus on their communication with you. They’ll have the rest of their lives to figure out how to transact with each other.


Predictions for 2015

It is challenging to make prognostications that are specific and testable, or, in the thinking of Karl Popper’s brand of empiricism, falsifiable. Many predictions are formulated such that they could be argued, post facto, to be true regardless of what actually happens.

I’ll try to avoid making those. At the same time, prognoses about complex phenomena like global economics and war are often, by their nature, vague and open to interpretation, and they consist of many parts. They are replete with statements about things that aren’t necessarily discrete or quantifiable. At best, they can be viewed as compositions, as systems of many interdependent propositions and variables which must be accepted or dismissed holistically. This is the insight of Thomas Kuhn, who found the Popperian view of how science evolves, through the testing of individual hypotheses, to be naive. He popularised the use of the word “paradigm”, a superset of a hypothesis. It may be that the apparent truth or falsehood of my predictions will depend on whether you buy into certain paradigms.

Without further ado:

1. The price of crude oil will rebound to US $80/bbl or more.

It’s not in the interest of any oil-producing nation to perpetuate the current freefall, which has led, at the time of this writing, to oil trading in the $50-$60/bbl range .

I’m not an industry analyst, but my impression is that the current slide is caused by a combination of:

  • Reduced short-term global demand.
  • Saudi Arabia breaking ranks with its OPEC cohorts and refusing to restrict output.
  • Optimism about the increasing production of the American domestic energy sector (leading to expectations of higher supplies).

I don’t think Saudi Arabia going rogue is going to last. Saudi Arabia can weather low prices better than many rival oil autocracies, but ultimately, these states have common interests. Someone will probably make a deal with the Kingdom to encourage them to return to cartel discipline. The Russian economy suffers particularly heavily from a fall in oil, as its state budget depends almost entirely on high oil price targets, and it exports almost nothing apart from energy and raw materials. It’s possible that Russia, despite not being an OPEC member state, will offer some carrots to get the Saudis to cut output.

As for the optimism about American domestic energy supplies, there’s definitely an underlying reality: undeniably, output has increased profoundly in the last few years, to the extent that it may be one of the most significant structural changes of the early 21st century. But I would wager that some of the correspondent market movements can be attributed to irrational exuberance and speculative trading, too; once the sizzling party cools down and the fast rave music gives way to slower ballads, cracks will emerge, in the form of concerns about sustainability of yields (at certain EROEIs) and environmental impact of this boom. I don’t believe that short-term advances in hydraulic fracturing of shale change the big picture, which is that the EROEI of global hydrocarbon energy supplies is falling.

I hope that the current trend in increasing energy capture efficiency (and therefore EROEI) of photovoltaic cells and wind turbines, as well as ongoing research into hydrogen fuel cells, thorium reactors, and other alternative sources, will continue undistracted by short-term fossil fuel supply bumps. I’m not holding my breath for capital markets to get smart enough to emphasise long-term prospectus over short-term speculative opportunities, though. I’ve always thought that energy and environmental issues are key examples of market failure, alongside healthcare.

2. Russia will continue to be mildly depressed.

Most of Russia’s present economic malaise and inflation (50% devaluation of the rouble) probably owes itself more to the crash in crude oil prices than to any effect of Western sanctions, in the same way that Russia’s stabilisation in the 2000s owes itself to the rise in crude oil prices rather than Putin’s much-touted economic policies.

So, I think any recovery is likely to be linked to the oil. As I expect the gains in crude to be modest (to ~$80/bbl or so, rather than $100+), I don’t expect Russia to see a dramatic reversal in its current fortunes.

3. The Ukraine conflict will stagnate, unresolved.

There is neither political commitment nor opportunity for either the West or Russia to, as outside influences, drive the Ukraine conflict to any endgame. The only actor that has an incentive to take decisive action and definitively consolidate Ukraine is the Kiev government, but it does not have the military capability or finances to do that.

What I expect is that the current status quo will coalesce into a kind of uneasy détente, not wholly unlike the aftermath of the Georgian-Russian conflict of 2008, where South Ossetia and Abkhazia became nominally independent and de facto Russian-controlled–and, at the very least, ungovernable for Georgia. This kind of situation has a tendency to ossify over time, unless some dramatic transpiration suddenly reanimates active conflict.

That is to say, the Russian-adhering Eastern Ukraine will continue to be a semi-ungovernable patchwork for the central government. Professional ‘rebels’ from the likes of the Donetsk People’s Republic will continue to carve out careers for themselves, and be variously, depending on what’s going on, nudged by half-hearted Ukranian army offensives, officially disavowed by Russia, or unofficially utilised by Russia as a tool to coerce Ukraine and the West (with the implicit threat of whipping up pro-Russian nationalist agitation among such groups). More generally, Eastern Ukraine will continue to run nominally as part of Ukranian territory, but with its fronds tending increasingly toward the Russian sun in terms of economic and political linkage.

This situation will, over time, reach some sort of equilibrium where nobody really “wins”, while everybody claims to have won and trades accusations of banditry. The world will forget about Ukraine and move on; Russian economic relations with the West will slowly and subtly renormalise, particularly with Europe, though the remnants of sanctions will continue to be used by the US to pressure Russia, while the instability of the East–with the implied influence that Russia has over it–will be used by Russia as leverage against the West. Both the US and Russia win in having an outside enemy to refer to, particularly the Putin regime, which will point to US sanctions as easy blame for ongoing internal malaise it cannot fix.

4. US-Iran relations will improve and the US will ease sanctions against Iran.

While some recent improvements in US-Iran relations can probably be attributed to the ascendance of more moderate post-Ahmadinejad forces, the more significant incentive for collaboration involves the common enemy of ISIS. It is likely that there will be some quiet, underplayed horse-trading and compromise regarding Iran’s nuclear programme in the service of this awkward alliance.

5. The ISIS-driven partition of Iraq will solidify.

With increasing US commitment to airstrikes against ISIS, which is also perceived as a regional threat by nearly all incumbent regimes, it seems likely that ISIS military gains will be arrested. On the other hand, another US ground war to decisively drive ISIS out of the territories they currently occupy does not seem politically possible.

This will probably lead to a hardening of existing boundaries between ISIS-controlled and non-ISIS-controlled Iraq. Small flare-ups will occur between ISIS and local governments who have more enthusiasm for driving them out decisively than the US does, such as the battle-hardened Peshmerga of Iraqi Kurdistan, though they will not succeed in doing so for lack of supplies and firepower. Other locals will probably come to an accommodation with ISIS, though this is something neither side will publicise because doing so is a losing proposition PR-wise (failure to defeat ISIS on one side, failure to comprehensively consolidate an Islamic caliphate on the other side).

6. The Syrian Civil War will continue without resolution or settlement.

This war will continue and inflict even more destruction upon an already destroyed and war-weary country.

In all likelihood, however, it will come to be increasingly simplified down to a pro vs. anti-ISIS conflict, especially since the American view is that any enemy of hard-line Islamists is its friend. Depending on the overall military successes of ISIS in Iraq and Syria, this may even lead to a delicate, understated thawing or rapprochement with the Assad regime, though neither side will publicise this fact because it’s a lose-lose PR proposition.

7. Chinese (PRC) growth will continue to plateau.

As China undergoes industrialisation without a clear next step beyond its specialisation in manufacturing, its GDP growth will continue to plateau. Combined with a possible pop in its overheated urban real estate market, this may lead to a mild recession.

8. China will become an increasingly attractive IT outsourcing destination over India.

As a result of its growth and increasing (though very unequally distributed) affluence, India has become too expensive for many Western firms’ tastes, and they will increasingly look to China to fill the gap, particularly in low-skill business process outsourcing and backoffice functions. But this in itself is unlikely to be China’s ticket out of a looming existential crisis of macroeconomic purpose.

9. The West African Ebola epidemic will peak and fade from public view.

The West African Ebola epidemic is currently projected to peak in April or May of 2015 by the WHO. After that, it will probably fade from public view entirely, punctuated by the occasional incident of an infected individual making it across Western borders.

10. Pope Francis will face political challenges from conservative hardliners.

It is unlikely that Pope Francis’ spate of rapid and theologically radical liberal reforms will be allowed to continue forever by the conservatives within the Vatican inner circles and conservative bishops elsewhere. Such conflicts have already flared up, and thus far, the Pontiff has prevailed.

Canon law has no procedure to impeach or recall a pope. I don’t think his job is in danger. His influence over certain conservative constituencies and their local directors will probably be reduced in increasingly conspicuous acts of defiance, though.

11. The global appreciation of the US Dollar will end.

The increased dollar demand is most likely due to:

  • The Federal Reserve is nearing the end of its most recent period of Quantitative Easing (QE), which is expected to pull back money supply.
  • The high performance of the S&P 500 spectrum of the US stock market.
  • Exuberance about American energy supplies and perceived domestic recovery in jobs and real estate.
  • Increased demand for hard reserve currency in places with high inflation, such as Russia and many of the former Soviet republics (whose currencies’ fate is, in many instances, closely linked to the rouble).

I think this exuberant climate will cool by the end of 2015Q2 and that former-USSR inflation will stabilise in tandem with rebounding oil prices. The stock market is thought to be in need of a correction phase. These things will send the dollar back down to historical norms.


Journalism, epistemology and conspiracy theories

I have an acquaintance that has tried numerous times to persuade me that Osama bin Laden was not killed in a commando raid in Abbottabad in 2011, but actually died of natural causes back in 2001. Just look at the photos of bin Laden in the press from the last ten years on WhatReallyHappened.com! They are “obviously” doctored!

We all know that one guy, or several. They’re devotees of their favourite “what the government doesn’t want you to know” web site. When you tell them that such claims, while interesting, suffer from a dire lack of peer review and corroboration from credible sources, that just amps them up even more: “I don’t need the megacorporate disinformation machine to know what’s true. I think independently. Free your mind! Look at the photos and judge for yourself; could the World Trade Center towers really have collapsed due to fire?”

There’s no arguing with conspiracy theorists. The conversation is complicated by the fact that established media certainly have, in their history, disseminated their share of outright lies, or, far more often, omissions and skewed narratives. Mainstream media often disseminate Official Truth in any country and political system, and, in more democratic societies, this is sometimes done under the guise of hard-hitting investigational journalism and putative independence. Yet, anyone who’s grown beyond the infantile thumb-sucking stage of critical thinking understands that the idea of media as a professionally neutral conveyor of consistently objective truth is, at a minimum, complicated and caveat-ridden, and perhaps even slightly risible. Furthermore, amidst the noise on WhatReallyHappened.com and tenc.net there are, in bits and pieces, some kernels of truth.

Such a polemic cannot be made intelligible by arguing about “the facts”, as people are wont to do with conspiracy theorists. This is a trap. What does the idea of “the facts” even mean? To my mind, this issue goes back to the very nature of our knowledge about the external world, and is, at heart, a philosophical issue.

In a sense, it’s a fundamentally defeatist stance, because if we’re going to interrogate the pillars of our construction of “the facts”, we have to be very honest about the limits of what we can justifiably claim to know. It’s the only reasonable thing to do. However, our loony friends aren’t going to reciprocate our concessions; you may not be completely sure about anything (because you cannot reasonably be, as I’ll discuss below), but the guy writing at TheNewReichstagFire.com/the-big-lie-always-works-time-after-time/ about how 757s don’t just smash into the Pentagon like that? He’s sure. Very sure. Your vague, slightly noncommittal scepticism is no match for his positive assertions. And yet, there are reasons why it’s more justifiable to believe The New York Times account than his.

For those who slept through their Philosophy 101 course (or an introductory course to scientific methodology), let’s revisit a well-beaten horse. There are two types of reasoning recognised in logical argumentation: deductive reasoning and inductive reasoning.

Deductive reasoning is the kind that is used to reach certain conclusions based on ironclad axioms of logic. It is used in self-contained systems of well-defined rules, such as mathematics. The kinds of proofs you used to do in geometry class are deductive in nature, because the conclusion follows certainly from the premises.

The simplest illustration of a deductive claim is a syllogism such as:

  1. All men are yellow;
  2. Alex is a man.
  3. Therefore, Alex is yellow.

Alex’s yellowness follows from the premises (the first two points). It cannot be otherwise. Given the premise that he’s a man and that all men are yellow, he must be yellow.

Note that we haven’t considered the question of whether all men are, in fact, factually yellow. That is not essential for this claim to be deductively true. Maybe not all men are yellow. Maybe Alex is not a man (though, I’m pretty sure I am). The claim is simply that given that all men are yellow and that Alex is a man, he must be yellow.

Deductive logic has relatively narrow applications in the grand universe of human endeavours. It’s usable in bounded systems that meet the Closed-World Assumption, most notably in the field of mathematics, but also, in a more applied dimension, in finite deterministic systems, such as for example in electronics.

Inductive reasoning is the other kind of reasoning. Inductive claims are probabilistic in nature. A strong inductive claim is a “good bet”, not a guaranteed or certain conclusion. An example of an inductive claim would be:

  1. The Sun has risen every day.
  2. Tomorrow is a new day.
  3. Therefore, the Sun will rise tomorrow.

There’s no guarantee the Sun will rise tomorrow merely on the basis of the fact that it has done so in the past. It could explode overnight. You never know. But, it’s a pretty good bet that it will rise tomorrow; it’s had a pretty consistent track record of doing that. Of course, to flesh out this claim realistically requires quite a bit more premises about what we know to be true about the Sun, quite apart from the fact that it rises. But even if we add all the scientific knowledge in the world, there’s still no logical guarantee that the Sun will, in fact, rise tomorrow. Maybe it won’t. But it probably will. And that’s the essence of inductive reasoning.

Pretty much all human knowledge is based on inductive claims. Pretty much anything humans say to each other, ever, is based on a giant cascade of inductive claims. There’s not much that we know about the outside world that could be said to be deductively true. Deductive claims participate in inherently synthetic universes. That’s why it’s a bit silly when a person arguing with another about who really shot John F. Kennedy says: “Your logic is terrible!” The other party’s logic is probably not wrong–at least, not in the sense of deductive logic. Most intelligent people are pretty good at basic deductive logic. The disagreement is generally about the truth or falsehood of the premises, or about their evidentiary relationship to the conclusion being advanced. Lee Harvey Oswald may have been on the top floor of the Texas Schoolbook Depository with a rifle, but how can you be sure he shot Kennedy?

Even direct human sensory experience can be called into some doubt: just because you’ve seen it with your own eyes doesn’t mean it’s true. The very idea of perceived physical reality as being “true” has been the subject of ongoing philosophical contention. It relies on a lot of metaphysical assumptions, such as that causes resemble their effects (just because you feel like you’re sitting in this chair and it feels very “chair-like” doesn’t mean the underlying object–if it even exists–is essentially “chair-like”). 

Worse yet, inductive logic is not deductively valid. The Scottish philosopher David Hume famously pointed out that the justification for inductive logic uses circular reasoning, because the only real justification for inductive reasoning is that inductive reasoning has worked in the past–itself an inductive claim. “Induction, because induction” isn’t very persuasive, and it certainly isn’t what most people would intuitively call “logical”.

But, this wouldn’t be a very interesting post if I ended it on a note of “so, you really don’t know anything at all, period”. Let’s skip over the metaphysical issues and assume, among other things, that our direct sensory experience of reality is mostly accurate, in the sense that the underlying reality meaningfully resembles our apprehension of it through our five senses.

So, back to journalism. When you open the The New York Times, do you really know that anything you see is true at all? Have you ever even been physically present, in a position to directly observe any occurrence that the Times has reported on? I’m not sure that I have. Most people haven’t either. And even if you have, it’s probably something that happens a few times in a lifetime, unless you’re the President.

The metro section says there was a fire on 125th St. It’s got photos of fire engines and a menacing blaze. Were you there? Did you see it? Probably not. Maybe it’s all fake. Maybe the whole story is a fake. Maybe there was no fire.

Maybe. But it’s probably true. It’s true because you can’t think of a compelling reason for the Times to make something like that up, a matter of fact seemingly without any ideological valence or political magnitude. It’s true because other articles on similar events in the past have been supported by the testimony of people you know. It’s true because the same, or substantially similar report is also carried by other local newspapers, television stations, and online sources, and an elaborate conspiracy among them all to convincingly report a fake fire is unparsimonious. It’s true because the Times would not be considered a credible news source if it just made stuff up all the time.

And yet, all of those are inductive claims. The last premise is particularly ludicrous, as it is circular: the Times is credible because it’s credible. Certainly, it could be that all of these different sources of information that you’re cross-referencing and integrating elaborately conspired to report on a fire that never happened, motivated by reasons unknown to us. It doesn’t pass Occam’s Razor, but it certainly wouldn’t be the first time they’ve reported something that was subsequently shown to be completely false; Gulf of Tonkin incident, anybody? And yet, when I say “shown”, was it shown to you? No, it was “shown” by other sources that you believe to be credible, because you’ve experienced them as having credibility in the past.

Well, that’s very inductive of you. And it’s deductively illogical. You can’t prove any of it. Your justifications are just other inductive claims; things you reasonably believe to have a high probability of being true, but you are in no position to directly verify. Indeed, it’s safe to say that ninety nine percent of human knowledge–which we utilise with great conviction every day–is that way: for all our cosmopolitanism, there’s not a whole lot in the range of an individual human’s direct observation and experience. You’ve probably never experimentally verified whether hypothermia truly sets in at or below 35C. Have you ever personally tested whether antibiotics really work? You’ve probably taken them, but what if it was just your awesome immune system? Want to give yourself and your friends a menacing infection and really explore it with some rigour? Be sure there’s a control, someone’s got to get the placebos!

Most of the time people are arguing about any matters of fact in the outside world, they are arguing about things they have never seen or touched; they just read about it somewhere. I confidently contend that you can land a Cirrus SR22 (given a longer runway) without flaps safely because a pilot told me so, citing the Cirrus operator’s manual. But hell if I’ve ever tried. I don’t even know how to deploy the flaps on a Cirrus. I don’t even know how to get it up in the air. Yet, I’m perfectly comfortable making this claim to you, and I’ll bet you $1000 it’s true.

Have you ever been to Niue? What if Niue doesn’t really exist? But surely it does! And yet, all the reasons you have for supposing so are based, foundationally, on other inductive claims, few or none of which you’ve directly verified. You simply suppose that atlases and maps generally reflect geographic fact, and that if there’s an entry in Wikipedia, it must be true. How gullible of you! (Incidentally, there have been plenty of fake entries on Wikipedia. What if the entry for the planet Jupiter is one of them? Is Jupiter a real thing?)

Even the meaning of everyday work of researchers and specialists in scientific fields relies on their belief in inductive assumptions that are widely held in their field. These inductive assumptions are sometimes invalidated, and, throughout history, have undergone massive upheaval and revolution. It’s actually quite possible to invalidate decades or centuries of scientific labour with a shift in high-level assumptions. (There is an entire field, Philosophy of Science, that seeks to properly capture or describe the process of how all this actually happens at a theoretical level, as well as to sometimes make prescriptions about how it ought to work.)

I think we’ve satisfactorily illustrated that nearly all knowledge about the external world is premised on a complex, interdependent house of cards, where the cards are inductive claims. Where it leads us with regard to journalistic credibility is this:

It is reasonable to say that The New York Times occupies a higher place in an inductive “hierarchy of believability” than does WhatReallyHappened.com or something Alex Jones said. It is higher in that hierarchy because, on average, the things written in it correlate much more extensively and profoundly with other sources of (inductive) knowledge that we draw upon elsewhere. We believe that the Times is a more professional journalistic enterprise that goes to greater lengths to check its facts, review its sources and report the truth, because of other things that we believe to be true about journalistic organisations that do and don’t do that (the latter are seldom cited as a source of truth across a wide spectrum of human endeavours).

None of this guarantees that any given thing written in the Times is true. I personally cannot say that the World Trade Center towers definitively did not collapse due to a controlled demolition. But, Mr. What-They-Don’t-Want-You-To-Know can’t say that they did so, either, and that’s what he’s missing. Like me, he wasn’t there, and he’s no more of a structural engineer than I am. What’s more, my secondhand source is better than his.

Provable? Absolutely not. It’s just a better bet.