In Response to the Cult of Remote Working

Remote working and working from home have become hallowed totems of the progressive side of IT business in recent years. Advocacy for the benefits of working from home and collaborative technologies that bring distributed teams together is widespread on weather vane forums of IT culture like Hacker News. Even in the more formal business press, there’s been a steady drumbeat of analysis with an optimistic view to the possible benefits.

I’ll be the first to admit that I have been a beneficiary. I’ve been self-employed since I was twenty-two, and have spent a good chunk of my adult life working remotely, in coffee shops and coworking spaces, as well as living overseas and working wherever there.

Furthermore, the remote work thesis has dragged onto the stage with it several related insights that unquestionably needed exposure to a wider audience. For some time now, we’ve been highlighting the failings of old-fashioned, Twentieth Century “butts in chairs” presumptions of corporate America, namely that if you’re sat in your cube like a good Organisation (Wo)Man, you must be working productively. It’s good to see more mainstream recognition of the fact that people work differently and have different biological and psychological rhythms, from which it follows that the standard 9-to-5 schedule is not the most productive one for a lot of people (I make a crappy 9-to-5er, and so do many developers I know). And more generally, I welcome the pressure to take a more results-based view of productivity that privileges what people are actually getting done over how and when they do it.

Multi-tasking mother at homeStill, remote work has become something of a religion now among Millennial professionals in the “digital realm”, and it’s reached a fever pitch. I’ve heard from multiple people and in various forms the claim that modern tech companies, or software companies, simply do not need offices. It’s trotted out as an incontrovertible fact that trumps all business and people-specific considerations. Among certain segments of affluent Millennial professionals, it’s become a cult.

The world wants for a more sober and equanimous analysis, which, if undertaken, leads to more ambivalent conclusions.

Business model and knowledge

I think the main thing missing from the generic focus of lyrical encomiums to working at home is an awareness of how knowledge is shared and transmitted. That’s going to be strongly tied up in the nature of the business model and its specific workflows.

Yes, remote can work well in a small team of professionals who work mostly independently on compartmentalised work items. That suitably describes a lot of web startups. Good web developers, for example, be relied upon to maintain and expand their skill set independently of the concrete work they do. Essentially, they’re freelancers with a W-2 paycheck.

That’s not how a lot of business in the “knowledge economy” works, though. I got my career start at a relatively small-town Internet service provider, rapidly rising from a part-time student tech support employee to the principal system administrator in 1-2 years’ time. I came into the first role at age 18, having good raw technical skills from a childhood of Linux and C programming but with no real-world work background, business experience, or knowledge of industrial equipment. I was an eager knowledge sink and learned a great deal from older colleagues who mentored me. A lot of the gaps that needed filling weren’t so much technical skills as applied experience with how to implement technology to serve real-world business cases, and the trade-offs involved in doing so. I had no exposure to business growing up; I had never dealt with the complexities of real customers or contracts, knew nothing about how to price services or the true cost structure of a company, CAPEX vs. OPEX, etc. Like all over-eager, bright-eyed, bushy-tailed 18 year-old beavers, I had to be slowly disabused of an overwhelming tendency to recommend “build” vs. “buy”. Furthermore, I forged strong relationships with the interesting and eclectic crowd that this employer attracted. These remain my strongest social connections more than a decade later, and have been professionally as well as personally important. And when I left that ISP role at age 20, I was able to successfully leverage the broad experience and parlay it into a rather meteoric rise in professional status at a big-boy corporate job in Atlanta. This process made me into the professional I became.

The small-town ISP wasn’t lucrative. The bargain with employees was, for the most part: we pay low student-type wages, you learn more, and more quickly, than almost anywhere else you could conceivably work. It was a fair trade, and one that exists in a lot of places in the economy. The average 19 year-old, even of the precocious sort, doesn’t get to administer BGP routers or help deploy SANs. This was all socially transmitted knowledge, the organic outcome of shared culture built around the proverbial water cooler.

I saw where the cables ran and how real networks looked. Even in our Cloudy world, where these links are increasingly software-defined, it’s important to see and touch. I paid attention to how my coworkers worked, their mannerisms, how they reacted to difficult situations, and I copied and adapted many of their habits. I came to have similar instincts. In ruminating upon how I learned, I learned how to better teach and train others. I learned to bring a business outlook to bear on many issues as well as a technical one. I learned a lot about common organisational anti-patterns and what not to do. These are the things that made me valuable to future employers as much as any technical skill set I possessed.

I have trouble imagining how this would have worked if I were sat at my home computer, given a bunch of logins to network equipment and told to inquire on something like Slack if I had any questions. I was there in person to pester — and occasionally frustrate — my senior coworkers, and, with time, to teach and mentor my junior team members, and it made all the difference.

Techno-utopian fantasies and the human factor

For the last decade or so, I have been doing SIP and Kamailio consulting for VoIP service providers. VoIP is a weird intersection of the technology universe where telephony meets computers, two worlds that don’t traditionally converse. The business opportunity as a consultant comes largely from the fact that the phone guys traditionally don’t know much about IP packet networks, data and IT, while the IT guys don’t know much about phones.

And although that world is slowly changing, VoIP providers still have to talk to the PSTN (Public Switched Telephone Network), AKA the traditional telephone network. To be a useful vendor to VoIP service providers, you need some rather esoteric domain knowledge about arcane PSTN concepts that go back to 1980s technology. The PSTN is highly regulated, and you need to understand that regulatory environment to be able to understand customer needs as they relate to billing and interconnection.

That sort of thing is called domain knowledge, and exotic domain knowledge is the essence of most commercially viable consulting endeavours. VoIP is an uncommon skill set; there’s a very limited number of people out there who possess it, and you can’t just hire off the street for it. Even when you do find someone with that expertise, it’s almost certainly going to be in an allied, but different subspecialisation of the field. In addition to imparting niche technical skills, you’re going to have to teach them about the industry and the customers.

How do you do that over Slack and Hangouts? Well, I thought I could. I have hired four or five people during the lifetime of my business, all remote, reasoning — rather contemporarily — that working at home is a nice benefit to provide and technology can bridge the gaps.

It can’t. It didn’t work. And I specialise in telecommunications. When it comes down to it, phone, e-mail, chat and video are all directed communications graphs. Any communication is particularised, deliberate, and has a certain cost, even if it’s relatively low. Chat inherently privileges the short sound-bite and the “quick takeaway”, the favoured refuge of people too busy to think. Few people are going to type as much as they speak. And any commitment to do so leads to self-consciousness about using “work time” for that purpose in ways that hallway chats do not.

Of course, it’s not that all “meatspace” workplaces are all socially robust, thriving marketplaces of ideas and nexuses of collegiate friendship. I’ve worked in plenty of corporate environments where people come in, sit in their cube, type things, have a meeting, and go home. But still, real-world tech work isn’t always about churning out code in a generic, undifferentiated way. Often, it can’t be divorced from deep knowledge of the business domain in which you participate. It’s very important that your employees come to have social knowledge of that business domain, and the inefficiencies of remote communication are a surprisingly strong headwind.’

What’s more, any honest entrepreneur can tell you that convincing people to work for you and applying their work in an economically useful way is actually an incredibly hard problem. It’s often harder than getting customers to pay for the product, which is usually the more central preoccupation of business lore. Knowing your (expensive, indispensable) people, what makes them tick, keeping them happy, and maintaining a finger on their pulse is more art than science. Accenture and Deloitte may think of people as “Linux resources”, but in the world of small business, this is your crew, your livelihood, your life-blood. Emojis don’t promote that kind of deep connection to so-called “human capital”.

I think this is all a special case of a more general fallacy that pervades the technocratic bent of Valley thinking: the conceit that technology can solve broad classes of timeless management problems that are essentially human. A lot of the sales pitch behind ticketing systems, project management systems, CRMs, Slack, Basecamp, etc. has the meta-message that if you just had the right tools, you can bridge all work and process gaps, or somehow guarantee or force productivity, or provide browser-based surrogates for the psychological feedback of solidarity and shared purpose. You can’t. Not even with uncompressed 8K video and a million dollar telepresence system. Ask the airlines if anyone still travels to have important business meetings. There are certain categories of problems for which more technology is not the answer.

traffic jams in the city, road, rush hourA related pitfall of technocratic utopianism—that it is in tools and technology that our salvation lies—is that it often leads to solving the wrong problems. For example, metro Atlanta is practically a poster-child for the sprawling suburban dystopia of which I have treated much. It’s an accepted fact that no matter where you locate your company office in Atlanta, you’re dooming a high percentage of your work force to a potentially soul-crushing commute across Atlanta’s unconscienable freeway distances. It’s not news that length of commute correlates inversely with health and happiness. So, what’s our response? Instead of taking a fresh critical look at our crappy infrastructure, lack of public transport, and automobile-centric, sprawling built environment, we flush the positive value of the enterprise of “going to work” out with the bathwater of “the hated car commute”. But they are not one and the same.

Personality and social needs

As I wrote in another post from 2015:

Oh, [working at home] seemed incredibly cool when it was the forbidden fruit. Back when I had to make a bleary-eyed, tedious commute to some cube at 9 AM and put cover sheets on TPS reports or listen to coworkers’ incessant sports talk, working from home was a rare and coveted treat, the stuff of dreams. Imagine, saving the world in my bathrobe, all the fine things in life at my fingertips: refrigerator, snacks, couch, coffee table, a breather on the balcony!

Working from home

However, after I went out into the reputedly exciting world of self-employment around this time eight years ago, the novelty wore off after a week or two and the bleak reality set in. I’m an extrovert and I don’t handle extended loneliness well. Not leaving the house was depressing and unhealthy. It was not conducive to a routine; I quickly developed a chronically dispirited mood, exquisitely strange and shifty sleep rhythms (even by my nocturnal standards), and eating habits worthy of a bulletin from the Surgeon General.

Oddly enough, this was unrelated to whether I lived alone, with a long-term romantic partner, or family and friends. Certainly, I can’t work at home these days in a small apartment with three young kids, but for most of the eight-year history of this business, I lived alone or with an adult partner no less busy than I. Also, I spent a few years living overseas. In all cases, I was dysfunctional working at home–or whatever place served the role of home–and I hated it. To stay sane and produce consistently, I need some kind of routine, a commute, movement and walking, coworkers, water-cooler talk, lunch meetings, and the overall psychological compartmentalisation that comes with a distinctive work-space. If I don’t have that, things go downhill fast.

Working remotely from one’s residence certainly doesn’t have universal appeal.

An understated but important subtext of the remote work discussion in IT culture is a celebration of the stereotypical techie introvert, who resents being subjected to mandatory social participation in the typical corporate workplace shuffle. In a world where such people feel — to some extent legitimately — crowded out by extroverts, this is fair play.

But not all extroverts are facile schmoozers and gold chain-wearing womanisers from Sales. A question that seems to get drowned out in ecstatic praise of remote working is about the psychological foundations of motivation. Productive relatedness to one’s fellow man is a universal psychological need. We all need, in one measure or another, to be seen, admired, included, valued, recognised and praised for our distinctive contributions to larger endeavours. There’s a lot of unexplored territory around how a chronic state of remote work bends this dynamic and affects long-term job satisfaction. I am not ashamed to admit my investment in my work and my professional identity generates social needs that languishing at home does not fulfill.

Lonely senior man looking at the windowThere are other wrinkles in the fabric of human psychology, the blunt and unvarnished truths we’re all supposed to learn as we get older, wiser and more savvy to the fragility, equivocality and capriciousness of the human condition. By way of illustration, one possible wrinkle is the role of workplace as refuge, however unconsciously sought, from a difficult and stressful home life, for men and women alike.

We’ve heard a lot lately from people who say, “Now that I work at home and don’t have to commute, I get so much more done in less time while still staying on top of chores and spending more time with my kids! My life is so much better!” Well, good. Everyone should have your idyllic life and happy marriage. Everyone should be young, affluent and healthy instead of old, bankrupt and ill, and they should live in a village full of warm, loving friends and relatives, instead of alone and forgotten. Home life should be easy and cheerful instead of overwhelming and demoralising. Who wouldn’t rather live in Mister Roger’s Neighborhood, where addiction, abuse and depression are the unintelligible words of a foreign language? Given the choice, instead of dealing with Mom’s methadone withdrawals, bailing their cousin out of jail again, going in for an MRI of a meningioma, or staring at a foreclosure sale notice, I think anyone would sure as hell prefer to crush some P90X, bro down on some Scrum board stories in Awesome.js, and make a fat 401(k) contribution because it’s payday. All without leaving the house! So much winning.

But my life experience suggests there is a not-insignificant number of people for whom escape to a workplace and a single-minded focus on work is what they need to stay sane. It may be all they have. Even for a lot of the affluent professional middle class, as often as not the best case of ordinary American existence is that it’s bleak and offers little to come home to and little to go out to. Or it can be much worse. That’s real life.

Rather generally

Well, that took a turn into a peculiar niche area, you might say. But we’ve got to think about stuff like that before we declare the workplace, as we classically understand the concept, to be unequivocally obsolete. We overzealously declared the pedestrian civic realm and the public plaza to be obsolete half a century ago, and look what happened?

There’s probably a reason why we have evolved an entire cultural vernacular not just around the specific places and facilities in which we work, but the idea of being “at work” and not “at home”. Being “at work” isn’t just about where you are located right now—it can also have a more cosmological dimension. It’s a state of affairs. It’s a punctuation mark to many would-be run-on sentences, far from all pleasant.

As a software engineer, I’m the first to say that not everything need be expensive and physical. However, humanity cannot be wholly separated from its physicality and “uploaded” to The Cloud. Many of the physical structures and in-person rituals we have built are a necessary manifestation—indeed, a mindful assertion—of inspired productive communion in our short time on Earth. It may be that a place of work—a work-place—is one of them.

Ten years of VoIP & SIP

I woke up today and realised that as of summer 2016, I’ll have been doing VoIP & SIP for ten years. It was about ten years ago that I first connected an Asterisk 1.2 server to my home landline via an FXO card, registered a Snom 190 to it and got down to business with my copy of O’Reilly’s Asterisk: The Future of Telephony.


I know that 2006 isn’t exactly old-school by the standards of Asterisk early adopters, and especially not by those of butt set-wearing colleagues who think E&M Wink is too newfangled  and won’t drive cars made after 1975 due to all those new anti-pollution requirements. If you did Dialogic IVR programming on NT boxes in the 1990s, or rolled out AT&T WATS lines in the 1970s, you won’t be impressed. However, this field encompasses a third of my life span and nearly my entire adult life and career. So, from my perspective, it’s been a minute.

I was never an early adopter. At my small ISP job of the time, several people were tinkering with Asterisk from 2003 onward with varying degrees of commercial implementation success. I was introduced to the notion in 2005, and at first responded with relative indifference. In fact, if you had told me I was going to be doing anything with voice or telephones before 2006, I would have looked at you like you were crazy.

As the modem boom rapidly wound down and our ISP floundered desperately in search of a new business model, the connectivity side became a strictly Layer 3 proposition, a kind of sales channel for the ILEC. They owned the network and transport we sold. Do you remember the brief window in which independent ADSL providers were allowed to exist in the US, in the early-mid 2000s? That was us.

Nevertheless, I had a burgeoning interest in the technical and business side of networking that grew out of my deepening responsibilities at work. As I aggressively pursued that curiosity, I plumbed down the protocol stack and became intensely interested in how all these circuits worked at the physical level (which was out of our reach as we did not own the network). T1: A Survival Guide was a very enlightening tome that I literally could not put down. Anyway, as one slices through the OSI burrito, the subject matter rapidly converges with the history of voice and the Bell system, through the pathway of digital trunking facilities and digital loop carriers, multiplexing, etc. Now I was interested in voice. The whole Asterisk and open-source IP PBX phenomenon was not central to my interests, but critical to pursuing them; at the right time and place in this developmental process, it pushed down telephony from the mystical realm of the proprietary into commodity hardware and closer to the application level, which meant I could actually afford to get my hands dirty with it.

Speaking of commodity hardware, my foray into voice was, as my forays into many things as a twenty year-old, chaotic and irresponsible. I took a PBX PC with a Digium quad FXS/FXO card home from work and impertinently formatted it into a fresh Debian install. Why not? It seemed to be sitting downstairs doing nothing for a few months, give or take. Spare hardware, right? As it turns out, its disk contained–well, had contained–the only copy of a home-spun Asterisk PBX CD distribution project into which my immediate supervisor had invested many months of work. There were no backups. Oops. I still feel terrible about it. Granted, there were no change control processes, asset tagging, version control, or project management systems, but common sense might have invited one to ask first.

Anyway, I returned the hardware, built my own Asterisk server, got my own FXO card, and plugged into my land line. Those who frequented the 24/7 coffee shop Hot Corner back in Athens, GA in those days (summer 2006) might remember me sitting there with a telephone plugged into my laptop, which I was using to bridge it onto the WiFi network. Making calls out of my home landline from the coffee shop! How cool was that!


In a data centre in 2007. The smile is definitely posed; I do not typically enjoy my time in data centres in this much.

Here, my career took an interesting turn and perhaps suffered from a bit of confusion. I’ve never been a so-called visionary for innovation, but my twenty year-old self was especially impaired as a barometer of industry trends. I didn’t really realise that the classical TDM stuff was on its way out. I saw Asterisk as a pedestrian and entry-level window into the Really Serious Stuff, a stepping stone to the exalted heights of multiplexers, DACSs, and big-iron switches. Instead of stopping for a moment to wonder why the O’Reilly book was called “Asterisk: The Future of Telephony”, I was suddenly enamoured with things like ISDN, SS7, and SONET. I wanted to get into hardcore CLEC operations–interconnection, switch translations, the works.

So, instead of getting deeper into VoIP per se or taking an overly active interest in the new >= Layer 5 capabilities Asterisk opened up, I spent a few years investing in that career direction, rather the opposite of where things were going. I was a bit of an enigma; a guy who got into VoIP and discovered his love for the CSU/DSU.

I kept up my Asterisk skills and used it frequently, and I also started to get into OpenSER around this time. However, it wasn’t really what I cared about the most. Real excitement came at moments like when a colleague of mine set me loose on his Cisco AS5300–the first time I encountered the voice side of IOS–to get a few inbound PRIs working and routed to a SIP server. That was living!

Besides the fact that my core areas of interest were not so much hot topics in 2006-07 as they were moribund, there were a few other things I didn’t understand in my very early twenties:

  • TDM & physical-layer stuff didn’t pay, for the most part.

    Back at my ISP job, I thought the BellSouth CWAs who came in to reprogram our building mux in TL1 via the craft port were highly-paid, highly-skilled, in-demand demigods, true engineering luminaries, practitioners of the dark and recondite arts. Anyone can do good old IT folk traditions; few can program a fibre mux!

    I didn’t realise they’re actually considered kind of blue-collar, not especially well-paid, and that their work consists largely of fastidiously adhering to procedures that are labouriously articulated in three-ring binders.

  • Engineering vs. operations.

    Diving into the voice service provider and CLEC world from the angle that I did, I ended up tracked for an “operations” career for a while without realising it. It took me some time to learn that this was a thing, and moreover, that I, as a developer historically, more belong on the engineering side. It dawned on me when I ended up a senior NOC tech at a mid-size company (a few hundred staff) and I noticed that only “engineers” actually get to implement things or change code; “operations” people just do things like server administration and monitoring.

    Coming from a small-company background (our ISP was ~5 people), I certainly didn’t know that Corporate America makes a distinction between “engineering” and “operations”. As you might imagine, in an environment of half a dozen or so, everyone did everything, and owned their responsibilities end-to-end. In that sense, it was a fairly seamless transition from how I grew up doing things at a hobbyist level to a professional environment. I was already accustomed to having to know all aspects of what I was doing.

    That’s why I was so confused when the word “DevOps” first popped up. What the hell is “DevOps”? It sounds like just “how we normally work”, right? Are there really developers who think their work ends when the IDE closes or whatever? All of us had to both write the code and cultivate the skill set to deploy it in the real world, secure it, etc. Are there really sysadmin that throw anything involving for loops and if statements or SQL over to the Programming Department, because that’s not really “operational”? Our sysadmin back at the ISP may not have been professional software engineers, but if they couldn’t write their own utilities and script glue, or execute the odd inner join, they couldn’t do their job for more than five minutes.

    So, nobody was more surprised to learn than I that I was apparently in something called “operations”, where we had to get three “engineers” on a conference call to fix that one Bash script called from the one cron job. Shouldn’t I just do that for them? I thought I was in the get-things-done department? Bumping up against that division of labour caused me some problems both as an employee and as an employer.

  • The misery of proprietary platforms.

    I grew up breathing open-source like air, so I kind of took the culture, values and skill set that it brings for granted. I got a rude awakening and a newfound appreciation for it when I ended up in a role where my job was to be the Broadsoft specialist, deal with Metaswitch and Acme Packet, etc.Suddenly, no O’Reilly books, no conferences, no mailing lists, no forums, and, worst of all, no Google and no source code. Operating these platforms required an entirely different skill set; strong-arming the vendor and playing political games.

    What little documentation existed was not conceptual in nature; it read like a reference manual–fine if you already know what you want to find out and how to express that in the vendor’s proprietary vernacular, but completely useless if you don’t know what you should want or what it’s called. Worse yet, it’s very clear that a lot of these platforms aren’t seriously designed for their users to run them, but for large operators who rely on the vendor’s consulting to architect, build and maintain their networks. That gets really fun if you don’t have official vendor support, but that’s another story.

    It turns out that bashing my head against the CLI of various black boxes to see what the autocomplete turns up and fruitlessly pleading with vendor support people for answers is not my cup of tea. I was really bad at that job and didn’t get a lot done. I know for a fact that there’s a certain kind of personality out there that is very good at amassing this kind of knowledge by more oblique means, and those people are indispensable in the enterprise world. However, I’m not one of them. If you give me a box under whose hood I cannot easily look and which is not augmented by a user community or freely available literature, I’m pretty useless.

    More importantly, I realised that it’s a dead end; should you choose to become (by way of rather expensive training and certification) a specialist in one of these platforms, your employability and fate will rise and fall strictly with the commercial fate of that platform. Knowing how to Combobulate the ANI Presentation Screen List on the Translations Profile of a Business Group is valuable, but it’s not Knowledge with a capital K, nor even a fundamental skill set per se–at least, not as I was accustomed to thinking of it. It’s just the privilege of carrying some part of the reference manual in your head. I wasn’t going to bracket myself into that. I’d rather be the guy who doesn’t know the answer offhand, but can Google it in 15 seconds, and has the strong fundamentals to be able to (a) find the answer in the results, (b) intuit that what he found really is the answer and (c) apply the information.

Anyway, the rest is largely history. In early 2008, I started a certain consulting practice and rapidly specialised in SIP service delivery platform development that heavily emphasises OpenSER / Kamailio. It’s been a slow, painful (ahem, “organic”) growth path that has taken far too long, but we’re well on our way to the product life, which is a generally positive development.

In hindsight, I don’t feel set back by the circuitous path I took to get here. Experience with development as well as infrastructure & operations goes far. Getting invested in the hard PSTN side of voice instead of viewing the world through the prism of open-source VoIP systems solely has empowered me to provide added value to my customers by being able to understand the migration path from “legacy” infrastructure intimately and to consult on intricate matters of PSTN-side economics. A lot of VoIP folks don’t really know what’s going on the other side of the fence; they just send their calls to the ITSP and provision DIDs without much of an inkling for how it works. One could argue that this sort of esoteric knowledge is obviously more useful so long as the PSTN is around and not as important in a world where voice is just another IP application, but, so far, even in 2016, the death of the PSTN has been greatly exaggerated.

Moreover, while there’s certainly a need to revolutionise paradigms and whatnot, being more deeply conversant with the folk traditions of telephony both allows one to play a more constructive role in bridging the gap and, with a little luck, might allow one to avoid some of the errors that those who don’t know history are doomed to repeat in any scenario, circuit-switched or packet-switched. There’s a reason it took decades to engineer voice to the standard of reliability POTS users have come to expect. I strongly believe there are some lessons in that even for the WebRTC-in-The-Cloud enthusiasts out there.

It’s an interesting industry and, without a doubt, a very small circle of key actors. From a business perspective, that has its upsides and its downsides (small market). I’ve been fortunate to make many good friends and colleagues along the way, and the support has been both indispensable and deeply appreciated.

That’s the best thing about being in a small industry; like Cheers, it’s nice to go where everybody [well, some people] knows your name. You are too numerous to list, but I owe a debt of gratitude to all of my friends, colleagues, customers, hiring managers and bosses for helping me to learn where I fit best.

More lessons from my twenties: surviving as an amateur literator in tech

My upbringing and extraction unquestionably lie in the liberal arts and humanities. I’m moderately extroverted, and always leaned hard on the side of verbal, expressive and linguistic capabilities. For the time I was in university, I was a philosophy major with vague notions of law school. My parents are philosophy professors, and most of my relatives have an academic pedigree. I also had the unique intercultural experience of coming to America at age six with no knowledge of English and subsequently learning it in an academic social setting (my parents were graduate students). I carried that formative experience, and the globally conscious, relativistic outlook on language and people that it fosters, forward with me through life.

That doesn’t mean I’m a great writer, but I can write. Great writing, though, is really hard. As with many other things, if you plot a line from “can’t write at all” to “great writer”, you’d have to plot it on a logarithmic scale. Having a broad vocabulary, a firm command of language, and adroit self-expression will get you to the table stakes of “can write”, but that last bit, on the right, is a hundred, a thousand times as hard as what precedes it.

You know great expository writing when you read it; the thoughts and ideas are scrupulously organised, yet presented in a compelling way, with varied transitions and entertaining use of language, at once colourful and precise. Come to think of it, it feels pedestrian to anatomise it this way. You know great writing when you see it.

My writing is far too disorganised and repetitive to hit those notes. I’m verbose and can write a lot quickly and easily, but quantity is not quality; organisation has always been a struggle amidst my desire to relate a lot of details. If you read this blog with any regularity, you’ve seen that battle play out.

Though I’ve got better at condensing my thoughts and communicating ideas simply with age, I’ll never write like my friend Alan. His writing is incredibly brief and terse, but his gift is succinctness per se, which is not the same as brevity, though the two are very often confused in contemporary minimalistic fashions in communication. He can say much with little where many others merely say little with little.

As far as I can tell, the real gift there is the ability to accurately foresee the details and connections that the reader’s mind can work out for itself. Then you can say only what’s necessary to anchor the conceptual tent, cloth not included, avoiding most of the potential redundancy that makes verbose text tedious. This post would be about six times shorter if Alan were writing it, yet say every bit as much–if it’s truly important.

I tried, for a time, to emulate his style growing up, but the results were farcical, much more along the lines of saying little with little. Not everyone’s intellectual output can be compact and tidy. I have to ply my version of the craft, such as it is, differently.

Anyway, I lay out all these concerns not to be pompous, but rather to say that the kind of stuff I spend a lot of time worrying about doesn’t typify the STEM personality one commonly finds in the software engineering profession, nor the pragmatic, utilitarian–and often Spartan, at least when it comes to writing–communicators in the business world. There are exceptions, of course, but as a whole, my life experience is that it’s a valid generalisation about engineer types and MBAs thumbing out curt txt spk on their Blackberries. And this is the environment in which I’ve spent almost my entire adult life, having dropped out of university to seek the exalted heights of corporate America.

BUCHAREST, ROMANIA - OCTOBER 23, 2013: Unidentified lonely thoughtful retired man on his 60s rests on a bench in a park, autumn scene. The retirees represents 22% of the Bucharest's total population.

As you can imagine, this occasionally leads to amusing and infuriating conflicts of style and culture, and in general doesn’t make for an easy professional life. It’s not easy to talk to people when you have completely different psychological priorities than they do. The curse of being somewhat better-rounded is that my mind often takes detours not travelled by fellow Professionals. To their mostly utilitarian sensibilities, idle musings and the cultivation of an inner life beyond the immediate task at hand are, above all else, a waste of time. It’s not enough to just write this e-mail; it must be a good e-mail, at once brief and useful, but also poignant, articulate, maybe even with a dash of wit, or clever and original use of the English language? They’re thinking: get to the point, Alex, because business. There’s money on the line, or action items or something. Never mind the existential why! Business

Being wordy didn’t make for an easy childhood, either. I don’t think I came off overtly as bookish, being mostly chatty and rarely seen with an actual book per se. All the same, I can’t remember how many times I was called “Dictionary” or “Thesaurus” in school, or otherwise suffered social opprobrium for… well, for using words like “opprobrium”.

Outside of the liberal arts wing of a university environment (which I forsook at age twenty), the rest of the world offers a pretty steady diet of hostility to aspirant wordsmiths, and, as far as I can see, more generally to broader combinations of the intellect. There’s the automatic, default hostility of idle, unemployed kids in school, and the studied hostility of busy professional grown-ups. It’s easy to get depressed shouting into a waterfall, or, more accurately, pissing into the wind. I often feel an impostor, not quite sure what I’m doing here donning the regalia of tech entrepreneurship. When almost everyone I mix with expects small talk, being the guy always keen to start some big talk is demoralising and lonely.

And yet, as I pass the thirty mark, I’ve noticed something interesting. As more and more friends, colleagues and classmates move up the career ladder or otherwise evolve higher-order life needs, they’re coming to me for help in formulating thoughts: delicate requests, polite demands, cover letters, biographies, dating profiles, admissions essays, crowdfunding campaigns, petitions. All of this and more has landed at my feet in the past year.

“You always know how to say this stuff just right.”

“I don’t know how to say this – help!”

“You can put it a lot better than I can.”

Every once in a while, I’ll even get a note from a customer: “We always appreciate your thorough explanations and your going the extra mile.”

So, good news from our own “It Gets Better Project” for fellow closeted English majors in their twenties and thirties: keep your head up. As folks who know you move up the value chain into managerial realms requiring them to flex their communication muscles for the first time, you’re going to be more in demand.

Moreover, through my own experiences in hiring and being hired in the technology sector, I’m firmly of the impression that the most valuable candidates in the long run are those who both possess raw skills and can communicate well. There’s a lot of bottom-line value in clear analysis, disentangling messy ideas, and presenting esoteric information in an accessible way to outside stakeholders. Wordy missives may always be ignored by MBA frat boys as a matter of course, but effective and engaging communicators have more influence and audience.

The point is, as you gain confidence on your professional ascent and increase your leverage, stop taking shit from philistines. Don’t shy away from selecting aggressively for employers, customers and partners who realise that better-rounded people bring more to the table and appreciate you for who you are. Much has been said about how the customer is always right, and while compromises are necessary in life, you don’t have to concede everything and always. The fibres most integral to your self-actualisation should be armoured. The rightful sense of self is not for sale.

Evaluating potential hires for “culture fit” is all the rage in human resources now. Why not evaluate them for culture fit? What’s the culture like at the new gig? Neverending arguments about last night’s Steelers vs. Cowboys and the impact on Fantasy Football picks? Spirited discussion of the pros and cons of sundry brotein shakes? A thriving marketplace of World of Warcraft items? Hackneyed memes about bringing democracy to Syria? Either way, fire ’em. Sounds like bad “culture fit”.

Finally, choose your cohorts and your spouses wisely. Your true friends will help, not hinder you in leading an examined life.

Why I love Industrious, and about coworking and offices more generally

I’m one of those people who just hates working at home.

Oh, it seemed incredibly cool when it was the forbidden fruit. Back when I had to make a bleary-eyed, tedious commute to some cube at 9 AM and put cover sheets on TPS reports or listen to coworkers’ incessant sports talk, working from home was a rare and coveted treat, the stuff of dreams. Imagine, saving the world in my bathrobe, all the fine things in life at my fingertips: refrigerator, snacks, couch, coffee table, a breather on the balcony!

Stressed Man Working At Laptop In Home OfficeHowever, after I went out into the reputedly exciting world of self-employment around this time eight years ago, the novelty wore off after a week or two and the bleak reality set in. I’m an extrovert and I don’t handle extended loneliness well. Not leaving the house was depressing and unhealthy. It was not conducive to a routine; I quickly developed a chronically dispirited mood, exquisitely strange and shifty sleep rhythms (even by my nocturnal standards), and eating habits worthy of a bulletin from the Surgeon General.

Asleep at workOddly enough, this was unrelated to whether I lived alone, with a long-term romantic partner, or family and friends. Certainly, I can’t work at home these days in a small apartment with three young kids, but for most of the eight-year history of this business, I lived alone or with an adult partner no less busy than I. Also, I spent a few years living overseas. In all cases, I was dysfunctional working at home–or whatever place served the role of home–and I hated it. To stay sane and produce consistently, I need some kind of routine, a commute, movement and walking, coworkers, water-cooler talk, lunch meetings, and the overall psychological compartmentalisation that comes with a distinctive work-space. If I don’t have that, things go downhill fast.

I had to go somewhere to work routinely, even if it was just a coffee shop. And so began a long journey through half a dozen or so workspaces, whose evolution somewhat tracked the growth and increasing sophistication of my company. I’m going to talk about a few of them here before I get to Industrious, my happy home for as long as they’re willing to have me.

Coffee shops and open-area coworking spaces

I’m pretty sure coffee shops are part of the workspace lifecycle of just about every freelancer. Freelancer suitably describes the state of my consulting business in the first two years (2008-2009), and I spent untold hours at Starbucks or its local-colour alternatives in Atlanta and Athens, GA.


The Cascade complex in Yerevan, Armenia, and the location of the second Green Bean–one of my favourite places to work.

For putting up with my tiresome hunched-over-laptop presence over the years, big thanks go out to:

I can only hope that all the money I spent made up for the annoyance of me bumming around endlessly.

The basic problems of working out of coffee shops are fairly well known. They’re noisy, so phone calls are tough to impossible. Seating is not guaranteed, and is far from always comfortable or ergonomic. There’s always the awkward economic aspect of earning your keep; even if you’re not frugal (I’m not), there’s only so much stuff you can buy while sitting somewhere for the better part of a day, and depending on how much foot traffic the coffee shop has, the owners may or may not love you as a semi-permanent fixture. Even if they love you, it gets socially weird showing up at the same retail food service outlet every day as if you live there. It’s a coffee shop, not an office.

But the real problem for me was that I needed an office, not a table. Much has been written about the importance for developer productivity of having a room with a door that closes, and I won’t belabour it here. More critically, I’m not one of those people that can live out of his laptop. If nothing else, my poor eyes and limbs can’t take it. I need peripherals, a real display (or several), a quality desk phone, and a place to store files, documents and gadgets. In other words, I need an actual workplace, not an ephemeral seat at a table.

My first move up the chain from retail coffee shops was to classical coffee shop/open area-style coworking spaces, then a relatively new idea. In late 2009, I tried the new and experimental (and since defunct) Ignition Alley collective near City Hall East here in Atlanta. It was a good concept and I admire Tim Dorr and Mike Schinkel for trying it, but it was just not viable from a physical perspective. It was one of numerous hipster-type urban renewal projects, a valiant effort to rehabilitate a dilapidated, grungy industrial building. There wasn’t enough money behind it to actually do that. It was a cold, damp, and clammy winter. Like many projects of its kind, IA wasn’t a place I would have brought a customer or a colleague to.

The fundamental deficiency in this type of coworking space, rather popular now globally, is that it’s not much of an improvement over a coffee shop. You’re still expected to come in and find a seat with your laptop–there’s no persistence. They’re better than a coffee shop because they’re quieter, understood to be for working, and you can stay there all day without feeling guilty or awkward, but in the best case it ends up being something like a library study room.

Some coworking spaces offer rentable dedicated desks to address this, as Ignition Alley did. I also spent a few months in spring 2013 renting a desk-spot (with no entitlement to any particular desk) at the quirky Sankt Oberholz in Berlin, which has both a coworking-oriented downstairs coffee shop and a dedicated members-only coworking space upstairs. It’s all situated in a fascinating 19th century building off the Rosenthaler Platz U-Bahn station in East Berlin. I also spent a month or so at Roam Atlanta in Dunwoody in late 2014, having just returned to the US and not yet found my new working home. They have created a nice members-only coffee shop with lots of nooks and crannies, but it’s still just a coffee shop.

I was content with these compromises while travelling abroad, but I can’t run a serious company–one-man or otherwise–out of a coffee shop. It clearly works for a large category of freelancers and so-called “digital nomads”, but not for me. I needed to step up to something real.

Regus and other offices

Commercial office space is expensive, and, as anyone looking for a small amount of real estate will quickly discover, generally rents in large increments (suites or entire floors) over multi-year leases.

One option is to sublet an office from someone with extra rooms. A colleague and I did this for a while in 2008-09, renting an office attached to a telecom company and data centre in Atlanta’s west side (near the Georgia Tech area). It was nothing special, but comfortable, access-controlled, and compatible with techie sensibilities.

We went the barter route on this instead of paying cash, since we could offer consulting that was notionally of value to our landlord. I advise strongly against bartering tech services, because this will lead to one of two outcomes:

  • The landlord exploiting you mercilessly, given their relatively high leverage;
  • You feeling guilty and/or landlord feeling screwed because you’re not really providing them much value.

In my case, it was the second scenario. The landlord was a very kind person who was much too honourable to meddle, impose or demand, and so the guilt that I wasn’t doing enough to earn the barter equivalent of market cash rent gnawed at me. However, I’ve known plenty of techies who fell for the first scam, and ended up doing thousands of dollars of free work per month to pay maybe a few hundred dollars’ rent.

Another option is to find a small, privately owned office building where space can be had in smaller increments. They do exist. I was in a four-room suite rented in a two-storey building owned by a nice middle-aged couple in downtown Decatur for a while, and the rent was quite reasonable for the square footage. However, four rooms was about the smallest they could offer, so if you’re like me, just looking for a room and not a whole suite, you’re going to have to band together with some others to get space in buildings like this. Moreover, as a general rule, I would say that such buildings are not fancy; this is Class B real estate territory, the reality of which can vary from merely somewhat drab all the way to bombed-out urban warehouse conversion.


That’s ice under my door!

Small sidenote: if you find yourself living in the “developing world” for a while, you may be tempted to think, as I was, that you can get a decent office cheaply. While “cheap” is a relative notion, my experience suggests that you’ll get exactly what you paid for. I rented an office in a Soviet-era building in Yerevan, Armenia for about a year at about ~$200/mo (allegedly expensive, if you listen to the locals),


Some renovation going on next door.

and I can say that as a spoiled IT brat, I would have been better off paying closer to mainstream First World prices at fancy new business parks. I had to fight for air conditioning in 40C summers, never got particularly viable Internet access, and found the place uninhabitable in freezing winter temperatures unless I were willing to daisy-chain a bunch of space heaters like the other occupants. In talking with other people who have rented space in notionally “cheap” countries, I get the impression this is not an unusual experience. Unless you come from an expensive place like NYC, London, SF, Tokyo, etc., you should expect that comparable levels of creature comforts will cost more or less… well, comparably… anywhere. Have you noticed that Starbucks-style lattes cost no less than a few dollars everywhere, in poor and rich countries alike, even if the specific amount of dollars varies? Think of good offices the same way.

(Disclaimer: Opinions mine, based solely on my subjective impressions at the time of service, and not in any way fact-checked with the rigour of the scientific method or the fastidiousness of investigative journalism.)

Back to matters stateside, though. Now we come to Regus, a UK-based company that provides, along with several smaller competitors (including HQ, which Regus acquired), the canonical answer to the small businessperson’s demand for individual office space. Regus and its ilk are specifically designed to meet this need, occupying entire floors of “Class A” office space in skyscrapers and premium office towers and parceling them out by the room. Regus has a seemingly unparalleled global network of these facilities; if you live in something like a city, you’ve probably got at least one “Regus centre”, if not a dozen or more.

I’ve been in three different Regus facilities in Atlanta — two in high-rise Midtown and Downtown office towers (2010, 2011), and one in the Perimeter area (2014-15). The first two were split with a colleague, and the third I rented on my own. I’ve probably spent more time with Regus than I have anywhere else.

The good thing about Regus is that the space is truly “Class A” and the service offering is complete. All centres are in premium office parks or downtown skyscrapers and high-rises. If appearances are important to you in your business because you meet customers in your office, e.g. you’re an attorney or an accountant, this is your cultural home. If you want to be in the same kind of upmarket, well-appointed office building as a large ex-employer, Regus will get you in there. Needless to say, it’s not cheap; don’t bother with dollars-per-square-foot arithmetic, it’ll make you cry.

For most young tech companies, the value proposition in “Class A” office space is probably quite poor. Most of my customers, for product and consulting alike, are not local to Atlanta, and I hardly see them face-to-face. I could work in a single-wide trailer in rural Alabama and they’d be none the wiser. I also found that the aesthetic of “Class A” interiors becomes quite bland after a moment–depressingly cookie-cutter and unimaginative, if undeniably clean and comfortable.

The bad side of Regus is virtually everything else. Candidly, they’re probably one of my least favourite companies.


The first thing you’ll notice about Regus is that it’s an entirely sales-driven organisation. It’s almost like it’s really a sales company with the actual provision of office space as an afterthought. The managers of the facilities had titles like “Area Sales Manager”, and this really reflects the Glengarry Glen Ross character of the place. Those guys are always closing. The entire experience is numbers-driven, and you’ll feel very nickel-and-dimed for a la carte items that you probably figured were just included in the rent — the $100/mo Internet access (any alternatives require paying far more to bypass Regus), the $40/mo “kitchen fee” (so you can drink their Keurig-type coffee), etc. Faxes and copies are $X/page. There’s a lot of tax, title and tag beyond your base rent. They’ll try to upsell you on phone service and answering service, sell you on marked-up office supplies — always be closing. Nothing’s free or included. The conference rooms are scrupulously monetised. Remember Dwight Schrute as Dunder-Mifflin’s new landlord? Think intensely capitalistic thoughts.

Then there’s the nature of the rent itself. One-year lease agreements are mandatory and there is no early termination option, which is anathema to the realities of a young company, whether struggling or growing. Regus lease agreements emphasise that you do not receive a tenancy interest in any particular office, just that you are entitled to an office of a certain area. It’s more like a hotel, as they’re fond of saying. Although it’s not too common, they can and will move you to a different office if it suits a commercial objective of the moment. This is probably fine for people that need little more than a desk and a laptop, but not so good for those of us who nest with equipment, files or books in our workspaces, and more generally, build nontrivial attachments to the aesthetics of particular rooms. On one occasion, we were bumped (well, strictly speaking, priced) out of our then-current Regus centre altogether to make room for a client who wanted lots of short-term offices clustered together and was waving a lot of money around. You’re just a number.

Speaking of numbers, prices seem to be continually adjusted to reflect “market conditions” (i.e. a salesperson’s sense of how much you’re willing to pay) and proposed annual rent increases can be jaw-dropping. It’s a lot like sitting on a plane, knowing that every single passenger paid  a different price for their ticket, a price that should be (in the view of the airline’s revenue management department) specially calibrated for every individual’s unique needs, so to speak. Traditional corporate landlords will wheel-deal and negotiate within a certain band that tracks average rental trends in a given market, but Regus have the flexibility to negotiate at the individual office level, so all bets are off.

Despite the high quality of the office space and the professionalism with which it is managed, all this makes for a rather user-hostile experience. If you don’t like the slimy feeling of constantly being sold to, you won’t like Regus. On the other hand, if money is not a key object and you like let’s-make-a-deal, you can find a nice office in the best buildings damn near anywhere in the world. If that’s important to you, might be worth a look.

One more thing: most traditional office space of any kind isn’t going to solve the solipsism of the solopreneur. I’ve never met anybody in Regus or other conventional spaces. Everybody scurries to and from their opaque offices and doesn’t socialise much. I imagine many folks like it that way.


11990556_10104811416504600_7380303958159755701_nThe first time I saw Industrious’ promos, I knew I wanted to be there. Finally, there’s an office company that really gets it! I’m obviously not the only person to have come to that conclusion, as their occupancy in Atlanta is high and they’re expanding aggressively nationwide.

Since September 2015, the Atlanta Midtown location is Evariste’s new home, and we’re not going anywhere. I think that’s the first time I’ve ever said that with certainty and conviction about any office space.

11935100_10104811416514580_5986254207105796870_nThe Industrious model combines all the social virtues of coworking with the critical realisation that private offices with closing doors are critical for serious companies. It’s real office space, but it’s also Gemeinschaft. When you want to hang out, you can come to the common areas, and when you want to hide in your office, you can do that. When you want to be a freelance hipster, you can do that, and when you need your business to grow up and be taken seriously, it lends itself to that as well.


Our actual office.

Open-area coworking spaces don’t necessarily provide socialisation or collaboration opportunities consonant with the sales pitch. In places like that, people are trying to carve out their cocoon of concentration, siloed off into earphone-wearing atoms. At Industrious, there’s a real feeling of collective, which I think is an underappreciated key to the psychological health of the solipsistic entrepreneur or isolated small team.

All the coworking spaces push this angle strongly, but it’s only at Industrious that I’ve really seen it work. Some of that is just blind luck, I imagine; I was fortunate not to end up in another sea of SEO Creative Catalysts and Twitter emoticon visionaries with whom I had zero professional overlap. But some of that is because at Industrious there’s real balance, just like in a residential community; sometimes you want to go to the festival, but other times you just want to have a quiet evening at home. Industrious lets you do both.

I never made any friends at the open-area coworking spaces I’ve been in, but here, I’ve already done business with one neighbour and am getting ready to enter into a long-term contract with another! No collective setting can guarantee that one will make friends or strike up fruitful professional connections, but I think that Industrious’ claim is more substantive. They’ve got the facilities to situate non-trivial companies and groups of people engaged in higher-order business, not just individual professionals, so I think there’s a better chance of finding someone you’ll want to talk to.

12342653_10105058550740610_204125970256939411_nThe physicality is unique, original and well thought-out: all-glass office walls and front windows, a well-lit fishbowl in keeping with the latest architectural fashions in startup land. It’s very pleasing on all levels, and the ability to see into everyone’s office lends a surprising additional visual diversity to the interior, as the decorations, creature comforts, and the occupants themselves become part of a living, breathing décor. The glassy, transparent style of the place is oddly addictive. It makes me feel like I really want to be there. The transparency probably takes away privacy in the eyes of some, but it seems to me that the upsides greatly outweigh that. Overall, they’ve done a good job of skimping on things that the incorrigibly entrepreneurial don’t really care about; there’s exposed HVAC pipework, lots of concrete and a slightly industrial feel. However, they’ve been scrupulously attentive to design concepts for things that do matter, and I really like the values and priorities expressed in those choices. Having been through my share of projects with an “urban renewal” take on this, I know that’s hard to find.

The transparent, glassy anatomy is also an ingenious way to increase the ever-troubled value of interior (no window) offices, since daylight still finds its way into them, and one can see other people. Interior offices at Industrious are not depressing, isolated dungeons.

The common area is a pleasant place to relax, and there are frequent low-key social events. That appears to be consistent across all the locations.

Did I mention that you can draw on the glass?


Industrious provide a variety of room sizes, too, ranging from a small single-person office to rooms that can support five or even ten people. You can grow a team quite a lot before you outgrow Industrious. Upsizing or downsizing is trivial, subject to availability.

I wouldn’t say it’s cheap per se, and pricing does vary by market and facility. The larger rooms can stretch into the thousands of dollars monthly. However, all rent is strictly month-to-month; there are no leases. When you consider that none of their rental revenue is under contract, you can’t expect it to be cheap. There’s no slimy, high-pressure sales atmosphere; Internet access is included, as are snacks and coffee. There are no games. You can print things without anxiety.

I think the concept is a clear winner. This is unquestionably the right way to do a la carte office space for freelancers, startups and small companies.  I think they’ve succeeded in redefining the category and anchoring a trend.

(No mention of my positive experience with Industrious at Atlanta-Midtown can be complete without a shout-out to Mary Catherine Hardage, the location manager. Aside from being a genial personality and pleasant to work with, she’s highly organised, diligent, and scrupulously attentive to detail. She takes excellent care of the place and is a huge asset to the company. Say I’m being gushy and saccharine, fine, but in bottom-line terms, I can’t tell you how much this stuff matters when you need something.)

The Hacker News playbook and enterprise software startups

Big Silicon Valley capital is largely interested in consumer-facing, high-growth “world domination” plays, in anything that has the potential to become a household name. Naturally, the Valley’s startup-grooming tributaries (e.g. Y Combinator) aim to position tech entrepreneurs at an angle complementary to that criteria, since that’s how they make money.

A lot of the cultivated folklore and intellectual work-product of the cultural leaders of this space, as epitomised by the writing of Paul Graham, speaks the language of upward exponential curves, critical mass, and gargantuan user bases–all things Valley web economy VCs like. As PG says here and elsewhere, startups are, most essentially of all, about going big by building something lots of people want.

But what if you’re like us, making a foray into the “boring” world of intra-industrial business software or a product that is specialised deeply into a vertical-specific niche?

I don’t mean a “lifestyle business”, nor do I specifically mean the long-run, sustainable, bootstrapped approach for the advocacy of which 37Signals and DHH have distinguished themselves (although nonparticipation in mainstream tech investment is implied); 37Signals still have products that millions of people want. I’m talking about building something relatively expensive that almost nobody wants.

Think of some Byzantine water pump control mechanism for a sewage treatment plant, something you can elevator-pitch in two seconds to a very select audience but that you couldn’t easily explain in ten minutes to anyone else. We build something like that for the VoIP telephony industry.

We’ve been around for eight years, we’re tiny, and we’ve morphed into a product company largely out of a consulting heritage. We’re clearly not a startup as YC and Valley “VC-istan” would have it, and nobody would fund us. We’re in no danger whatsoever of a “rocket-ship trajectory”, do not leverage “network effects”, we’re not “going viral”, and our customer acquisition cost is pretty high. While we too have benefited from the structural decline in the cost of starting a technology company (e.g. cloud servers), we’re largely unable to benefit from some of the biggest shifts to a lower cost basis and barriers to entry: Google web ads don’t do us much good because that’s not how our typical customers find us, and we don’t have anything to put in a mobile app store.

It’s mostly old-fashioned relationship building, personal brand, conferences and trade shows for us. It’s boring, it’s expensive, it’s slow. Imagine an SEO-tweaked conversion funnel with low-touch onboarding. We’ve got whatever the opposite of that is: a trickle of leads that, when we get lucky, spool out into long, drawn-out, consultative sales cycles measured in months or even years. It’s not the stuff of compelling pitch decks.

And so, the question I’ve been pondering for a long time is: what is the size and location of the cultural and methodological intersection between the Hacker News flavour of startup lore and our kind of business model? Do we have still have something to learn and apply? Can any useful takeaways be mined from the corpus of essays and “thought leadership” that PG and YC have provided? Are there useful entrepreneurial insights to be captured from Hacker News?

I think the answer is yes. One must simply be careful to cherry-pick the right bits. Here are a few thoughts:

Making Something People Want

In the business software realm, this needs to be recast with a sharp emphasis on “solve problems people have”. The advice to solve one’s own problems, or at least problems directly relatable to one’s industry experience, is, at its core, essential.

I first had an idea for something like our present-day product in 2006. At that time, I was young, inexperienced and new to telecom, and conceived of the problem space in very a priori terms. Had I moved forward and tried to take the concept to market at that stage, it would have been a spectacular flop because it did not provide institutionally acceptable solutions to actually-existing problems.

It’s possible that, had I been in a financial position to commit to it full time and avoided being bogged down in consulting for several years, I would have realised a speed advantage from being able to “fail fast”, “iterate” and/or “pivot” in response to the negative feedback for my initial concept. However, in a small industry where personal brand and reputation plays an important role, I’m not sure the speed advantage would have outweighed the personal brand deterioration resulting from putting something out there that simply doesn’t work.

Thus, I’m moved to say that the emphasis on empiricism and really understanding one’s target user is triply important in business software. In particular, I would add that non-trivial domain expertise in the user’s industry is probably a must, unless you’re building something that is, at heart, rather broad and generic.

There’s a particularly ludicrous current of wishful thinking out there that presupposes “customer discovery” to be a free-floating skill set unrelated to any particular industry or sphere of expertise. There probably are some backward industries where daily workflows consist of mostly disposable paper pushing, and where an application with some commodity CRUD screens could make a meaningful dent. However, in our industry, having a JS/CSS-savvy “UX quarterback” shadow “everyday users” for a few days to “really discover their pain points” would be a hopelessly naive waste of time.

There are no shortcuts to knowing a lot about telecom by working in telecom. To make a good telecom product, you have to be deeply conversant with the history of voice and data, the supply chains, the acronyms, and the regulation–oh God, the regulation. This probably holds true of most industries you could build complex solutions for. If you think you can walk into property casualty reinsurance and “disrupt” the place with a month-long Ruby on Rails bender (how very Agile of you), the vertical niche business market is not for you.

In this light, the market validation provided by an organic, consulting-driven funding strategy–which PG is generally sour on–is highly valuable. It might be worth building your product that way even if you could go the fundraising route instead. You’ll learn a lot. I doubt our product would have any market traction if we had tried to leapfrog the several years of hard lessons learned about our target market from our otherwise tiresome and financially stressful consulting slog.

Making Something Users Love

Having said all that, if you read a lot of PG and Sam Altman, it’s easy to become discouraged by repeated talk of the importance of building something users love. Good products simply roll off the shelves, like those round-ish late-1990s iMacs. Marketing is just an optimisation for more eyeballs; the product fundamentally sells itself on a powerful wave of early-adopter enthusiasm, and if your product isn’t grabbing most people who come across it, the implication is that it’s just not a good product.

There’s a fine line there. You do have to know when to call it quits if nobody’s buying what you’re selling. It’s possible to sell at least one unit of something to someone, somewhere given enough time and effort; it doesn’t imply a good commercial prospectus. You should have some way of figuring out if your product isn’t really taking.

However, sales in this area is hard, and you should expect that; the idea that your potential customers are going to just want or love what you’re selling in a self-evident kind of way is complicated by, well, the complexity of what’s being sold. Don’t be fooled into thinking that your product isn’t good just because every sale feels like bruising hand-to-hand combat. You’re fighting against institutional inertia, the customs and habits of the boys at the country club, sclerotic management bureaucracy, combative purchasing departments, and the marketing stranglehold of big-brand competitors on risk-averse management. In our case, we’re selling something that requires the customer to remove core infrastructure in a growing, revenue-generating, and intensely downtime-averse business and replace it with our own. It’s easy to get everyone to agree it’s ultimately a good idea, but it takes political wherewithal. “We’re really in a hurry to do that,” said no executive decision-maker ever.

Many of our most loyal customers of today didn’t know they had a commercial problem our product could solve. The function of marketing becomes very important here: as a general rule for the world of capital goods, customers have to be educated. The “bounce rate” on eyeballs alone is going to be close to 100%.

The popular understanding of what it means to make something users love is often tied up in good user experience and front-end mechanics. This is a siren song. In the world of capital goods and the complex solution sales that go with them, the most important criterion is, “Does it make or save us money?”

That’s not to say a good UI isn’t a competitive advantage, but don’t sweat it. Users will put up with a pretty bad UI on a machine that prints money for their company. More importantly, a good UI won’t do a damn thing for a product that doesn’t have positive bottom-line impact or significant business-level differentiation.

The closest thing to a product that sells itself in the business market is a product that obviously enables new revenue streams or improves margins on existing ones. You’ll want to focus on that, not giving the end-user Microsoft Word in a browser. The latest trending JavaScript front-end framework is not important. Delivering business results is important. That’s something the users will love.


It is indeed critical to hire the right early-stage employees, and hiring mistakes in the early stage will break your company with all the good products and marketing tailwinds in the world. Much of what PG and the YC crowd have to say on the importance of building a great team is directly applicable.

Early-stage hiring is particularly complicated in niche verticals because your customers are buying a vendor service relationship as much as they are buying a software system. Your staff will be engaged closely with customers who expect credibility and expertise from your people, and it’s the warm fuzzies from that collaboration that often close the sale.

That often means that run-of-the-mill skill sets found in the general IT population you can hire off the street simply won’t do. Entry-level people are especially pernicious hiring choice here, since there’s so much more background knowledge to impart. Your early-stage employees will need to be both technical and essentially fluent in the industry domain to which you are selling, which greatly reduces your hiring pool and makes candidates even more expensive.

Accordingly, your success and failure depends on your ability to get people with some vertical-specific industry background in the door. You should expect to make even more compromises here than typical in Valley web startup land as far as equity grants and so on. You’ll want to be mindful of the regions and labour markets around the country that concentrate IT people with particular domain knowledge above and beyond table stakes technical skills: if you’re doing something in energy, get comfortable with Houston, and if you’re doing fintech, think Chicago or New York.


There really are no pre-revenue business models in enterprise software; at least, there shouldn’t be. All talk about building “critical mass” user base and figuring out how to monetise it or render it profitable later is irrelevant and should be summarily ignored. Your first customer should be paying.

There are, of course, some strategic API and platform plays out there whose primary purpose is to set up for an acquisition. These often don’t have paying users or don’t generate a lot of revenue. However, as a general rule, acquisitions in the business market are more rationalistic and quantitative, so bringing more revenue to the table redounds to the benefit of your valuation and bargaining power. This is a bit different from the voodoo valuation process of mass-market startups, where irrational investor exuberance can sometimes be maximised by removing the constraint of concrete, earth-bound revenue and encouraging the acquirer to “really dream big”. All this to say: book revenue. You’re not going to get more for less by letting a freemium cat out of the bag and into the open market. More revenue is always better.

Otherwise, in a market with a low volume of high-magnitude transactions, every customer counts. Arithmetically, price segmentation of some description is usually required to make a product economically viable. One of your biggest preoccupations early on should be to delineate the needs of your “lite” users at one extreme, versus your “enterprise” or “platinum plan” users on the other, and to tier your product accordingly. Joel Spolsky’s classic Camels and Rubber Duckies, dated though it may be, still comes highly recommended as one of the best introductions I’ve seen on this subject.

Too cool for school: a retrospective on dropping out of university

The homo computatis college drop-out was a cliché whose establishment in folklore predated my departure from the University of Georgia by at least two decades. Nevertheless, I also joined the club. In the first semester of the 2005-2006 academic year, after dabbling half-heartedly in coursework for two years as a philosophy major, I threw open the gates and exiled myself into the great beyond.

Although my actions conformed to a known stereotype, I still feel I was something of an early adopter, virtually alone in my group of peers. I came to count many in my acquaintance who never pursued post-secondary education in the first place, or who floated in and out of community and technical colleges amidst working and financial struggles, but knew of vanishingly few, especially at that time, who straight-up dropped out of a four-year institution–that is, academic majors who unregistered abruptly from their courses mid-semester and skipped town with no intention of returning. That sort of thing seemed to be the province of Larry Page, Sergey Brin and Bill Gates–definitely outliers. And unlike them, I wasn’t at the helm of a world-changing startup on a clear trajectory into the multi-billion dollar stratosphere, so I couldn’t point to an overwhelming and self-evident justification.

A lot has changed since then. By all appearances, we seem to be passing through a watershed moment where existential questions about the value and purpose of college and traditional higher education in America are emerging onto the mass level among Millenials and Generation Z-ers. This discussion has been spurred on by the ensuing housing crisis, a growing tower of debilitating student loan debt, tuition rises, and mounting questions about the future of employment in the developed world, especially the ways in which opportunity has become more and less democratic in the context of technological shifts and globalisation. There’s also a growing interest in open courseware and novel forms of technology-enabled correspondence learning–though, I should say, I don’t share in the TEDdies’ conflation of the Khan Academy with higher education. Still, nearly a decade has passed since I made my fateful decision to forsake the path of higher learning, so it seems like a good time to reflect on where it’s taken me and whether it was a good call.

Visiting Western Michigan University in Kalamazoo for a conference, somewhere around 4th or 5th grade.

Some aspects of the progression of events will sound familiar to many in IT. I grew up mostly around university environments, and computer programming figured dominantly among my childhood interests. It was an interest easily encouraged by proximity to lots of expensive computer equipment, good Internet connectivity and access to sympathetic graduate student mentors. I had been playing with computers and the Internet since age 8 or so, and wrote my first program at 9. I spent much of my adolescent and teenage years nurturing this hobby, having a strong interest in both software engineering and operational-infrastructural concerns. As most people in IT know, ours is a profession that offers unrivaled self-teaching opportunities, enabled by a highly Googleable open-source software ecosystem and collaborative social dynamic. That’s why so many programmers like me are self-taught.

I also had various other intellectual interests, however, and had no plans to make a tech career. In fact, for most of my life prior to age eighteen or so, I wasn’t even particularly aware that I had a marketable skill set. The desire to get into computing as a child was utterly innocent and non-ulterior, as arbitrary as some kids’ choice to take up the cello or oil painting. I entered UGA in 2004 as a political science major and shortly switched to philosophy, with vague ideas of law school in the future.

It’s also worth remarking that I came from a humanities-oriented academic family and cultural background; my parents were professors of philosophy at a top-tier Soviet university, and my father is a professor of the same at UGA. My extended family background includes a venerable dynasty of musicians, including my great-grandfather, Mikhail Tavrizian, the conductor of the Yerevan State Opera and a National People’s Artist of the USSR, as well as his Russian wife Rassudana, a renowned ballerina in the Bolshoi Theatre. My late grandmother was a philologist and a member of the philosophy faculty of the Russian Academy of Sciences. When my parents emigrated to the US in 1992 (I was six years old), they redid graduate school entirely at the University of Notre Dame, which is where my primary school years were spent. My social and cultural life at that time played out in housing for married graduate students with children, where I ran around with friends from dozens of different nationalities.

All this to say, I was on a strong implicit academic trajectory as a function of my upbringing, a trajectory rooted in the humanities, not hard sciences. In fact, my parents were not especially supportive of my computing hobbies. As they saw it, I think, spending my days holed up in my room modem-ing away interfered with schoolwork and was not especially promotional of cultural development consonant with the mantle I was meant to inherit.

Nevertheless, I began working when I was eighteen (my parents did not let me work prior to that–Russian parents do not, as a rule, share the American faith in the virtues of part-time work for teenagers or students). My first job was technical support at a small Internet Service Provider in our university town of Athens, GA, at first very much part-time and reasonably complementary to college. I earned a 4.0 GPA in the first semester of my freshman year.

However, I was ambitious and precocious, decidedly more interested in work than school. Within a year, after some job-hopping (which included a stint in IT and/or warehouse labour at a Chinese importer of home and garden products–how’s that for division of labour?), I returned to the ISP at twice the pay rate and assumed the role of systems administrator. I was learning a great deal about real-world business and technology operations, getting my hands on industrial infrastructure and technologies, and rapidly assimilating practical knowledge. I had been around theoretical publications and underpaid graduate student assistants hunkered in dimly lit carrels my whole life, but I never had to learn the basics of business and how to communicate with all kinds of everyday people on the fly. Although the cultural clash was sometimes frustrating, the novelty of learning practical skills and how to run a real-world operation was intoxicating. It occasionally led to an outsized moral superiority complex, too, as I became conscious of the fact that at age 19, I could run circles around most of the job candidates being interviewed, some of whom had master’s degrees. Clearly, I was doing something right!


Fielding customer calls at the ISP. Clearly, I’m thrilled to be doing customer support despite being a sysadmin.

From that point, my career rapidly evolved in a direction not compatible with school. Formally, I was still part-time and hourly, but it was effectively closer to a full-time gig, and I rapidly took on serious responsibilities that affected service and customers. Small as the company was, in retrospect, I was a senior member of its technical staff and a sought-after authority by people ten to twenty years older. My commitment to school, already a decidedly secondary priority, rapidly deteriorated. I had no semblance of campus life immersion or student social experience. From my sophomore year onward, I was effectively a drop-in commuter, leaving in the middle of the day to go to a class here and a class there, then hurrying back to the office. I neither had time for studying nor made the time. My GPA reflected that. I didn’t care. School droned on tediously; meanwhile, T1 circuits were down and I was busy being somebody!

As my interests in telecom, networking, telephony and VoIP deepened, it became clear that the next logical career step for me was to move to Atlanta; Athens is a small town whose economy would not have supported such niche specialisation. Toward the end of the second semester of my sophomore year, I began looking for jobs in Atlanta. I unconsciously avoided the question of what that means for my university; I was simply too engrossed in work and captivated by career advancement. In the first semester of my junior year, by which point my effort at university had deteriorated to decidedly token and symbolic attendance, I finally found a job in Alpharetta (a suburb of Atlanta) at a voice applications provider. In October 2006, at the age of twenty, I announced that I was quitting university and moving to the “big city”.

My parents reacted better than I thought they would. I halfway expected them to disown me. However, in hindsight, I think they were pragmatic enough to have long realised where things were headed. It’s hard for me to say, even now, to what degree they were disappointed or proud. I don’t know if they themselves know. What was most clear at that moment was that I am who I am, and will do as I do, and there’s no stopping me.

That’s not to say that East and West didn’t collide. I remember having a conversation that went something like:

– “But what happens if you get fired in a month?”

– “Well, I suppose that’s possible, but if one performs well, it’s generally unlikely.”

– “But is there any guarantee that you won’t lose your job?”

Guarantee? That was definitely not a concept to which I was habituated in my private sector existence.

– “There are never any guarantees. But my skills are quite portable; if such a thing happened, I could find another job.”

– “It just seems very uncertain.”

– “That’s how it goes in the private sector.”

All the same, it was clear enough that, for all the problems this decision might cause me, I certainly wasn’t going to starve. Even in Athens, I was an exceptionally well-remunerated twenty year-old. My first salary in Atlanta was twice that. Moreover, it was clear that IT was an exceptionally democratic and meritocratic space; if one had the skills, one got the job. My extensive interviews in Atlanta drove home the point that potential employers did not care about my formal higher education credentials by that point in my career development. The “education” section from my résumé was long deleted, replaced by a highly specific employment history and a lengthy repertoire of concrete, demonstrable skills and domain knowledge with software and hardware platforms, programming languages, and so on. The résumés I was sending out to Atlanta companies at age twenty proffered deep and nontrivial experience with IP, firewalls, routers, switches, BGP, OSPF, Cisco IOS, Perl, Bash, C, PHP, TDM, ADSL aggregation, workflow management systems, domain controllers, intrusion detection systems–what didn’t I do at that ISP? There aren’t many companies that would have let someone with my age and experience level touch production installations of all those technologies. I was bright-eyed, bushy-tailed, and soaked it all up like a sponge. And when asked for substantiation by potential employers, I sold it.

Those of you in IT know how this works: formal education is used by employers as signalling about a candidate only in the absence of information about concrete experience or skills. All other things being equal, given two green, inexperienced candidates among whom one has a university diploma and one doesn’t, employers will choose the one who finished university, as it’s a proxy for a certain minimal level of intelligence and ability to complete a non-trivial multi-year endeavour. When concrete experience and skills are present, however, the educational credentials fly out the window for most corporate engineering and operations jobs, and the more one’s career evolves, the less relevant early-stage credentials become. Moreover, there are innumerable people in mainstream IT whose university degrees were not in computer science or affiliated subject matter, but rather in a specialty like literature, history or ecology.

My next three jobs were in Atlanta, within the space of about the next year and a half. I averaged a job change every six months or so, often considerably increasing my income in the process. By the time I was barely twenty-two, I had worked for a voice applications provider, a major local CLEC and data centre operator, and an online mortgage lender.

Of course, certain jobs were off-limits. I couldn’t do research work that required a formal computer science background, nor take jobs in government or certain other large institutions who remained sticklers for credentials. I lacked the formal mathematics and electrical engineering background necessary for low-level hardware design work. It’s also quite likely that if I had tried to climb the corporate ladder into middle to upper management, I would at some point, later in life, bump into a ceiling for degree-less drop-outs. When one gets high enough, it becomes comme il faut to have an alma mater in one’s biography, even if it’s just an airy-fairy management degree from a for-profit correspondence course mill. The only way I know of to get around that is to have had a famous and inscrutable business success (i.e. an acquisition) to overshadow it. Click on “Management Team” under the “About” section of some tech companies’ web sites to get the drift. Appearances are important at that level.

I didn’t stick around long enough to figure out where exactly the limits are (although I didn’t get the impression there were many, as long as one could demonstrably do the work). In early 2008, I was abruptly fired after some political clashes. Also, they don’t take kindly to the habitual morning tardiness of “programmer’s hours” in financial services. Instead of looking for my seventh job in four years, I decided to go out on my own. I had been itching to do it for quite some time, but didn’t quite have the wherewithal to walk away from the steady paycheck. Getting fired has a way of forcing that issue.

33446_10100118160946360_1749762_nAnd so, on a cold, windy January day in 2008, barely twenty-two, I left the building with my belongings in a box, with nearly zero dollars to my name, having wiped out my savings with a down payment on a downtown Atlanta condo. I had no revenue and no customers. A friend and I went to celebrate. I was determined and hit the ground running, though, and that’s how I started Evariste Systems, the VoIP consultancy turned software vendor that I continue to operate today, nearly eight years later.

Because the US does not have a serious vocational education program and because the focus of the “everyone must go to college” narrative of the last few decades is reputed success in the job market (or, more accurately, the threat of flipping burgers for the rest of one’s life), the first and most pertinent question on American students’ minds would be: do I feel that I have suffered professionally because I did not finish my degree?

I didn’t think I would then, and I still don’t think so now. Notwithstanding the above-mentioned limitations, it’s safe to say that I could qualify for almost any mainstream, mid-range IT job I wanted, provided I evolved my skill set in the requisite direction. In that way, IT differs considerably from most white-collar “knowledge work” professions, which are variously guilded (e.g. law, medicine) or have formal disciplinary requirements, whether by the nature of the field (civil engineering) or by custom and inertia (politics, banking). Although politics perverts every profession, IT is still exceptionally meritocratic; by and large, if you can do the job, you’re qualified.

The inextricable connection of modern IT to the history and cultural development of the Internet also moves me to say that it’s still the easiest and most realistic area in which one can receive a complete education through self-teaching. You can learn a lot about almost anything online these days, but the amount of resources available to the aspiring programmer and computer technologist is especially unparalleled.

An insert from a stack of mid-1990s PC Magazines discarded by my neighbour, which I coveted tenaciously.

That doesn’t mean I’d recommend skipping college generically to anyone who wants to enter the profession at roughly the same level. I put in, as a teenager, the requisite ten to twenty thousand hours thought to be necessary to achieve fundamental mastery of a highly specialised domain. However, I can’t take all the credit. I was fortunate to have spent my formative years in a university-centric environment, surrounded by expensive computers and highly educated people (and their children), some of whom became lifelong mentors and friends. Although my parents were not especially thrilled with how I spent my free time (or, more often, just time), they had nevertheless raised a child–as most intelligentsia parents do–to be highly inquisitive, open-minded, literate and expressive, with exposure to classical culture and literature. Undergoing emigration to a foreign land, culture and language at the age of six was challenging and stimulating to my developing mind, and the atmosphere in which I ended up on the other side of our transoceanic voyage was nurturing, welcoming and patient with me. The irony is not lost upon me that I essentially–if unwittingly–arbitraged the privilege associated with an academic cultural background into private sector lucre. A lot owes itself to blind luck, just being in the right place and in the right time. I could probably even make a persuasive argument that I lucked out because of the particular era of computing technology in which my most aggressive uptake played out.

This unique intersection of fortuitous circumstances leads me to hesitate to say that nobody needs a computer science degree to enter the IT profession. My general sense is that a computer science curriculum would add useful, necessary formalisation and depth to the patchwork of the average self-taught techie, and this certainly holds true for me as well–my understanding of the formal side of machine science is notoriously impoverished, and stepping through the rigourous mathematics and algorithms exercises would have doubtless been beneficial, though I don’t think it would have been especially job-relevant in my particular chosen specialisation.

Still, I’m not committed to any particular verdict. I’m tempted to say to people who ask me this question: “No, you don’t need a degree to work in private industry–but only if you’re really good and somewhat precocious.” Many nerds are. Almost all of the really good programmers I know have programmed and tinkered since childhood. It comes part and parcel, somewhat like in music (as I understand it). In the same vein, I don’t know anyone who wasn’t particularly gifted in IT but who came out that way after a CS degree.

On the other hand, for the median aspiring IT professional, I would speculate that a CS degree remains highly beneficial and perhaps even essential. For some subspecialisations within the profession, it’s strictly necessary. I do wonder, though, whether a lot of folks whose motive in pursuing a CS degree is entirely employment-related wouldn’t be better off entering industry right out of high school. They’d start off in low entry-level positions, but I would wager that after four years of real-world experience, many of them could run circles around their graduating peers, even if the latter do have a more rigourous theoretical background. If practicality and the job market are the primary concern, there are few substitutes for experience. Back at my ISP job, CS bachelors (and even those with master’s degrees) were rejected commonly; they had a diploma, but they couldn’t configure an IP address on a network interface.

Another reason I don’t have a clear answer is because things have changed since then; a decade is geological in IT terms. I’ve also spent twice as much time self-employed by now as I did in the employed world, and niche self-employment disconnects one from the pulse of the mass market. I know what I want in an employee, but I don’t have a finely calibrated sense of what mainstream corporate IT employers want from graduates these days. When I dropped out, Facebook had just turned the corner from, and there were no smartphones, no Ruby on Rails, no Amazon EC2, no “cloud orchestration”, no Node.js, no Docker, no Heroku, no Angular, no MongoDB. The world was still wired up with TDM circuits, MPLS was viewed as next-generation, and VoIP was still for relatively early adopters. The point is, I don’t know whether the increasing specialisation at the application layer, and increasing abstraction more generally, has afforded even more economic privilege to concrete experience over broad disciplinary fundamentals, and if so, how much.

All I can firmly say on the professional side is that it seems to have worked out for me. If I were in some way hindered by the lack of a university diploma, I haven’t noticed. I’ve never been asked about it in any employment interview after my student-era “part-time” jobs. For what I wanted to do, dropping out was the right choice professionally, and I would do it again without hesitation. It’s not a point of much controversy for me.

The bigger and more equivocal issue on which I have ruminated as I near my thirtieth birthday is how dropping out has shaped my life outside of career.

I don’t mean so much the mental-spiritual benefits of a purportedly well-rounded liberal education–I don’t think I was in any danger of receiving that at UGA. 80% of my courses were taught by overworked graduate teaching assistants of phenomenally varying pedagogical acumen (a common situation in American universities, especially public ones). The median of teaching quality was not great. And so, I’m not inclined to weep for the path to an examined life cut short. It’s not foreclosed access to the minds of megatonic professorial greats that I bemoan–not for the most part, anyway.

However, moving to Atlanta as a twenty year-old meant leaving my university town and a university-centric atmosphere. My relatively educated environs were replaced with a cross-section of the general population, and in my professional circles, particularly at that time, I had virtually no peers. My median colleague was at least ten years older, if not twenty, and outside of work, like most people living in a desolate and largely suburban moonscape, I had nobody to relate to. At the time I left, I found value in the novelty of learning to work and communicate with the general public, since I never had to do it before. I thought our college town was quite “insular”. In retrospect, though, it would not be an exaggeration to say that I robbed myself of an essential peer group, and it’s no accident that the vast majority of my enduring friendships to this day are rooted in Athens, in the university, and in the likeminded student personalities that our small ISP there attracted.


Beautiful morning view from the balcony of my recently foreclosed condo.

As a very serious and ambitious twenty-year old moving up the career ladder, I also took a disdainful view of the ritualised rite of passage that is the “college social experience” in American folklore. I didn’t think at the time that I was missing out on gratuitous partying, drinking, and revelatory self-discovery in the mayhem of dating and sex. If anything, I had a smug, dismissive view of the much-touted oat-sowing and experimentation; I was leapfrogging all that and actually doing something with my life! Maybe. But I unraveled several years later, anyway, and went through a brief but reckless and self-destructive phase in my mid-twenties that wrought havoc upon a serious romantic relationship with a mature adult. I also at times neglected serious worldly responsibilities. Being a well-remunerated mid-twenties professional didn’t help: it only amplified gross financial mistakes I made during that time, whereas most people in their twenties are limited in the damage they can do to their life by modest funds. I’m still paying for some of those screw-ups. For example, few twenty-one year olds are equipped to properly weigh the wisdom of purchasing a swanky city condo at the top of a housing bubble, and, subsequent developments suggest that I was not an exception. Oh, a word of advice: pay your taxes. Some problems eventually disappear if you ignore them long enough. Taxes work the opposite way.

But in hindsight, a bigger problem is that I also missed out on the contemplative coffee dates, discussion panels, talks and deep, intelligent friendships that accompany student life in the undergraduate and post-graduate setting. While the median undergraduate student may not be exceptionally brilliant, universities do concentrate smart people with thoughtful values densely. It’s possible to find such connections in the undifferentiated chaos of the “real world”, but it’s much harder. I situated myself in a cultural frame which, while it undergirds the economy, is not especially affirmative of the combinations of the intellect. To this day, there is an occasionally cantankerous cultural clash between my wordy priorities and the ruthlessly utilitarian exigencies of smartphone-thumbing business. Get to the point, Alex, because business. Bullet points and “key take-aways” are the beloved kin of e-solutions, but rather estranged from philosophy and deep late-night conversations.

This facet of campus life is less about education itself than about proximity and concentration of communities of intelligent people at a similar stage of life. Because I grew up in universities, I didn’t appreciate what I had until I lost it; I traded that proximity to personal growth opportunities for getting ahead materially and economically, and my social life has been running on fumes since I left, powered largely by the remnants of that halcyon era of work and school.

If leaving the university sphere was a major blow, self-employment was perhaps the final nail. Niche self-employment in my chosen market is a largely solipsistic proposition that rewards hermitism and prolific coding, perfect for an energetic, disciplined introvert. I probably would have done better at it in my teenage years , but it didn’t suit my social nature or changed psychological priorities as an adult. A lot of time, money and sacrifice was emitted as wasted light and heat into the coldness of space as I spun my wheels in vain trying to compensate for this problem without fully understanding it.

The essential problem is much clearer in hindsight: in leaving the university and the employment world, with its coworker lunches and water cooler talk, I had robbed myself of any coherent institutional collective, and with it, robbed myself of the implicit life script that comes with having one. I was a man without any script whatsoever. I rapidly sequestered myself away from the features of civilisation that anchor most people’s social, romantic and intellectual lives, with deleterious consequences for myself. I did not value what I had always taken for granted.

There are upsides to being a heterodox renegade, of course. Such persistent solipsism mixed with viable social skills can make one very fluid and adaptable. I took advantage of the lifestyle flexibility afforded by the “non-geographic” character of my work to travel for a few years, and found unparalleled freedom few will experience in wearing numerous cultural hats. I had the incredible fortune to reconnect with my relatives and my grandmother on another continent. For all its many hardships, self-employment in IT has much to recommend it in the dispensation it affords to write the book of one’s life in an original way.

Be that as it may, the foundations of my inner drive, motivation and aspirations are notoriously ill-suited to the cloistered life of a free-floating hermit, yet I had taken great pains to structure such a life as quickly as possible, and to maximal effect. My reaction to this dissonance was to develop a still-greater penchant for radical and grandiose undertakings, a frequent vacillation between extremes, in an effort to compensate for the gaping holes in my life. The results were not always healthy. While there’s nothing wrong with marching to the beat of one’s own drum, I should have perhaps taken it as a warning sign that as I grew older and made more and more “idiosyncratic” life choices, the crowd of kindred spirits in my life drastically thinned out. “Original” is not necessarily “clever and original”.

In sum, I flew too close to the Sun. When I reflect upon the impact that my leaving the university has had upon my life, I mourn not professional dreams deferred, nor economic hardship wrought, but rather the ill-fated conceit that I could skip over certain stages of a young adult’s personal development. Now that the novelty has worn off and the hangover has set in, I know that it would have been profoundly beneficial to me if they had unfolded not within the fast and loose patchwork I clobbered together, but within a mise en scène that captures the actions, attitudes and values of the academy–my cultural home.

On “communication skills” and pedagogy

Here’s a pet peeve: the widespread belief that any two people, regardless of the disparity in their levels of intellectual development, are destined to fruitfully converse, as long as both exhibit “good communication skills”.

First, acknowledgment where it’s due. It is indeed an important life skill to be able to break down complex ideas and make them accessible to nonspecialists.

“If you can’t explain it simply, you don’t understand it well enough” is a remark on this subject often attributed to Einstein (though, as I gather, apocryphally). The idea is that explaining something simply in ways anyone can understand is the sign of true mastery of a subject, because only deep knowledge can allow you adroitly navigate up and down the levels of abstraction required to do so.

Those of us in the business world also know about the importance of connecting with diverse personalities–customers, managers, coworkers. In the startup economy, there’s a well-known art of the “elevator pitch”, wherein a nontrivial business model can be packaged into ten-second soundbites that can hold a harried investor’s attention–the given being that investors have the attention spans of an ADHD-afflicted chipmunk.

I would also concur with those who have observed that scholarly interests which don’t lend themselves to ready explanation–that are “too complex” for most mortals to fathom–are often the refuge of academic impostors. There are a lot of unscrupulous careerists and political operators in academia, more interested in what is politely termed “prestige” than in advancement of their discipline and of human understanding. These shysters, along with more innocent (but complicit) graduate students caught up in the pressures of the “publish or perish” economy, are the spammers of house journals, conferences and research publications, often hiding behind the shield of “well, you see, it’s really complicated”. Most legitimate scholarly endeavours can be explained quite straightforwardly, if hardly comprehensively. Complexity is an inscrutable fortress and a conversation-stopper in which people more interested in being cited and operating “schools of thought” (of which they are the headmasters, naturally) hide from accountability for scholarly merit.

All this has been polished into the more general meme that productive interaction is simply a question of “learning to communicate”. With the right effort, anyone can communicate usefully with anyone. It doesn’t matter if someone is speaking from a position of education and intelligence to someone bereft of those gifts. Any failure to achieve necessary and sufficient understanding is postulated as a failure of communication skills, perhaps even social graces (e.g. the stereotypical nerdling).

This is an extreme conclusion fraught with peril. We should tread carefully least we impale ourselves on the hidden obstacles of our boundless cultural enthusiasm for simplification.

First, there’s a critical distinction between clarity and simplicity. It is quite possible to take an idea simple at heart and meander around it circuitously, taking a scenic journey full of extraneous details. Admittedly, technologists such as programmers can be especially bad about this; their explanations are often vacillatory, uncommitted to any particular level of abstraction or scope, and full of tangents about implementation details which fascinate them immeasurably but are fully lost on their audience. I’ve been guilty of that on more than a few occasions.

However, there is an intellectually destructive alchemy by which the virtues of clarity and succinctness become transformed into the requirement of brevity. Not all concepts are easily reducible or lend themselves to pithy sloganeering–not without considerable trade-offs in intellectual honesty. This is a point lost on marketers and political activists alike. It leads to big ideas and grandiose proclamations that trample well-considered, moderate positions, as the latter are thermodynamically outmatched by simplistic reductions. Brandolini’s Law, or the Bullshit Asymmetry Principle, states: “The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.” As always, sex sells–a fact of which the TEDdies have a firm grasp, with their peddling of seductive insight porn. As Evgeny Morov said:

Brevity may be the soul of wit, or of lingerie, but it is not the soul of analysis. The TED ideal of thought is the ideal of the “takeaway”—the shrinkage of thought for people too busy to think.”

Second, the idea that “communication skills” are at the heart of all matters has wormed its way into pedagogy rather disturbingly in the form of group work and so-called collaborative models of learning. As the thinking goes, the diversity of a student body is an asset; students have much to learn from each other, not just the lecturer, and encouraging them to do so prepares them for “the real world”, where they’re ostensibly going to be coworkers, police officer and arrestee, and so on.

It reminds me of an episode recounted by my favourite author William Blum in his memoir about the political upheaval of the 1960s:

At one point I enrolled for a class in Spanish at the so-called Free University of Washington, and at the first meeting I was flabbergasted to hear the “teacher” announce that he probably didn’t know much more Spanish than the students. And that’s the way it should be, he informed us–no authoritarian hierarchy. He wanted to learn from us as much as we wanted to learn from him.”

The counterculture kids were challenging incumbent hierarchies of authority. I see the same kind of anti-intellectualism recycled today into the putatively more laudable goal of social flattening.

But there’s a limit to the productive fruit of such ventures. It’s best illustrated by an anecdote from my own life.

When I was a freshman at the University of Georgia, I took an obligatory writing and composition course, as part of the infamous “core requirements” (remedial high school) that characterise the first year or two of four-year undergraduate university education in the US. One day in November, our drafts of an expository essay were due, presumably for commentary and feedback on writing mechanics by the English graduate student teaching the course.

Instead, we were paired with a random classmate and told to critique each other’s papers. My partner was an Agriculture major–a farmer’s son, he readily volunteered–who was only at the university because his father insisted that he needed a college degree before taking up his place in the family business. I would estimate his reading level to have been somewhere in the neighbourhood of fifth to eighth grade. I was going to critique his paper, and he was going to critique mine.

Candidly, his paper was largely unintelligible gibberish; it would have taken many improbable megajoules of energy input much for it to rise merely to the level of “unpolished”. Were the problems strictly mechanical–paragraphs lacking topic sentences, no discernible thesis in sight, no clear evidentiary relationship between his central claims and the sentences supporting them–I would have earned my keep in a few minutes with a red pen.

The problem was much deeper: his ideas were fundamentally low-quality, benighted in a commonsensically evident kind of way. They were at once trite, obvious, and all but irrelevant to the assigned topic. The few empirical claims made ranged from startling falsehoods to profoundly unfalsifiable arrangements of New Agey words that grated on the ear of someone accustomed to the idea that the purpose of arranging words was to convey meaning. He was hurtling at light speed toward an F. What could I do, rewrite his paper for him? How would I even begin to explain what is wrong with it? There was no room to start small or to evolve toward bigger, more summative problem statements; it was a genuine can of worms: pry it open, and all the worms come out to play at once.

I don’t mean to impugn him as a human being; he just wasn’t suited to the university’s humanities wing, whose business was reputed to be the life of the mind, set in a programme of liberal education. He didn’t know how to argue or how to write — period. He was more of a hero of proletarian labour, as it were, reared in a life script ineffably different to my own, never having crossed paths with me or anyone else in the pampered, effete, bourgeois “knowledge work” setting before, and destined to never cross paths with me in any such setting again. I was utterly paralysed; there just wasn’t much I could do to help him. Plainly, I couldn’t tell him that his thoughts issue forth from a nexus of civilisation unrecognisable to me. There wasn’t much of anything to say, really. I made a few perfunctory remarks and called it a day.

His feedback on my paper, which in turn suffered from organisational and topic transition problems that continue to dog my writing today, was: “Looks good, man!” Verily, his piercing insight knew no bounds. We really learned a lot from each other that day. Along the way, I overheard bits and pieces of a rather erudite peer review by a considerably better-educated classmate. Why couldn’t she review my paper? It would have almost certainly helped. My writing wasn’t stellar, and my devoted readership–I do it all for you, much love!–knows it still isn’t.

Later, I privately enquired to the lecturer as to how I was supposed to condense a lifetime–however hampered by the limitations of my age and experience–of literacy, intellectual curiosity, familial and cultural academic background, semi-decent public education and informal training in polemic and rhetoric into a functional critique that would realistically benefit my beleaguered cohort and help him write a better paper. She replied: “That was the whole point; you need to work on your communication skills.”

In defiance not only of the comme il faut tenets of political correctness, but in fact–in some sense–of the national mythos of our putatively classless and democratic melting pot, I brazenly suggest something that is, I think, considered fairly obvious elsewhere: not all categories of people are destined to communicate deeply or productively.

When such discord inevitably manifests, we should not reflexively blame so-called communication skills or processes. People operate in silos that are sometimes “civilisationally incommensurable”, as it were, and sometimes there just isn’t much to communicate. This is the reality of culture, class and education, and the thinking on collaborative learning and teaching methodologies should incorporate that awareness instead of unavailingly denying it. Matching partners in group activities by sociological and educational extraction clearly presents political challenges in the modern classroom, though. Instead, I would encourage teachers to rediscover–“reimagine” is the appropriate neologism, isn’t it?–the tired, hackneyed premise of leadership by example. At the risk of a little methodological authoritarianism and a few droopy eyelids, perhaps the best way to ensure that students leave your course better than you found them is to focus on their communication with you. They’ll have the rest of their lives to figure out how to transact with each other.