In general, pedantic nitpicking isn’t one tenth as insightful as it might seem. In a pointed critique, it’s best to think holistically about big, central ideas and refute on that level.
It’s harder work, since it requires a close reading to truly grapple with and comprehend the idea one aims to dismantle. It’s also more risky, because there’s more “skin in the game”; big-picture theses necessarily depend on advancing certain generalisations, summations and interpretations of opposing positions, exposing the critic to accusations of having failed to properly understand what they’re assailing.
As with many gambles in nature, however, there’s a risk-reward ratio here. Fortune favours the bold. Engaging at a high level is lot more efficient, effective and persuasive, but you’ve got to put yourself out there. That is why taking small pot shots at ideas from the sidelines is a more popular sport; guerilla warfare from a sniper position, ensconced in the foliage, is a lot safer. It’s a common refuge of intellects too feeble or cowardly to wrestle with the main corpus.
However, not everyone whose criticism operates on technicalities is feeble or a coward. Some simply don’t understand that it’s not generally insightful. I’ve commonly encountered this in two areas.
One is in academic disciplines in the humanities, where bright-eyed, bushy-tailed graduate students, often young, uninitiated and not capable of much original thinking in their field, fall prey to great institutional pressures to “publish or perish” into the sizzling carousel of
spam cite-able research work. A seductive, intellectually lazy cop-out for them is to bring a little of the quantitative “rigour” of the hard sciences to the reputedly equivocal headspaces of their fields by “troubling” broad claims with pedantic caveats. And so, conference papers and entire dissertations are built on the dubious notion that there’s something meaningful to be illuminated in demonstrating that yes, Virginia, there do exist exceptions to generalisations!
Another bastion of notionally insightful pedanticism is techies, and more broadly, technocrats. The kinds of left-brained–if you’ll excuse the pop psychology–people who are stereotypically drawn to mathematics and computer science often seem to have an axe to grind with the humanities and “soft” sciences. Many programmers get into software in part because they are powerfully drawn, at an aesthetic and psychological level, to disciplines with finite, deterministic, and self-consistent systems of deductively logical rules. Such systems are not only elegant in their eyes, but appeal strongly to their sense of justice and fairness. Clear, distinct, binary, black-and-white right and wrong answers leave little to whims, tastes, customs and habits.
And so, aggrieved and slighted by the “subjectivity” of some literature, arts, history or philosophy professor in their past–fields for which they did not show exceptional aptitude–they find in pedantry a personal vehicle for restitutive vindication. If they can just show that not all truisms in sociology apply to everyone, they’ll expose the whole field for the colossal tower of bullshit that it is! There are powerful currents of vicious contempt circulating in the technocracy for anything not “user-friendly”–that is, problems which threaten to burden the thinker with consideration of relative meaning and varied interpretation. As most crystal balls into future of private sector employment in the developed world prophecy demand for human thought processes and skills that are complementary to machine intelligence, I do worry about what this means for the already imperiled state of the liberal arts.
Anyway, there are certainly some claims where small details matter or whose foundations can be invalidated by singular exceptions. Such quality control is table stakes for the design of satellite guidance systems and aircraft engines, for the teaching of open-heart surgery, and the certification of medications for sale. We wouldn’t have it any other way.
Still, before being “that guy”–ever quick on the draw with the “actually…”–it’s wise to ask onesself in an honest and open-minded way: if I pull this block out, will the whole tower collapse? Or have I spun my wheels, emitting a lot of heat and light into the cold emptiness of space, in committing a disposable, utterly forgettable act of superficial vandalism?
Big Silicon Valley capital is largely interested in consumer-facing, high-growth “world domination” plays, in anything that has the potential to become a household name. Naturally, the Valley’s startup-grooming tributaries (e.g. Y Combinator) aim to position tech entrepreneurs at an angle complementary to that criteria, since that’s how they make money.
A lot of the cultivated folklore and intellectual work-product of the cultural leaders of this space, as epitomised by the writing of Paul Graham, speaks the language of upward exponential curves, critical mass, and gargantuan user bases–all things Valley web economy VCs like. As PG says here and elsewhere, startups are, most essentially of all, about going big by building something lots of people want.
But what if you’re like us, making a foray into the “boring” world of intra-industrial business software or a product that is specialised deeply into a vertical-specific niche?
I don’t mean a “lifestyle business”, nor do I specifically mean the long-run, sustainable, bootstrapped approach for the advocacy of which 37Signals and DHH have distinguished themselves (although nonparticipation in mainstream tech investment is implied); 37Signals still have products that millions of people want. I’m talking about building something relatively expensive that almost nobody wants.
Think of some Byzantine water pump control mechanism for a sewage treatment plant, something you can elevator-pitch in two seconds to a very select audience but that you couldn’t easily explain in ten minutes to anyone else. We build something like that for the VoIP telephony industry.
We’ve been around for eight years, we’re tiny, and we’ve morphed into a product company largely out of a consulting heritage. We’re clearly not a startup as YC and Valley “VC-istan” would have it, and nobody would fund us. We’re in no danger whatsoever of a “rocket-ship trajectory”, do not leverage “network effects”, we’re not “going viral”, and our customer acquisition cost is pretty high. While we too have benefited from the structural decline in the cost of starting a technology company (e.g. cloud servers), we’re largely unable to benefit from some of the biggest shifts to a lower cost basis and barriers to entry: Google web ads don’t do us much good because that’s not how our typical customers find us, and we don’t have anything to put in a mobile app store.
It’s mostly old-fashioned relationship building, personal brand, conferences and trade shows for us. It’s boring, it’s expensive, it’s slow. Imagine an SEO-tweaked conversion funnel with low-touch onboarding. We’ve got whatever the opposite of that is: a trickle of leads that, when we get lucky, spool out into long, drawn-out, consultative sales cycles measured in months or even years. It’s not the stuff of compelling pitch decks.
And so, the question I’ve been pondering for a long time is: what is the size and location of the cultural and methodological intersection between the Hacker News flavour of startup lore and our kind of business model? Do we have still have something to learn and apply? Can any useful takeaways be mined from the corpus of essays and “thought leadership” that PG and YC have provided? Are there useful entrepreneurial insights to be captured from Hacker News?
I think the answer is yes. One must simply be careful to cherry-pick the right bits. Here are a few thoughts:
Making Something People Want
In the business software realm, this needs to be recast with a sharp emphasis on “solve problems people have”. The advice to solve one’s own problems, or at least problems directly relatable to one’s industry experience, is, at its core, essential.
I first had an idea for something like our present-day product in 2006. At that time, I was young, inexperienced and new to telecom, and conceived of the problem space in very a priori terms. Had I moved forward and tried to take the concept to market at that stage, it would have been a spectacular flop because it did not provide institutionally acceptable solutions to actually-existing problems.
It’s possible that, had I been in a financial position to commit to it full time and avoided being bogged down in consulting for several years, I would have realised a speed advantage from being able to “fail fast”, “iterate” and/or “pivot” in response to the negative feedback for my initial concept. However, in a small industry where personal brand and reputation plays an important role, I’m not sure the speed advantage would have outweighed the personal brand deterioration resulting from putting something out there that simply doesn’t work.
Thus, I’m moved to say that the emphasis on empiricism and really understanding one’s target user is triply important in business software. In particular, I would add that non-trivial domain expertise in the user’s industry is probably a must, unless you’re building something that is, at heart, rather broad and generic.
There’s a particularly ludicrous current of wishful thinking out there that presupposes “customer discovery” to be a free-floating skill set unrelated to any particular industry or sphere of expertise. There probably are some backward industries where daily workflows consist of mostly disposable paper pushing, and where an application with some commodity CRUD screens could make a meaningful dent. However, in our industry, having a JS/CSS-savvy “UX quarterback” shadow “everyday users” for a few days to “really discover their pain points” would be a hopelessly naive waste of time.
There are no shortcuts to knowing a lot about telecom by working in telecom. To make a good telecom product, you have to be deeply conversant with the history of voice and data, the supply chains, the acronyms, and the regulation–oh God, the regulation. This probably holds true of most industries you could build complex solutions for. If you think you can walk into property casualty reinsurance and “disrupt” the place with a month-long Ruby on Rails bender (how very Agile of you), the vertical niche business market is not for you.
In this light, the market validation provided by an organic, consulting-driven funding strategy–which PG is generally sour on–is highly valuable. It might be worth building your product that way even if you could go the fundraising route instead. You’ll learn a lot. I doubt our product would have any market traction if we had tried to leapfrog the several years of hard lessons learned about our target market from our otherwise tiresome and financially stressful consulting slog.
Making Something Users Love
Having said all that, if you read a lot of PG and Sam Altman, it’s easy to become discouraged by repeated talk of the importance of building something users love. Good products simply roll off the shelves, like those round-ish late-1990s iMacs. Marketing is just an optimisation for more eyeballs; the product fundamentally sells itself on a powerful wave of early-adopter enthusiasm, and if your product isn’t grabbing most people who come across it, the implication is that it’s just not a good product.
There’s a fine line there. You do have to know when to call it quits if nobody’s buying what you’re selling. It’s possible to sell at least one unit of something to someone, somewhere given enough time and effort; it doesn’t imply a good commercial prospectus. You should have some way of figuring out if your product isn’t really taking.
However, sales in this area is hard, and you should expect that; the idea that your potential customers are going to just want or love what you’re selling in a self-evident kind of way is complicated by, well, the complexity of what’s being sold. Don’t be fooled into thinking that your product isn’t good just because every sale feels like bruising hand-to-hand combat. You’re fighting against institutional inertia, the customs and habits of the boys at the country club, sclerotic management bureaucracy, combative purchasing departments, and the marketing stranglehold of big-brand competitors on risk-averse management. In our case, we’re selling something that requires the customer to remove core infrastructure in a growing, revenue-generating, and intensely downtime-averse business and replace it with our own. It’s easy to get everyone to agree it’s ultimately a good idea, but it takes political wherewithal. “We’re really in a hurry to do that,” said no executive decision-maker ever.
Many of our most loyal customers of today didn’t know they had a commercial problem our product could solve. The function of marketing becomes very important here: as a general rule for the world of capital goods, customers have to be educated. The “bounce rate” on eyeballs alone is going to be close to 100%.
The popular understanding of what it means to make something users love is often tied up in good user experience and front-end mechanics. This is a siren song. In the world of capital goods and the complex solution sales that go with them, the most important criterion is, “Does it make or save us money?”
That’s not to say a good UI isn’t a competitive advantage, but don’t sweat it. Users will put up with a pretty bad UI on a machine that prints money for their company. More importantly, a good UI won’t do a damn thing for a product that doesn’t have positive bottom-line impact or significant business-level differentiation.
It is indeed critical to hire the right early-stage employees, and hiring mistakes in the early stage will break your company with all the good products and marketing tailwinds in the world. Much of what PG and the YC crowd have to say on the importance of building a great team is directly applicable.
Early-stage hiring is particularly complicated in niche verticals because your customers are buying a vendor service relationship as much as they are buying a software system. Your staff will be engaged closely with customers who expect credibility and expertise from your people, and it’s the warm fuzzies from that collaboration that often close the sale.
That often means that run-of-the-mill skill sets found in the general IT population you can hire off the street simply won’t do. Entry-level people are especially pernicious hiring choice here, since there’s so much more background knowledge to impart. Your early-stage employees will need to be both technical and essentially fluent in the industry domain to which you are selling, which greatly reduces your hiring pool and makes candidates even more expensive.
Accordingly, your success and failure depends on your ability to get people with some vertical-specific industry background in the door. You should expect to make even more compromises here than typical in Valley web startup land as far as equity grants and so on. You’ll want to be mindful of the regions and labour markets around the country that concentrate IT people with particular domain knowledge above and beyond table stakes technical skills: if you’re doing something in energy, get comfortable with Houston, and if you’re doing fintech, think Chicago or New York.
There really are no pre-revenue business models in enterprise software; at least, there shouldn’t be. All talk about building “critical mass” user base and figuring out how to monetise it or render it profitable later is irrelevant and should be summarily ignored. Your first customer should be paying.
There are, of course, some strategic API and platform plays out there whose primary purpose is to set up for an acquisition. These often don’t have paying users or don’t generate a lot of revenue. However, as a general rule, acquisitions in the business market are more rationalistic and quantitative, so bringing more revenue to the table redounds to the benefit of your valuation and bargaining power. This is a bit different from the voodoo valuation process of mass-market startups, where irrational investor exuberance can sometimes be maximised by removing the constraint of concrete, earth-bound revenue and encouraging the acquirer to “really dream big”. All this to say: book revenue. You’re not going to get more for less by letting a freemium cat out of the bag and into the open market. More revenue is always better.
Otherwise, in a market with a low volume of high-magnitude transactions, every customer counts. Arithmetically, price segmentation of some description is usually required to make a product economically viable. One of your biggest preoccupations early on should be to delineate the needs of your “lite” users at one extreme, versus your “enterprise” or “platinum plan” users on the other, and to tier your product accordingly. Joel Spolsky’s classic Camels and Rubber Duckies, dated though it may be, still comes highly recommended as one of the best introductions I’ve seen on this subject.
You come to pick up my sofa — still in original packaging, never opened, offered as-is on Craigslist. Naturally, your first thought is: “What’s it cost?”
“Don’t worry about it, man! Take it home, then we’ll run the numbers later today.”
“Very well”, you say, figuring it couldn’t be more than what you’d reasonably expect a really basic, small-ish new sofa with cheap fabric cover to cost, less depreciation once it left the showroom — maybe $500, maybe even $700? You’ve got about $1200 in your checking account, you’re good to go.
“What the hell?! Who pays $13,860 for a sofa?”
“Oh, no worries,” I tell you. “I offer a generous 85% discount for same-day on-the-spot payment. For today’s charges of $2079, are you paying with Visa or MasterCard?”
Your face is awash with total astonishment. “Who the hell pays $2079 for a sofa?”
“Well, that’s a very generous prompt payment discount of 85%.”
“A discount off of $13,860? Where the hell did you come up with that?”
“Um, that’s the Standard Sofa Charge, sir.”
“Standard? Based on what? Do you know of a single person, business or other entity who would pay anywhere near $13,860 for a sofa?”
“$13,860 is the Standard Sofa Charge for anyone,” I say, handing you a short itemised bill with a line item “SOFA”, quantity 1, unit price 13860.00.
“Come on, you and I both know nobody pays $13,860 for a sofa!”
“Well… will today’s charges of $2079 be on your Visa or MasterCard?”
Your first inclination is to return my sofa, but it’s a little late for that. Your kids are in love with it, you’ve already spilled Corn Flakes on it — it’s yours now. It was your mistake not to have agreed upon a firm price before hauling it away, anyway. With a heavy sign of resignation, you accept your fate and the cross you must bear of your expensive mistake:
“Okay, fine. I’ll pay $2079, although — just for the record — that’s ridiculous. I can get a tiny little two-seat fabric-covered sofa like this in any other developed country for less than half of that, new, right from the furniture store. It’s going to take me a week or two to come up with two grand, though. I’m going to need to borrow some from my friends and family. Can you give me about 30 days? I’ll pay you $500 now so you know I’m good for it.”
“Sure thing,” I answer helpfully. “Actually, you can pay off the remainder over four months, in four easy payments of $741.25.”
“WHAT? That’s a total of $3465.”
“In this case, I can only extend a 75% discount.”
“So, you’re asking me to pay $3465 instead of $2079 now. That’s an extra $1386.”
“That’s correct, sir. To extend terms you must pay an additional 10%.”
“10%? What are you talking about? That’s an extra 66% on top of the $2079 I’d otherwise have to pay!”
“Well, no, you’re only paying 25% of the Standard Sofa Charge.”
“What the hell are you talking about? I’ve seen your other Craigslist ads, and I have a really hard time believing anyone would pay you more than about $800 for any of your other entry-level sofas. I could see a really good one going for $1000, but that’s tops!”
“Hmm, well, actually, many people pay the Standard Sofa Charge. You’re getting a hell of a deal.”
“Can you show me at least one person who has paid anywhere near the Standard Sofa Charge? Even half of the Standard Sofa Charge?’
“I am not at liberty to disclose that information, I’m sorry.”
“Why am I being asked to negotiate against this bullshit, made-up Standard Sofa Charge figure, when there is not an iota of evidence that it, or anything remotely close to it, clears the market? If you can show me at least one person or organisation that has paid anywhere near the Standard Sofa Charge for a sofa of this class…”
“Well,” I say smugly, “obviously, some of my repeat buyers have negotiated volume discounts due to their market-moving power – resellers, secondhand furniture stores and so on.”
“I’ll bet! And I have a pretty hard time believing any of them are paying more than $2000 for even the best, most premium fabric starter sofa. $2000 will buy me a leather L-shaped sectional.”
“I can’t disclose that. But you’d be surprised how close we get to the Standard Sofa Charge. If you worked for Foolhardy O’Toole’s Pre-Owned, you’d get to see the bills and get an idea of how much they pay.”
“Look,” you say, “I don’t really care how much O’Toole’s pays, although may a vengeful God smite me where I stand for I know it sure as hell isn’t $13,860, or even $3465. $2079 is already beyond the pale, but I figure what’s done is done. However, there is simply no way I can afford $3465! I came here looking for a Craigslist deal!”
I clear my throat and shift in my seat slightly uncomfortably. “Well, we do offer financial aid for those who may have trouble paying. We may be able to get you a discount of even more than 85%.”
“What the hell? Why didn’t you say that before?”
“Here’s a four page application. You’ll need to disclose your gross income, itemise your expenses, and provide three months of bank statements and last year’s tax returns.”
“That’s a lot of work”, you figure. “It sounds like the kind of work someone with a very low income would find worthwhile to do.”
“Mhm…” I mutter.
“… it almost sounds like a price segmentation and revenue optimisation strategy, designed to extract the highest price possible, in the most opaque and duplicitous manner imaginable, for one of the most price-inelastic goods or services in existence, from every patient, according to a scrupulously calibrated sense of their maximum ability to pay…..”
“Please return the last three pages of the form to me by e-mail or fax at your earliest convenience.”
By now, you’re visibly frustrated:
“Look man, I just have a cash flow issue because I was not even remotely prepared to have to fork over two grand for the sofa in one day. I can get you the full $2079 in 30 days. I don’t need a multi-month installment plan or what amounts to a 66% financing charge.”
“You’d be surprised at the range of incomes of qualified applicants.”
“But there’s one thing I still don’t understand. You and I both know that the Standard Sofa Charge figure is bullshit, that it has absolutely zero attachment to market reality, and that nobody pays anywhere near the Standard Sofa Charge. So, why is the Standard Sofa Charge even a thing? What on God’s earth is the least bit ‘Standard’ about it? And why are all of your ‘discounts’ with reference to it as a negotiation anchor when you and I know damn well that you can’t sell a sofa for $13,860?”
“That’s the Standard Sofa Charge, sir.”
“And why in the hell did you come at me with the financial aid offer only once I threw enough of a shit-fit?”
“Well…” I chuckle nervously. “That’s not exac–I gotta get to a meeting…” I trail off as I beeline for the door.
My son Roman was born on Thursday via C-section. We had a four day hospital stay afterward and just got home. We are uninsured cash payers, and this is more or less the conversation I had with the hospital’s finance office today. The only difference is that the Standard C-Section Charge was $24,790¹, the cash payer discount was 75% (to $6197), and the giant middle finger installment plan was at a 65% (+40% premium, or +$2479 = $8676).
Welcome to the only developed country where this is possible. You can’t make this stuff up.
¹ This $25k is just the hospital fee for four days of hospital stay, nursing, as well as use of the operating room and assistance; it does not include several thousand dollars in additional obstetrician-gynaecologist and anaesthesiologist fees for the actual operation.
The homo computatis college drop-out was a cliché whose establishment in folklore predated my departure from the University of Georgia by at least two decades. Nevertheless, I also joined the club. In the first semester of the 2005-2006 academic year, after dabbling half-heartedly in coursework for two years as a philosophy major, I threw open the gates and exiled myself into the great beyond.
Although my actions conformed to a known stereotype, I still feel I was something of an early adopter, virtually alone in my group of peers. I came to count many in my acquaintance who never pursued post-secondary education in the first place, or who floated in and out of community and technical colleges amidst working and financial struggles, but knew of vanishingly few, especially at that time, who straight-up dropped out of a four-year institution–that is, academic majors who unregistered abruptly from their courses mid-semester and skipped town with no intention of returning. That sort of thing seemed to be the province of Larry Page, Sergey Brin and Bill Gates–definitely outliers. And unlike them, I wasn’t at the helm of a world-changing startup on a clear trajectory into the multi-billion dollar stratosphere, so I couldn’t point to an overwhelming and self-evident justification.
A lot has changed since then. By all appearances, we seem to be passing through a watershed moment where existential questions about the value and purpose of college and traditional higher education in America are emerging onto the mass level among Millenials and Generation Z-ers. This discussion has been spurred on by the ensuing housing crisis, a growing tower of debilitating student loan debt, tuition rises, and mounting questions about the future of employment in the developed world, especially the ways in which opportunity has become more and less democratic in the context of technological shifts and globalisation. There’s also a growing interest in open courseware and novel forms of technology-enabled correspondence learning–though, I should say, I don’t share in the TEDdies’ conflation of the Khan Academy with higher education. Still, nearly a decade has passed since I made my fateful decision to forsake the path of higher learning, so it seems like a good time to reflect on where it’s taken me and whether it was a good call.
Some aspects of the progression of events will sound familiar to many in IT. I grew up mostly around university environments, and computer programming figured dominantly among my childhood interests. It was an interest easily encouraged by proximity to lots of expensive computer equipment, good Internet connectivity and access to sympathetic graduate student mentors. I had been playing with computers and the Internet since age 8 or so, and wrote my first program at 9. I spent much of my adolescent and teenage years nurturing this hobby, having a strong interest in both software engineering and operational-infrastructural concerns. As most people in IT know, ours is a profession that offers unrivaled self-teaching opportunities, enabled by a highly Googleable open-source software ecosystem and collaborative social dynamic. That’s why so many programmers like me are self-taught.
I also had various other intellectual interests, however, and had no plans to make a tech career. In fact, for most of my life prior to age eighteen or so, I wasn’t even particularly aware that I had a marketable skill set. The desire to get into computing as a child was utterly innocent and non-ulterior, as arbitrary as some kids’ choice to take up the cello or oil painting. I entered UGA in 2004 as a political science major and shortly switched to philosophy, with vague ideas of law school in the future.
It’s also worth remarking that I came from a humanities-oriented academic family and cultural background; my parents were professors of philosophy at a top-tier Soviet university, and my father is a professor of the same at UGA. My extended family background includes a venerable dynasty of musicians, including my great-grandfather, Mikhail Tavrizian, the conductor of the Yerevan State Opera and a National People’s Artist of the USSR, as well as his Russian wife Rassudana, a renowned ballerina in the Bolshoi Theatre. My late grandmother was a philologist and a member of the philosophy faculty of the Russian Academy of Sciences. When my parents emigrated to the US in 1992 (I was six years old), they redid graduate school entirely at the University of Notre Dame, which is where my primary school years were spent. My social and cultural life at that time played out in housing for married graduate students with children, where I ran around with friends from dozens of different nationalities.
All this to say, I was on a strong implicit academic trajectory as a function of my upbringing, a trajectory rooted in the humanities, not hard sciences. In fact, my parents were not especially supportive of my computing hobbies. As they saw it, I think, spending my days holed up in my room modem-ing away interfered with schoolwork and was not especially promotional of cultural development consonant with the mantle I was meant to inherit.
Nevertheless, I began working when I was eighteen (my parents did not let me work prior to that–Russian parents do not, as a rule, share the American faith in the virtues of part-time work for teenagers or students). My first job was technical support at a small Internet Service Provider in our university town of Athens, GA, at first very much part-time and reasonably complementary to college. I earned a 4.0 GPA in the first semester of my freshman year.
However, I was ambitious and precocious, decidedly more interested in work than school. Within a year, after some job-hopping (which included a stint in IT and/or warehouse labour at a Chinese importer of home and garden products–how’s that for division of labour?), I returned to the ISP at twice the pay rate and assumed the role of systems administrator. I was learning a great deal about real-world business and technology operations, getting my hands on industrial infrastructure and technologies, and rapidly assimilating practical knowledge. I had been around theoretical publications and underpaid graduate student assistants hunkered in dimly lit carrels my whole life, but I never had to learn the basics of business and how to communicate with all kinds of everyday people on the fly. Although the cultural clash was sometimes frustrating, the novelty of learning practical skills and how to run a real-world operation was intoxicating. It occasionally led to an outsized moral superiority complex, too, as I became conscious of the fact that at age 19, I could run circles around most of the job candidates being interviewed, some of whom had master’s degrees. Clearly, I was doing something right!
From that point, my career rapidly evolved in a direction not compatible with school. Formally, I was still part-time and hourly, but it was effectively closer to a full-time gig, and I rapidly took on serious responsibilities that affected service and customers. Small as the company was, in retrospect, I was a senior member of its technical staff and a sought-after authority by people ten to twenty years older. My commitment to school, already a decidedly secondary priority, rapidly deteriorated. I had no semblance of campus life immersion or student social experience. From my sophomore year onward, I was effectively a drop-in commuter, leaving in the middle of the day to go to a class here and a class there, then hurrying back to the office. I neither had time for studying nor made the time. My GPA reflected that. I didn’t care. School droned on tediously; meanwhile, T1 circuits were down and I was busy being somebody!
As my interests in telecom, networking, telephony and VoIP deepened, it became clear that the next logical career step for me was to move to Atlanta; Athens is a small town whose economy would not have supported such niche specialisation. Toward the end of the second semester of my sophomore year, I began looking for jobs in Atlanta. I unconsciously avoided the question of what that means for my university; I was simply too engrossed in work and captivated by career advancement. In the first semester of my junior year, by which point my effort at university had deteriorated to decidedly token and symbolic attendance, I finally found a job in Alpharetta (a suburb of Atlanta) at a voice applications provider. In October 2006, at the age of twenty, I announced that I was quitting university and moving to the “big city”.
My parents reacted better than I thought they would. I halfway expected them to disown me. However, in hindsight, I think they were pragmatic enough to have long realised where things were headed. It’s hard for me to say, even now, to what degree they were disappointed or proud. I don’t know if they themselves know. What was most clear at that moment was that I am who I am, and will do as I do, and there’s no stopping me.
That’s not to say that East and West didn’t collide. I remember having a conversation that went something like:
– “But what happens if you get fired in a month?”
– “Well, I suppose that’s possible, but if one performs well, it’s generally unlikely.”
– “But is there any guarantee that you won’t lose your job?”
Guarantee? That was definitely not a concept to which I was habituated in my private sector existence.
– “There are never any guarantees. But my skills are quite portable; if such a thing happened, I could find another job.”
– “It just seems very uncertain.”
– “That’s how it goes in the private sector.”
All the same, it was clear enough that, for all the problems this decision might cause me, I certainly wasn’t going to starve. Even in Athens, I was an exceptionally well-remunerated twenty year-old. My first salary in Atlanta was twice that. Moreover, it was clear that IT was an exceptionally democratic and meritocratic space; if one had the skills, one got the job. My extensive interviews in Atlanta drove home the point that potential employers did not care about my formal higher education credentials by that point in my career development. The “education” section from my résumé was long deleted, replaced by a highly specific employment history and a lengthy repertoire of concrete, demonstrable skills and domain knowledge with software and hardware platforms, programming languages, and so on. The résumés I was sending out to Atlanta companies at age twenty proffered deep and nontrivial experience with IP, firewalls, routers, switches, BGP, OSPF, Cisco IOS, Perl, Bash, C, PHP, TDM, ADSL aggregation, workflow management systems, domain controllers, intrusion detection systems–what didn’t I do at that ISP? There aren’t many companies that would have let someone with my age and experience level touch production installations of all those technologies. I was bright-eyed, bushy-tailed, and soaked it all up like a sponge. And when asked for substantiation by potential employers, I sold it.
Those of you in IT know how this works: formal education is used by employers as signalling about a candidate only in the absence of information about concrete experience or skills. All other things being equal, given two green, inexperienced candidates among whom one has a university diploma and one doesn’t, employers will choose the one who finished university, as it’s a proxy for a certain minimal level of intelligence and ability to complete a non-trivial multi-year endeavour. When concrete experience and skills are present, however, the educational credentials fly out the window for most corporate engineering and operations jobs, and the more one’s career evolves, the less relevant early-stage credentials become. Moreover, there are innumerable people in mainstream IT whose university degrees were not in computer science or affiliated subject matter, but rather in a specialty like literature, history or ecology.
My next three jobs were in Atlanta, within the space of about the next year and a half. I averaged a job change every six months or so, often considerably increasing my income in the process. By the time I was barely twenty-two, I had worked for a voice applications provider, a major local CLEC and data centre operator, and an online mortgage lender.
Of course, certain jobs were off-limits. I couldn’t do research work that required a formal computer science background, nor take jobs in government or certain other large institutions who remained sticklers for credentials. I lacked the formal mathematics and electrical engineering background necessary for low-level hardware design work. It’s also quite likely that if I had tried to climb the corporate ladder into middle to upper management, I would at some point, later in life, bump into a ceiling for degree-less drop-outs. When one gets high enough, it becomes comme il faut to have an alma mater in one’s biography, even if it’s just an airy-fairy management degree from a for-profit correspondence course mill. The only way I know of to get around that is to have had a famous and inscrutable business success (i.e. an acquisition) to overshadow it. Click on “Management Team” under the “About” section of some tech companies’ web sites to get the drift. Appearances are important at that level.
I didn’t stick around long enough to figure out where exactly the limits are (although I didn’t get the impression there were many, as long as one could demonstrably do the work). In early 2008, I was abruptly fired after some political clashes. Also, they don’t take kindly to the habitual morning tardiness of “programmer’s hours” in financial services. Instead of looking for my seventh job in four years, I decided to go out on my own. I had been itching to do it for quite some time, but didn’t quite have the wherewithal to walk away from the steady paycheck. Getting fired has a way of forcing that issue.
And so, on a cold, windy January day in 2008, barely twenty-two, I left the building with my belongings in a box, with nearly zero dollars to my name, having wiped out my savings with a down payment on a downtown Atlanta condo. I had no revenue and no customers. A friend and I went to celebrate. I was determined and hit the ground running, though, and that’s how I started Evariste Systems, the VoIP consultancy turned software vendor that I continue to operate today, nearly eight years later.
Because the US does not have a serious vocational education program and because the focus of the “everyone must go to college” narrative of the last few decades is reputed success in the job market (or, more accurately, the threat of flipping burgers for the rest of one’s life), the first and most pertinent question on American students’ minds would be: do I feel that I have suffered professionally because I did not finish my degree?
I didn’t think I would then, and I still don’t think so now. Notwithstanding the above-mentioned limitations, it’s safe to say that I could qualify for almost any mainstream, mid-range IT job I wanted, provided I evolved my skill set in the requisite direction. In that way, IT differs considerably from most white-collar “knowledge work” professions, which are variously guilded (e.g. law, medicine) or have formal disciplinary requirements, whether by the nature of the field (civil engineering) or by custom and inertia (politics, banking). Although politics perverts every profession, IT is still exceptionally meritocratic; by and large, if you can do the job, you’re qualified.
The inextricable connection of modern IT to the history and cultural development of the Internet also moves me to say that it’s still the easiest and most realistic area in which one can receive a complete education through self-teaching. You can learn a lot about almost anything online these days, but the amount of resources available to the aspiring programmer and computer technologist is especially unparalleled.
That doesn’t mean I’d recommend skipping college generically to anyone who wants to enter the profession at roughly the same level. I put in, as a teenager, the requisite ten to twenty thousand hours thought to be necessary to achieve fundamental mastery of a highly specialised domain. However, I can’t take all the credit. I was fortunate to have spent my formative years in a university-centric environment, surrounded by expensive computers and highly educated people (and their children), some of whom became lifelong mentors and friends. Although my parents were not especially thrilled with how I spent my free time (or, more often, just time), they had nevertheless raised a child–as most intelligentsia parents do–to be highly inquisitive, open-minded, literate and expressive, with exposure to classical culture and literature. Undergoing emigration to a foreign land, culture and language at the age of six was challenging and stimulating to my developing mind, and the atmosphere in which I ended up on the other side of our transoceanic voyage was nurturing, welcoming and patient with me. The irony is not lost upon me that I essentially–if unwittingly–arbitraged the privilege associated with an academic cultural background into private sector lucre. A lot owes itself to blind luck, just being in the right place and in the right time. I could probably even make a persuasive argument that I lucked out because of the particular era of computing technology in which my most aggressive uptake played out.
This unique intersection of fortuitous circumstances leads me to hesitate to say that nobody needs a computer science degree to enter the IT profession. My general sense is that a computer science curriculum would add useful, necessary formalisation and depth to the patchwork of the average self-taught techie, and this certainly holds true for me as well–my understanding of the formal side of machine science is notoriously impoverished, and stepping through the rigourous mathematics and algorithms exercises would have doubtless been beneficial, though I don’t think it would have been especially job-relevant in my particular chosen specialisation.
Still, I’m not committed to any particular verdict. I’m tempted to say to people who ask me this question: “No, you don’t need a degree to work in private industry–but only if you’re really good and somewhat precocious.” Many nerds are. Almost all of the really good programmers I know have programmed and tinkered since childhood. It comes part and parcel, somewhat like in music (as I understand it). In the same vein, I don’t know anyone who wasn’t particularly gifted in IT but who came out that way after a CS degree.
On the other hand, for the median aspiring IT professional, I would speculate that a CS degree remains highly beneficial and perhaps even essential. For some subspecialisations within the profession, it’s strictly necessary. I do wonder, though, whether a lot of folks whose motive in pursuing a CS degree is entirely employment-related wouldn’t be better off entering industry right out of high school. They’d start off in low entry-level positions, but I would wager that after four years of real-world experience, many of them could run circles around their graduating peers, even if the latter do have a more rigourous theoretical background. If practicality and the job market are the primary concern, there are few substitutes for experience. Back at my ISP job, CS bachelors (and even those with master’s degrees) were rejected commonly; they had a diploma, but they couldn’t configure an IP address on a network interface.
Another reason I don’t have a clear answer is because things have changed since then; a decade is geological in IT terms. I’ve also spent twice as much time self-employed by now as I did in the employed world, and niche self-employment disconnects one from the pulse of the mass market. I know what I want in an employee, but I don’t have a finely calibrated sense of what mainstream corporate IT employers want from graduates these days. When I dropped out, Facebook had just turned the corner from TheFacebook.com, and there were no smartphones, no Ruby on Rails, no Amazon EC2, no “cloud orchestration”, no Node.js, no Docker, no Heroku, no Angular, no MongoDB. The world was still wired up with TDM circuits, MPLS was viewed as next-generation, and VoIP was still for relatively early adopters. The point is, I don’t know whether the increasing specialisation at the application layer, and increasing abstraction more generally, has afforded even more economic privilege to concrete experience over broad disciplinary fundamentals, and if so, how much.
All I can firmly say on the professional side is that it seems to have worked out for me. If I were in some way hindered by the lack of a university diploma, I haven’t noticed. I’ve never been asked about it in any employment interview after my student-era “part-time” jobs. For what I wanted to do, dropping out was the right choice professionally, and I would do it again without hesitation. It’s not a point of much controversy for me.
The bigger and more equivocal issue on which I have ruminated as I near my thirtieth birthday is how dropping out has shaped my life outside of career.
I don’t mean so much the mental-spiritual benefits of a purportedly well-rounded liberal education–I don’t think I was in any danger of receiving that at UGA. 80% of my courses were taught by overworked graduate teaching assistants of phenomenally varying pedagogical acumen (a common situation in American universities, especially public ones). The median of teaching quality was not great. And so, I’m not inclined to weep for the path to an examined life cut short. It’s not foreclosed access to the minds of megatonic professorial greats that I bemoan–not for the most part, anyway.
However, moving to Atlanta as a twenty year-old meant leaving my university town and a university-centric atmosphere. My relatively educated environs were replaced with a cross-section of the general population, and in my professional circles, particularly at that time, I had virtually no peers. My median colleague was at least ten years older, if not twenty, and outside of work, like most people living in a desolate and largely suburban moonscape, I had nobody to relate to. At the time I left, I found value in the novelty of learning to work and communicate with the general public, since I never had to do it before. I thought our college town was quite “insular”. In retrospect, though, it would not be an exaggeration to say that I robbed myself of an essential peer group, and it’s no accident that the vast majority of my enduring friendships to this day are rooted in Athens, in the university, and in the likeminded student personalities that our small ISP there attracted.
As a very serious and ambitious twenty-year old moving up the career ladder, I also took a disdainful view of the ritualised rite of passage that is the “college social experience” in American folklore. I didn’t think at the time that I was missing out on gratuitous partying, drinking, and revelatory self-discovery in the mayhem of dating and sex. If anything, I had a smug, dismissive view of the much-touted oat-sowing and experimentation; I was leapfrogging all that and actually doing something with my life! Maybe. But I unraveled several years later, anyway, and went through a brief but reckless and self-destructive phase in my mid-twenties that wrought havoc upon a serious romantic relationship with a mature adult. I also at times neglected serious worldly responsibilities. Being a well-remunerated mid-twenties professional didn’t help: it only amplified gross financial mistakes I made during that time, whereas most people in their twenties are limited in the damage they can do to their life by modest funds. I’m still paying for some of those screw-ups. For example, few twenty-one year olds are equipped to properly weigh the wisdom of purchasing a swanky city condo at the top of a housing bubble, and, subsequent developments suggest that I was not an exception. Oh, a word of advice: pay your taxes. Some problems eventually disappear if you ignore them long enough. Taxes work the opposite way.
But in hindsight, a bigger problem is that I also missed out on the contemplative coffee dates, discussion panels, talks and deep, intelligent friendships that accompany student life in the undergraduate and post-graduate setting. While the median undergraduate student may not be exceptionally brilliant, universities do concentrate smart people with thoughtful values densely. It’s possible to find such connections in the undifferentiated chaos of the “real world”, but it’s much harder. I situated myself in a cultural frame which, while it undergirds the economy, is not especially affirmative of the combinations of the intellect. To this day, there is an occasionally cantankerous cultural clash between my wordy priorities and the ruthlessly utilitarian exigencies of smartphone-thumbing business. Get to the point, Alex, because business. Bullet points and “key take-aways” are the beloved kin of e-solutions, but rather estranged from philosophy and deep late-night conversations.
This facet of campus life is less about education itself than about proximity and concentration of communities of intelligent people at a similar stage of life. Because I grew up in universities, I didn’t appreciate what I had until I lost it; I traded that proximity to personal growth opportunities for getting ahead materially and economically, and my social life has been running on fumes since I left, powered largely by the remnants of that halcyon era of work and school.
If leaving the university sphere was a major blow, self-employment was perhaps the final nail. Niche self-employment in my chosen market is a largely solipsistic proposition that rewards hermitism and prolific coding, perfect for an energetic, disciplined introvert. I probably would have done better at it in my teenage years , but it didn’t suit my social nature or changed psychological priorities as an adult. A lot of time, money and sacrifice was emitted as wasted light and heat into the coldness of space as I spun my wheels in vain trying to compensate for this problem without fully understanding it.
The essential problem is much clearer in hindsight: in leaving the university and the employment world, with its coworker lunches and water cooler talk, I had robbed myself of any coherent institutional collective, and with it, robbed myself of the implicit life script that comes with having one. I was a man without any script whatsoever. I rapidly sequestered myself away from the features of civilisation that anchor most people’s social, romantic and intellectual lives, with deleterious consequences for myself. I did not value what I had always taken for granted.
There are upsides to being a heterodox renegade, of course. Such persistent solipsism mixed with viable social skills can make one very fluid and adaptable. I took advantage of the lifestyle flexibility afforded by the “non-geographic” character of my work to travel for a few years, and found unparalleled freedom few will experience in wearing numerous cultural hats. I had the incredible fortune to reconnect with my relatives and my grandmother on another continent. For all its many hardships, self-employment in IT has much to recommend it in the dispensation it affords to write the book of one’s life in an original way.
Be that as it may, the foundations of my inner drive, motivation and aspirations are notoriously ill-suited to the cloistered life of a free-floating hermit, yet I had taken great pains to structure such a life as quickly as possible, and to maximal effect. My reaction to this dissonance was to develop a still-greater penchant for radical and grandiose undertakings, a frequent vacillation between extremes, in an effort to compensate for the gaping holes in my life. The results were not always healthy. While there’s nothing wrong with marching to the beat of one’s own drum, I should have perhaps taken it as a warning sign that as I grew older and made more and more “idiosyncratic” life choices, the crowd of kindred spirits in my life drastically thinned out. “Original” is not necessarily “clever and original”.
In sum, I flew too close to the Sun. When I reflect upon the impact that my leaving the university has had upon my life, I mourn not professional dreams deferred, nor economic hardship wrought, but rather the ill-fated conceit that I could skip over certain stages of a young adult’s personal development. Now that the novelty has worn off and the hangover has set in, I know that it would have been profoundly beneficial to me if they had unfolded not within the fast and loose patchwork I clobbered together, but within a mise en scène that captures the actions, attitudes and values of the academy–my cultural home.
Here’s a pet peeve: the widespread belief that any two people, regardless of the disparity in their levels of intellectual development, are destined to fruitfully converse, as long as both exhibit “good communication skills”.
First, acknowledgment where it’s due. It is indeed an important life skill to be able to break down complex ideas and make them accessible to nonspecialists.
“If you can’t explain it simply, you don’t understand it well enough” is a remark on this subject often attributed to Einstein (though, as I gather, apocryphally). The idea is that explaining something simply in ways anyone can understand is the sign of true mastery of a subject, because only deep knowledge can allow you adroitly navigate up and down the levels of abstraction required to do so.
Those of us in the business world also know about the importance of connecting with diverse personalities–customers, managers, coworkers. In the startup economy, there’s a well-known art of the “elevator pitch”, wherein a nontrivial business model can be packaged into ten-second soundbites that can hold a harried investor’s attention–the given being that investors have the attention spans of an ADHD-afflicted chipmunk.
I would also concur with those who have observed that scholarly interests which don’t lend themselves to ready explanation–that are “too complex” for most mortals to fathom–are often the refuge of academic impostors. There are a lot of unscrupulous careerists and political operators in academia, more interested in what is politely termed “prestige” than in advancement of their discipline and of human understanding. These shysters, along with more innocent (but complicit) graduate students caught up in the pressures of the “publish or perish” economy, are the spammers of house journals, conferences and research publications, often hiding behind the shield of “well, you see, it’s really complicated”. Most legitimate scholarly endeavours can be explained quite straightforwardly, if hardly comprehensively. Complexity is an inscrutable fortress and a conversation-stopper in which people more interested in being cited and operating “schools of thought” (of which they are the headmasters, naturally) hide from accountability for scholarly merit.
All this has been polished into the more general meme that productive interaction is simply a question of “learning to communicate”. With the right effort, anyone can communicate usefully with anyone. It doesn’t matter if someone is speaking from a position of education and intelligence to someone bereft of those gifts. Any failure to achieve necessary and sufficient understanding is postulated as a failure of communication skills, perhaps even social graces (e.g. the stereotypical nerdling).
This is an extreme conclusion fraught with peril. We should tread carefully least we impale ourselves on the hidden obstacles of our boundless cultural enthusiasm for simplification.
First, there’s a critical distinction between clarity and simplicity. It is quite possible to take an idea simple at heart and meander around it circuitously, taking a scenic journey full of extraneous details. Admittedly, technologists such as programmers can be especially bad about this; their explanations are often vacillatory, uncommitted to any particular level of abstraction or scope, and full of tangents about implementation details which fascinate them immeasurably but are fully lost on their audience. I’ve been guilty of that on more than a few occasions.
However, there is an intellectually destructive alchemy by which the virtues of clarity and succinctness become transformed into the requirement of brevity. Not all concepts are easily reducible or lend themselves to pithy sloganeering–not without considerable trade-offs in intellectual honesty. This is a point lost on marketers and political activists alike. It leads to big ideas and grandiose proclamations that trample well-considered, moderate positions, as the latter are thermodynamically outmatched by simplistic reductions. Brandolini’s Law, or the Bullshit Asymmetry Principle, states: “The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.” As always, sex sells–a fact of which the TEDdies have a firm grasp, with their peddling of seductive insight porn. As Evgeny Morov said:
Brevity may be the soul of wit, or of lingerie, but it is not the soul of analysis. The TED ideal of thought is the ideal of the “takeaway”—the shrinkage of thought for people too busy to think.”
Second, the idea that “communication skills” are at the heart of all matters has wormed its way into pedagogy rather disturbingly in the form of group work and so-called collaborative models of learning. As the thinking goes, the diversity of a student body is an asset; students have much to learn from each other, not just the lecturer, and encouraging them to do so prepares them for “the real world”, where they’re ostensibly going to be coworkers, police officer and arrestee, and so on.
It reminds me of an episode recounted by my favourite author William Blum in his memoir about the political upheaval of the 1960s:
At one point I enrolled for a class in Spanish at the so-called Free University of Washington, and at the first meeting I was flabbergasted to hear the “teacher” announce that he probably didn’t know much more Spanish than the students. And that’s the way it should be, he informed us–no authoritarian hierarchy. He wanted to learn from us as much as we wanted to learn from him.”
The counterculture kids were challenging incumbent hierarchies of authority. I see the same kind of anti-intellectualism recycled today into the putatively more laudable goal of social flattening.
But there’s a limit to the productive fruit of such ventures. It’s best illustrated by an anecdote from my own life.
When I was a freshman at the University of Georgia, I took an obligatory writing and composition course, as part of the infamous “core requirements” (remedial high school) that characterise the first year or two of four-year undergraduate university education in the US. One day in November, our drafts of an expository essay were due, presumably for commentary and feedback on writing mechanics by the English graduate student teaching the course.
Instead, we were paired with a random classmate and told to critique each other’s papers. My partner was an Agriculture major–a farmer’s son, he readily volunteered–who was only at the university because his father insisted that he needed a college degree before taking up his place in the family business. I would estimate his reading level to have been somewhere in the neighbourhood of fifth to eighth grade. I was going to critique his paper, and he was going to critique mine.
Candidly, his paper was largely unintelligible gibberish; it would have taken many improbable megajoules of energy input much for it to rise merely to the level of “unpolished”. Were the problems strictly mechanical–paragraphs lacking topic sentences, no discernible thesis in sight, no clear evidentiary relationship between his central claims and the sentences supporting them–I would have earned my keep in a few minutes with a red pen.
The problem was much deeper: his ideas were fundamentally low-quality, benighted in a commonsensically evident kind of way. They were at once trite, obvious, and all but irrelevant to the assigned topic. The few empirical claims made ranged from startling falsehoods to profoundly unfalsifiable arrangements of New Agey words that grated on the ear of someone accustomed to the idea that the purpose of arranging words was to convey meaning. He was hurtling at light speed toward an F. What could I do, rewrite his paper for him? How would I even begin to explain what is wrong with it? There was no room to start small or to evolve toward bigger, more summative problem statements; it was a genuine can of worms: pry it open, and all the worms come out to play at once.
I don’t mean to impugn him as a human being; he just wasn’t suited to the university’s humanities wing, whose business was reputed to be the life of the mind, set in a programme of liberal education. He didn’t know how to argue or how to write — period. He was more of a hero of proletarian labour, as it were, reared in a life script ineffably different to my own, never having crossed paths with me or anyone else in the pampered, effete, bourgeois “knowledge work” setting before, and destined to never cross paths with me in any such setting again. I was utterly paralysed; there just wasn’t much I could do to help him. Plainly, I couldn’t tell him that his thoughts issue forth from a nexus of civilisation unrecognisable to me. There wasn’t much of anything to say, really. I made a few perfunctory remarks and called it a day.
His feedback on my paper, which in turn suffered from organisational and topic transition problems that continue to dog my writing today, was: “Looks good, man!” Verily, his piercing insight knew no bounds. We really learned a lot from each other that day. Along the way, I overheard bits and pieces of a rather erudite peer review by a considerably better-educated classmate. Why couldn’t she review my paper? It would have almost certainly helped. My writing wasn’t stellar, and my devoted readership–I do it all for you, much love!–knows it still isn’t.
Later, I privately enquired to the lecturer as to how I was supposed to condense a lifetime–however hampered by the limitations of my age and experience–of literacy, intellectual curiosity, familial and cultural academic background, semi-decent public education and informal training in polemic and rhetoric into a functional critique that would realistically benefit my beleaguered cohort and help him write a better paper. She replied: “That was the whole point; you need to work on your communication skills.”
In defiance not only of the comme il faut tenets of political correctness, but in fact–in some sense–of the national mythos of our putatively classless and democratic melting pot, I brazenly suggest something that is, I think, considered fairly obvious elsewhere: not all categories of people are destined to communicate deeply or productively.
When such discord inevitably manifests, we should not reflexively blame so-called communication skills or processes. People operate in silos that are sometimes “civilisationally incommensurable”, as it were, and sometimes there just isn’t much to communicate. This is the reality of culture, class and education, and the thinking on collaborative learning and teaching methodologies should incorporate that awareness instead of unavailingly denying it. Matching partners in group activities by sociological and educational extraction clearly presents political challenges in the modern classroom, though. Instead, I would encourage teachers to rediscover–“reimagine” is the appropriate neologism, isn’t it?–the tired, hackneyed premise of leadership by example. At the risk of a little methodological authoritarianism and a few droopy eyelids, perhaps the best way to ensure that students leave your course better than you found them is to focus on their communication with you. They’ll have the rest of their lives to figure out how to transact with each other.
In tech, we’re always talking about workplace ergonomics, the fine points of Aeron chairs and standing desks, the wretchedness of open-plan spaces and cubicle farms versus private offices, how many monitors to have, and so on. Usually, I’m an eager participant, very much attuned to the idea that comfort and pleasant aesthetics are essential to output, motivation, and sustained concentration in high-focus, specialised creative labour.
But sometimes it does help to get a little perspective. A glance at the work environments of many world-class musicians, scholars, authors and researchers from the 18th, 19th and early 20th centuries, or at the cramped physical environments in which highly capable, clever IT people work in developing countries, may convey a useful reminder that if you really want to do something, you can do it just about anywhere.
For that matter, my first job, as a technical hand and later sysadmin at a small-town ISP, when I was a bright-eyed, bushy-tailed 18-20 year old, was in office space not altogether Class A; it was a freezing, windowless dungeon in a small 4-room suite whose HVAC controls were shared with a machine room in need of constant, high-intensity cooling, and the rooms were littered with stray computer hardware and accessorised with fairly second-rate furniture. Yet, I’ve never been so productive or so excited to go to work since those days.
In fact, the physical configurations in which I eagerly wrote code for many hours as a teenager are downright sadistic by cushy Silicon Valley standards; a crude wooden desk from Big Lots, a far too low wooden dining room chair, a 7 lbs (~3.1 kg) laptop, a shared family PC in the tiny living room of graduate student barracks. And what a PC that was–a 386SX/40 with 2 MB of RAM, at a time when most respectable citizens were packing 60 and 90 MHz Pentiums. We were poor–by American standards, anyway–but I didn’t really notice.
I have a friend and colleague who travels around the world, living his digital life out of a rather clunky Lenovo ThinkPad, sat on a variety of surfaces, whichever are available at the moment. While he’s not usually staying in mosquito-ridden youth hostels in the Congolese jungle, I fully grant, he doesn’t have 30″ IPS displays or wrist rests, to say nothing of a snazzy office with an adjustable-height desk and a foosball table–you know, the bare essentials for Ruby on Rails jockeys in the Bay Area. Yet he’s one of the most disciplined and entrepreneurially successful people I know. I think of him every time someone says that they “literally cannot work” without a chair with proper lumbar support.
More extremely, some of my Armenian colleagues learned to code in a time when Yerevan was largely without grid electricity, during the disastrous Nagorno-Karabakh War and its contemporaneous power crisis. They hooked up their clobbered-together computers to car batteries for a few hours day, batteries which they improvised some means of charging occasionally, usually by leeching surplus electricity from cables to critical facilities that did have it. They’re some of the most capable and multifarious IT guys (not to mention electrical engineers!) I know.
Yes, one’s eyes, back, wrists, etc. become more fragile and capricious with age, and it would be prudent to afford them some care. As I near 30, I’m acutely aware of that fact, having all sorts of aches and pains I didn’t used to have.
Nevertheless, my conclusion on the psychological purpose of constant twiddling of small features of our environment – desks, monitor sizes and so forth – is that it’s more about tricking yourself into working on stuff you don’t really want to do. It’s to create the illusion that now everything is truly right with the world, and productivity will seamlessly spring forth from one’s fingers. It’s a means of papering over the fact that you’d rather be doing something else. If most of us were actually doing something stimulating, we’d probably be happy enough doing it on a rooftop while it’s snowing.
To drive that point home, a story:
I rented an office in a repurposed Soviet-era building in Yerevan, Armenia for a while. I should pause to say that this was very much a “you get what you pay for” kind of thing, so please don’t think all life in Yerevan looks like this. Anyway, here’s what was going on in the very room next door some of the time (“renovation”), and also a picture of the ice that formed under my doorway some winter days:
The “included” Internet connectivity was sorely lacking; in the end, I ended up with some WiMax receiver that gave me 384k/128k on a good day. I grumbled about it some of the time, sure. But I also wrote much of our product’s middleware, user interface and API documentation there. I was furiously productive, and it would not be unreasonable to say that our product made a titanic generational leap whilst I was there. Some other parts of this product infrastructure were written, also at a very impressive clip, whilst sat in a stiff chair and Spartan metal table in the rentable upstairs workspace at Sankt Oberholz, a Berlin coffee shop and coworking enterprise situated in a late 19th century building. I was hunched over uncomfortably, my nose stuck in a 13″ ultrabook with a loathsome keyboard. My eyes burned and my wrists tingled by the end of most days.
Now I’m sitting in my eminently comfortable Class A office in Atlanta, with great connectivity and a 32″ LED display on my desk, a favourite Das Keyboard at my fingertips, and I’m writing blog posts, poking around on Facebook. The difference is pretty clear to me: when I was in Yerevan and Berlin, I wanted to work, and now I don’t.
I won’t end with some hackneyed and nauseating Millenial “thought leadership” about “doing what you love”, nor the facile conclusion that it’s all in your head. I would just offer the modest speculation that an ounce of tweaks to the intellectual content of one’s work, or other, more existential life issues that inform your inner drive, is probably worth a pound of major adjustments to one’s office furniture, seating, barriers, and computer peripherals.
It’s wonderful to be back amidst the beautiful Alpine scenery of Innsbruck, and I’m overwhelmed by nostalgia. I was last here almost exactly twelve years ago, in 2003, when I was 17, for about two months during the summer before my senior year of high school.
This was before the era of smartphones and ubiquitous WiFi, and we had no Internet access in our rented apartment. I still had to use a real map, and had maybe an hour of Internet access a day. For the first time in many years, I learned to happily do without, and to go outside and enjoy life without anxiety about the torrent of news, information and opinion to which I was not privy.
My father got a rare and coveted opportunity to teach on a summer abroad programme for American students. Practically, classes would let out around noon on Thursday, and the weekend was ours for travelling; this way, I got to see Vienna, Rome, Florence, Berlin and Paris, generally connecting by train through München. The München-Innsbruck EC train got to feel something like a commute home by the end of it all.
But it was in Innsbruck itself that I got my first exposure to Western European life, and it did a lot to mellow me out of my teenage angst, in those times expressed through we might call “niche” intellectual and ideological fixations. In everyday life in Austria, waking up after dawn to behold this ring of spectacular mountains above and piles of unlocked bicycles below, I found my idea of “capitalism with a human face”. Its attachment to reality is a complex topic, but irrelevant; my teenage mind had learned to stop worrying and love the small things, love the petite bourgeoisie. Apart from a brief exposure to the Rockies, I had never been in mountainous settings. I had only known the stifling humidity and mugginess of the Eastern half of the US and never crisp, cool air. Last time I had seen vestiges of daylight at 10 PM was in the “white nights” of Riga when I was four–a last-hurrah holiday in 1990 preceding the secession of the Baltic Republics.
In many ways, my quotidian walks up and down Maria-Theresien-Straße, ventures west on Anichstraße to the Universität Innsbruck cafeteria for our included lunch of schnitzel, and my hikes to Hungerburg (868 m) did more for my spiritual health than the whirlwind of train travel.
I returned to America in August calmer, thinner, fitter and happier, with very concrete–for once, not theoretical–expansion of horizons. I had forgotten a lot of Spanish grammar just in time for the AP course, but had a bit of German up my sleeve.
My literature teacher from the previous semester asked, dejectedly, “Where is the angry communist Balashov?” I had no answer for him; I was in good spirits, and it was a great time to be alive.