The world is abuzz with talk of “coding” lately. Lots of people tell me their brother or their cousin is “into coding”; “you know, he does web sites and stuff”. Indeed, I saw this book at Fry’s yesterday:
(Apparently, this book, mostly a tome of basic HTML and CSS, passes for “coding” nowadays, but that’s a rant for another day.)
On the shelf below it, there was another title: “Python for Kids”. And lots of tech colleagues tell me they’re teaching their five and six year-olds basic programming.
And as I have a two year-old son, and given what I do—though it has precious little to do with web development per se, in the main—I am asked fairly often: “Are you going to teach Roman to code?” It seems to be almost rhetorical in the mind of many doing the asking, almost a fait accompli.
I’ve always found the question puzzling. I don’t know. Am I going to teach him to code? To me, it sounds rather arbitrary, a bit like, “Are you going to enroll him in karate lessons?” or “Are you going to have him tutored in oil painting?” or “Is he going to play basketball?” It depends on what he’s like as a growing person, and whether he seems interested or appears to have any aptitude for it, I suppose. He’ll doubtless be exposed to it, given his parentage; there’s probably no avoiding that. Beyond that, it’s really a question of whether he’s keen on it.
There’s an important balance to strike; when it comes to specialisations, kids don’t know what they don’t know, and one of the main reasons we have general public education (and general ed/survey course requirements at the university level, in the USA) is to expose growing minds to the range of occupational possibilities, academic disciplines and fields of human endeavour generally. Still, I’m acutely aware of what happens when parents try to remake children to any degree in their own professional or intellectual image. I got this mildly, in the form of being subjected to parental projections of Soviet intelligentsia values: mandatory piano lessons, assigned reading of literary classics, lots of classical musical concerts, ballet performances, etc. In hindsight, it probably did me some good, though my adolescent rebelled powerfully on the inside. I see much sharper examples in the lives of others, whose parents want them to proceed down some similar track — play football in college, learn the family business, or, as it happens, become a software engineer.
My own interest in IT as a child arose in a particular context, a historical conjuncture of many factors: university environment, emergence of the commercial Internet, supportive academic social community, adolescent quest for identity, efficacy, communication. There’s no reason to think the same motivations will drive others in an era in which all this is long commoditised. A lot of people seem to subject their kids to forcefully projected nostalgia for a different time and place. I know my love for computers came from a different time and place. I am not sure I’d have been lured by them as they are today.
I think the question about “coding” runs deeper, though. There’s a widespread awareness—and perhaps it’s fair to say, anxiety—about software eating the world. There seems to be some consensus that the foreseeable future of gainful employment in the developed world dovetails extensively with machine intelligence. Automation as a reputed killer of low to medium-skilled service jobs is a routine headline. I think what’s really being asked is, “given that we’re going to be a society of computer programmers, will Roman take part?”
I suppose don’t buy the given. It’s fair to say that use of computer technology has become routine and necessary in most full-time professional jobs. I also think it’s important for kids to have some idea of how software works so that they can make sense of the world around them; it can’t all be “magic”, and indeed, that lack of understanding is an obstacle as we rapidly leap into a very software-driven world.
But it doesn’t follow that everyone needs to learn to speak to computers in code. Indeed, one could convincingly argue that the general arc of software progress and the commoditisation of computers has been to make this less necessary over time; there was a time when everyday uses of computers required speaking an assembler, COBOL, BASIC, while nowadays a substantial portion of the digitally savvy population taps through “apps”, and frankly, so do I. I started writing socket (network-related) code in C on Linux when I was 12, but I only have the broadest idea of how my Galaxy S8 works. I’ve asked younger Millennials for Android help before.
Moreover, people learn what they need to; I know plenty of otherwise technically illiterate accountants who have conquered snow-capped summits of Excel macro wizardry, the likes of which I could not have even conceived.
My undergraduate-aged babysitter is far from a technologist, but her mobile and desktop computer literacy surpasses that of many Baby Boomer and Gen X professionals. Why? She was born in the late 1990s; she’s always known the Internet. I jokingly asked her once if she realised music wasn’t always on iPods or in MP3 format, but based on her matter-of-fact response, I don’t think she really heard the full notes of the humour. It was almost like asking me if I realised history used to be recorded on papyrus.
In short, I don’t see law, medicine, writing, poetry, music, art, or the myriad of skilled professions becoming a fancy, domain-specific branch of computer programming. These fields will—as they do—put computers and the Internet to business use, but why are we talking as if everyone’s sat in front of a PDP-11?
That leads me to the heart of what inflames me about this cultural moment of software mania and metaphysical, cosmological technocracy: technology is a tool, not an end in itself, and we mustn’t forget that. It is a force subordinated to human purpose, not the other way around. It is as lifeless and mechanical as a jackhammer, not an organism in need of care and feeding, nor a capricious god to which we must pay tribute or sacrifice our young. It does not intrinsically solve most timeless sociopolitical problems. It’s not a raison d’être, and neither is “coding”.
Speaking of sacrificing our young, while my own childhood obsession with programming and the Internet got me a well-compensated occupation in an in-demand and growing field, as well as a supportive network of likeminded online cohorts, I’m all too aware of the human costs, physical and psychological. At least ten thousand hours were spent in a sedentary pose as an adolescent and teen. I missed out on almost all social features of high school, since there was always C code to be tinkered with or someone was wrong on Kuro5hin or something. (Though, there’s no particular reason to think it’d have been epic otherwise, for reasons Paul Graham articulated better than I could). The shockingly low amounts of sleep I ran on most school days between grades 6 and 11, bleary-eyed from the blue light-soaked all-nighters of homo computatis, ought to be the subject of some kind of study, I swear. I wear multiple pairs of glasses due to eye strain. I dropped out of college because I cared so much more about my work. The fact that anyone ever dated me seems like a miracle sometimes; I somehow had a girlfriend my senior year of high school, which finally had me looking after myself more, but you, too, would ask “how?”; it didn’t (heh) compute.
I’m not saying I necessarily regret any of it, though of course we’d all tweak a few things with the benefit of hindsight and time travel. What I will say is that I don’t bill my lawyer-ish hourly rate for nothing. I got here at the cost of much of my childhood and adolescence, as we ordinarily understand those stages of life, and at this point I’ve fed for more than 2/3rds of my life span to the exacting and jealous machine. The road to being pretty good at what I do was long and arduous. Computers are addictive as all hell. It’s no accident I’m finishing this post at 4:45 AM; when you mess up your biorhythms from such an early age, old habits
die hard don’t die. I’m very mindful of all that as I consider the full list of possible consequences of parentally encouraged geekery for kids.
I suppose there is one way in which Roman will be socialised in the shape of his father: he’s genetically part philosopher, and if he does take up programming, we’re going to spend a lot of time on: “But code what? And why?” In the meantime, I have no plans to plant him in front of a Raspberry Pi or “Python For Kids”.
One of the notes that the dog whistle of the Trump political machine hits is rural parochialism and provincialism, a conviction of the inhabitants of the American “heartland” that it is the essential America.
America hardly invented pitchfork-wielding country chauvinism, but it has had a particularly powerful historical current of that kind of know-nothingism, owing to its roots in a medley of dispossessed immigrant political minorities—especially religious minorities. There is ample lore in the heritage to support the popular psychological metaphor of taking a nihilistic wrecking ball to the state. There’s burning the corrupt edifice to the ground, nuking it from orbit, methodically strangling it with “starve the beast” spending legislation, “draining the swamp”—whatever form it takes.
These people don’t want to hear about big city folk and their reputedly elite cosmopolitan problems. No, it is we, on this two-lane road in the middle of southern nowhere, amidst the vestigial shells of Rust Belt industry, in the bucolic crop fields of South Dakota, who are the “real” America! It’s time to take it back! We don’t need no stinking foreigners, we couldn’t give a hoot about “diversity”, and (as was remarked to me recently) come on, show us one legit immigrant from Yemen, we dare you.
As always, the problem is that going full ostrich doesn’t work.
Like it or not, we do live in a highly globalised, interdependent world. Interdependence means complexity, and—to return to thermodynamics, as we all someday do—complexity means fragility. Fragility means that diplomacy and a multilateral approach to human affairs are necessary. Omaha mostly buys Samsung smartphones, too. Banks in Kansas are exposed to the Hong Kong Dollar and the German stock market. There are Caterpillar excavators and Boeing airplanes everywhere, and the manuals are translated into two dozen languages. Ohioans entrust their very lives to talented Japanese embedded software engineers every day, and will again tomorrow and the day after that—isn’t it nice to plug one’s ears and not have to wonder why one’s Honda doesn’t spontaneously explode or run off the road? The working-class enlisted sons of Kentucky are shipped off to places their parents would do well to learn to pronounce. The more economically privileged ones are going to discover a good Jordanian immigrant-owned falafel stand in their college towns.
(My own tech career was launched at a small-town Internet service provider owned by a Syrian Arab and his Pakistani business partner. But everyone knows devout Muslims don’t create jobs in Murrica.)
Or, more cosmologically: events in Yemen, Somalia, Egypt, South Korea, China and India, inter alia, are highly relevant to every American’s existence.
I know a lot of people who would prefer not to venture beyond the county line and like their intellectual preoccupations as they like their beer—domestic. To you I say:
Sorry, you can’t put the milk back in the cow and rewind the clock to 1802 A.D.1
Moreover, without the Coastal Mind Control Elites, this iconic heartland would have neither the government subsidies on which it extensively relies, nor markets for its products. I won’t patronise you with one of the many charts showing the geographic distribution of net inflows and outflows of federal dollars and the rural-urban trade balance. Y’all are actually pretty good at Google when there’s pervasive liberal bias and insidious Soros funding to be unearthed on the Internet, or some
Julius Streicher Breitbart screed in need of scholarly citation.
Big conurbations concentrate economic opportunities and institutions. You don’t have to live in them if you don’t want to, and you don’t have to take part in the so-called Knowledge Work economy, but you can’t just give it all a sociopolitical middle finger and pretend that they, their denizens, or the larger world don’t exist. You guys just want to rap about how you’re screwed by globalisation and immigration, drop the mic, and leave hard problems of making the world turn ignored and unsolved. It’s facile, it’s petulant, and it makes you sound like an overgrown toddler
If that’s how it’s going to be, fine, but could you at least let the overedumacated, book-learned cityfolk do their jobs instead of sticking them with a reality TV demagogue and his diabolical alt-right posse?
1 This is precisely why the [seemingly] intentional deafness of the Libertarian candidates to foreign policy is a non-starter.
I am rather deeply disenchanted with and bored of The Daily Show. This has two distinct, unrelated causes I can identify:
1) One simply gets too old to tolerate its dick-and-fart jokes and potty humour. At this point in my life, I’m looking for more incisive, more subtle and less superficial political satire and social critique.
I don’t know if that problem has a solution, given that the Daily Show’s seeming target is college-aged audiences. But I find myself wishing that, in terms of sophistication, it would graduate along with the early-mid 2000s students who were the engine of its ascent.
2) Trevor Noah is a fantastic and brilliant comedian—if you watch his stand-up from South Africa and the UK.
His gift for accents and languages is uncanny, and his satire of the South African political elite (e.g. Julius Malema, Jacob Zuma) and everyday interactions had me doubling over from the stomach pain of laughter. Watch his old material; if you don’t know much about South Africa, you’ll learn.
Some of my favourite examples on YouTube (always subject to removal, you know how YouTube is) include his treatments of airport terminal announcements, the inclusion of Chinese in the BEE (Black Economic Empowerment) programme, and this rather obscure act poking fun at the frustration of interactions with Johannesburg City Power. And his performances about cultural differences in America need no introduction at this point.
Since that was my first contact with him, that’s how I see the real Trevor Noah. At The Daily Show, I have the acute sense that he’s not being allowed to carry his artistic identity over. He’s being shoehorned into an American dick-and-fart comedian role, and to say that it’s not a natural fit for him is an epic understatement.
I understand that the logic of the show has to have some continuity with Jon Stewart and with the show’s historical inertia, audience and so forth, but he’s not Jon Stewart, and he shouldn’t be.
To really reap the rewards of Trevor Noah as an artist, an observer and a commentator, he must be allowed to be his “South African” self and to bring—perhaps it would be more apt to say, to impose—a distinctly foreign perspective to his American audience, nolens volens.
It might not be what the focus groups tell Viacom executives would sell, but there’s real potential there. Yes, there’s always the spectre of Piers Morgan, and with his demise, the idea that foreign commentators interpreting American affairs and telling Americans how things should be is a fatally unpopular formula. However, I think—perhaps naively and overoptimistically—that the younger and more malleable viewership of the Daily Show would respond better than that of CNN’s, and would have a greater inclination to stretch their brains and grapple with such a phenomenon.
I cringe when I see Trevor forced into various contrivances and artifices of a false pretense of American mass-cultural familiarity. He’s not American, and he shouldn’t be made to pretend as if he is. Low-brow toilet humour and American Millennial lore doesn’t work for him. Laundering facile Internet memes is not his competency. If he could control more of his material and delivery, and tailor it to his traditional style, there would be a show more worth watching.
I’ve written some in the past about how the predominant suburban design of the US of the worst features of life here, viewed from the perspective of a European immigrant like me, at any rate.
Far from posing a mere logistical or aesthetic problem, it shapes–or perhaps more accurately, it circumscribes–our experience of life and our social relationships in insidious ways. The destruction of the pedestrian public realm is not merely an economic or ecological absurdity; it has real deleterious effects. For just one small example of many: life in a subdivision cul-de-sac stops children exploring and becoming conversant with the wider world around them because it tethers their social lives and activities to their busy parents’ willingness to drive them somewhere. There’s literally nowhere for them to go. The spontaneity of childhood in the courtyard, on the street or in the square gives way to the managed, curated, prearranged “play-date”. Small wonder that kids retreat within the four walls of their house and lead increasingly electronic lives. (The virtues of a private backyard are easily exaggerated; it’s vacuous and isolated, and kids quickly outgrow it.)
However, it’s been difficult to elucidate in specific physical terms what it is about suburbia that makes it so hostile to humanity. To someone with no training in architecture, it’s often experienced as a great, nonarticulated existential malaise, like depression. You know it sucks, but it’s hard to say exactly why. The same holds true in reverse; North Americans who have not travelled abroad extensively and don’t have a clear basis for comparison can be tongue-tied when asked to explain what exactly makes a non-sprawl city street “charming” or “cozy”. It’s telling that we have no widespread cultural vernacular for why classical urban settlements, which draw on millenia of intellectual background and corpuses of architectural knowledge, are pleasant. It’s because Americans took that inheritance and unceremoniously discarded it, consonantly with the rise of the mass-produced automobile. It irks me that many of us know, on some level, that we live in a dystopian nightmare but can’t say what makes it a dystopian nightmare.
That’s how I came to spend a fair amount of time recently thinking about and researching what exactly makes suburbia suburbia. I don’t mean the abstract reasons why it sucks; I’ve pontificated on that plenty. I mean the physicality. For example, I live in Atlanta, a suburban mega-agglomeration that sucks in the same general way as cities like Los Angeles, Dallas-Ft. Worth, Houston and Phoenix. When someone asks me where I’m from, and I roll my eyes and diffidently groan, “Atlanta…” Why? It’s worth asking what specifically makes Atlanta “[groan] Atlanta”.
If one hopes to avoid broad vagueries like, “designed for cars, not humans” and instead to get specific, then there’s no single linchpin attribute that makes suburbia what it is. It’s an interdependent constellation of misanthropic zoning rules, building codes, and planning guidelines. My aim is to list as many of these as I’ve discovered and been able to formulate.
1. Single-use zoning
American zoning law (in all but its oldest cities) forecloses on the possibility of mixed-use development. This means traditional design patterns like shops and offices on the first floor with apartments above are impossible. Residences are constructed in special areas zoned for residential construction, while shopping and work take place in altogether different areas zoned for commercial development.
The idea, of course, is that the peaceful slumber of the suburbanite should not be interrupted by the noise generated by the transaction of commerce or any other public-sphere human activities. The result is that running any errand or attending to any need, no matter how small, requires getting in one’s car and driving somewhere else, in many cases several miles or more.
Since separation of commercial and residential zones by vast tracts built at automobile scale (rather than human scale) removes the possibility of accessing useful destinations on foot, it removes any practical motive for walking. Without consequential destinations that are part of normal human activity, by and large, the only people who walk on suburban streets do so for exercise. And the only reason they would do that is because their automobile-powered daily existence does not otherwise compel much movement.
2. Hierarchical traffic distribution
The chief complaint of most residents of suburban sprawl is traffic. The most obvious cause is, of course, that everything requires driving, but there are more subtle reasons, too.
The endless cul-de-sacs, winding loops and seas of parking lot in suburbia empty into larger “collector roads”, often constraining traffic in a given neighbourhood to a single preordained path.
Traditional neighbourhoods and cities are designed in a dense grid and/or interconnected web of streets, so there are many alternative paths between two points.
3. Set-backs from the street & parking ratios
Local building ordinances in suburban sprawl don’t allow buildings to directly abut the kerb. That means one cannot simply enter a building from the street. Instead, the building is set back from the kerb, requiring one to traverse a parking lot to reach it.
In the case of larger shopping centres, this means the building is set back several hundred feet, separated from the street by a large sea of parking. This is because suburban building ordinances require a generous proportion of parking spaces in relation to the surface area of the building. So, the larger the building, the more parking it must have, and, seemingly, it must be in front, not behind the building (more on that later).
4. Proximity does not mean pedestrian accessibility
On the other hand, it is not so uncommon in suburbia to live very close to a nearby shopping centre. I’ve had lots of suburban friends tell me, “actually, the grocery store is 1000 ft from me. Very convenient.” Indeed, when we lived in an apartment complex in the Perimeter Mall area in Dunwoody, the nearby Walmart shopping strip was within spitting distance. I could almost see the store entrance from my bedroom window.
But, perversely, that doesn’t mean I could walk to the store, as a normal person from virtually anywhere else on the planet might conclude from that statement. In its fanatical quest to eviscerate the pedestrian realm and make cars exclusive first-class objects, suburbia manages to make far even that which is conceptually close. Building ordinances generally require some sort of “divider” between these adjacent land parcels, like a ditch, a chain-link fence, or a concrete wall or noise barrier. In our case, that means I had to walk out of the apartment complex, go around the divider, and then cross several hundred feet of parking lot to go to the store.
It goes without saying that most normal people would choose to drive the distance. And that’s the idea.
5. Economic segregation by building type.
It does not bear repeating here that one of the things that makes interesting places interesting is variety. However, one more subtle effect of the enforced homogeneity of suburban residential neighbourhoods is economic segregation.
In older and more traditional neighbourhoods, multiple types of buildings of varying sizes coexist closely. Yes, it is a universal premise of building regulation and planning that they must be united by some sort of overarching organising aesthetic principle and geometrically agree in some way or another, but that doesn’t mean they all had to be approximately the type or size. As a result, it’s quite possible for poor, middle class and rich people to live side by side in one neighbourhood, with the difference that the rich people’s houses or apartments are merely bigger.
Local building ordinances in suburbia aggressively disallow this, and it’s the fastest way to tank property values within the logic of the suburban system. That’s why every new subdivision varies only by a handful of approximately similar house types, and the residents are all in a similar income bracket.
Suburban building codes also commonly disallow affordable housing hacks available in older neighbourhoods, such as above-garage apartments (sometimes known as granny flats). It is no mean feat to get approval for a small secondary edifice in one’s backyard–something the size of a toolshed, but habitable. Contrary to the individualist-libertarian ideology underpinning widespread suburban attitudes, even use of the space behind one’s walls, within the private sphere, is highly constrained and regulated.
6. No street enclosure and definition
The geometry of streets and sidewalks is a critical topic. Generally speaking, the reason settled streets in older neighbourhoods and European cities feel “cozy” and “charming” is because they provide a feeling of enclosure, which humans want because it gives them with a coherent sense of place, like rooms in a house.
I’m not a sociobiologist and cannot say exactly why this is, but would speculate that it caters to people’s primal need for shelter and clear directional orientation. Whatever the case, it’s an established fact that people gravitate toward places that have clear borders and relatively comprehensive enclosures; it’s a kind of axiom for the discipline of architecture. People feel vulnerable and uncomfortable in open areas with ill-defined margins.
That’s the difference between standing on Saint-Germain:
And standing in the middle of nowhere:
Creating that enclosure and definition cannot happen if buildings are sparse and set back from the street. It also requires a certain broadly rectangular building geometry, with more right angles and less campy avant garde twists (more on that later). Suburban streets are notable for the degree to which they don’t provide a sense of place. Their curved, winding trajectory also robs one of a sense of cardinal direction–that’s why it’s so easy to get lost in suburbia. I am much more likely to need GPS aid in navigating through a subdivision than through a downtown.
Pleasant, walkable streets have other important features, such as protection of the pedestrian sphere from automobile traffic. This delineation is provided by architectural buffers such as trees, high kerbs, and street-parked automobiles themselves. All of these things can arrest a car about to plough into a crowd.
Another thing that takes away from the feeling of place and enclosure is large kerb radii. You’ll notice that in dense cities and older neighbourhoods, sidewalks adhere to the street at right angles, providing a minimal crossing distance for the pedestrian. However, suburban kerbs are optimised for cars, allowing them to maintain some speed while turning right–and to easily mow down anyone who is misled by the formal presence of a crosswalk into the belief that they’re actually meant to walk there.
7. Useless, ugly and wasted space
When quizzed about the advantages of suburban life, the most common answer is “space”. But even if you like lots of space, you’d have to agree that the quality depends on what kind of space it is.
Suburban development ordinances are replete with requirements for useless frontages, pointless greenspace between compatible land uses, as well as chain-link fences, concrete barriers, and drainage pits. Space is still inhabited by humans, and has to be articulated to match their specific uses for it. A lot of open space in suburbia lacks that articulation; it’s neither pristine forest nor a particularly usable surface. It’s just kind of there.
The absurdly large width requirements for inner residential streets are a special case of their own. Small, low-density streets don’t need to be so wide that one almost can’t see his opposite neighbour’s house because of the intervening curvature of the Earth, especially given that street parking is generally not done in these places because, evidently, everyone needs their very own [expensively and unnecessarily] paved driveway. The formal reason for large width requirements is generally something comical, like to accommodate a full-size fire engine or other large emergency vehicle in case tragedy should strike. Well, sure, conceivably you might need to land an A380 there, too.
8. Parking-first aesthetics, garage façades, no alleys, no interior yards
It took me some time to consciously realise it, but one of the biggest differences that makes traditional neighbourhoods more appealing is that parking typically happens behind the house, reached through an alley. One is not likely to see an alley approved in suburban construction; that’s where robbery happens, right?
Instead, suburban houses are set back to make room for a driveway. Much of the façade of many houses is accounted for by a garage. This telegraphs the impression that the primary function of a house is really, above all else, to provide parking for one’s car.
Considering that suburbia is reputedly sterile and safe, there ought to be many other uses for alleys and common interior courtyards located at the rear of buildings, away from the street. In addition to being the proper place for cars, those are good places to put trash and recycling bins. Instead, the suburban street is surreally dotted with plastic trash cans at least weekly. So much for the pretense of civilisation.
What this says is: we have such a delapidated and depressing public realm, so few memorable places and things worth seeing, that we truly don’t care. This tension also accounts for the kitschy, farcical schizophrenia of the suburban home façade:
It’s a castle, a veritable homage to collonades! But wait, there’s more: there’s a front porch–and if you’re a toddler, you can fit on it! Seriously, what is this thing? It looks like it’s trying to be a lot of different things from the annals of written history.
It’s not a house. At best, it’s awkward and unsettleed eclecticism, and at worst, it’s a caricature, as Kunstler would say. The form of normal houses much more closely follows their function. The problem is, when there’s nothing else worth looking at, developers are maximally exposed to the charge of building “sterile” suburbs if they build a merely functional house, the sort of thing that would be thought attractive for its simplicity and cohesion elsewhere on the globe. And that’s how we get to the neurotic potpourri of superficial ornamentation above.
The same dialectic is often a driver of the infamous suburban NIMBYism. When the public realm is so depressing and demoralising, describable mainly in terms of the car traffic it generates, it’s understandable that nobody would want to see more of the same built nearby. It ultimately comes down to the fact that we don’t value our public realm in America, and, no surprise, we’ve not built a public realm worth valuing but instead retreated into escapism in the private one. All escapists, ranging from readers of fantasy literature to video game players to drug addicts, are generally irritated by any effort to somehow disrupt or meddle with the ongoing process of their withdrawal from reality.
9. No street life or visible human activity
Periodically, people will ask me: “Well, if you’re so committed to walking, why not just … do it?” They mean right here, on the highway, next to six lanes of traffic, in 90F heat.
Well, in actually-existing psychological reality, people aren’t going to walk where it’s neither comfortable nor interesting to walk. Contrary to popular Republican-type mockery of the notion, “interesting” doesn’t require a hipster paradise of airy-fairy, frou-frou creature comforts like street cafes (though they do uncannily arise in interesting places). “Interesting” just means there’s some intimation of human presence and activity expressed in the architecture and scenery.
There’s nothing about a treeless six-lane highway that conveys this. I’m going to drive, not walk, because to walk would be boring, tedious, uncomfortable, dangerous, and, in a sprawling geography designed at automobile scale, impractically slow.
10. No public transport
Aside from its superior efficiency and ecological footprint, the primary value of public transport is not in being able to commune with the armpits of your fellow man, but in being able to spend your time in some way other than chained to one’s steering wheel cursing the traffic. You can read a book, catch up on e-mail, or just close your eyes for a while.
In suburban sprawl, you’re doomed to spending vast amounts of time at the wheel–time you cannot do much else with, and which you won’t get back. The nature of low-density automobile sprawl cities is that everything is insanely far away from everything else, so no matter what you do, you’re doomed to driving vast distances to see most friends, to commute to work and so on.
Clearly, it bears mention at this point that self-driving cars could address the chained-to-steering-wheel factor. But it remains to be seen to what extent they can shift the larger paradigm. I can envisage self-driving cars doing very little to change the overall blight (and environmental costs) of suburbia, or I could see them evolving more rationally into a kind of semi-personalised public transit. It’s a phenomenon that has the theoretical potential to either greatly further our atomisation into the pathetically sybaritic techno-pods of a WALL-E type world, or to turn into a moderately pleasant band-aid.
Whatever the case, they don’t solve the more fundamental problem of our vicious contempt for the idea of a public realm.
11. Improper interface between city and highway
In most places in the world, one will find that high-speed highways run between cities, not through them. You’ll also find that intercity highways don’t have a lot of commercial development along them, allowing unadulterated views of the countryside.
In places like Atlanta, interstate highways are something like main thoroughfares. Three of them converge downtown, along with numerous other high-speed roadways.
The effect is to induce lots of derivative traffic within the city. Freeways breed on-ramps and car-centric development along the corridor. At the same time, the city, especially its most important historic parts, is partitioned by an ugly exoskeleton.
12. Lack of regional planning vision
Turning back to Atlanta: Decades of unbridled free-for-all building in Atlanta have led to a widely dissonant, fragmented patchwork that cannot deliver a coherent thesis for future development in the city.
Some individual neighbourhoods in Atlanta, like Midtown (where I live), have made great strides over time to become walkable and present viable in-city living options. The problem is, as soon as you need to leave such a neighbourhood, you still have to get in your car.
The same problem can even play out on the block level. I’ve been to some downtowns of suburban sprawl cities and found them to have a number of blocks or sectors that are actually quite pedestrian-friendly, well-designed and interesting. The problem is, these blocks are like a chessboard; they’re not contiguous! Want to go more than 500 ft? Better start the car.
The point is, Metro Atlanta covers nine counties and untold municipalities, incorporated and not. With all the resources and initiative in the world, there’s nothing the City of Atlanta can fundamentally do to alter the reality of life in 95% of Metro Atlanta. I haven’t seen anything inhabitable constructed in America through a laissez-faire approach to building across such a patchwork. Charge has to be taken at the regional level.
As far as I can tell, the same holds true almost everywhere, since everything in the US that is–gallingly–called a “city” consists of fragments scattered across unconscienable stretches of freeway. I have a special place in my heart for Dallas-Ft. Worth, much of which should be reclassified as a rural area outright if one is to judge by density. But the need for a regional approach to development priorities and transportation probably applies almost everywhere, including places like St. Louis, Indianapolis, and Omaha.
This post draws in part upon the work of James Howard Kunstler, including his widely disseminated TED talk, as well as upon the data and ideas in the well-known New Urbanist title Suburban Nation by Andres Duany, Elizabeth Plater-Zyberk and Jeff Speck.
My upbringing and extraction unquestionably lie in the liberal arts and humanities. I’m moderately extroverted, and always leaned hard on the side of verbal, expressive and linguistic capabilities. For the time I was in university, I was a philosophy major with vague notions of law school. My parents are philosophy professors, and most of my relatives have an academic pedigree. I also had the unique intercultural experience of coming to America at age six with no knowledge of English and subsequently learning it in an academic social setting (my parents were graduate students). I carried that formative experience, and the globally conscious, relativistic outlook on language and people that it fosters, forward with me through life.
That doesn’t mean I’m a great writer, but I can write. Great writing, though, is really hard. As with many other things, if you plot a line from “can’t write at all” to “great writer”, you’d have to plot it on a logarithmic scale. Having a broad vocabulary, a firm command of language, and adroit self-expression will get you to the table stakes of “can write”, but that last bit, on the right, is a hundred, a thousand times as hard as what precedes it.
You know great expository writing when you read it; the thoughts and ideas are scrupulously organised, yet presented in a compelling way, with varied transitions and entertaining use of language, at once colourful and precise. Come to think of it, it feels pedestrian to anatomise it this way. You know great writing when you see it.
My writing is far too disorganised and repetitive to hit those notes. I’m verbose and can write a lot quickly and easily, but quantity is not quality; organisation has always been a struggle amidst my desire to relate a lot of details. If you read this blog with any regularity, you’ve seen that battle play out.
Though I’ve got better at condensing my thoughts and communicating ideas simply with age, I’ll never write like my friend Alan. His writing is incredibly brief and terse, but his gift is succinctness per se, which is not the same as brevity, though the two are very often confused in contemporary minimalistic fashions in communication. He can say much with little where many others merely say little with little.
As far as I can tell, the real gift there is the ability to accurately foresee the details and connections that the reader’s mind can work out for itself. Then you can say only what’s necessary to anchor the conceptual tent, cloth not included, avoiding most of the potential redundancy that makes verbose text tedious. This post would be about six times shorter if Alan were writing it, yet say every bit as much–if it’s truly important.
I tried, for a time, to emulate his style growing up, but the results were farcical, much more along the lines of saying little with little. Not everyone’s intellectual output can be compact and tidy. I have to ply my version of the craft, such as it is, differently.
Anyway, I lay out all these concerns not to be pompous, but rather to say that the kind of stuff I spend a lot of time worrying about doesn’t typify the STEM personality one commonly finds in the software engineering profession, nor the pragmatic, utilitarian–and often Spartan, at least when it comes to writing–communicators in the business world. There are exceptions, of course, but as a whole, my life experience is that it’s a valid generalisation about engineer types and MBAs thumbing out curt txt spk on their Blackberries. And this is the environment in which I’ve spent almost my entire adult life, having dropped out of university to seek the exalted heights of corporate America.
As you can imagine, this occasionally leads to amusing and infuriating conflicts of style and culture, and in general doesn’t make for an easy professional life. It’s not easy to talk to people when you have completely different psychological priorities than they do. The curse of being somewhat better-rounded is that my mind often takes detours not travelled by fellow Professionals. To their mostly utilitarian sensibilities, idle musings and the cultivation of an inner life beyond the immediate task at hand are, above all else, a waste of time. It’s not enough to just write this e-mail; it must be a good e-mail, at once brief and useful, but also poignant, articulate, maybe even with a dash of wit, or clever and original use of the English language? They’re thinking: get to the point, Alex, because business. There’s money on the line, or action items or something. Never mind the existential why! Business.
Being wordy didn’t make for an easy childhood, either. I don’t think I came off overtly as bookish, being mostly chatty and rarely seen with an actual book per se. All the same, I can’t remember how many times I was called “Dictionary” or “Thesaurus” in school, or otherwise suffered social opprobrium for… well, for using words like “opprobrium”.
Outside of the liberal arts wing of a university environment (which I forsook at age twenty), the rest of the world offers a pretty steady diet of hostility to aspirant wordsmiths, and, as far as I can see, more generally to broader combinations of the intellect. There’s the automatic, default hostility of idle, unemployed kids in school, and the studied hostility of busy professional grown-ups. It’s easy to get depressed shouting into a waterfall, or, more accurately, pissing into the wind. I often feel an impostor, not quite sure what I’m doing here donning the regalia of tech entrepreneurship. When almost everyone I mix with expects small talk, being the guy always keen to start some big talk is demoralising and lonely.
And yet, as I pass the thirty mark, I’ve noticed something interesting. As more and more friends, colleagues and classmates move up the career ladder or otherwise evolve higher-order life needs, they’re coming to me for help in formulating thoughts: delicate requests, polite demands, cover letters, biographies, dating profiles, admissions essays, crowdfunding campaigns, petitions. All of this and more has landed at my feet in the past year.
“You always know how to say this stuff just right.”
“I don’t know how to say this – help!”
“You can put it a lot better than I can.”
Every once in a while, I’ll even get a note from a customer: “We always appreciate your thorough explanations and your going the extra mile.”
So, good news from our own “It Gets Better Project” for fellow closeted English majors in their twenties and thirties: keep your head up. As folks who know you move up the value chain into managerial realms requiring them to flex their communication muscles for the first time, you’re going to be more in demand.
Moreover, through my own experiences in hiring and being hired in the technology sector, I’m firmly of the impression that the most valuable candidates in the long run are those who both possess raw skills and can communicate well. There’s a lot of bottom-line value in clear analysis, disentangling messy ideas, and presenting esoteric information in an accessible way to outside stakeholders. Wordy missives may always be ignored by MBA frat boys as a matter of course, but effective and engaging communicators have more influence and audience.
The point is, as you gain confidence on your professional ascent and increase your leverage, stop taking shit from philistines. Don’t shy away from selecting aggressively for employers, customers and partners who realise that better-rounded people bring more to the table and appreciate you for who you are. Much has been said about how the customer is always right, and while compromises are necessary in life, you don’t have to concede everything and always. The fibres most integral to your self-actualisation should be armoured. The rightful sense of self is not for sale.
Evaluating potential hires for “culture fit” is all the rage in human resources now. Why not evaluate them for culture fit? What’s the culture like at the new gig? Neverending arguments about last night’s Steelers vs. Cowboys and the impact on Fantasy Football picks? Spirited discussion of the pros and cons of sundry brotein shakes? A thriving marketplace of World of Warcraft items? Hackneyed memes about bringing democracy to Syria? Either way, fire ’em. Sounds like bad “culture fit”.
Finally, choose your cohorts and your spouses wisely. Your true friends will help, not hinder you in leading an examined life.
My twenties are coming to a close in a few days. Like many people in my position, I got to thinking: “What do I really wish I had known when I was twenty?”
I suppose I could recapitulate professional and business lessons in easily digestible form, but does the tech entrepreneur self-help genre really need my help to survive? I could write about the evolution of my politico-philosophical positions, but I do that already. When I think about the kinds of insights I most wish I had when I was twenty, I think about the harder pills to swallow–the ones in the world of life, love and people. So, I’m going to talk about those.
It’s not that age thirty is a snow-capped summit of shareable wisdom–no, indeed, one of the lessons from my twenties is how little I truly know. But another thing I’ve learned acutely in my twenties is that the road can end abruptly at any time, so if you want to write down some thoughts, don’t wait until you’re seventy and retired to write memoirs. It’s a cruel trade-off, because at seventy you’ll have a lot more credibility. But you may not get the chance to employ it.
Not all these thoughts originate in my direct experience. Some do, and I’ll own that. Some I’d rather disavow so as to not look like an idiot; you know, “asking for a friend” here. Some others are from observing the lives and fates of those close by, particularly where their experiences are more diverse. The ambiguity I’ve created here lets me steal a bit of their thunder and make my twenties potentially seem more interesting and original–or catastrophic–than they were.
1. The biggest advantage of youth is youth.
You only get:
- Bright-eyed, bushy-tailed excitement
- Huge amounts of physical and mental energy
- Low maintenance and high risk tolerance
once, and if you don’t use them, they’re gone forever. You’re never getting them back.
I’m going to frame this in tech entrepreneurship terms because that’s close to home, but it applies equally well to any hard endeavours: launching a substantive career, getting a doctorate, doing significant research, writing a book, opening a shop, becoming an expert.
We often talk about “working smarter” versus “working harder”, and surely, working smarter is an important evolution. But to attach to the value chain in the first place, there’s a lot to be said for working harder. Sometimes there’s just no substitute for raw hustle. The cruel reality is that not everything can run on “vision” and “management”; someone’s got to be down in the boiler room, and some problems can only be tackled with huge inputs of raw energy, high motivation and brute force.
When I was twenty, I could write code for twelve hours in a dark room without a care in the world. Nowadays, I’d say two to three hours is a banner day, and I might need a mental break tomorrow. If nothing else, my eyes and limbs can’t take it; I’ve got all sorts of little aches and pains I didn’t used to have.
Much is made of “a lifetime of learning”, and that’s good and fine, but the reality is that most of us do become more sclerotic with age and our habits become more ingrained. We get intellectually fatigued from seeing the same patterns over and over. We get physically tired.
From a competitive standpoint, it’s really hard to kill a “Ramen-profitable” 23 year-old rooming cheaply with some buddies. At that age and life situation, one needs almost nothing to survive. What’s the worst that could happen? He could fail completely and move back in with his parents for a while? A businessman like that is like a cockroach; you could drop an atomic bomb on him and he’d still be kicking. In contrast, a guy with two kids, a mortgage, daycare, medical bills and wifely lifestyle expectations is a sitting duck with a massive burn rate. If his income stalls below six figures, he’s going to have to quit and do something else.
Now that it’s become sociologically normal to view one’s twenties as an extension of high school, many folks let their twenties go to waste. I pissed away my twenties, too. I started my company at age 22, but by that point I managed to buy a downtown condo, incurring two mortgages and a car payment. That high minimal personal expense base doomed me to spinning my wheels for years on consulting in order to stay afloat when I should have been building product–effectively, the same kind of funding constraints as the 40 year-old guy with a family. And I didn’t hustle nearly as hard as I should have; I wasted a lot of time and money with stupid distractions.
Yeah, I’ve got some kind of “work smarter” play in motion, but the point is that I’m not getting my 22 year-old self back. If you’re lucky enough to still be in your early twenties, recognise that your time is now, and the world is your oyster. You may not have the wisdom and experience of older folk, but you’ve got 200,000 lbs of thrust and tiny gross tonnage; that’ll get you to space, if you really want to go. Stop watching Celebrity Apprentice and go do something real. You’ll never have the same opportunity again.
2. It’s important to build a real identity.
It’s relatively commonplace nowadays to see people in their twenties wile away some of their most socially formative and interactively significant years on an exterior of “ironic” or sarcastic hipsterism, or veil themselves in thick shrouds of pop culture inside jokes, movie references, or Internet memes. That’s about the only kind of conversation you can have with them.
The appeal is easy to understand; it’s lazy as can be, requiring little personal ethos and cultural literacy (of the non-pop culture sort), and only moderate brain candlepower. More importantly, it’s risk-free, since anything one says in this insincere mode of social operation is easily disavowed or denied. As Christy Wampole said in the excellent article Living Without Irony:
… irony functions as a kind of credit card you never have to pay back. In other words, the hipster can frivolously invest in sham social capital without ever paying back one sincere dime. He doesn’t own anything he possesses.
Social capital is a game where one must pay to play. If you bring nothing to the table, you get nothing. Despite their occasional short-lived hit singles, the full-time “ironic” and “absurdist” are utterly discardable, forgettable people. When their black day arrives, nobody will come to their funeral, because they impressed nothing upon anyone worth remembering. Hopefully relieved of the naive invincibility of late teenage years, it’s time to give some thought to what would be said at your eulogy and written on your tombstone if you died tomorrow. Do you want your immortal contribution to the world to be that you had a kitschy hat, a snarky one-liner, or a Lolcat for every occasion?
To build real relationships, to learn and to grow, you’ve got to do the hard work of growing intellectual and moral backbone, and you’ve got to learn to defend it while negotiating bridges of understanding with others. It requires putting yourself out there and making yourself vulnerable. It requires applying yourself toward a directed purpose, an ongoing project of who you want to be when you grow up.
3. The pervasive current of truth about most of humanity is a conservative one — and that’s okay.
You can pick up on this most easily by observing the habits and lifestyles of white middle-class liberals in the US. You’ll notice they mostly have rather conventional and morally upright marriages, and aim to raise rather morally upright and conventionally successful children. Officially, they’ll pay much lip service to morally relativistic fashions and postmodern eclecticism, but that mostly concerns other people’s rights, not their own lives. Consider where they choose to live, which schools they send their children to, and the positions they take in zoning forums. Sure, they’re all for recognition of non-cisgendered non-binary pansexual genderbenderqueer whatever, but watch their reaction when their son comes home in drag and says he’s screwing a black guy.
In general, even these people tend toward those who are sociologically similar. They can be notionally against the death penalty, but not when it’s their sister’s murderer. They’re against militaristic foreign policy until someone flies a plane into their office tower. They’re against militarisation of police and martial authority until their neighbourhood is overrun by vagrant looters. They’re enthusiastically for affirmative action and equal opportunity policies to rectify historical racial inequity until a busload of Hispanic gangbangers is unloaded into their kids’ AP Calculus class.
They’re not hypocrites. They’re just trying harder than most to keep the politically correct cat in the bag, because that’s their shtick. But occasionally, fissures form in this elaborate fiction, and if you peek in and look around, you’ll see that they’re normal people, after all. There are some earnest progressives among the progressive–mostly unreconstructed flower children and their confused descendants. But when it comes to things like family morality, sexual mores and sociocultural dilution, most liberals agitate for rainbow causes on the implicit theory that other people can “live and let live”. While other people deal with the consequences, liberals trumpet their progressive, tolerant values without their own skin in the game, and everyone walks away happy. This is great news for people living at the margins who used to be actively persecuted and just want to be left alone, but it can give a very wrong idea about what people think privately.
Still, in our highly individualistic and politically correct age, it’s comme il faut in all but the most parochial circles to belch out at least a nominal paean to social liberalism–that is, unless you want to taint yourself with some kind of retrograde Republicanism and its populist dog whistle, conservative Christianity. This leads to the misapprehension that most of people are quite liberal, but it’s a siren song. Taking it at face value in one’s twenties means ignoring the realities of people’s private judgement, and it can lead to some harsh consequences.
One area is friendship and reputation. I’ve lost good friends over some unhealthy lifestyle choices in my mid-twenties which telegraphed flimsy constitution and poor impulse control. It’s one of those things where everyone is superficially very tolerant and accepting–democratic live-and-let-live and all this–but one day I woke up and realised that, while they do speak to me, they’re not really my friends anymore.
It’s astoundingly easy to fling yourself clear out of respectable society if you don’t see past the veneer of tolerance–if you actually allow yourself to believe that people don’t believe in respectable society anymore. I don’t know what’s worse, falling for that scam or never having been taught that there’s such a thing as respectable society. The latter is the stuff of bad education from unreconstructed sixties hippies, who never quite got over their sentimental attachment to the idea of disestablishing the whole thing. It’s poetic, but it sets up your children for failure.
“Don’t judge me!” is a facile bit of sloganeering, a thumb-sucking fantasy. Of course people will judge you! Judgment about others’ character is part of the basic inductive reasoning integral to our species’ survival. We live in a socially and morally interdependent universe, and the painstakingly evolved mechanics of social cohesion and shared morality–the stuff of anthropology–predate the latest fads and nouveau projects in sociology by what, a few millenia?
The consequences of my actions in this area are entirely mine to own and live with. I wish I could go back and give myself a blunt reminder that the world works more like my parents and other elders said it does, and less like the ostensibly freewheeling Bacchanalia of mass-culture would have us believe.
Also, because I’ve seen it happen to others around me: sleeping around for both genders robs them of valuable experience and skills of relationship-building, since sloppy, drunk sex requires none of them. I’ve seen far too many peers come out of a Lost Decade like that with no capacity to relate to another human being. It’s common for twenty-somethings to conflate quantity and quality in talking about the much-prized “life experience”. A decade on heroin is knowledge, but not good or useful knowledge.
4. The quality of the people with whom you surround yourself is of paramount importance.
One of the fallacies of the Anglo-American penchant for individualism is that it is given to paint one’s journey through life as that of a free electron, associating incidentally with various atoms from time to time in a kind of undifferentiated way. This is amplified by the American national mythos of the socially mobile, implicitly classless society, as well as the value of “diverse experiences” to “broaden horizons”.
If there’s one thing I’ve learned from my twenties, it’s that this is a grand, epic lie. Life experience is good, but not all diverse experiences are worth having, and not all horizons need to be broadened. We are social animals, and we adapt to those around us. It follows from this that if you surround yourself with bad people, they will slowly pull you down to their level, no matter how clever you are. Your choice of friends and partners is a powerful signal to other people whose opinion is important to you, just like anything else. There’s no judgement-free zone.
To advance through life productively, it is important to take a page from the social rules of the Old World and acknowledge the existence of social strata and concepts like “level of cultural development”; your friends and your loved ones must embody the values in which you would like to commune. If you aim to conserve a cultured upbringing, you must be around cultured people. If you would like to live a healthy life, you must do it with healthy people.
I’ve also come to appreciate that shared values are the single most strongly indicated prerequisite for a successful marriage or long-term relationship. When I was younger, I would have said that the most important thing is intellectual parity and intelligence. This is not exactly correct; a couple with disparate intellects but with a wide base of common inner-cultural understandings and unspoken agreements on what’s important, right and wrong will be a lot more durable than two brilliant people who are what you might call “civilisationally incommensurable”. It just happens that common values among intelligent people tend to necessitate common intelligence; if education, literacy, and higher-order self-actualisation are important values for both of you, it will logically come to pass that you will get together only if you’re both smart.
The iceberg to watch out for here for is falling out of the bottom. This is easier to do in a large, highly individualistic society largely bereft of traditional drip pans. You’re free to fall through the cracks, and in the name of official tolerance, everyone will let you. However, as I said in #3, your fellow human beings aren’t actually as fluid, bendable and tolerant as the brochure advertises.
I said above: “It’s astoundingly easy to fling yourself clear out of respectable society if you don’t see past the veneer of tolerance…” The meaning here is that if you lose your connection to the people who are truly important to you and let the aspects of life you most value slip through your fingers, it may be hard to get them back.
Here, too, the folk American sociology tells some lies: people are forever saying it’s easy to make a fresh start. Perhaps. But while the world is large, the parts of it in which you’d want to hold membership are much smaller. They talk. The most important lesson of all in my twenties is that this culture wants for a resurrection of the pedagogical primacy that used to be placed on the concepts of reputation, word, pride and dignity. If someone had informed me that most of the world still believes in these things, I would have made rather different choices, as would–hopefully–many of my peers.