Expanded ideas
I am a unique snowflake
Every software person is as special and unique as they think they are. But things go weird, in my experience, when I try to express my snowflakeness in production code. If I want to be weird or try something new, I should at least do it in a side/passion/mastery project. Even better: hobbies!
Ideas for Twittering better
When it comes to Twitter, things can get out of hand fast. Setting aside the hostile environment some people face when they participate in Twitter (which is setting aside a doozy!), it helps to have a few defense mechanism for what is appearing in your stream.
Most importantly, I evaluate each potential follow by the rule of “smart and happy”. Which doesn’t mean smart, angry people are automatically off the list. But, they have to show a really unique intelligence to get past my emotional filter. I made a graphic to boil down my “should follow?” decision:
[caption id=“attachment_3566” align=“aligncenter” width=“235”] How to decide to follow someone on Twitter.[/caption]
Non-brilliant and happy? Probably in! Brilliant and happy? Probably in! Smart with a little bit of edge? Maybe. Just angry? No thanks.
Information overload, confirmation bias, and overwhelming negativity are also handy things to manage. I do a few things to keep my head above water and a not-too-dismal outlook on life:
- Don't worry about keeping up. It's impossible. That's OK!
- When I have stuff that needs doing, shut it down. The tweets will go on without me.
- Follow people with a perspective different from your own.
- Keep a private list for high signal-to-noise follows. Good friends and people whose ideas I don't want to miss end up here.
- But follow a lot more people as a firehose of interesting and diverse voices.
- When on vacation: don't even care about Twitter. Disconnect as much as possible.
I hope one of these ideas can help you Twitter better!
Hype curve superpositions
It seems, these days, that technologies can exist in multiple phases of the hype curve, simultaneously. Two data points I read this weekend:
- Node, which I personally place somewhere between "trough of disallusionment" and "plateau of productivity", is in the "exceptional exuberance" phase for the author of Monolithic Node.js
- Ruby, which I personally place in the "plateau of productivity" phase is in the "trough of disallusionment" for the author of The Ruby Community: The Next Version
In short, I strongly disagree with both of these opinions. But I think that’s not the useful datapoint here. The takeaway is that both viewpoints can exist simultaneously, in their own context, and not be entirely wrong.
Missing the big picture for the iterations
I.
Driving in Italy is totally unlike driving in America. For one thing, there are very often no lane markers. Occasionally a 1.5 lane road is shared by two cars moving in opposite directions. Even if there were lane markers, it’s doubtful Italian drivers would heed them. Italian traffic flows like water, always looking for shortcuts, ways to squeeze through, and running around temporary obstacles. For an American, driving in a big Italian city is a white-knuckle affair.
My conjecture is that the unspoken rule of Italian drivers is “never break stride”. Ease in and out of lanes, blend in at traffic circles. There’s almost a body language to Italian driving by which you can tell when someone is going to merge into your lane, when a motorbike may swerve in front of you, or when a tiny delivery van is going to blow past you on a two-lane road.
II.
Start with the result. I find myself mired in optimizing for short-term results that I can incrementally build upon. This is a fine tactic, especially when getting started. It’s a nice way to show progress quickly and keep making progress when rhythm matters.
But, it’s a tactic. To make a musical analogy, it’s how you write a song, not how you write a whole album. At some point I need a strategy, a bigger idea. I need a result in mind.
III.
I love to tinker with new technology. The grass is always greener with new langauges, libraries, tools, etc. I’ve learned a lot this way, and kept up with the times. I’ve got lots of surface-level experience with lots of things. But increasingly I want more experience with deeply accomplishing or understanding something.
IV.
Driving in Italy was extremely jarring for me at first. It closely resembled chaos. Eventually, I got used to it, at small and medium scales. (But never drive in Rome/Milan). Now, I sort of miss driving in Italy, at least the good parts. I miss the freedom to overtake other drivers without having to swerve through lanes, and I miss not stopping at traffic signals any time there’s an intersection.
Maybe this is a reminder, for me, that getting out of my routine (American driving) isn’t so bad. Worth the initial shock. Maybe my routines, my tactics, my tool/library/langauge novelty seeking, were helping me along as much as constraining me.
Maybe the big picture result, not the iteration, is the thing and how you get there (highly ordered American driving or seemingly unordered Italian driving) is of less consequence.
Sometimes
- Sometimes you go on a writing slump. Usually, just throwing something at the wall is how you undo that.
- Sometimes you notice that a lot of your writing can end up in platitudes and that deepens your slump. We regret the error, those responsible have been sacked.
- Sometimes you get kinda caught up in learning, tinkering, and enjoying things and forget to write. That's ok!
18 months is a smelly interval
18 months is a dangerous window, when it comes to building a product. It’s far enough in the future that it seems like you could deliver an ambitious idea within a year and a half. But it’s a long enough timeline that one is tempted to skip the necessary contemplation, dividing-and-conquering, and hypothetical thinking that the planning process forces on you.
As it happens, 18 months is 540 days. 540 degrees is one and a half revolutions. As in, you started a revolution, but ended up regressing.
Saddest trombone. Be wary of anything that promises to happen in 18 months. It’s the “epic handwaving” of project management.
Megaprojects: megacool
Megaproject. It’s a cool word. It’s an even cooler list-of-pages on Wikipedia. I’ve only worked on projects limited to tens or dozens of people. The human and geographical scale of some of these endeavors just blows my mind. The coordination and planning required for something like the Boeing 747 or Apollo program is beyond my comprehension. (OK, maybe I’m still really into aerospace; I did go to Space Camp. Twice.)
I would love to be a fly on the wall of the meeting where it’s decided to go ahead with a project that will be visible from quite some height above the earth like Denver International Airport or Walt Disney World. That’s a pretty huge commitment. I waffle for weeks when I decide I’m going to buy a new car!
If those don’t whet your appetite, perhaps speculative megastructures are more your speed? Trans-global highways! And of course, Dyson spheres, ringworlds, et cetera.
It’s good to remind myself that occasionally, human kind is capable of building really great stuff.
Vacation, disposable, and calm computing
1
Let me talk about vacation computing. The prime directive of vacation computing is that you should compute on vacation as little as possible. Neglect your email, abandon your social mediums. Don’t do the things you normally do, regardless of how computery your regular work is.
From there, it follows that your vacation computer should basically not be a computer. That means smartphones, tablets, and book readers are the only options. But smartphones are pretty much synonymous with social media, so they aren’t really viable as a vacation computer (though you probably want it anyway because they’re a superpower). Tablets are nearly computers now, so that’s not viable either.
It follows that a book reader is the only acceptable vacation computer.
2
Let me talk about disposable computing now. We put a lot of important stuff on our computers these days. Important passwords, legal documents, email, family pictures, private pictures, computer games, purchased and bespoke music, Hollywood and home video, etc. Sometimes those computers are in our pockets, sometimes they’re on our laps and coffee tables, and occasionally you might still find them on our desks!
For the drama and heartbreak that can occur when we lose these computers, we take astoundingly bad care of them. We don’t back them up, we reuse passwords. A moment without wireless networking is the worst and yet we don’t take steps to prevent even more dramatic losses due to password breaches and storage failure.
Given all of this, a computer is made better by making it a disposable object. Backup your data, and backup your backups. Practice good password habits as much as possible so your accounts are isolated and somewhat disposable. Know your gameplan and what happens to your stuff if your computer or backups fall into a lava pit.
3
Knowing about vacation and disposable computing, I’m led to an odd and dissonant conclusion: an e-ink Kindle is the perfect computer. It does not do work, it does not social media. You can take it through airport security without any extra steps, which feels a little perverse and seems a bit surreal. It does not interrupt, it does not beep or blorp, it just barely displays text. As modern computers go, it’s basically useless.
But. You can read on it. And reading is so wonderful. And you can put stress aside. A Kindle gets wet? Not a big deal. Drop a Kindle? Not a big deal. Try to use it by the pool, out in nature, out in weather, out where the internet does not go? Not a big deal. Lose your Kindle? Buy another one, it costs a fraction of all your other computers.
The one scenario where you will find yourself absolutely screwed with a Kindle is when you have to enter text. Logging into Amazon or a wireless network for the first time? That’s a bad time.
In every other respect, the Kindle is a computer that does nothing to increase your stress level. That’s pretty remarkable today. Let’s make more calm computing devices, ok?
Apple, Disney, and obsession
People in technology disproportionately like to comment on Apple’s products and business. Outside of technology, there are just as many folks who love to obsess over Disney’s theme parks. Based on my friend networks, I’d wager that for every person who obsesses over Apple’s keynotes, there’s a Disney enthusiast keeping up on special events and the best way to enjoy the theme parks.
The connection that strikes me is that both of these companies pay more attention to details than their competitors. Apple’s competitors throw software and hardware at the wall like spaghetti, hoping it will stick. Disney’s competitors rename their rides to match the blockbusters of summer. By comparison, Apple makes a big deal about the fit and finish of their hardware (let’s not talk about that camera though!) and has a coherent story about how all their products fit together into a useful landscape. Disney carefully arranges their parks to keep the guests in a cohesive movie world and pays attention to the little details that enhance or optimize the experience.
I could make some value judgement here about how attention to detail is more profitable, better design, better engineering, or whatever. I suspect all of those are true, but it’s not what excites me about Apple or Disney. When I read about changes to Disney’s theme parks or Apple’s keynotes, I’m excited that there are companies, quite large and successful ones, that are connecting lot of dots in an intriguing way. They’re extracting delight from large scale complexity. Megaprojects are nifty and often enhance humanity, but they’re mostly out of touch or sight. It’s nice that some of us can experience and enjoy these commercial projects of vast scale and quality execution.
Multiplication over management
When a developer becomes a manager, It’s not a promotion, it’s a career change:
If you want to do your leadership job effectively, you will be exercising a vastly different set of skills on a daily basis to what you are exercising as an engineer. Skills you likely haven't developed and are unaware of.Your job is not to be an engineer. Your job is not to be a manager. Your job is to be a multiplier.
Don’t miss the section on how we undervalue non-technical skills. It’s not unlike developing software, it’s just that your levers are people and processes instead of software and data centers. See also, Managing Humans.
Make systems from goals
Use systems to get where you’re going, not goals:
My problem with goals is that they are limiting. Granted, if you focus on one particular goal, your odds of achieving it are better than if you have no goal. But you also miss out on opportunities that might have been far better than your goal. Systems, however, simply move you from a game with low odds to a game with better odds. With a system you are less likely to miss one opportunity because you were too focused on another. With a system, you are always scanning for any opportunity.
Applies to personal life, biz, programming, hobby, whatever. Use goals to figure out what systems you need in place, then get habits and systems going to make those goals, or something better, happen.
Yet another way you can use your skills as a developer to construct a system that really solves the problem, and not a symptom of the problem!
Businesses can empathy too
Empathy is sometimes described as a personal trait, but it’s a skill, a skill that can be learned, that can be honed, and that can be instilled as a core value of a company.
The post is about applying empathy to the core values of a business, how it shapes the actions and culture of the institution. But it’s a reminder empathy is about people, and how they experience you, your team, or your employer.
Follow up reminder: empathy is your most important skill. I’d go further than saying it’s a skill; those I’ve met with outstanding empathy basically have superpowers.
Stop me if you've heard this one
Lately, I find myself stopping to make sure I haven’t previously written the thing I’m currently writing. For starters, I have a horrible method for moving things out of my “I should finish this idea” folder into the “I wrote about this idea!” folder. It doesn’t help that I often draft articles in my head while I run, shower, or do chores and then forget that I had the thought. It’s kind of a mess in here.
Assuming that I slip up and write the idea down twice, hopefully it’s in a way that doesn’t look like I’m plagiarizing myself. Is it weird to write about the same thing multiple times, if it’s nearly the same idea?
I hate repeating myself, telling the same stories over and over. “Have I told you this one before?” is a frequent prologue to great stories. But is it necessary? Hearing a mediocre story twice is slightly painful, but hearing a great story twice is no chore at all.
If I keep writing an idea, coming back to it, maybe there’s something important there? Perhaps it’s still bouncing around in my brain for a reason. I haven’t fully wrapped my head around it, or articulated the idea in a way I find satisfying and essential.
This is a personal website; the line between “just play the hits” and “stop trying to make fetch happen” doesn’t have to be so strong here as it would on, say, the New York Times. So, stop me if I write about an idea so much I run it into the ground. It’s just that I’m trying to get it out of my head in the right way.
Let the right something in
There will always be more somethings we want to do than we have time to do. Right? Maybe.
- A lot of the right somethings can add up to a great thing, even if the somethings aren't of the highest quality or express the biggest idea.
- A lot of the wrong somethings aren't that interesting, unless your work is generally of great enough import that historians take an interest in it.
- A lot of the wrong somethings may not add up to much at all and are unlikely to attract the interest of historians.
- If you don't care about whether something's great, you can produce a lot of somethings.
- If you don't care if something expresses a big idea, you can produce a lot of somethings.
A lot of truisms will tell you that quality is the important thing and quantity is secondary. But perhaps there are all sorts of cases where that’s not entirely true.
Mozart wrote way more music than Beethoven; Beethoven’s was more sophisticated but their bodies of work are considered on the same level. There are way more episodes of Law and Order and all its spin-offs than Breaking Bad; one made more money, one gathered more acclaim.
Rather than deciding to pursue quality over quantity, perhaps it’s better to:
- Choose your somethings with care
- Execute on the idea central to those somethings
- Produce as many somethings as possible without hating the quality of your work
The false bad guy
A pet peeve, in writing and thinking: introducing a false antagonist to create tension in a story. This is rampant in tech writing. Apple vs. Google, Ive vs. Forestall, Rails vs. Django, Ember vs. Angular, etc. But there’s no there, there. It’s only filling space.
I would love to have Ernest Hemingway weigh in on introducing a bad guy where one isn’t needed. I suspect there would be a lot of cursing involved.
Quit your desk
Things I’ve quit doing at my desk:
Many writers maintain a private writing hut. The hut has one purpose: it’s the place they go to write. They don’t do anything else there. Once they can’t write any more, they go do something else. I think we need to think of our desks in the same way: these are places where we get work done.I like my desk, but I know the hours I can sit at it and get work done before fatigue sets in are finite. I try to mix in standing at our bar-height dining table, sitting on the couch (most recently, with three dogs), working from coffee shops and occasionally sitting on the front or back porch.
The big idea from that article, burning a hole in my head, is that we should step away from our desks when we’re not working (for me, telling computers to do things). Thinking can happen on a walk, standing outside, or in the shower. Socializing can happen from the couch or mobile device. Procrastinating by reading, surfing, social networking, etc. can happen anywhere.
Once I freed my mind from the idea that I’m only working the moments my butt is in a chair at a desk in front of a computer, my work improved and my life got better. Quit your desk and find out for yourself.
Coffee and other warmups
Making a cup of coffee sometimes helps me prepare for the process of solving puzzles with computers. Something about choosing AeroPress, French press, Chemex, or Clever; heating the water to 212F or 200F; medium-fine, medium, or coarse grinding of the beans. The weighing and grinding of the beans, boiling the water, rinsing the filter, pouring the water, waiting, pouring more water, agitating, pressing the coffee, discarding the filter and grinds. Now I’m left with a cup that I made for myself. A minor victory for the day.
All sorts of things require warm-ups. Stretching, air-squatting, or a quick jog lets my body know it’s almost time to exert itself. Word association or playing little games tells my brain it’s time to improvise.
Updating some documentation. A tiny, superficial refactoring or layout change to some code. Drawing a picture in my notebook. Making some coffee. That’s how I know it’s time to solve puzzles.
Off my grid
Ignorance: pros and cons
We can often, but not always, choose to ignore those on the internet, on TV, and in our lives with different ideas, philosophies, or opinions about the world. Whether intentional or accidental, this is ignorance.
Ignorance is handy because it can keep us sane. We can’t know all the things or have all the experiences. We all value things based on our own experiences and learnings. We cross-reference that with our ego and emotions and come up with our “truths”. Conflicting “truths” can hurt, and so we only let some kinds of them in and trim our lives to exclude the others. This is helpful for reducing stress and making for more happy days.
It’s not great though, because it isolates us from seeing more of the world and understanding it more clearly. Many media fights/beefs/arguments are rooted in conflicting “truths” and collisions of ignorance. You ignore the value of a supportive government, I ignore the value of maximum personal liberty, and boom! we’re arguing. We’re not getting things done.
Personally, that arguing is stressful. I’d rather not get worked up about politics, governance, and technical minutiae if at all possible. Therefore, I selectively engage in ignorance. I try to double check my assumptions and ignorance occasionally; I find ignorance is a useful tactic, not a long-term strategy.
If you could imagine a world where empathy ruled and everyone possessed a superpower for compromise, you might see a world where ignorance isn’t so much of a problem and amazing things can get done. Oh, what a fantastic, science-fiction world!
Ignorance is bliss and that which prevents us from achieving really big things. Use your ignorance carefully and with consideration.
Twice the podcast listening
I like to listen to podcasts and screencasts at two or three times the recorded speed. The application I use (Instacast) does this with pitch correction, a feature that’s probably built into iOS at this juncture. In short, I can listen to a thirty minute podcast in ten to fifteen minutes and they only sound funny when music plays. I do mean funny; listen to Radiohead’s “Creep” at 3x speed and it comes out downright chipper.
Our brains can process speech at these accelerated rates just fine. In fact, when I listen to some of my favorite podcasters in “real” time, they sound like they’re thinking really hard and speaking slowly, or that they’re flat-out drunk. The interesting bit is when an accelerated speaker has an accent or when there is radio interference with the FM transmitter I use in the car. At this point, all bets are off and I have to slow the podcast down or listen when the signal is better.
The bottom line is that, empirically, human speech has built-in redundancy. We tend to speak at a rate that, if you miss some sounds, you can probably still make out the words. Further, the space in-between words is probably filled with our own thoughts anyway; we only listen part of the time we’re listening.
Nifty things, our brains are.