Advocacy = empathy + speaking to someone else’s conceptual framework. When I’m trying to convey an idea from my head to someone else’s head, the biggest challenge is converting from my conceptual framework and values to theirs. Hence, The words that work:
For example, one partner in a conversation might use concepts like power and tradition and authority to make a case, while the other might rely on science, statistics or fairness. One person might argue with tons of emotional insight, while someone else might bring up studies and peer reviews.
Rare is the success of advocacy that doesn’t involve a lot of empathy, understanding your conversation partner, and letting go of the little details to reach a new local maximum of mutual understanding.
Yesterday I was handed a fresh, nifty new laptop. This is, for me, mildly terrifying. Last time I did a clean operating system install was seven years ago. I’ve carried an idiomatic mixtape of dotfiles, macOS preferences, files, and cruft with me on my personal laptop ever since.
A brand-new, stock laptop is a shock to my highly acclimated and particular system.
I started contemplating how exactly I could get setup relatively quickly. At the same time, I want to pay down a little bit of automation debt. By the next time I’m faced with this situation (when I buy my own computer, if a disk is struck by lightning, etc.) I shouldn’t feel so much like a deer in headlights.
At first I thought I’d attempt to transmogrify my current lightsaber into something like Gina Tripani’s dotfiles. I like how this is structured, and that the initial setup of apps and Unix-y things is bootstrapped by Homebrew. But, then I remembered Thoughtbot’s laptop and dotfiles and convinced myself this was the way to go.
Indeed, laptop helped me cut the Gordian knot of setting up my new machine so I can write code and feel at home on it. I highly recommend it if you have the means.
New dotfile repo forthcoming!
I’m starting a new job tomorrow. I decided to take a week off in-between jobs, mostly to make a quick trip to Disney Land.
I hid most social media apps away, stopped paying attention to news, and caught up on reading. I gave myself a 3-day weekend before our trip to decompress, we went to Disney Land for 3 days, and had a 3-day weekend to relax before I start the next thing. I’ve done a fair bit of writing, watching movies, tinkering with Ableton, and playing games too. A great vacation sandwiched between two stay-cations, in the lexicon of our times.
My mind feels like it’s had a chance to reset and get back to a neutral state. I’m hoping this will help me keep my frame-of-mind looking forward as I start the next job. This was a great decision and I highly recommend you do something like this (granted, Disney Land isn’t everyone’s thing) yourself, if you have the means.
My previous forays into machine learning left me a little frustrated: I could tell there was language,pattern, and notations to this, but I couldn’t see them from the novelty of new-to-me words like sigmoids, convolution, and hidden layers. Turns out those are part of the language.
But the really handy idioms are encoded in TensorFlow’s high-level model-and-layer API. A model encapsulates a chunk of machine learning that can be trained to classify inputs (images, texts, etc.) based on a mess of training data (pre-classified stuff). Every model is built from a network of layers; layers use linear algebra to transform numbers into classifications.
Once you’ve built a model, you feed it a bunch of training data so that it can learn the coefficients and other number-stuff that goes inside the math-y network. You also provide it with an optimizer and loss function so that as the model is trained, it can know whether its getting better or worse at classifying data.
A really cool thing is you run this training process on your computer’s GPU. GPUs, like machine learning models, are big networks of fast math-y stuff. Beautiful symmetry! On the other hand, you usually can’t fit your training data set into GPU memory, so you end up batching your test data and submitting it to the GPU in loops.
Once all this runs, you’ve got a trained model that can take image inputs (in this case, hand-written digits) and classify them to decimal numbers (0-9). Magic!
For some reason, identifier schemes that are global unique, coordination-free, somewhat humanely-representable, and efficiently indexed by databases are a thing I really like. Universally Unique Lexicographically Sortable Identifier (ulid, for humans) is one of those things. Implementations available for dozens of languages! They look like this: 01ARZ3NDEKTSV4RRFFQ69G5FAV.
Paul Ford’s website is twenty years old. For maybe half that time I’ve been extremely jealous of how well he writes about technology without being dry and technical. When I grow up, I’ll write like that!
How Awesome Engineers Ask For Help. So much good stuff there, I can’t quote it. There’s something in there for new and experienced engineers alike. In particular: don’t give up, actively participate in the process of getting unstuck, take and share notes, give thanks afterwards.
The best time to work on your dotfiles is on weekends between high-intensity project pushes at work. No better time to do some lateral thinking and improving of your workflow. Feels good, man.
Over the past few weekends, I’ve been reading on two topics which are way out of my technical confidence. I’ve spent the majority of my software development career building web applications and neither of these are very coincident with web apps right now:
blockchains, cryptocurrencies, and autonomous contracts
machine learning, neural networks, general purpose GPUs, deep learning
With blockchain stuff, there are very interesting fundamentals underlying a sprawling system of hype and information asymmetry. Every time I go in, it’s “shields up!”, time to defend myself from people trading reputation for short-term speculation or actively spreading inaccurate information. In other words, here comes the snake oil salesmen!
That said, there are cool ideas in there. Solidity is a language built into Ethereum for writing programs that run alongside the blockchain. You wouldn’t want to build a normal application this way, but if you want some degree of confidence in a system, like voting or accounting, a system inside Ethereum and Solidity might make sense. Even more strange, to a web developer, you have to pay for the compute time that program requires in Ethereum itself. Strange and intriguing!
By comparison, machine learning is equally hyped but has little speculation. They both involve about the same level of mathematical and computational complexity. Which is probably how I’ve managed to avoid both so far: I’m far better at social reasoning, which is a big deal in web applications, than I am at math. But I’m trying to change that!
I found deeplearning.js and it seems like a nice gateway into the domain of building neural networks for machine learning, computer vision, etc. And it utilizes your GPUs, if present, which is pretty neat because GPUs are strange little computers we seem to have increasingly more of as the days go on.
No idea where this line of thinking is going. All I know is it’s more fun than reading about yet another client or server side framework. ;)
I watched pal Drew Yeaton work in Ableton briefly and it was pretty incredible. He laid down a keyboard and drums beat, fixed up all the off-beat stuff, and proceeded to tinker with his myriad of synthesizers and effects rack with speed. I had no idea what his hands were doing as he moved from MIDI keyboards, mouse, and computer keyboard like a blur. Seems pretty cool!
I talked myself into and out of porting this website to Jekyll three times over the past week. Hence, the writing dropped off, which is silly because I just blogged about not tinkering with blog tools in the last month. WordPress.com doesn’t quite do the things I want it to and its syntax highlighting is keeping the dream of the nineties alive. I’m writing these short form bits in lieu of a sidebar thing for now. No idea how I’ll make do with the code highlighting.
The Good Place is an amazing show. Ted Danson, Kristen Bell, and the rest of the cast are fantastic. There is an amazing-for-a-comedy twist. Do not read the internet until you watch the first season of this show. It’s just started season two, get on board now!
I did the preliminaries for this last year and ended up turning back from it. I understand Docker and virtualization superficially at best. I don’t want to impose it on teammates. It’s still too hard to search for Unix-y error messages and fix your development environment. Trying to figure out if your host Unix, Docker, or a virtualized Unix are the problem is not something I wanted to do to someone else.
Is Amazon Lightsail a move by AWS into the space occupied by Linode, Digital Ocean, etc.? Related to virtualized localhost setups: someone write me a thing to drop my dotfiles from macOS onto a Digital Ocean, AWS, etc. instance and do development from an iPad, keyboard, and SSH client.
Hammerspoon is a really cool to do all-the-things with your keyboard and some Lua. I use it to launch/switch to my most frequent dozen apps and some light Markdown helpers. But, something about it is correspondingly creepy. It can, theoretically, scoop up every keystroke. (Which probably every bit of open source I install via Homebrew could, to be honest) But maybe I could replace it with a clever bit of Alfred workflow and scripting. Catch a triggering keystroke and then give me a constrained list of apps to switch to. Yes, this is a very strange way to hit Command-Tab! I wonder how well a few custom Alfred workflows fit into a dotfiles repo.