musings

Random thoughts and writings.

"I Don't Trust Anything That We Didn't Build"

The problems started small, as they often do. But as we've seen many times before, lots of small problems in quick succession tend to make one big problem.

In this case, the problem got big fast. It started off easy enough: read the big report, find the bug, fix it, the usual. Our bug-tracking team located the source of the issue right away, and my team set about trying to work out the fix. We found the data source that was causing the issue, and it happened to be a web service owned by another team. We couldn't see into it, we could only see the inputs and outputs; it was essentially a black box to us.

Here's where the bad part starts. Due to a lack of efficient communication on all sides, impending deadlines, frustrated coders and general hesitancy to deal with this particular, ahem, time-intensive project, the actual bug fix took about four days to nail down. Yes, four days; we were just as annoyed as you are, and probably more so.

To make a long story short, the project we were attempting to deal with was:

  • old,
  • slow,
  • in desperate need of a rewrite,
  • using a data source which we had no visibility into (the aforementioned service),
  • not written by anyone currently on the team,
  • still our responsibility to fix AND
  • needed to be fixed right friggin now.

You should read that list and cringe a little for each bullet point. I know I did.

All of those problems put together (plus the fact that it took us four days to figure it out) prompted my manager, normally a well-reasoned, thoughtful individual, to say during our bug post-mortem:

"I'm starting to really loathe this project. It's getting to the point where I don't trust anything that we didn't build."

I have to say, it's hard to blame him for wondering if we shouldn't be using things that were not invented here.

It's incredibly easy for a software development team, even an experienced one like mine, to fall into the comfortable trap of believing that everybody else's code is terrible and their own is awesome. We developers often forget (or, quite possibly, want to forget) that most of the time the bug is in our code, no matter how much we wish that it wasn't.

Do this enough and the untamed wild of other people's code starts to look like an intimidating place. It's safer, easier, to believe that your code is correct and everyone else's is wrong, because that means you don't have to spend time trying to understand what the other person was thinking. Or, just as often, spending time figuring out how you are wrong, something nobody enjoys doing.

I've written before that I believe code should exist for a reason. The difficulty in working with other people's code is that not only are you trying to understand what the code does, you're trying to comprehend the reason why it does that. That's a difficult thing to accomplish in the best of times (efficient communication being a feat that usually fails, except by accident), and when you're approaching a deadline and trying to have a meaningful conversation with the original developer who has his own deadlines and responsibilities to deal with, it can be nigh impossible.

Let me be perfectly honest: there are times I completely understand my manager's frustration. It would be SO much easier if the only code I had to deal with was my own, because then the only stupid person in the equation is me and I can fix that. Dealing with other stupid people is infinitely more frustrating than dealing with your own stupidity.

To be clear, I am not calling my coworkers stupid; they are in fact quite the opposite. But it's tempting to fall back to lazy thinking and believe they are stupid merely because they were dealing with requirements and scenarios that I didn't have time to thoroughly understand. That temptation, to believe that things are stupid because I don't understand them, is something I find myself fighting against on a daily basis. It's an innate human quality, and not unique to programmers or other technical people.

Here is a basic fact of life: people, on the whole, are not stupid. Programmers do not write code for no reason, as the best code is no code at all and if we could have our way there would be no code, ever. But because code needs a reason to exist, it almost certainly had a set of requirements, or scenarios, or something which shaped its current form. Even if those requirements were merely thoughts in the original developer's head, they existed. It is not the fault of that developer that some idiot who saunters up to a laptop and is trying to break her code doesn't understand what said code is meant to do.

But it's easy to think that, isn't it? It's easy, it's simple, it's lazy. When we don't have time or energy to think, really think, the lazy thoughts are what we are left with. Given that programming is an almost-entirely-mental task, accepting the lazy thoughts as fact could even be seen as a reprieve from needing to think critically all day, every day.

Resist the lazy thoughts. Resist the idea that your fellow programmers are stupid, or wrong, or only doing a half-done job. Resist Not Invented Here syndrome. Resist the idea that because someone didn't understand you, they're dumb. Resist all these little thoughts that end up with a conclusion of "those other people are stupid," and instead try to answer "what were they trying to accomplish?" There's nothing wrong with digging a little deeper for a better understanding.

That's what I say to you: resist the lazy thoughts, and dig a little deeper. You will eventually have to trust something you didn't build. If you keep digging, you'll find what you are looking for.

Post image is Digging a hole for the digester from Wikimedia Commons, used under license.

In Praise of the Junior Developer

"She's a project," my boss said to me. "She's green, and even though she's been working here for several months, you should consider her like a brand new college graduate. She'll need a lot of oversight, a lot of hand-holding and you'll still be expected to finish your projects on time."

"Excellent," I said.

I'm not sure why my boss felt the need to warn me. Melissa is a new, green developer. I know that; I knew that from the moment the coding section of the interview began. She had trouble with the FizzBuzz test, stumbled over the differences between an abstract class and an interface, and generally showed that she needs some experience. This ain't my first interview; I know she's a newbie, that's she'll need some help (maybe a lot of help) to get started.

And it doesn't friggin matter. I'm happy to have her on my team.

We need junior programmers. And I don't mean my company specifically, I mean our industry as a whole. This profession runs on the backs of the juniors. They're the one who get the crap work, who check the nitty gritty details, who learn and advance and become seniors and leads and managers. They are the people who will eventually replace me. I'm not scared of them; I want to see them succeed. But to do that they're going to need teachers, mentors, someone to help answer their questions.

I've been saying for a long time that what this profession needs is more teachers of technology. The problem, I'm now beginning to realize, goes much deeper than that. The real issue is not a lack of people willing to teach others, it's a lack of compassion in doing so.

The odd thing about programming is that it's entirely mental (with all the different usages of that word applying). There's no real physical component. Sure, we type on a keyboard, but that's the result of the mental work in progress, not the end goal. And it's tiring. Programming is mentally exhausting work. If the brain is a muscle, then programmers work it to the metaphorical bone each and every day. Good programmers solve people's problems using code in an efficient, skillful manner.

When you think about it, that's also the issue with teaching: it's a mentally exhausting job. Now, instead of trying to work out problems, you're trying to work out how different people learn and then teach to their skills, which is infinitely more difficult that just solving static problems. People are dynamic, changing, varying from one to the next. If you're a good teacher, you can innately understand how people learn, and then construct situations in which they will acquire the skills they need in the most efficient manner.

Here's the rub: the skills that make you good at programming (solving problems) and the skills that make you a good teacher (solving people) are not, and never will be, fully compatible. You can be the greatest damn programmer in the world, and yet anybody you try to teach will be just as bewildered as before, if you don't have compassion for the learner.

Compassion bridges the gap between teacher and learner. The less skilled you are at teaching, the more compassion you need to have for your learner.

(I'm tempted to use the word patience here instead of compassion, but you can be patient and still not be compassionate. It's the difference between a boss needing work done and a teacher helping a student study; they might both wait a long time and be perfectly fine with that, but the boss will simply expect the work to be done, while the teacher will understand why it took such a long time and work to help the student improve that.)

Look, not everyone will be a good teacher. Not everyone will be a good programmer. But those of us who are in a position to help others learn should take advantage of that. At the very least, attempting to teach others will help your communication skills; by describing a problem, you increase your understanding of it, otherwise rubber duck debugging would not be a thing.

But everyone can be a compassionate teacher. I can, you can, even newbies like Melissa can. Skill doesn't matter, ability doesn't matter; compassion matters. Compassion is what makes a good programmer into a great teacher.

So, bring it on, Melissa. Bring it on, junior devs. We need you. And we'll be doing our level best to be compassionate, to be teachers, even if we're not very good at it. Despite all our misgivings, despite all the hate and impatience and intolerance you might run into out there in the wilds of the Internet, there are still those who want to see you succeed, and are willing to use our compassion to help you reach your goals. This industry runs on the backs of the junior developers, and we would be loathe to forget that.

Post image is Representatives of the database development team from the U.S. Department of Agriculture, used under license

Show Up, Kick Ass, Go Home

I refuse to work overtime. In the five years I've been at my current company, I've worked overtime exactly once, and that was because our server was literally on fire. Overtime is just not worth it to me.

I'm a salaried employee. A rather well-paid salaried employee, at least compared to many other professions. In the United States where I live (where I am classified as an "exempt" employee), that means that I will not be paid for work done above and beyond 40 hours a week. So, as far as I am concerned, my employer pays me to work 40 hours a week. I show up on time, I kick ass for 8 hours a day, and then I go home.

What I don't do, at least not on a regular basis, is work overtime.

Unpaid overtime dilutes your hourly rate. If you get paid a salary of $60k per year, that's approximately $29/hr if you work 40 hours a week. If you work just 5 hours more a week (45 hours per week), your hourly rate diminishes to approximately $26/hr. You've just devalued yourself by $3 an hour. Further, you've told your company that that's what your worth, since they're already paying you a set amount. From their perspective, overtime is free work, and who would turn down free work?

And for what? I'm an American, but one of the apparent ideals this country seems to hold is absolutely ludicrous to me: I don't live to work. I've written before that I live to live, to do things with my family. I don't want more money; I already have enough that my family and I can live comfortably, if not extravagantly. I want more time.

Time is the one thing I can't ever get more of. No amount of salary negotiations, of GitHub commits, of stand up meetings can ever replace the time with my family that I lose when I work. And "lose" is the correct word here; it's not time I can make back up.

I have to wonder: why do so many people do this? Why do so many people commit themselves body and soul to a company, to work? I don't have any proof, but I personally think it has a lot to do with the illusion of control.

See, in many people's lives, things are simply beyond our control. We can't always protect our children from every little thing; we can't always get that promotion we so desire; hell, we can't even always catch the damn Pokemon that we need to complete our collection. But we can do our job. We can file the correct paperwork, we can write the appropriate tests, we can get all the appropriate projects planned out months in advance. Those are things we can control.

Control is a big deal. Anything we can control, we tend to hold on to for far longer than we should, far longer than is rational (not that humans are always rational, of course). After all, why lose something when all it takes is our hard work to make it worthwhile?

But it's not. Hard work, work above and beyond what you get paid to do, is not worthwhile. It's the opposite of worthwhile, because it diminishes the amount of time you get to spend on other activities. It reduces the time spent with your family, with your loved ones, with your hobbies that give you purpose. It gives us control, but it also wastes our time. It's a time-sink.

Now, at this point in my life, my most valuable commodity is not money, it's time. I can't get any more, no matter how hard I work. I have a limited amount of keystrokes left in my life and I refuse to voluntarily use them up for some company, some effort, some goal that I don't believe in. I've done that before, and it never works out.

Fellow salaried employees: don't work overtime, at least not on a regular basis. Your time is more valuable than that.

Code Is Ephemeral, Concepts Are Eternal

Lots of people ask me things like "should I learn MVC or Web API first?" "HTML or Javascript?" "Angular or React?" etc. After all, there's only so many hours in the day, and what time we have to spend learning is often limited by other factors (energy, work policies, etc.) This leads to the most common question I get from junior programmers: What framework, stack, or language should I spend my precious time learning?

I always tell them the same thing: It. Does. Not. Matter.

It doesn't matter if you pick Angular, or ASP.NET, or Ruby, or Java. It does not matter if you want to build web sites, or IOS apps, or Windows programs. It does not matter if you're a fresh-out-of-school graduate or a 30-year programming veteran. All of these technologies, all of these platforms, will ultimately help you learn the same things, the same tried-and-true best practices. Just pick one!

Remember: you will be obsolescent someday. That will happen, especially in a business where you must continually stay on top of your own learning in order to do your job. You have a finite number of keystrokes left. Therefore you should spend your limited time learning whatever will stave off that obsolescence for as long as possible.

Concepts fight obsolescence. Even when ASP.NET inevitably dies, the concepts I've learned from programming in it for ten plus years will still be useful. Concepts have a longer shelf life than details, because details change. Languages are born and die, frameworks become unpopular overnight, companies go out of business, support will end. But the thoughts, the ideas, the best practices? They live forever.

Learn about SOLID. Learn KISS, DRY, and YAGNI. Learn how important naming is. Learn about proper spacing, functional vs object-oriented, composition vs. inheritance, polymorphism, etc. Learn soft skills like communication and estimation. Learn all the ideas that result in good code, rather than the details (syntax, limitations, environment, etc.) of the code itself. Mastering the ideas leads to your mind being able to warn you when you are writing bad code (as you will inevitably do).

Don't fret about the how. How you learn the concepts is irrelevant. It doesn't matter what framework you like, what stack you use, what technology you're currently in love with. Just pick one, learn that, master that, and remember some of the pain you had to deal with for the next project. Write a small project, post it to GitHub, blog about it. Get some experience with it! Experience is king, and nothing can substitute for real-world experience.

Code is ephemeral, concepts are eternal. Code is static; it will die, fall apart, degrade. It may take a long time, years or decades, but it will happen. But the concepts which programming lives off of do not die; they live on.

So again I pose the question: what should you spend your precious time learning?