Original Link : https://onezero.medium.com/coding-is-easy-and-coding-is-hard-9ce86a6a598c

Writing computer code can be so easy — and so hard at the same time

This is what coding is: You type some special words into a text editor. It probably has a black background. Some of those words turn different colors as you type them. Once you’ve typed all the special words, you save, or save and build, or press compile. And then you look at the output. If it doesn’t look right, you go to a white and orange website and hunt around until you find someone who has done more or less what you want to do and you copy and paste that.

This is also what coding is: You’re at a whiteboard. It’s covered with mathematical symbols. There are Greek letters up there. A colleague is drawing some dotted lines. Someone is talking about memory management. You all have at least one PhD.

When we talk about coding, we’re talking about everything from putting together basic HTML for a static page to creating software that runs custom embedded systems inside nuclear power stations. Codingis easy and coding is extremely hard. It contains multitudes.

I do easy coding. Strings. Strings are bits of text. That’s what I write. Computers like to know what they’re dealing with, so when you give them information, generally you tell them ahead of time what you’re going to give them: I’m going to give you some text (a string, for example, “Hello Computer, you think you’re so smart?”) or I’m going to give you a whole number (an integer, for example, 7) or I’m going to give you a decimal (a float, for example, 7.1). After a while, it all gets a bit maths-y.

In coding different types of data are referred to as “types. ” There are lots of types. Inevitably, I use most of them from time to time. But let’s be honest, most of what I do is passing around strings. That’s the case with many websites. On Twitter you type a string into a box, Twitter sends it to a database, and it shows up in a big list. On Whatsapp, you type a string into a box, and it gets sent to someone else’s screen. Obviously there is more going on (scale is a killer), but if you wanted to throw together a cheap Twitter-clone over a weekend that’s what you’d do — pass some strings around.

So mainly I use strings. Splitting them up here, adding them together there. I pass them around and I write some basic logic control flows. It’s all reasonably straightforward.

We call these CRUD apps. Not out of rudeness, but because of what they do: Create (new items in a database), Read (from a database), Update (items already in a database), and Delete (from a database). At a computer science level, we often make the same app over and over again. In theory, this is bad. Developers don’t like to repeat themselves. They have mantras like have a SSoT (single source of truth) and DRY (don’t repeat yourself). But in practice, it’s much easier and better for everyone to make the same things over and over again with small differences.

I might as well be Harry Potter saying “NPMicus Compilate!” and waving around a wand.

Smart people who use advanced applications and are experts in finance or mechanical engineering, but who have never tried development, look over my shoulder when I’m coding and go to pieces. In their heads, coding is a mystical art, and to misquote Arthur C. Clarke’s famous adage, it is sufficiently advanced to be indistinguishable from magic. I type incantations into this black screen (if, then, while, do, else, class, border) and a website appears. I’ve summoned it with those magic words. I might as well be Harry Potter saying “NPMicus Compilate!” and waving around a wand.

Recently, a colleague came across some garbled nonsense output on a computer screen. I forget what it was now. Maybe a base64 encoded string, a load of hex, or a memory address to an error somewhere. Whatever it was, it was a page full of nonsense characters.

“Oh,” he said, “I guess you feel right at home here. It’s like what you do all day.” He was making a joke, but I could see that to him, a screen full of unreadable hex code was indistinguishable from a screen of basic markup.

Personally I blame The Matrix.

There’s a famous British television sketch from the 1960s. Three comedians line up, arranged from tallest to shortest. The one in the middle turns to the taller one and says “I look up to him because he is upper class,” then turns to the shorter one and says “but I look down on him because he is lower class.” In 1960s England this was considered hilarious.

It may not be true for the British class system anymore, but it feels like it still represents coding skills pretty well. “I look down on him because he doesn’t know the HTML syntax for creating a link. But I look up to him because he does complex functional programming in F#.” (Note him in both cases. What can I say — software development has a diversity problem.)

People who don’t code think that syntax is the hard bit. Learning how to write the magic words to make the computer dowhat you want it to.

if (answer == true) {

doTheThing();

}

And yes, the first time you try coding, there are things that might confuse you — all the brackets, and double or even tripleequal signs. The curly braces are weird, the way of writingWordsWithCapitalLettersAndNoSpaces or of writing_words_with_underscores seems like a strange affectation. Not to mention putting () at the end of words. And the semicolons. God, the semicolons. Everywhere. They’re scary enough in normal written English without them being here too.

Code is unusual at first. But for most modern coding, a small bit of experience is all you need to get started. Some estimates say there are 20 million people writing code around the world today. The vast majority of these coders are building small sites on top of the world’s APIs. Calling Google Maps here to add navigation, calling Stripe there to take payments. And when they get stuck, StackOverflow is there for them to ask questions, copy and paste from, and laugh at jokes about parsing HTML with a regexs and unicode.

We think coding is telling the computer what to do, but really it’s making sure other humans (including ourselves in the future) know what we are telling the computer to do.

And yet, websites take ages to build. And everything crashes or gets hacked. Every app is broken. The output seems disproportionate to how easy the activity of writing code is. Writing code is making lots of small decisions and actions, each of which is easy when taken on its own, but these add up to become hugely complex. Code is easy and code is hard.

There are lots of points to make here about code complexity: That you write code once, but you read it many times. That there are many parts you need to hold in your mind at once. That you need 100% test coverage, but even then you don’t know if you’ve tested everything. That documentation is important and comments are good (or comments are bad). All of this is to say that naively we think coding is telling the computer what to do, but really it’s making sure other humans (including ourselves in the future) know what we are telling the computer to do.

As an activity, coding is simultaneously very easy and very difficult. It both plays to the strengths of the human mind and also its weaknesses. We’re very good at coming up with ideas and expressing those precisely. We’re very bad at holding thousands of those ideas in our mind at once and reasoning through every one of them without missing or overlooking any.

In that sense we’re the opposite of computers. Tell a computer to iterate over a billion items and check each one, and it’ll do it flawlessly in an instant. But ask it to come up with a good app idea and it’s not even that you’ll get bad ideas like Tinder for cats or Uber for vacuum cleaners. You won’t get any ideas at all.

I worry sometimes about the widening gap in computer knowledge in society. In the class sketch, I am the middle-class man, Ronnie Barker — I look up and I look down. I have days where I worry that I don’t know C++, that I’m not good enough at Git, and that I like PHP more than I should. Then someone asks me for help with their computer. A website isn’t loading properly. “Have you cleared your cache?” I might ask.

“My what?” they reply, a look of utter incomprehension on their face. In their eyes, I am a computer God.

I work at a large organization with 20,000–40,000 co-workers, depending on how you define their employment status. Within the company, there are people with PhDs in computer science working on huge machine learning systems. And there is a man who once, when I showed him what button to press in Outlook to sort by file size, picked up a marker pen and drew a circle on his monitor screen. These people are equally senior in the organization.

In an environment like this, managing technology is difficult. So although I say I do easy coding, it’s easy coding for difficult business problems. And after a while all that simple code starts to become difficult. Code is easy to write and difficult to think about. It is easy to understand at a line level, and difficult to understand at an application level.

The code you write is microscopic compared with the time you spend trying to work out the vagaries of the giants you’re standing atop.

At a certain point in your career as a developer you stop doing any actual coding. I’m not even talking about management. I’m talking about debugging configuration files and troubleshooting why a downstream system isn’t working. Our modern digital world is built on the shoulders of slightly doddery giants. And so the code you write is microscopic compared to the time you spend trying to work out the vagaries of the giants whose shoulders you’re standing on. Why didn’t the Diggle API respond with a valid response? Why is Glurk 2.0 returning a 403? (These products are made up — but it’s only a matter of time.)

By no means do I want to downplay the difficulty of writing extremely performant complex algorithms. But let’s be honest, you’re probably not involved in that. Most likely you’re trying to allow a user to select from a dropdown. At least that’s what I’m usually doing.

All of this is to say that the difficulties of code are difficulties of thought. We hear of “computer bugs,” but those bugs are almost never caused by someone coding incorrectly, but by someone overlooking one of the implications of their code. They’re still mistakes, of course, but they’re not due to a rogue semicolon.

This results in a slightly strange conversation with new developers. They want to get better at writing more complex, beautiful, elegant programs, using ever more obscure commands and precise conventions. If they’ve done computer science, they’ll be especially keen on algorithms. But when they ask what they need to do next, I’ll talk about naming things clearly and using consistent, simple control flows, and speaking to users to understand what the real problem is. This is probably frustrating for them. Coming up with a name seems easy, as does just talking to someone. And it’s difficult to focus on that when they want to work on more challenging problems. But hey, what can I say: Coding is hard, right?

광고