*...or how two pieces of writing by computer programmers changed my life after I learned to "code".*
I want to lead with an apology. This article _can_ and will be classified as TMI by many. TMI stands for Too Much Information and a lot of information will follow in the darned first person: I, me, mine, myself, et cetera. Unfortunately, this text being a semi-personal, somewhat intimate story I see no way to entirely avoid it.
My name is Andy "progapandist" Baranov and I learned to code eight years ago. Right after graduating first from a Ruby on Rails bootcamp and then _failing_ to graduate from a Computer Science program at VU Amsterdam (although I will forever be grateful to Andrew "Type A Personality" Tanenbaum of MINIX fame for teaching me Networks 101)—I stumbled onto one of the most important online articles for new programmers: [Teach Yourself Programming in Ten Years](https://norvig.com/21-days.html) by Peter Norvig.
This seminal text, now translated into more than 20 languages, unlike the one you're reading, is a concise and clear instruction on how to develop your programming expertise over ten years: the span of time universally considered to be enough practice to _master_ anything.
Eight years into the track, I want to share how a different piece of writing, a poem called "The Zen of Python" by Tim Peters of [Timsort](https://en.wikipedia.org/wiki/Timsort) fame, changed the way I see code. Reading a poem is easy: all you need is Python interpreter and the magic words `import this`, which are sort of an inner joke by themselves. Understanding it, I believe, requires years of practice and I'm finally ready to say I know it by heart.
![[Pasted image 20240521203756.png]]
## Beautiful is better than ugly.
This subheading will have a period despite everything the Journalism school gave me. There, they taught me the titles should never have a period in them, not at the end at least. This particular occurrence would be something programmers call "corner case", so I am allowed to have some wiggle room. This section's title is a direct quote that contains a period, so we _can_ use a period. Look:
![[Pasted image 20240516105325.png]]
This line was written by Tim Peters in 2004. Same year, I celebrated my first year as a night-time International Desk editor in the newsroom of the All-Russian State Television and Radio Broadcasting Co (what a mouthful!). Please watch [this report](https://www.youtube.com/watch?v=r61CXKGrwl4) by ARTE if you just got triggered by the "All-Russian": to see why I:
a) Don't work for Russian TV since 2016 legally and since 2014 (the Annexation of Crimea) de-facto;
b) Will probably never go back to Russia or journalism in my lifespan, judging from how things look now.
My job in 2004 consisted of inserting the Betacam SP tape in the studio deck at the exactly the right moment when Associated Press TV (APTV) or Reuters start transmitting video feeds from the 2003 Iraqi war (American invasion, technically), then watching the transmission live to the end, then rewinding it back and forth to get the 30 second "montage" of tanks shooting at bridges, then writing the text that the anchorperson in the newsroom will read out from the teleprompter live on air few hours into the morning, once the whole broadcast would have been prepared and aired.
Back to the beauty.
Fast forward 20 years. I am now a senior backend and infrastructure engineer with some notable career wins and losses under my belt. The beauty I saw first in magnet Betacam tapes and decks, in early golden hours of the morning when my cameraman and I filmed the chain of Syrian refugees crossing the Hungarian border by foot via empty rusty train tracks—the same beauty I now see in code, in functions and classes, in deployment scripts and YAML configs.
A lot is said on the niceties, expressiveness, and [MINASWAN](https://en.wiktionary.org/wiki/MINASWAN)'y "happiness optimisation" of Ruby that happened to become my first "proper" language (meaning I used it to help develop backend for what later became a big travel startup, although none of my code made it into prod, of course).
After that first initial rush of moderately successful web programming ("Look, it's alive, it does things!") it took me years to understand that this is NOT beautiful:
```rb
%w[foo bar baz].map(&:capitalize).reduce(Hash.new { |hash, key| "" }) { |acc, tng| acc[tng] = []; acc }
```
By year 8 of my programming career, I believe in senior developers (actually, I don't believe in these distinctions, not really) writing junior code and adding _lots of comments_ to help newcomers to the codebase understand what they're dealing with.
This is *more* beautiful*:
```rb
output = {}
input = %w[foo bar baz]
# We need the things to thang as capitalized keys for empty arrays for such and such product reason...
input.map { |thing| thing.capitalize }.each do |thang|
output[thang] = []
end
output
```
_* This is subjective and I do not urge to put this code in production, I am merely trying to make a point for explicitness and making all steps super clear._
Leaving asides all the talk about aesthetics, it is pretty much a common knowledge these days that looking at beautiful (aka "easier to grasp") things, as opposed to ugly, makes humans less stressed. On year eight of a ten-year programmer journey managing daily stress becomes important!
Talking about efficiency and the fact the intermediate variables take space in memory... Well, you have already lost your argument if you are using Ruby or Python or any other interpreted language in your programming. If you are not dealing with system low-level code where performance optimizations count, I would say that beauty trumps efficiency.
The Zen of Python goes on for 18 more aphorisms. One could even call them _truisms_, but one would never _smirk_ saying that. Those truism are the truest truth in the business of writing software for humans. Most often, the software is written for machines, of course, but ultimately it is for us, humans.
> Those truism are the truest truth in the business of writing software for humans.
## Explicit is better than implicit.
In the first Ruby example above, a new programmer feels drunk with power of composition and forgets that that writing code is nothing but composing machine instructions that robots will execute but _humans_ would read and improve upon. The code is not _explicit_, it _implies_ the knowledge of what `&:method_name` does in Ruby, on what map actually returns, how `reduce` gets initialized with an argument and a block, what are the "yielded" locals and so on.
I would dare to say that the second verse of Tim Peters' genius poem is the disambiguation of the first one, at least that would be my personal interpretation. In the world of arts, implicit _is_ often the essence of the beauty. The negative space, the _lack_ of the object or a mention in a painting or in a poem can cause our hearts to skip a beat, but in the field of software programming _implying_ always leads to things getting uglier by the minute (or by the commit).
## Simple is better than complex.
Have you ever tried to write safe concurrent code? If you did, you would probably agree that Golang's approach beautifully put by [Mr. Rob Pike](https://go.dev/talks/2012/concurrency.slide#1) himself:
> Don't communicate by sharing memory, share memory by communicating.
...is simple to internalize and it ultimately makes us better at cooking concurrent compute then the (say, Ruby's) approach that combines processes, threads, and something called ["ractors"](https://ruby-doc.org/core-3.0.0/Ractor.html) with so many ways to shoot off your RAM's feet that I would not even start.
Simple is better than complex. That is the sort of truism all of us, humans, instinctively understand.
## Complex is better than complicated.
Okay, let's do a little pause here. It's been smooth sailing till now, but this one requires some graphing. Wanna know how the difference between complex and complicated looks like on a chart? Here it is:
![[Pasted image 20240521202225.png]]
The above figure demonstrates the plot generated by [churnalizer](https://github.com/gosukiwi/churnalizer)—a Ruby gem by Federico Ramirez that certainly needs more love than it currently enjoys. According to the gem's author, the inspiration for the visual Churn vs. Complexity analysis comes from two sources: Sandi Metz's blog post [Breaking up the Behemot](https://www.sandimetz.com/blog/2017/9/13/breaking-up-the-behemoth), and [another great article](https://www.stickyminds.com/article/getting-empirical-about-refactoring) by Michael Feathers linked from Sandi's post.
The idea is very simple: run `churnalizer my_app_dir` on your code project and then see if there are any dots in the upper-right corner of the plot. If there are, you're in for fun refactoring!
Any module that finds itself in the "fun zone" has two distinct features:
1. It has an insane [cyclomatic complexity](https://docs.codeclimate.com/docs/flog) score, also know as `flog` score in Ruby community thanks to slightly BDSM-named family of libraries that can put the numbers next to the mess of the real-world production code.
2. This file changes often. Meaning it is both _complicated_ to parse for humans, but is also an important part of a _complex_ machinery that your application is.
I once worked at a company where the whole unreadable ball-of-mud of a codebase that nevertheless earned millions of dollars ventured on a fate of one file that described a (leaky) abstraction over how the business domain model should work.
I also worked at another company where I met a programmer who had soon become my number one role model in terms of whom I want to be once "Ten years to become a programmer" has passed. You know what his favorite motto was?
> More Epic Visualisations!
By epic visualizations, Guido meant graphs, schemes, diagrams, any sort of visual clues that help us better understand software we write.
This brings as to the next couple of lines.
## Flat is better than nested. Sparse is better than dense. Readability counts.
In the imaginary world of my employers (we don't want to dox anybody, are we?) there was a Company X that relied heavily on _seeded_ database data: the one that did not come from any sort of UI, be it customer-facing or admin. The data was of the utmost importance for business, I'd dare say of _existential_ importance.
As the input for that type of data lacked _any_ external interface, it was configured internally as the bunch of YAML files stored in the same repository as the application code.
Well, "the bunch" is a heavy understatement as there were thousands of files. Each of the existential importance. The files created Extremely Important Entities (EIE) in the database on each deploy of the application. Those EIEs, surely, had relations to each other and those relations were set by _manually pasting_ UUIDs into YAML files. Besides this whole setup being sheer madness by any standards, the YAMLs existed in a deeply nested directory structure.
Up to six layers of nesting, in fact. My attempt to change that was ill-fated at the beginning, as more senior co-workers reported that they are _used_ to the structure being a certain way and questioning the amount of levels is now entirely moot and beside the point.
Eight years on my 10-year journey, I fully understand and partially relate to this sentiment, yet still I believe that everyone's life would have been much better if instead of...
```sh
folder1
├── subfolder1
│ ├── subsubfolder1
│ │ ├── subsubsubfolder1
│ │ │ ├── subsubsubsubfolder1
│ │ │ │ ├── subsubsubsubsubfolder1
│ │ │ │ │ ├── file1.txt
│ │ │ │ │ └── file2.txt
│ │ │ │ └── file3.txt
│ │ │ └── file4.txt
│ │ └── file5.txt
│ └── file6.txt
└── file7.txt
```
... the Company X and its _accustomed_ employees could have:
```sh
folder1_subfolder1_subsubfolder1_subsubsubfolder1_subsubsubsubfolder1_subsubsubsubsubfolder1_file1.txt
folder1_subfolder1_subsubfolder1_subsubsubfolder1_subsubsubsubfolder1_file2.txt
folder1_subfolder1_subsubfolder1_subsubsubfolder1_file3.txt
folder1_subfolder1_subsubfolder1_file4.txt
folder1_subfolder1_file5.txt
folder1_file6.txt
folder1_file7.txt
```
...and then maybe trim some of those sub-paths, as surely it can be less than six!
Humans are just better at grasping flat lists and that is exactly why OS makers like Apple for the past decade or so rely on things like Spotlight and surfacing the smart Recents folder on top of the Finder sidebar so the people could just drop their files in _flat_ folders and rely on fuzzy search to fish out the needed file on demand.
I can bet that hitting a shortcut and typing a few letters is _faster_ than drilling down the hierarchy of directories. However, this is only my personal opinion, I am not able to cite any studies that would actually proof that.
I still can bet though!
## Special cases aren't special enough to break the rules. Although practicality beats purity.
Guido, my role model from the Big Commerce Company (BCC), was ruthless about me cutting corners in my PRs. You know, that line that calls an API the class should not know about just to make things easier—and into production—faster. This was the good demonstration of the first verse: special cases aren't special enough to break the rules.
The second verse—*Although practicality beats purity*—is a mantra I repeat to myself every time I want to show off with a *pure function* or pick up yet another FP book. It's all fun and games until the entropy rises and chaos reigns, as it always does in the human world populated by software.
## Errors should never pass silently. Unless explicitly silenced.
The Company Y I had a pleasure to work with once declared a "Zero Exception" policy. No, that did not mean the exception caused by a division by zero, it meant that we should have _no uncaught exceptions_ in our code. While I consider this undertaking noble, I would not recommend anyone going down that path to the letter: this will almost inevitably lead to a point where the important exceptions will be _implicitly_ silenced by something along the lines of `rescue... do` and you would consider yourself lucky if that rescue sends anything to the APM like New Relic or Datadog. If not—you're in for an exciting bug hunt that will take some explaining to your PM who will be standing over your head with a handful of Jira tickets.
Never pass errors silently. Never silence them implicitly. Better yet, never silence them. Let things rip. Someone will thank me later.
## In the face of ambiguity, refuse the temptation to guess.
This one sound deeply philosophical... because it is! One can adopt this maxim as a general GQ-style "Rule of Life", regardless whether you code at all. Daniel Kahneman et al literally [received a Nobel Prize](https://en.wikipedia.org/w/index.php?title=Daniel_Kahneman&action=edit§ion=5) for proving that humans are notoriously bad at guessing due to their evolutionary firmware firing at all cylinders, making a myriad of guesses each second about the world around us and the obstacles we face: guesses that are sometimes _good enough_, but often also _not great_.
Whenever the ambiguity stares us in the face on a commercial programming project a temptation to guess can cost human lives. Remember-remember the ~~4th of November~~ the [Patriot clock bug](https://www.reddit.com/r/WarCollege/comments/18xcf54/how_bad_was_the_patriot_clock_bug_and_would_it/).
## There should be one-- and preferably only one --obvious way to do it. Although that way may not be obvious at first unless you're Dutch.
I did my Computer Science degree (the one I never finished) in The Netherlands. In total, I spent about two years in Amsterdam and these were not the happiest days by any measure: I was recovering after years of psychological trauma as I was forced to stretch my conscience and sometimes outright _lie_, live on air, for years as my TV career progressed from the night shift newsroom editor to morning news to prime time bulletin, then to on-air correspondent, then to chief of European bureau in Brussels, then Paris. Then my home country decided to annex a part of my other home country (both of my bloodlines come from different Russian and Jewish settlements in Ukraine) and my mind broke in half.
I abandoned a successful (not morally, but "professionally") career in media to take up programming using Zed Shaw's "Learn Python the Hard Way". This is also when I first discovered the REPL Easter egg. And REPLs!
I was amazed by the immediate feedback and the interactive nature of coding in a REPL environment. It felt like a conversation with the computer, where I could quickly test out ideas and see the results in real-time. My life changed at that very moment.
Was it _the only way_ to do it? To abandon the mirrored palace of propaganda lies and build a new life in Europe? I would never know.
Go ask the Dutch!
## Now is better than never. Although never is often better than *right* now.
In my TV years, everything I had to to was _right now_. Go cover this right now: buy that plane ticket _right damn now_, pack clothes for we-don't-know-how-long, drive the company car out of garage _now_, pick up your cameraperson with all the gear, check it into the airplane, take off _now, now, now!_ Find the fixer, pay them loads of cash, drive a strange car to the border of war-torn country, cross that border (technically, illegally, but hey, the posts are abandoned), drive through the desert to the closest hotel that hosts all the BBC-CNN-FRANCE24 media circus—now! Dodge that bullet, interview that victim, use this shitty GPRS connection to upload a minute of dramatic footage. Now! 5-4-3-2-1, ON AIR!
For a while, I used the same approach at my software jobs. Did it work? Yes, it even earned me some points for being that very "10x" programmer, as I coded FAST and _now_. Did it make my code better and ultimately the systems that I worked at more resilient? No.
Through years of practice I learned that it is more than OK to:
a) Say "no" to your EM. Say "no, I won't do it _right_ now, I need time to examine the outcomes". It is OK to take your time to prototype the solution before sharing it with anyone.
b) Say "I will need N hours/days/weeks" to simply _read_ code before contributing to it.
Try this, if you never did. You might lose your job if your engineering manager is hell bent on Jira tickets (I had this happen to me once), but ultimately your colleagues will thank you. Plus, if you lose that job—it's not like this was a great place to thrive, right?
## If the implementation is hard to explain, it's a bad idea.If the implementation is easy to explain, it may be a good idea.
Here goes the mandatory, proverbial rubber duck story that I am going to omit. Explain code to people, not machines. Explain. Code. To. People. Period!
## Namespaces are one honking great idea -- let's do more of those!
What can be said? There's no way we could write so much mediocre software without them!
-----
As I reflect on my journey from a novice wannabe programmer to a somewhat seasoned engineer, I am grateful for the wisdom imparted by Peter Norvig and Tim Peters through their insights. Their words guided me through the ups and downs of my coding adventures, reminding me of the importance of patience, perseverance, and passion in the pursuit of mastery.
In conclusion, the art of coding is not just about writing lines of code; it is about crafting elegant solutions, solving complex problems, and pushing the boundaries of what is possible. By embracing the principles of beauty, simplicity, and clarity in our code, we can create software that not only functions flawlessly but also inspires and delights those who interact with it. And so, I continue on my journey, eager to explore new horizons, tackle new challenges, and write more beautiful code.
With love,
Andy B.
https://github.com/progapandist