How Information Technology became the realm of bigots

Imagine you are in the Emergency Room of a modest hospital and you are a seasoned worker trying to save a patient.

You have not even begun to diagnose the patient someone just come in your back with a nice title saying, diagnose for what? We lose money.

Let's shower the patient in golden flakes and put him on a three days diet without water! It is exactly what they do that works in the fancy hospital where I come from.

You oppose the decision saying it could kill the patient, and they laugh at you saying, well who cares, we are not liable neither financially nor legally and who cares about the costs? Anyway NHS will pay.

Well, I do. A professional cares about costs and about the survival of the entity he works for. Ho! And if the patient dies, it is my fault, and I don't like people dying.

The patient I am gonna talk of today is Information Technology and Computer related technology: it is sick and FUBAR (like foo/bar).

We have symptoms:
  • multiplication of dependencies and frameworks;
  • non determinism of tools that used to be deterministic (systemd, C compilers);
  • non transactionals backend used for transactions;
  • bloatware that induces operational and energetical cost sinks;
  • unsecure software due to increase in the surface of vulnerability.
It all looks like what security engineers calls a shift of concern.

A collective dismissal about what our jobs is all about.

I could be theoretical on this one, or just tell my daily experience about what happens in real life.

I am not really a coder, I am first an analyst: I never take decisions without investigations of what a problem could be. I analyze, plan and anticipate before I code.

But, the process of diagnosing a trouble is random. Making it hard to plan. And nowadays we have all these magical tools for gathering gazillions of data. The funny part is I never have the data I need.

H = - k ln (Omega).

Information is not having access to big quantity of data, but just the right few relevant data. Two ways to higher information exists : better discrimination (AI) or smart data (science).
So the other day, there was a discussion on how to get insights working for our product: either take the upstream branch of the product we work on, or develop a new one.

Problem was solved in 30 seconds: fork is bad, end of discussion.

The upstream solution requires to introduce in a webstack of 500k lines of code that already have: Django, scipy, matplolib, bootstrap, jquery, ruby dependencies on java, hadoop, elasticsearch, openstack.

Our company has no sysadmins, and is pretty tight on developers.

What hadoop is used for? Time series display of events occuring with quite a light frequency. Something that could be handled in more than one ways: events could be dispatched at creation time with carbon/graphite. Using a custom script in a crontab with even less dependencies with python stdlibs...

But no, in modern days computer science diagnosing before taking a decision is bad because it makes company lose time, and everybody knows that big data is the direction to go, so if a tool is marketed big data and used by big companies, hence it is good.

I am totally aware this fall in the YAGNI article recently published on the internet.

But it does not end up there, because this is one of many symptoms of Cargo Cult science: borrowing the sign of science to look scientific but discarding any scientific methodology.

I made half a dozen internships in science. Science is not about theory and knowledge, it is a method based on making sure your model match your observations. For this measurements are important, and accepting you may be wrong. And believe me, when you deal with real world, you often are. You will fail eventually in your decisions!

I am a boring coder, longing for boredom.

I do not like crunch. I want to come back at the end of the day to enjoy the company of my family. I do hate death march, useless innovations, and adding new servers to the architecture when I already have a swarm of 26 servers using 20% loard to handle 50 request per seconds (all conformance with my actual work data would be fortuitous and coincidental).

Did I told you my linux experts in the company ignore that max CPU load on linux is CPU count * 100%? No so actually  the CPU load on most servers is 5%. And, my newly hired experts who takes decision for the infrastructure was scared when a consultant proposed to mutualize infrastructures. Let's use another VM instead! Yet another line of billing in the never ending growth of costs.

Anyway, I would like to work with a boring stack. I really do not like having to waste hours reading poorly documented code for which read the source luke involves using a buggy version of ansible (python), shell scripts, Vagrantfiles (ruby) based on magical assumption about environment variable, and not only deal with js/HTML/CSS but also 3 js frameworks one of which adds another templating systems that copies other templates we already serves in mako.

And when I ask why is this so complex? The answers is always because we trust our IT specialists that are in conformance with what the others are saying.

I am the rebel here. Wondering why no one backs up financially engaging choices with clear operational impacts with an argumentation about cost of different choices.

Logic has clearly left the realm of IT.

We are not anymore in the realm of individual responsibility and accountability we are in the realm of collective intelligence and benchmarking. We are well paid, because we are supposed to have responsibilities, we can make choices that can sink a company. But, it seems following the herd to be innovative is now more important.
Is the herd a troup of lemmings jumping of a cliff? We all seem to be going in euphoria to that cliff.

Except me, I don't like excitation and bromance.

I have a wife and a kid.

I hate documenting, I do document. Why? Because some operations require so much states to remember I cannot. Thus I document complex tasks. It is boring.

My coworkers also document complex tasks ... they never actually made and that they wish would be done one day.

And I am like, oh this is religion!

Documenting something that exists is sane.

Documenting something that does not exists is fantasy. Especially when there is no rationale for setting these tasks into motion other than I learned during my formation it is the way it should be done.

The freaking One best way of the engineering schools. Where you never did anything with your hands but are raised to think you are a special snowflake that have the magical power to be better than others at organizing their work.

Short military saying: if you want to be good at ordering first learn to obey. What is funny is these kind of persons seem to be good at writing a lot, but unable to accomplish simple procedure they are given. Yes, I am an asshole, I eventually trapped one of these guys in having to do a simple task that was documented.

How did we come up in a world that is basically functioning on magical thinking rather than scientific thinking?

How observations and measurements are now relegated as less efficient than just do what the majority does?

Because of Conway law. The problem in your code reflect the problem in your organization. This mess of dependencies and complexity is a reflection of our organizational problems.

There is no more conceptual integrity (Mythical Man-Month) that is ensured by a small team of knowledgeable coders.

If one project managers has experience with photoshop, I will have to generate documents as if we were a print workshop.

If a PR is enthusiastic about a new article about an IT guru we have to embed the technology.

We are living in companies were chain of commands are utterly broken. The one with responsabilities/accountability are not anymore in charge of taking decision. And the mismatch between responsibility and authority is called stress. I am a boring coder: I hate useless stress, because a production that is unstable is already enough stress for me. For every functionalities I prune, I am ordered to had 4. The fight against entropy is lost.

The multiplication of dependencies is just the reflection of how much persons are involved in the process of taking decisions. People who actually do not face the consequence of their actions. IT faced to the scarcity of coders have taken the decision to hire a lot of average quickly educated coders and think management worth more than the skills for actually doing. Basically skills are under-evaluated in regard of social status.

It is also a negation of what coding is. Coding is about analysing before coding. Most of the code will be in maintenance: screened for defects and better resource handling.

However management in IT is about hiring manageable coders that will not fight stupid decisions.

But how?

And that concludes my pamphlet: because we are experiencing an IT bubble.

In a normal market that actually works at handling resources efficiently companies that do not care about efficiency disappear.

There is clearly an hidden incentive for decisions that heads towards non rational decisions.

50 request/seconds (disclaimers these are not true figures bla bla) should not require 26 servers x 2, and 3 rows of load balancers for continuous delivery in a company of less than 50 if the budget is tightly controlled unless it is the norm in the economical environment.

It is the norm if and only if money is flowing so much that profitability does not matter.

Modern IT is just a sign our economical system is not working anymore and people have become irrational. Global warming is a truth. But because computers emit smoke only on very rare occasion (a warehouse taking fire), people do not integrate the energy consumption they use.

The watt per requests nowadays are ridiculously high. Who cares? It is green.

Modern IT has no legal/financial liability and people base their business and liability on a layer of unreliable products (software/hardware) that have no liability. Who cares, there is no incentives for delivering product that work as advertised.

IT/computer related industries revolution is based on the fact the nations have given a wild card to this industry and removed totally for them any liability, responsibility.  This industry is now used as a trojan horse to break any rights protecting consumers, stakeholders, citizens and workers. Uber, Amazon, Goldman Sachs, google, facebook all use the special treatment given by the AS IS clause in any software license to ignore any laws to which businesses are normally constrained like paying taxes.

As a rational person, I don't think this can end well. Because this is a clear incentive that results in destroying actual industry that efficiently handle resources.  The destroyed industry is the one that actually provides the goods and services with economical rationality.  When they will be gone, IT gurus will just cash out their money, take their benefits and retire on a deserted island not being accountable for the damage they did. Why? Because they have a boulevard of incentive to do so.

Can we stop this craziness?

Yes, we remove the incentives: the non financial and legal liability. IT are not above the laws, neither of the common, of economics, or of physics.

And what can YOU do?

Hire fucking boring coders instead of your crazy bigots that are gonna make your company bankrupt.

No comments: