Why we should Say Bye Bye to Hello world!

In physics, when someone comes with a wonderful idea, the next question is: can you show us something practical -often just fun- we could do with that?

When introducing development the classical "hello world" example is there to prove how easy it is and how wonderfully any one can code.

At my opinion, this example is making incompetent companies and absolutely unskilled developers alike getting attracted to a field that may provide us money, but also the burden of their misconceptions. And sometimes cowardly I dream I could trade money I earn for living in a more sane environment. Yes I can trade freedom for comfort at this level of insanity.

Follow my steps in the realm of the white rabbit on steroïds.

 What are you supposed to do to print hello world on a screen?

Short answer :
  • open file
  • write incantation called programming
  • save/compile/give to interpreter
  • Magic Happens
Neat, no?
It is exactly the harmful way of thinking.

This short example is a kind of an ode to the black magic of reusability that is plaguing our actual world.

After you have done the hello world, you still know nothing jon snow.

In fact you ignore the biggest problem of programming: resources

Resources you are spoiling by requiring Gigabytes of disk spaces, and memory, screen, keyboards,  an operating system, 250W power supply, just to be able to launch your editor. 
The prerequisite you expect from a computer to do that is huge.

In assembly language the hello world example is pretty straigth forward.

Given the fact that we are given the full control of the memory and have access to both memory and video memory (the framebuffer). Than it may hold in less than 6k.

I have no operating system. Just memory I can read and write.

First I must think ahead. Does a computer know how to print a character on a screen at startup?



For this, like in hello world, a firmware on the motherboard with no know capacity of the world boot, probe its devices, and make the devices come to life.

Actually the motherboard could launch a software to do it... if it only knew of the thousands of peripherals sold everywhere and the code to boot them patched with the latest versions. But since hallucinated people falsely claimed that programming was about black magic, every computer user have been expecting  computers with no operating system loaded to already being configured enough to print stuff on screen like:

This is ambios feng shui 2008
Let me show off and print stuff on screen while I should not have been programmed to do it so that customers is happy watching my giblish status.


So on top of my 6 thousands as simple instruction as hello world, to make hello world works I also need tenth of thousands of lines of code to just have the screen come to life.

But How do I get this code?

By black magic of PCI.

A decoder with fixed adress that when are wired on the right cable can exchange informations for turning them on through a very elaborate dance like

Knock knock adress pci id (lit) model id (lit) I am PCI investigator, Decline your claim

Yes, "I am ...." 256 world CV to print for your amazed eyes. I also want these relative addresses to be reserved for me, and register these alarms for me, and set these parameters, oh and call me back later

This is done, your extra info are ....

Oki, now you can go at this address and jump into my firware in pure assembly and you execute it. Thanks....

Youhou I love to execute arbitrary code ..... Oh! and now I can execute routines with fixed address positioning to call code before your hello world exists! in my "console"! I can print my status...
I am the mad CPU youhou....

I would be doing a device claiming to be secured, I would refuse to have a computer execute arbitrary code. I would be scared to see a screen working before MY code has set it up to work.

So we stack black magic on black magic for the sake of unrealistic expectations. We want computers executing arbitrary code from outside vendors that can be founded by people with interests potentially diverging from mine to be secured... That is unrealistic.

We expect all computers to be the same with a purely abstracted twisted model that does contradict reality.

claiming this is atomic // executed in one cycle

*p = 4;

is a lie.

Memory should not be presented as linear. Because causality matters.

Hugh. When you design a computer, the speed limit is computed by the longest wire in nanometers, and then with basically the frequency being related to length given the speed of light as a limit you cannot do much.

Open your computer and look: memory banks are CENTIMETERS away from the CPU. make your computation, look at the figure, look at the frequency and say! Wut?

Yep, basically if we could access memory in one cycle it would make our computers millions of time slower. And the more the memory, the more the physical distance. So the more linearly accessible memory you have the smaller your frequency is. So real memory is never linear and will not tend towards linearity.

That is the reason to be of address bus, of L1/L2/L3 cache.

RAM is accessed in thousands of cycles, cache in hundreds of cycles.

The more memory you want to access the more layer of indirections you put.

Thus resulting in diminution of speed with the more memory you use or the faster your cores are.

The ultimate L0 memory is called registers. That is where we can do programming by playing with a memory model of at most 64 words in one cycle. No programming language let you access it. That is where programming really happens. In a claustrophobic place on a dire of silicium inside what you call a core. This place you are pushed away from by programming language and operating systems. You never went there. This requires a pure hellish initiation from which you'll come back seeing the dark side of the world. Instead you let compilers, interpreters operating system transform what you said in a code that may not respect your intentions.

So when you access the other memories (not registers), all the requests are analysed, decoded rerouted by circuitry.
They are all handled by the CPU with complex enough circuitry to be considered as a computer itself when you have learned to program microcontrollers.  It presents you data not from where you asked, but from another location. The cache. Magically! In the middle it may have disrupted the other code flowing in the pipelines. Breaking the claim of isolation.

It does the magic of performance happens, by handling 3 level indirection to mask the real performances of the computer and perturbing other code. The one you hit "only in worst case".

The worst case being ironically the one you wish to reach: your computer fully occupied to justify the investment.

Where is worst case? It is near the transition to full work in best possible case for a business: a lot of customers with different contexts, doing a lot of different things. You wish for every business to be at the crossroads between a lot of different profiles of buyers and providers. That is you wish to become a defacto standard in a place you minimize risks. Risks are coming for not diversifying your sources of incomes and of providers.

Well, that is the worst case: the computer will probably kick more often than average page faults. Thus penalizing the best case possible. because of expectations you had. You will work for the locality of variables : you are gonna therefore aim to cluster things, reduce differences. That is the purpose of bug data. Create ways to reduce customers to a set of "relevant datas" that is we classify them as sex, age, location ... You also have devops justifying the use of computer at 100% of CPU to reduce OPEX and thus resulting in a degradation of performance resulting in more level of indirections redirections, dependencies, availability ... making you sucked in the spiral of diminishing OPEX using devops black magic mantras. Did I said it also makes your CPU leak information thus making security nightmarish?

Hell is paved with good intentions and bankruptcy with inept costs reductions.

Just for avoiding a worst case in a computer...

Just because some believe in black magic...
Just because people have no intuitions...
Just because idiots propagate the rumour that when you do an "hello world" you learnt programming. No! You dislearnt that there is a price for everything, and people have done a lot of questionable choices for you regarding cost/price issues.
Just because of this, your choices are constricting your profitable business incomes.

You may not need an operating system in the first place. You may not need firwmare, nor a bus to communicate with the hardware nor even level 3 cache memory. You maybe totally fine using a 1.5 V battery to light leds on a gigantic banderolle in 255 levels of color per pixel for the length of 3 days comicon convention because you may not obviously want to bring on a light trip a full computer that has a 17 inch screens and install a C compiler, linux, maybe python, apache.... drivers for the banderolle that you might have to write if it is custom resulting in more lines of codes for the sake of the god of black magic of interoperability. Just writing an  hello world module/driver for linux outweigth the complexity of lighting a led in assembly language.

Hello world value a very bad habit of magical thinking about the use of resources.

Whatever happens the printf equivalent implies used for hello world implies an abstraction of interface and assumption that reflects a certain model of what computer are that is totally disconnected of reality.

The first lie of hello world is that the context in which you program does not matter.
That resources are free.

All business should be about mastering costs and prices. And people not caring of costs should be barred from being near production lines.

It is another lie of programming; code does matter. Code matters, but not as much as data structures. Because the art of data structures is to map the data the closest  to the wires in a fashion that goes with the strength of the computers. For stuff to print on a screen you want a contiguous memory mapping to make the "media/STRing/blitter specialized" routines to kick.

If you want good data model for accessing data in memory with a level of indirection, you have to have your data structure map the reality of the hardware memory.

For every use data structures are the place where the most gain in productivity is found.

Read this article for a complete essay:You are doing it wrong (PhK)

Yep, knowing of B-tree is more essential then doing recursion or OOP, or even functional programming. The language matters less than the fine control of your data locations.

Hello world focus on code. Not on data.

And with 4 pages I did not even raised the issues of the bloatwarish dependencies involved in hello world; "standard" libraries, module control, exec/thread models, scheduling, io and devices, dynamic libraries, security/user model, permissions, the optional agument in the signature of printf, then handling of the function calls with the stack, implicit memory allocations...

hello world is an invitation do get what is programming the wrong way. It is pretending knowing the map is more important than the territory.

Nothing comes as a solution except the careful analysis of a problem. Hello world is just a universal solution to your programming problems: exepecting developers with limited knowledge of computers being able to make code work good enough in a business context. Resulting in cashflow being poured in solutions with randomly diminishing returns, thus making your growth antagonizing with your profits based on your initial figures with a toy proof of concept pretending to "scale".

Pushing Hello world as the first introductive step in programming is as misleading as the claim that being able to turn on a flash on a camera is your first mandatory step as a fully competent photographer... Programming is not about coding, and bringing solutions, it is about analyzing and solving problems eventually by creating code when that costs less money on the long run than not coding. For this the complexity of computer should not be hidden by stack of magic.

Everytime someone is writing "hello world", Chtulu shimmers, kitten are killed, and you can hear the Apocalypse's Knights nearing, one hello world at a step.

Hello world is magic, thus it should be excommunicated of the programming culture and burnt as a witch. All references and books printing this example should be burnt in the name of the enlightenment of the people by pure reason. No obscurantisms should be mistaken with careful mastering by the experience of trials and errors. Amen.


Benoît Rouits said...

Ce serait chouette de pouvoir lire cet article en français.
/me fatigué de l'anglais.

julien tayon said...

J'essaie de choisir la langue qui va bien pour le domaine qui va bien.

Mais plus je parle anglais, moins je vois la nécessité de le faire.