Top 10 Skin Care Advice for the Best Results

Sound skin is presumably the main part of regular magnificence upgrades. Beneath you will find the main 10 best healthy skin tips. However there are unquestionably in excess of 10 significant hints…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Can programming be liberated from the von Neumann style?

Universal machine: John von Neumann and his high-speed computer, circa 1952.

This was a visionary article well ahead of its time. The point is, that he pinpointed a fundamental problem of thought in programming that is still valid 40 years after. The argument is not about imperative vs functional programming but it is more about how people are thinking and seeing programming as explicit statement control flow without even realizing that this is not the only one approach of programming.

The John Backus’ paper is very controversial at the time but it was certainly a very good step in the right direction.

The first part of the paper starts setting the tone for the following pages:

40 years has passed since this was written but do you think it’s not relevant anymore? I think that most of it still applies to your preferred mainstream programming language. I hear you saying that this is not true anymore because most of mainstream languages are not imperative but Object Oriented. Different paradigms!

But when I read trough Backus’ article, he is highlighting some of the problems that still apply to mainstream languages. They are getting more enormous but not stronger, they are fat and week, they have still primitive statements, close coupling of semantics and state transitions.

Sure, “word-at-a-time” is not relevant anymore but how many times we write code based on arrays not only to express a collection of elements but encoding different data structures like trees, binary search trees and graphs? In the mainstream languages most of the algorithms are using arrays as the basic data structure and your are left with implementing an error-prone logic and indexing arithmetics. This is like “word-at-a-time” processing even if you don’t call it so. Modifying mutable variables, using assignments and control structures such as if-then-else, loops, breaks, continue and return are the most common informal way to understand imperative programs as instruction sequences for a von Neumann computer. There is a strong correspondance between mutable variables and memory cells, variable dereferences and load instructions, variable assignments and store instructions, control structures and jumps. However how this is scaling up when we conceptualize programs using “word-at-a-time”?

So many questions but in the end you may say that imperatives languages belongs to the past so it’s not my problem anymore. More, most of the software industry is not interested in solving algorithmic problems at all. Mainstream languages are mostly used to encode business logic and construct models to solve real business problems with Object Oriented programming. Let’s model the real world, they say! Bear with me, most of the time people use objects (classes) as mutable data wrappers combined together with the same control structures used in the von Neumann programming style.

Backus is also highlighting:

By the way even if we have mathematical model for functional programming (lambda calculus), people writing business software in FP don’t need to provide mathematical proofs that the program is correct. This is not the point (although I think that Haskell for example is compiled to intermediate lambda calculus which is then verified for correctness, that is why Haskellers say “If this compiles, it works”). Simple fact that your programming language is backed by mathematical model allows to think about business problems in different ways and provides to you construction blocks based on mathematical properties allowing you to write better software, with richer expressiveness and a way better abstractions and logic. He continues:

This is one of the weakest elements of von Neumann programming style that Backus seems to be particulary heated. His vision for a better model is essentially without variables, with coherent algebra, precise definitions and applicative state transition properties.

In the second part of the article, Backus proposes a solution to the von Neumann programming problem.

Backus describes his vision of functional programming which seems rather modern, even if 40 years has passed. The syntax of the FP language he proposes is quite different from what we can imagine compared to the modern functional programming language but the ideas are still fresh. This was certainly a way to go at the time. It is based on function composition, higher order functions and the program is evaluated through reduction. He describes a short program than it compares it to von Neumann style highlighting the following properties that I’ll try to explain:

This is the core of functional programming. This is why it is so powerful and easy to work with.

This is not 100% clear to me but I think today we talk about “composition” and not “hierarchy”.

This is also a core idea of functional programming. We care about what the program means and not how it is implemented. There is no cognitive load at mentaly executing a program step by step in order to understand what it does. How many times have you been mentaly executing steps of your program in your prefered popular language to understand what it does?

Higher level abstractions.

I’m not sure about that but it seems to me that he still talks about the simplicity of the language that can be abstracted to use any kind of data structures.

Seems like he is proposing a way to compose functions in a simple way.

For me what he is describing here as “housekeeping” are all the constructs that we use repeatedly in our mainstream languages like loops, if-else branching constructs, etc. How many times have you written loops with almost the same form and without even realizing how poor abstraction it is? It is rooted so deeply in our way of programming that we don’t even notice we’re still writing code like a von Neumann “word-at-a-time” style. In functional programming those were already captured as patterns so we don’t need to write them over and over. Instead of looping we have higher order functions, if-else can be replaced with powerful pattern matching decomposition, etc.

While Backus continues on describing his vison of functional programming system there are many details and mathematical proofs that I’m not able to explain correctly, but I hope I have captured the core idea of the paper. Definietly it is worth to read it.

However, you may ask. If the functional programming is so good why...

Not yet! But it’s getting more and more popular for obvious reasons. The part of the response we find in the paper:

and then:

It seems that the functional architectures has failed not because they were unworkable but because there were no support from the market rooted in the von Neumann style. There was no support from the industry or large academia either. Mainstream language have been steamrolled since then and then Object Oriented pattern forced itself as the only valid way of writing programs at large scale getting more and more popular. But the popularity is not related to quality. Not everything popular is bad but definietly not everything popular is good. In the end the popularity might be an evidence of minimum level of quality acceptable by the majority of people.

Object Oriented programming is good enough so it allows it to be widely used. People are happy to choose good enough solutions that seem easy not matter if there are better solution out there. OOP is relatively accessible to average programmer even if in the end it is more complex than functional programming but the root cause may be the way it is taught in universities. Often functional programming is associated with maths which indeed may be helpful, but bear with me, you don’t need to know category theory and lambda calculus to write composable functional software.

We’ve also touched upon the lack of verifying mathematical properties of Object Oriented languages and that it was formally impossible to prove the correctnes of Object Oriented programs but this didn’t stop the wide adoption and growing popularity. Indeed, OOP is still adopted mostly for business applications where ultimate correctness isn’t necessary. You don’t need 100% of correctness most of the time. From the business point of view you need a good enough software so you can earn more money than you can loose. Also very often a time to market is more important factor than ultimate correctness and stability of you program and you compose with available skills on the market. It’s how the business works! However in my opinion I think that you can write software in FP language quicker being more reliable at the same tume than what you can do in OOP. It depends on what paradigm the team is familiar with.

40 years later and I have a feeling that today we’re still fighting similar problems that Backus described in this paper. This is a visionary article ahead of its time. The biggest problem are not the mainstream languages but how most of the people are collectively thinking inside the boxes without even realizing that there are other available paradigms of thought. Many people are solving problems locking them up in mental structure of von Neumann style like if this was the only way. Even if, since some time functional programming is becoming more and more popular it will take some time before we would call it mainstream or another patter would emerge in the meanwhile (maybe quantum computer will change everything we know about programming). On the positive side, many concepts from functional programming are leaking into mainstream languages. For some people this could be good or bad, but what I’m seeing is the opportunity to learn to think and to solve problems differently.

And you, what do you think? Are we still trapped in von Neumann style?

Add a comment

Related posts:

Your Personal Beliefs Can Affect Others

I was in a barbershop recently with my son. The barbers did not wear masks. My son chooses to wear one whenever we go to public places. Knowing people that have died of covid-19, understanding…

Make Your Writing Goals SMARTer This Year

With a new year comes new energy, excitement, expectations, and goals. Maybe this is the year that you’re finally going to get that novel finished, or get your company’s blog up and running with…

What is a Kernel Filter?

Kernel Filter is a type of Kernel-based smoothing method which is often applied to an input vector, time series or matrix to generate the smoothed version of the input sequence. A specific type of…