From logic and math to code. For dummies. Part I: Basics

Roman Krivtsov
3 min readMay 16, 2017

We are used to think, that software development is a kind of exact science. At least we believe in that. How else are we going to prove our customers that our software will work? But in reality, things usually are not so clear (the answer why will be found in the end of the 3rd part).

So what’s wrong with software development?

The end goal of programming is to translate business requirements from the proposal in natural language to concrete processor commands that will make a machine do what we really want. The problem is that informal logic and natural language are very hard to adapt to formal systems, if even at all possible completely. Apparently, because evolutionary our mind and consciousness were built on other laws.

But somehow we translate it and sometimes even more or less properly. Bu maybe someday we will be able to do it automatically?

No need in developers anymore! Yay!

Such thoughts excite many futurists and journalists. Recently Microsoft even created a neural network which can write code by itself using more informal instructions from humans.

Formal and informal logic

But fundamentally, the relationship between informal and formal logic is not clear yet, complete processing of natural language is not solved yet and don’t forget, that requirements in informal logic can easily have errors which can be imperceptible.

So let’s imagine the situation from freelance board:
Premise: Bob is a programmer
Premise: Programmer develops software
Premise: Facebook is software
Conclusion: Bob can develop Facebook

If you as a freelancer would ask the customer to explain what “Facebook” means, in order to write the implementation mostly correctly, he or she would have to design a formal logic system describing the entity “Facebook”.

Ideally, such description should look like the greatest attempts of humanity to build logically proved definitions: Euclid’s “Elements” or Spinoza’s “Ethics” where the final definitions (geometry and ethics) are inferred from atomic notions like point, line, circle, god, nature and consciousness.

Also, all statements should be proved correctly, be valid, complete and consistent with other known theorems. And even then you can get in trouble, for example, because of Gödel’s incompleteness theorem, which says that even in the formal complete system you can come across with unprovable statements (e.g Liar Paradox).
Probably customer would then think that you are crazy and will find someone else for making just some kind of social network.

So we are not going to consider perspectives of developers replacement with neural networks, instead of that, step by step, we’ll consider the magic of transition from formal logic to code. So let’s go.

Formal logic

The first person who introduced the notion of formal logic (or analytics how it was called those time) and described its basics was a great guy — Aristotle. He discovered 3 of 4 laws of logic (here and further I’ll use C++ logical operators, which every developer is familiar with, instead of math ones):

  • Law of identity (A == A)
  • Law of noncontradiction (A && !A == false)
  • Law of excluded middle (!A || A == true)

Does any of this seem familiar?

Exactly! Boolean algebra, which treats values only as truth values and extends formal logic rules.
The fourth rule was discovered by another great logic guy — Leibniz and it’s called “Principle of sufficient reason”. By the way, he was the one to introduce 1 and 0 as true and false.
This rule is not formalized so well as Aristotle’s rules but has to be used to make any type of judgments. In other words, we must firstly somehow prove, that our true/false values are really so, to be able to make any conclusions.

But what can we do in real life with such low-level logic? Basically nothing. We need some rules to build more abstract proposals.
And here Propositional calculus comes! This guy allows us to analyze the logical structure of really complicated proposals, which we deal with in real life.

So he we go: Part II: Higher-order logic

Find me in Twitter

--

--