EDITING BOARD
RO
EN
×
▼ BROWSE ISSUES ▼
Issue 22

Perspectives on Object Oriented Design Principles

Cătălin Tudor
Principal Software Engineer
@IXIA



PROGRAMMING

Popular belief about entropy leaves no room for interpretations: the bigger the entropy the greater the chance for disorder and chaos to lurk their ugly heads at every step. This means unpredictability, which of course is not amongst the desired qualities of a good design. However, as we shall see in a minute, big entropy (I am referring here to Shannon entropy not the thermodynamics version, although there are similarities) is not a quality of bad design, in fact we are not able to say anything more about a design by looking at its overall entropy than that it solves a certain problem which needs that many states as the design allows.

While this is counterintuitive, the only way to get a good design is to grow it towards increasing the entropy, as any attempt in reducing entropy will result in unwanted strong coupling and weird behaviour.

The purpose of this article is to look at how some of the well known design principles influence local design entropy.

I will start with Liskov Substitution Principle (LSP) as its influence on entropy is more straight forward.

In a couple of words, LSP is a rule that helps programmers identify when to use inheritance. One of the most known examples of LSP violation is the "Squares and Rectangles" application.

Imagine you"ve created an application that manages Rectangles. The application is so successful that users are requesting a new feature in order for it to also handle Squares. Knowing that a Square is a Rectangle, your first design choice is to use inheritance (inherit Square for Rectangle) in this way you are reusing all the functionality already implemented.

class Rectangle
{
public:
  void SetWidth(double w) {
    width = w; }

  double GetWidth() const { 
    return width; }
 
 void SetHeight(double h) { 
    height = h; }

  double GetHeight() const { 
    return height; }        

private:
  double height;
  double width;
};

Square overrides SetWidth and SetHeight.

void Square::SetWidth(double w)
{
   Rectangle::SetWidth(w);
   Rectangle::SetHeight(w);
}
void Square::SetHeight(double h)
{
   Rectangle::SetHeight(h);
   Rectangle::SetWidth(h);
}

I will not go into more details on why this turns out to be a bad idea (more info can be found on Robert Martin"s Object Mentor (http://www.objectmentor.com/resources/articles/lsp.pdf) but I will show how this influences design entropy.

First, a couple of words on entropy. Shannon Entropy is a measure of the uncertainty associated with a random variable and is typically measured in bits. Yup, the same bits as in memory capacity or network throughput. If this wasn"t strange enough, find out that a single toss of a fair coin has an entropy of one bit! Entropy measures the quantity of information needed to represent all the states of random variable.

For small quantities of information we can identify simple rules to represent all the states. Chaos however, or large random sequences have huge entropies they are an explosion of information and there are no simple rules to guess the next number in the sequence.

[A short discussion about the similarities between thermodynamics entropy and Shannon entropy. In the end both represent the same thing. Use as an example the container with two liquids (one white and one black) separated by a wall. After removing the wall the liquids mix therefore increasing the entropy. From information theory point of view identifying the position of each particle relative to the separation wall requires much more information when the liquids are mixed.]

There is also a definition and a formula for entropy: For a random variable X with n outcomes (x1, x2, x3…, xn) the Shannon entropy (represented by H (X) ) is:

where p(xi) is the probability of the outcome xi.

Let"s take some examples in order to get a grasp on what this means and why it makes sense to measure it in bits.

[Depending on the base of the logarithm the entropy is measured in bits (base 2), nats (base e) or bans (base 10)]

Example 1

How much information is required to store variable X having possible outcomes in the set {0, 1}? Consider

[That means 0 and 1 have the same chance, 50%, to be assigned to X].

Example 2

How much information is required to store variable X having possible outcomes in the set {00, 01, 10, 11}? Consider

[That means all values have the same chance, 25%, to be assigned to X].

The formula tells us that

[Ask yourself if you know the definition of the bit. The definition is not straight forward and every programmer thinks he/she knows it (which is kind of funny because it turns out they actually don"t).]

This looks all nice and tight but how does it applies to object oriented design? Well, let"s go further with our analysis.

How much entropy is in the Rectangle class? We can look at its fields, width and height, but we"ll use a more simplified case where they can take only values 0 and 1. It looks like Rectangle class is defined by a random variable XR={wh}, XR has possible outcomes {00, 01, 10, 11} each a different combination of width (w) and height (h) and we know from the second exercise that entropy equals 2.

H(XR)=2 (bits)

How much entropy is in the Square class? We can look again at its fields, width and height, which can only take values 0 and 1. It looks like Square class is defined by a random variable XS={wh}, XS has possible outcomes {00, 11}. Now the entropy is different because width and height no longer vary independently. Every time width (w) gets set, the height (h) gets set to the same value. We know from the first exercise that entropy equals 1 in this case.

H(XS)=1 (bit)

Here is our first rule of how the entropy should be allowed to vary in a design:

Whenever class S (Square) extends class R (Rectangle) it is necessary that H(XS)>=H(XR). In our case 1=H(XS)R)=2 ! What actually happens when we break the "entropy rule"?

  • Let"s say we have method (function) m using objects of type R
  • If class S extends class R by breaking the entropy rule then method m will have to account for the missing entropy in class S (read this as strong coupling, adding if statements to discriminate between Square and Rectangle is one possible scenario)

[An important aspect is how the design grows the entropy because chaos and disorder inside source code also comes from how the entropy is structured, grown and used within classes]

Real world example

That"s a door. You"ve guessed!

What does a door do? What is its behavior? Well… a door opens (if it"s not locked) and it uses the interior of a room to do it. Here"s a simple way to write it in code:

class Door
{
  void Open(Room r)
  {
  …. the door opens inside the room
  }
}

Imagine the entropy of the room is proportional to its volume. What would happen if we extend the class Room by reducing its entropy (volume)? Let"s call this new class a FakeRoom. Well… the next picture is talking for itself. The missing information (entropy) in the room needs to be accounted for and gets coded into the door (by cutting out the bottom part so it can be opened). Now the door and the room are strongly coupled. You can no longer use this door with another room without raising some eyebrows!

[Developers should understand their design will look the same as in this picture. My advice is not to ignore the signs, a vase and flowers will not transform it into a good design.]

[We can imagine a second example with a water pump and pipes of different sizes.]

Conclusions

  1. Lowering the entropy by using inheritance is a sign of broken encapsulation.
  2. Aggregation should be favored instead of inheritance. There"s no way to break the entropy rule when using aggregation. Entropy can be varied as desired.
  3. Depending on abstractions is a good practice as an interfaces doesn"t have a lower entropy bound and allows for any customization.
  4. As a mathematician would put it, the entropy rule for LSP is necessary but not sufficient, meaning that if we obey the rule we might get a good design, while if we break the rule this definitely leads to a bad design
  5. Design entropy is a perspective that can augment already existing methods of detecting LSP violations

VIDEO: ISSUE 109 LAUNCH EVENT

Sponsors

  • Accenture
  • BT Code Crafters
  • Accesa
  • Bosch
  • Betfair
  • MHP
  • BoatyardX
  • .msg systems
  • P3 group
  • Ing Hubs
  • Cognizant Softvision
  • Colors in projects

VIDEO: ISSUE 109 LAUNCH EVENT