OOP and dependencies

  1. OOP is a paradigm that for a while now has left a certain bad after taste for me. It's not exactly caused by OOP itself, but how OOP is typically practised, or I dare say, misused.

    OOP advocates that problems be solved with objects. These objects have properties and behaviour associated with them. This is an approach that I think in most cases is a meaningful one. The typical way to define objects, yet not the only one, are writing classes which can inherit from each other in very useful manners. This all promotes modularity and re-use. So far so good.

    This is where common practice is starting to play out, and contrary to some beliefs, they are not part of the OOP paradigm.

    Fault Tolerance

    Software programs are complex creations, which means mistakes and errors will occur. OOP programmers are typically taught that the methods they write must be made fault tolerant. This includes checking parameter types and value boundaries. I would argue that this practice actually goes a bit against the OOP paradigm, by introducing problems not actually associated with the objects themselves, but their interaction. In actuality, their existence is less justified than that; the code is written to handle the programmers inability to process and pass correct data. Not only that, but it often includes quite a bit of redundant code, as many types of errors can occur many places. This type of fault tolerance usually includes throwing exceptions, which in itself also has some problems.

    I would argue that most of the time, programmers are able to write correct programs at the time they write them. This should especially apply to software written using the OOP paradigm, since it's such a highly modular approach. So instead of spending time on cluttering the code to take into account all possible problems that may arise, I think it's a better idea to track dependencies.

    That's a fairly easy thing to do. What I suggest, is something contrary to typical version management. Instead of overwriting the modules, make new copies and let the names include a version number. Next, load the modules dynamically by including the version number. This way, the programmer interacting with the code, can clearly state what specification was relevant at the time of writing. To track possible inconsistencies, write a wrapper around the loader that logs any cases where the version being used is not the newest one. Using the log, finding the module that isn't up-to-date is easy, and the problem can quickly be fixed.


    A much greater sin, I think, is the overuse of patterns. A pattern is a sort of best practice blueprint to avoid certain problems. These problems includes things like code coupling and cohesion. It's an attempt to increase the modularity and re-use of a piece of software, by making them less dependant on each other. This might sound like a good idea, and sometimes it is, but I believe that dependencies are not necessarily an evil thing. In fact, I have just made a point why I think dependencies are important; to track programmers' intents or "reality" at the time of writing. Besides, the chances that code has to be changed at some point is usually so high anyway, so why put so much effort in avoiding it when you really can't escape it?

    The evidence that patterns can be bad, appear quickly when starting to implement them. They introduce a lot of new components that does not relate to the problem space and is thus additional clutter. Just coming up with their names is a hard thing to do, because of their abstract nature, and they are very unintuitive to anyone not as adept and knowledgeable with all the common patterns. They also usually cause a simple line of code to turn into several lines of code with more symbols in them, reducing readability.

    If you decide to accept, and maybe even embrace dependencies, rather than shun them, you can reduce the amount of clutter significantly. Clean, elegant code makes programmers happy, and maintainability makes the managers' job easier. Loosening and splitting up code isn't necessarily a good thing.