OOP is just a mental model. Deep down everything is made of bits. The church of OOP has failed but if something looks like a duck, walks like a duck and talks like a duck it probably is useful to make a duck class. We're now down to fighting for nuances. You can do most things with OOP or without OOP but each path has some upsides and downsides and most of the time it's good to use some things it provides where it makes sense and not get too religious about it. The great architect has the foresight on how the code will be used in five years and design it accordingly.
I think this relates to what you're saying.
I've never felt any frustration that OOP feels like the wrong tool when I'm using languages that give me the choice to use it or not (like Python, and JavaScript). But when I'm using Java, as one example, it often feels like I'm really locking myself into a design up front.
In Python, especially. I'll find myself starting off all experiments or simple projects with functions and basic data types. As something evolves and I want some semantic clarity I'll stop using dicts and start using namedtuples. And then at some point I may replace the namedtuples with classes. From there I may discover value in having subclasses so I'll add a few (but this is exceedingly rare in my line of work).
But that's the whole point of JAVA. It's an opinionated platform with a hyper-standardized workflow. Sure, that limits creativity, but in many business contexts, the last thing you want is your programmers getting "cute".
There's a straight line from requirements to implementation; no meandering involved. At least that's the theory. In practice...
Yes. That may very well be. And I'm not saying Java is bad because of what I described. Just that I experience the, "ugh OOP isn't helping me do what I want to do here" with Java.
It probably is the right tool if you have an enterprise problem that needs an enterprise solution.
According to my experience "enterprise problems" are more related to scalability, data integrity, APIs, and workforce hiring than "what programming paradigm to use."
It was a hypothesis, not a theory. Usually a hypothesis is abandoned after so many counter examples.
https://medium.com/stanford-d-school/want-some-creativity-cr...
It’s interesting - when you begin adding constraints, sometimes it helps solve the problem. You can’t be creative unless you’ve created a solution. It would be interesting to see the impact on this with programming languages.
I'm glad I don't work places that try to keep me from being "cute".
> But that's the whole point of JAVA. It's an opinionated platform with a hyper-standardized workflow.
That may have been where Java wanted to go, but, when I'm working in Java, I don't feel like that's where I am. Ways of doing things in Java tend to be wildly inconsistent from project to project. Partially, I think, because so much core functionality in the Java ecosystem was allowed to be federated out to 3rd-party projects for so long. Take the long-standing popularity (and rivalry) of Guava and Apache Commons for handling even basic tasks that are hard to get done using the core Java APIs. If there's such a thing as a "platform smell", I'd say that certainly qualifies.
With Python, on the other hand, there is a fairly consistent common understanding of what "Pythonic" means, and, even when there really is more than one way to do it, the question of which one to use can usually be quickly resolved to a predictable outcome by simply pointing out that one option is the more Pythonic way to do things.
(edit: Though, to be fair, Java was first released into a world where languages like C, C++ and Common Lisp represented the status quo. Expectations were lower at the time.)
To be honest, I've never seen a problem I couldn't solve in easier way with core modern JDK libraries than with "3rd party" libraries.
It's definitely gotten better over the past 5 or so years. But there was a lot of time spent acquiring technical debt over the preceding couple decades.
Even if I don't use Guava or Apache Commons myself, for example, I still occasionally run into dependency conflicts that I need to resolve with awful hacks like package relocation because so many other major libraries rely on one or the other, and neither library is a particularly great citizen about breaking changes.
When faced with complexity you have to use with whatever 'opinionated hyper-standardized workflow' to build 'un-opinionated hyper-unstandardized workflows'. The net result is like trying to build sculpture with glass with chisel and hammer.
Its then you realize you had better started with clay.
My experience with Python is largely the same. I often start with local variables in a script, then put some parts into functions, then start putting some shared/persistent data into dicts. If after some time a pattern emerges where I think "gee, wouldn't it be useful to pack these three variables that always occur together and interact with each other into a class and use some methods to consistently modify them?". Then I create a class that does some specific thing. This transformation occurs gradually over the development process and the data shapes the design of the application. I would call this process a prototype development. At the end I would have the knowledge about the design where I could start implementing it in Java or C++. But the problem is already solved well enough in Python and you almost never need to take the leap to another programming language.
The program is not good, the design is not well thought out and the implementation is not very clean. It is a design prototype that you use to write the real application. If only there were time for a fresh start. The solution that we have by that point is not perfect but it does three things: Actually solve the problem, solve it good enough for regular usage and it allows further modification with only minor pain.
I wish I could tell the full story but most software stories end before the grand finale.
> I've never felt any frustration that OOP feels like the wrong tool when I'm using languages that give me the choice to use it or not (like Python, and JavaScript).
Do you not miss more advanced features, like multiple dispatch? Do you just implement it ad-hoc when you need it?
I don't know any way to say it that doesn't come off sounding condescending, but this looks like the Blub Paradox to me. Python's brand of OOP is better than most, but it still feels pretty limiting to me. It doesn't even offer syntactic abstraction to make it easy to work around. You have to hope (as the 'multimethod' package does) that other features accidentally allow you to.
It's not something I've ever needed with such frequency that I wished it was in the core langauge. When I need it I just use a library that exposes it. I use a decorator based multidispatch.
> OOP is just a mental model. Deep down everything is made of bits. The church of OOP has failed but if something looks like a duck, walks like a duck and talks like a duck it probably is useful to make a duck class. We're now down to fighting for nuances.
No... I think this is missing precisely what was exactly the point of the article (see "(B)" in the text). This post is not a fight over OOP religion. The point of it is if you misunderstand or mischaracterize "nuances" about some idea (if these are actually all nuances, which I think is debatable) and propagate them, then you shield other people from prior (edit: or even current) literature in the area, and hence prevent them from understanding what the relevant techniques really are and how to use them properly at all. This makes them lose a potentially powerful tool in their toolset, which you should agree is an awful thing regardless of what coding 'religion' you follow.
Just because prior literature exists, does not mean it should not be superseded. For example, generics wasn't even a thing when OOP originally started and yet LINQ and basic list ADT's wouldn't be as powerful without it.
> Just because prior literature exists, does not mean it should not be superseded. For example, generics wasn't even a thing when OOP originally started and yet LINQ and basic list ADT's wouldn't be as powerful without it.
No, this not an example... generics did not "supercede" OOP.
My point is that earlier OOP documents did not implement generics.
When they were finally implemented in OOP, it superseded the original intentions of OOP.
The question is, are we now supposed to remove generics because they don't conform to the early literature of OOP?
Looks like the OOP edited his comment though, so my point is irrelevant.
> My point is that earlier OOP documents did not implement generics.
> When they were finally implemented in OOP, it superseded the original intentions of OOP.
No, when they were implemented in static OOP, it brought static OOP closer to intentions of the original, dynamic, OOP, where generic-ness doesn't require parametric polymorphism.
> My point is that earlier OOP documents did not implement generics.
So what? Early cars didn't have AC, therefore AC superceded cars? Or therefore I'm somehow arguing we should use AC instead of cars?
> The question is, are we now supposed to remove generics because they don't conform to the early literature of OOP?
Who ever claimed such a thing in the first place? It certainly wasn't me. If there is any question like this under discussion, it is whether OOP should should be removed because generics somehow superceded them (your idea), not the other way around. In either case the answer is clearly No because the idea is obviously ridiculous and not something anybody suggested.
> Looks like the OOP edited his comment though, so my point is irrelevant.
Not sure what this is referring to, but I haven't been able to agree with your comment since it was initially written.
So you don't really know what I was originally talking about yet you continue to act like you do? My point was in a response to the OP's implication that somehow the original OOP literature was perverted by misunderstandings and wrong implementations of the current generation, as if the original OOP documents were somehow "pure".
I pointed out that just because the literature is original doesn't mean it can't be superseded. I gave generics as an example that was later implemented in OOP languages and fail to see how implementing these perverted the original intentions.
I was then told that OOP intentions were originally meant to be dynamically typed and that the static typing of generics was meant to put it towards the original intentions of this dynamic structure (this is untrue because Alan Kay just didn't like static typing, but didn't make dynamic typing a requirement for OOP). Upon further research, the earliest OOP concepts were explained in the 1960's and the first OO language (Simula) was statically typed and a superset of ALGOL 60 which was a language made in 1960, with Simula following in 1965. Smalltalk came in at 1972 (around the time of generics) and is considered the definitive OOP language which is dynamically typed.
So its hard to say, without direct sources, what the original intentions of OOP were, but considering OOP appeared before the first languages that contained generics (i.e. 1970's), generics is an idea that superseded OOP.
Considering the utility of generics, its clear that later concepts that were added to OOP did not somehow perverse the original literature.
So we've established the following: 1) The earliest OOP language (Simula) was statically typed. 2) Generics came in after Simula 3) The original intention of OOP could probably be attributed to Alan Kay, who created Smalltalk, but it borrowed heavily from Simula. And while Alan Kay coined the term OOP, the idea was not created in a vacuum as OOP concepts predate Smalltalk.
Hopefully this provides some clarification. But my guess is people will continue to misinterpret what I meant.
> For example, generics wasn't even a thing when OOP originally started
Generics werent part of the earliest OOP because generics only make sense with static typing and the earliest OOP languages were dynamically typed; generics were around other places around the time of early OOP, though.
Just to add on to that.
Very few programmers know the prior art wrt. OOP, or have worked with the kind of code in which OOP is done well (I guess "OOD", using the terminology from the article). Instead, almost all junior (even senior) programmers I encounter parrot something along the lines of OOP being too enterpris-y and crufty, and something about inheritance being stupid. OOP is dismissed out of hand. It's high time for a correction in that mindset. The ability to structure your data and the operations on that data together in place is incredibly powerful, and OOP is a good approach to do that.
Schools are partly to blame. They teach OOP as if it is an exercise in abstracting some sort of reality (e.g. "a dog barks, a cat meows, and both walk"). But that approach falls apart for the sort of concepts programmers work with. OOP is at its core a way to structure code, and to do so cleanly, to avoid repetition, and to enable easy navigation through a program. It is not intended to be mental map of some external reality.
>The ability to structure your data and the operations on that data together in place is incredibly powerful.
Agreed. In a lot of cases if you don't have objects (the good parts) you are doomed to reinvent them:
https://www.cs.cmu.edu/~aldrich/papers/objects-essay.pdf
"Linux uses service abstractions in order to support multiple file systems. There are vtable-like structures such as file operations that are used to dispatch operations such as read to the code that implements file reading in a particular driver."
I don't really see a problem with reinventing things when you need them. Classes etc are just syntactic sugar on top of functions and structs.
A specific church of OOP.
OOP the Eiffel, Sather, Smalltalk, C+@, BETA, CLOS, SELF way isn't the same thing as most people learn in school as THE OOP.
Just like there isn't a single way of doing FP or LP.
Also lets not forget that all successful FP/LP languages are actually multi-paradigm and also include OOP concepts.
>The great architect has the foresight on how the code will be used in five years and design it accordingly.
Isn't he better off taking that crystal ball that gives him the foresight, using it to pick the correct lottery numbers and simply retiring?
Predicting position and velocity of little plastic balls tumbling inside some rotating container is a very different problem to predicting the behaviour of future API consumers. Specially if those consumers work in the same company you do and have shared objectives.
So no, he's not better using that "crystal ball" to predict the lottery.
Many applications run maybe 10 times on real data and then their purpose is fulfilled. You don't need to design a microservice architecture with redundant servers when a simple shell script could do the job better.
You need to foresee the scale of what you're building and how you would proceed to the next level. Some things must be solved right before the first deployment because you can never change them after the application is deployed. You must know what these things are and solve them right. You must also reduce their number ideally to zero if possible. You must use solutions that allow refactoring and later scaling in areas where your crystal ball is not sure. If the hard things are solved correctly you can use average workers to do the rest and it will work well.
Yup, I like to think the religious war has subsided, and we're a little more free to do things in ways which work, are concise, readable and maintainable, even in previous bastions of enterprise OOP fervour like Java.
> The great architect has the foresight on how the code will be used in five years and design it accordingly
Perhaps this is a function of me working in startups and consulting my whole career, but it seems extremely misguided, if not negligent for an experienced engineer trying to design for use cases five years in the future. Five months into the future is even pushing it.
What kind of companies operate in this way?
Consultant here, and I think that's a big blind spot we tend to have: we don't stick around for long enough to see the consequences of what we designed, usually.
I have more experience at startups than as a consultant. I was the 6th hire at a company that grew to 130 over three years and I was never thinking more than a few months in advance. A lot changes in five years — your customers, the competitive landscape. It's an enormously long period of time in technology. It seems like such a waste of time when you have customers that have real, unsolved problems today
Unless it's some one off batch process or a prototype, if the codebase doesn't last at least six months then it's unlikely to be something that makes any money.
In my experience, badly designed code tends to become a net loss after a couple of months because after that time someone is going to have to modify or fix it.
Companies that are here to stay and plan to stay for longer, not the hipster wannabe unicorn type
Probably every company outside the startup domain? If you're working on product-market fit, then you can expect to discard lots of systems, however, that's a niche and even startups are only temporarily in that position (unless they fail). For any business where there is a clear product-market fit (which, employee-wise, is pretty much all businesses) the systems rarely go away, they accumulate - if you're not a startup, or if you have found your product-market fit, then you expect that your products and processes won't disappear after 5 months or 5 years, and neither will the code that supports them, unless it's so broken that it's prudent to invest in a full rewrite.
Even if a company fails, their products, processes (and code) usually get absorbed by some other company and need to be maintained - startups get acquihires that keep teams but discard products; "normal companies" get M&As that discard headcount but keep product lines, divisions and processes that require lots and lots of running code. The large companies often have multiple "inherited" codebases from all the other companies they have absorbed. And there is a lot of old code running; nothing is as permanent as temporary code - I have seen comments stating "this won't work properly on the boundary between fiscal years, but the system is scheduled to be replaced by then" that were made IIRC 6 years before I was looking at that system, so it obviously did not get replaced back then. In many industries a 10-year old company is a young company; heck, most of the current "internet startup unicorns" are 10, 20 or more years old; in established industries (you do know that the vast majority of software people work in non-software companies, right? most code is written for internal business needs, not sold as a service or product or consulting to others) there is a lot of mature code serving business processes that have been there for decades, will be there for decades, but often have some changes that require also code adaptations. The same goes for all the code that's inside industrial products - in the automotive industry, in home electronics industry, etc; you may have a new model of car every year, but most of the code in that car will be much older than that.
I mean, the trivial fact is that if we look across the whole industry, all the statistics show that the majority of programmer manpower is spent on maintenance. So the total costs of software are dominated by how easy it is to maintain it, and a lot of that comes from proper design that takes into account what the likely needs are going to be after a bunch of years.
> everything is made of bits
Bits are objects, too...
I kind of like the way D handles oop. D gives you structs, classes and interfaces.
Structs are stack allocated and have no inheritance. But are otherwise syntactically work like classes.
Classes are heap allocated and allow single inheritance, unless the parent is an interface.
Interfaces are similar to a class but it's member functions must be overridden. A class can inherit from multiple functions.
I tend to use a mix of these and templates depending on the type of data i'm handling. I find it gives the best of whatever design pattern works well for different parts of a project without locking you into a certain paradigm throughout and still keeping everything fairly logical and coherent to read through and understand.
How has the church of OOP failed? Nearly every used language is based almost entirely on OOP. OOP makes organizing software and code reuse incredibly easy. The only real downsides to OOP is that its arguably slower and has more overhead. But that's only a problem in niche applications (ie. embedded apps).
It failed by teaching that every data structure should be put into classes using many levels of inheritance, interfaces, encapsulation, accessors and the whole shebang when the thing really is just a plain old integer. OOP is really useful and powerful in many areas and applications but it is not the only tool that has to be used for everything.
That sounds like a failure of the teachers then, not of OOP.
There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy.
OOP-based languages are certainly more common, but definitely not the only game in town. Clojure, Erlang, Lisp, Perl 4, Forth, Fortran, Haskell...
> OOP makes organizing software and code reuse incredibly easy.
That is the big promise and the big lie of OOP. It, in fact, accomplishes the opposite.
The medium used across systems today is data, not objects. Your objects are not compatible with systems across the wire, they need to be converted to data (JSON, XML, ...). They're not compatible with your data base, they need to be converted to data (SQL, ...). And if you want to use other people's objects (say from a library) you first have to make a layer that translates them for your own objects, since objects from other systems won't directly fit the model of your own object system, they always need to be engineered in. And if the objects are encapsulating data that you actually need, but doesn't offer ways to get it (private methods), you often have to jump through hoops to get it.
Not to mention the fact that OOP often entails immutability which leads to problems while doing multithreaded processing.
Clearly the answer is to use a more data-oriented perspective and use a programming language focused around data. Clojure gets it right and that's what I use. It's all concise functional code that skips all that class creation OOP loves, instead operating directly on immutable data (numbers, strings, maps, vectors, sets).
I recommend watching some talks by Rich Hickey (the guy who made Clojure). They're almost all excellent.
Except for the fact that objects can be serialized and ORM's exist to convert your data into OOP. How do you think Entity Framework works? How do you think Rails works? You can build almost an entire ORM from a database without even writing any code in Entity Framework.
> And if you want to use other people's objects (say from a library) you first have to make a layer that translates them from your own objects
Funny, I've been using .NET framework objects without any type of translation. And the point is to use inheritance to mitigate the translation.
Don't blame the model for its poor use. If there's no way to get the data you need, then that means the class was designed so you didn't actually need it or there are other ways to get it (i.e. through an interface).
>Clojure gets it right and that's what I use.
Except Clojure uses OOP principles and even admits to saying it uses immutable objects in the form of interfaces. Interfaces are essentially stripped down abstract classes. How this is not a subset of OOP, I don't know.
I'm sure Clojure is great, but can you write interactive applications with it without integrating OOP libraries?
> Except for the fact that objects can be serialized and ORM's exist to convert your data into OOP. How do you think Entity Framework works? How do you think Rails works? You can build almost an entire ORM from a database without even writing any code in Entity Framework.
That was actually sort of my point. You need all of this extra stuff _because_ your code is all objects. Data doesn't get serialised, data just gets sent and then it gets received. Why should you spend time serialising and de-serialising an object, when you can just send your map data structure directly? Maps can be represented 1:1 as e.g. JSON. Any JSON data is basically a big map data structure. It's one function call instead of hours of writing ORM classes or custom serialisation methods just to send some data over a wire.
> Except Clojure uses OOP principles and even admits to saying it uses immutable objects in the form of interfaces. Interfaces are essentially stripped down abstract classes. How this is not a subset of OOP, I don't know.
You don't use interfaces in Clojure, you tend to use multi-methods for most purposes where an interface is needed.
I wouldn't say Clojure uses OOP principles. Its core is written in Java, so obviously that part is forced to use objects, but that is only used to create the immutable data structures used in Clojure, which are represented as data literals, not as objects. You typically don't operate on objects in Clojure unless you're doing interop with Java or JavaScript. Instead what you do is use pure functions that take immutable data as input and spits out new immutable data. There is no object to consider, only the raw input data. A vector or a map is as much of an object as a struct or an enum. Those are data types that also existed in C, not exactly intended as an object-oriented language.
> I'm sure Clojure is great, but can you write interactive applications with it without integrating OOP libraries?
Well, yeah? Why wouldn't you be able to?
>Maps can be represented 1:1 as e.g. JSON. Any JSON data is basically a big map data structure. It's one function call instead of hours of writing ORM classes or custom serialisation methods just to send some data over a wire
Again, you don't write the ORM classes, the framework does it for you.
And what you advocate is essentially sending a table over the network. So, what happens if your data within that map is complex? Are you suggesting to send every piece of a complex data type over the wire in separate chunks? If so, how do you relate it in the application? You still to make some sense of that JSON data in your application. Having it in a big map structure is akin to a god object.
I my mind all you're doing is masking objects in different concepts just because you don't like using classes.
And now you have to rely on a framework, where there's more opportunity for leaky abstractions, more surface area where bugs can show up.
I'm not sure what you mean by complex data - data is data, and using a serial format like edn allows you to encode a lot of different stuff as data - even functions. I think you're stuck in the oo mode where you're passing around objects and classes instead of just data. Data is so much easier to deal with!
> Except for the fact that objects can be serialized and ORM's exist to convert your data into OOP.
Except for the fact that there's usually a mismatch between how your database handles data and how you want to get them back into objects. Cue: Object-relational impedance mismatch.
And while your objects become bigger and bigger and your domain more complicated, you end up relying on ORMs who keep re-creating those objects from the database with every transaction and are loading lots of data no one requested. And then you are wondering why your stuff doesn't scale.
> Interfaces are essentially stripped down abstract classes. How this is not a subset of OOP, I don't know. It's not. Interfaces have been around way before OO entered the field.
> That is the big promise and the big lie of OOP.
Well I wonder all those reusable libraries that I'm using all the time such as boost, Qt, POCO, openframeworks, etc... come from then. Am I dreaming them ?
> The medium used across systems today is data, not objects. Your objects are not compatible with systems across the wire, they need to be converted to data (JSON, XML, ...). They're not compatible with your data base, they need to be converted to data (SQL, ...)
not all code on earth is your average server app that communicates with a DB and sends JSON to the internet. I don't think I have even one installed program working like this on my computers. However I have an office suite, a lot of GUI apps, media authoring software, music player, web browser, mail client.. and they are all built with OOP languages - C++ being the one used for the immense majority - and OOP patterns.