i'm back to needing a work icon
Sep. 11th, 2008 01:22 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Question for object oriented gurus:
I am currently reviewing some code implementing a current standard of an algorithm i frequently use. (I want to examine some modifications to the algorithm, but, i need a good baseline to compare to.) In it we see something like:
struct velocity
{
int size;
double v[D_max]
};
There are a lot of these structs - position, quantum, etc.
Thing is, in my code, i generally declare
num_dim = n; // this is what they are using size for up above
double position[num_dim];
double velocity[num_dim];
(quantum, for the record, appears to be taking the place of what i usually declare as a constant Eps, and is used to get around numerical issues when looking for zero.)
etc. I do not have additional structs. Thing is, i find all this structifying to be sort of pointless and irritating. Pointless because i do not know what the structs are adding to the code. Irritating because i think they add a level of obfuscation, rendering the code not only longer, but also much less readable.
My question - what, if anything, am i missing in this situation? I get, generally, what object oriented-ness does for you. But i haven't used it very much in the past 6 or so years. (Matlab's excuse for object oriented isn't worth bothering with.) Right now i find myself faced with a few examples of modern code that are object oriented up the ass, and it just seems like it all has been taken too far. If i give myself three months will i become a believer? Will i stop feeling like there should be some sort of natural progression through code and adapt to having objects interacting at will?
I am currently reviewing some code implementing a current standard of an algorithm i frequently use. (I want to examine some modifications to the algorithm, but, i need a good baseline to compare to.) In it we see something like:
struct velocity
{
int size;
double v[D_max]
};
There are a lot of these structs - position, quantum, etc.
Thing is, in my code, i generally declare
num_dim = n; // this is what they are using size for up above
double position[num_dim];
double velocity[num_dim];
(quantum, for the record, appears to be taking the place of what i usually declare as a constant Eps, and is used to get around numerical issues when looking for zero.)
etc. I do not have additional structs. Thing is, i find all this structifying to be sort of pointless and irritating. Pointless because i do not know what the structs are adding to the code. Irritating because i think they add a level of obfuscation, rendering the code not only longer, but also much less readable.
My question - what, if anything, am i missing in this situation? I get, generally, what object oriented-ness does for you. But i haven't used it very much in the past 6 or so years. (Matlab's excuse for object oriented isn't worth bothering with.) Right now i find myself faced with a few examples of modern code that are object oriented up the ass, and it just seems like it all has been taken too far. If i give myself three months will i become a believer? Will i stop feeling like there should be some sort of natural progression through code and adapt to having objects interacting at will?
no subject
Date: 2008-09-12 04:24 am (UTC)no subject
Date: 2008-09-12 02:55 pm (UTC)Since there are no cultural associations to organize around data or around code-flow (neither basically OO nor basically procedural), most people do neither. Thus, my comment about it being hard to trace both, which is what happens when you don't work to make it easy to do one or the other. It's entirely possible that OCaml, an unholy hybrid of functional (not pure-functional) and OO, manages to have those OO "Kingdom of Nouns" cultural expectations and organize its programs around data like regular OO programs. I've avoided OCaml, mainly because I'm neither a big OO fan nor at all an ML fan, and what few things I liked about ML would be basically destroyed by an OO type system and what it would do to type inference.
Yes, pure-functional (no modifications or side-effects allowed) does solve some problems with ordering, and especially with concurrency. When I said "functional" I meant more "allowing functions as first class objects and encouraging passing such objects, and large data structures, through function calls" rather than "prevents all side-effects." While the two are usually culturally bundled, they're technically orthogonal issues -- you could have a language as limited as C and still prevent side-effects, it's just that nobody's crazy enough to do that because avoiding side-effects is much harder than not, so you generally need the language to work harder to accomodate you while you're working out how to do things without all those intuitive imperative tricks like, "print this" or "assign this value to a variable" that are now verboten.
However, if you think of "functional" as meaning "has functions as first-class objects", it's a whole different set of assumptions and problems. Ruby, Python, SML/NJ, are all languages that allow first-class function objects without any prohibition on side-effects (functional, but not pure-functional). So they don't get the advantages you mention, but they still tend to organize code in wacky ways. When you're using a lot of map/reduce code (or its equivalent in other languages), you wind up wanting to create big sequences and pass them through a set of big sequence operations, which is what map and reduce are. It doesn't much matter how you organize your static chunks of data, and barely matters how you organize your data structures at that point -- the heart and soul of the program is going to be that very small number of nearly-impenetrable code that subtly handles your map/reduce stuff, along with some lambdas which will be defined nearby....
Which is fine, and works well for many people. Hell, my Ruby code looks exactly like that. And I try to organize it as procedural rather than OO, because if I'm going to make it that hard to follow control flow then I might as well organize to make it a *little* easier to follow control flow.
no subject
Date: 2008-09-12 03:24 pm (UTC)I've seen code that was just a lambda soup. It helped make me despise Ruby, although it wasn't the language's fault (plenty enough other things were). Most code I run into sees a function that takes a closure as basically a loop, which is perfectly readable.
no subject
Date: 2008-09-12 04:17 pm (UTC)Assuming PL == "programming language", you overgeneralize. That is one thing functional means. Much like "OO" can mean "using objects for polymorphism", and usually does, but may not. It can also mean "all types descended from a single parent type," but often does not.
Most code I run into sees a function that takes a closure as basically a loop, which is perfectly readable
Yup, that's actually how Ruby does most iteration.
And yeah, lambda soup is a pain in the ass. Ruby is interesting because it has little enough in common with most of its predecessor languages that its fans are really still figuring out how to write it, stylistically. So "Ruby style" is all over the map, and still evolving fairly rapidly for a language of its age.
no subject
Date: 2008-09-12 05:02 pm (UTC)My main beef with Ruby was that it didn't seem to offer much over python, except for tricky design issues that perl and python faced and overcame 5-10 years earlier. What it has come up with since I last got burned is a mature Ruby on Rails, which is the shiznit when it comes to writing web sites, or so I'm told. My main beef with both python and ruby was that it does almost no static checking, which means if you misspell a function name on a line of code that is only invoked after 30 minutes of computation, you want to kill something when those 30 minutes are up and you just lost the data.
As for PL, I meant the theoretical programming languages research community. Functional means no side-effects; first-order functions means lambdas; polymorphism usually means parametric polymorphism, not the god-awful OO inheritance stuff; first- and higher-order types are in vogue; any language in common industrial usage is hardly worth dignifying with the term "language." None of that is really in dispute in these halls (among the PL types -- others are very happy about how Ruby "doesn't get in the way" [of writing buggy code]). Once you drink the kool-aid, it's pretty hard to go back. Languages in industrial usage lag the research community by 20-30 years of course, though Simon Peyton-Jones at MSR Cambridge is pushing some pretty modern features into C#.
We're by now pretty far afield of the original topic aren't we...
no subject
Date: 2008-09-12 05:07 pm (UTC)Most of what I like about Ruby specifically is metaprogramming, which can be thought of as either poor-man's LISP macros or structured-eval-plus-runtime-type-definition. Rails happened in Ruby because Ruby has metaprogramming, which Rails uses quite extensively.
I like Python just fine, but its indents-are-syntactic quirk makes it very hard to embed as a templating language. That's not a huge drawback overall, but again, Rails needs a very robust templating language since a lot of what it does involves generating HTML from templates. And honestly, I just like Ruby syntax better for most stuff.
no subject
Date: 2008-09-12 05:21 pm (UTC)