[conspire] End of an era for Java

Nick Moffitt nick at zork.net
Wed Oct 19 16:20:45 PDT 2011


Tony Godshall:
> On Wed, Oct 19, 2011 at 10:21 AM, Rick Moen <rick at linuxmafia.com>
> wrote:
> > Many people will have missed the fact that Oracle Corp. changed the
> > licensing on Sun Java as of Java SE 7 / JDK7 so that third parties
> > (such as Linux distributions) are no longer allowed to distribute
> > it. ...
> 
> I'm wondering if it makes sense to code in Java anymore.

The joke that Java is COBOL for the 21st century certainly has legs, but
I have come to think of Java more as the FORTRAN of the 21st century.

Fortran is a language with a pedigree, of sorts.  It was the only
language Feynman knew how to code in, according to Danny Hillis.  It's
been used by scientific projects for the better part of a century, and
you still see Fortran programs used as the payload for some of the
mind-bogglingly parallel supercomputers they tend to build at national
laboratories.

Scientists and EEs tend to write bad code, as a rule.  This isn't a dis,
really, but they tend to assume that either calculus or large tables
full of data will save them.  As a result, the development on modern
Fortran compilers has focused on anticipating these coding styles and
the related assumptions made, and trying to optimize around them.  So a
good compiler can look at your matrix operations, say, and determine
that some expression you've laid out will reduce to an identity matrix
and just substitute that as a constant.  It will do the same kind of
optimizations that a program like Mathematica would.

So Fortran, like Java, is a language where bad code with good intentions
is made to run well.  Java still leads the herd on a few bits and bobs
of JIT technology, for example, and the relationships between code and
VM and between VM and bare metal are so well explored now that you're
hard-pressed to find a more finely-tuned beast.

So what future Javur?  Well for starters, the whole "VM" model has been
cursed with (as discussed earlier on this list) something of a MacOS 6
memory model.  You have to pre-allocate a memory arena for any app, and
if it overruns this you're basically SOL.  So people default to crazy
gigaboobles and just plug their ears chanting "NA NANANANANAA" and leave
deployment notes advising increases in memory allocation when the
datasets grow too large.  I mean, the spare pages will all swap out
anyway, riiiight?

As for JITs, the PyPy project has been doing some amazing work with
Python 2.x (and thus delaying anyone's interest in Python 3.x still
further) creating an adaptive VM that can work its way up to speeds that
CPython can't ever reach.  The python community is still balking at the
quirks of JIT environments that the Javatroids worked through a decade
or more ago ("What do you mean I have to wait ~500 requests before my
app server is up to speed?  Have you the brain worms?!"), but the
solutions to most of these things have already been worked out solidly
in the rest of the standard deployment infrastructure (most proxy
front-ends know how to let an app server warm up before getting any
serious load, these days).

And of course there's Java's famous propensity for architecture
astronauts to read the Gang Of Four "Design Patterns" book as
prescriptive rather than descriptive, and treat terminology as
technique.  It's hard to avoid being the sort of cynic that saw BASIC as
brain damage in the 1980s, and write the authors of this sort of code
off as too far from reality to be saved.  I have a hope that the shock
of learning something new will bring out diamonds in the rough, and the
patternheads can keep on refactoring their
AbstractFactoryGeneratorVisitorSingletons in legacy codebases for a
comfortable wage well into retirement.

Myself, I worry more for Python's future.  It seems that the really hip
kids these days are all working in ruby, which is a great language to
write in but reading code is a lot harder than in Python.  

When you bring up choice of programming language, the defenders of the
maligned languages (PHP, Perl, etc.) will roll their eyes and say "But a
good programmer can write good code in any language!"  To this, I always
say: Mozart could write grand opera in German, but that doesn't mean
it's the best language for the rest of us to work in.  It's not
interesting that Knuth could have conceivably written TeX in readable
Intercal.

Apologies for the meandering.  I blame the Lagavulin.  I ought to give a
double to the sigmonster, this time.

-- 
"If you carefully examine the intercal package (which
was not available for a month despite emails about it
being a 404), you will discover that . is in ESR's
PATH."   -- Joey Hess




More information about the conspire mailing list