Discussion:
Programming languages, operating systems, despair and anger
(too old to reply)
Jeff Bone
2009-11-12 19:08:40 UTC
Permalink
Or, the State of the End-User Environment / Programming Onion...

So, in a confluence of events, I have been lately looking at Scala and
yesterday spent some time looking at Pike / Thompson / Google et.
al.'s "Go" language for "systems programming." And Guido halts Python
syntactic evolution before the job is done...

And it fills me with despair and anger.

Language designers in general: You are IN THE WAY. Go here:

http://www.rebol.com/oneliners.html

...then do us all a favor and don't resist the urge to fall on that
wakizashi when your shame overcomes you.

First of all, neither Scala nor Go is anything more than incremental
evolution; Go in particular put me over the edge. It's merely the
latest of a series of languages from these same people (former god-
like heroes to me) --- most recently Limbo, each successive one of
which implements the same ideas in barely, idiosyncratically, every-so-
slightly different ways. And so for 20 years now these folks ---
*the* shining lights, in many ways, of "practical" programming
language, operating systems, and general systems research --- have
continued to fail to "get" the fundamental practical needs of everyday
programmers working in The Real World. "Go" is just another language
written first for the compiler and only secondarily for the programmer
--- and stuck in a 70s mindset* about the relationship of that
programmer to the (digital) world within which they live and work!
(But hey, it compiles fast! Which is, of course, THE problem that
really needs addressing.) (Or rather, a fast-forwarded 70s mindset
projected into a fictional future. I get the sense that Google is a
bit Star Trek-ish; an anachronistic-nostalgic future that doesn't
really exist outside of its own fictional milieu. I mean --- it took
'em 200 years to figure out the implications of the transporter
buffers --- and then they didn't even take advantage of them
consistently? GMAFB. But I digress.)

I think Carl Sassenrath may be the only person on the planet working
in this area that accurately perceives the "pain" involved in general,
everyday, practical programming and is *actually* trying to address
it. Too bad his solution is locked in closed source, in corporate
lock-in, and in a company that *clearly* is suffering from a
completely schizophrenic (lack of) "business model" --- no doubt due
to the influence of investors and suits and isolation. (Not to
mention suffering from ghastly-ugly, mondo-1989 graphic design of its
built-in GUI toolkit components and hence all application UIs.)
(Other candidates for clue include Eric Meijer who, unfortunately,
seems to have been unable to translate clue to practice in many ways,
despite lots of "enterprise-scale" innovation; and Alan Kay, who has
some real ideas ("Worlds" == absolutely awesome idea) but is
apparently suffering from "research / think tank / industry alumni"
inability to reduce research to practical practice.) (Wolfram might
get it... maybe, part of it... but let's not go there, same lock in
probs as Carl's gig but a whole lot worse, and a lot of other issues
too.)

What do we do? Communication, coordination, automation, analysis, and
visualization / animation / interaction. That's IT. Over a few-dozen
basic kinds of information and in a handful of typical hardware
environment / architectural scenarios. Come on!

J.H.C, folks, it's nearly 2010. Let's get a few things straight:

- most programming involves schlepping a few but complex data types
between different string representations
- programmers have become plumbers and documentation-archaeologists
mostly, which is sad and uninteresting
- programming languages are for *programmers* --- not compilers and
compiler-writers
- until you make the everyday, "simple" things simple, it will
continue to be a dark art practiced by fewer and fewer
- any language that makes you explicitly import an IO module to
read a file or stdin is fucked
- declarations are a pointless anachronism (same for explicit
memory management)
- if I have to understand category theory to write a program that
does IO, IT IS A NON STARTER!
- less stupid line-noise syntax and punctuation, people

SIMPLE GUI PROGRAMMING! Remember BASIC? Logo? Zero to graphics in
three minutes, max. How the hell are kids supposed to learn to
program these days? Even Python's learning curve is too high, IMHO,
though it's probably the closest thing to a reasonable starting ground
that has any real traction. And the GUI bar is raised these days:
kids and non-developers (and busy developers w/o time for little,
interesting toy projects *just because project set-up cost is so high
in almost any language / environment!) need to be able to throw
together complex multi-agent 3D microworlds with minutes. (Yes, I'm
aware of Kodu and friends. It's a start, maybe; but it needs its
textual equivalent; enough with the "only visual programming" crap,
it has shown its limits.)

If you provide some binding / embedding to Tk and you think that's
sufficient to satisfy your GUI needs, then IT'S A NON-STARTER! If you
DO NOT provide a CANONICAL cross-platform GUI toolkit, IT'S A NON-
STARTER!

If it's more than 5 readable lines to produce a "hello, world" web
server --- NON STARTER!

If sending an e-mail isn't a one-liner --- NON STARTER!

Getting a web page or making a simple http post > 1 line --- NON
STARTER!

Figuring out the number of days between two dates > 1 line --- NON
STARTER!

While I'm on a rant, FUCK JSON. I mean, first fuck XML thoroughly and
nastily, but let's call it like it is -wrt- JSON. Anything that
forces me to do this shit:

{
'someSymbolicKeyAsString': 'someUriValueAsString'
}

...is BROKEN BY DESIGN!

NO MORE TUNNELING CRAP IN STRINGS!!!

*REAL WORLD*, modern datatypes, built-in, literal, batteries-included
PLEASE!!! If the following aren't first-class types with first-class
literal constructors / representations supported *at the language
level* then your new programming language ISN'T EVEN ON THE PLAYING
FIELD:

- symbols AS SUCH
- numbers AS A SINGLE TYPE
- strings NOT AS BYTE ARRAYS
- a single, universal aggregate type (cf. Lua tables, more or
less; Rebol blocks)
- e-mail addresses, file / directory paths, and general URI types
- date and time constructions with simple operator arithmetic
- generic markup constructs (cf. Meijer "Unifying Documents [etc.]")
- domain names and IP addresses
- regexes and structural regular expressions
- generalized patterns for pattern-based dispatch
- quantities --- numbers with attached dimensionality
- booleans and *potentially tri-state logic support*
- ranges and slices
- some controlled ability to do open classes
- concurrency constructs AS SUCH

Also: if your language doesn't include a CPAN / CRAN / etc. repo and
package-distribution capability BUILT IN, it's again a non-starter!
Dammit, people!

And let's get a few things straight: I don't care if it's the OS,
language, or language-integrated runtime (as opposed to externalized
"standard library" or even worse third-party library) but:

- the "operating system" or runtime should provide *real world
abstractions*, too
- higher-level stuff BUILT IN, like:
- person, group, social network, presence, identity,
authority, permission, etc.
- media objects: audio, video, pdf, still images, etc. (just
a few!)
- pages, bookmarks, notes, tasks, todo lists, other lists,
calendars, events, etc.
- spreadsheets, charts, diagrams, formulas, maps, etc.
- tagging rather than hierarchy --- EVERYWHERE
- workspace, context, orthogonal persistence, location-
transparency
- context awareness, "cluestreams", user activity,
remembrance agents enabled / made easy
- not just multi-core single-machine or homogeneous-cluster
programming, but rather:
- a heterogeneous, loosely-connected, multi-device, multi-
input, user-centered universe-of-tools
- multiple machines, multi-machine-aware tools and objects
- programmable / multi-machine clipboard / clipping /
plumbing
- scheduling (i.e. cron), rules, etc. (e.g. things like
"Puppet" unnecessary!)
- caching / replicated / fully versioned home dirs, drop
boxes, etc.
- deep support for both client- and server-side personalizable
HTTP and mail services
- handling of disconnection / replication / synchronization
- dispense with application-level balkanization of data!
- composition and re-use


Grrrr.... it's pretty damn sad that something as limited and now-
ancient as bash represents some kind of optimum of productivity for
many real-world "everyday programming" tasks --- and yet fails so
miserably for so many other everyday programming tasks due to lack of
data abstraction and richness.

If I have to write one more polyglot bash / awk / python script to
gather data from log files on a bunch of different machines, demux
that into a time-ordered event stream, pipe it through something to
munge it into some slightly different format, ship that off via post
to some web address and get some JSON back, parse that into some other
shit, do some computation over it like aggregation or date math over
time stamps with unlike representations, wrap the results up in an
HTML table and send that table in a MIME-enabled e-mail to myself I
think I am going to *explode.*

90% of the shit that gets written doesn't even involve anything as
complicated as finding set partitions. Really.

Somebody do something about this, before I LOSE MY FUCKING MIND!?!?!


jb
James Tauber
2009-11-12 19:14:52 UTC
Permalink
And Guido halts Python syntactic evolution before the job is done...
Not forever. Just to let implementations and libraries catch up to 3.X

James
Benjamin Black
2009-11-12 19:48:34 UTC
Permalink
Post by James Tauber
And Guido halts Python syntactic evolution before the job is done...
Not forever. Just to let implementations and libraries catch up to 3.X
Given Guido's well documented hatred for metaprogramming and recursion,
it might as well be forever.


b
Damien Morton
2009-11-12 19:31:17 UTC
Permalink
Post by Jeff Bone
90% of the shit that gets written doesn't even involve anything as
complicated as finding set partitions.  Really.
90% of the shit that gets written involves bridging and adapting
between different utopian ideals that partially implement what you
describe in the rest of your email.

I remember a blog posting about how great Java programmers were
because they could discuss at length the advantages and disadvantages
of 7 different inversion-of-control containers, while .NET programmers
are bad because they have never heard of the term (and instead just
use the one container given to them).

Bleh.

What you describe is a massive undertaking - and one rife with
problems that when resolved result is a mess of not quite right
solutions. Your platonic ideals will not help you here.
Benjamin Black
2009-11-12 19:59:20 UTC
Permalink
Post by Jeff Bone
So, in a confluence of events, I have been lately looking at Scala
The most enjoyable Scala-related experience I've had is going to #scala
on freenode and referring to Scala as a 'kitchen sink language'. I was
informed that Scala's (numerous) features are 'orthogonal' and I had not
read the right (large) set of research papers to understand all of its
features are not complex, but simple.
Post by Jeff Bone
and
yesterday spent some time looking at Pike / Thompson / Google et. al.'s
"Go" language for "systems programming."
http://twitter.com/benjaminblack/status/5603154596

"Wrote my first concurrent Go program: func fuck_yourself() { ; //
8=====D } go fuck_yourself();"
Post by Jeff Bone
And Guido halts Python
syntactic evolution before the job is done...
How could anyone take him seriously after this?

http://neopythonic.blogspot.com/2009/04/tail-recursion-elimination.html

So, I agree with your observations that much of language 'design' is
turd polishing, but I wonder in what field that is not the case. There
are a few bright spots, and the rest is mediocrity. I don't see your
prescriptions as improvements, however.

Any language with reasonable metaprogramming facilities can produce the
result (see all the DSL-ish stuff being embedded in Ruby, for example).
One-liners are useful, yet many useful things can't be expressed as
one-liners without a whole lot of lines underneath, lines that are often
very specific to the task at hand. You have chosen a set of things you
want available for creation of one-liners, but I expect others would
have different ideas of what things are most important. If that were
not the case, this problem would be trivial.


b
Jeff Bone
2009-11-12 20:05:20 UTC
Permalink
Post by Damien Morton
What you describe is a massive undertaking - and one rife with
problems that when resolved result is a mess of not quite right
solutions.
And one that has mostly been done, pretty well -- cf. the
aforementioned Rebol. While almost everything else out there is a
"language for the 80s conceived in the 90s" Rebol is a language for
the teens conceived in the 90s.

It's just unfortunately crippled by lots of other baggage. But it's
been out there for a decade now, the strength and utility of its novel
ideas should be patently obvious to anyone, and it's only the myopic
enterprise-developer (rather than the improvisational duct-tape
programmer) that fails to instantly recognize this. So why haven't
the damn language geeks figured this shit out?

Other promising contenders, hobbled for other reasons: PowerShell
(.Net is ghastly, and it shows through) and arc (still too religiously-
lispy to get behind the kind of strict literalism Rebol initiates and
I am endorsing.) Erlang is good for what it is (a language for the
90s designed in the 80s and 90s, finding use in the late-Oughts) but
it's just a DSL for join calculus and large-scale systems-building.
Syntactically a mess, but at least it has symbols --- but bad for much
of the kind of stuff many of us do in terms of data-wrangling. Other
almost-contenders litter the roadside of development: FiSH (both of
them), various Logos, Shoes (DAMN WTLS!), Pluvo, etc. all promising,
never to realize their potential. Various analytical DSLs --- R comes
to mine --- absolutely *shine* but fail to generalize enough beyond
their niche. Etc...

Don't BS me about what a "massive" undertaking this is. It's
obviously do-able, there are 80% existence proofs. It's just that
none of them cover enough of the set of disjoint use cases to get
enough traction --- *or* they fail the adoption test for non-technical
reasons.
Post by Damien Morton
Your platonic ideals will not help you here.
Let's not be rude. As previously discussed, I sit far closer to the
Epicurean end of that false one-dimensional landscape than the
Platonic. But not even that anymore, so much, really.

jb
Jeff Bone
2009-11-12 20:19:29 UTC
Permalink
You have chosen a set of things you want available for creation of
one-liners, but I expect others would have different ideas of what
things are most important. If that were not the case, this problem
would be trivial.
*I* haven't chosen any such set of one-liners; I'm merely pulling
this stuff out of an existing set of e.g. Rebol one-liners and other
one-liners, *assembled by (many, varied) others working in that
language, which I do not.* The point being, there is a large common
subset of tasks frequently performed on data types that are now common
but didn't exist 10 years ago, for which insufficient language- and
environment-level support forces solutions in most languages to
involve an unpleasantly-large amount of code at an uncomfortably-low
level of abstraction, mostly resolving impedance mismatches between
leaky and unnecessarily different levels of abstractions, data types,
representations, and APIs.

I didn't come up with that set, nor do I suggest that it's
authoritative or comprehensive. But seriously: go look at that list
of Rebol one-liners again. Then go look at the lists of any set of
"one-liners", tiny hacks, etc. (in any such language that supports
such things). Arrange all these lists of one-liners by language
according to language chronology, and see the evolution in abstraction
over time. Observe that, for the most part, things COMPLETELY stalled
in terms of evolving language-level support for "new" data types and
tasks about the same time the 'Net took off --- modulo Rebol and
perhaps a few others.

I am tempted to conclude that *almost every* language designer starts
with a relatively specialized beef with some existing language in some
particular and self-absorbed usage scenario and then sets out to build
a language that addresses *just that beef* in *just that use case* ---
rather than actually attempting to take stock of the problems
*actually, commonly* faced day-to-day in *using computers (and
networks) to do interesting things.*

Others *do* have different ideas, no doubt. I'm not suggesting any
language should seek to be a silver bullet and tile the entire
potential use-case space. However there is *clearly* a *very large*
set of present and anticipatable use-cases that could be covered much
more adequately if only we acknowledged that (a) we live in a
networked world, (b) you're not just using one box, much less sitting
in front of a fucking TTY, anymore and (c) there are several ---
maybe couple-dozen --- data types *that we all use* (and operations on
those) that, if we simply supported them better, would ease all our
lives considerably.

What that list comprises is open for debate. That such an
intersection exists for a VAST amount of the programming that is done
today *by anyone* --- is NOT.


jb
Damien Morton
2009-11-12 20:31:24 UTC
Permalink
What you describe is a massive undertaking - and one rife with problems
that when resolved result is a mess of not quite right solutions.
And one that has mostly been done, pretty well -- cf. the aforementioned
Rebol.  While almost everything else out there is a "language for the 80s
conceived in the 90s" Rebol is a language for the teens conceived in the
90s.
If those Rebol one-liners are anything to go by, I'd say it wasnt done
right. Its way out there on the APL branch somewhere, from what I can
see.
Other promising contenders, hobbled for other reasons:  PowerShell (.Net is
ghastly, and it shows through) and arc (still too religiously-lispy to get
behind the kind of strict literalism Rebol initiates and I am endorsing.)
 Erlang is good for what it is (a language for the 90s designed in the 80s
and 90s, finding use in the late-Oughts) but it's just a DSL for join
calculus and large-scale systems-building.  Syntactically a mess, but at
least it has symbols --- but bad for much of the kind of stuff many of us do
in terms of data-wrangling.  Other almost-contenders litter the roadside of
development:  FiSH (both of them), various Logos, Shoes (DAMN WTLS!), Pluvo,
etc. all promising, never to realize their potential.  Various analytical
DSLs --- R comes to mine --- absolutely *shine* but fail to generalize
enough beyond their niche.  Etc...
Powershell.. maybe. Arc is barely a sparrow fart - you cant seriously
be suggesting it is a contended for Java/Python/etc replacement?.
Erlang is suffers from bit-rot, its libraries written by people from
completely different mindsets and now locked into a morass of
inconsistencies. The others I dont know so well.
Don't BS me about what a "massive" undertaking this is.  It's obviously
do-able, there are 80% existence proofs.  It's just that none of them cover
enough of the set of disjoint use cases to get enough traction --- *or* they
fail the adoption test for non-technical reasons.
It is a massive undertaking, and unfortunately one that has only been
done by staged evolution. Java, .NET, and Python are the
cross-platform fully-fledged dev systems I am most familiar with (and
ones that approach the completeness you propose), and they have all
evolved over time and carry different strata of thinking within them.

To take one example, .NET - 1.0 was just java, 2.0 added generics (a
huge step), 3.0 added Linq (a limited metaprogramming) and type
inference, and 4.0 adds dynamic language features and duck typing. You
could make a case that all of these things should have been in 1.0,
and you can certainly make a case that their not being there has
resulted in a massively confusing library system. Then again, the
project would have been canned if they hadnt shoved 1.0 out there when
they had. Worse is better, blah blah blah.
Your platonic ideals will not help you here.
Let's not be rude.  As previously discussed, I sit far closer to the
Epicurean end of that false one-dimensional landscape than the Platonic.
 But not even that anymore, so much, really.
Not being rude - trying to make an allusion to a scene in a movie
"your god will not help you here".
Jeff Bone
2009-11-12 20:42:36 UTC
Permalink
Post by Damien Morton
If those Rebol one-liners are anything to go by, I'd say it wasnt
done right. Its way out there on the APL branch somewhere, from what
I can see.
You must be high. CLEARLY this is a Lisp-derivative, just from the
surface syntax. (In fact, it is exactly that. It, like so many other
things, started life as a Scheme interpreter w/ a surface m-expression-
ish syntax and addition literal data type constructor syntax. I mean,
literally, it did. Ask Carl et. al. Or Google for it, this was
discussed by somebody somewhere.)

APL my ass. Clearly you snoozed through your "programming language
genealogy" class. J, K, even f-script --- sure. Rebol's a Lisp.
That's obvious to anyone that's ever used either any Lisp or any APL-
derivative.

jb
Jeff Bone
2009-11-12 21:15:06 UTC
Permalink
a) Clean syntax b) Rich basic datatypes c) Functional operations d)
Metaprogramming
Am I missing your point?
While you are are correct in naming these 4 as general facilities
required, they are the necessary but not sufficient set. Or rather,
sufficient but not adequate, kind of like claiming that e.g. almost
any language is Turing complete therefore there's no benefit to using
e.g. C vs. assembler.

First point: Ruby doesn't supply "syntactic closure" over a large-
enough subset of its data types. Almost no language does! I.e.,
there aren't enough literal data type constructors. This is *the*
main advantage of e.g. Rebol over anything else, these days. The
ability to simply literally construct various data types directly is
THE key to resolving a lot of the various impedance mismatches that
characterize most coding today (cf. Meijer, http://tinyurl.com/yfbal8z
--- though this is the work that eventually became LINQ, nothing Rebol-
related.) (You could argue that homoiconic languages provide this in
a much more general and powerful way, but if that were true Lisp
wouldn't have evolved to have strings. Ironically bash achieves much
of its power by simply treating *everything* as a string.)

Second point: A rich standard library (or standard class library)
isn't the same, expressively, as having a lot of "rich basic
datatypes" that are supported at the language level. And even Ruby
doesn't get particularly further than other similar entrants (Lua,
Python, I'm looking at you.) It might have a slight advantage over
both of those, though I have to say Lua's tables rock the free world.

Third point: write a program that, when your phone is near your
laptop at work, checks your security system at the house, determines
via various heuristics whether there's anyone else in the family home,
locks the doors remotely if they aren't and arms the alarm, and sends
you a message (determined by where you're most active --- SMS if
computer is idle, IM if not idle, copies e-mail in either case)
indicated that it detected that the last person out (also determined
heuristically by spanning machines to build a complex "context) forgot
to lock the door. Also, if this person was your teenage son, sends
him a text indicating that he's grounded, gets on his cell provider
web page and "locks down" his account to strictly-family-and-emergency
use, and automatically logs into the bank and changes the PIN number
on his ATM card. (Where all does this program run? How does it
orchestrate all this activity? Etc.)

That *SHOULD* be simple. It should be TRIVIAL!

It's not, and it won't be until we start picking apart what all the
abstractions are and supporting them better.

Re: Shoes... Shoes is a bright light along the way, not so much
strictly from a language level, but from recognizing and making a run
at a large subset of the problem in the first place. If you get why
Shoes is cool, and you understand the use case I described above and
why existing stuff is *essentially* and *fundamentally* prohibitive of
addressing that in any sensible way, then you can bootstrap your way
to understanding the kinds of "other things" that belong on your list
besides (a)-(d).

Sadly, Shoes was actually a kind of back-to-the-future attempt. It
was sort of an attempt to get back to the ability to simply put
together rather interesting, useful, or fun kinds of little graphics
apps that folks used to build in these little language environments
like BASIC and Logo and such. An attempt to build, say, a kind of
Squeak w/o the cultism. But we need to not just get back to the
future --- we need to push the future further along. At least to the
*present.*




jb
Benjamin Black
2009-11-12 21:24:44 UTC
Permalink
Third point: write a program that, when your phone is near your laptop
at work, checks your security system at the house, determines via
various heuristics whether there's anyone else in the family home, locks
the doors remotely if they aren't and arms the alarm, and sends you a
message (determined by where you're most active --- SMS if computer is
idle, IM if not idle, copies e-mail in either case) indicated that it
detected that the last person out (also determined heuristically by
spanning machines to build a complex "context) forgot to lock the door.
Also, if this person was your teenage son, sends him a text indicating
that he's grounded, gets on his cell provider web page and "locks down"
his account to strictly-family-and-emergency use, and automatically logs
into the bank and changes the PIN number on his ATM card. (Where all
does this program run? How does it orchestrate all this activity? Etc.)
That *SHOULD* be simple. It should be TRIVIAL!
This is a key point. Many of the problems we face in writing software
are integration, not simply automation. Many of the facilities you
describe as desirable are only relevant for integration. This is
actually the line of logic that led to the creation of the Chef
automation tool: Puppet tries to be all things itself, while Chef
assumes the internal bits are easy, but talking to the inevitable
outside stuff is hard.

Now, leaping from there to "only integration-focused languages need
apply", I don't know if I can manage.


b
Jeff Bone
2009-11-12 23:16:48 UTC
Permalink
This is a key point. Many of the problems we face in writing
software are integration, not simply automation.
Actually, that is THE point, the whole point, and nothing but the point.

Eric Meijer realized this, cf. the paper I referenced earlier. MOST
programming these days is pipe-fitting between different data models,
communication models, and data types. (And without any real standards
about those pipe fittings, but that's a different part of the
picture.) MOST OF IT. ALMOST ALL OF IT. And most of *this* happens
at the lowest, gorpiest, ugliest level possible.

Markup doesn't help, it hurts. YAML, semi-structured text like
various Wiki markups, etc. are an improvement for some things. JSON
is an improvement for what it does, but it's still world-of-strings
wrapped in maps.

One *major* impediment to all integration tasks is this: when you're
passing your data between components, you generally either (a) are
trapped in a very type-specific and strict implementation regime (RPC
stubs, CORBA IDL, and now e.g. Thrift and Google's protocol buffers
are attempts to resolve the problem, not very successfully IMHO as
they imply a kind of development cycle involving a lot of static, non-
interactive crap) or (b) you're marshalling and unmarshalling strings,
parsing files that are strings that encode data structures that embed
strings ad infinitum --- probably with lossy semantics.

Let the programmer express the common type values they want to express
directly, literally, in a way that makes it unambiguous; do so in a
generic way that doesn't entail the kind of massive machinery and
process that inhibits on-the-fly interactive programming of the style
we used to enjoy. THEN and only then will we begin to move in the
other direction -wrt- this impedance mismatch / integration issue that
will otherwise increasingly KILL forward progress ENTIRELY.

It starts with a data language. Lua started life as a configuration
file / data language, but is too impoverished to serve the real
purpose --- and it's still not a particularly good language for
interactive use.

Rebol's got the data literals, but... baggage. And not good on the
interactivity front either.

Sigh...


jb
Stephen D. Williams
2009-11-13 01:30:00 UTC
Permalink
Post by Jeff Bone
This is a key point. Many of the problems we face in writing software
are integration, not simply automation.
Actually, that is THE point, the whole point, and nothing but the point.
Eric Meijer realized this, cf. the paper I referenced earlier. MOST
programming these days is pipe-fitting between different data models,
communication models, and data types. (And without any real standards
about those pipe fittings, but that's a different part of the
picture.) MOST OF IT. ALMOST ALL OF IT. And most of *this* happens
at the lowest, gorpiest, ugliest level possible.
Markup doesn't help, it hurts. YAML, semi-structured text like
various Wiki markups, etc. are an improvement for some things. JSON
is an improvement for what it does, but it's still world-of-strings
wrapped in maps.
One *major* impediment to all integration tasks is this: when you're
passing your data between components, you generally either (a) are
trapped in a very type-specific and strict implementation regime (RPC
stubs, CORBA IDL, and now e.g. Thrift and Google's protocol buffers
are attempts to resolve the problem, not very successfully IMHO as
they imply a kind of development cycle involving a lot of static,
non-interactive crap) or (b) you're marshalling and unmarshalling
strings, parsing files that are strings that encode data structures
that embed strings ad infinitum --- probably with lossy semantics.
I've been thinking of this stuff for a long time. esXML / esDOM was one
of my attempts to solve the problem for many integration cases. I won't
bore you with details, but I was right about a number of things. I'm
thinking of reworking Google Protocol Buffers in a couple ways to merge
the ideas.

Done with libraries, but designed to be fundamental to a language.
Basically, everything can be done dynamically, wire format is the same
as memory format: no parsing or serialization yet little overhead (but
some) and little wasted space. This doesn't solve everything, however
it would suffice for many business applications. A high-profile
application I helped design used this API over standard XML DOM.
http://esxml.org/w3cout/img40.html

However, as mentioned, we also need better data and interchange models.
While XML and XML-like tree models (which are also like object hierarchy
models) are nice, they don't compete with graph-based models in terms of
flexibility and change / difference resiliency. My current thinking is
that XML-like (or microformat-like) data should be considered a view of
graph-based data (RDF et al), so that both are unified to some extent.

Interesting, but old sequence of API choices:
http://www.xml.com/pub/a/2003/10/15/dive.html

Google APIs use AtomPub for getting and setting all data.
Post by Jeff Bone
Let the programmer express the common type values they want to express
directly, literally, in a way that makes it unambiguous; do so in a
generic way that doesn't entail the kind of massive machinery and
process that inhibits on-the-fly interactive programming of the style
we used to enjoy. THEN and only then will we begin to move in the
other direction -wrt- this impedance mismatch / integration issue that
will otherwise increasingly KILL forward progress ENTIRELY.
We need a lot more direct expression with a lot less plumbing. A
library of data / application situations and one or more of the simplest
possible pseudocode / nirvana language code snippets would be useful.

person p; p.FirstName="Bob"; p.LastName="Smith";
msg b { type="newCustomer"; p; };
status={b.put http://example.org/application; };

How about some actual examples of your ultimate data formats?

Besides general verboseness and maximum expressiveness, what are you
optimizing for in particular?

Perhaps pluggable data representation parsing modules would allow enough
arbitrary in-place data formatting.

In addition to the things-that-should-be-straightforward list you gave,
we need these at least:

We need to be able to represent:
hierarchical data members (instances of object hierarchies, XML, JSON,
various other examples}
graphs, including all degenerate formst such as lists, and a superset of
RDF-like knowledge graphs
tables, sparse tables
arbitrary text, including full unicode
spacial / temporal (and similar complex) data
numbers and rationals
images, diagrams, 3D models, sounds, and other multimedia constructs
efficient arbitrary data formats (genome, protein)

We need this to be clean and clear, to have concise in-memory and
on-the-wire formatting, and to support all needed operations. This
especially includes minimal-agreement interchange between subroutines,
processes, communications links, web services, database storage, queues,
and other kinds of integration. Versioning, transactions, and similar
should be supported in memory, between modules or programs, and
remotely, preferably in some unified way.
Post by Jeff Bone
It starts with a data language. Lua started life as a configuration
file / data language, but is too impoverished to serve the real
purpose --- and it's still not a particularly good language for
interactive use.
Rebol's got the data literals, but... baggage. And not good on the
interactivity front either.
Sigh...
jb
sdw
Jeff Bone
2009-11-13 02:27:46 UTC
Permalink
Post by Tom Higgins
Neither the prog langs nor the bitching about them has changed all
that much since I got into it in the late 70's.
That's true --- and sort of makes the point. The *nature* of our
computing environment has changed *a lot* since the late 70s --- but
with a very few exceptions, *nothing* that we're doing with our
languages and data today wasn't either already in place or thoroughly
described and thought out by about that time.

There's still a damn *teletype* in the middle of things in your spiffy
Mac or Linux notebook, for xsakes. (And for that matter, the
operating system underlying both is still essentially the same,
concerned with the same things, though buried under millions of lines
of bolt-on cruft that was added in a fashion inconsistent with the
architectural style of the OS itself, e.g. Berkeley sockets.)

Both languages and operating systems have *obviously* failed to keep
pace with the increasingly-rich data, forms of interaction, and
diversity and ubiquity of devices we all swim within these days. I'm
not sure old-timer "been there, done that, had that flame war 30 years
ago about goto" arguments are going to help jar us out of the cul-de-
sac.

Though I'll spot you points... you are making the point. And despite
my focus on integration, there's an even bigger boogeyman looming that
we've known about all along but mostly completely ignored. Jim Backus
called it in his 1977 Turing Award Lecture re: the dilemma we now
face doing multi-core. A fringey few have been seriously concerned
about what he had to say since that time, but even the language
designers who took him seriously mostly failed to get the point.
(Maybe "stay the course" would be a better phrase.) (I.e., the
Haskell community's category-theoretic gestalt sent them into the
weeds of academic la-la land from which they have, and can, never
return.) Joe Armstrong's perhaps the only guy that "got" the problem
and delivered a practical solution; maybe Gelernter, though his
solution was more dissimilar to what Backus envisioned than
Armstrong's. But they've been been mostly ignored for the last two
decades, and it's only now that the magnitude of the issue is --- and
their solutions are --- really appreciated.

Don't get me wrong, Pike et. al. did something about it w/ Limbo a
decade ago, and now that's resurrected in this "Go" revenant with its
too-cutesy "goroutines." But IMHO, the problem still has not been
thoroughly or adequately solved, and even where there are partial
solutions they are several years of learning-curve (at least) ahead of
it being reduced to practice by most everyday programmers.
(Seriously. Even Very Smart Folks (tm) still try to do all kinds of
stupid shit with threads. Go figure.)

Anywaze...


jb
J. Andrew Rogers
2009-11-13 04:26:15 UTC
Permalink
Jim Backus called it in his 1977 Turing Award Lecture re: the dilemma we now face doing multi-core.
In the last year or so I have been working with a relatively rich and diverse set of massively parallel architectures, including some that are pretty exotic. One of the hard-won lessons learned from a software design stand point is that the model of multi-threading that almost every programmer is learning today is, to be blunt, dead wrong. The experience has been enlightening.

In conventional multi-threading models we are given the concept of synchronization primitives. We are correctly lead to believe that synchronization is expensive and most code is written for this model, but that assumption does not scale.

When designing software for massively parallel systems, the correct model is one of two assumptions: that synchronization is free or that synchronization is non-existent. Both assumptions will vex the conventional programmer. For the former assumption, just about everything you traditionally learn in computer science is naive and under-exploits the power of this model. These models also tend to be exotic; while they scale, not that many people have experience with them. For the latter assumption, which most real-world "synchronization is expensive" models collapse to at scale, we have a different problem where synchronization -- such as it is -- is not a simple matter of defining a sync variable. We really do not prepare people for this scenario either.


The problem with multi-core is not so much the idea of multi-core as it is an implementation that is attempting to scale the idea that synchronization is expensive. Just about all of our programming languages parrot this assumption. If they want the concept to scale, they will have to make it either free or non-existent at the architecture and language level. A few languages do an adequate job of "synchronization is non-existent", and none (that I can think of) support the notion that synchronization is free (it is typically implemented as a set of extensions to standard languages in lieu of traditional synchronization primitives).
Tom Higgins
2009-11-13 18:57:28 UTC
Permalink
Both languages and operating systems have *obviously* failed to keep pace
with the increasingly-rich data, forms of interaction, and diversity and
ubiquity of devices we all swim within these days.  I'm not sure old-timer
"been there, done that, had that flame war 30 years ago about goto"
arguments are going to help jar us out of the cul-de-sac.
I think the 450 pound elephant in the room is this, why are we even
doing programming anymore. As you say, with multi core nth dimensional
aspects on the table i think it is safe to say the meatbags have a
class issue, as in we do not have the right stuff to do what needs
doing well enough consistently enough and with a steady ramp up of
iterative betterment to support the comming Multilarity(tm). What we
seem to do well is the same stuff over and over in different dialects
and with different interfaces. Sure a chimp can work a typewriter,
and yes the old chestnut about a million monkeys pumping out Halmet
might be right... or wrong...it does make a good illustration to bring
to this party.

Could it be our best course would be to build the things that build
the code, or even build the things that will then assemble themselves
to build the things to build the code?

I for one wish to extend my warmest welcome to our skynet overlords,
now can you please pay my bills , heat my house, generate enough
income to do both as well as the cooking, email/twitter/fb/irc
filtering, cleaning and bittorrenting...thanks. I will be over here
getting busy being a post multilarity(tm) human...nanunanu.


-tom(The Multilarity(tm) will not be telvised, it will be shown on
Hulu (5 trailing episodes only))higgins
Benjamin Black
2009-11-13 20:38:04 UTC
Permalink
Post by Tom Higgins
Both languages and operating systems have *obviously* failed to keep pace
with the increasingly-rich data, forms of interaction, and diversity and
ubiquity of devices we all swim within these days. I'm not sure old-timer
"been there, done that, had that flame war 30 years ago about goto"
arguments are going to help jar us out of the cul-de-sac.
I think the 450 pound elephant in the room is this, why are we even
doing programming anymore.
That is a perspective shared by many a windmill tilter. Have a look at
Charles Simonyi's Intentional Programming diversion for one of the
longest struggles to make that work. It's not even worth doing, in my
opinion.
Post by Tom Higgins
As you say, with multi core nth dimensional
aspects on the table i think it is safe to say the meatbags have a
class issue, as in we do not have the right stuff to do what needs
doing well enough consistently enough and with a steady ramp up of
iterative betterment to support the comming Multilarity(tm).
This is a required view to support the previous assertion, but is also
not a given. We have grown how many orders of magnitude in devices and
traffic over the past 20 years? The 'Multiarity' has been here for some
time, it's just not evenly distributed.

Different abstractions are most appropriate for expressing different
concepts. We see this in art, in natural languages, everywhere.
Programming languages are not exceptions. We have different ones
because they are better at different jobs, or better at mapping from how
some folks think to what the machines do. The right abstraction for the
job is often not more abstraction, just a different one.
Post by Tom Higgins
What we
seem to do well is the same stuff over and over in different dialects
and with different interfaces. Sure a chimp can work a typewriter,
and yes the old chestnut about a million monkeys pumping out Halmet
might be right... or wrong...it does make a good illustration to bring
to this party.
This is only a good illustration if your point is that programming is
often a very creative activity, like writing prose, and the notion that
we want machines to automatically write all our code is counter to our
enjoyment of it, even assuming you could build such beasts. I don't
think that is your point, but it would be mine.
Post by Tom Higgins
Could it be our best course would be to build the things that build
the code, or even build the things that will then assemble themselves
to build the things to build the code?
See above.


b
Tom Higgins
2009-11-14 00:11:33 UTC
Permalink
Post by Benjamin Black
This is only a good illustration if your point is that programming is
often a very creative activity, like writing prose, and the notion that
we want machines to automatically write all our code is counter to our
enjoyment of it, even assuming you could build such beasts.  I don't
think that is your point, but it would be mine.
Much of what I see in code is not art, its plumbing, duct tape and
rehashes. Now to be sure there are McGyver moments to be had, but
mostly though its the mental equivalent of showing butt crack. Those
overdone and mundane parts are long overdue to be abstracted away to
mouse gestures and/or includes, depends on what religion you favor.

The creative, the joyous endeavors to make with the new jimies and see
things not seen before..the ahha moments.... thats the elusive bits
that go beyond the plumbers crack or hackneyed bash scripting. Often
it is old code in new containers, often new ways of communicating bits
across various meat and otherly interfaces....take for instance The
Sixth Sense Technology.

But the same tired crap that many folks have been coding and recoding
for years, decades, etc in order to some how claim relevance or have
job security..yeah that shit is old like mold and drippy like a hippy.

A place for learning the guts of a bubble sort, the whys of how
multiplication works, the alternative paths you can take for the
longest path problem..sure sure thats what the padwan stage is all
about....take out your Cradiac boards and bum 3 steps out of this
division routine... A time now and again to touch back and reconnect
so that you can move forward, sure. But as a place to orbit in
perpetuity with maybe the hope of changing the lingo every now and
again....yeah thats the rub right there.


-tom(of course this is all the subjective view of a 45 year old who
does not code for a living anymore, take it for what its
worth...please have exact change)higgins
Russell Turpin
2009-11-14 12:40:55 UTC
Permalink
Post by Tom Higgins
Much of what I see in code is not art, its plumbing, duct tape and
rehashes.
Yep. Which causes me to notice that we have not yet automated
plumbing. When our water pipes leak, we still call a guy in a truck.
Damien Morton
2009-11-14 13:38:32 UTC
Permalink
http://lua-users.org/lists/lua-l/2009-11/msg00576.html
"""
There is nothing new under the sun.

Have you seen Algol 68? As you might imagine, it's an Algol variant that
came out in 1968. (Which makes it older than I am.) It begat Pascal,
Modula, Oberon etc and is a cousin of C. (I think it's Lua's uncle.)

...

So, to get back to the point: Go vs Algol-68. TBH, I think the
41-year-old language is richer, clearer and more expressive. It's
certainly missing some features, like any form of object orientation,
polymorphic functions, but the overall design is much more consistent
and well thought-out.

In 1968 Neil Armstrong was still pootling about in orbit and integrated
circuits were rockets science. Why am I able to compare a language from
this era with a contemporary one on an equal basis? Because,
depressingly, state-of-the-art in programming languages hasn't moved
much in those 41 years.

I think that it's entirely plausible that if someone were to take the
Algol-68 spec, redraft it in modern terminology (types instead of modes,
names instead of variables, etc), update some of the odder areas such as
transput, give it a catchy Web-2.0 name and produce a decent compiler
for it, then it would be heralded as the next great development in
programming languages. Which is kind of depressing...
"""
Eugen Leitl
2009-11-14 15:17:59 UTC
Permalink
Post by Russell Turpin
Yep. Which causes me to notice that we have not yet automated
plumbing. When our water pipes leak, we still call a guy in a truck.
Of course you realize that plumbing is a hard problem. Arguably,
Turing-complete, or nearly. Making plumbing easier would require
complete overhaul of our infrastructure.

It would be much easier to just grow stuff in place.
kelley
2009-11-21 19:45:09 UTC
Permalink
late to the party, but wanted to ask. Who here has read Dreaming in Code? I
know one of the stars of the book is on this list, but curious if any
FoRKers have read it and what they thought of the book? I loved the book
and have gotten the CIO to read it and managed to slip some of it into
various presentations I've had to give. I have definitely been all fanbuoy
about it for awhile, but always interested in hearing better criticisms
than mine.

Currently reading Pragmatic Thinking and Learning -- with which I have some
disagreements (mostly at the nitpicky level of not comfortable with his
surface understanding of brain research which makes me doubt his expertise
and wonder if he's fully grasping emerging research). Has anyone here read
Andy Hunt's work, or been to a talk? He was in our area recently, but I had
a presentation the next day, so didn't attend. I heard Hunt was pretty good
at the talk but he'd had to condense most of his work into 30 minutes when
he normally conducted all day seminars.

kelley

Jeff Bone
2009-11-16 05:06:33 UTC
Permalink
It is my pious hope, to quote Roger Penrose, that none of the
challenges I've describe above are fundamental and all could be
solved with only a modicum of effort from some motivated folk.
Whether they are the same sort of problems that raised Jeff Bone's
ire I can't say, but I remain quite optimistic there isn't cause for
despair or anger in this.
Rather stream-of-consciousness, caveat lector.

Well-acquainted with both Puppet (as in, user) and Chef (not a user,
but definitely an admirer.) And I'll even go so far as to say that,
indeed, they both do in fact represent a kind of benchmark in terms of
what's reasonable and possible in expression of complex integration
environments and scenarios. (Particularly Chef; I didn't choose
Puppet over Chef in the situation in which I'm acquainted with it, but
probably would have had it the other way around had it been my call.)

That said, neither is quite the answer to either the programming-in-
the-large or programming-in-the-small scenarios I envision; they
don't make solving either of the (still-toy) use cases I sort of
informally sketched out as *trivial* as I believe they could be, with
only a bit more batteries-included abstraction supported *throughout
the toolchain.*

All that information, buried in all those strings. Ruby's DSLish
support improves things a bit. Not enough, IMHO.

Despair and anger? I mean come on. How can you experience anything
else when e.g. (a) it's nearly 2010 and *printing still doesn't work
right, really*, and (b) there's a damn teletype in your sleek MacBook
Air, and (c) sockets, network servers, and files all still live in
different namespaces, with different non-generic interfaces, in most
contexts? (Particularly the latter is frustrating: I see Plan 9,
circa early-90s, perhaps less so Inferno a bit later, as sort of
practical benchmarks in *how* systems integration should happen.
Maybe also Plan B and Octopus, slightly later but definitely academic-
research progeny. But the kinds of things enabled by a little
abstraction, universally applied in such environments is science
fiction in "real world" operating systems and user environments. And
*those folks* are the very ones that are now obsessing over receiver
type designation and return value type declaration syntax in a
"systems programming language" that looks only epsilon different than
the language experimentation these guys were doing nearly two decades
ago. How can you innovate wildly at the system level of things yet be
so incredibly overcautious in your language experimentation...?)

Perhaps your level of nerviness is less than mine, or perhaps I just
enjoy a good rant more than you. ;-)

Can you abstract away a lot of useful stuff with existing tools and
types, enabling some kinds of easier integration and using
representations of those higher-level things in terms of lower-level
things? But of course; "solving" many (valuable) problems of
integration at that level is quite tractable and not at all
fundamental, and actually quite useful. Hence e.g. Chef. But is is
sufficient in the long run? I don't think so. Let's be honest: do
we really think that Chef has the potential longevity of, say, the
Bourne Shell? Doubtful, and I'd guess that even its authors would
agree; it's a point along the way and a useful one, but it's
evolutionary. We need a revolution. Well, at least I *want* one. :-)

Absent another pipelines-like "aha" moment about how things should
plug together, talk to each other, and exchange data --- maybe some
reconsideration of namespaces and generic interfaces, ala Plan 9, is a
reasonable starting point for some serious contemplation --- the
future is doomed to being rife with Vinge-ian software
archaeologists. That's even assuming we even get there, which
possibility is made more distant by saying that the state-of-the-art
in such things circa 1970 (or 1990, or whatever) is "good enough."

E.g., I can "pretend" that the tabular data flowing through my shell
pipeline into my awk program is actually something structured as
"records" consisting of "fields" --- and awk itself does so.
Semantically preserving that intent across an arbitrary pipeline of
arbitrary components written by arbitrary other people at arbitrary
points in the past --- not so much a winner.) Raising the bar by
making those abstractions more first-class and preserving those types
in some transitive yet reflective rather than strict way *definitely*
leads to better expression and better composition.


YMMV...


jb


PS - "pious hope" --- well, exactly.
Michael Cummins
2009-11-12 20:41:30 UTC
Permalink
Post by Jeff Bone
isn't a one-liner --- NON STARTER!
I've been really happy with Cold Fusion, been writing applications with it
since version 5.

Your enumerated one liners are pretty much one liners, plus you can leverage
your Java and .NET knowledge into it fairly easily. Since Adobe also owns
Flash, it integrates quite nicely with that as well for some nice AS3
powered interfaces. It plays well with XML and JSON, and has all kinds of
factory installed goodies. Recently they added ORM support, which I haven't
really played with yet, but it looks like a real time saver for the many
simple applications I churn out almost daily.

http://www.adobe.com/products/coldfusion/features/

Anyway. There's my plug for my favorite environment.

-- Michael Cummins
Lucas Gonze
2009-11-12 23:17:54 UTC
Permalink
Post by Michael Cummins
I've been really happy with Cold Fusion, been writing applications with it
since version 5.
Me too. Lately I've been doing my device drivers with it. Though I
really miss the REM statement from DOS batch files.
Tom Higgins
2009-11-13 00:17:47 UTC
Permalink
Neither the prog langs nor the bitching about them has changed all
that much since I got into it in the late 70's. Folks love to set out
utopian constructs as much as others love to rip em a new one. Some
are more creative than others, 99% are just the same old rehash
rehashed. Having to pick something for the kids to start with I am
very much loving the CARDIAC environment.

-tom(goto=evil yea, the good old days)higgins
Michael Cummins
2009-11-13 16:34:57 UTC
Permalink
Post by Lucas Gonze
Me too. Lately I've been doing my device drivers with
it. Though I really miss the REM statement from DOS
batch files.
I don't know why you have to be like that, Lucas. Python came up, so I
didn't think the high level syntax tangent was too much off topic and Jeff's
REBOL/Command supports web server interfaces, which looks really neat, so
why is touting the Cold Fusion environment for a moment so mockable? I
wasn't being all fan boy about it.

Also, part of the discussion was about "pipe-fitting", something CF is
pretty good at. Right now I'm daydreaming about REBOL FastCGI scripting and
wondering what useful things I could do with that. It's all interesting
stuff.

http://www.rebol.com/docs/fastcgi.html
Lucas Gonze
2009-11-13 20:05:40 UTC
Permalink
Post by Lucas Gonze
Me too. Lately I've been doing my device drivers with
it. Though I really miss the REM statement from DOS
batch files.
why is touting the Cold Fusion environment for a moment so mockable?  I
wasn't being all fan boy about it.
Sorry Michael, I didn't mean to mock in a mean way. I meant to mock
in a mocky way.

What do I know anyway? Take it in that spirit -- that my own
limitations are so blatant that it makes no difference whether I enjoy
a little sport.
Michael Cummins
2009-11-13 21:38:28 UTC
Permalink
Post by Lucas Gonze
Take it in that spirit
Humorous mocking spirit taken :)

It *was* funny.
Dr. Ernie Prabhakar
2009-11-12 20:43:08 UTC
Permalink
Hi Jeff,
Post by Jeff Bone
http://www.rebol.com/oneliners.html
...then do us all a favor and don't resist the urge to fall on that wakizashi when your shame overcomes you.
Help me out here.
Michael Cummins
2009-11-12 20:46:04 UTC
Permalink
Go here: http://www.rebol.com/oneliners.html
That looks really interesting.

-- Michael Cummins
Ken Ganshirt @ Yahoo
2009-11-12 21:14:16 UTC
Permalink
Post by Jeff Bone
J.H.C, folks, it's nearly 2010. Let's get a few
- most programming involves schlepping a few but
complex data types between different string representations
- programmers have become plumbers and
documentation-archaeologists mostly, which is sad and
uninteresting
- programming languages are for *programmers* ---
not compilers and compiler-writers
- until you make the everyday, "simple" things
simple, it will continue to be a dark art practiced by fewer
and fewer
.... etc. ...
About here I'm starting to think, He's talking about BASIC. ..... Naw, can't be. He's w-a-y too young...
Post by Jeff Bone
SIMPLE GUI PROGRAMMING! Remember BASIC?
Logo? Zero to graphics in three minutes, max.
How the hell are kids supposed to learn to program these
days? ...
I began my second professional life as a programmer (inhouse development at a telco). That was ... you don't want to know. I gave up looking at programming languages somewhere between C++ and Java. In my opinion none of them did a much better job in making a programer's life easier than COBOL.

Don't go off the deep end, folks. That's NOT an endorsement of COBOL.

I'm just sayin'.

It doesn't sound like it's improved much in the intervening couple of decades.
Post by Jeff Bone
Somebody do something about this, before I LOSE MY FUCKING
MIND!?!?!
Ummm.... Sounds like it's way too late.

...ken...


__________________________________________________________________
Looking for the perfect gift? Give the gift of Flickr!

http://www.flickr.com/gift/
Ken Ganshirt @ Yahoo
2009-11-13 22:31:50 UTC
Permalink
Post by Tom Higgins
What we
seem to do well is the same stuff over and over in different dialects and with different interfaces.  Sure a chimp can work a typewriter, and yes the old chestnut about a million monkeys pumping out Halmet might be right... or wrong...it does make a good illustration to bring to this party.
This is only a good illustration if your point is that programming is often a very creative activity, like writing prose, and the notion that we want machines to automatically write all our code is counter to our enjoyment of it, even assuming you could build such beasts.  I don't think that is your point, but it would be mine.
If I understand the implications of the "singularity" correctly as it relates to AI or machine intelligence, would this not be one good reason to fear it?

.... Sorry if this has been chewed to death. It seems like a terribly obvious concern if someone is looking at it from different angles. But I'm way late to the party. A simple Yes or No is sufficient if the subject is too boring.

...ken...


__________________________________________________________________
Connect with friends from any web browser - no download required. Try the new Yahoo! Canada Messenger for the Web BETA at http://ca.messenger.yahoo.com/webmessengerpromo.php
Eugen Leitl
2009-11-14 11:01:04 UTC
Permalink
Post by Ken Ganshirt @ Yahoo
If I understand the implications of the "singularity" correctly as it relates to AI or machine intelligence, would this not be one good reason to fear it?
There's very good reason to assume that a naturally
intelligent system has no code anybody, itself including
could understand.

We're not interested in the code, we're interested
in functionality. The whole idea of massaging text
in a text editor does only make sense if one has
been doing it for most of your conscious life.
Post by Ken Ganshirt @ Yahoo
.... Sorry if this has been chewed to death. It seems like a terribly obvious concern if someone is looking at it from different angles. But I'm way late to the party. A simple Yes or No is sufficient if the subject is too boring.
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE
Ken Ganshirt @ Yahoo
2009-11-14 18:43:12 UTC
Permalink
On Fri, Nov 13, 2009 at 6:11 PM, Tom
Post by Tom Higgins
Much of what I see in code is not art, its plumbing, duct tape and
rehashes.
Yep. Which causes me to notice that we have not yet
automated plumbing. When our water pipes leak, we still call a guy in
a truck.
But we are advancing. The last two times I had to call a plumber there was no butt-crack showing.

...ken...


__________________________________________________________________
Looking for the perfect gift? Give the gift of Flickr!

http://www.flickr.com/gift/
Tom Higgins
2009-11-14 19:57:05 UTC
Permalink
The problem with plumbing is not that it is hard, I have to deal with
enough of it over the last several houses, it is that its mucky and
repetitive.

Find the leak, seal the leak.
Find the clog, clean out the clog.
Assess load balancing, increase the pipe at the bottle necks.
Fight gravity. (basement dirt floor wine cellar needs a working
sink..say hello to my pumppy friend)
Connect a pipeline from the user input to the proper output.

And yes often you are dealing with the previous "plumbers" work and if
you think coders dont document...oi..

I mean sure, could you make an AND gate between two bathrooms and the
sewer pipe? Sure, heck given enough trips to the Home Despot and rom
behind the walls you could possibly build a Turing machine out of
schedule 40 and fittings.

Mostly though its repetitive mucky tasks in spaces not easily getable
by someone of my largess, so yeah if the wife is not game for the task
we also call a handy friend if its very dire we toss money at a pro.

tom(In a post NthIlarity world though I expect that if I want a cupa
water I will motion with my hand and a temporary nanobot buckytube
will be extended to me from the nearest water source.)higgins
Eugen Leitl
2009-11-15 10:42:50 UTC
Permalink
Post by Tom Higgins
The problem with plumbing is not that it is hard, I have to deal with
It is pretty hard even for people. You're typically working in
an unstructured, constrained space, using a large set of tools,
including modifying existing tools, and using a very large set
of parts, including custom ones.

Of course a robot would have advantages, too, since lighting and
video (even in confined spaces) as well as accurate measurements
are a given. But you need a long arm or two with lots of degrees of
freedom, exchangeable tools at the end, and ability to exercise
considerable force/torque. And of course without a said plumbing
intelligence on the other end it would be still no good.
Post by Tom Higgins
enough of it over the last several houses, it is that its mucky and
repetitive.
Find the leak, seal the leak.
Find the clog, clean out the clog.
Assess load balancing, increase the pipe at the bottle necks.
Fight gravity. (basement dirt floor wine cellar needs a working
sink..say hello to my pumppy friend)
Connect a pipeline from the user input to the proper output.
And yes often you are dealing with the previous "plumbers" work and if
you think coders dont document...oi..
I mean sure, could you make an AND gate between two bathrooms and the
sewer pipe? Sure, heck given enough trips to the Home Despot and rom
behind the walls you could possibly build a Turing machine out of
schedule 40 and fittings.
Mostly though its repetitive mucky tasks in spaces not easily getable
by someone of my largess, so yeah if the wife is not game for the task
I've spent around 4-5 hours yesterday on my back under the kitchen
sink, quite a few times requiring cooperation from above. Sorry, no
all-purse plumbing robots, nor all-purpose surgeon robots. Some
things are just not easy.
Post by Tom Higgins
we also call a handy friend if its very dire we toss money at a pro.
tom(In a post NthIlarity world though I expect that if I want a cupa
water I will motion with my hand and a temporary nanobot buckytube
will be extended to me from the nearest water source.)higgins
In a post-Singularity world you wouldn't need water to start with.
The physical layer would probably look like something like Giger designed.
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE
Benjamin Black
2009-11-15 23:53:02 UTC
Permalink
Inspired by the constructive, though almost incomprehensible, stylings
of my old friend tomwhore, I offer this to the conversation.

Those of you in the large-scale technology operations space will be
familiar with Puppet, a long-time favorite for infrastructure
automation, and Chef, a more recent entrant. Both of them are open
source and both have a component responsible for system discovery:
Puppet includes a tool called facter and Chef includes one called ohai.

System discovery is the task of collecting various facts (hence the name
facter) about the system on which you are running so the rest of the
automation can run from a consistent view. System discovery is
abstraction, and that brings a host of questions around implementation
and presentation. facter takes a minimalist approach: it returns a
compact set of information and relies on a number of native C extensions
(which, on might argue, pushes you towards only returning a compact set
of information). ohai (now) takes a rather maximalist approach: the
data returned can be quite large, for example when run on OSX with the
plist gem installed, and avoids use of any native C extensions.

I cannot comment on the history or philosophy of facter, but I can do so
for ohai. I wrote quite a bit of the ohai code and am primarily
responsible for the volume of information it collects compared to
similar tools. ohai began life as approximately a pure Ruby version of
facter to support Chef. The data returned was similar (and similarly
unstructured), the main difference being its avoidance of C extensions.
The motivation for remaining pure Ruby was some combination of
simplicity and a desire for consistency with the rest of Chef. Where
facter uses native interfaces to collect system data, ohai relies on a
lot of popen4() and regex matching. This has made ohai incredibly easy
to port to new platforms, and it went from 1 (Linux) to 4 (Linux of
various descriptions, Solaris, FreeBSD, and OSX) in a couple of weeks.

In so doing, I learned quite a bit about how command-line output varies
between platforms and the issues of semantic mismatch between
programming languages and the operating systems on which they run. My
lessons do not lead me to conclude we are so far from an "80%" solution
or that there is cause for despair.

The first contribution I made to ohai was a slight restructuring to
support multiple platforms. This introduced hierarchy both in the code
layout (with OS-specific plugins) and in the data output. The use of
JSON for the output is the least bad option, given the common
alternatives and the use of CouchDB at the server, but is not entirely
satisfactory. The most bothersome issue is its lack of references. I
might have several IP addresses on a host, but I want one that I can
refer to as its canonical address in the rest of the automation.
Automatically deciding which IP to use is easy (take the primary IP
address on the interface used for the default route), but indicating
which address has been chosen creates a new problem: I have a top level
notion of the IP address, but no way to indicate, in the data structure,
where it came from.

As an example, the top level entry looks like this:

"ipaddress": "172.16.100.202"

And the actual network interface definition (which is in the
network->interfaces sub-hash) looks like this:

"en1": {
"status": "active",
"flags": [
"UP",
"BROADCAST",
"SMART",
"RUNNING",
"SIMPLEX",
"MULTICAST"
],
"number": "1",
"addresses": {
"00:23:6c:90:47:10": {
"family": "lladdr"
},
"fe80::223:6cff:fe90:4710": {
"scope": "Link",
"prefixlen": "64",
"family": "inet6"
},
"172.16.100.202": {
"broadcast": "172.16.100.255",
"netmask": "255.255.255.0",
"family": "inet"
}
},
"mtu": "1500",
"media": {
"supported": [
{
"autoselect": {
"options": [

]
}
}
],
"selected": [
{
"autoselect": {
"options": [

]
}
}
]
},
"type": "en",
"arp": {
"172.16.100.1": "0:1b:c:f:90:23",
"172.16.100.201": "0:23:12:a8:2d:84",
"172.16.100.246": "0:16:cb:a9:70:4b"
},
"encapsulation": "Ethernet"
}

If I want to know where the default address came from, I have to iterate
of the interfaces to find it. If I added a tag to the default
interface, I then have to update in 2 places should there be a change.
Storing a reference to the default interface would be a cleaner
solution, but is not supported in JSON. Creating a JSON-based format
that supported references seems not such a problem, it just hasn't been
done, to my knowledge (and please don't suggest XML, it is too bloated
and complex for consideration). This is minor compared to the other,
big challenges, though.

The second problem, and one most clearly an issue for all languages
interacting with the OS for systems work, is process management. While
abstractions like threads and event callbacks are (reasonably) well
understood, Unix-style process management remains just this side of a
black art; look at the daemonization code in any C server code for an
example. Scripting languages like Ruby and Python tend to just punt and
directly expose the C process management interface, hence the use of
popen4() all over the place in ohai. Mocking out popen4() for testing
and the complexity of spawning a child (A), that in turn spawns a child
(B), and returns, orphaning B and leaving your initial process without
its return value, well, it's not fun. It is also unnecessary, but
nobody has bothered to write reasonable libraries to do this in Ruby
(parts of it are now in Chef), and I am not familiar enough with Python
to know what folks do there. Again, there is a semantic gap between
what the OS is exposing and how the languages consume them. As an
aside, this gap does not really exist for the lightweight concurrency
mechanisms, particularly event-based concurrency, where the language
support is quite good (see EventMachine in Ruby and Twisted in Python,
both of which are libraries, not language features; process management
should yield to similar effort).

The third big problem I encountered was the wild variation in command
output. At the amusing end of the spectrum, I received a bug report
from someone running Linux with German localization and the output of
ifconfig was entirely translated into German, something you are unlikely
to see in a C API. Generally, the challenge in working across platforms
might be summarized in this way: the more optimized a system is for
direct consumption by a human operator, the harder it is to write
automation that doesn't use 'native' APIs. Windows is the obvious
extreme example of this, but the unexpected offender here is Solaris.

Solaris is, in my estimation, the best OS core (kernel, filesystems,
etc) on the market. It is also the long-time favorite of old-school
sysadmins who pride themselves on knowing every last inch of their
systems and only using automation to take care of certain, recurring
tasks, rather than the full-auto, lights out style encouraged by Puppet
and Chef. The output from things like ifconfig is optimized for them,
being particularly verbose and human-readable, but extensive variation
in output makes them very involved to parse (see the ifconfig man page
for a taste:
http://docs.sun.com/app/docs/doc/816-5166/ifconfig-1m?a=view). At
another point in the space, there are things like the OSX
system_profiler command that will happily generate XML output exactly
for ease of consumption by code rather than people. All of which is
really to say operating can systems can, should, and sometimes do,
expose interfaces above the level of the native C APIs, but intended for
consumption by scripting tools. Things like system_profiler show one
way of doing that, though the XML-ified plist output is not a winner.
An OS that had 'automation modes' on all its system management tools
would be a massive win for system language users and would, I think, not
be hard (where lots of simple code is not hard).

I didn't intend this post to be quite so long, so my apologies and
thanks to those of you who made it this far. It represents my
experience in one, possibly representative, corner of dealing with the
challenges at the interface between systems languages and systems. It
is my pious hope, to quote Roger Penrose, that none of the challenges
I've describe above are fundamental and all could be solved with only a
modicum of effort from some motivated folk. Whether they are the same
sort of problems that raised Jeff Bone's ire I can't say, but I remain
quite optimistic there isn't cause for despair or anger in this.


b
David Edmondson
2009-11-16 12:41:43 UTC
Permalink
Not at all disagreeing with your general point...
Post by Benjamin Black
Solaris is, in my estimation, the best OS core (kernel, filesystems,
etc) on the market. It is also the long-time favorite of old-school
sysadmins who pride themselves on knowing every last inch of their
systems and only using automation to take care of certain, recurring
tasks, rather than the full-auto, lights out style encouraged by Puppet
and Chef. The output from things like ifconfig is optimized for them,
being particularly verbose and human-readable, but extensive variation
in output makes them very involved to parse (see the ifconfig man page
http://docs.sun.com/app/docs/doc/816-5166/ifconfig-1m?a=view). At
another point in the space, there are things like the OSX
system_profiler command that will happily generate XML output exactly
for ease of consumption by code rather than people. All of which is
really to say operating can systems can, should, and sometimes do,
expose interfaces above the level of the native C APIs, but intended for
consumption by scripting tools. Things like system_profiler show one
way of doing that, though the XML-ified plist output is not a winner.
An OS that had 'automation modes' on all its system management tools
would be a massive win for system language users and would, I think, not
be hard (where lots of simple code is not hard).
There's work to improve this specific aspect of Solaris - newer commands are expected to sport a "-p -o" set of options to help you parse the output:

: lynx-01; dladm show-link
LINK CLASS MTU STATE BRIDGE OVER
igb2 phys 1500 unknown -- --
igb0 phys 1500 up -- --
igb1 phys 1500 unknown -- --
igb3 phys 1500 unknown -- --
: lynx-01; dladm show-link -p -o link,class
igb2:phys
igb0:phys
igb1:phys
igb3:phys
: lynx-01;

There are some obvious problems with the 'colon separated' formatting, but it is a start. This doesn't help with ifconfig at the moment, though there is an undocumented option 'configinfo':

: lynx-01; ifconfig -a configinfo
lo0 inet plumb mtu 8232 index 1 set 127.0.0.1 netmask 0xff000000 up
igb0 inet plumb mtu 1500 index 2 set 10.6.70.170 netmask 0xfffffe00 broadcast 10.6.71.255 up
: lynx-01;

ifparse(1M) may also be useful.
Benjamin Black
2009-11-16 17:20:50 UTC
Permalink
Post by David Edmondson
: lynx-01; dladm show-link
LINK CLASS MTU STATE BRIDGE OVER
igb2 phys 1500 unknown -- --
igb0 phys 1500 up -- --
igb1 phys 1500 unknown -- --
igb3 phys 1500 unknown -- --
: lynx-01; dladm show-link -p -o link,class
igb2:phys
igb0:phys
igb1:phys
igb3:phys
: lynx-01;
: lynx-01; ifconfig -a configinfo
lo0 inet plumb mtu 8232 index 1 set 127.0.0.1 netmask 0xff000000 up
igb0 inet plumb mtu 1500 index 2 set 10.6.70.170 netmask 0xfffffe00 broadcast 10.6.71.255 up
: lynx-01;
ifparse(1M) may also be useful.
Wow, excellent! Thanks, David.


b
Damien Morton
2009-11-16 13:09:28 UTC
Permalink
Post by Benjamin Black
An OS that had 'automation modes' on all its system management tools
would be a massive win for system language users and would, I think, not
be hard (where lots of simple code is not hard).
Powershell
Benjamin Black
2009-11-16 17:18:33 UTC
Permalink
Post by Damien Morton
Post by Benjamin Black
An OS that had 'automation modes' on all its system management tools
would be a massive win for system language users and would, I think, not
be hard (where lots of simple code is not hard).
Powershell
Writing Powershell scripts to drive WMI or COM interfaces is not the
direction I would hope anyone else to take. I am mystified why anyone
outside Microsoft finds Powershell attractive.


b
Damien Morton
2009-11-16 17:29:30 UTC
Permalink
Post by Benjamin Black
Post by Damien Morton
Post by Benjamin Black
An OS that had 'automation modes' on all its system management tools
would be a massive win for system language users and would, I think, not
be hard (where lots of simple code is not hard).
Powershell
Writing Powershell scripts to drive WMI or COM interfaces is not the
direction I would hope anyone else to take.  I am mystified why anyone
outside Microsoft finds Powershell attractive.
Basic principle: instead of streams of bytes or characters being the
underlying medium of communication between 'automation nodes', the
fundamental premise of Powershell is to have all 'automation nodes'
communicate via streams of strongly typed data structures.

While you discuss having Unix utilities have an XML input and output
mode, Powershell has already solved that problem by piggybacking on
the .NET typesystem.

Whatever you may or may not think of Microsoft, having structured data
along with a central repository of types is a huge improvement. when
it comes to 'automation nodes'.
Benjamin Black
2009-11-16 18:16:55 UTC
Permalink
Post by Damien Morton
Post by Benjamin Black
Post by Damien Morton
Post by Benjamin Black
An OS that had 'automation modes' on all its system management tools
would be a massive win for system language users and would, I think, not
be hard (where lots of simple code is not hard).
Powershell
Writing Powershell scripts to drive WMI or COM interfaces is not the
direction I would hope anyone else to take. I am mystified why anyone
outside Microsoft finds Powershell attractive.
Basic principle: instead of streams of bytes or characters being the
underlying medium of communication between 'automation nodes', the
fundamental premise of Powershell is to have all 'automation nodes'
communicate via streams of strongly typed data structures.
Yes, premise #1 is we need a richer way to interchange data than what
JSON provides, without getting into the excesses of XML. Powershell
consistently almost gets there, and keeps missing. In my opinion, it
would've benefited from a lot of earlier use by the only group at
Microsoft doing serious, large-scale automation: Bing (ne Search). As
far as I know they never touched it.
Post by Damien Morton
While you discuss having Unix utilities have an XML input and output
mode, Powershell has already solved that problem by piggybacking on
the .NET typesystem.
I was explicit that I thought XML was the wrong approach:

"Things like system_profiler show one way of doing that, though the
XML-ified plist output is not a winner."
Post by Damien Morton
Whatever you may or may not think of Microsoft, having structured data
along with a central repository of types is a huge improvement. when
it comes to 'automation nodes'.
Piling on to .NET creates new problems. We gain a type system, but we
now also have 3 different ways of interacting: GUI, command-line, and
.NET. If I want to go from REPL-ish interaction on the command-line to
discover the solution to my problem, to automation consuming the output
of the command(s) I've been using, I'm stuck unless they are Powershell.
I will almost certainly have to figure out what WMI calls are being
made underneath and write completely new code to make those calls.
Going from something in a GUI to WMI calls is even more onerous.

What I think of Microsoft's approach to systems automation is that they
optimize for someone sitting at a desk pushing buttons. That is not a
value judgment, and there is clearly a market for it, but it is not
conducive to the sort of automation common outside the Windows world.


b
Damien Morton
2009-11-16 20:03:20 UTC
Permalink
Post by Damien Morton
Whatever you may or may not think of Microsoft, having structured data
along with a central repository of types is a huge improvement. when
it comes to 'automation nodes'.
Piling on to .NET creates new problems.  We gain a type system, but we
now also have 3 different ways of interacting: GUI, command-line, and
.NET.  If I want to go from REPL-ish interaction on the command-line to
discover the solution to my problem, to automation consuming the output
of the command(s) I've been using, I'm stuck unless they are Powershell.
 I will almost certainly have to figure out what WMI calls are being
made underneath and write completely new code to make those calls.
Going from something in a GUI to WMI calls is even more onerous.
Its true that Microsoft now supports several stratum in its automation
stack. This is just a symptom of having a long history and a huge
userbase.

I havent tracked Powershell for a while, but I would guess that there
are projects underway to replicate or bridge to all their previous
utilities and systems. For example, I seem to remember there being a
LINQ to WMI project somewhere.
What I think of Microsoft's approach to systems automation is that they
optimize for someone sitting at a desk pushing buttons.  That is not a
value judgment, and there is clearly a market for it, but it is not
conducive to the sort of automation common outside the Windows world.
I'm not sure that can be said of Powershell.

I think you are confusing Microsoft with an entity capable of acting
with a cohesive plan.

I doubt, for example, that many at Microsoft foresaw the success of
.NET - it was simply a wild attack on Java that was thrown out there.

Thats what Microsoft does, and if fact all large organisations do -
they throw a plethora of concepts out there and see what sticks.

Once something sticks, then they think about how it might fit into the
ongoing narrative, retroactively adapted to the overall plan, and
presented as wisdom and foresight.
Dr. Ernie Prabhakar
2009-11-16 21:15:01 UTC
Permalink
Post by Benjamin Black
Creating a JSON-based format
that supported references seems not such a problem, it just hasn't been
done, to my knowledge (and please don't suggest XML, it is too bloated
and complex for consideration). This is minor compared to the other,
big challenges, though.
Have you looked at JSONQuery? I'm pretty sure there was at least one proposal for those sort of references:

http://groups.google.com/group/json-query/web/json-query-requirements

-- Ernie P.
Loading...