|
|
Subscribe / Log in / New account

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Red Hat has been working on a new programming language called Ceylon. Ars technica looks at the language and the rationale for its creation. "One of the chief goals behind Ceylon is to create a language that will be easy to learn and easy for existing Java programmers to adopt. King seems to believe that a functional programming language would have difficulty meeting those goals. It also seems like a matter of strong personal preference for King—his slides include a rather trollish dismissal of programming languages that are based on "the lambda calculus used only by theoretical computer scientists.""

to post comments

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 21:08 UTC (Fri) by emk (subscriber, #1128) [Link] (17 responses)

...languages that are based on "the lambda calculus used only by theoretical computer scientists.""

Well, and by web developers and people programming in Visual Basic. :-) Seriously, pretty much every mainstream language other than Java and C has lambda expressions at this point, including the forthcoming C++ draft.

It will be interesting to see how this language compares to Clojure, Scala, JRuby and Mirah. There's a ton of great languages on the JVM these days, and many of them are both highly-expressive and relatively easy to learn.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 21:47 UTC (Fri) by elanthis (guest, #6227) [Link] (10 responses)

You seem to be confusing the concept of languages that feature anonymous functions and functions as first-class objects with languages that are based on lambda calculus.

Yes, anonymous functions are often called "lambda functions" by various languages, but that does not in any way shape or form make them proper functional languages based on the concepts of actual lambda calculus.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 22:09 UTC (Fri) by gowen (guest, #23914) [Link] (8 responses)

The problem with pure-lambda-calculus languages is that quite frequently side-effects are the only useful things that a a real-world program is required to do. It's not really important how quick you can calculate n!, if you're not allowed to print the answer to the console.

While pure functional languages are interesting research tools, the face that Haskell had to sully itself with Monads just to enable a programmer to write "Hello World" suggests that their role will, ultimately be limited.

However, the notion of functions-as-first-class-objects, and the things functional languages teach us using adding closures and efficient recursion, and things like map-reduce, in imperative languages, will prove to be their really important role.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 22:15 UTC (Fri) by nix (subscriber, #2304) [Link] (3 responses)

Quite. Writing functionally is often useful, but you don't write whole programs that way: you write computationally-intense subsets of larger programs which are not themselves written in a functional style.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 0:52 UTC (Sat) by dgm (subscriber, #49227) [Link] (1 responses)

> you write computationally-intense subsets of larger programs

Not really. Functional style is more descriptive than imperative. In an imperative language you write HOW to go from data A to B (or from one state to another). In a functional language you write WHAT is the relation between A and B (the compiler/runtime then decides the how). This is _specially_ useful for, say, linking data from a database to a form (instead of teaching with all detail how to copy data around to the machine, just describe the relation and let it do its thing). If you realize how most of common applications' code is just moving data around and changing it slightly, you will realize how useful this can be.

The problem is that, for computational intensive stuff, I can be much more clever than the compiler or runtime, because I can _understand_ what the data is, and thus find better ways of doing the critical stuff by doing trade-offs, and maybe sacrificing accuracy for speed (something an automatic translator would never be able to do).

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted May 8, 2011 11:44 UTC (Sun) by nix (subscriber, #2304) [Link]

Well, yes. I know what functional programming is like. But the way it is used in practice is almost always as subsets of larger, imperative programs. Functional top-levels are not very common, simply because they are not very useful, because the higher levels of a program are often the parts that have to actually communicate with the user, and functional languages are terrifically clumsy at that sort of side-effectual stuff.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 6:10 UTC (Sat) by gowen (guest, #23914) [Link]

I initially wrote a sentence that was extremely similar to this one, but edited it out. So, yes "Me too", "+1", "this" and other statements of agreement.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 23:18 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link] (1 responses)

Uhm. Monads do not 'sully' a language, they are just one way to represent the 'sequentialness' of the global state.

In fact, there is a more fundamental concept: http://en.wikipedia.org/wiki/Arrow_%28computer_science%29 which generalizes monads.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 6:08 UTC (Sat) by gowen (guest, #23914) [Link]

they are just one way to represent the 'sequentialness' of the global state
You're right. Its the concept of global state itself that sullies a pure-functional language ;)

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 0:01 UTC (Sat) by emk (subscriber, #1128) [Link]

...Haskell had to sully itself with Monads...

I/O is the least interesting use of monads. They become truly useful when you want to implement list comprehensions, software transactional memory, provably localized state (the ST monad), or probabilistic programming. Or half a dozen other really cool things, including logical inference in a functional language.

If Haskell had done nothing more than illustrate the broad usefulness of monads, that would have been enough to make me love it.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Oct 21, 2013 9:29 UTC (Mon) by Bienlein (guest, #93501) [Link]

This is a great thread with many interesting points being made, but as what I'm concerned the question of the rational behind Ceylon is still mostly open. To me Ceylon is to some extend a language build with the lessons learned from Scala in mind, e.g. no elaborate type system to keep built times scalable, no implicits to keep the code readable and other things.

Other than that I don't understand what's the rational behind abadonning Java's core library and creating its own. All right, there are things in the collection hierarchy that could have been solved more elegantly, but I personally don't see this as a reason to abandon the entire JDK. To me the argument that the JDK is not well designed an there Ceylon wants to build its own >might< be pretextual argument, because Red Hat wants to build its own silo beside Java. Note that I said might, because I can't really make head or tail of this myself. What makes me wonder is why basic types have been defined so that a mapping is necessary from Ceylon's basic types to Java's, see this article: http://zeroturnaround.com/rebellabs/the-adventurous-devel...

Cheers, Bienlein

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 23:50 UTC (Fri) by emk (subscriber, #1128) [Link]

You seem to be confusing the concept of languages that feature anonymous functions and functions as first-class objects with languages that are based on lambda calculus.

I don't think that's a terribly useful distinction. If a language has anonymous first class functions which (1) close over the local scope, and (2) potentially outlive the creating function, then there's a trivial morphism from the lambda calculus onto a subset of the language. And this is true for JavaScript, Visual Basic 2008, and most other languages designed recently. All of these languages were designed quite consciously with the lambda calculus in mind, and all of them may be used as functional languages.

In fact, recent versions of Visual Basic and C# even have monads in the form of LINQ's query expressions, and Scala has them hiding in its 'for' loop. But compared to lambda expressions, monads are still slightly radical, and not completely ubiquitous in new languages.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 22:00 UTC (Fri) by hp (guest, #5220) [Link]

If that dig was supposed to be on Scala it's a little tough to buy it. Scala is pragmatic and has a bunch of OO stuff including some improvements on Java (traits, singleton objects rather than statics). And overall it adopts smooth Java interop as a design goal and so plain has to work with the Java object system. We aren't talking Haskell purity here.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 2:31 UTC (Sat) by revertTS (guest, #64772) [Link] (4 responses)

Just to note, I think Ars grossly misrepresented this quote. If you look at the slides, he's mentioning lambda calculus in the context of syntax.

"Java’s syntax is rooted in standard, everyday mathematical notion taught in high schools and used by mathematicians, engineers, and software developers not the lambda calculus used only by theoretical computer scientists"

So it definitely looks like a criticism of Lisp-like syntax and not at all a trollish dismissal of functional languages as a whole. Indeed, he goes on to criticise Java for lack of first-class/higher-order functions.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 23:40 UTC (Sat) by HelloWorld (guest, #56129) [Link] (3 responses)

> "Java’s syntax is rooted in standard, everyday mathematical notion taught in high schools and used by mathematicians, engineers, and software developers not the lambda calculus used only by theoretical computer scientists"
The funny thing about this quote is that it's just plain wrong. Javas Syntax doesn't actually have much in common with standard, everyday mathematical notation. The meaning of the = operator is fundamentally different, same thing with curly braces, there's no "return" keyword in math etc.. And he also misses the fact that theoretical computer science is actually just a branch of mathematics.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 20, 2011 13:56 UTC (Wed) by ballombe (subscriber, #9523) [Link]

True. Haskell syntax is much closer to maths than C/Java etc.
For once, in mathematics and in Haskell, all variables are immutable.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 22, 2011 16:07 UTC (Fri) by phiggins (subscriber, #5605) [Link] (1 responses)

I notice that Ceylon actually uses "=" for immutable declarations, and ":=" for variable assignment, so at least that difference from mathematical notation did not go unnoticed.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 22, 2011 16:29 UTC (Fri) by vonbrand (guest, #4458) [Link]

That is exactly what Pascal does...

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 21:38 UTC (Fri) by miekg (subscriber, #4403) [Link] (12 responses)

I welcome new languages, but it seems this one doesn't have any built-in support for concurrency like Erlang and Go have. Too bad.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 21:42 UTC (Fri) by biehl (subscriber, #14636) [Link] (11 responses)

http://www.infoq.com/news/2011/04/ceylon

Gavin King: I strongly believe that concurrency, at least until some kind of revolution comes along, is something that libraries should solve. The language doesn't need built-in concurrency primitives. There are a number of good competing patterns for handling concurrency - everything from actors to STM, and the language should make it possible to write libraries that implement these patterns.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 22:58 UTC (Fri) by josh (subscriber, #17465) [Link] (10 responses)

So on the one hand he dismisses functional programming as academic, and on the other hand he wants libraries to handle all concurrency rather than taking the practical approach of "that doesn't work well enough yet, we need to give programmers access to concurrency". And for that matter, some of the most impressive attempts to do concurrency automatically rely on functional programming to allow program transformations that make parallelization easier.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 5:59 UTC (Sat) by elanthis (guest, #6227) [Link] (1 responses)

Unless the language is capable of being run on a modern GPU (which entails a huge number of limitations compared to general purpose programs), then adding built-in concurrency is worthless for most types of programs.

Very few people have access to huge super-computer clusters or mega-networks like Google does, and consumer-grade CPUs are still being sold with just a single core, but just about everyone has a machine with anywhere from 32 to 2,000 shader units.

The interesting concurrency for _most_ applications is taking place in OpenCL/CUDA/DirectCompute. It is not happening anywhere inside the main application logic.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 9:23 UTC (Sat) by tialaramex (subscriber, #21167) [Link]

"consumer-grade CPUs are still being sold with just a single core"

Other equally true statements

"houses are still being built with just a single room"

"razors are still being sold with just one blade"

Equally true and equally useless for assessing whether anything important has changed.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 12:05 UTC (Sat) by Ben_P (guest, #74247) [Link] (7 responses)

I don't really understand his dismissal of Java collections. Disclaimer I have done a couple years of work on JDK7 java.utils.concurrent as well as some work more recently on JDK8. We trivially get standard java library data-structures to scale to hundreds and thousands of cpus. No special purpose jvm or GC-tweaks. Sure if you use the wrong structure or if they are not used correctly you'll see poor performance but that doesn't have much to do with the collections framework. Are there any community wide gripes with the collections framework?

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 16:26 UTC (Sat) by HelloWorld (guest, #56129) [Link] (6 responses)

The way immutable collections are "supported" by the collections frameworks is about as broken as it gets.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 17, 2011 14:38 UTC (Sun) by Ben_P (guest, #74247) [Link] (5 responses)

I haven't really seen a strong use case for immutable collections. You can use a small <10 line wrapper class for most collections. Other times it may just be more appropriate to copy the collection. Are there any good articles/papers etc making the point for immutable collections?

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 17, 2011 14:45 UTC (Sun) by Ben_P (guest, #74247) [Link] (4 responses)

Forgot to mention that even with the current immutable collections throwing an exception seems like the right thing to do. If a caller tries to write to an immutable collection that is wrong and should be handled. Local try/catches most of the time end up as simple conditional jumps by the time it gets to machine code. They don't have any of the speed penalties generally associated with exceptions.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 17, 2011 16:24 UTC (Sun) by HelloWorld (guest, #56129) [Link] (3 responses)

> If a caller tries to write to an immutable collection that is wrong and should be handled.
It should be handled through the type system at compile time.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 18, 2011 5:28 UTC (Mon) by alankila (guest, #47141) [Link] (2 responses)

I don't think it's easy to do. Are you suggesting, for instance, that List interface should be spit between UnmodifiableList and ModifiableList that extends UnmodifiableList?

Note that unless the immutable vs. mutable objects really *are* different at runtime, one can just cast the distinction away: you could be given an UnmodifiableList, but because the same object implements a ModifiableList, a cast will succeed. I am guessing that handling this would be rather costly implementation for a feature that doesn't really seem to have many use cases.

For me, immutability comes up in context of avoiding having to make a full copy of some collection to caller; you just give the caller an unmodifiable version of some internal property, and can still trust that your methods are the ones doing any mutating.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 18, 2011 10:42 UTC (Mon) by Ben_P (guest, #74247) [Link]

Immutable and mutable collections are different at run time. The former overrides methods which change the collection to throw an exception. With that exception, they are the same.

You are correct in that the Collection class hierarchy would have to be shuffled around a bit to get immutable collections to error at compile time if someone tries to call an add/remove/etc. However, getting around those compile time checks would be trivial with reflection or even with some specific casting tricks. As it is now, no amount of reflection or jvm foolishness will let you change an immutable collection.

Is it common practice to surround all calls that pass your immutable collection with a try catch? Without such a requirement I'd imagine your UncaughtExceptionHandler would be quite busy.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 18, 2011 13:23 UTC (Mon) by HelloWorld (guest, #56129) [Link]

> I don't think it's easy to do. Are you suggesting, for instance, that List interface should be spit between UnmodifiableList and ModifiableList that extends UnmodifiableList?
Yes.

> Note that unless the immutable vs. mutable objects really *are* different at runtime, one can just cast the distinction away: you could be given an UnmodifiableList, but because the same object implements a ModifiableList, a cast will succeed.
The point of the type system is to guard against accidents, not against bad will, so the fact that you can cast to circumvent it doesn't make it less useful.

Besides, you can still use wrapper classes that can't be cast to mutable collections if you think that's necessary, for example if you're writing a program that will load potentially malicious plugins.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 21:57 UTC (Fri) by aliguori (guest, #30636) [Link]

As far as I can tell, the language is statically typed relying heavily on type inference, there is no explicit overloading of operators or methods, but operators are implemented in terms of polymorphic functions (like Python). Closures are fully supported, interfaces tend to have private visibility unless declared as public, references are non-Nullable by default.

Nothing terribly radical here except for the non-Nullable references perhaps. That's a pretty difficult thing to implement in a non-annoying way. I wonder if once they get a compiler going and start writing code, they'll regret that decision.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 22:35 UTC (Fri) by bjacob (guest, #58566) [Link] (8 responses)

Interesting how more and more companies/projects decide to invest R&D resources into new programming languages.

Google->Go
Mozilla->Rust
RedHat->Ceylon

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 22:59 UTC (Fri) by ncm (guest, #165) [Link] (3 responses)

Vanity languages are a sign of corporate decadence. In the old days, the same impulse would result in a vanity chip architecture. Most don't remember Z8000, M88000, iapx-432, or NS16032, and don't suffer for it.

The only interesting (very incomplete) language I've run across lately is Clay. No built-in "O-O gook" or GC, but a type system that supports real generics.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 18, 2011 18:16 UTC (Mon) by ejr (subscriber, #51652) [Link]

Thanks for the reference to Clay. The snippets feel (mentally) like a compile-time relative of Dylan or Cecil... Having iteration protocols in a language focused on static compilation would make me thrilled. (To save others the search: http://tachyon.in/clay/ )

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 22, 2011 3:55 UTC (Fri) by raven667 (subscriber, #5198) [Link]

I wonder if the i960 falls in that case as well.

I wonder how much this is a real sign of decadence which often accompanies decline.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 22, 2011 4:20 UTC (Fri) by mjg59 (subscriber, #23239) [Link]

The freedom my coworkers have to write new languages is the same freedom that lets me draw a salary for supporting enterprise Linux on servers and spend a significant portion of my time working on laptop support anyway. It may be decadent, but I think it's a corporate culture that benefits Linux.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 17, 2011 1:16 UTC (Sun) by Lennie (guest, #49641) [Link] (3 responses)

And the funny thing is that Javascript/ECMAScript gets into more and more places. It is really weird in a way. JSON is derived directly from Javascript but is being used in many, many places by many different languages. JSON is even used with some NoSQL-databases. In many places XML seems to be overkill. And node.js (event-based network programming for the server) is the third most populair project on github.com.

Lots of people are using webbased technologies (HTML5/CSS3/JS and so on) to build apps for phones, especially because it works on all these devices. Sometimes an extra layer is used like Phonegap which gives some access to the needed native phone API's, if you don't need to update Phonegap (or whatever app-api-library-thing you are using) you can update the to rest without going through the App Store which makes it really easy to update.

Javascript may at first glance not look like a lot of though went behind it, hell it was created in just 10 days (!*) but it is really widespread and seems to be growing.

It has some really bad things which are pretty easy to avoid, it is a pretty small language and deceivingly easy.

* quote from Brendan Eich: Ten days to implement the Javascript lexer, parser, bytecode emitter (which I folded into the parser; required some code buffering to reorder things like the for(;;) loop head parts and body), interpreter, built-in classes, and decompiler... Ten days without much sleep to build JS from scratch, "make it look like Java" (I made it look like C), and smuggle in its saving graces: first class functions (closures came later but were part of the plan), Self-ish prototypes (one per instance, not many as in Self).

http://news.ycombinator.com/item?id=1797304

Well, it's not because JavaScript is any good...

Posted Apr 17, 2011 16:29 UTC (Sun) by khim (subscriber, #9252) [Link] (2 responses)

JavaScript is used because it's ubiquitous, not because it's good. As for JSON... it's good illustration of how things should be done: first you create a prototype in a toy language like JavaScript, later, if it works you rewrite it in some industrial language.

Today even Web-browsers offer JSON-handling written in industrial languages...

Well, it's not because JavaScript is any good...

Posted Apr 18, 2011 6:41 UTC (Mon) by wahern (subscriber, #37304) [Link] (1 responses)

Linux and OS X implement routines like strlen() in assembly. Is this because C isn't industrial enough?

Nope...

Posted Apr 18, 2011 7:02 UTC (Mon) by khim (subscriber, #9252) [Link]

No, it's because not everything must be implemented in industrial language.

There are assembler, which is wickedly fast, but notoriously difficult (glibc had many different errors in most trivial functions like memcpy or strlen) - you use it for the most time-critical pieces of code (and for things which are flat out impossible in higher-level languages). There are toy languages which you are using to "play around": to develop prototypes and to write simple scripts. And there are industrial languages which are used for code which is maintained for years by different people.

JavaScript is separate phenomenon: it's toy language forced to play role of industrial language. It's painful. But yes, there are different ways to cope.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 23:01 UTC (Fri) by cmccabe (guest, #60281) [Link] (30 responses)

So to sum it up, it's Java but:
* generics via type reification instead of type erasure
* a non-nullable pointer type
* no method overloading, even for constructors
* curried functions
* no primitive types

Interestingly, they seem to be borrowing a lot from C++. Ceylon has:
* the local keyword, their version of the C++0x auto keyword
* the ability to specify functions that *aren't part of classes*!
* default parameters (yuck)
* something that's kind of like operator overloading, but only for the equals sign, the less than and greater than signs, and a few other things.
* A C-style structure initializer syntax
* function pointers!
* the ability to put *code* into Interfaces (ew)

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 9:36 UTC (Sat) by juliank (guest, #45896) [Link] (29 responses)

> * the ability to put *code* into Interfaces (ew)
That's useful. Treating an interface as a purely abstract class has no advantage over treating interfaces as stateless classes (mixins). In fact stateless classes with methods are far more useful than pure abstract ones, since they allow more multi-inheritance, while still preventing the problems involved by having state in multiple parent classes.

Actually it's not true...

Posted Apr 16, 2011 12:03 UTC (Sat) by khim (subscriber, #9252) [Link] (28 responses)

Treating an interface as a purely abstract class has no advantage over treating interfaces as stateless classes (mixins).

Sorry, but no. Mixins are huge disadvantage for "industrial" language. Why? They make it harder to understand program. If the interfaces are just declarations then you don't need to know what they actually declare when you debug your class. If they may declare functions then you suddenly may introduce change in behavior just by adding new interface to the declaration of class.

This may be win for a "toy" language (like python) where you have just one developer - it may even increase said developer productivity. But for industrial language it's no-go... but I've not seen how exactly Ceylon makes it possible - perhaps it's not so problematic there.

RE: Actually it's not true...

Posted Apr 16, 2011 17:44 UTC (Sat) by nybble41 (subscriber, #55106) [Link] (27 responses)

If the implementation is anything like the same feature in D, there is no real risk of changing the behavior of a class. Functions defined in the interface can only refer to other declarations in the interface (concrete or abstract), not anything added by the implementing class, and cannot be overridden. Your program may fail to compile if your class and the interface happen to define methods with the same name, but otherwise you can safely ignore the concrete interface functions.

Well...

Posted Apr 16, 2011 17:49 UTC (Sat) by khim (subscriber, #9252) [Link] (26 responses)

If that's true then they are mostly harmless... and useless. These are not a mixins by any stretch of the imagination. But they may save small number of keystrokes so may be it's a net win in the end.

Well...

Posted Apr 16, 2011 19:48 UTC (Sat) by elanthis (guest, #6227) [Link] (25 responses)

Nothing to do with saving keystrokes, everything to do with saving code duplication. You sound like one of those people who think inheritance is just a lazy feature to avoid a little cut-n-pasting.

I don't want to write the same code twice because I want to keep the bug count down. _real_ programmers of large systems understand that, and don't go around calling features like that "toys" just because their pet language lacks the feature.

Sorry, but not. It just looks like you never worked with large (==million files or more) systems.

Posted Apr 17, 2011 8:27 UTC (Sun) by khim (subscriber, #9252) [Link] (24 responses)

You sound like one of those people who think inheritance is just a lazy feature to avoid a little cut-n-pasting.

Cut-n-paste is Ok, it's Copy-n-Paste which is bad :-)

As for inheritance... well, time showed that it's actually very bad way to try to avoid copy-n-paste. Because you excahnge amount of time needed to write code for amount of time needed to debug said code. With large systems debugging dominates so class inheritance is bad while interface inheritance as good. Well, but... how to avoid copy-n-paste if you only have interface inheritance? Well, the old-fashioned way: by using separate functions. This way every time you call function which does some kind of "inherited" opration you clearly see what exactly it can affect.

Don't go around calling features like that "toys" just because their pet language lacks the feature.

This is knee-jerk reaction to the "toy language (like python)" phrase? Sorry, but python is a toy language (the same as Smalltalk, BTW). This is not a figure of speech, this is well-established fact. We have millions of lines of code here - including python code. While modules written in different languages have different fate the python projects fall in two distinct groups:
1. Projects which are still maintained by original authors.
2. Projects which are constant irritation and maintainability-problems.
There are no third class. The reasons are simple: Python is powerful and clever, real joy to program on - but it employs a lot of features which are easy to abuse and create hard-to-understand-and-debug program.

Any feature in industrial language must be:
1. Hard to abuse (strict requirement).
2. Easy to use (nice to have).
Good example is C++. By itself C++ is horrible language - it contains literally tons of features which are easy to abuse. But in industrial context it can be mitigated: just write a "Style Guide" and forbid the problematic features in it. It's workable, I suspect that even Python can be used as industrial language if you forbid most of the dangerous (and most powerful!) features. But if you design new "industrial" language then it's good idea to not include features which are known problems. Default parameters, for example.

P.S. Sometimes it's good idea to include some dangerous but powerful features if you can show that in some circumstances they can save literally years of work. Operator overloading is one such feature: often in aforementioned "Style Guides" you can find phrase like "operator overloading is forbidden unless it's permitted by such-and-such group of experts". But I rarely see mixins used this way: more often then not they are used to avoid proper refactoring.

Sorry, but not. It just looks like you never worked with large (==million files or more) systems.

Posted Apr 17, 2011 11:14 UTC (Sun) by mpr22 (subscriber, #60784) [Link] (16 responses)

... Python is powerful and clever ...

There is an irreconcilable mismatch between this statement (even in its full context) and the notion of Python being a "toy language" for any useful definition of the term "toy language".

Why do you think so?

Posted Apr 17, 2011 13:34 UTC (Sun) by khim (subscriber, #9252) [Link] (15 responses)

There is an irreconcilable mismatch between this statement (even in its full context) and the notion of Python being a "toy language" for any useful definition of the term "toy language".

Huh? Toy language == language you use to play. Basically to write the code which is not supposed to outlive you (or your involvement in a particular project).

Toy languages may be quite powerful and clever (Haskell, Perl, Python, Smalltalk are good examples) but since it's practically impossible to support the code in these languages unless you wrote it long-term it's a drain on the organization which employs these languages. If one of these languages is forced to play a role of industrial-strength language then it leads to huge waste of the resources (like happened with Javascript).

There are nothing wrong with toy languages. In fact you probably write more throw-away code then "long-term" code - and it's fine to use these languages there. They often increase your individual productivity and so save the resources. The problem begins when you keep code in toy language around too long: once other individuals besides the principal author are forced to play role of the maintainers the whole house of cards fall apart.

Why do you think so?

Posted Apr 17, 2011 14:10 UTC (Sun) by mpr22 (subscriber, #60784) [Link]

I happen to parse "toy language" as carrying a strong implication that the language has no worthwhile use in the speaker's eyes. I'm actually astounded to find that there are people who don't.

Why do you think so?

Posted Apr 17, 2011 14:54 UTC (Sun) by vonbrand (guest, #4458) [Link] (13 responses)

This is quite wrong. There are tons of "toy language" code around, healthy and maintained. You'd think by now whoever is still writing new code and maintaining the old would have wisened up and thrown the language out (or at the very least be actively looking for a replacement).

Also, the "throwaway" code you write today might turn out more useful than you thought, and get expanded and refined, and used over and over. Better make sure it isn't thrash from the start.

Examples, please...

Posted Apr 17, 2011 16:01 UTC (Sun) by khim (subscriber, #9252) [Link] (12 responses)

There are tons of "toy language" code around

That's quite true.

healthy and maintaine.

This I'm yet to see. The world is large place, so perhaps you can find couple of such projects, but as I've said most of projects out there (and certainly all internal projects I've seen) fall in two aforementioned categories:
1. Either they are actively developed and maintained by original authors, or
2. They original authors are gone and most feature requests and even majority of bugs are closed with "WANTFIX" resolution under one pretext or another.

You'd think by now whoever is still writing new code and maintaining the old would have wisened up and thrown the language out.

Sorry, but this is typical situation. The people who develop something "exciting and new" code and the people who are maintaining "old and crusty" code are usually different people. When you develop something in a powerful and forgiving toy language you are more productive and the fact that someone will spend years trying to fix that mess does not matter to you. It does matter to your manager if your company is large enough so very few languages are used in large projects.

Also, the "throwaway" code you write today might turn out more useful than you thought, and get expanded and refined, and used over and over.

If it's useful and important then it can be rewritten in "industrial language" (think original Bittorrent and today's Bittorrent AKA uTorrent), but most of throwaway code is just a throwaway code.

Examples, please...

Posted Apr 18, 2011 8:00 UTC (Mon) by anselm (subscriber, #2796) [Link] (11 responses)

The world is large place, so perhaps you can find couple of such projects, but as I've said most of projects out there (and certainly all internal projects I've seen) fall in two aforementioned categories:
1. Either they are actively developed and maintained by original authors, or
2. They original authors are gone and most feature requests and even majority of bugs are closed with "WANTFIX" resolution under one pretext or another.

Yeah right. As if that doesn't happen with C++ code. Just look at KDE, where the SOP is to reimplement from scratch after the original developers of an application have left, simply because the code is usually too crufty to maintain.

Of course it happens with C++!

Posted Apr 18, 2011 8:24 UTC (Mon) by khim (subscriber, #9252) [Link] (3 responses)

You can write unmaintainable code in any language. The question is not if there are projects which are broken but if there are projects which are not. It's easy to say about new language (like Google Go): well, there are no such projects because nothing was abandoned yet. It's not easy to say the same about Python: it's 15 years old! If it's not a toy languages there are must be projects which changed maintainers without full rewrite. You know, like WebKit (former KHTML) or InkScape (former Sodypodi). The question is: how many of them are out there.

Of course it happens with C++!

Posted Apr 18, 2011 8:49 UTC (Mon) by anselm (subscriber, #2796) [Link] (2 responses)

I think whether that sort of thing works depends much more on good development practices rather than what language was used. I tried to think of some large Python projects that underwent a radical developer change but the ones I use (at least) seem to have avoided this so far by gradually bringing new people into the core team who are then ready to step up if long-standing developers leave. This, together with sound engineering and good documentation, is probably a better approach than hoping that the language will make the project easier to pick up by others if the original developers leave it lying on the floor.

Also note that, e.g., with WebKit, the whole point of Apple taking over KHTML was in order not to have to do a full rewrite. The Apple developers would probably have been perfectly capable of coming up with their own HTML rendering library from scratch if they'd wanted to; as things happened they used KHTML as a starting point but in the process butchered into something the original KHTML developers wouldn't recognise if they met it in the street. I'm not convinced this is actually good evidence for C++ as a »non-toy langugage«.

But this is reality...

Posted Apr 18, 2011 11:58 UTC (Mon) by khim (subscriber, #9252) [Link] (1 responses)

I tried to think of some large Python projects that underwent a radical developer change but the ones I use (at least) seem to have avoided this so far by gradually bringing new people into the core team who are then ready to step up if long-standing developers leave.

But this is reality of "industrial programming". The people are hired (or contracted) to write the system. Then they go away (to different projects or may be to different company). Then few years later different people are hired (or contracted) to change the system. And so on.

This, together with sound engineering and good documentation, is probably a better approach than hoping that the language will make the project easier to pick up by others if the original developers leave it lying on the floor.

"Project lying on the floor" is definition of "industrial programming". Sometimes it may contain adequate documentation, more often then not documentation will be incomplete or just plain wrong. You don't have the luxury of keeping the developers around and keeping the project alive because quite fast it becomes economically unfeasible. Yes, you can try to mitigate effect of such freeze/thaw cycles, but if you can not do it at all then the language is not suitable for the use in large systems (when you have millions of lines of code and only hundreds or thousands of programmers most of your code is "frozen" - you just can not afford any other approach).

Note: most "industrial projects" are "inside projects", they never are published publicly so it's hard to collect statistic - but surely as popular as python is if it's capable of going beyond "toy language" we should see at least some projects which successfully survived such transition?

But this is reality...

Posted Apr 18, 2011 13:18 UTC (Mon) by pboddie (guest, #50784) [Link]

Note: most "industrial projects" are "inside projects", they never are published publicly so it's hard to collect statistic - but surely as popular as python is if it's capable of going beyond "toy language" we should see at least some projects which successfully survived such transition?

I went to the Python Wiki to see if I could dig some projects up, and straight out of the gate, MoinMoin (which is what runs the Wiki) is precisely the kind of project which ended up being maintained by a bunch of people who are not the original maintainers. There are projects like lxml whose founder doesn't have that much to do with it any more, apart from the occasional contribution, perhaps. I happen to run a ViewVC installation, and I'm pretty sure the original author of that software doesn't have anything to do with maintaining it. Likewise, Roundup is maintained by someone other than the person who originally wrote it. I've even picked up other people's projects and enhanced and maintained them, albeit not projects measured in gigalines of code because I don't have time to commit to such things, so it's not exactly the "scary Python makes understanding hard" situation you'd like everyone to believe in.

Of course, the programming language chosen only exerts a certain amount of influence over the long-term management of software projects: there are quite a few other factors at work, as most people are already aware.

Examples, please...

Posted Apr 18, 2011 9:22 UTC (Mon) by rdale (subscriber, #70788) [Link] (6 responses)

"Just look at KDE, where the SOP is to reimplement from scratch after the original developers of an application have left, simply because the code is usually too crufty to maintain."

No that is quite wrong. In my experience, the vast majority of application code in KDE is well written and can be understood by developers who aren't the original authors. Before a KDE application is included in a KDE module it is subject to a review process, where coding style and any design issues are discussed. Once the app has made it to the module, then the commits can reviewed by developers who aren't on the project as they are made. Design and coding issues can also be discussed on mailing lists which are open to everyone, not just the original developers. The KDE project has a strong community which allows a project to outlive its original developers because it follows these good practices.

If you are implying that, in general, Qt based C++ projects are inherently hard to maintain, don't scale, can't be understood by the original author, then as far as I can see there is an awful lot of evidence out there in various code repositories, to discount that view.

Examples, please...

Posted Apr 18, 2011 10:32 UTC (Mon) by anselm (subscriber, #2796) [Link] (5 responses)

If you are implying that, in general, Qt based C++ projects are inherently hard to maintain, don't scale, can't be understood by the original author, then as far as I can see there is an awful lot of evidence out there in various code repositories, to discount that view.

I didn't say that. I'm a long-term KDE user and I agree that judging from results, much of KDE seems reasonably maintainable. However, from experience and from talking to people within the project it seems to me that this does not apply to all of KDE – and when important pieces of infrastructure like KMail go essentially unmaintained for extended periods of time because the original developers have lost interest and nobody else dares touch the code then I would consider this a problem. A large contributing factor is that, given a choice, most free-software developers would probably prefer writing shiny new code with their name on it to maintaining loads of crufty stuff that other people have dumped on them, which leads to »reimplementation in favour of maintenance« if the pre-existing code base is at all difficult to work with. Also consider kprinter, which (according to what I've been told) is far too complicated to have been ported from KDE 3.5 to KDE 4. I mean, seriously?

Most sub-projects within KDE probably manage to transition gradually to new maintainers, but if – as does happen even within KDE – the existing developers of a sub-project lose interest in the project before getting new people up to speed, then there is no magic within C++ (especially magic that other languages like Python don't have) that makes it easier for a completely new person to get the hang of the code. If the code in question is well-written and adequately documented it may be easier for outsiders to pick up but again that doesn't depend on the language. As khim said, it's possible to write unmaintainable code in any language, but conversely, it is also possible to write maintainable code.

Examples, please...

Posted Apr 18, 2011 12:05 UTC (Mon) by rdale (subscriber, #70788) [Link] (4 responses)

"when important pieces of infrastructure like KMail go essentially unmaintained for extended periods of time because the original developers have lost interest and nobody else dares touch the code then I would consider this a problem."

The reason that there isn't much of a focus on maintaining KMail, is because the KDE PIM team have been developing Akonadi, which is a server based back end for contacts, email etc, along with something like a 'construction kit' for writing mail clients.

I wasn't really going to contribute to this discussion as I don't have much of an opinion about a language like Ceylon, which doesn't actually exist yet. I was only trying to correct what I saw as some misleading, though possibly well meant, comments about KDE Qt/C++ development.

But I would say a big problem with many Java programmers is that they only have learned Java. In the same way that many years ago, many COBOL programmers would only really learnt that one language. From experience I am always very wary of people who claim that the language they know best is the 'one true language', and is best for everything. Or similarly, saying that C++/Java style static typing is the 'one true way' and the other sorts of languages must be mere toys.

I'm more impressed with someone who knows when to choose JavaScript, Scala, Python, Erlang or whatever given the nature of the particular project, the people working on it currently, and those maintaining in the future. Maybe in a year or two Ceylon will be a worthy candidate to consider, but I doubt very much that it will be 'the last language you ever need'.

Examples, please...

Posted Apr 18, 2011 13:58 UTC (Mon) by anselm (subscriber, #2796) [Link] (3 responses)

The reason that there isn't much of a focus on maintaining KMail, is because the KDE PIM team have been developing Akonadi, which is a server based back end for contacts, email etc, along with something like a 'construction kit' for writing mail clients.

I don't think that's the real explanation (even though it sounds rather convenient). IIRC there wasn't a lot going on with KMail even before the whole Akonadi issue came up. My company shared a booth with the KDE project at CeBIT, 2004 – i.e., two years before Akonadi was launched – and I seem to remember talking to one of the KDE-PIM people there. The gist of what he said was that there wasn't much interest in doing things with KMail because the original author had left and the code base was in such an abominable state that nobody else wanted to take charge of it.

but I doubt very much that it will be 'the last language you ever need'.

The history of computing is full of languages claiming to be »the last language you ever need«. COBOL started out as such a language. If we have learned anything during the last 50+ years of computer programming it is that there is no single language that will solve all programming problems. Anyone who comes up with another one of those should be treated with the same respect that is accorded to the inventors of perpetual motion machines.

Examples, please...

Posted Apr 18, 2011 15:11 UTC (Mon) by rdale (subscriber, #70788) [Link]

"I don't think that's the real explanation (even though it sounds rather convenient). IIRC there wasn't a lot going on with KMail even before the whole Akonadi issue came up. My company shared a booth with the KDE project at CeBIT, 2004 – i.e., two years before Akonadi was launched – and I seem to remember talking to one of the KDE-PIM people there. The gist of what he said was that there wasn't much interest in doing things with KMail because the original author had left and the code base was in such an abominable state that nobody else wanted to take charge of it."

I've been using KMail as my main mail client since before 2004, and at no point has it stopped working, stopped being maintained or exhibited serious bugs. I am using KMail 1.13.5 at the moment. You may well be right that the KMail 1.x code isn't the most perfect in the world, but it appears to have been ported from Qt3 to Qt4 at some point after 2004 and I don't think the developers would have bothered if it was that bad.

Thomas McGuire gives a good account of what is happening with Akonadi and KMail as at one year ago, and I'm sure it has progressed perfectly OK since.

I'm still not clear exactly what we are discussing here - are you saying that if only the KDE project had the awsome power of the unreleased Ceylon language then re-architecting major projects would just work?

"The history of computing is full of languages claiming to be »the last language you ever need«. COBOL started out as such a language. If we have learned anything during the last 50+ years of computer programming it is that there is no single language that will solve all programming problems. Anyone who comes up with another one of those should be treated with the same respect that is accorded to the inventors of perpetual motion machines."

Yes I agree completely.

Examples, please...

Posted Apr 18, 2011 16:26 UTC (Mon) by vonbrand (guest, #4458) [Link] (1 responses)

The history of computing is full of languages claiming to be »the last language you ever need«. COBOL started out as such a language.

I believe you are confusing COBOL with PL/1 (a merger (sort of) of COBOL ("bussiness oriented") and FORTRAN ("scientific/engineering language") with a dash of ALGOL structuring on top).

Examples, please...

Posted Apr 18, 2011 17:21 UTC (Mon) by anselm (subscriber, #2796) [Link]

[…] PL/1 (a merger (sort of) of COBOL ("bussiness oriented") and FORTRAN ("scientific/engineering language") with a dash of ALGOL structuring on top).

Yep, there goes another one. Fast-forward another decade to Ada, and so on.

The original idea behind COBOL was to make programs look more like natural language so after the programmers had written the code, their (non-technical) managers could proof-read it to see whether the programmers had got things right.

Sorry, but not. It just looks like you never worked with large (==million files or more) systems.

Posted Apr 17, 2011 17:42 UTC (Sun) by pboddie (guest, #50784) [Link] (6 responses)

Sorry, but python is a toy language (the same as Smalltalk, BTW). This is not a figure of speech, this is well-established fact.

Facts require more than assertions or anecdotes as documentation.

We have millions of lines of code here - including python code. While modules written in different languages have different fate the python projects fall in two distinct groups:
1. Projects which are still maintained by original authors.
2. Projects which are constant irritation and maintainability-problems.
There are no third class. The reasons are simple: Python is powerful and clever, real joy to program on - but it employs a lot of features which are easy to abuse and create hard-to-understand-and-debug program.

With regard to the latter remark you could say the same about any language. The old joke about shooting yourself in the foot with C, blowing your leg off with C++ arose precisely because both languages provide "a lot of features which are easy to abuse and create hard-to-understand-and-debug program". No large-scale C or C++ project can manage without a "house coding style", and any large-scale Python project ought to restrain developers from using exotic stuff they barely understand.

And it is ridiculous to assert that Python itself causes projects to be abandoned and undermaintained in some way: such issues are purely social ones, unless you are bringing in some kind of language snobbery about how other programming languages have better practitioners who manage their projects better, write better documentation, and so on. Given recent experiences with some awful but widely-used Java tools, I beg to differ.

Python itself would benefit from simplification and having features thrown overboard that language snobs insisted be added for Python not to be, as you call it, a "toy" language. But certainly, programming language evolution should be driven by feedback from actual experiences of actual practitioners, not people who have already decided that a language isn't suitable for a particular kind of project (while others are already doing such projects in that language).

Well, that's the problem...

Posted Apr 17, 2011 19:08 UTC (Sun) by khim (subscriber, #9252) [Link] (5 responses)

No large-scale C or C++ project can manage without a "house coding style", and any large-scale Python project ought to restrain developers from using exotic stuff they barely understand.

Sadly in Python the "exotic stuff" is in fact unavoidable. It's in language core: dynamic typing. Sure, sometimes static typing hurts, and a lot of "clever" concepts are enabled by Python-style duck-typing. But the fact that you are not using dynamic typing in a few carefully designed places but everywhere means that you have many times more failure places then in, say, Java. Or even C++.

But certainly, programming language evolution should be driven by feedback from actual experiences of actual practitioners.

"Actual practitioners" have spoken long ago: large projects need at least static typing with compile-time checking - it's as simple as that. It's possible to discuss advantage of different approaches to static typing, but without it... the language is toy language for sure. It's possible to create statically typed languages using python as starting point, but it certainly look like few people are interested in such a languages. And most python-lovers will prefer to use Java or Objective C rather then "crippled" statically typed Python.

Well, that's the problem...

Posted Apr 18, 2011 7:12 UTC (Mon) by juliank (guest, #45896) [Link] (3 responses)

I'm proposing something that could be called predicate-based typing. This needs an introspectable language where we can see which methods exists in objects. Then we could say that a function accepts objects fulfilling certain requirements, such as in Python, specifying __lt__:

def f(x: predicate(hasattr, '__lt__')):
return do_something(x)

This is probably something like duck-typing with static analysis. Would probably be useful.

Well, that's the problem...

Posted Apr 18, 2011 7:44 UTC (Mon) by mlobo (guest, #72557) [Link]

Structural typing.

It may be useful, but it still will not make python acceptable for large projects...

Posted Apr 18, 2011 8:16 UTC (Mon) by khim (subscriber, #9252) [Link] (1 responses)

The static typing is important not because it can be used to notify reader and compiler about something. There are many ways to do that.

Static typing is important and only makes sense because it's opt-out, not opt-in - and because it static: it happens before the program is even run. This basically adds perfect (or almost perfect) test-coverage layer which is very hard to remove. Why it's important that it's hard to remove? Because we are talking about old code. The code which you didn't write and which is not clearly described (well, there are probably some kind of documentation - but it's outdated and probably plain wrong in places). When you change the code you break the unittests - and it's hard to know if you've just introduced new bug in code... or exposed hidden assumption in the unittest which is no longer true. But you know that types should match: if try to circumvent type system... this is actively frowned upon and often requires you to ask explicit permission from other peoples. It's like dimensional analysis in physics: it does not guarantee that something is right, sometimes it's even wrong (E=mc^2, remember?), but it happens rarely so it's still very important tool in practice.

That's why I said "it's possible to discuss advantage of different approaches to static typing": if language asks you to explicitly override static type checking too often then you lose most of benefits, but if it does not convoy enough information then aforementioned "free coverage" will be too week. It's possible to add static typing to python but to be useful it must be ubiquitous - and for that it must be simple to use. You scheme is too hard, but may be some modification?

In any case I feel it's too late: python is old language. It's not easy to change fundamental assumptions in it. IMO it's better to have good toy language for the cases where you want to play (and not to design large complex systems) rather then yet-another poorly designed industrial language. If you want to take a look on "what it takes"... well - JavaSE 1.4 to JavaSE 5 is good example. JavaSE 1.4 was statically typed but it's type system was so poorly designed that any non-trivial program contained huge number of type casts. JavaSE 5 fixed this problem but it was major redesign of the language. To make "statically typed Python" usable you'll need something of similar magnitude.

It may be useful, but it still will not make python acceptable for large projects...

Posted Apr 18, 2011 8:34 UTC (Mon) by juliank (guest, #45896) [Link]

> To make "statically typed Python" usable you'll
> need something of similar magnitude.

What you need in order to statically type a program are type annotations of function parameters, function return values, and object attributes. With this type information, all types in the remaining program can be inferred.

We already have a way to annotate functions in Python 3, so we'd only need to annotate all functions, and add a way to annotate properties. Then we would theoretically be able to perform static type checking.

But I did not say we should choose Python for it. It was just an example of what I'd like too see as a type system, and know I know it's called structural type system.

Well, that's the problem...

Posted Apr 18, 2011 9:15 UTC (Mon) by pboddie (guest, #50784) [Link]

But the fact that you are not using dynamic typing in a few carefully designed places but everywhere means that you have many times more failure places then in, say, Java.

So upon deploying a Java-based application recently, I shouldn't have expected it to fail at all, but then it got confused about which classes it should have been loading. You can introduce dynamic behaviour in any system, but at least dynamically-typed languages give you mechanisms to control it, rather than encourage mechanisms that just bomb out with low-level errors.

It's possible to discuss advantage of different approaches to static typing, but without it... the language is toy language for sure.

If you can get stuff done in a language, it is not a "toy language", and there are plenty of systems out there written in Python. Now, I may agree with you that it can be very useful to be able to predict program behaviour, and I don't encourage the kind of dynamic configuration tricks which get people into trouble (even in languages like Java, as noted above), but repeating "toy language" all the time while conflating the social issues (people aren't maintaining their projects) with the technical ones makes it pretty hard to take your remarks seriously. And I am generally of the opinion that Python has encouraged the exotic and superfluous in recent times, so I should be sympathetic to some of your arguments.

Of course, looking past the rhetoric, your assertion may have been that only the original maintainers of Python projects are able to maintain them - the code is surely too unfathomable for other people - but there are plenty of projects whose maintainers are not the people who developed those projects in the first place. And one might expect people to struggle with maintaining many projects in Python, yet apart from the need to reacquaint oneself with a project after a certain absence, the limiting factors are frequently matters like the availability of conceptual documentation for the project; widely available tools can usually provide everything else one needs to know to get back into the code.

I'm sorry if all this boring peripheral stuff doesn't support the notion of a "silver bullet", but that's just how software engineering is.

Ad hominem attack.

Posted Apr 15, 2011 23:28 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link] (1 responses)

First, look at the designed of the language - Gavin King. He's the one who inflicted Hibernate on the unsuspecting world.

At first, Hibernate appears to be a nice ORM with great features. Then you discover that it's buggy, slow and poorly maintained (if you encounter a bug, then chances are it's already in their bugtracker, probably since 2006).

And instead of fixing Hibernate, Gavin King went off to create another abomination: JBoss Seam which somehow merges the worst features of JSF and dependency injection in a large NIH-suffering way.

This language appears to suffer the same problems. It almost _nothing_ new compared to Scala and other JVM languages.

It'd be just better to continue working on Scala.

Ad hominem attack.

Posted Apr 16, 2011 6:10 UTC (Sat) by rsidd (subscriber, #2582) [Link]

Specifically, it probably refers to Ceylon tea (cf. Java coffee).

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 15, 2011 23:34 UTC (Fri) by mfedyk (guest, #55303) [Link] (3 responses)

Is it just me or does this immediately remind you of the Cylons from Battlestar Gallactica?

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 3:58 UTC (Sat) by njs (guest, #40338) [Link]

It reminds me of, uh, Ceylon.

The rationale for “Ceylon”

Posted Apr 16, 2011 7:30 UTC (Sat) by pjm (guest, #2080) [Link]

Nah, why would anyone name a programming language after the enemies of Kobol?

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 29, 2011 1:33 UTC (Fri) by wtanksleyjr (subscriber, #74601) [Link]

It's disappointing to see ad homonym attacks on LWN.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 16, 2011 16:08 UTC (Sat) by b7j0c (guest, #27559) [Link] (6 responses)

meh

vala scala c#/mono go d

welcome ceylon to the big boring world of big boring replacements for c++ that will never actualy replace c++

Boring replacements for C++

Posted Apr 17, 2011 17:19 UTC (Sun) by pboddie (guest, #50784) [Link] (5 responses)

My reaction is also somewhat "meh" but for different reasons. However, in your list of "replacements for c++", Scala and Go are not really targeted at the same function. Go, in particular, is a successor to Alef and Limbo, and although it is meant for "systems programming" like C++, one can argue that it belongs to a completely different school of systems programming: one that shuns object-orientation in favour of message-passing components.

My "meh" reaction springs from the observation that in the list of desirable programming language features driving the Ceylon project, if one just dropped "statically-typed", you'd land pretty close to Python. There's probably a law of programming language design that states that it's easier for people to come up with new languages than to markedly improve existing ones for a particular purpose.

Boring replacements for C++

Posted Apr 18, 2011 7:55 UTC (Mon) by cmccabe (guest, #60281) [Link] (4 responses)

> Go, in particular, is a successor to Alef and Limbo, and although it is
> meant for "systems programming" like C++, one can argue that it belongs to
> a completely different school of systems programming: one that shuns
> object-orientation in favour of message-passing components.

Actually, it is easier to write object-oriented code in Go. Locks don't compose, but Go-style message passing does.

Also, creating new objects by building them out of existing ones-- composition-- has been shown to be more maintainable than creating them through inheritance. James Gosling even said that if he did Java again, he'd "leave out inheritance."

I think maybe Go should emphasize its object-oriented aspects more. A lot of programmers still seem to think that Go takes a "bah, humbug" approach to object orientation, when nothing could be further from the truth.

Boring replacements for C++

Posted Apr 18, 2011 11:17 UTC (Mon) by oelewapperke (guest, #74309) [Link] (3 responses)

Message-passing composes without problems ?

I have totally different experience. It's horrendously difficult to even get a single message-passing system to behave correctly, and it always has subtle bugs.

Composing different message protocols is ridiculously difficult since you may have unannounced resets, messages lost, ... And all of this may happen at any time at all. Every last thing you do must be separately confirmed, with timing backoffs and the whole shebang. It is horrendously difficult, and sometimes completely impossible to maintain consistency in these cases.

I know of only one type of network programming that composes without problems : single-threaded asynchronous programming. A version of cooperative multitasking, for the people who know C : the select loop. Now that works and composes very nicely. Of course, it'll never be scalable across different cores and definitely won't work across different machines. But if you use this method of programming, you can compose to your heart's content and bugs will not start appearing out of subtle interactions.

Boring replacements for C++

Posted Apr 18, 2011 18:05 UTC (Mon) by cmccabe (guest, #60281) [Link]

> Composing different message protocols is ridiculously difficult since you
> may have unannounced resets, messages lost, ... And all of this may happen
> at any time at all. Every last thing you do must be separately confirmed,
> with timing backoffs and the whole shebang. It is horrendously difficult,
> and sometimes completely impossible to maintain consistency in these cases.

Er, message passing in Go doesn't involve network communication. Everything happens on a single computer. You seem to be thinking of some kind of distributed systems problem.

Boring replacements for C++

Posted Apr 18, 2011 20:19 UTC (Mon) by wahern (subscriber, #37304) [Link] (1 responses)

+1 cmccabe's clarification.

Also, there are definitely subtle bugs with select loops, or at least with the way lots of software supports them. Callback patterns are the source of all sorts of memory corruption bugs because they raise a significant subset of the object ownership and lifetime issues that plague threaded programs.

I learned long ago that for my asynchronous libraries to be properly composable, they shouldn't implement any callback scheme. They should make it easy to be used with callback schemes, but that's the end of it. Rather than a member function take a callback to signal completion, I make it restartable and have it return EAGAIN and a single, unique pollable fd. Composability improves 10-fold this way.

The only callbacks in my software are at the core loop. I wrote an entire DNS library from scratch just so I could get rid of the callback craziness that c-ares, adns, and other asynchronous libraries involve. Ditto for my SSL sockets library, HTTP library, RTSP library, MySQL client library, etc. I've re-written an entire ecosystem of asynchronous software just to get rid of callbacks, and it has been absolutely, positively worth it. It makes the logic and code flow of the larger applications much more clear and simple, and each application can have its own quirks as required without having to smooth over the various asynchronous schemes of the various libraries.

It also makes integration in threaded applications much easier because with a few lines of code you can get synchronous behavior, and also moving objects between threads is less bug prone because objects don't keep any hidden dependencies on particular polling loops. Ditto for binding with higher level languages like Lua or Perl.

Boring replacements for C++

Posted Apr 18, 2011 23:42 UTC (Mon) by cmccabe (guest, #60281) [Link]

I agree. I'd rather have an event-based API than a callback-based one.

An event-based API lets me choose how many threads I have. I can have no threads, a thread blocking on each outstanding operation, or anything in between. In contrast, in a callback-based API, I'm stuck with whatever the library decides.

Callbacks from libraries generally force you to use mutexes. And if the callback is made under a library mutex, things get even more complicated. Keep in mind, you can't have one part of the program where you hold mutex A and then acquire B, and another where you hold mutex B and acquire mutex A. And what happens if you spend a long time in the callback? Better read the library source code carefully.

Callbacks are nice, but not in library APIs.

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 17, 2011 1:11 UTC (Sun) by euske (guest, #9300) [Link] (1 responses)

Any conventional (i.e. procedural) programming language is based on "the turing machine used only by theoretical computer scientists." So what?

The rationale for Ceylon, Red Hat's new programming language (ars technica)

Posted Apr 19, 2011 16:11 UTC (Tue) by butlerm (subscriber, #13312) [Link]

"based on" and "is equivalent to" are completely different things.


Copyright © 2011, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds