Monday, June 21, 2010

Is Java dying? A response

This is actually a response to an article,  "Is Java dying? Dependency injection and other Java necessary evils", written by my friend Juanfri. Like his article was a response to another article that turned out to be too large, mine started out that way as well. In the end I decided to just post it to my own blog (that way it at least gets some attention).

I must say I liked the original article, but in the end I don't agree with some of it's assertions.

I don't agree for example that there is anything wrong with GWT using overlay types. The code example is disingenuous because it solves a different problem. In fact I have written the same code (using simple-json) this weekend and it's completely similar:


The integration in Ruby is nicer and simpler? Sure, but the mapping provided by GWT sure is useful and the implementation is normally completely hidden from view.

The examples about dependency injection, and his efforts to create a DI framework for Ruby are really funny, especially because they are so recognizable, all of us in this business have gone through the same moments (and will probably still do so a couple of more times during the years that come). Encountering a new technology and then using it for absolutely everything that you come across is normal.

It is the reason almost everything in Java is configured with XML, it was just the way it was done some years ago and will probably still stay that way for some years more. But the whole drive to use annotations shows that people are looking for different ways (and making the same mistakes where suddenly everything uses annotations). And Guice is a prime example where configuration is done using code instead of configuration files. Of course, with the limitations of the Java syntax the result will be less "beautiful" than the same in some other languages.

And finally I also don't agree that there is a huge movement to dynamically typed languages. Instead I would say there is a strong desire to find "The Next Big Thing", in this case "The Next Big Language".

I agree it probably has to do with the fact that Java is already with us for a long time and we have come to know its limitations, 15 years ago it was brilliant, today it's still nice but we know we can do much better.

And of course we know we can do better because there are several examples: C# is already somewhat better (but too similar, not revolutionary), dynamically typed languages like Groovy and Ruby have many advantages (but too messy and too much testing needed for my tastes), or functional languages like Haskell and Erlang that might become essential in the future where having large numbers of processors becomes the norm (but highly functional and therefore unknown to many programmers) and some that try to combine to best features of all like Scala.

Personally I don't really like dynamically typed languages for anything large. It's really cool to be able to work your way into the innards of any object in any way you like but real soon I feel the need to "formalize" in some way what I'm doing, some way of saying "I know what I'm doing, trust me" or even "this is the way to do it, don't try it some other way!".

At this moment I'm trying to learn Haskell and Scala because my "functional side" is really underdeveloped. Scala because it's not so much of a big step coming from Java and Haskell because I want some experience with a language that was made from a fully functional point of view from the beginning. The problem of course is that neither has yet attracted a huge following so I'm not really sure I'll ever be able to use any of them for work. But in the mean time at least I can enjoy learning something new and interesting.

In the end I think "The Next Big Language" doesn't exist yet, but it seems we're all, consciously or unconsciously, waiting for it.

Friday, June 11, 2010

Nexus One: after a few days

So far my impressions of this device are very positive. It's fast, it's very flexible, has loads of options (being a KDE user on Linux probably means I'm one of those people that just loves options) and looks nice to boot.

Like many people have commented all over the world, I think Google should try investing in better graphic and interaction designers for its operating system, Android, because in a first-impressions comparison the iPhone wins hands down in my opinion. The Android interface just seems more cluttered and chaotic and somehow less appealing and intuitive.

On the other hand finally having real multi-tasking is a blessing. Not being constrained anymore by Apple's weird decisions about what you can and cannot do with your phone is even better and the integration with all kinds of Google services by default is just the best part about it all.

But the Nexus One and Android do have some glaring faults. The touch buttons just below the screen are horrible: every time you write on the virtual keyboard and try to hit keys on the bottom row you run the risk of accidentaly hitting the Back or Home keys. That just plain sucks. I've tried an HTC Desire, which is basically the same phone, that has real buttons and it doesn't have that same problem because you need to press them with more force than you normally use for the touch screen.

Another big problem for me as a multi-lingual Dutchman living in Spain is that I very regularly switch languages while writing messages. I might be writing with several people at a time in three different languages and need to be able to switch with a single key-press. Not only is that not possible in Android, it doesn't even let you select a different language than the one you use for the operating system! So either you switch your entire phone from Spanish to English (including all Locale settings like dates, money and whatever) or you disable the keyboard corrections (which doesn't help too much, because I want the OS in English but I need the special keys for the Spanish language!).

But in the end, this being an Android phone from Google (which is definitely not Apple) , I'm sure that the problem can be solved looking around on the Market. Because where Apple says: "you can't do this", Google says: "go ahead, have fun".

So although the new iPhone 4 seems like a nice piece of hardware I'm glad to have "escaped" Apple's "clutches" (yeah, overly melodramatic, but hey, the post is already boring enough).

Thursday, June 10, 2010

Nexus One

Today I got my Nexus One and I decided to do one of those silly unwrapping things, just for shits and giggles.

So this is the box it comes in:


And here is what is inside:





Wednesday, April 7, 2010

/proc/self/fd/0

cat somefile | cp /proc/self/fd/0 someotherfile

I never knew this was possible!

Now you might ask yourself what this is good for but just the other day I was wondering how I could turn a copy of a drive partition made with dd into a sparse file. For example this copies the entire first partition of your first disk to a file:

dd if=/dev/sda1 of=partition.backup

Of course, depending on the size of the partition this file can get quite large. But knowing that most of the time a disk contains a large number of empty blocks and at the same time Linux supports sparse files wouldn't it be great if the file would only contain the blocks that actually have any data in them?

Now, the dd command doesn't support sparse files, but cp does have a --sparse=always option. So you could just use that to make a sparse copy of the file. But then you would have two copies of a potentially very large file on your system, which you might not have room for.

So I had been wondering if there was a way to pipe the output of dd to the input of cp. For dd this is simple because by default it writes it's output to the standard output if you don't give it an output file name, but cp does no tsupport reading from the standard input... until I found the above option on the ntfsclone man page. Cool stuff!

PS: Quite often free blocks on a drive aren't actually empty, but might still be filled with old data that used to be stored there. In those cases it doesn't help trying to make a sparse copy. To remedy that situation you can use zerofree before making the copy of the partition.

Tuesday, March 9, 2010

Curried and Confused

In the article Curried and Confused on coderspiel the author gives us this example of Scala code that people supposedly think is too difficult for regular folks:

def sum(nums: List[Int]) = (0 /: nums) { _ + _ }

He then goes on to say that it might be because of the /: symbol, but that this version does not make it any better:

def sum(nums: List[Int]) = (nums foldLeft 0) { _ + _ }

And this is the point where we differ completely. For me if the first version hadn't been called "sum" I would have had no idea what the code does. The moment I read the second version and I saw "foldLeft" I knew this was about an application of some function to a list of numbers resulting in a single result (hence the folding).

For me this isn't about reading (nums foldLeft 0) and not knowing it's the same as nums.foldLeft(0), you learn that once when reading about Scala's syntax and it will stick. Maybe you need to think about it a bit more the first couple of times, but it's not hard, the basics are still the same, the order is still the same, you can still think: "I've got this object nums on which I'm going to perform a foldLeft and... what's this 0 here? Oh must be an argument".

It also isn't about the whole closure bit where you suddenly see braces at the end of a statement, because getting used to Functional Programming you know that this is normally part of this brand new toolbox you've got. The _ + _ part does pose a bit of a problem for Scala first-timers, I agree.

But in the end for me the real problem is this /: symbol. What does it mean? If I see it, it doesn't ring any bells! There is no mathematical symbol that looks like this, in fact it looks like the symbol we've been using for ages now to denote division!

This is about readability, I always agreed with the Java-tenet where they would say that you tend to read code many many more times than that you write it. And it might be read by many people who don't have a way to look into your mind so it's important to write code in a way that it's function is as obvious as possible.

/: is NOT obvious.

But reading foldLeft is, once you've read at least once about folding and what it is used for. Because it doesn't matter in the end which language you use to implement the folding, it's a concept that's easy to grasp, but what happens if you have to go and learn by heart all the strange symbols that one can come up with? Especially when each language decides to use different symbols to express the same thing.

I now understand why they left out operator overloading in Java.
(yes, I know that in Scala this is not operator overloading strictly speaking, take your nitpicking somewhere else)

Monday, March 8, 2010

Regular expressions and backreferences

While trying to find a way to match quoted strings in a text taking into account that you can use both double and single quotes I first started looking at using two separate sub-expressions OR-ed together, but then I started wondering if it was possible to say something like "match a sequence that's exactly the same as that other sequence that you matched earlier". After some looking around I found the answer in backreferences. The following will match a single or double quoted string for example:

(["']).*?\1

Sunday, March 7, 2010

Heavy Rain

I started playing Heavy Rain today and I must say that I really like what I've seen so far! The quality of the graphics and the 3D models isn't too great, it's easy to see the typical "broken" lines where curves have been divided up into separate polygons for example. People's faces tend to have this glassy-eyed look and depending on the person can have this "plastic" look to them. Movement around the world can sometimes be a bit of a chore for somebody like me who's not used to a console's controller but to mouse and keyboard.


Now those were the main bad points. The good points are: it's just a fun, engrossing game with a great atmosphere. I'm not even sure if you could even call it a game as such, although it definitely has game aspects, like when you are an FBI agent looking for clues with your futuristic scanner glasses. Then there are the "action" sequences which are like those horrible console, press the right button at the right time, sequences... except that this time they aren't horrible. There doesn't seem to be a way to do anything really wrong, the game doesn't punish you for reacting 5ms too late, the sequence just takes a different turn.


As you can see in the video above you just walk around until some symbol appears indicating there's something to do there. There are lot's of little things to discover, most of them are just there for fun, to add some depth to the entire thing, but luckily most of those disappear after the first time you use/perform them so you don't get swamped all the time with symbols floating in the air. It also keeps you focused on the story, which turns this more into an interactive movie than just a graphically advanced adventure game. Which might turn some people off because (so far) it isn't very difficult, I don't think you will ever feel much of an achievement having encountered the solution to a particularly devious puzzle for example. But I still think it's worth it because the story is just so compelling. For a movie it might be your average psychological thriller about trying to catch a serial killer, but for a game it's great stuff that hasn't been done enough before.