Wednesday, July 28, 2010

JVM Languages Summit Day 2


Another good day of talks starting with Doug Lea on parallel recursive decomposition using fork-join. Amazing how much subtle tweaking is require to get good performance.

This led into Joshua Bloch's talk on performance. There is so much complexity in all the layers from CPU to OS to language that performance varies a lot. He showed a simple example of a Java program that gave consistent times on multiple runs within a given JVM, but sometimes when you restarted the JVM it would consistently give quite different results! Cliff Click's theory was that it was caused by non-deterministic behavior of the JIT compiler since it runs concurrently. The behavior is still "correct", it can just settle into different states. The solution? Run tests over multiple (eg. 40) different JVM instances. That's on any given machine, of course you should also test on different CPU's and different numbers of cores. Easy for them to say. 

Neal Gafter talked about Microsoft's LINQ technology - pretty cool, although nothing to do with the JVM. 

Kresten Thorup talked about his Erlang implementation on the JVM using Kilim for coroutines. Erlang is an interesting language, and quite different from Java so it was interesting to see how he implemented it. He actually runs the byte code produced by the existing Erlang compiler. 

I talked to Remi Forax about whether I should use his JSR292 backport in jSuneido. This would let me use the new technology before Java 7 is released (who knows when). Of course, he said I should. But ... it means developing with the "beta" JDK 7 which still has bugs and is not supported by IDE's. And then it requires an extra run-time agent. I'm not sure I want to complicate my life that much!

Monday, July 26, 2010

JVM Languages Summit Day 1

My hotel is "behind" the Sun/Oracle campus so I had to circle around through the endless acres of parking lots to get to the right entrance. But it was pretty easy, and not as far as it looked on the map. I've never had much to do with giant organizations so it's still a little mind boggling when you use an automated kiosk (like the ones at the airport) to get your visitors badge.  


The sessions were a real mixture, from stuff where the nuances were pretty much over my head, to thinly veiled sales pitches with no technical details. It was pretty neat to be in the company of people you're used to thinking of as gurus, like Doug Lea, John Rose, Joshua Bloch, and Cliff Click. And nice to see they have their frustrations with the technology also.


It's naive of me, but unconsciously I expect really smart people to be "rational", and it's always a bit of a disappointment when that proves to be untrue. Smart people have egos, are insecure, or argumentative, or negative, or defensive, or obnoxious, just like everyone else. 


Maybe I'm just too soft, but I felt bad for one guy who basically got told he was doing it wrong. It seems like they could just have well asked "did you consider ..." or "what would you think about ...", rather than just "that's wrong, you should have done ..."


Mostly I just listened to the conversations. In the Java / JVM area I still feel like a relative novice and I'm not sure I have much to contribute yet. But for the most part I didn't feel too much out of my depth, so that's good.

Sunday, July 25, 2010

OSCON Wrapup

The last two days of OSCON were good.

I got to hear Rob Pike talk about the Go language. Rob is a legend in software and Go is a cool language. In some ways Go is more like C or C++ in that it's compiled to machine code with no VM. But unlike C++ it compiles extremely fast. For what he called a party trick he showed it compiling every time he typed a character - and it kept up. It has features I miss in Java like pointers (but safe) and values on the stack (not always allocated). It also has "goroutines" - lightweight threads like coroutines. But its attractions aren't quite sufficient to tempt me away from Java. They don't even have a Windows version yet, let alone all the support libraries and frameworks that Java has.

I also got to hear another legend, Walter Bright, talk about his D language. I used Walter's C and C++ compilers for many years. D also has some very interesting features. Andrei Alexandrescu of Modern C++ Design fame is now working on D and has written The D Programming Language book.

One feature D has that is sorely missing from other languages is the ability to declare functions as "pure" (ie. no side effects) and have the compiler verify it. This (not lambdas) is the key to functional programming. And yet languages like Scala that claim to support functional programming don't have this.

I also went to a talk by Tim Bray whose blog I read. He was working at Sun but moved to Google after the Oracle buyout. His talk was on concurrency. He was very pro Erlang and Clojure but didn't mention Scala. When asked about it he said he thought the Scala language was too big. It does have a sophisticated type system, but the actual syntax is quite simple - smaller than Java. Scala's Actor implementation has been criticized but it's been improved in 2.8 and there are alternatives like Akka.

Wednesday, July 21, 2010

OSCON Another Day

After the less than thrilling key notes, I went to another Scala session. A lot of it was repetition, but I learn a bit more at each one. The recurring theme seems to be if you're going to use Scala, use Simple Build Tool (SBT).

Next I went to a talk on GPARS, a Groovy concurrency library. The talk was as much about concurrent programming patterns as about Groovy, which suited me.

Next, I was going to go to one of the database talks, but then I realized it was a vendor presentation and they're usually more sales spiel than technical. Looking for an alternative I noticed Robert (R0ml) Lefkowitz had a talk on Competition versus Collaboration. If you've never listened to one of his talks, it's well worth it. They are as much performances as presentations and always thought provoking. (search for Lefkowitz on the Conversations Network)

Next was a talk on Clojure (a Lisp that runs on the JVM). The title was "Practical Clojure Programming" which sounds like an intro, but it was actually about build tools, IDE's, and deployment. Like most of the audience, I would have rather heard more about Clojure itself. I guess we should have read the description closer.

Finally, I snuck into the Emerging Languages Camp to hear Charles Nutter talk about Myrah (formerly Duby) his close-to-the-JVM-but-like-Ruby language. I would have been tempted to go to the Emerging Languages Camp (it was even free) but by the time they announced it, I'd already signed up for the regular sessions.

All in all, a pretty good day. Lots of food for thought, which is, of course, the point.

One thing I forgot to mention yesterday is that a lot of the Scala people are also (or ex-) Ruby/Rails people.  Maybe that's simply because they're the kind of people that like to learn/adopt new things.

But a lot of people left Java and went to Ruby, so it's surprising to see them coming almost full circle back to Scala. Scala is better than Java, but it's still a statically typed language like Java, which is part of what people seemed to reject. Maybe Ruby wasn't the silver bullet they were hoping. Maybe it was performance issues. Maybe they realized that static typing does have advantages after all. Maybe they realized the advantages of running on the JVM (although jRuby allows that). Maybe Scala's improvements over Java are enough to win people back.

Tuesday, July 20, 2010

OSCON Part One - Scala

I just finished the first day and a half of OSCON - in the Scala Summit. And contrary to my anti-social nature, I even went to a Scala meetup last night.

As I've written before, I'm quite intrigued with the Scala programming language. It runs on the JVM and interfaces well with Java. It combines functional and object-oriented programming. It has actors for concurrency. And it has a rich type system that is probably Turing complete, like C++ templates. 

Scala seems to be attracting a lot of attention. Of course, the Scala Summit attendees are all keen, but there were also several other language developers (like Charles Nutter of jRuby) interested in borrowing ideas from Scala.

Programming in Java is like driving a truck. It's not fast or fuel efficient or fun to drive or comfortable. But it gets the job done without a lot of problems and it can haul just about anything. 

Why do people like to drive a fast car? It's not like they regularly need go from 0 to 60 in 5 seconds, or hit 200 km/hr. But they like to know that if they "need" to, they could.

Similarly, I think people like to know their language "has a lot of power under the hood", even if they never use it. And it gives lots of potential for books and conference talks that keep people excited.

And like you could use C++ simply as a better C, you can use Scala as a better Java. This lets less sophisticated users get involved as well. 

The funny part is that it's very reminiscent of C++.  There were all kinds of talks and articles and books about all the cool stuff you could do with templates and other fancy C++ features. Now, all you hear are negative comments about C++, too complicated, too hard to use, etc. I start to think I'm the only person that liked C++. 

But there are differences too. Scala didn't try to be backwards compatible with Java the way C++ was backwards compatible with C. And Java already had object-oriented programming, so Scala isn't as big a jump as C++ (in that respect). 

If nothing else, it's made me really want to spend some more time with Scala. It will sound crazy, but I'm very tempted to rewrite jSuneido in Scala. Thankfully, that's something that could be done gradually. I think it would let me make the code much smaller and cleaner. I think it would also make it easier to switch back and forth with Suneido coding, since Scala has things like named and default arguments, optional semicolons, and type inference that make the code much more similar to Suneido code.

Wednesday, July 07, 2010

Time Capsules

I use an Apple Time Capsule for my home wireless router and backup storage.

Time Capsules have a bad reputation for dying and I've had mine for quite a few years so I was a little nervous about it. If it died I wouldn't have a backup of my iMac. Which would be ok unless it happened to die at the same time. This seems unlikely, but it's surprising how often you do get simultaneous failures. For example, a power surge due to lightning. I can also keep the external drive at work so I have an offsite backup in case the house burns down.

I had a 500gb external drive that I'd used for backups, but it's not big enough to do a complete backup of my 1tb iMac. So I went and bought a Lacie 2tb external drive and used SuperDuper to make a backup. I used the free version, but I'll probably get the paid version so I can update the backup without redoing it.

Then I decided I should also backup my MacBook and my MacMini. I didn't have much critical stuff on them, but a backup would save hassle if I needed to restore. But SuperDuper takes over the whole drive so how could I backup additional machines? The answer seemed to be to partition the drive, but I didn't want to have to redo my iMac backup (600gb takes a while to backup, even with Firewire 800). I searched on the web and found various complicated ways to resize partitions. Finally, I found that with recent OS X you can resize right from the Disk Utility. All the complicated instructions were for older versions of OS X.

The "funny" part of this story is that a few days later I went to use wireless and it wasn't working. I went and checked on the Time Capsule and it was turned off. Strange, because I leave it running all the time. I turned it on and about 10 seconds later it turned itself back off.

My computer was still fine, so I didn't actually need the external backup, but I was glad to have it nonetheless.

I phoned the local Apple dealer (Neural Net). The receptionist wanted me to bring it in and they would look at it in the next few days. I didn't want to be without internet for days so she let me speak to the technician. When I described the symptoms he said the power supply had died. But Apple doesn't let them repair them and doesn't supply any parts. Apple has been promoting their environmentally friendly products, but no matter how they're built, a non-repairable "disposable" product isn't very environmentally friendly.

I would have preferred to give Neural Net the business but they didn't have any Time Capsules in stock so I picked up a 2tb Time Capsule from Future Shop. 10 minutes after opening the box I was back up and running. (Although the initial Time Machine backups took considerably longer.)

One nice thing is that Time Machine backups seem to be a lot less intrusive. Before, when Time Machine kicked in it would really bog down my machine. If I was trying to work I'd often stop it. But now I don't even notice when it runs. I'm not sure how a new external device would change the load on my computer, but it's nice anyway.

My old Time Capsule ran quite hot, even when it wasn't doing anything. I was hoping the new ones would be better, but it seems much the same. I haven't measured it, but I assume heat means power consumption. I'm not sure why it can't go into a low power mode when it's not active. The other reason they run hot is that they have no ventilation or heat sink. Apparently there is an internal fan but all it does is stir the air around inside a small sealed box. You'd think they could come up with some better heat management without compromising their design. I would guess the heat is one of the big reasons they have a reputation for dying. Electronic components tend to have a shorter life at higher temperatures.

Rather than throwing out the old Time Capsule, I passed it on to one of the guys at work that tinkers with hardware. I thought he could at least extract the 1tb drive. But he managed to repair the power supply and is using the whole unit. I guess the hardest part was getting the case open! I'm glad it was saved from the landfill for a while longer.

Now that I have a bigger drive I thought I might as well backup Shelley's Windows machine as well. I have it set up with Mozy for on-line backups but just with the free 2gb account so I'm only backing up selected things. I'd seen Mozy will now backup to external drives so I thought I'd set this up. Unfortunately, it only backs up to directly attached drives (i.e. internal or USB) not to network drives. I'm not sure what the logic is behind that choice. I could use different software, but I think what I'll do (I haven't got round to it yet) is to use the old 500gb external drive.

Tuesday, July 06, 2010

iPhone Multi-tasking

A lot of people made a big thing about how the iPhone didn't have multi-tasking. But personally, I never saw any of the reasons as very compelling. Given limited cpu power and memory and a tiny screen, it made perfect sense to single task.

Sure, there are probably a few reasonable uses of multi-tasking. But there are way more potential abuses.

I look at my Windows machine, which is continually thrashing away when I'm not doing anything. Every piece of software I install wants to run stuff in the background. Just to check for updates once a week they want to have a process running constantly. That's not a benefit, that's abuse.

Two days after I installed iOS 4 with its long awaited multi-tasking, I went to use my iPhone in the morning and the battery was totally dead - it wouldn't even turn on. It had plenty of power the evening before. I can't be 100% sure of the culprit, but I assume it must have been some application left running by the new multi-tasking. I suspect it was a mapping app that I'd been playing with, probably running the GPS constantly or something like that.

On the other hand, according to Apple, most apps do not actually run in the background, they just get suspended and resumed. If that's really true in all cases, then I'm not sure what drained my battery overnight.

So now I find myself regularly "killing" all the active apps because I'm paranoid about this kind of scenario. Great, a "feature" has imposed a large manual overhead on me.

Hopefully the situation will improve once apps are updated to work better with the multi-tasking. But there will always be poorly behaving apps.

What I find strange is that Apple went from not allowing multi-tasking, to making it the default, with no way to switch it off. It seems like an option that you had to explicitly turn on (or at least some way to disable it) would have still silenced the critics, but wouldn't have imposed the cost on those of us that didn't want it.