I recently picked up a copy of Groovy in Action. Groovy is a dynamic language based on Java infrastructure - it compiles to Java byte codes, runs on Java virtual machines, and has full access to Java libraries. You can call Java from Groovy and vice versa.
I just started reading the book, but the Groovy language looks interesting. There are some close similarities with Suneido, and of course differences. For example, this could be either Suneido or Groovy:
list = [1, 2, 3]
map = [name: "Fred", age: 24]
although Suneido would also allow this but Groovy wouldn't:
listmap = [1, 2, 3, name: "Fred", age: 24]
This is because Suneido uses a combined list/map data type whereas Groovy has separate list and map types.
BTW I like this map notation better than Ruby's, which uses a lot more punctuation:
map = { :name => "fred", :age => 24 }
Groovy's closures are also quite similar to Suneido's blocks:
Groovy: { arg -> ... }
Suneido: { |arg| ... }
I borrowed Suneido's block syntax from Smalltalk. The extra '|' before the parameters makes it easier to parse.
One feature I like is that you can write:
list.each { key, value -> ... }
whereas in Suneido you'd need parenthesis after the "each" function call:
list.each() { |key, value| ... }
This might not be too hard to add to Suneido since it's currently a syntax error (not ambiguous). In both Groovy and Suneido this is equivalent to list.each({...})
In addition to =~ for regular expression matching (the same as Suneido) Groovy also has ==~ which must match the entire string i.e. the same as "^(...)$". I'm tempted to add this to Suneido because it's a common mistake to omit the '^' and '$' and/or the parenthesis.
That's about as far as I've gotten. I'm not sure if I'll ever use Groovy but it's always interesting to look at other languages. And since the Java platform is quite ubiquitous, that makes Groovy more widely applicable. Groovy also has a web framework called Grails that is similar to Ruby on Rails.
Monday, October 29, 2007
Friday, October 26, 2007
Freebase and Cinespin
I recently listened to a podcast with one of the developers of Freebase. I've been meaning to have a look at Freebase for a while. One of the people behind it is Danny Hillis of Thinking Machines.
For an interesting application based on Freebase data, have a look at Cinespin.
For an interesting application based on Freebase data, have a look at Cinespin.
Thursday, October 25, 2007
Amazing Open Source
The variety and quality of open source software even in specialized areas is pretty amazing.
Art Gallery - Art of Illusion
Art Gallery - Art of Illusion
Monday, October 22, 2007
Use at Least Two Compilers
I recently made some minor changes to Suneido. I compiled with MinGW and it worked fine.
A bit later I compiled with Visual C++ 7 (2003) since that produces the smallest, fastest code at the moment. It wouldn't run at all - crashed immediately on start up!?
I recompiled the MinGW version - it still worked fine.
I reverted my changes and the VC7 version worked again so it definitely appeared to be my changes.
I checked my changes several times but they looked fine.
Finally I found the problem. While making my changes I had done some minor refactoring. (I know, you shouldn't mix the two, but it seemed minor.) I had found something like:
Lessons:
A bit later I compiled with Visual C++ 7 (2003) since that produces the smallest, fastest code at the moment. It wouldn't run at all - crashed immediately on start up!?
I recompiled the MinGW version - it still worked fine.
I reverted my changes and the VC7 version worked again so it definitely appeared to be my changes.
I checked my changes several times but they looked fine.
Finally I found the problem. While making my changes I had done some minor refactoring. (I know, you shouldn't mix the two, but it seemed minor.) I had found something like:
int fn1()So I eliminated the duplication by moving the constant outside the functions:
{
static const list = symnum("list");
...
}
int fn2()
{
static const list = symnum("list");
...
}
static const list = symnum("list");The problem is that this changes when symnum is called. The old way it wasn't called until the functions were used the first time. The new way it is called during startup, probably before something else that it needs is initialized. The order of initialization is undefined and obviously varies between compilers. Sure enough, putting this back fixed the problem.
int fn1()
{
...
}
int fn2()
{
...
}
Lessons:
- build with multiple compilers to catch this kind of dependence on undefined behavior
- don't mix changes and refactoring, do them one at a time, test in between
Saturday, October 20, 2007
From Ruby on Rails to PHP
7 reasons I switched back to PHP after 2 years on Rails - O'Reilly Ruby
A lot of the points he makes I find familiar from Suneido. Most of the time you can reproduce the needed parts of some other whiz-bang system in less time than it takes to learn the other system. It won't do everything that other system does, but if it does what you need, who cares?
We've run into some of the same frustrations he mentions with our Rails project (eTrux). We didn't have the problem of trying to deal with an existing database (we were starting from scratch), but it always seems like there's something you want to do that isn't on the Rails easy path. But despite the frustrations, I think Rails was a good choice for this project.
A lot of the points he makes I find familiar from Suneido. Most of the time you can reproduce the needed parts of some other whiz-bang system in less time than it takes to learn the other system. It won't do everything that other system does, but if it does what you need, who cares?
We've run into some of the same frustrations he mentions with our Rails project (eTrux). We didn't have the problem of trying to deal with an existing database (we were starting from scratch), but it always seems like there's something you want to do that isn't on the Rails easy path. But despite the frustrations, I think Rails was a good choice for this project.
My Chumby! Or Maybe Not
I finally received my invitation to get one of the "insider" early release Chumby's.
I go to order, pick the color, get to the checkout, and find that it's United States only. Argh!
I realize Free Trade doesn't really mean free trade, but I find the US export regulations pretty frustrating and ridiculous. This thing is all open source software and I can't imagine the hardware is anything special - what are they protecting? I probably shouldn't put all the blame on the US either. Canada has it's own safety code and approval process that devices have to pass. To an ignorant consumer it seems like the US and Canada are similar enough that they could agree on a standard approval process, but I'm sure that's unlikely.
I'm not blaming the companies, I'm sure it's a hassle for them too. And US companies probably (rightly) see Canada as a minor market that's not worth much effort, at least initially.
Sigh. Maybe I can find someone in the US to order my Chumby for me.
(Don't get the wrong idea, it's not like I'm "desperate" for a Chumby. I'm sure I could live without it :-) But us geeks like our toys, especially if it's a new toy that not everyone has!)
[I see from my earlier post that I already knew it was US only. In the excitement of my invitation I obviously forgot that. Either that, or I'm just getting old and forgetful!]
I go to order, pick the color, get to the checkout, and find that it's United States only. Argh!
I realize Free Trade doesn't really mean free trade, but I find the US export regulations pretty frustrating and ridiculous. This thing is all open source software and I can't imagine the hardware is anything special - what are they protecting? I probably shouldn't put all the blame on the US either. Canada has it's own safety code and approval process that devices have to pass. To an ignorant consumer it seems like the US and Canada are similar enough that they could agree on a standard approval process, but I'm sure that's unlikely.
I'm not blaming the companies, I'm sure it's a hassle for them too. And US companies probably (rightly) see Canada as a minor market that's not worth much effort, at least initially.
Sigh. Maybe I can find someone in the US to order my Chumby for me.
(Don't get the wrong idea, it's not like I'm "desperate" for a Chumby. I'm sure I could live without it :-) But us geeks like our toys, especially if it's a new toy that not everyone has!)
[I see from my earlier post that I already knew it was US only. In the excitement of my invitation I obviously forgot that. Either that, or I'm just getting old and forgetful!]
Friday, October 19, 2007
Using S3 for Customer Backups
For some time we've been using S3 to back up our customers systems.
Originally we set them up to FTP the data back to our office. This was handy for us because if we wanted to look at their data we had it in-house. But as the number of clients and the size of their data grew we ran out of bandwidth. (Thankfully we did these transfers at night so it didn't slow down our daytime internet access.)
The other problem was the sheer size of the data and the issues of how to guard against disk failure. Although we don't promote this to our clients as a "backup" service many of them still end up falling back on us when they discover their own backups weren't done or were no good. And people often don't discover problems till days or weeks later so they need older copies, not just the most recent.
We decided to give S3 a try and so far it has worked out well. We're up to about 8 gb of data transfer per night and we currently have about 240 gb of storage. This is costing us about $50 per month - a bargain as far as I'm concerned.
We are currently keeping the last 8 days, last 5 weeks, last 13 months, and every year, or something like 25 - 30 copies per customer. On top of this redundancy, Amazon says they store multiple copies of each file.
A minor downside is that when we want to look at someone's data we have to download it, but that's not a big deal. And it's still a lot easier to download from S3 than to download directly from the customer.
There is a potential concern with storing data with a third party, but we encrypt the files and Amazon has decent security on top of that, so it seems ok. It's doesn't seem any worse than other hosting situations.
Overall, we're pretty happy with this setup.
Originally we set them up to FTP the data back to our office. This was handy for us because if we wanted to look at their data we had it in-house. But as the number of clients and the size of their data grew we ran out of bandwidth. (Thankfully we did these transfers at night so it didn't slow down our daytime internet access.)
The other problem was the sheer size of the data and the issues of how to guard against disk failure. Although we don't promote this to our clients as a "backup" service many of them still end up falling back on us when they discover their own backups weren't done or were no good. And people often don't discover problems till days or weeks later so they need older copies, not just the most recent.
We decided to give S3 a try and so far it has worked out well. We're up to about 8 gb of data transfer per night and we currently have about 240 gb of storage. This is costing us about $50 per month - a bargain as far as I'm concerned.
We are currently keeping the last 8 days, last 5 weeks, last 13 months, and every year, or something like 25 - 30 copies per customer. On top of this redundancy, Amazon says they store multiple copies of each file.
A minor downside is that when we want to look at someone's data we have to download it, but that's not a big deal. And it's still a lot easier to download from S3 than to download directly from the customer.
There is a potential concern with storing data with a third party, but we encrypt the files and Amazon has decent security on top of that, so it seems ok. It's doesn't seem any worse than other hosting situations.
Overall, we're pretty happy with this setup.
G.ho.st - Global Hosted Operating System
G.ho.st – Home Page
Apparently this is implemented using Amazon S3 and EC2. Interesting idea, but pretty annoying how they maximize my browser window and hide the address and tool bars.
Apparently this is implemented using Amazon S3 and EC2. Interesting idea, but pretty annoying how they maximize my browser window and hide the address and tool bars.
Wednesday, October 17, 2007
Thursday, October 11, 2007
Telnet now optional in Vista
Today I went to use telnet to test some server code I was working on. Except there's no telnet!?
A quick web search tells me it's not "turned on" by default, you have to go to:
Control Panel > Programs and Features > Turn Windows features on or off
Microsoft's justification for this is to decrease the "footprint" of Windows and to increase security. Neither really makes much sense. The telnet client is a tiny program and having it in a directory on the path versus some other "turned off" directory doesn't seem to change the "footprint" much. The code to turn it on and off is probably bigger than telnet itself. And I'm not sure how a telnet client is much of a security risk. Any attack that gets far enough into your system to use your telnet client is probably not going to be stopped by it being "turned off", especially since it could easily "turn it on" from the command line. (I can see leaving the telnet service turned off, but that's not what I'm talking about.)
I'm really tempted to do some Microsoft bashing at this point, but I'll restrain myself.
A quick web search tells me it's not "turned on" by default, you have to go to:
Control Panel > Programs and Features > Turn Windows features on or off
Microsoft's justification for this is to decrease the "footprint" of Windows and to increase security. Neither really makes much sense. The telnet client is a tiny program and having it in a directory on the path versus some other "turned off" directory doesn't seem to change the "footprint" much. The code to turn it on and off is probably bigger than telnet itself. And I'm not sure how a telnet client is much of a security risk. Any attack that gets far enough into your system to use your telnet client is probably not going to be stopped by it being "turned off", especially since it could easily "turn it on" from the command line. (I can see leaving the telnet service turned off, but that's not what I'm talking about.)
I'm really tempted to do some Microsoft bashing at this point, but I'll restrain myself.
Wednesday, October 10, 2007
Ask 37signals: Is it really the number of features that matter? - (37signals)
Ask 37signals: Is it really the number of features that matter?
The idea of "editing" is interesting. It's a role I play with our application software, both in deciding which features to add, and in reviewing what we've done to try to ensure it's usable.
Of course, I still struggle with trying to pursue simplicity. Sales people, customers, customer support - they all think more features is the answer.
The idea of "editing" is interesting. It's a role I play with our application software, both in deciding which features to add, and in reviewing what we've done to try to ensure it's usable.
Of course, I still struggle with trying to pursue simplicity. Sales people, customers, customer support - they all think more features is the answer.
Monday, October 08, 2007
Chumby Coming Soon, But Not Here
Soon you'll be able to buy a Chumby, but only in the United States, not in Canada :-(
I think one of the things that helped the internet spread so fast was that it wasn't restricted like this. Although when I went looking on the internet for a tv show that I missed I found Fox limits access to shows to just the US. And Amazon's MP3 sales are US only. So the internet is not necessarily free of restrictions either.
I think one of the things that helped the internet spread so fast was that it wasn't restricted like this. Although when I went looking on the internet for a tv show that I missed I found Fox limits access to shows to just the US. And Amazon's MP3 sales are US only. So the internet is not necessarily free of restrictions either.
Sunday, October 07, 2007
Ubuntu on Parallels on Mac
I just installed Ubuntu on my Mac under Parallels using these instructions:
How to install Ubuntu 7.04 in OS X using Parallels Desktop 3.0
It seemed to go smoothly. Of course, Ubuntu immediately wanted to download a pile of updates, but that worked ok.
The only problem is that you seem to have to manually turn on the networking within Ubuntu every time you boot it up. The instructions mentioned this, but I thought it was just during installation. There may be a fix, I haven't looked yet. I don't think it's a big deal because normally I just suspend the virtual machines rather than shutdown and restart them. Hmmm... that's assuming the network stays turned on after being suspended, I haven't actually tested that.
Parallels has a pre-installed Ubuntu virtual machine, but the downloads were very slow so I gave up. But I should be able to take the virtual machine I've now created and copy it from my Mac Mini to my Mac Book.
Aside from curiosity, one of my motivations for installing Ubuntu is to get back to working on porting Suneido to Linux. I should be able to do a lot of that natively under OS X but I assume I'll still want to work under Linux as well.
How to install Ubuntu 7.04 in OS X using Parallels Desktop 3.0
It seemed to go smoothly. Of course, Ubuntu immediately wanted to download a pile of updates, but that worked ok.
The only problem is that you seem to have to manually turn on the networking within Ubuntu every time you boot it up. The instructions mentioned this, but I thought it was just during installation. There may be a fix, I haven't looked yet. I don't think it's a big deal because normally I just suspend the virtual machines rather than shutdown and restart them. Hmmm... that's assuming the network stays turned on after being suspended, I haven't actually tested that.
Parallels has a pre-installed Ubuntu virtual machine, but the downloads were very slow so I gave up. But I should be able to take the virtual machine I've now created and copy it from my Mac Mini to my Mac Book.
Aside from curiosity, one of my motivations for installing Ubuntu is to get back to working on porting Suneido to Linux. I should be able to do a lot of that natively under OS X but I assume I'll still want to work under Linux as well.
Friday, October 05, 2007
MinGW GCC Version 4
I see MinGW (Minimalist GNU for Windows) now has a preview of GCC 4. (It looks like it came out in August, but I didn't hear about it.) I've been waiting for this for a while.
I haven't looked at it yet, but I keep hoping that MinGW GCC will close the gap with commercial compilers so I can use it for Suneido.
I haven't looked at it yet, but I keep hoping that MinGW GCC will close the gap with commercial compilers so I can use it for Suneido.
Thursday, October 04, 2007
The Big Rewrite
ChadFowler.com The Big Rewrite
I've been involved in a number of "big rewrites" over the years. The Suneido project could be called a rewrite of a previous in-house project called C4, which in turn was a "rewrite" of a framework built on top of a personal information manager called Lucid. Our accounting applications are also the third rewrite. So it is possible to pull it off. And I would bet some of Chad's projects have succeeded in the end. But his points about the dangers and problems of big rewrites are definitely valid.
Joel Spolsky calls the big rewrite "the single worst strategic mistake that any software company can make"
In the end it comes down to Software is Hard
I've been involved in a number of "big rewrites" over the years. The Suneido project could be called a rewrite of a previous in-house project called C4, which in turn was a "rewrite" of a framework built on top of a personal information manager called Lucid. Our accounting applications are also the third rewrite. So it is possible to pull it off. And I would bet some of Chad's projects have succeeded in the end. But his points about the dangers and problems of big rewrites are definitely valid.
Joel Spolsky calls the big rewrite "the single worst strategic mistake that any software company can make"
In the end it comes down to Software is Hard
Tuesday, October 02, 2007
Digital Music Not Quite Here Again
When Apple announced DRM-free music on the iTunes store I thought I'd finally be able to buy music without the roundabout system of buying a physical cd and ripping it to get the digital version which is all I use these days. But unfortunately, their selection of DRM free music is small and doesn't seem to be growing very fast. It seems like every time I go to look for something it isn't available.
So when Amazon announced DRM-free MP3's I was hopeful but knew the real test would be selection. I checked for a few things and although not everything was available it seemed better than iTunes (and cheaper, at least for some things).
BUT when I actually went to buy something I found out it's United States only. Argh!
I found someone on the web saying they had managed to purchase from Canada by giving a fake address but that didn't work for me. Maybe because I was logged in and therefore it knew where I was.
Note: My reasons for wanting DRM-free music have nothing to do with piracy. I just don't want the headaches of DRM, especially when I use multiple computers and music players. It seems crazy how much resistance the music companies are putting up against DRM-free sales, when they've been selling DRM music (i.e. cd's) all along.
Patience, grasshopper.
So when Amazon announced DRM-free MP3's I was hopeful but knew the real test would be selection. I checked for a few things and although not everything was available it seemed better than iTunes (and cheaper, at least for some things).
BUT when I actually went to buy something I found out it's United States only. Argh!
I found someone on the web saying they had managed to purchase from Canada by giving a fake address but that didn't work for me. Maybe because I was logged in and therefore it knew where I was.
Note: My reasons for wanting DRM-free music have nothing to do with piracy. I just don't want the headaches of DRM, especially when I use multiple computers and music players. It seems crazy how much resistance the music companies are putting up against DRM-free sales, when they've been selling DRM music (i.e. cd's) all along.
Patience, grasshopper.
Problems Embedding Google Maps in Blogger
I decided to add embedded Google Maps showing routes to some of my recent Sustainable Adventure posts but I had some problems.
First, it's quite hard to get the zoom/scale to come up properly on the embedded map. It would either be zoomed out too far or zoomed in too far. Presumably the zoom in the embedded map is related to the zoom when you ask for the link/embed code, but there doesn't seem to be a simple correlation. With trial and error, zooming in and out and resizing the window, I could usually get it more or less right.
In some cases it seemed to help to copy the link address, open a new tab/window and go to that address. I'm not sure why that would help. Maybe it was just coincidence that it happened to work after doing that.
One of my maps refused to show the route. The map itself would display (although missing one tile). Eventually, after playing with it and making a minor change to the route it started working. (I gave up on trying to get the zoom right on this one, I was happy enough to just get it to work.)
It's sad how much the success of an "expert user" comes down to trial and error and randomly poking things. The equivalent of kicking the machine. Isn't software supposed to be consistent and predictable? The problem is that if the software gets complex enough (as most software is this days) then it becomes impossible to control/know all the inputs and state, so it ends up seeming random, unpredictable, and inconsistent. Yuck!
I searched for other people having this problem but I didn't find much. Is it something I'm doing different? Issues related to Blogger? But lots of people use Blogger. Maybe I just didn't hit on the right search terms.
First, it's quite hard to get the zoom/scale to come up properly on the embedded map. It would either be zoomed out too far or zoomed in too far. Presumably the zoom in the embedded map is related to the zoom when you ask for the link/embed code, but there doesn't seem to be a simple correlation. With trial and error, zooming in and out and resizing the window, I could usually get it more or less right.
In some cases it seemed to help to copy the link address, open a new tab/window and go to that address. I'm not sure why that would help. Maybe it was just coincidence that it happened to work after doing that.
One of my maps refused to show the route. The map itself would display (although missing one tile). Eventually, after playing with it and making a minor change to the route it started working. (I gave up on trying to get the zoom right on this one, I was happy enough to just get it to work.)
It's sad how much the success of an "expert user" comes down to trial and error and randomly poking things. The equivalent of kicking the machine. Isn't software supposed to be consistent and predictable? The problem is that if the software gets complex enough (as most software is this days) then it becomes impossible to control/know all the inputs and state, so it ends up seeming random, unpredictable, and inconsistent. Yuck!
I searched for other people having this problem but I didn't find much. Is it something I'm doing different? Issues related to Blogger? But lots of people use Blogger. Maybe I just didn't hit on the right search terms.
Subscribe to:
Posts (Atom)