15 December, 2007

Spring AOP: Dynamic Proxies vs. CGLib proxies

Spring's AOP is proxy-based. Spring provides two different options to create the proxies. One is based on JDK dynamic proxies and works with interfaces, the other one utilizes CGLib and is based on classes. (That's why the property is called proxyTargetClass respectively proxy-target-class.) For the moment I just want to provide a quick summary on the pros and cons of both options:

JDK dynamic proxies:

  • The class has to implement interfaces. Otherwise you will get ClassCastExceptions saying that $Proxy0 can not be casted to the particular class.

  • Eventually dynamic proxies force you to program to interfaces since you can not cast the proxy to the class - a feature I really like about them.


CGLib proxies:

  • The proxies are created by sub-classing the actual class. This means wherever an instance of the class is used it is also possible to use the CGLib proxy.

  • The class needs to provide a default constructor, i.e. without any arguments. Otherwise you'll get an IllegalArgumentException: "Superclass has no null constructors but no arguments were given." This makes constructor injection impossible.

  • The proxying does not work with final methods since the proxy sub class can not override the class' implementation.

  • The CGLib proxy is final, so proxying a proxy does not work. You will get an IllegalArgumentException saying "Cannot subclass final class $Proxy0". But this feature is usually not needed anyway. (This issue might be solved in the future.)

  • Since two objects are created (the instance of the class and the proxy as instance of a sub class) the constructor is called twice. In general this should not matter. I consider changing the class' state based on constructor calls a code smell anyway.

  • You have CGLib as additional dependency.


Both options suffer from some issues (not really issues, but you have to be aware of them):

  • Most important proxy-based AOP only works from "outside". Internal method calls are never intercepted.

  • Second, the object has to be managed by the Spring container. Instantiating it yourself using new does not work.

  • The proxies are not Serializable.


Regarding performance of the one or the other method I have read different things. I remember having read a blog post about CGLib proxies being better, one of the comments says dynamic proxies are. Actually the Spring reference has a paragraph on this itself:
There's little performance difference between CGLIB proxying and dynamic proxies. As of Spring 1.0, dynamic proxies are slightly faster. However, this may change in the future. Performance should not be a decisive consideration in this case.

I emphasized the last sentence by intention.

Especially for enforcing programming to interfaces and allowing constructor injection I strongly prefer the JDK dynamic proxies.

04 December, 2007

Please tell Microsoft about this problem

If you are a Windows user you probably know this request quite well. No, this post is not yet another Microsoft or Windows bashing (though I switched to Mac OS X lately). It's only about an incident I had last weekend on the Times Square in New York where I really did not expect to see that message:



(I took the picture on Nov 24th 2007, around 4:30 pm.)

When I saw this on one of the big screens I really had to laugh. How embarrassing is this? For Windows and Flash (though I was surprised and impressed that these screens are running on Windows and Flash). It will be (justified) bad press for them though. Everybody sees his prejudices proved on the big screen. But it is especially embarrassing for the advertiser. How many thousands of people are walking over the Times Square on Thanksgiving's Saturday? At least it was incredibly crowded and I was probably not the only one who noticed that.

14 September, 2007

When the shit hits the fan ...

Not that long ago I received my first Mac, more exactly the top MacBook Pro with 17" high resolution wide screen. I was quite unhappy with the Mac OS X - and actually I am still not satisfied. One of the things I first complained about was the way of deleting files. It's not possible to do this directly, but it always has to happen via the trash bin. Now I know why they handle it that way which did not help me though ...

Now what actually happened? On last Monday I visited the overly expensive King Tutankhamun exhibition. They have impressive stuff there, but the most famous golden mask is missing - and you are not allowed to take any pictures. But it was a nice evening and on the way home I took the one or the other picture. At home I wanted to copy them to my MacBook and here the story actually starts.

Mac OS X has some really strange features. The first one is the mentioned above: I'm not able to delete files directly. Of course I want to delete the pictures from the memory card after having them copied to the hard disk. Here we get to strange feature 2: Deleting them on the memory card does not free the memory since they are still on the card's trash bin. But you neither can selectively delete items from the trash bin - it's an all or nothing and strange feature 3. I want to keep stuff in the trash bin where it makes sense, being forced to empty the trash bin completely is rather stupid.

Now I found a workaround which seemed quite comfortable. I use CocoViewX for viewing and managing my pictures. It's not as good as IrfanView which I used to use on Windows, but much better than iPhoto. The latter is kind of unacceptable for me since it insists on managing them in its own directory structure and creates 3 copies (originals, data, modified) of all of them - I have 3.5 GB photos! Anyway, CocoViewX notices when there is a memory card with photos and offers to import them with the 3 options of keeping the photos on the card, moving them to trash or deleting them directly on the card. The last one is what I'm looking for and used. So far, so good. It creates an import folder where it puts the new pictures.

After I sorted the pictures and moved them to my folders I wanted to delete the import folder. Again CocoViewX offers the three possibilities to keep it, to move it to trash or delete it. I knew this folder is empty and I'm very, very sure that I want to delete it. But I was to lazy to switch to the mouse and wanted to switch the buttons with the keyboard. I played around with Tab and left key to which I'm used from Windows, with different combinations of the modifier keys. I only found Apple+Left changing the directory in the background but nothing moved the focussed button. Apple seems to be completely unusable without a mouse!

Anyway, in the second screen shot you see were I ended and I selected "Yeah, throw it away". And what happened? Due to the changed context in the background it threw away my complete "Bilder" (meaning pictures in English) folder! Despite the question is still for the import folder! It took some seconds until I noticed what happened.

Then I slowly started to think about what I really lost: 200 pictures of my last two months here in Philadelphia, so since I arrived in the US. Beneath them pictures of my trips to Washington D.C. and New York City. Some I sent already to friends, so I could get at least little variants of those ones, 12 to be exact.

Then I thought about what and how to rescue. Undelete for Mac OS X! Everything I found was anything but promising. I found references to tools like Stellar Phoenix, File Salvage, Data Rescue II and Virtual Lab Data Recovery, but none of them was free. But wait, there is still the memory card! I have a second notebook here running on Windows XP. I knew there are free undelete tools for Windows and with one of them, FreeUndelete 2.0 I could recover most pictures from my NYC trip.

And the rest? All 4 Mac OS X tools offered evaluation versions where you can search your disk for recoverable files, but not actually recover them. I tested all by downloading them directly to a second 1 GB memory card. Stellar Phoenix did not even found my logical drive, Virtual Lab could not (potentially) restore my pictures to the memory card if I remember correctly. Both File Salvage and Data Rescue II worked at least and I scanned my disk. They found more than 5,000 JPEGs. Wow! Paying $100 and then additionally the work of filtering files? I nearly gave up and only wanted to ask a colleague of mine the next day who has more experience with Mac OS X. Unfortunately he has never been in need for such a tool.

But in the evening I started again to search for a tool. A Unix/Linux operating system but no open source tool for undelete?! I finally found one called PhotoRec by CGSecurity. It's rather low-level, started from the shell, but provides a menu. Unfortunately, it can not just scan the free space but only the whole hard disk. It groups restored files by 500 in one directory. Since I knew I would run out of space on my 1 GB memory card I cleaned up the directories as soon as they were "finished", i.e. had 500 files in them. PhotoRec found thousands of useless operating system files, mostly very small, some bigger ones like the background images. I had no problem to keep the space usage on the card rather low. As soon as it got to my actual pictures I could not keep up, the card filled more and more. When the free space reached 10 MB I stopped the process in hope I could continue afterwards - unfortunately not. Anyway, after more than 5 hours and scanning around 55% of the sectors of my hard disk and restoring 19,000 (!) JPEGs I also had most of my pictures from Washington D.C. and Philadelphia back - and was quite happy.

I sent Christophe Grenier (that's what CGSecurity stands for) an email about my observations with the tool. He responded pretty fast. Regarding scanning the whole disk instead of only the free space he wrote
Currently this feature is only available for FAT and NTFS. I hope to provide it for Ext2/ext3, HFS/HFS+ isn't planned.

He also told me how to start from a specific sector by modifying photorec.ses. This file contains a huge list of scanned sector ranges starting from 0. All entries up to those ones that should be scanned must be dropped. This worked pretty well on evening 3. It took only another hour for scanning the rest and restoring another 1,000 JPEGs, nearly all to be deleted anyway.

Yeah, so overall I had to filter out around 120 pictures out of 20,000. First criteria was of course the size, which should have sorted out around 80% of the pictures. Since my hard disk is rather new it is also not that fragmented yet and my pictures came mostly in groups. I could easily delete the ones I had on my backup and keep the other groups. All in all it were 2.5 days because of annoying features in Mac OS X and bugs in CocoViewX - where I consider the first one worse than the latter.

It really annoys me that Mac OS X does not allow me to delete things directly especially since I can't delete stuff selectively from the trash bin. This might be good in general since it is so hard to restore deleted files. But I usually know what I'm doing and don't want to be patronized by my operating system. Since there are more issues I have with Mac OS X (about which I might blog in the near future as well) it might even happen that I switch to another one (with Linux more likely than Windows).

10 August, 2007

The next big thing!

No, I don't talk about Web 3.0 or a successor for Spring and JEE, not even about the upcoming Cocoon 2.2 release. It's only about the next Cocoon GetTogether which will take place for the first time in Rome this year as it has recently been announced. It will last from October 3rd to 5th with a two-day hackathon and one day conference as usual.

For those who don't know it a short history. Starting in 2002 for the first three years it was held in the beautiful town of Ghent (Belgium) by the Outerthought guys as an occasion to meet each other (developers and users!). With around 100 € it has always been a rather cheap event. The American spare ribs of a local restaurant (somebody knows the name or a link?) became most famous and a tradition at the Cocoon GT.

In 2005 the Cocoon GT moved to Amsterdam organized by the Hippo team. Amsterdam is a wonderful city as well, more night life and sights, but was therefore more expensive, especially the hotels. And the spare ribs were not half as good as the one in Ghent. We came back to Amsterdam in the last year though.

And now it's Rome and the Sourcesense team. Again it's a very interesting town where it is worth to leave the laptops once in a while. I have been there already - 11 years ago ... it would be time to visit Rome again. But not this year, since I won't make it to the Cocoon GT. For the time being I'm in the US and would spend my holidays to visit the interesting places here.

For anybody who is interested in Cocoon or uses it I can only recommend the Cocoon GT. It has always been very interesting and enlightening. You can meet the developers and get deep-inside views or get your hands dirty when working on your first bug during the famous bug hunt.

For a few impressions also take a look at the pictures.

07 August, 2007

1,000th post

Nothing that really matters to the general public I guess. But I made my 1,000th post yesterday in the Spring forums. At the moment it's the community I'm most active in after I worked with Spring on my last two major projects.

The first project was a major refactoring. The web application was mainly separated into a web tier based on Apache Cocoon and a business tier without a container or distinct architecture. I introduced Spring for the dependency injection in the business tier and replaced the home-grown transaction framework with a JTA-based solution. Besides the database we access a Corba server and the file system transactionally so we need distributed transactions and two-phase commits. I wrote an XAResource/ JCA implementation for Apache Commons Transaction. Unfortunately I can't release this stuff as open-source but have to reimplement it when the time allows it. I use Jencks which integrates the Apache Geronimo transaction and connection management, so I don't need a full-blown JEE server, but I can use Spring and Apache Tomcat.

The second project was a portal. Starting from scratch was really nice. I could reuse my JTA/ JCA stuff and added JMS to the picture. As portal server Liferay 4.1.2 was chosen - and I learned to hate it quite fast. So many issues especially (but not only) with IBM DB2. Lately I switched to 4.3.0 - and I am quite happy with it. They have improved tremendously and now I can also recommend it. For the portlets I used Spring Portlet MVC which was really new at that time and had still the one or the other issue. But in contrary to Liferay (this improved a lot as well) and especially Hibernate (that's a topic on its own) I really got help from the community and issues got fixed really fast. That's when it makes fun to live on the bleeding edge and is not only frustrating.

Getting more and more used to the different parts of Spring I also started to share my knowledge. First it have been the custom scopes (which I used in both projects) and proxying (mostly for declarative transactions), later I added Spring MVC in general, Spring Portlet MVC in particular and the PropertyEditors to the picture. And so that's what I focus mainly on in the Spring forums.

Actually my home community is Apache Cocoon. That's were I started in end of 2000 and I became a committer in mid 2003. Unfortunately, by not working really with Cocoon for the last two years I lost more or less track of its development. I still try to bring me in from time to time though (as lately by propagating PropertyEditors as in Spring ;-) ). I'd like to work more again in this community again in the future, especially with the forthcoming next version 2.2.

04 August, 2007

Killing Music

Do you remember "Copy kills music"? That was a campaign of the German department of the International Federation of the Phonographic Industry (IFPI), an umbrella organization "representing the recording industry". In 1999 the campaign tried to prove with a ludicrous reasoning that copying kills music. What they actually meant and even said so in the text is that it endangers the earnings of the recording industry. The consequences read as dramatic as follows: The recording industry will no longer be able to finance risky projects off the mainstream and so the musical landscape will become desolate.

In my opinion it's the recording industry itself or better said its major players that needs to be blamed. Which were the last 5 qualitative albums released by major players? Nobody needs to wonder at the decreasing earnings which is caused mostly by the copying as I admit. But in the meantime you have to pay around 16 to 18 € for any CD in Germany in a shop, Amazon is around 2 € cheaper. And they missed the move to the internet. If there would have a been a platform like iTunes right from the beginning I claim they would have not these problems nowadays. By the way, this is no call to copy music. Just to mention it: I have far more than 400 original audio CDs.

Now it was Elton John who said that the Internet destroys music. At least he refers to the creativity not the money. The internet is supposed to be preventing people from going out and being creative. This results in only 10 fantastic albums per year - while there have been 10 per week in the early 70s. Maybe it's only the taste in music that has changed a lot in the meantime? For sure I don't consider many of those fantastic albums as fantastic as Elton John does. Anyway, he wants to shut down the internet for 5 years and expects better music to arise! And:
Hopefully the next movement in music will tear down the internet.


In my humble opinion both the IFPI and Elton John just don't understand what the internet is about - but on completely different topics. Elton John asks the people to communicate since this results in creativity. I say there is no better communication platform than the internet. Face to face would be better of course but there is not somebody for every interest around.

The IFPI says the internet endangers the recording industry's earnings and so the jobs. I say I don't care. This does not mean I don't care about the affected people - it's simply from an economic point of view. Or in other words: Who cared about the gunsmiths when they got obsolete? Things just change. And yes, I consider the recording industry being obsolete in its current form. I don't need their talent scouts and their marketing for mainstream music. They should only focus on the production and distribution of the actual recordings. Then they would not need to care about copying and to lobby for more restrictive laws. They require to access the internet connection data which internet provider need to store for 6 months in the near future in Europe (for those that like the lengthy German words it's the so called Vorratsdatenspeicherung). Those data were targeted for anti-terrorism investigations and should now be misused for civil law cases. That means I have to relinquish my fundamental right for privacy for a purely monetary interest of another party. In my opinion not even the so called war against terrorism justifies those restrictions of the fundamental rights since their effectiveness is at least questionable. You might remember that the US authorities had actually very many data about the 9/11 terrorists but this could not prevent their attack. But I completely digress ...

Back to the internet killing music in concerns of creativity and earnings. There is already an example that belies both fears: open-source software development. Who claims there is no creativity involved makes him/herself ridiculous. And there are also a lot of successful companies which base their business around open-source development. The secret is that they add value to the simply copyable source code by providing services, training or more trivial things as discs and documentation.

I really can't see why music should be so different from software. So what can the music industry learn from open-source software? How will it look like in the future? Actually we just need to look back before the arising of a music industry. Probably the music itself will get less important from an economic point of view. It's the additional value that will matter in regards of money. People will still buy audio CDs as they do now despite the possibility of just downloading the music. The performances/concerts will also get more important.

First steps have already been done. Most famous example of a successful career started in the internet are for sure the Arctic Monkeys (though I don't like the music). Another band following this example is Clap Your Hands Say Yeah (I really like it a lot). Furthermore the recording industry seems to recognize that any attempt to prevent copying is doomed to fail. (Don't consider Apple and EMI as benefactors, they made it for pure economical reasons.) I guess this change of view started with the huge disaster of Sony BMG's DRM based on a rootkit. Now only others need to follow and listen to their customers instead of fighting them.

I might conclude with a slightly modified version of the quote of Elton John:
Hopefully the next movement in music will tear down the music industry.

But since I need them for getting my audio CDs I guess it's more appropriate to conclude with a quote of one of their managers named Irving Azoff. Unfortunately I found this quote only used by somebody else as conclusion of a preview of the music industry in 2010 and have no idea in which context Irving Azoff used it originallybut here it goes:
When all the changes are done there will be still music.


Update: Universal joined the party

The Univeral Music Group has announced to sell some of their music without DRM - at least for the time being. And Amazon followed short after EMI what I missed at that time.

15 July, 2007

There is always insufficient information ...

From a long experience on several mailing lists and forums I can tell you there is (nearly) always insufficient information. So thanks go actually to Andrew Stevens for the name of this blog. I laughed sooo much about his post on the Cocoon users list.

But what makes a post actually a good post?

1. Have in mind there are no dumb questions. Even if it has been asked a hundred times. This only means that the answer might be hidden too much in the documentation (a problem the Cocoon documentation suffers from a lot). The users must be taken seriously and so their documentation needs. If the code gets an end in itself the project will get obsolete. So listen to the users.

2. That's actually more a 1b. RTFM. Really, please try to solve an issue you have first on your own. Read the documentation, reference, manual, whatever it is called. Search the mailing lists. Use Google or your search machine of choice. Rarely that issue hasn't occurred before (though somebody must be always the first one obviously). Even if you can not solve the issue on your own having read the documentation will help you to understand (and later explain) the issue better.

3. Give your post a meaningful subject. "Help" is not a good one. Any mentioning of "urgent" is useless, you won't get your answers faster. Don't waste people's time with reading stuff they are actually not interested in.

4. Give context to your issue. Maybe you are trying to achieve something in a completely unusual way - if not even a dead end. Since nobody might have tried it that way and can't help you with your actual issue they might know alternative solutions or workarounds though. It makes it also easier to try to understand your issue.

X. Use an appropriate way of writing and styling your post. You should only post in plain text to mailing lists. That's not because all developers are purists, it's a matter of readability in the mailing list archives. Reading a HTML-mail in the archive is like reading the HTML-source of a web-page - or even worse since < and > are often escaped to &lt; and &gt;.
In forums it's the other way around. Reading code in variable-width fonts is really hard, not to forget that the code loses its indentation since the whitespaces get collapsed in HTML.

XX. To be completed ...