Smack my iPod up

|
On my way home yesterday I thought that I'd listen to my favourite podcast via my iPod and iTrip. I selected the normal Recently Added playlist, found the show I want to listen to and pressed the Select button. Then, the iPod reset itself – OK I thought, I'd seen this behaviour before, and tried again. This time when I selected the show the iPod just hung. OK, I thought, I'd also seen that behaviour before too. Normally a full reset does the job; but, it wouldn't reset. Odd, now that I had not seen before.

By the time I got home and half an hour of trying to reset the thing it did eventually manage it. Once the familiar menu came up I shut it down and figured I'll have a look later. Later that evening I thought I'd listen to my favourite podcast – when this happened:



To accompany the picture I also had a nice little "click, click, whurr!" going in a nice little repetitive cycle. After about 30 seconds of this the iPod shut down and would not restart. I went to the Apple support site to see what I could see – I've been around computers a long time now and I can tell when the head on a harddisk is stuck – and as expected the help articles from Apple just said I need to get it serviced. The issue I had with that, other than being massively expensive at £166.00, was the statement that I would lose all my data, no "ifs", "buts" or "maybes" - the data will evaporate.

This is because the way Apple solve this problem is to send you another iPod (notice I did not say new iPod) that they have already fixed (they call it refurbished). This causes me a big problem as I have been using my iPod as an external drive for all my data as well as an MP3 player – as I have been upgrading and moving between machines a lot since the whole Vista thing started the CTP and Beta merry-go-round; so using my iPod seemed to be the perfect solution: I'd always have it with me and there was plenty of room for the data (shame on me for trusting Apple – or is it really Toshiba I should be pissed at).

Anyway, given that this was not a route I wanted to pursue I pulled out my favourite search engine and looked to see what I could find. I started by searching for information on how to open the thing up – so I could have a look-see. I found a good article or two: here's one and another. In the latter post I found, buried in the comments, something that completely goes against all that I have learned about working with and using technical equipment:

Simply hold it in one hand, and smack it with your other hand. I was told to try this by a friend online, and wa-la, it has not had a single problem since which was over 2 months ago

I had nothing to lose – the thing would not start and it had the sound of stuck heads; plus I'd already convinced myself that I was going to try and do something with the disk: replace it, mount it or something – what's the worst that could happen?

Well, this morning I put the iPod in my left hand, with the wheel pointing upwards, and smacked the device with my right hand as hard as my hand could stand until the Apple logo appeared (it took three smacks). It's a happy ending (so far...) now my iPod is working great! Now, I've got to go - I have some data to copy.

Laziness is Good

|
Another gem from Code Complete:
Laziness: The quality that makes you go to great effort to reduce overall energy expenditure. It makes you write labour-saving programs that other people will find useful, and document what you wrote so that you don't have to answer so many questions about it.
- Larry Wall

REST vs. SOA (again!)

|
Great article on REST and SOA: REST Eye for the SOA Guy. Steve Vinoski of IONA Technologies tries to explain REST from the viewpoint of someone steeped in SOA, with the intention of helping SOA people understand the value the REST camp so rightfully touts.

Outsourcing

|
Whist reading though Matt's post at Technagerial on outsourcing there was one thing that lept out at me... "Quality", Matt says:
Then there is the question of quality. In my experience, with only a few exceptions, the quality of code developed offshore has been good.

Matt's experiences have been different to mine (however, he has had more experience in this area than I) - for me, on the whole, the code developed offshore has been very poor. However, the code did work and therefore it got released; which is different from being good - but from a managers perspective this could been seen as good and that could also lead a manager to believe that the quality was also good because it was released. This is not the definition of good code, more on why this poor code gets released follows later.

Matt goes on to say:

However, this requires vigilance and extra time from your key, local staff to review, comment and guide the offshore team. It is advised to have your coding standards ready and validated and make sure code is reviewed regularly to check for compliance and clarity.

Entirely agree and here's some additional advice in this area specifically: Do use automated test suites, do use static code analysis, do have standards AND enforce them; do not assume that the code will be OK or "good enough" and do not accept the code if you are not happy. If it is not the code you would have written yourself then it's not the code for you. How will you ever maintain or grow the code if you do not love it as your own? Code is read many more times that it is written so it is vitally important that the code does more for you than "just work".

There is a common issue I have seen in business time and time again with projects that use outsourcing: the project plan does not allow for the code to be reworked.

This is like having only one test cycle in the plan: there will be no time to put anything right. Without this being factored into the plan all you will get is an issues list but you'll have zero time to do anything about it. The fact that you'll have a "statement of work" and contracts that state what you and they must do will make little or no difference in the end - because if time has run out and the code works you'll be under pressure to release it, regardless of the quality.

If you're seriously considering outsourcing here's something that I have found extremely useful, Steve McConnell's insightful article on the topic: Managing Outsourced Projects.

McConnell Rocks!

|

Steve McConnell of Code Complete, Software Estimation and Software Project Survival Guide fame is appearing on my favourite podcast. If you have not heard of the either the podcast or Steve then I encourage you to check it out, especially if you're involved in the business of software.

del.icio.us

|


If you have not yet discovered del.icio.us then you need to get yourself over there and create an account. This site is another in the long line of services with the handle of "Web 2.0" (owned by Yahoo); I have found it to be excellent. Check it out, it'll be the last bookmark you ever make... in your browser that is.

Binary Watch

|




The thing I love about this watch is the idea of walking down the street and someone stopping me for the time...

REST or not to REST

|
REST (Representational State Transfer) is a very interesting style of service architecture; this morning, I and another architect, were looking at the pro's and con's of REST vs. Soap and WSDL. We also looked into the possibility of combining the two worlds using REST with standard Soap as the payload, gaining the Uri loveliness of REST and the structure and standards of Soap. The upshot for me was that REST is definitely a viable and compelling technology, my thinking is if your service is simple then keep it simple with something like REST - if you need a highly structured secure service then Soap and WSDL would probably make more sense.

One other point of interest was the obvious tight coupling between REST and HTTP i.e. the service implementation style and the protocol; there is no such constraint with Soap and WSDL. However, this is nicely exploited with RESTful services in that they use the verbs defined in HTTP to state the intent of the message e.g. POST == Create, GET == Read, PUT == Update and DELETE = Delete. The intent then does not need to be inferred, in the worst case, or explicit in the name of the call, meaning I can operate just on the named resource e.g. Customer or PurchaseOrder etc. clearly and cleanly:



The intent here is to change the Postcode for a Customer entity - I like how clean that is, the equivalent in Soap would be much more verbose and the intent would be specified by the SOAPAction and/or inferred from the wrapped type e.g. ChangeCustomerAddressRequest. There are some obvious problems with the REST approach in relation to enterprise systems but you cannot argue with the simplicity and when it come to just getting things done, I like it.

My biggest fear with employing the HTTP verbs is that most firewalls are not verb friendly - in that they only allow POST and GET, where POST basically means INVOKE and the intent for the call is buried somewhere deep in the payload. Which probably means that an alternative header, ala SOAPAction and the GDATA X-HTTP-Method-Override approach. I'm really interested to see how this plays out over time - will the verb usage just go away or will there be more widespread support for the verbs?

This, however, is not enough of an issue to scare me away but I'm also unlikely to the verbs other then POST and GET, but jump straight to the alternate X-Header approach to save the thinking time and the clock cycles (maybe not even bother with GET!). The core reason this issue does not make we want to run a mile is that the power of REST for me is in the simplicity of the "Uri" approach that RESTful services have; this is powerful and very compelling.

Architecture Books

|
I was recently asked if there were any good books on Software Architecture - I'm saddened to say that I have never found one that I liked. In my view the main reason for this is that I think it's really hard to squeeze that kind of information into <1000 pages and still be current and pertinent, whilst also keeping the reader engaged throughout. The topic is so vast, you could only write generally, in most cases, and you would not be able to delve into the detail that would be necessary to actually implement anything.

Having said all that I do think that there are some good resources out there; here are just a few that I keep my eye on, which keep me in-line with all things architectural in my problem domain:
  • The Architecture Journal - This is a Microsoft offering that I get sent in paper form every moneth but is also completely available on-line.
  • SkyScrapr - An on-line resource which, in their words: "... is your window into the architectural perspective". Here they talk about all the different elements of architecture from the perspective of the different types of Architect i.e. Solutions, Enterprise, Infrastructure etc.
  • HanselMinutes - This one is a little off the wall in terms of architecture in that it is a developer podcast (of guy called Scott Hanselman) and not about architecture; but his insight into the industry is very compelling and I have learned about so many new technologies from this podcast that (almost) every Architect should be aware of.

I'm sure that there are many other good resources, and maybe there is a good book or two, it's just that I have not found them. If you do then please share with the group.

i-name, CardSpace and OpenID

|
Have you registered your i-name yet? Soon these identifiers will be like you email address and if you don't register your name someone else will. Wouldn't you have loved to have had the name yourname@hotmail.com, yourname@gmail.com etc.

Security in the world of the Internet is moving and fast. With the introduction of Microsoft's implementation of the Information Card security spec with CardSpace, in Windows Vista, and Bill Gates recent announcement at RSA to support OpenID with CardSpace these areas are going to get hot very soon.

My i-name is =paul.jackson (any personal i-names always start with a "=" and company i-names start with an "@"); I have some forwarding from my i-name service provider using the standard XRI dictionary names: Once you have your globally unique identity, via your i-name, you now need your OpenID persona. I have used http://myopenid.com/ but I'm reliably informed that there are many others too; once you have this you're all set for the coming Web 2.0 solution to identity management. Soon OpenID will support i-names and CardSpace will support OpenID, job done.

Do it today, do it now, solve your identity problems.

Web 2.0

|
I have heard, seen, read, been present with many definitions of the phrase "Web 2.0" over the last year or so; however, today I have seen the answer to the "what is Web 2.0?" question, that I think, most completely answers it for both the technical and non-technical alike and in only ~4:30 minutes:
Web 2.0 ... The Machine is Us/ing Us

A Little Flurry

|
The scene when I arrived at work this morning...

public Stream Filter { get; set; }

|
Update: I have completely re-written this post - Apologies if you have already read it, but if you read it again it might actually make sense this time :-) [PJ]

Work has been really busy recently so my posts have not been as regular as I would have liked; however, in the short term I wanted to share a little something that I thought might useful to others. Very soon I'll be doing a piece (maybe a couple of pieces) on a private project which has involved JavaScript, XML, ASP.Net, C# and JSON - focusing on the "How To" part of JSON.

Anyway, on with this post. I have recently had occasion to write a post processing Rules engine for ASP.Net, the basic idea being that we have a need to change the content of the out-going HTTP response stream without the originator of the HTML content being involved in the process. If this all sounds a little convoluted then that'll be because it is - the real issue is that we do not want change the existing codebase but we still need to make some minor modifications to the HTML that the existing code spews out.

What follows is some pseudo code that outlines the intent of how we wanted to achieve that goal:

string responseText = null;
HttpResponse.OutputStream.Position = 0;
using(StreamReader reader =
new StreamReader(HttpResponse.OutputStream)
{
responseText = reader.ReadToEnd();
}

// ...check the response text for "whatever"
// ...and replace
responseText = FindAndReplace();

HttpResponse.OutputStream.Position = 0;
using(StreamWriter writer =
new StreamWriter(HttpResponse.OutputStream)
{
writer.Write(responseText);
}
The idea would then be to wire-up this all up in an event within an HttpModule. Well, that was the intent - rather than the implementation. In fact, what is shown above is the prototype code that I originally wrote when trying to prove the idea; however, what has been described above is not possible - the HttpResponse.OutputStream is write-only, you cannot move the position of the cursor manually nor can you read from this stream in any way what-so-ever; all you can to is add more stuff to it.

So if this isn't what I did the question surely must be: So, how do you replace the contents of the response stream at runtime?

Well, the answer lies in the filtering capability of HttpResponse object (and HttpRequest object for that matter); which has the capability to insert a chain of filters which can get in the way when writing to the underlying stream. The kudos for this answer goes to Fritz Onion and his great book (the only one you ever need to read for ASP.Net) to which there is now a second volume that discusses .Net 2.0, which I have yet to read but most definitely will be. An example of the technique is illustrated below:
HttpResponse.Filter = new MyCustomStream(HttpResponse.Filter);
Now any time a call is made to the underlying stream in the HttpResponse object your filter is called, this then gives you the opportunity to mess with the data to be written in any way you see fit, before passing it along to the base stream (or the next stream the in chain).

All that remains is for you to write a custom stream class as the filter is really a stream - WTF! I hear you say... well, this is not as daunting a task as it may sound or first seem; the only method that you do not pass directly on to your contained base filter is the call to Write(), so really it's just a bunch of typing, rather than being hard or complicated to actually do. Here's an example:
public class MyCustomStream : Stream
{
private Stream _baseStream;

public MyCustomStream(Stream baseStream)
{
this._baseStream = baseStream;
}

public void Write(byte[] buffer, int offset, int count)
{
// ...work your magic with the buffer here
...

// ...once done write the data to the buffer
this._baseStream.Write(buffer, offset, count);
}

// ...remainder of the class implementation
...

}
The only thing left to do it to wire up your stream in the right place. This can be achieved in an HttpModule where ever you see fit, but think about the fact that you may not want your filter to play a part in all requests. A sample application that shows off this concept, and saves you a whole bunch of typing, can be downloaded from here.

Simply Simple

|
I have been working with JavaScript quite a bit recently (on a project that I promise to post more about soon); whilst I'm big fan of the language and its awesome capability I'm beginning to remember how much I dislike actually working with it - the realities of that particular ECMA standard when working with multiple implementations of it can be a little "complex?".

Einstein:
Everything should be made as simple as possible, but not simpler.