Friday, May 30, 2008

Hark, where are thou keys?

I was talking to my mothra (mum) last night and she had recently found an email I had sent back in August 2005. If I remember correctly I sent this to my boss because I was late and I couldn't find my car keys.

I’ve been looking for half an hour, to the point I’ve looked into the shower.
I’ve searched high and low, left and right. In a bank of snow, with and without light.
I’ve looked far and wide, deep and shallow. Under my feet and, under my pillow.
So if you see my keys then let me know. A reward you will see, to work I will go.
To ease my thoughts and waste my time. I had a coffee and wrote this rhyme.

I think it took a few days to find my keys. They had fallen down the back of one our La-Z Boy chairs and had gotten caught in the internal framework. This meant that they couldn't be found by putting your hand down the side of the chair, nor could they be seen by lifting the chair off the ground. Fun!

Have a good weekend!

Thursday, May 29, 2008

Attachments and Requirements in QualityCenter with C#

Previously I covered creating new requirements and how to handle requirement post failure. Today I'll cover attachments which are fairly trivial but have a gotcha that might not be immediately obvious.

To add an attachment to a requirement, the requirement must already have been posted. Otherwise you will get an exception. The following code is how to add the attachment. It is very simple.

code - adding attachments to requirements

Debugging
When you set the path of the image it will be adjusted from something similar to: "D:\\Path\\to\\my\\image.jpg"
to:
"C:\\DOCUME~1\\user\\LOCALS~1\\Temp\\TD_80\\504857af\\Attach\\ REQ1722\\D:\\path\\to\\my\\image.jpg"

Whether this is an artifact of the Quality Center API or the .NET development environment is beyond my knowledge of either.

The gotcha?
You have to add your attachments after creating any child objects. Otherwise every single child requirement (and their children will get a flag added on them saying that a parent requirement has changed and that you should review the potential impacts of such a change). Sometimes you want this, sometimes you don't. If you do it accidentally it can take a long time to remove all those flags.

Wednesday, May 28, 2008

Handling Post failure with QC Requirements in C#

Yesterday I covered creating new requirements and updating custom fields. Today I will discuss post failure and why handling it is non-trivial.

Quality Center has a requirement that no two requirements share the same name where they have the same parent. If you attempt to save a requirement like so using the QC UI it gives you an error and the user tries again. If you try using C# you get a COM Exception.

It also resets your object.

I couldn't believe it either. It means you can't add an incremental counter to the test case name and try posting again until it works. It means that if you are building your requirement up from a source document, you have to go and redo your work. This is a pain if you were using recursion.

There is a solution. It is not complicated either. What I did was to define a new class called SafeQcRequirement which owns standard QC requirement object and each of the attributes that you care about. In a worse case scenario you are duplicating all of the attributes. In my case it was only a handful.

As you build up your requirement, store all the data in the safe requirement. When it comes time to post, build the real QC requirement and post. If it fails, rebuild again with a new name try again. Keep going until you succeed.

The following screenshot is what my post routine resembles.

code in a bitmap, hideous... apologies... a solution is almost here.

The second protected post method is a recursive loop that just tries until it succeeds. There are no gaurds for stack overflows or more than uint requirements. Based on my usage scenarios this is unlikely. If it does occur, shotgun not testing.

In actual fact, any duplicate requirement names are recorded and fed back to the User Centered Design people (who created the source document) and they update their source document so that the duplicates no longer exist.

Regarding the exceptions: you don't get a useful exception from COM, so just capture the base exception and use the message text to determine if it is the one we care about.

Our requirements are sourced from an Axure document and in the next week I'll put a post about parsing their document for requirements. Tomorrow I'll cover adding attachments to requirements which is fairly trivial but there are still a few gotchas.

Tuesday, May 27, 2008

Creating a Quality Center requirement using C#

Unfortunately Quality Center's Open Test Archtecture (OTA) API doesn't have any documentation for C#. Its all in Visual Basic which can make it a little bit of a trial to work out what is required. In the case of creating requirements, the documentation states:
Passing NULL as the ItemData argument creates a virtual object, one that does not appear in the project database. After creating the item, use the relevant object properties to fill the object, then use the Post method to save the object in the database.
Sounds simple enough until you find you that null doesn't work, neither does a null object of the type, a null object, zero, nothing and about 15 other things we tried that could be considered null. In the end we needed the singleton System.DBNull.Value object to be passed in.

TDAPIOLELib.Req req = (TDAPIOLELib.Req) m_reqFactory.AddItem (System.DBNull.Value) ;

Once you have created a requirement, you can start setting the value of each attribute. C# Properties are provided for standard attributes while custom attributes must be set using the direct operator.

req["QC_USER_01"] = "my value" ;

Where QC_USER_01 is the name of the attribute previously defined in QC. One you have set your values, call post to write to the QC server or call undo to reset any changes you have made to the object.

req.Post () ;
req.Undo () ;


If your Post call fails, then you need to handle it accordingly. It is non-trivial and I'll talk about that tomorrow.

note: I would like to thank all the brave developers who walked past my desk whilst this was being worked out and helpfully shouted different ideas for what null could be.

Monday, May 26, 2008

Leveraging MSVC's output window

If you use scripts to identify parts of your code that may not meet standards, you can simplify the job of the developer by making use of Visual Studio's Output Window. Any output from a script that appears on a seperate line in the output window with the following format is clickable.

$filepath($line) : $message

When you click on it, the user is taken to the specified file and line and the message will be displayed in the status bar. This is how it works when you let Visual Studio perform builds for you.

So now that you know how to format user messages in the output window, you need to be able to run your script to that output is sent to the output window.

This is pretty easy to do... in Visual Studio, go to: Tools > External Tools and setup you're tool a bit like mine. My example is a perl script that looks for //TODO comments in code and writes the information to the standard output.


The "Use Output Window" is the kicker here. Otherwise it'll run in a separate console window and be less useful to you. When you write your scripts, you should dump the output to the standard output. That is what gets piped into the output window.

With this in mind you can start writing scripts to useful feedback to developer like:
  • not consting parameters
  • identifying classes / methods without unit tests
  • poor coding standards / formatting
  • implementing FxCop like functionality (for non C# languages)
Basically anything you can identify in a script. I use it for enforcing coding standards, allowing other developers to mark areas for me that could be performance tuned.

It is also useful for doing ASCII to Unicode conversions. Conversions like so are non-trivial and carte-blanche find and replace methods generally don't work. You can write a script to identify potential areas for change and then tackle them one at a time.

Happy Scripting!


note: I don't know how to get it to show up in the Error List window (not that I've looked recently) but if anyone knows, send me a link. That would be super.

Sunday, May 25, 2008

The potential defect

For those that want to know the defect was a memory problem. The application (Nokia Nseries PC Suite) when you changed from tab to tab, would acquire a several 100k of private bytes. If you keep changing tabs the memory usage continues to climb. This application is designed to start with the computer and remain operational the entire time (its for phone synchronisation). If you didn't turn your computer off at night after a few days of usage its memory usage can get quite high.

Personally I don't really use the application, as a matter fact it bugged me, because it starts when Windows starts and I don't like applications to do that without asking first. I was looking for a way to turn that "feature" off and just happened to have process explorer open on my secondary monitor set to sort by private bytes. After much clicking about, I noticed it work its way up the list... after that, like all good testers, I was deliberately trying to workout what was causing the memory usage to rise.

Reporting Defects in Proprietary Software

I, like just about everyone else on the planet, find bugs in proprietary software. Sometimes I like to report them to the vendor. One thing that really annoys me is how hard it is to report a bug. Often these sites have no Contact Us about Gaping Holes In Our Software, or more frequent Contact Us regarding Potential Software Problems.

This is even more the case for non-software companies that produce software. Often to accompany their hardware. I shall use Nokia in my example. I would still say that they are a non-software company (they make phones) but are heading in the direction of being at least partially a software company (they are trying to buy Trolltech) and invest in other software companies like Symbian.

Nowhere on their site was a place to report a defect in any of their software products. In the end I sent an email with the defect report to the customer service people responsible for my phone and asked them to forward it on. Not an ideal solution and normally if it gets to this point I don't bother. Today I must be feeling extra kind.

At this point I would state my hard-line position and say something like: Seriously, if you produce software in any form provide a mechanism by which users can report bugs in your software. An email address is usually enough for me. A potentially better solution is to make use of Windows Error Reporting. I've never used it, but I can't see how it couldn't be useful. Every little bit of information regarding your applications reliability in the field is useful. Note that WER is only useful for crashing or hanging apps.

The problem is that non-software companies that produce software, traditionally, don't have organisation practices around defect reporting, management and the eventual software evolution that occurs from that. The smaller or less mature the company is, the greater the chance of them not having such practices.

There is not much you can do about that. Supporting software is an expensive process. Even if they wanted to, it may not be possible for them to setup the requisite infrastructure. Some may argue that they should have thought of this before getting in the game. That doesn't change the fact that the software is already written, and if the vendor has gone bankrupt then you have zero chance.

What about the concept of a open defect registry? Where users can report problems with bugs against the software that they use. It would mean that defect is documented somewhere but doesn't solve the following problems:
  1. doesn't mean the developer will fix it
  2. someone has to manage the defects being report to handle duplicates or non-a-defect
  3. the service would effectively be doing the work of software companies for free
  4. still doesn't help if the vendor is out of business
The first issue applies to defects reported by formal process with the vendor. You can't help that. The second two points could be solved by charge vendors a nominal flat fee to access the defect information regarding their software. This does two things, pays for someone to manage defects and secondly allow for the operational costs to be spread out over many organisations. Reducing the cost of defect management across the board.

The final problem of vendor's going out of business. What we need on top of that is an open source graveyard where applications that are still being used but are no longer supported have the code release to the public. They are kept at a common location allowing open-source developers to revive them to fix bugs or extend per demand.

Eventual advantages of this process allow users to rate the importance of a defect, giving vendors better visibility into the top 20% of defects. An API for automate defect reporting by software and better software all around.

note: This wasn't intended to be a post about a shared service of defect reporting. Train of thought blogging, ftw!