Testing Certification – My experience of the ISEB Foundation Certification

Recently, Red Gate sent me on the ISEB Foundation Testing certification course. I had heard a lot of reports about the certification from fellow testers, but the course was only two days, in house and I had some free time so I thought why not – I might learn something.

Now I have my mark, I feel I should ‘share’ my experience and my view on the certification.

Theory and Terminology

I understand that this is a foundation course, however it still includes a lot of terminology which accounts for a large percentage of the course. The aim is to  have a consistent terminology across the industry and make sure everyone is reading from the same page. All of the terminology used is listed in the glossary. Personally, knowing the correct terminology isn’t actually that useful in the real world, for example knowing what the term ‘failure’ means is useless if you cannot identity it. 

In general, the course was focused more on the theory side of testing, which was mainly based on theory in the perfect world where you have a fully testable spec. The advice was that if the spec wasn’t testable, alert the business that you can’t do any work until it is testable – I’m not sure how many companies would welcome this demanding stance. As a result of focusing on specification driven testing, it didn’t go into much depth about Exploratory Testing, apart from define the term, which I feel is much more important and generally more applicable.

Ignorant of new development processes

Most of the testing process discussion was based around the V-Model. The V-Model is great, its a visual presentation of a sequence of steps,  on the left you have tasks, while on the right you have verification steps. For example, when you receive your detailed testable specification, you create your acceptance tests to be used to verify the system. However, it starts to get a bit murky when you haven’t got the stages down the left, or when your doing Test Driven Development (according to the model, you design, code, test).

While the V-Model doesn’t define what development methodology it applies to, it sits perfectly in the waterfall development lifecycle. I would have liked to see more of a focus on agile development, mentioning iterative and Rational Unified Process (RUP) doesn’t count.

We should be encouraging agile processes, how testers fit into the process and how they can work more closely with BA and Devs to ensure testable requirements and testable stories.

Agile is not just a fad, a new-age way which will never catch on! It should be taking a much more positive stance in our teaching.

“Just learn it, don’t argue with it”

For me, this is the single biggest problem with certification.  Even if you feel, or know, it is wrong – don’t argue, just learn it so you can pass and have your name on a piece of paper. This is encouraging the MCSE 2000 style certifications, where all you need is a brain dump of all the terminology and sample questions to be able to pass.

This is not a very effective way to teach, and it is definitely not an effective way to learn. Generally with software development, I have learnt by having in-depth discussions about the topic in question, the rights and wrongs, best practices and alternative approaches in order to gain a good understanding.  Someone saying that this is the only way you can do something is not very constructive.

However, you do just need to ‘learn’ the material in order to pass the exam. During my scrum master training, we had really good discussions about how scrum works in the real world. By using the material as a guide, and not the be all and end all, and not having to worry about passing an exam, we was able to dig deep into certain areas and have a discussion, as a result taking much more away from the two days.

“Principle 6: Standardized tasks and processes are the foundation for continuous improvement and employee empowerment(The Toyota Way)

Maybe we (I) have this all wrong? Maybe the aim of testing certification isn’t to teach you the latest and great techniques, but to provide you with a set of standardised tasks and processes to use as a foundation – it is after all, a foundation certification. I’m currently reading The Toyota Way and this is similar to principle 6, have standardised tasks and processes to allow for improvement instead of reinventing the wheel each time.  It would make more sense.

If this is the case, then where is the continuous improvement and updating of the course material to take into account new processes, tools and best practices. By standardising these new ideas, we could improve them to create new best practices improving the industry in general. While the content is updated, is appears to be very static in terms of ideas.

The future for testing certifications?

What is the future for testing certification? From the numbers taking the examination, it looks like testing certification is here to stay. I think there are two initial approaches to improve the certification.  The first is that the foundation course doesn’t have an exam and instead follows a similar approach to the certified scrum master training to allow for discussion and sharing of ideas. With no exam, there is not as much red tape, there is no need for writing and marking papers allowing the content to be updated with more flexibility. The course could be changed to include the new ideas, sharing the best practices and improving the industry in general.

In terms of training, it would be great to see a similar course aimed at testers, which developers have with “Nothing but .Net” from Jean-Paul Boodhoo. A serious deep dive into different testing techniques, tools and approaches. Along side this, conferences have their role to play. This year, DeveloperDay in the UK has a number of different testing based sessions, all of which are real-world ‘take back to your office and use’ subjects. A number of testing conferences I have seen are more focused on the academic side and papers on testing, while interesting do not apply to improving your work today.

I wonder if the practitioner exam is any better?

Technorati Tags:

GUI Automation – A waste of time? (Potentially first of many posts)

Alan Page, a Microsoft Tester, recently posted about GUI Automation and I just wanted to provide my view. Alan’s main comment was:

“For 95% of all software applications, automating the GUI is a waste of time.”

Bold statement, but one I have to agree with – to a point.

Personally, I think we should automate UI’s, however, we need to be very clear and careful about what we automate and the tools we use. The most important part of UI testing is how you structure your tests and know your aim of the tests.

Generally, there are two types of automated testing tools. The first is the ‘record and playback’ approach, using tools such as QTP (HP QuickTest Professional). These tools have been around for a long time, and from my point of view has generally given automated GUI testing a bad name.

QTP records all the users interaction with the application (Web or Desktop) and scripts it out to scripting language built on top of VBScript file for playback.

Sadly the application doesn’t support sharing of steps which means you have to repeat the same actions within each test. If this same repeated action changes, you need to change delete and recreate all of your existing tests – something which is really frustrating when you are just about to release but need to change the name of a button.  This causes a lot of waste.

The other approach is more programmatic, where you program against an object model which interacts with the SUT (System Under Test). For example, with Watin, the html ‘’ turns into Ie.Link object. I’ve spoken before about WatiN and Project White, two of the talks I suggested for DeveloperDay are based on these topics.

As mentioned in the blog posts, these frameworks allow you to interact in a more domain driven way. They allow you to write the test in a reusable way using your application terminology making them much more readable and maintainable. However, from my experience its still relatively slow to start creating the tests.  Once they are created, it fine, but identitying the GUI controls and how to interact with them can take a while.

Are these tests worth writing? Or are they just a waste?

That really depends on your aim. Both approaches will get your good test coverage. The programmatic approach will produce much more maintainable tests which results in less waste.

One approach which I have recently been thinking about is having a set of core smoke tests, created using WatinWhite, focusing on core functionality of the application. These tests are just to ensure that the happy scenario of the application works, if someone picks up the latest build will they be able to use it? Or when it launches, is the first thing a user sees an unhandled exception?

For example, with Windows Live Writer, the test would create a new post, write some text, click publish and then verify the blog was posted. If that breaks, I really want to know, I want to know as soon as possible. I guess this is the 5% of UI tests.

The rest of the GUI (95% general waste) is automated using a similar approach to an APIbusiness logic. You are still automating the UI, but you are not interacting directly with the UI, this is covered by manual test cases, exploratory testing and generally making sure the application is useable – something which you can only do manually. I want to cover this in more details in a later post, but the CompositeWPF (Prism) project from Patterns and Practices is a great start.

Together, this should provide you with confidence that the application works as you expect and want in the most maintainable way.  If you can’t test the 95% using the correct patterns, then you might need to look at using WatinWhite for more than just core functionality.

As I said, this is just one approach, which has advantages and disadvantages.

UPDATE: I think I might have been wrong with my percentages, in fact ignore the percentages. There should be a bigger ratio of tests using WhiteWatin, but they should only focus on core functionality.

My main point was that these tests shouldn’t attempt to cover all possible inputs and outputs, their aim should be to give you confidence in your application and that it will work on a given platform. In order to have this confidence, you might want more tests, which is fine as long as you are structuring your tests correctly and using frameworks such as Watin.

Technorati Tags: , ,

Bad Software – Google Ads – They have a time and place

Google Ads (Adsense) has a time and a place on websites, for example it’s a great way of bloggers to earn some additional revenue to cover hosting costs, along with many companies befitting from its advertising model.

This morning I was searching around for some test tools, and I was directed to one particular site.  After landing on the companies website, which I was told sold software testing tools, I was greeted with a large banner followed by a large block of Google ads followed by some content.

image

I’m not sure how many other websites are doing this, but if you are attempting to sell something to a visitor of your site – what is the point in having Google ads?  If you are having to use Google Ads to up the income of your product, I think you need to look again at your product.

When selling products on your website, focus on the users experience, making sure the information is clear and easy to understand.  Make sure nothing gets in the way of users being able to find this information and how you earn your money.

Why attempt to earn a few additional dollars on the side by ruining this experience and making your core message more difficult to find, losing potential customers in the process. 

Needless to say, I didn’t stop long on the site.

Technorati Tags:

Alt.Net.UK – Post Conference

This weekend was the Alt.Net.UK conference. Personally, I had a great time, met some great people and had some great conversations.  However, one thing which surprised me was the level of automation and testing topics. While I proposed a few sessions on testing (as I would), other people raised great discussions around this area.  To my surprise, they wasn’t just TDDBDD but focused more on Acceptance Testing and how we can improve the quality of software. This is something I wasn’t expecting and I think its a sign of things to come with developers and testers working as a single unit.

The discussions has definitely got me thinking, and hopefully others who attended.  There are clear issues within software development which are causing pain, the question is – how do we solve these problems?

 

Thank you to everyone who attended, without you the conference wouldn’t have worked, and also our great sponsors. While, I felt it was a successful conference, there is still room for improvements which we will take into account next time.

If you want to know more about Alt.Net and some of the topics, a good starting point is the Foundations of Programming e-book.

Technorati Tags: ,

Using MSBuild to create a deployment zip

Automated builds are one of the core fundamental musts for software development. However, your build doesn’t just have to build the solution and execute your unit tests. For IronEditor, my build also creates two zip files. One zip is the output from the build for archiving purposes, the second is my deployment zip – the zip which actually gets pushed up to CodePlex containing only the files required by the application. In this post, I will cover how you can get MSBuild to zip your build output.

To use zipping functionality within your build scripts, you need to use the MSBuild Community Tasks which is a great collection of MSBuild extensions and a must if you are using MSBuild.

In order to zip your files, you need to specify which files you want to zip. In the script below, I create an ItemGroup called ZipFiles, this includes all the subdirectories (**) and files (*.*) from my Release directory which is my build output folder. I also specify that this group should not include any other zip files. I then create a Builds directory if it doesn’t already exist. Finally, I use the Zip task, passing in my ZipFiles ItemGroup which the task uses to know which files to include.


 
     
     
 

 
         WorkingDirectory=”$(BuildDir)Release”
       ZipFileName=”$(BuildDir)BuildsIronEditor-Build-$(Version).zip”
       ZipLevel=”9″ />

The most important property is the WorkingDirectory, this is the root directory where all the files you want to exist live. If you don’t have this set correctly, you will have the additional directories in your zip file which are navigated in order to get to your actual files and just looks rubbish.

My deployment zip also looks very similar and is executed after the above target. The only different is that I individually specify which files and directories to include. For some directories, such as Config, I still include all sub-directories and files it contains as they will all be relevant and required.


 
     
     
     
     
     
     
     
     
     
     
     
 

 
         WorkingDirectory=”$(BuildDir)Release”
       ZipFileName=”$(BuildDir)BuildsIronEditor-$(Version).zip”
       ZipLevel=”9″ />

One thing which tripped me up was that while my ItemGroup was created within a target, it actually has global scope. As such, you need to call the two groups within the two different targets something different.

Once my script has executed, I have two zip files created – one containing everything, the other ready to be released on CodePlex.

image

Technorati Tags: ,

Overriding MSBuild variables via command line options

While working on a build script for the Pex Extensions project, I wanted to be able to specify the version of Pex which the project was built against for use within the AssemblyInfo and file names. This build script is being executed manually in order to create all the zip files to upload but I didn’t want to have to manually edit the script to set the value. I wanted to provide the version when I executed MSBuild via the command line as a command line argument.

After a bit of searching on MSDN, I found that you can override any variables created within the PropertyGroup. At the top of your build script, you generally define your variables within the PropertyGroup to specify paths, versions etc. Within my build script, my PropertyGroup looked like this:


    $(MSBuildProjectDirectory)
    0.6

Using the /p: command line switch, I can override the PexVersion property by executing the script as follows:

> msbuild Pex.msbuild /p:PexVersion=0.6.30728.0

If I forget to set the argument, the value set within the script will be used.

Technorati Tags:

Continuous Integration Builds for CodePlex Projects

As many of you might be aware, I maintain the IronEditor project on CodePlex.  I like CodePlex, I find the wiki and issue tracking to be useful, but I find the directory structure and searchablity of projects on the site to be most useful. However, the one thing which I wish it could offer, and this goes for Google Project Hosting as well, is a continuous integration server. I would like to know that after I have pushed my changes to CodePlex, that it does actually compile.

As I’ve posted before, I run TeamCity on my home machine as it is free and easy to configure. As TeamCity can connect to Team Foundation Server (what CodePlex runs under the covers), I decided I would just be able to connect to the CodePlex servers and threat it like any normal source control system.

Sadly, I was wrong. When I tried to connect, I found the following (extremely helpful) error:

image

If anyone knows the reason why this failed, please leave a comment.

However, all is not lost! CodePlex have created the SVNBridge, a small application which runs on your local machine, or in this case your CC server. This translates all SVN commands into TFS commands, the result is that you can connect to CodePlex using SVN clients, one of which is TeamCity!

Connecting TeamCity via SVNBridge to CodePlex

SVNBridge is a single executable. It lives in the system tray and listens for communications on a particular port, in my case 8081.

image

In order to connect to SVNBridge, and as such your CodePlex project you must use a particular URL.  The URL is your SVNBridgeAddress/YourCodePlexServer/YourProject

For example, IronEditor lives on tfs03.codeplex.com and in order for me to connect via the bridge my URL would be:

http://localhost:8081/tfs03.codeplex.com/IronEditor

With my setup, I simple wanted to be able to click run and TeamCity go off, download the latest code from TFS, build my solution and execute my unit tests again just to be sure I haven’t broken anything (or in my case simply forgot to upload the latest code changes).

In order to connect TeamCity to CodePlex, you simply need to use the Subversion VCS settings and point it at your SVNBridge. Below, are my settings for IronEditor. The username and password you need to use to connect are your website login details.

image

On TeamCity’s dashboard, I now have my new CodePlex build along with my local IronEditor build, both run the IronEditor.build MSBuild script, they just pull the sources from different locations. After I have pushed my changes to CodePlex, I manually click run, and off it goes. The reason I don’t have it manually checking is because it’s rare I push changes to codeplex and as such I’m happy to be in control of when it happens.

image

TeamCity is now happily talking to my CodePlex repository, building as and when asked. If you happen to have a spare server, or a windows home server, then you could run this setup from there, have it automatically check for modifications with all the developers on your project benefiting.

Technorati Tags: , , ,

How to have Pex generate NUnit, MbUnit and XUnit tests

By default, Pex will generate MSTEST tests, however that doesn’t mean you are constrained to using MSTEST for your project. The Pex Extensions project on CodePlex has a set of extensions which allows Pex to generate code for the three main test frameworks. I’ve just committed some changes to make the code work against the newly released Pex 0.6.  In order to generate tests for a particular framework, you need to follow these instructions.

XUnit

image1) Reference Pex.xUnit.dll

2) In your test project’s AssemblyInfo.cs. Add the attribute – [assembly: Pex.Xunit.PexXunitPackage]

3) Pex will now start generating xUnit tests.

NUnit

image 1) Reference Pex.NUnit.dll

2) In your test project’s AssemblyInfo.cs. Add the attribute – [assembly: Pex.NUnit.PexNUnitPackage]

3) Pex will now start generating NUnit tests.

MbUnit v3

1) Reference Pex.MbUnit.dll

2) In your test project’s AssemblyInfo.cs. Add the attribute – [assembly: Pex.MbUnit.PexMbUnitPackage]

3) Pex will now start generating MbUnit v3 tests. A MbUnit extension is also included within the GallioMbUnit package.

Note: At the moment, you need to download the source code from codeplex and compile the binaries yourself, there is a single solution file which will build everything.

image

You also need to ensure both the extensions and your test project are built against the same version of the unit testing framework, otherwise there will be a type mismatch and Pex won’t be able to generate any tests.

Technorati Tags: , , ,

Very exciting news! Red Gate’s .Net Reflector

ReflectorRed Gate announced today that, under a new agreement, it will be responsible for the future development of .NET Reflector, the popular tool authored by Lutz Roeder.

Reflector is one of the must-have tools if you are a .Net developer or tester, allowing you to really understand the software and APIs in a very unique fashion. Personally, I always have a copy of Reflector open on my desktop and without it .Net development would be very different!

If you are interested knowing more about this agreement, I would recommend you read the interview between James Moore and Lutz Roeder on Simple Talk. The most important part – “Red Gate will continue to offer the tool for free to the community.”

If you haven’t tried Reflector yet, I really recommend you download it. It could change the way you develop .Net applications – http://reflector.red-gate.com/

Technorati Tags: ,