Are we afraid of test automation?

After having spent the last 9 months working on a test automation implementation for a team I am not running myself I have started to wonder what are the main success factors for test automation, besides the obvious: achieving the goal you set out in the first place?

One of the main factors that will help test automation be a success in your organisation is to not consider test automation a goal, but use it as a means to achieve a goal. The goal should not be something like “have x% of regression automated”, your goal should be something in which test automation can be a means to achieve this goal, for example help free up time for the testers to focus on important things rather than having to spend most of their time in regression testing.

Another very important thing to keep in mind in implementing test automation is that you need to keep a sharp eye out for trying to automate everything. Technically it may be possible, but it is hardly ever a good idea to want to automate everything. Quite some things require human intervention or human interpretation and cannot be simply made into a Boolean (which is what test automation is in the end, something is either true or false).

Look & feel testing, or validating, should in most cases not be automated for example. This for the simple reason that, despite it being possible, will more often than not raise either false positives or more likely, false negatives. Since these tests often require some form of image recognition and comparison a change of screen resolution or a small CSS change (font difference for example) will make the test fail, resulting in either maintenance or tests being made redundant.

For me however, the main show of success is that the automated tests are actually used and maintained actively.

Having a test automation framework and relying on it to do its job is not good enough. Just like every other software automated tests also want some TLC or at least some attention and maintenance.

Something that is still often seen in test automation is that the tests run but with non-determinate errors, in other words errors that are not really due to the testcases but are also definitely not a bug in the system under test.

If you show some tender love and care to your automation suite, these errors will be spotted, investigated and fixed. More often however these errors will be spotted, someone will “quickly” attempt to fix it and fail, after which the “constantly failing test” will be deemed useless and switched off.
Besides non-determinate errors there is another thing I have seen happen a lot in the past.

Some automation engineers spend a lot of time and efforts building a solid framework with clean and clear reporting. They add a lot of test cases, connect the setup to a continuous integration environment and make sure that they keep all tests working and running.
They then go ahead building more and more tests, adding all kinds of nice new tests and possibilities. What gets forgotten often however, is to involve the rest of the department, they do not show and tell to the (manual?) testers, they do not share with the developers. So the developers use their own unit tests for themselves and if they do some functional testing they do it manually and sloppily. The (manual) testers go about their usual business of testing the software, as they have always done. Not considering that a lot of the test automation suite they can use and abuse to make their life easier. They will spend time on data seeding, manually. They will spend time on look and feel verification on several browsers, manually.

All this time the developers and (manual) testers could have been using the automation framework and the existing tests in it to make their life a lot easier.

While writing this down it starts sounding silly to me and unlikely that it happens, yet I have seen this happen time and time again. What is it that makes developers but especially testers afraid of hesitant to use automated tests?
I love using test automation to make my life easier when doing manual testing, despite having an very strong disliking for writing code.

Test automation should be seen as a tool, it is there to make life a hell of a lot easier. It cannot replace manual testing, but it can take all the repetitiveness and tediousness of manual testing away, it can help you get an idea of what kind of state a system is in before you even start testing, but it can also help you, a lot, in getting the system in all kinds of states!

Where are the technically inclined testers?

–Edit–

This would have been better had it been called something towards “Where are the technically curious testers” as Anna Baik pointed out. The wording is not perfect.

//Edit//

Over the years I have worked with a whole range of testers, both functional and technical, who have shown different levels of comfort with the technology under test. Some were good, some were great and others I simply never want to work with ever again. The latter ones, to me, are a problem we are currently facing in software testing. There seem to be too many software testers with a lack of understanding how software is actually built, how the internals of a program work.

I have never understood how one can be working in technology, or to be more specific in information technology and not have at least a base grip on how things you are supposed to test work? Even when you are purely focussed on functional testing, I really do not understand how you can not grasp the basics this entire industry is based on.

How come that even now, when it is becoming more and more clear that software testing is an important specialisation within the IT industry we still see a lot of testers with no technical knowledge what so ever in this industry? I realize that historically there was the idea that “anyone can be a software tester”.  I am just not convinced that statement still holds true.

Take for example a building inspector, that is also a tester of sorts, no? He verifies whether all the rules have been followed of proper architecture, the design has been implemented as promised etc. Imagine you are having your own house constructed and this building inspector comes by. You put your hopes on this person to validate that your house is what you want it to be: warm, safe and as you requested.

This building inspector, upon arriving, grabs his excel sheet with the requirements for the house and checks the ones he can verify, without needing any knowledge of architecture or constructing a house for that matter. He then gives you this checklist and tells you he has functionally covered all things of your house.

Considering that you live in an area with lots of snowfall in winter you ask this person whether the roof can manage with a load of snow on it in winter and the heating will not collapse under the strain of having to heat the entire house in a snow-storm.

He says: I don’t know, I am not an architect nor a construction worker, I am merely the building inspector. I just go through this list of things I need to check, why would I know anything about building a roof for a house or what the heating can take? You should probably ask the construction workers what they think.

Would this give you faith in his verdict that the house is indeed what you wanted?

I believe not, yet this is a practice that i see happening fairly often in software testing. So where does it still go wrong in software testing? How come there is still this tendency to believe there is no need for understanding software when you test it?

How often do you see the same bug coming by when testing an application, once you see a bug more than once, in a very similar form, you should be able to come up with the idea that there might be an underlying issue going on. Instead what I  often see is that for every occurrence a new bug is created in the bugtracker. To understand that there quite likely is one underlying problem, the tester doesn’t need to know how to program, you should however have a basic idea of how a program is (or should be) built up and if you are not certain whether it indeed might be one and the same issue in the code, how about talking to the developer?

When exchanging ideas and thoughts with other testers on twitter and forums etc I quite often see an amazing lack of knowledge in this area. To make things even worse, a fair share of testers seems to have a degree (bachelor or master) in computer science yet have no clue what, for example, a regular expression is. Is it just me, or is this indeed a worrying thing? When reading through the Computer Science curriculum of an average university here in the Netherlands, I do see all kinds of interesting subjects and descriptions that would lead me to believe basic programming is part of what you get taught, however when talking to the graduates that end up in software testing I see nothing of that knowledge.

Where are the technically strong testers? The ones that can have a discussion with developers about how the structure of a program was setup, who can tell a developer that the SQL query he wrote is extremely inefficient? I know I see some of them online, these are the testers I enjoy following on twitter and on blogs, but there must be more than these happy and noisy few. Where are they hiding? In my experience there are not enough of them, at least not in the Netherlands.

— Edit–

There is a nice article dealing with similar questions and frustrations on testnieuws.nl: http://www.testnieuws.nl/2011/06/06/tester-praat-ook-eens-met-een-ontwikkelaar/ (sorry, it is in Dutch, if you do want to attempt have a look at the google-translated version)

 

–Edit 03/11/2011 —

Really nice to read Elisabeth Hendrikson’s article on a similar subject, but from a different point of view: http://testobsessed.com/blog/2010/10/20/testers-code/

I am sad to see though that there are people commenting on her article and are calling QTP and especially Selenium basically record/playback tools. If you have ever used either you know that QTP is a hell of a lot more than just record playback and Selenium is clearly NOT a record playback tool (unless you mean the Selenium IDE rather than the entire toolset of Selenium.

 

 

 

Educational reflections of a fairly successful tester

Experience in any field comes through learning and perfecting the skills and knowledge you apply in many different areas. Working in the software testing industry has taught me a lot, over the years I have gained quite some programming skills, analytic skills, people management skills, learned a lot about all the different testing theories, read a boat load of books about testing, programming, test automation and quality assurance, heard a lot of people speak about it and gained quite a broad general knowledge of testing, or so I hope. All of this knowledge I apply, fairly successfully, on a daily basis in my job.
So what would it take for me to go from fairly successful to very successful? Do I need to gain some more specific knowledge? If so what exactly?

I love learning new things overall and related to my job in particular, i enjoy studying to get new ideas and get my mind challenged making space for the most unexpected and fresh approaches, perceptions and solutions.

Working for Polteq gives me an advantage of my employer providing quite a lot of trainings I can sign up for. Unfortunately when going over that list there is not all that much I would want to study at this point. I have studied books on TMap and ISTQB, ITIL, Prince, TQM and many other subjects, why would I now all of a sudden need to do courses in it?
Agreed, it might give me some new ideas, but it will not really challenge my mind I believe, especially since the courses are mainly oriented towards gaining certification in these subjects.

A while ago I needed to hand in an overview of my “educational needs” to my manager in order for the company to see how many people have desires in a similar direction and thus which courses they can arrange with a fair sized group. My answer to this request went something like this:

I feel a need in further education, however, at the moment I have no clear ideas in which direction I would want / need this education to be.

I am inclined however to say that my education should be oriented more towards gaining skills, both testing skills and softskills, given the frame of work I am being inspired by at the moment. This will help ensure moving test automation within Polteq and the testing community to the next level, the thing that i am aiming to in my professional life at the moment.

Is it more difficult to learn something new as an adult than when we were kids

Over the last 3 years I have continually been amazed when watching our son learn new skills. It seems to come so natural to kids, learning. They go about it with extreme ease and are absolutely not dissuaded by initial failure, or even by repeated failure, in stead, they change their approach and try again, but now from a different angle or point of view (usually literally a different angle or point of view).

This has kept me thinking,  why is it so difficult for adults to learn something new?

Experiential learning

Experiential Learning Model

Kids have a big advantage when trying to learn things, they (often) look at them for the very first time. This helps them to not worry about what it is supposed to do, but to instead figure out what it does do. Whether it is the intended function or not, the child learns what the object can and cannot do in a rapid and fun way.

Whenever us adults encounter something new we always will try to draw from the past, look for something that may have been similar and go forward based on assumptions rather than intuition. In quite a few things it helps, but in as many things I also believes it hampers us. Most adults have a form of built-in “best practices” which they will use when they encounter something new (as described in the concept of experiential learning).

So how can we, as software testers try to not use our default boxed-in thinking? Is there a way to break through the barriers experience builds and look at something as if you see this truly for the first time?

One way I try to ensure I do not get blocked by my knowledge when starting on a new piece of software is by exploring it as I believe my child would do it. Rather than guess what it is supposed to do I try to figure out what I can do with it.

Fairly soon however, I tend to run into the feeling I have seen something like it before, or the feeling of recognizing a pattern which then triggers all kinds of things I have learned in the past. So, how do you get past this? Or should you want to get past it?

The fact that our mind works based on boxes seems a limiting factor, but apparently these boxes do come fairly natural for the human mind. One way I try to use for the so called “thinking outside of the box” is to expand the boxes in my head to contain several different boxes.

In the example above, where I start recognizing a pattern or have the feeling I know what I am looking at, I try to combine the two feelings, or hunches, and through that push my mind into a whole new trail of thought, thus breaking out of the original boxes, and shoving all of that into a bigger one. On top of that I try to change the context in which I know these patterns, transplant the feeling to my current situation and see how this could apply here, since I have to deal with a different situation, with different circumstances; different data, different programming language, different developers and designers, quite likely a totally different objective of the software I am looking at etc.

There have been all kinds of studies and theories that should help one think outside of the box, but what seems to work for me is a combination of things:

  • try to apply knowledge I gain from reading books (and I read a fair amount) into a practical situation
  • try to combine the new experience and the “learned” responses into its own, separate context of the here and now (e.g. the context in which I am currently working)
  • try to find several points of recognition or deja-vu if you will, and mash these ideas together into a whole new thing, within the current context
  • and to top this all of: I try to think of what my son would do to figure out what this object (or software) can do for him.
This approach may not always be the most efficient or for that matter the most effective, but it does ensure that whenever I try to learn something new, new software when testing it for example, that I expand my horizons yet a bit more beyond just this object under investigation. I quite often even get new insights into how I could have tackled a previous problem more effectively as well. Which then gets stored in memory and at some point gets reused yet again.
The vicious circle of learning is great, and finding new ways to expand your knowledge and the ways of learning new things is even greater!

Getting a junior up to speed on test automation with FitNesse

Last week we had the privilege of having a junior test engineer working with us for a few days to see what it would take to get him fully up and running with test automation as we have implemented it at our customer.

Our intern, as I introduced him at our client, has a solid eduction in Civil Engineering and lacked any kind of knowledge of test automation or writing software. He has just finished his first project as a junior tester, which was testing a data warehouse migration.

Motivation

What drove us to try this was simple: curiosity and questioning of my personal coaching and explaining skills. Thus far I have had the feeling that I somewhat failed in showing how easy the setup of FitNesse with our custom fixture is to anyone who is not part of a management group. With this engineer I wanted to confirm whether this was me not explaining things clearly or people not following what I explained properly (e.g. me explaining things in the wrong wording).

Starting point

Our “intern”, as said, has little or no hard IT knowledge. He is one of the junior test engineers that came out of the group of trainees Polteq Test Services hired at the beginning of the year. With his degree in civil engineering he is for sure a smart guy, but considering he has never been on the construction side of software, he had some ways to go.

Considering that he had no prior knowledge of either the FitNesse/WebDriver setup nor the environment we are working on, we started with a morning session of explaining and overflowing him with information by answering the following questions.

  • What do the applications under test do?
  • What is the service oriented architecture underneath?
  • How does this all tie together into what you see when you start using the applications?
  • What is FitNesse?
  • What is WebDriver?
  • How do these two work together?
  • What is the architecture of the Fixture?
  • What is a fixture and how does that relate to WebDriver and FitNesse?

After this session he was set to work with FitNesse. The main thing that slowed him down somewhat was the getting used to the syntax as we force it through the use of our custom Slim fixture. At this point he still had only a minor base knowledge of what the application under test does or is supposed to do. Based on the existing functional testcases he managed to fairly rapidly automate a set of testcases, or more specifically, transform them from base functional descriptions into Slim tables which will run successfully as a test.

The result

Writing the testcases was easy work for him, he picked up the base syntax really fast and managed to pump out some 15 working tests in a very short period. It was time for a bit of a challenge.

Considering he had never written a line of code in his life I thought we might as well check to see how fast he would pick up writing a simple wrapper in C# around an Advanced Search page, which includes a set of dropdowns, textfields, radiobuttons and checkboxes which can be manipulated along with a submit and reset button.

The first two methods we wrote out together, him typing while I was explaining what is what. Why do you want the method to be public, why does it need to be a bool, what are arguments and how do you deal with that in FitNesse.  Where do you find the Identifier of the object you are trying to get the wrapper around, what do you do when there is no ID, how do you extract the xpath and make that work well. Once we got through the first few methods I set him at work to figure it out for himself.

The first question I received after a while was: ok, so now I’m done writing these things out in code, then what? How can I now see this working in FitNesse? After making an extremely feeble attempt at explaining what compiling is and deciding to just show where the compile button is, he then set to work to verify in FitNesse that his code indeed is capable of reaching and manipulating every element on the search page and getting to a sensible search result.

Take away

What did I learn in this week? For starters that when coached close enough it is very simple to get someone without experience up and running with FitNesse the way we set it up, which is good to have confirmed again, since that was the aim.

Another thing we have seen proven is that adding new methods to the fixture is dead-simple, changing the ID’s of objects on the pages should not lead to too much hassle maintaining the fixture. For the Base class quite some developer knowledge is still required, but once that part is standing expanding the testable part can be done with some of coaching. So technically we would need to start handing over maintenance of our Base classes to someone within Development and hand off maintaining the rest of the fixture to someone within the test teams here.

One of the things we might consider in making maintenance easier could be to split the leaf-nodes, e.g. the page level, off from the base and helper classes in order for these two to be complete independent of one another, which means that the developer can add and refactor in the base, without breaking the current functionality, once done refactoring or adding stuff, the new DLL can be used to talk to.

Maybe I am getting carried away with making things modular now though…

Overall, good to see our idea of making things easy to transfer indeed seem to work well, although I do not want to say that this one week was enough to hand over everything of course!

Based on this week I have started to explain things to the test team internally, which does seem to indeed be an improvement. I do believe that having this week gave me a chance to play around with the ways in which I explain stuff, especially on a non-technical level.

Erwin, thanks for being here and listening to my explanations, following instructions and asking questions! It was a joy working with you this week.