Testing is a craft which can be performed in many different ways. There are several different approaches to define all the ways, styles or schools of testing. But most of these approches are one or at maximum two-dimensional. Of course the reality is different.
All the various classifications are only a model, a representation of one or few aspects. And all these representations are valid - in their specific point of view. One of the most common models list scripted testing on the one end and explorative testing on the other; but in this classification it is hard to fit disciplines like performance testing.
So if you want to classify performance testing it does not work with this model. In performance testing you will always have a script; or at least one script and probably multiple scripts or a whole framework of scripts. But it also doesn't prevent you from exploratory testing, which utilizes different combinations of scripts, configurations and test data. This concept is not widely spread yet and many people will associate performance testing with classical approach of load testing which is used to validate the performance requirements. It is seldom used during a development cycle to give feedback about actual behaviour in edge situations; but as performance requirements are seldom strict requirements but have many dependencies on other system variances, a exploratory approach would be even more important as for functional testing. It allows to do performance engineering including tuning, optimization, troubleshooting, bug fixing etc. It even might give you the ability to define performance specifications in the first place.
And in the light of this context the concept of scripted test doesn't seem to fit for performance testing anymore. Having this new viewpoint on performance testing, what would be the opposite site of the exploratory context? - Regression testing fits best.
In regression testing we want to make sure that during the product lifecycle in further iterations we don't have regressions. We like to have this as quick and cheap as possible - so we are using test automation (whenever possible) for this task. I often refer to this type of testing rather as checking. For a new system we would use exploratory testing to gain insights about system behaviour and gives us the ability for defining scripted tests we will use as regression test in later iterations.
Having a definition of in this way, the different testing approaches are looking complementary instead of trying to beat each other. Questioning one approach or favoring the other approach is faulty from beginning.
Another example where you could use both approaches complementary is when you have a highly modularized configurable system consisting of several software components. Having automated component tests can be supplemented by explorative system testing with focus on system configuration and component interactions.
This said the classification into regression and exploratory testing is of course not complete! There a various other point of views into testing and none of them is the ultimative one. Relying on only one strategy however should be avoided.
When it comes to testing mobile devices one question will become relevant sooner or later:
What devices do I need to test? What configuration (OS, browser versions) do I need to test?
One good approach is to ask marketing, business development or whoever is responsible what the target market is (or what they promised the customer). Certainly don't trust the your device pool to be representative. Another good place is to look at statistics on current device distribution. I'd like to recommend two sources:
A weekly updated statistic of iOS device usage: https://david-smith.org/iosversionstats/
Googles Android OS distribution: http://developer.android.com/about/dashboards/index.html
Once again I rebooted my blog. I will start with linking to other content, posts, articles and commenting on them.
I reinstalled Octopress - now in version 3. Again it looks promising and again easy to use. I still have some blog posts as drafts from 2012 and 2013 I might want to pick them up again.
Yesterday I facilitated our monthly coding dojo again. Topic was 'Refactoring'. I chose the Gilded Rose Kata, which can be found in Emily Baches Github repository. In my opinion it went quite, but I also learned some things; which of course is a good thing.
First is to not introduce to many new things; my coding dojo group is quite new to this topics. Some developers are young and unexperienced, like I am unexperienced in facilitating katas. As suggested in the kata we used texttest, a tool to compare the output of a testrun against a golden master text output. I installed this tool a month ago myself and tried it only once myself. While this worked like a charm for me, it did not work so easy for some of my coworkers. We spent over thirty minutes to setup everyones environment to get everybody start coding. Also I was not prepared good enough to give installation and setup instructions.
Next lesson for me was to make clear that "Refactoring is not Reimplementation". Some choose the approach to delete the whole code and reimplement it. That is clearly not the goal of the Kata. Out of 4 implementations two were completely ditching the code and implementing the provided requirements. Last night I thought about how to prevent this. One solution would be to not provide the detailed requirements, so that that you have to rely on the given implementation. Instead of the requirements only a rough description of the system would be given.
BUT this would make one of the approaches, writing own tests, extremely difficult. When I did the kata the first time myself I did not use the provided testtest but implemented some acceptance tests myself to get a reasonable coverage of the provided requirements.
Only one of the teams started to implement own tests. And they got lost in implementing very detailed and complicated tests, so they ran out of time without even starting to refactor the code.
I ended the session with an retrospective and closing circle where we had very interesting discussions about the difficulties provided by the code. And of course had a discussion about when to refactor and when to start reimplementing.
There are many explanations out there (and very good blog post) what checking and what testing is.
One thing came to my mind a few minutes ago:
While doing checking just the result, the answer to the question "Is this working as expected?" matters. Testing is the pursuit for this answer. And will give you more than just this binary result.
A little bit of history
A few years ago, I was blogging on blog.stefan-weigand.de. The blog was called b.l.o.g. I use various kind of blogging systems: expression engine, moveable type and at the end wordpress. I blogged in german, I blogged in english, I blogged about fun stuff on the web, I blogged about technology, I blogged about personal stuff.
I stopped blogging in december 2010 - I didn't know what to write about then. I thought, the web is full of information and people writing, blogging in a better way I could do. So I stopped.
I didn't consider to write because I want to write; because I want to express myself.
Continue or restart
During the last year (and maybe even for some time longer) I am thinking about different things regarding my work, work related education as a software developer, software tester and the other roles I am taking on in these fields.
With this thinking I am also developing the necessity to write about it. I could have continued on my old b.l.o.g., but I think it is not the appropriate place as my focus now is different than it was the years before. So I am restarting.
Which domain to use?
Currently I am still struggeling with the question, what domain I should use for this writing project. For now I will use the subdomain octo.stefan-weigand.de, but may be it will move to somewhere else. octo is used because of the underlying software octopress. This might not be wise, but anyhow I can't come up with some better name for it at the moment; so it is as good as any other name. This new blog is titled O.C.T.O. This is similar to the naming b.l.o.g. and simply stands for itself. ;-)
So what will be the topic of O.C.T.O.
I want to do a seperate post on this topic as well. To be short it will be about my work related life, a.t.m. about the topics of software development, testing, software craftsmanship. I want to write, but also want to comment on other blog posts or sometimes simply point to other blog posts.
This made my day! Jetzt kann ich ja Feierabend machen. :-D
Außerdem kommen kurz vor der Deadline noch neue/geänderte Requirements, die beachtet werden müssen!
subscribe via RSS