August 9, 2016

Probe Testing in The Martian

The story from The Martian about testing the probe is worth a post (some scenes and phrases are skiped).








July 8, 2016

Bachelor Thesis "Version Update Automation Using Scripting Language Bash"


I've finally finished University of Tartu, so I am a Bachelor of Science in Engineering now!

My work was about version updater bash script, that I made on my work for updating Java applications on different web servers. I once wrote a post about first version of this script – Scripting For Automated Update (Tomcat 6), – the final version has more features and much more code.

The thesis is written in Estonian and can be found in GitHub – Version Update Automation Using Scripting Language Bash.pdf (or in UT registry).

The script itself is open-source (currently only Tomcat 8 part) and can be used in other projects. It's located in GitHub repo – github.com/iriiiina/version-updater, the manual about install, configuration and usage is in GitBook – iriiiina.gitbooks.io/version-updater-manual.

And here are slides of defence (also in Estonian). I don't know why you may need them, but since we are talking about thesis I'll leave them here:

I feel proud that I got A for the thesis, but I don't have any good feeleings about finishing the university itself. Two years ago I wrote a post Secrets of a Buccaneer-Schoolar by James Marcus Bach where I explained my opinion about the university and hight schools, so it wasn't the priority for me. I decided to finish just to not loose already gotten points for finished courses – I've already passed 99% of the programm, so it would be a shame to spend that time for nothing.

June 17, 2016

Bash Scripts for Transfering Files Between Server and Local Computer


Yes, Bash scripts again! Two scripts for copying files from server to local computer and vice versa. Usually I use them if I want to work with some text files in graphical editors, not in Vim. So I can download the file to my local computer and then upload it back to the server.

./copy-file-from-server.sh [SERVER] [FILE]
./copy-file-to-server.sh [SERVER] [FILE]

Basically, it's the same as using scp command, but scripts allow you to define short names for servers. For example, instead of username@192.168.1.1, username@some.hostname.com or something1-23-45-678-9.eu-east-0.compute.amazonaws.com you can use short names, like test, demo, latest etc.

To set your own hosts in the script you need to modify function setHost(). In the copy-file-from-server.sh script you can also set a fixed directory where you want to download files. Currently it downloads to the same directory where script was run, but you can change it in the downloadFile() function.

Also, don't forget about aliases! You can set some short name to the script and use it in every directory.

June 7, 2016

Script for Sending E-mails About Certain JIRA Issues


"Still life with various Unix shells" by Bartholomeus van der Ast (source)


Recently I wrote a new Bash script, that can be useful for others.

./notify-about-issues.sh
It sends e-mail when some specific JIRA JQL query returns any result. You can create any JQL query you want: return issues that were created in specific project, return issues that were commented by some specific user, return issues where status was changed etc. Probably you want to limit the query by period to monitor only recent changes. This period can be the same as the frequency of the e-mail sending. For example, if you want to check this filter every 10 minutes, then you can check changes in last 10 minutes.

To implement this script you need to define following variables in the code:
$jiraUrl – URL of your JIRA
$filter – JQL query, encoded for URL (you can use some online URL encouder/decoder)
$user – JIRA username
$password – JIRA password (authentication is very primitive and not safe, but you can use some general credentials or keep the script file in safe place)
$to – e-mail where notification should be sent
$from – e-mail from whom notification should be sent (could be the same as to)
$subject – subject of the e-mail

Automatic running can be configured with crontab – read post 10 Mac OS X Terminal Commands for Testers for details.

The most complex part is to write a propper JQL query that returns exactly what you want. You can use Advanced searching documentation for complex queries or ask an advice in Atlassian Answers.

May 27, 2016

Web Application Security Training by Clarified Security


You shouldn't find any EXIF data in this picture.

I've participated in the Web Application Security Training by Elar Lang from Clarified Security. It's very detailed and thorough hands-on 4 days training for web programmers, testers and all people, that are related to web development.

As for me, it's must-do for all web developers and testers. If you look at the content (especially client-side part), then it may seem to be too general and basic, but actually these well known things are explained in a quite deep and advanced level.

Organization is really nice feature of this training: first we had 2 days of client-side part and next week 2 days of server-side part, so between them I had 3 working days to work through client-side attacks on my application. Also, there can be maximum 12 participants, which makes the treaining more individual.

In summary – I really recommend it to people, who thinks that their software is safe or that there is nothing to wory about even if it's not.

October 23, 2015

10 Mac OS X Terminal Commands for Testers


I really like terminals, command lines, bash scripting etc. They are handy and easy way to automate some routine tasks in my everyday work. Thats why after I saw article Eight Terminal Utilities Every OS X Command Line User Should Know I decided to write something like this, but for testers.

These are OS X commands, but some of them are also working on Linux systems and Windows (if use some UNIX-like terminal like Babun, CygWin or Cmder).

This is defently not the complete list. Here you can find the complete list of all OS X commands, but this post is about my favourite and frequently used (almost) commands.

1. alias
It's the most useful command of all – it allows to replace some command with a short word. That means, that if you have some long command that you regularly run in the terminal you can assign alias to it and use some simple short substitution. For example, you always monitor application logs using tail -1000f application.log, but instead of writing this whole string every day you can create alias like logs and write only 4 letters into the command line. Moreover, you can use autocomplete with aliases (TAB key), just like with regular commands. You can see my previous post Bash Scripts for Working With Documentation for more examples about using alias.

There is one trap that many beginners fall into – you can just type following command into your command line:
alias logs='tail -1000f application.log'
but in that case you create alias only for your current session. That means it'll disappear after you close it. To create permanent aliases you need to put them into specific file, which is usually ~/.bash_aliases. You may also need to add following code into your ~/.bash_profile file (you can add it just to the end):
# Aliases
if [ -f ~/.bash_aliases ]; then
. ~/.bash_aliases
fi
And you need to restart your session for taking these changes into use (just close-open you terminal or exit-login to server).

2. crontab
This one allows you to automate some commands. For example, you have some documentation on SVN (or some similar version control system) and you need to update it regulary to get the latest state. You can add following job into your cron schedule, which updates SVN directory every 10 minutes:
*/10 * * * * svn update ~/Documents/SVN-documentaion
To open crontab editor, where you can insert previous command, you need to run:
crontab -e
(or sudo crontab -e if you are not under admin user)
See Ubuntu CronHowTo article for understanding what are these asterisks mean.

3. grep
Grep is a very powerful tool for finding strings in any output (log files, query responses, script output etc), that you can perform. I use it to filter log files. For example, I can get all rows, where ERROR word is present:
tail -1000 application.log | grep ERROR
Or I can get all rows, where todays date is mentioned in the beginning of the row:
tail -1000 application.log | grep ^23.10.2015
Grep works with regular expressions, so you may need to understand them to perform some complex filtering, but simple search by one word may be enough for you.

4. find
Find searches files in directory. Again, in my previous post Bash Scripts for Working With Documentation I wrote about scripts for finding documents by name or by content – the whole funtionality is done by find there. So I use it in by daily work for finding documents with required data. You can find (ha-ha) a lot of examples in the following article: Find Command in Unix and Linux Examples.

5. open
Simply opens a file, a folder, an application or URL, just as if you had double-clicked the file's icon. It's useful if you first have done some complicated search of required file (using find), got the path of it and open it (with default or some other application).
Second useful case – opening directories (especially hidden ones). For example, open . opens current directory.

6. awk
AWK is actually programming language designed for text processing. It can be used as reporting tool and I usually use it to beautify my scripts' output – for example, to color some words. I am not sure that it's useful for everyday life, but if you want to write your own script, it better has meaningful output.

7. scp
It stands for Secure Copy. It is useful for somebody who works with servers (application updating, logs monitoring etc). With this command you can copy files from one server to other, or from server to your computer and vice versa. You can find some examples in Example syntax for Secure Copy (scp), I copy here one of them:
scp your_username@remotehost.edu:foobar.txt /some/local/directory

8. pbcopy / pbpaste
These two commands may save you from endless scrolling: pbcopy copies output to the clipboard, pbpaste inserts content from the clipboard. For example, you have some log file, that you want to investigate on your computer in you favorite editor. The simplest way is to copy its content to clipboard and paste it into another document. With these commands you don't need to scroll anything to select the text, or google "how to select the whole text in vi" or anything like this, just type:
cat application.log | pbcopy

9. screencapture
Honestly, I don't use this command in my daily work, but I have a feeling that it's important. It does what it says – captures the screenshot. You can set time delay to it and you can write a script for time lapse (taking screenshot every 10 seconds or something like this). It just happens, that I use screenshots quite rarely in my work, but I guess it can be interesting command for testers, who use them a lot.
You can see examples here: Take a screen capture from the command line.

10. sips
Last (and least) one is SIPS – Scriptable Image Processing System. You can use it for processing files, for example, resize all PNG files in current directory to 1024×768 (ignoring aspect ratio):
sips -z 768 1024 *.png
Again, I don't work with screenshots and picture files a lot, but it's the simplest and the fastest way (and cheapest one, comparing to some Photoshop) to process a large amount of picture files (or if you want to work with pictures in your bash script).



Don't forget the alias! You don't need to know all these commands by heart – you can just figure out how you can use them and which parameters suits for your case, put it into the alias thereafter use only short and nice commands.

BONUS!
telnet towel.blinkenlights.nl
This one is just for fun (once you've already opened the terminal). Some guys did pretty awsome ASCII version of Star Wars IV. You can run it over the Telnet. I don't know how long does it last – I've watched 15 minutes and Luke only meets Chubaka and Han Solo, so I guess they did the whole movie.

September 30, 2015

Bash Scripts for Working With Documentation

Recently I wrote a couple of bash scripts for working with specification documents. They solve sort of general problems, so I decided to share them here.

All scripts can be downloaded from my GitHub repo: github.com/iriiiina/scripts

All scripts can be executed in UNIX terminal (for Windows users there are alternatives such as Babun, Cygwin, Cmder). I also recommend to use aliases: just pick up short and unique command, specify the path to the script and you can excecute this script with short command from wherever you are (like spec bug.01.1 for finding specification with such text in the title).


find-specification-by-name.sh
./find-specification-by-name.sh [text_in_file_title]
Script looks for .doc files (extension can be changed) in a specific directory (which can be also changed) where title contains text, that user gives to the script.
In the example I have directory specifications with 4 documents, but only 3 of them contain word bugs in the title. There is also directory Archive with old documents, which script also finds, but shows Archive folder in red, so user can easily differ old documents from new ones (you can also change words that should be colored).

Rows, that you may want to change to use this script in your way:
  • 6 path="/Users/irina/Desktop/specs" – here you should specify the absolute path (not relative, because you want to execute this script from different places) of the directory, where you have all your specifications (usually it's CVS of SVN root directory).
  • 31 find $path -iname "*$text*" | awk -v text="$text" -v lower="$lower" -v upper="$upper" '{ gsub(text, "\033[36m&\033[0m"); gsub(lower, "\033[36m&\033[0m"); gsub(upper, "\033[36m&\033[0m"); gsub("Arhiiv", "\033[31m&\033[0m"); print }' – the awk part is responsible for coloring. Codes like \033[36m are colors – particualary this one means cyan, but you can change it. Also you need to change word "Arhiiv", which helps to differ old documents. Or you can remove this part, if you don't want to color anything.

open-last-specification.sh
./open-last-specification.sh [text_in_file_title]
This one is doing basically the same as the previous one, except that it automatically opens lastly modified file. You can use it, if your components have unique identifiers and specs about these components have many versions, like this:
Rows that you may want to change:
  • 6 path="/Users/irina/Desktop/specs" – same as in the previous one, absolute path to directory with your specs.
  • 43 function checkIsFileIsDoc(){} – in my case I want to open only .doc files – I don't want to open folders or excel files, but you may want to remove this check.

find-text-in-specification.sh
./find-text-in-specification.sh [text_in_the_file_content]
This one is also very similar to first one, only it finds files with given text in the content.
For example, I want to know in which specification is mentioned data base column BUG.STATUS:
Rows that you may need to change are the same as in the first one.

nortal-logo.sh
I always wanted to print some pictures with bash script. So here are my first experiences.
Logo of company where I work. It was picked as easy to reproduce, because it has strict lines.

bug.sh

July 31, 2015

Presentation in DevClub About Biases in Testing


Recently I did presentation about biases in testing in Tallinn DevClub. You can guess, that DevClub stands for developer's club, so target audience was developers.

DevClub is pretty nice community of developers in Tallinn, which organize small meetings every month. Usually meetings are held in the evening (so every developer can attend after work) and there are 3 speakers in one evening. Most of the topics are technical about software development, but there are also abstract topics about photography or beer, for example. Before me Anton Karputkin talked about quantum computers and after me Alek Aldzanov talked about storage of picture files for sales service.


My slides are available on Slideshare. They are mainly in Russian (as the whole presentation was in Russian), but some of them have English translations. Actually, they don't have much text at all, because biases are not very illustrative things.


DevClub is recoding all sessions, so video is also available:



The presentation was mainly build on John Stevenson's book The Psychology of Software #Testing, that has a chapter about biases. I have already wrote the post about how I liked this book and I have also wrote the post about John Setevenson's workshop on Let's Test, where my main takeaway was "if you want to learn something try to write about it or teach it". So these 2 facts were the main reasons why I have chosen this topic and why I decided to do presentation at all.

I am not going to write about presentation's details – I'll write only one thing, that would be great to change next time: people want to hear about personal experience in details. I had some examples, but they were too abstract and general. So next time I try to add very concrete situations.

Surely it was very positive experience, the main surprise was that after presentation people started to share with me some interesting examples of biases.

June 11, 2015

My First Ever Workshop at Nortal TechDay 2015


Recently I did my first ever workshop at the conference (it was Nortal company inner conference, so you probably haven't heard about it). It was some sort of bugbash, where participants had to find bugs and vulnerabilities in certain application. The application was written by me using Angular.js, JavaScript, Node.js and it used REST services and DB. All the technical part was on me and documentation part on Alla Tarnovskaja – co-author of the workshop.

As we started to use Angular.js and REST services in our company, the idea was to give some insight about some potential bugs that may appear in these technologies. Participants didn't have any documentation or specification – just the application with very simple functionality and a lot of bugs with different level of complexity, so every tester could find at least one bug. Testers worked in pairs, so there was an opportunity to share experience. Pairs had about 1 hour for testing, after what I revealed known bugs and some pairs shared their bugs that I didn't mentioned about. Also in the middle of the workshop we published some tools, that we suggested to use to explore the application – it was a little hint for testers, who were out of ideas.


Alla Tarnovskaja drew beautiful pictures specially for the workshop. This bug is defeated by Nortal team.

My main fear before workshop was stability of servers – application uses web and REST servers (on Node.js) and as I didn't write them before I was scared that testers will put them down with some injections or special symbols. And they did. They put REST server down about 10 times and web server – 2 times. But during the workshop I constantly monitored logs, so I managed to put them up again in a very short time (1-2 seconds), so general downtime is not more than 2 minutes. The problem is, that all pairs used the same servers, so if one pair managed to put it down all pairs didn't have access to application anymore. I was thought about giving source code of services and application to all pairs, so they could launch them on their own machines (or on virtual machines) and don't depend on others injections, but these things usually take a lot of time in the beginning and we didn't had that time.

One idea that came up during the workshop – it will be fun to do some beautiful logs of servers and put them on big screen, so every pair during the workshop could see what is going on on server. But you can do this only if pairs use one general server, which is risky.
Also I got a lot of ideas about bugs that I can put into application from workshops in Let's Test conference.

There is 2 things that I really liked in our workshop and didn't see in other similar workshops. First one – very detailed documentation about known bugs and how to reproduce them (documentation was opened at the end of workshop), so every participant could try to reproduce them at home, after the conference. Second one – one very interesting and hard to find bug, that nobody found. It's always a little bit boring to hear about revealing the known bugs if you have already found them by your own. More interesting is to see how to find bugs that you didn't found.

Doing workshop is fun and useful – usually I start to ask myself right questions only if I have responsibilities and fears. So making workshop is way more productive for me, than participating in it.

June 8, 2015

Let's Test 2015: Final Day 3

Software Talks by Julian Harty (@julianharty)


Photo from Let's Test Conference Flickr

Security Testing talk by Jari Laakso was replaced by this one. Julian Harty talked about mobile software, customers feedback and statistics. Some facts about Google: about 75% of bugs are not being fixed (usually it's minor bugs that are closed or postponed); GoogleAnalytics doesn't know data of one particular user, it aggregates all data into one big picture; there is not so much manual testers in Google – basically they collect data from users, which helps to find problems.

One very interesting thing about mobile testing: if your mobile application connects to the internet then you should test its traffic – it shouldn't send or receive more data than needed, because users pay for each byte.

And one lesson about customers service: developers (or somebody in product developing team) should follow users comments on store page. First of all, it's good source of problems and bugs. Secondly, if there is a reported problem without an answer, users are likely to report more negative comments and reduce the rating. If developers team answers that they are dealing with reported problem users are more likely to trust.

Coders To The Left by Jan Eumann (@JanEumann) and Philip Quinn


Photo from Let's Test Conference Flickr

Workshop about how to find bugs using the source code and how to fix them. Very useful workshop for me, because 5 days earlier I did something similar in our company internal conference. So it was interesting experience to see how other people doing same staff as I did (especially, taking into account that it was my first workshop that I ever did).

During the session we worked in groups and had to find bugs and fix them. I discovered some new functionality in DevTool for myself and had a couple of ideas how I can improve application for my own workshop.


Me working in pair with Kadri-Annagret Petersen (@kadriannagret).
Photo from Let's Test Conference Flickr

Closing keynote Detecting the Heartbleed Vulnerability by Tuomo Untinen


Photo from Let's Test Conference Flickr

Seems like the presenter was not very experienced (which was a big contrast with the whole conference), so it was hard to listen to this keynote. Especially hard to focus after so intensive days. But topic was very interesting, so I even have written some interesting points: finding heartbleed bug wasn't a luck, more like a decision; vulnerability was made in 2012 in last day of December, which is just coincidence. They wanted to put some honeypots to find out does somebody exploit this vulnerability, but it went to public too fast (by OpenSSL fault), so it didn't succeed.

Main takeaway – if you notice something new and unclear – try to understand it.


Siim Sutrop (@SiimSutrop) is asking quite interesting question – was finding heartbleed bug a luck or good testing? The answer – it was a decision.
Photo from Let's Test Conference Flickr

Summary

In the first post I have already written that it was the best conference I have ever attended at. I've met there a lot of open, smart and inspired people, who give a lot of energy and ideas. You can talk about testing there in very different aspects, for example, I even participated in small talk about testing and religion. So, if you have and opportunity to participate in some conference and don't know which one to choose – I definitely suggest to choose Let's Test. Next date is already known – May 23rd-25th 2016!


Me thinking about the conference during the final keynote.
Photo from Let's Test Conference Flickr


See posts about other days:
Let's Test 2015: Day 1
Let's Test 2015: Day 2, Exploring Web App (In)Security