Android design inspirations

2017-03-19 Android, Design, DSP2017 No comments

When we are planning to create our next Android app, besides the codebase, it’s also good to plan its design, UI, and UX. Before doing that, we can take a look at some inspirations and work of other people to gather a few UI design patterns, inspirations, and ideas.
Good resource of such inspirations is Android Niceties website. As the authors write, this website is
a collection of screenshots encompassing some of the most beautiful looking Android apps.

You can also take a look at

Do you know any other similar resources? Share them in comments :-).

Unit test coverage report with Travis CI for Robolectric on Android

2017-03-19 Android, Continuous Integration, DSP2017, Gradle, Unit Tests No comments


Some time ago, I’ve written an article about Test coverage report for Android application. It got some interest (many comments below article and many visits according to Google Analytics), so I decided to refresh this topic. Previously, I’ve written instrumentation unit tests, which needed to be executed on a real device or an emulator. It’s a good approach when you want to test functionalities strongly connected with the device. E.g. when you want to test operations on a real SQLite database or something like that. Nevertheless, this approach has huge disadvantages. It’s hard to run tests on the Continous Integration server because we need to have the emulator or device up & connected all the time and also tests need to interact properly with the device to get passed what is not so easy. In most cases, mocking part of the application’s behavior is enough. In that case, we can easily run tests on a CI server and have deterministic test results. In order to do that, we can use Robolectric.

Gradle configuration

First, we have to add appropriate dependency to jacoco-android plugin in our top-level build.gradle file:

buildscript {
  repositories {
  dependencies {
    classpath ''
    classpath 'com.dicedmelon.gradle:jacoco-android:0.1.1'

Next, we need to add appropriate test dependencies in another build.gradle file for our app or library.

dependencies {
  testCompile 'junit:junit:4.12'
  testCompile ''
  testCompile 'org.robolectric:robolectric:3.1.2'
  testCompile 'org.mockito:mockito-core:2.7.17'

I’ve added also dependencies to JUnit, Truth and Mockito library, which are used in my tests.

We also need to add appropriate plugins:

apply plugin: 'jacoco'
apply plugin: 'jacoco-android'

To avoid ignoring our tests by the coverage report, we need to configure the following settings:

android {
  testOptions {
    unitTests.all {
      jacoco {
        includeNoLocationClasses = true

Next, we need to configure report output:

jacocoAndroidUnitTestReport {
  csv.enabled false
  html.enabled true
  xml.enabled true

Travis CI configuration

We are done with Gradle configuration. I’m assuming we have Travis CI build configured. If you don’t know, how to do this, visit and enable builds for your project. It’ pretty easy. Now, we should visit website, register there (e.g. with GitHub account) and add our project. After that, we need to add the following items to our .travis.yml file:

  - bash <(curl -s

  - ./gradlew clean build test jacocoTestReport check

Here we are performing clean, build an application, running unit tests, generating test coverage report with Jacoco and performing check (Lint, FindBugs, PMD & CheckStyle).

Writing unit tests with Robolectric

Next we can place our tests in src/test/ directory.
Sample unit test can look as follows:

In my case, I also needed to create src/test/resources/ file with the following content:


because Robolectric didn’t work with the Android SDK newer than 23. Moreover, I also needed to use Robolectric v. 3.1.2, because I had problems with running tests and generating coverage report with the latest version of the Robolectric.


When we have everything configured, we can push our tests to the GitHub repository, Travis CI build will be triggered and we can beautiful test coverage report, which can help to improve our tests.

I’ve applied approach described in this article in ReactiveNetwork open-source library. If want to see the complete solution, take a look at the source code of this project.

My approach to Git aliases

2017-03-12 DSP2017, Git No comments

While we are working with Version Control Systems like Git, it’s good to adapt them to our needs to perform daily work in a more productive way. People often create so-called Git aliases, which are shortcuts for longer commands. E.g. you can edit your .gitconfig file, which is usually located in your home directory and place a few aliases in the [alias] section. For example:

  ls = log --pretty=format:"%C(yellow)%h%Cred%d\\ %Creset%s%Cblue\\ [%cn]" --decorate

Then you can type: git ls in your Git repository to see pretty Git log.

Sometimes people go further and create many more aliases like:

cp = cherry-pick
st = status
cl = clone
ci = commit
co = checkout
br = branch

and so on. I’ve seen configurations containing about 20 aliases or more consisting of shortcuts, which have 2 or 3 letters. Usually, we don’t use 20 commands every day. I can remember e.g. 5 shortcuts, but I don’t want to remember more.

Instead of alias:

lcm = log -1 --pretty=%B

I prefer:

last-commit-msg = log -1 --pretty=%B

When I’m using terminal on Linux or macOS, I have type hinting, so I can type: git la, hit Tab and terminal will autocomplete my command to git last-commit-. Then I can hit Tab again and I can choose one of my aliases and select one by hitting Enter.

git alias hints in unix terminal

Now, I don’t have to remember all of my aliases. I treat my .gitconfig file as a documentation. Whenever I want to browse aliases, I can type git list-aliases (it’s also an alias to !git config -l | grep alias | cut -c 7- | sort) and if I want to find aliases related to diffs, I can type git list-aliases | grep diff. I also have more descriptive aliases like:

undo-last-commit = reset --hard HEAD^

so I know what this command actually does.

Morover, divided my aliases into separate sections and marked these sections with comments. The sections are as follows:

  • showing metadata
  • showing urls
  • showing commits, logs & branches
  • ignoring files
  • adding & reviewing changes
  • resetting and reverting changes
  • merging changes
  • branching
  • showing diffs
  • searching files

It allows me to keep my aliases in more organized way. It’s useful when our .gitconfig file “lives” and we update it during the work day if we need to.

Maybe this approach won’t be the best way of using Git for everyone, but it works for me and allows me to solve my daily tasks easier and faster.

You can find complete source of my .gitconfig file in my dotfiles repository at

Further reading

Happy coding!

Working with different Git configs

2017-03-10 DSP2017, Git, Linux, macOS No comments

Short introduction

Sometimes people need to specify multiple values for single .gitconfig file or they want to share just part of the configuration between two machines. There are different approaches for that. I can show you mine.

Different configs for different Operating Systems

On my private computer, I use Linux. I use Git for my private projects and I use my private e-mail address there. At the same time, I use Git at work on macOS with exactly the same Git configuration, but with a different e-mail address. How to deal with that?

In my .gitconfig file, I set my private e-mail address, which is used by default. In my .zshrc file, I created two aliases:

alias setupGitPersonal="git config --global \"\""
alias setupGitForWork="git config --global \"\""

Hint: If you want to configure more stuff than just an e-mail, you can do it in the appropriate alias or you can create separate shell scripts for that and place them in /usr/local/bin/ directory.

Then, on Linux, I don’t have to do anything and my private e-mail address is used out-of-the-box.
On macOS, I do the following trick in .zshrc file:

if [ `uname` = "Darwin" ]; then

  # rest of the macOS config goes here...

After that, every time I start terminal on macOS, it automatically sets up my e-mail address to the one I use at work and keeps my .gitconfig file updated.

Hint: If you don’t use zsh, instead of .zshrc file, edit .bashrc file.

Different configs for the same OS on two machines

If you’re using different configs on the different machines with the same OS, you can try another trick. Create configuration file – e.g. .machine_name in your home directory. Setup one name on one machine and another name on a different machine. Next, include this file in your .zshrc or .bashrc file, perform appropriate check and load different settings basing on variable name.

. ~/.machine_name

if [ $machineName = "workMachine" ]; then

Contents of the .machine_name file are simple:


Different configs on the single machine with one OS

In such case, we are supposed to perform the manual switch. We can use aliases provided above. When we want to have personal settings, we can open terminal and type setupGitPersonal. When we want to apply work settings, then we can type setupGitForWork.


As we can see, keeping different configs for different machines or operating systems and changing them depending on our needs is not so hard. I hope these ideas will help you to manage your configs.

SJUG updates – March 2017

2017-03-05 DSP2017, Java, Meetups No comments

SJUG aka Silesia Java Users Group meetups were reactivated one year ago and it’s still alive! I think it’s a great success of the community of Java developers who are located there. It’s not an easy thing to gather people from different cities in the region in one place every month, but it’s possible! During the year we gathered a few sponsors and partners, discounts for software tools and tickets for the conferences. Moreover, we had guests outside the Silesian region. There were really interesting presentations and discussions. Recently, I updated the website of the SJUG, so it automatically downloads information about the latest meetup. Now, we don’t have to do it manually every time.
Next meetup is planned for this Friday (10.03.2017) and if you’re interested in it, feel free to join.

More details about the group:

Control Spotify on Linux like a hacker

2017-03-05 DSP2017, Linux, Open source, Python No comments

Recently, I created a tiny script called spotify-cli, which allows you to control Spotify on Linux from terminal. It’s inspired by shpotify, which is a shell script doing similar things, but on macOS. My script is written in Python and uses dbus under the hood, which allows to communicate with bus daemon to pass messages between applications. I used pactl for controlling the system sound.

You can install spotify-cli as follows via wget:

sh -c "$(wget -O -)"

or via curl:

sh -c "$(curl -fsSL"

After that, you can just type spotify-cli in your terminal.
You can use spotify-cli with the following parameters:

--help, -h          shows help
--status            shows status (currently played song name and artist)
--play              plays the song
--pause             pauses the song
--playpause         plays or pauses the song (toggles a state)
--next              plays the next song
--previous, --prev  plays the previous song
--volumeup          increases sound volume
--volumedown        decreases sound volume

That’s it! Happy listening!

Source code of the project can be found at

KrkDataLink Meetup – Functional Programming & Data Science

2017-02-09 Meetups No comments

Recently, a lot of new interesting meetups connected with IT were started in Poland. Yet another meetup is starting in Kraków in the middle of February. It’s called KrkDataLink and is related to functional programming and data science. These topics became hot recently, so it’s worth to be up to date with them. You can find more details about this event on the meetup website.

Visit website of the event!

One year of using Macbook Pro at work as a software developer

2017-01-29 Apple, macOS 3 comments


For most of the time, I was MS Windows user. I was using this OS since Windows 95 up to Windows 10 ( the last version really occasionally). About ~3 years ago, I switched to Linux (Ubuntu) on my private computer. I used it before on virtual machine or sometimes in dual boot with Windows. About one year ago, I changed my job and decided to switch to Macbook Pro at work. I also had an option of choosing a laptop with MS Windows, but I was already a bit familiar with Unix, so I decided to learn something new and try Apple stuff. It had OS X El Capitan installed and it was later upgraded to macOS Sierra. I decided to collect my thoughts related to using Macbook in this article.
Below you can see a photo of my current work-station in the office.

In the beginning, it seemed that it’s something different than systems I already used, but it was quite similar to Linux. There’s Unix shell, HUD like in the Unity on Ubuntu. I don’t use Unity on Linux anymore, but I’m familiar with that concept. Settings window also looks similar to Unity or Gnome. Macbook is connected to one monitor via HDMI and to another one via mini-display port. The keyboard is connected via USB and magic mouse is connected wirelessly via Bluetooth. It’s worth mentioning that Macbook doesn’t have RJ45 port, so you need an adapter if you want to be connected to the network via cable. There’s adapter from mini-display port to RJ45 and from USB to RJ45. You can choose one of them. If you need any specific device setup, better check hardware specification, available adapters and their cost.

Configuration as a code

After some time of using Linux, I discovered that I can keep my personal configuration as a code. My config is open-source and you can see it in my dotfiles repo. This is very useful when you want to restore your configuration if you’re installing a system from the scratch or you accidentally lost your data and want to have a backup. Moreover, when you are using two Unix machines like me right now, you can share your configuration between them. On macOS I can use the same stuff like on Linux. E.g. Zsh, Tmux, Vim, etc. Nevertheless, there are differences between Linux and macOS. That’s why in my .zshrc file, I have a separate section for Linux and the separate section for macOS. Moreover, sometimes scripts also need to be customized separately for different systems. Using macOS helped me to make my config more robust and now I can use it easily both on macOS and Linux without huge problems.


Of course, I needed to find apps, which will be useful for me during daily work and usage of the system. Below you can see what I find useful.


  • Homebrew – Missing package manager for macOS; it’s obvious thing for Linux systems, but on macOS you need additional software for that.
  • iTerm2 – Terminal emulator with possibility of searching, creating tabs, creating panes horizontally and vertically. It’s better than default Terminal app.
  • Spectacle – An app for resizing and moving windows. On macOS for me it’s very annoying that when you resize the window to the full-screen, it hides top menu (HUD) and dock and jumps into separate workspace. We can solve that problem by installing spectacle and with appropriate shortcuts you can resize window without hiding anything. It has also a few additional features regarding windows resizing. Moreover, it’s good to remember to disable shortcuts, which conflicts with iTerm2 shortcuts for splitting the panes if you are using them like me.

Development Tools

Here are my basics apps I use for development. I also use terminal tools, but I mentioned them earlier in this article. All of them works pretty the same as on other platforms. Docker for Mac was improved so we can use Docker as easy as on Linux right now.


We also have basic messengers. It’s no difference comparing the to other systems. Maybe Skype is simpler than Windows version. Linux version of Skype is also quite simple because they probably stopped developing it. It’s strange because it’s good and bad at the same time.

Menu indicators

I like the concept of menu indicators in HUD. It’s similar to Gnome Classic and Gnome 3 desktop environment for Linux.

  • Caffeine. It’s the same as on Linux. It’s an indicator, which prevents screen lock.
  • Menu meters. It’s similar to Linux indicator. You can measure CPU, Memory and network usage.
  • Degrees. It’s an indicator for checking weather and temperature in your city.

Additional apps

There are also a few additional apps I use…

  • Commander One – “Total Commander”-like app for macOS. Total Commander is one of the apps I really miss on non-MS systems, because its replacements are never as good as the original app.
  • Default Mail and Calendar app
  • MS Outlook – the newest version looks much better after update, nevertheless notifications are still not consistent with macOS UI (at least they look a bit better) and closing app means killing it like on Windows, but it’s uncommon behavior for macOS.
  • MS Office apps
  • Evernote
  • Wunderlist
  • Spotify

What’s different?

There are things, which are slightly different on macOS and I needed to get used to them during the time.

  • Apps are not closed until you explicitly close them. Apps behaves a bit like on Android. They go to the background and you can wake them up. If you want to kill the app explicitly you have to use Command+Q shortcut or choose “Quit” option from the context menu.
  • Keyboard and shortcuts. The keyboard is different. We have additional Command key, which replaces Windows Key and a few additional keys. System shortcuts are also different. There’s no “Print Screen” key, but there’s a shortcut for that. Moreover, IntelliJ shortcuts are also different than on Windows and Linux, but I learned them during the time without any shortcuts re-mapping like some people do. In my opinion, it’s a better approach than re-mapping, because we have different keyboard and opportunity to train our brain a little bit.
  • Workspaces. This functionality became a standard in the latest versions of OS X and Windows. Nevertheless, it was available on Linux Desktop Environments long time ago.
  • Dividing screen. This is similar to Windows and Linux, but with a few differences. When we divide the screen and share it between two windows, this view goes to a separate workspace without HUD and Dock. I don’t like it. Additionally, we can grab a dividing line and adjust space of one window or another by dragging like in Tmux. This is really cool feature, which is not available on Windows and Linux DEs as far as I know.
  • Head Up Display and Dock. As I mentioned earlier in this article HUD is a concept similar to the Unity DE from Ubuntu, but I suppose Apple was first with that idea. I’m not a fan of Dock. Its functionality is more or less the same as a bottom bar from Windows, but with a bit different UI/UX. It’s hard to say, which one is better.
  • Hiding and showing hidden files. I haven’t found an option of showing hidden files like I can do on Windows or Linux. I had to do a trick and create an alias in my .zshrc file to show and hide hidden files. Commander One app is able to show and hide hidden files via GUI, but it’s third party app.
  • Spotlight. The Spotlight is a useful feature, which you can use for launching apps. You can also use it for other things like searching mail, calendar, places and so on, but I don’t really use these additional things.
  • Siri. Siri is an assistant with voice recognition. It was available on iPhone earlier. In macOS they brought it to the desktop. It’s cool, you can ask it about weather or launch an app with voice command, but I don’t really use it in daily work.


Here are a few of my observations I made while using Mac.

  • It doesn’t work exactly the same as Linux, but it’s in the same Unix family. We can use nice terminal, Zsh, Vim, Tmux and stuff similar to Linux, but sometimes we need to adjust things to Mac and not all commands, which are valid for Linux are also valid for Mac.
  • It’s more stable than Linux. I don’t remember if I encountered any crashes or problems related to system itself on Mac. Maybe system hanged once or twice in a year.
  • It’s less customizable than Linux. On Mac you cannot change the desktop environment and do a lot of customizations like on Linux. Luckily UI of macOS is nice, although Gnome UI is, in my opinion, a bit more minimalistic, what I actually like.
  • It’s simpler than MS Windows. From my perspective macOS is simpler than MS Windows. In MS Windows they tend to change everything in the successive releases and UI gets more complicated. In macOS they try to keep everything simple and consistent with their standards.
  • Software vendors care about macOS users. Every popular app has an official macOS version (e.g. Evernote, Wunderlist, Photoshop, etc.). You cannot say that about Linux.
  • Almost every dummy app is not free. I’m not against buying software and I use licensed commercial software. Nevertheless, in App Store almost every dummy app for doing one simple thing is paid, what can be annoying especially if you are using Mac at work and don’t want to connect it with your personal payment cards.

Design & Hardware

We can discuss software issues, but there are things, which Apple makes the best and leaves the competition behind.
These things are design & hardware. Macbook’s touchpad is the best touchpad I’ve ever used. It’s smooth easy to use and is integrated with the system in a very convenient way. Page scrolling is really smooth and natural. There are gestures for switching workspaces, scattering windows, zooming, etc. Retina display screen is very clear, has high resolution and it’s much better than displays of the other laptops. Speakers are incredibly clear and music sounds better than in regular laptops. A lot of people complain about the necessity of using adapters. I haven’t really had such problems. The only adapter I needed was USB to RJ45 adapter for a wired network. I could connect one external screen via mini-display port and another one via HDMI port. I think an issue with adapters can be serious for people who use a lot of specific external devices. The battery in the Macbook works fine, but it’s getting worse as long as you’re using a laptop. If you start energy consuming services and apps it can be drained in less than 8 hours now. I think it was better in the beginning of the usage (one year ago). The thing, which I really like in Macbook is the design. It’s very clean & simple. It’s also made from high-quality materials and looks really well. People who care about aesthetics will appreciate that.


To sum up, I can tell that Macbook is really nice & expensive piece of hardware. I could recommend such device to people, who can afford and want to spend more money than usual on their computer. I would also recommend this device to people who travel a lot and need to work with laptop screen and touchpad because Apple made them better than the competition. I can also recommend this device to people who don’t want to spend too much time on configuration & customization of the software.
I wouldn’t recommend this device to people who want to save money. You can get a laptop with similar specs in about 30% of the Macbook’s price. E.g. Thinkpad device. I also wouldn’t recommend it to people who use external devices quite often. If you are using the external screen, mouse & keyboard lack of Apple touchpad and retina display shouldn’t be a deal-breaker for you. I also wouldn’t recommend it to people who use a lot of specific external devices. In the case of choosing Macbook, probably you’ll need a lot of adapters and some stuff may not work fine for you. In addition, if you like to customize your system, probably you should get a Thinkpad device & install Linux. Moreover, you need to remember that Macbook Pro is not a gaming laptop. In my opinion, macOS is better than Windows for the type of programming I do (recently Java, Bash & Python). Nevertheless, I don’t think it’s better for that than Linux.

As you see, using a Macbook Pro has pros and cons.
You need to remember that choosing of the device should be dictated by logic and pragmatism. Not by marketing and fashion.

Automate boring stuff

2017-01-01 Bash, Git, Python No comments


In my current company all the people who perform creative work (mostly programmers) need to prepare so-called PKUP report. PKUP stands for Podwyższone Koszty Uzyskania Przychodu in the Polish language. It’s legal regulation in Poland, which allows paying a lower income tax due to the particular type of work. For the regular employee, it means that he or she will simply get a bit higher salary per month.

How the report looks in practice?

As a programmer, I simply create software as a source code. Added, removed and modified lines of code in the existing codebase are treated as my creative work. Luckily, we use Git so I can generate *.diff files from the Git repositories I’m contributing to. Besides that, I need to prepare document as a *.docx file with a short description of my work. My tasks look different every month, but report actually looks almost the same every month. Preparing this report is boring and repeatable stuff.

Let’s automate it!

Generating *.diff files from Git repos

I simply created a shell script, which goes through predefined project directories and saves *.diff files with names the same as project directory with changes performed by me from the 20th day of the last month until now.

Generating *.docx document

Next, I created a python script, which is parametrized and used by shell script. It uses python-docx library for generating *.docx report. I’ve chosen such option, because it’s one of the simplest solutions I’ve found and it’s lightweight. Moreover it can be easily used on Unix systems and integrated with shell scripts.


I wanted to make a script available and usable for everyone, so I created .pkup.conf file, which is responsible for personalization and configuration of the script. I think, it looks pretty straightforward.

yDEV_PROJECTS_LIST=(backoffice platform-backoffice cockpitng backofficesearch pcm pcmbackoffice cockpit cockpit-core)
yDEV_NAME="Your name"
yDEV_SURNAME="Your surname"
yDEV_ROLE="Software Developer"
yDEV_MANAGER="Your manager name and surname"

Installation and uninstallation

I also created installation script, which allows to start using the scripts faster. Installation script install dependencies for python script, copies shell script and python script into /usr/local/bin/ directory and .pkup.conf file into home directory. Configuration file needs to be adjusted by the user manually after installtion. Of course, there’s another script, which can be used for uninstallation.


There are python tests for this solution in file, but they’re quite poor right now due to the limited amount of time. They can be a subject of improvements in the future. Note that such scripting solutions rarely have tests because they’re small and created ad-hoc. Nevertheless, I wanted to follow the philosophy from my last blog article and create tests for any kind of software I make.


I’ve spent some time for preparing this stuff, but it was fun and I think it should save me and hopefully my co-workers some amount of time during creating reports every month. In the future, it can be improved by automatic generation of report messages and sending data to the server.

To sum up, preparing reports manually is boring. Generating reports automatically is exciting!

Complete solution described in this article with documentation is available on GitHub:

Lifting quality of a shell script

2016-11-30 Bash, Linux No comments


In release cycle of our team at work, we need to perform so-called system tests. In order to do that, we need to log into Artifactory, search for the latest release package, check if it’s up to date, download it, unzip it, install internal configuration recipe, compile, initialize & run it. Not all of that can be easily automated, but I thought that at least searching & downloading phase can be done from the terminal in a semi-automated way. That’s why I created ydownloader shell script.

Writing a script with unit tests and continuous integration

I’m not an expert in shell scripting, so I also wanted to learn more about it. In addition, I wanted to apply best software development practices in that script. Someone can say that in the case of a simple shell script proper engineering may be a superfluity, but in my opinion, the simplicity of the project is not an excuse for doing it the right way. Especially, if we want to use it in the future. That’s why I divided this script into smaller functions, added command line arguments handling and help for the users. Moreover, I added unit tests with shunit2 (yes, we can write unit tests for the shell scripts) and continuous integration with Travis CI server. In the “Clean Code” book, we can read that code without unit tests is not clean by definition. After dividing script into smaller functions, it was much easier to test it. My script is accepting command line arguments, so I needed to do the following trick to make it testable and include it in the testing script:

if [ "$TEST_MODE" == "" ]

if [ "$TEST_MODE" = false ] ; then
  # parse command line arguments here...
  echo "TEST_MODE enabled"

Then, I could write unit tests:

. ./ydownloader # load script to be tested

testCutLastChars() {
  # given

  # when
  actualValue=$(echo $valueToCut | cutLastChars 3)

  # then
  assertEquals $expectedValue $actualValue

# more tests goes here...
. ./shunit2/shunit2 # load shunit2

There are also other solutions for unit testing like bats, and others. We can choose what we like. We can also use additional tools like shunit2-colorize to make our console output of shunit2 tests look like a rainbow if we are not fans of monochromatic terminal. Moreover, we can use static code analysis tools for shell scripts like shellcheck.

In addition, I prepared simple install script, which allows to install script locally via curl or wget. Of course, project has sufficient documentation in file.

Short recap

It was really nice coding exercise. Now, I feel much more comfortable with shell scripting, but there’s still a lot to learn. I recommend trying applying a similar approach in your scripts if you haven’t done it yet.

If you want to browse complete project, check it out in my repository: