Today I Learned

Debugging Intellij IDEA Performance Issues

All of a sudden our work with the Intellij IDEA platform started to feel sluggish. Each code change caused delays that lasted multiple seconds. Observing the cpu activity we saw that it kept spiking at around 100% load. It was impossible to continue working like that so we proceeded to find out what exactly might be behind these massive spikes.

It turns out that Intellij IDEA has a pretty simple built in activity monitor (Activity Monitor under Help menu) to see what is going on at any given moment. When everything works smoothly the monitor shows something like this on our work computer:

Intellij IDEA Activity Monitor
Intellij IDEA Activity Monitor

We saw some specific plugins taking most of the load at the moments the lag spikes occurred. After going through all the enabled plugins and removing those that were not needed anymore we restarted IDEA and it was working smooth again. It took just a small review of the work setup to get back up to optimal speed. If you don't even notice anymore how sluggish your install of the program is or you feel like it could be faster then I suggest taking a few minutes to review the performance of your main working tool. It is definitely worth the time.

Oracle DB in Docker

Just in case anybody will need it in the future.

Note that Oracle no longer provides images of Oracle XE and if you build your own for 18c, it will be more bloated (8Gb) compared to Oracle Enterprise 12.x images (2Gb for slim)

Here is a script to run it:


echo "Oracle container registry will now open in the browser."
echo "Please navigate to Database/Enterprise, then login, select language and accept terms and conditions before you can download Oracle images"
echo "This will be effective for the next 8 hours"
read -p "Press enter to continue"

echo "Now login via Docker"
docker login

# add '-slim' if you don't need APEX, will save ~1.5Gb
docker run -d -it --name oracle-db -p 1521:1521
# default SYS/SYSTEM password is 'Oradoc_db1'
# connection string: jdbc:oracle:thin:@localhost:1521/ORCLPDB1.localdomain

Add sequence field to child table rows

I recently got a request from a customer to add the possibility to reorder invoice rows manually. In order to achieve this on the database side you need to add a new column which holds the sequence number of each row. It is easy to add this column with ALTER TABLE statement. But how do you assign a value for each existing row?

This is a simplified version of the invoice_rows table:

SELECT * FROM invoice_row;
| id   | invoice_id | product   |
|    1 |          1 | Pencil    |
|    2 |          1 | Book      |
|    3 |          2 | Coffee    |
|    4 |          2 | Sandwitch |
|    5 |          2 | Newspaper |

Now I add the column for the sequence number:


This will result in all rows having sequence value 0. Now I want to have a growing sequence number for rows of each invoice so that the end result would be this:

SELECT * FROM invoice_row;
| id   | invoice_id | product   | sequence |
|    1 |          1 | Pencil    |        0 |
|    2 |          1 | Book      |        1 |
|    3 |          2 | Coffee    |        0 |
|    4 |          2 | Sandwitch |        1 |
|    5 |          2 | Newspaper |        2 |

Is it possible to achieve this with a single update statement? I didn't believe so initially but after some looking around found a very neat solution that works on MySQL.

MySQL supports variables in queries which you can change for each row. Here is an example:

  @sequence := IF(@invoice = invoice_id, @sequence + 1, 0) sequence, 
  @invoice := invoice_id invoice_id
  (SELECT @invoice := NULL, @sequence := 0) vars 
  JOIN invoice_row
  invoice_id, id;
| sequence | invoice_id |
|        0 |          1 |
|        1 |          1 |
|        0 |          2 |
|        1 |          2 |
|        2 |          2 |

Start reading it from the FROM section: we set two variables to initial values and join them with the table that we are actually querying.

Then look at the SELECT section: for each row we assign a new value to both variables - the @sequence value is either incremented by one (if the invoice_id is the same as previously) or reset to zero (if we have got a new invoice_id). And @invoice get always the current invoice_id value.

The ORDER BY is important because you want all rows of the same invoice to come together in the result. Otherwise the IF() for new @sequence value would work incorrectly.

Once you have this, it can be used in an UPDATE statement to change all rows:

UPDATE invoice_row ir,
     @sequence := IF(@invoice = invoice_id, @sequence + 1, 0) sequence, 
     @invoice := invoice_id
     (SELECT @invoice := NULL, @sequence := 0) vars 
     JOIN invoice_row
   ORDER BY invoice_id, id) t
SET ir.sequence=t.sequence WHERE;

Notice that I added id to the selected column list as this is how you join the result with the actual updated table.

And now I have proper sequence value for each row!

Being able to do this with a single SQL statement means that I can easily add it as a database migration and it will be executed correctly in all environments.

Can two applications listen on the same port at the same time?

We are integrating with an external system which makes to HTTP POST requests to an agreed URL. In order to learn about the exact format of the data posted to us, we set up a primitive server written in Python that logs the request body to a file and responds with a dummy response.

We kept this server running for several days while we were building our actual implementation. When we wanted to try out our implementation we decided to ssh into the server where the requests where coming to and port forward the traffic to our development machine. (ssh -R 8090:localhost:8090 user@server)

We expected the port forwarding to fail initially because our primitive server is already listening on the same port. To our great surprise it didn't fail. The forwarding was working just fine and we could test our implementation. We got disconnected from the server at some point and reconnected again when we needed to do more testing. To an even greater surprise - the primitive server had continued to receive requests after we had disconnected!

This was contrary to anything we had learned so far about sockets so we decided to investigate deeper how is this possible. We used netstat to see who is actually listening on that port:

$ sudo netstat -lp
tcp    0  0 *:8090             *:*           LISTEN      32256/python
tcp6   0  0 localhost:8090     [::]:*        LISTEN      4487/1


The primitive server was written in Python and it seems that it binds to all IPv4 interfaces on that machine. However ssh seems to bind to IPv6 localhost interface where no-one is listening on that port.

The system we are integrating with was running on that same machine and was configured to make requests with URL http://localhost:8090/. It seems that there is a hierarchy in place where localhost:8090 is first picked up by the IPv6 stack and if no-one is listening there then IPv4 gets a chance.

So when we had the ssh tunnel in place then IPv6 was used and we got all the requests to our development machine. However when we disconnected then IPv4 was used and our primitive logger server handled the requests.

This is a nice feature of the network stack and something that could be utilized again in the future.

Using DBDeploy in Gradle

As you probably know, DBDeploy is a tool for easily and automatically installing database changes. And Gradle is a next-generation build automation tool (like Ant and Maven). The question is, how to use DBDeploy in Gradle scripts? It seems that DBdeploy doesn't have Gradle plugin yet, nor has Gradle DBDeploy plugin. After some experimenting we found out that the easiest way is to reuse dbdeploy ant task. Let's see the example below. Assuming that our project has "db" folder with all the sql scripts:

  • db

    • create_changelog_table.sql
    • 001_create_customer_table.sql
    • 002_create_address_table.sql
    • 003_etc...
  • build.gradle

We can create a Gradle build file containing 3 tasks:

project.ext {
  dbDriver = 'com.mysql.jdbc.Driver'
  dbUrl = 'jdbc:mysql:///codeborne?useUnicode=yes&characterEncoding=UTF-8'
  dbUsername = 'codeborne'
  dbPassword = 'codeborne'

task updateDatabase << {
  ant.taskdef(name: 'dbdeploy', 
              classname: 'com.dbdeploy.AntTarget', 
              classpath: configurations.compile.asPath)

  ant.dbdeploy(driver: dbDriver,
    url: dbUrl,
    userid: dbUsername, 
    password: dbPassword, 
    dir: 'db',
    dbms: 'mysql',
    undooutputfile: 'db/undo_last_change.sql')

task createChangelogTable << {
  ant.sql(driver: dbDriver, 
          url: dbUrl, 
          userid: dbUsername,
          password: dbPassword,
          encoding: 'UTF-8',
          classpath: configurations.compile.asPath) {
      fileset(file: 'db/create_changelog_table.sql')

task undoLastChange << {
  ant.sql(driver: dbDriver,
          url: dbUrl,
          userid: dbUsername,
          password: dbPassword,
          encoding: 'UTF-8',
          classpath: configurations.compile.asPath) {
      fileset(file: 'db/undo_last_change.sql')

Now we have 3 gradle tasks:

> gradle createChangelogTable
> gradle updateDatabase
[ant:dbdeploy] dbdeploy 3.0M3
[ant:dbdeploy] Reading change scripts from directory /home/andrei/projects/blog-gradle-dbdeploy/db...
[ant:dbdeploy] Changes currently applied to database:
[ant:dbdeploy] 1..61
[ant:dbdeploy] Scripts available:
[ant:dbdeploy] 62..62
[ant:dbdeploy] To be applied:
[ant:dbdeploy] 62..62
[ant:dbdeploy] Applying #62: 062_migrate_currency_to_eur.sql...
[ant:dbdeploy] -> statement 1 of 5...
[ant:dbdeploy] -> statement 2 of 5...
[ant:dbdeploy] -> statement 3 of 5...
[ant:dbdeploy] -> statement 4 of 5...
[ant:dbdeploy] -> statement 5 of 5...
[ant:dbdeploy] Generating undo scripts...

> gradle undoLastChange

Now you must run "gradle createChangelogTable" once and then execute "gradle updateDatabase" so much as you wish, doing "gradle undoLastChange" to rollback the latest changes (until you committed you changes!) The bottom line is: Gradle has a very concise readable syntax for build scripts, DBDeploy is a simple and stable way for applying database changes. They just work fine together.Happy databasing!

SoapUI Tests with Ant+Ivy

Do you like quest games? Everybody likes! Today we had to play one. It's name was "run SoapUI tests with ANT script".

SoapUI is de-facto number 1 tool for testing SOA services. It's yound enough to have Maven 2 plugin, and it's old enough to have Maven 1 plugin. But we are not lucky enough: SoapUI doesn't have Ant task. This article shows how to run SoapUI tests from Ant script.We suppose you are already using Ivy. So, let's begin.

Ant script

Ant task is simple enough:

<target name="test">
    <mkdir dir="test-results"/>
    <java classname="" errorproperty="tests-failed" fork="yes" dir="test-results">
      <arg line="-j -f${basedir}/test-results"/>
      <arg value="-t${basedir}/soapui-settings.xml"/>
      <arg value="${basedir}/MY-SMART-soapui-project.xml"/>
        <fileset dir="lib" includes="*.jar"/>

    <junitreport todir="test-results">
      <fileset dir="test-results">
        <include name="TEST-*.xml"/>
      <report format="frames" todir="reports/html"/>

    <fail if="tests-failed"/>

Ivy configuration

Next, you need to add corresponding dependencies to ivy.xml:

<dependencies defaultconf="default->default">
    <dependency org="junit" name="junit" rev="4.10+" />
    <dependency org="eviware" name="maven-soapui-plugin" rev="4.0.1" />
    <dependency org="net.sf.jtidy" name="jtidy" rev="r938+"/>
    <exclude org="jtidy" module="jtidy"/>

And finally, you need to add repository to your ivysettings.xml:

  <settings defaultResolver="default"/>

    <ibiblio name="public" m2compatible="true"/>
    <ibiblio name="eviware" m2compatible="true" root=""/>
    <chain name="default" returnFirst="true">
      <resolver ref="eviware"/>
      <resolver ref="public"/>

Yes, eviware should come first, because it overrides some artifacts from the central repository, e.g. javax.jms:jms.

SoapUI settings

Typically all tests have some configuration parameters. It's common to declare them in SoapUI's global properties. Fortunately, these can be stored in VCS in file soapui-settings.xml:

<?xml version="1.0" encoding="UTF-8"?>
<con:soapui-settings xmlns:con="">
  <con:setting id="GlobalPropertySettings@properties"><![CDATA[<xml-fragment xmlns:con="">
  <property xmlns="">


Execute ant test and be prepared for download ~60 MB of jars. I have no idea why SoapUI needs so much (for example, why it needs javax.jms?), but that's Java, guys. After running, you will find JUnit-style html reports in folder reports/html. And several log files in the test-results folder.


Missing dependencies

Since eviware repo overrides some artifacts from the central repository, you can get in trouble if some artifacts are already cached in your local repository. For example, we have get this problem with javax.jms:jms artifact.

[ivy:retrieve] :::: WARNINGS
[ivy:retrieve] [NOT FOUND] javax.jms#jms;1.1!jms.jar (0ms)
[ivy:retrieve] ==== public: tried
[ivy:retrieve] ::::::::::::::::::::::::::::::::::::::::::::::
[ivy:retrieve] :: FAILED DOWNLOADS ::
[ivy:retrieve] :: ^ see resolution messages for details ^ ::
[ivy:retrieve] ::::::::::::::::::::::::::::::::::::::::::::::
[ivy:retrieve] :: javax.jms#jms;1.1!jms.jar
[ivy:retrieve] ::::::::::::::::::::::::::::::::::::::::::::::

Workaround is simple: rm -fr ~/.ivy2/cache/javax.jms/jms


Though SOA concept is offspring of the devil, SoapUI is a great tool for testing this, and automated testing is a great practice. Putting them together is a good level for programmer's quest game. Maven repository is inevitable beast of satan at the end of level. Be patient and kill this boss.

How to use Mobile-ID in Python

Mobile-ID (Mobiil-ID) is a personal mobile identity in Estonia and Lithuania, provided by an additional application on a SIM card.

The good thing is that it is backed by government and provides the same level of security for authentication and digital signatures as a national ID card without the need of having a smart card reader.

So, while thinking on adding Mobile-ID authentication to our free domain service, I have came up with this Python code, which is incredibly simple. Replace +372xxxxxx with your own phone number for testing and register it at before running the code.

from sys import exit
from suds.client import Client
import time
import logging

url = '' # test env
client = Client(url)

mid = client.service.MobileAuthenticate('', '', '+372xxxxxx', 'EST', 'Testimine', ' login', '12345678901234567890', 'asynchClientServer', 0, False, False)
if mid.Status != 'OK':
    print mid.Status

print 'Challenge: ' + mid.ChallengeID

    status = client.service.GetMobileAuthenticateStatus(mid.Sesscode, False).Status

if status != 'USER_AUTHENTICATED':
    print status

print 'Authenticated: ' + mid.UserGivenname + ' ' + mid.UserSurname + ', ' + mid.UserIDCode

Don't forget to install the SUDS Python library that does the magic of creating SOAP requests (apt-get install python-suds on Debian & Ubuntu)

Archived comments

ornyx 2012-03-07T20:17:07.200Z

In real environment sadly this script does not work for me also customer has agreement with - returns 101 server fault. What is the real URL and request? Is everything the same?

Keep API simple

I want to share a success story of designing a simple API, when the problem seemed to be complex at first glance.

Recently we got a task. We had to log every action that user does on the web. In other words, we would need a simple class (API) which could be easily used from almost all the controllers in our web. Additionally, we should log different parameters depending on action.We also have got a draft solution for this task from unknown developers. We are not going to look inside logging; we are interested in how easy is to use logging (it's API). Below you can see a typical controller code for logging user action:

private void logUserAction(User user) throws Exception {
  UserActionModel model = new UserActionModel();
  model.setAction("Buy a ticket");

  List<String> values = new ArrayList<String>();

  model.setParams(LogUtil.getParamsWithFieldNames(user, values));


Probably you guess how LogUtil.getParamsWithFieldNames works. Given a "user" object and list of Strings, it calls corresponding getters: user.getPersonCode(), user.getUserName() etc.

It's it nice?

Look what a mature, multi-purpose solution! It doesn't depend on concrete class - it could be User, Client, Customer or whatever else. You can pass any object and list of field names, and this universal LogUtil can log all these fields. Yeah, that's a smart API!


But you know what? You just don't need this smartness.Stop for a while and think: couldn't it done easier? Of course it could!Why do you need reflection? Why loose compile-time check of getter names? Why bother with handling reflection (checked) exceptions? Dude, just use getters!


The final solution didn't use reflection and was not smart - it was simple.The following is the typical controller code for logging user action:

Action action = new Action("Buy a ticket")
    .add("PersonCode", user.getPersonCode())
    .add("PersonName", user.getPersonName())
    .add("EmailAddress", user.getContactInformation().getEmailAddress())
    .add("Language", user.getContactInformation().getLanguage());


As simple as possible.

Archived comments

Ivo 2011-10-19T12:24:13.531Z

Why would you implement this programmatically anyway? IMHO this is a clear case for aspect usage ...