All posts by tobias

Boomi: Debugging, oh finally

If you have ever wanted to do some debugging in Boomi, well, it is a bit of a challenge if you are used to available when doing traditional programming in any language.

This Groovy script will dump the following information to the App Log.

  • DDP: Document Properties
  • DPP: Dynamic Process Properties
  • Document content

The following GROOVY script will print out that

And can be injected anywhere in your flow.

Just add a “Data Process” shape and include the script, and inject it anywhere you like

When running, the debugging info will be found in the App Log, see example below

Network problems in Azure – Development Tools console to the rescue !

If you like me just can’t figure out why you are unable to reach a machine/host

Using the built in Console for Azure WebApp can be quite powerful, you will find it under Development Tools

However if you turn to “ping” in the console for help, and you get the following ..

Unable to contact IP driver. General failure.

Use instead the command

tcpping www.tsoft.se

If you would like to find out the ipaddress then do

nslookup www.tsoft.se

Boomi: Check if already in cache before adding

I found myself with the error

Found more than 1 document in the document cache (index: dummyUserIndex, keys: [id (Root/Object/id) = 100])

and figured I just need to add a Decision Shape and check if the key (object) is already in the cache and then I would not add it. BUT it turned out to more complicated than I first thought.

Here is my first attempt…

Seems reasonable … I have 2 steps, first I check if the object is in the cache and if it is I don’t do anything, and if it is not I go and fetch and add it to the cache.

Problem is that the json that comes in is a list of objects

A deliberate duplicate to show the point I am making

Now what we end up having is 6 json objects, when Boomi processes that , it will process ALL of them through the Decision Shape (“Already in cache?”) above before it does anything else, and hence NONE of them are in the cache.

So when we execute it, it looks like this

Except for the error we get, we can see a peculiar thing, we never get a cache hit, instead we get all cache misses !!!

I ended up having to add a Flow Control shape

And select “Run each document individually”, see below

Now it runs fine …

We can see that we have a cache hit ! this is great, and we also did not get an error 🙂

Over and out !

Boomi: Dynamic Document Properies; in shapes and Groovy Scripts

In my case I wanted to count the number of elements inside an XML that was a result I got back from a SAP query. Below a small example.

Inside the Set Properties shape you specify a Dynamic Document Process like this

In the Groovy script we set that attribute according to the full path of the dynamic process property which must be prefixed with document.dynamic.userdefined

document.dynamic.userdefined.rowcount

e.g.

When you later on try to access the Dynamic Document Property you need to just use the last part again, in my example “rowcount”

Boomi: Create XML or JSON document through Message shape

Creating a JSON object in the message shape like this

Here it is important to that the message starts with ‘ and ends with a ‘ (single quote)

Creating XML document

In JSON you need that the message starts with ‘ and ends with a ‘ (single quote), BUT this is not necessary for XML, but works too.

However here you should NOT have the <?xml> tag in front cause then Boomi is unable to parse it as an XML element.

<?xml version=”1.0″ encoding=”UTF-8″ standalone=”no”?>

including the xml tag is not working !!!

And if you do forget this is the error you can expect

The test run for fetch Materials from MARA (SAP) you are attempting completed with the following errors: Embedded message: Unable to create XML files from data, the document may not be well-formed xml ; Caused by: Illegal processing instruction target (“xml”); xml (case insensitive) is reserved by the specs. at [row,col {unknown-source}]: [2,5]

Using Special characters in XML…

So if you intend to have a special character like a single quote or similar then you need to XML encode them. Use a tool like https://coderstoolbox.net/string/#!encoding=xml&action=encode&charset=us_ascii

E.g. if you would like to have an xml that looks like this

MAKTL EQ ‘ABC123’

SAP query example where single quotes are used

Then that should be converted into

MAKTL EQ &#39;ABC123&#39;

This is how the encoded XML should look like

Lua scripting in KrakenD

Here are some notes on how to get on with Lua scripting for KrakenD

Imagine you have a backend that returns a JSON like this

So we route

KrakendDService A
GET /public/api/get-indv/45 GET /api/indv/45

The most basic KrakendD config would be something like this

In fact you could remove the ‘output_encoding’ and ‘encoding’, cause those are ‘json’ by default but gives clarity to the example.

Remove unwanted fields from the JSON

Let’s say that we do not want to expose the field “databaseId” Then we can apply a Lua script in KrakenD that solves this for us The configuration in krakendD can look something like this

The Lua code we place in a file on the same directory as the krakenD config

And the Lua function could look something like this

Dump some request info and the Data-object to the KrakenD log

Let’s place the lua script to dump some info on the ‘proxy’ level, the config could look something like this

The Lua code

Set or Modify the body when it is Content-Type : text

Note that KrakenD needs to be told how to interpret the data, and above the encoding was set explicitly to ‘json’ here we will set it to ‘string’ (i.e. text). The KrakenD config could look something like this

The Lua code below

Note that when we would like to modify the body and the data-object key will be ‘content’.

Set the http header Content-Type from Lua script

Imagine that the encoding/output_encoding was set to ‘string’ then the return data from krakenD will set the Content-Type to ‘text/plain; charset=utf-8’

If you would for some reason get a string and would like to set the Content-Type to ‘application/json; charset=utf-8’

You can do this from a Lua script like this

Bless you !
-Tobias

ssh tunneling

Local Port Forwarding

When you want to access your MySQL database from home, but only have an ssh connection.

I find my self reading up this time and time again, I guess I do not do it often enough, so I decided to write this down atleast for my own, but perhaps there is someone else also in the need of this small example/tutorial, so let’s put it out on the internet for all to see.

The example below intend to show how I can connect from my local laptop at home, to my office where an ssh connection has been allowed through a port forward in the Router to a machine (raspberry pi), and then I connect to a database on another machine in the same network. See picture below.

The ssh command looks like this

ssh -L 9900:192.168.0.30:3306 -p 21022 ssh-user@88.123.234.100

What the command does

ssh -L 9900:192.168.0.30:3306 -p 21022 ssh-user@88.123.234.100

  1. Connect to the Router at 88.123.234.100 (which is a public IP)
  2. The router listens on port 21022 , and will forward the connection to port 22 on 192.168.0.20 (which is the Raspberry Pi’s ssh server)
  3. The ssh server on the Raspberry Pi will forward the request once more to 192.168.0.30 (the Linux PC) on port 3306, which is the port of MySQL.
  4. Port 9900 is a local port on my laptop, and connecting my database client (e.g. dbeaver) to localhost:9900 will forward it to the MySQL process running port 3306 on 192.168.0.30

That is it !
/Tobias

Logstash

Base config

Remove nested field

the json looks something like this

with this config the field “entitet / attribut / mottagare” can be removed

Convert JSON field to string

If we would like to convert a field that is a json object into a string, we can use the ruby module to make it a string instead

The output now looks like this

Postman Variables and Scripts

Tired of updating your json payload (attribute) let Postman do it for you with a script and variable

Sometimes you need to generate new timestamps or id’s or … for every new request you send with POSTMAN. The solution is to use pre-request scripts. These scripts are Javascripts that will use macros to populate values in your body content (json in my case).

With the following JSON

And the following pre-request script

then you can get the following result

Inserting a json attribute from one query into another using a script and variable

In my case I was first querying to get a authentication token CSRF_NONCE that I should use in the forthcoming queries as a header.

Step 1) Query for a token

GET https://my.super-secret.server.com/api/security/csrf

Using Basic Authentication (Username/Password)

I would get back a json payload like this

Now let’s create a script that extracts that nonce token

NOTE ! You use “Post-response” script to the left

The script could be something like this

Step 2) Query for the data

Add the header attribute CSRF_NONCE in my case, and set the value to {{NONCE}} which was the variable you set above in the script.

After the query in Step 1) has been executed you can verify that the global variable has received the right value by going to the following view

You can also see the value in the POSTMAN Console

jq to the rescue

JQ is a really good tool when you have lot of JSON and you just interested in some of that data.

What if I just want to count number of entries in a list

Imagine I have this

And now you want to know how many entries there are in that array/list

jq query is so simple

. | length

(dot pipe length)

and the result is simply the number of elements e.g. 5

Only see the first N entries in an array (list)

.[:3]

The above would result in returning the first 3 entries in the array (list)

Only see some attributes

Imagine you have a JSON like this, and you only want to see the “name” attribute

(extract specific attributes from a json array with objects, from array to array)

the JQ query would then be something like this

[.[] | { “email” : .email } ]

and that would give you the following output

You can use jqpplay, JQ Kung Fu and other online alternatives to try it out, and ofcourse the command line ‘jq’ command.

Just a list of strings for one “column” / attribute

If you want to just create an array of strings, then you try this jq query

[.[] | .email ]

Which would give you the following

JSON to CSV

You will the following output with lots of quotes

That is not so nice, so to avoid the quotes then use the “Raw Output” function
then the output will look like this

Much better 🙂