Building A Microservice

July, 2018
Hunter Cote
Microservice Documentation

What are we building?

In this tutorial we’ll be building a microservice, which will act as the middleman between Loyyal’s JSON based HTTP API’s and the client/browser. By definition a microservice is an HTTP server with a small API surface, which functions independently of other parts making up a bigger application. Our main goal is to gather transaction data and present it on the frontend. Our server will be built with node.js, redis will act as our database and cache, and jQuery will be the (very minimal) frontend.

What purpose does this serve?

Significantly increased speed. Transaction data will display on the front end, and for the best user experience latency should be kept to a minimum. We’re solving two main problems by creating this caching layer.

Hitting Loyyal’s API’s every time the page loads is both slow and expensive. Often the goal is to interact with your database no more often than you absolutely have to. Say for example, you’re programme contains 500 wallets. To call the server directly, it would take more than 10 seconds to go through the following process to retrieve the tx history (one API call for a record of all wallets, and x API calls for each wallet’s tx history, x being the number of wallets). 10 seconds each time you load the page. Now imagine multiple parties are loading the page, and subsequently querying the servers again for all the data. You can see why we need a solution.

This is where the microservice comes into play. It will act as the middleman. It hits Loyyal’s API’s for fresh data in a recurring manner (you set the interval), and sends the response to a new endpoint that clients can repeatedly hit to receive data in <1 second. Not only is the data retrieval and thus the frontend display much quicker, you can narrow the data set returned to you, promoting even faster response times. This is because you don’t have the client parsing through an entire history of transactions when you only need the ones from the last few days. This relatively simple application improves performance and a better experience for the front end user.

This post will be broken down into the following parts.
  1. Setup our node server and connect redis
  2. Write our main function that will
    1. call Loyyal’s ‘programme’ endpoint
    2. loop through the returned data (all associated wallets)
    3. make another API call to a different Loyyal endpoint, which returns all txs the wallet has initiated
    4. save the historical tx data to redis cache
  3. Set up our own API endpoint that
    1. when called, returns all tx history from the cache
    2. allows for query parameters to specify or limit what data to send back
      1. client can request tx data within a certain time frame
      2. client can also request only to get back a certain number of txs
  4. On the frontend, use AJAX and jQuery to request data from the last 7 days or 30 days

 

This tutorial does not cover installing node or redis

Server Side

 

Start by initializing a node application.

$ npm init

Create index.js, the file that will contain our server-side code.

$ touch index.js

Install node packages express, redis, axios, nodemon. Redis will be our cache and database. We’ll use axios to make API calls, and nodemon is a helpful npm package for any node app which automatically restarts your server after each save.

$ npm install express redis axios nodemon

Open index.js in your editor. Import/require the packages and initialize express

const express = require('express');
const redis = require('redis');
const axios = require('axios');
const app = express();

In a new terminal tab, start the redis server.

$ redis-server

In index.js, create and connect to the redis client.

const client = redis.createClient();

client.on('connect', function() {
  console.log('Connected to Redis...');
});

client.on('error', function(err) {
  console.log(err);
});

Create the node server’s entrypoint (conventionally kept at the bottom of the file).

app.listen(process.env.PORT || 3000, function(){
  console.log('Server started on port 3000');
});

In a different tab than the redis server, start the node server.

$ nodemon

 

CHECKPOINT

In the terminal, you should see the two console.log()‘s stating redis is connected and the node server is started.

 

Set CORS so all domains can access the API we’ll soon create.

app.use(function(req, res, next) {
  res.header("Access-Control-Allow-Origin", "*");
  res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
  next();
});

Define variables.

// Sets how often we request new data to update the cache (in milliseconds)
const frequency = 120000;
// Token required in the headers of the API calls to Loyyal
const accessToken = 'your_access_token';
const APIDomain = 'https://loyyal.loyyal-devnet.com';

Define the main function of our app. Note: explanations will be commented in to keep the function in one code block.

function getDataAndCache() {

  // API call to get all wallets associated to a programme
  axios({
    method: 'POST',
    url: `${APIDomain}/reporting/search-wallets`,
    headers: {'x-access-token': accessToken, 'content-type': 'application/json'},
    data: { "programme": "your_programme_name" }
  })
  // we loop through the returned array of wallets, and for each, we make another API call
  // the response of this additional request will be the tx history for that wallet
  .then(function(response) {
    const dataLength = response.data.wallets.length;
    for (let i = 0; i < dataLength; i++) {
      axios({
        method: 'POST',
        url: `${APIDomain}/reporting/search-history`,
        headers: {'accept': 'application/json', 'x-access-token': accessToken, 'content-type': 'application/json'},
        data: { "from": `${response.data.wallets[i].aliases[0].toString()}` }
      })
      .then(function(res) {
        // if a wallet has multiple txs, it's returned in an array
        // therefore to get each tx and get the date (score), we need to loop through each individual wallet as well
        for (let j = 0; j < res.data.entries.length; j++) {
        // by calling 'client.ZADD' we store this response to our redis db in a sorted set. The format for storing data in a sorted set is following line: 
        // client.ZADD('sorted_set_name', 'member_score', 'unique_member_name') 
        // the score must be an integer, which is why we're converting the date 
        // in this case each unique member will be a tx and it's score will be the date it was completed at
        // redis only stores strings, which is why we need JSON.stringify()
          client.ZADD('data', `${Date.parse(res.data.entries[j].completed_at)}`, JSON.stringify(res.data.entries[j]));
        }
      })
      // error handling
      .catch(function(err) {
        console.log(`Error (line ___): ${err}`);
      })
    }
  })
  .catch(function(err) {
    console.log(`Error (line ___ ): ${err}`);
  })
}

 

Now that the tx data is stored in redis, we’ll send it to a new API endpoint. We’ll work through the setup to allow for query parameters to narrow the dataset returned to you. It’s much faster and cleaner to request txs from the last 24 hours than requesting the entire history of txs and then filter through. With redis command ZRANGEBYSCORE, we’ll allow the client to:

  • retrieve all tx data
  • retrieve all tx data within a specific time frame
  • paginate through tx data within a specific time frame (only call for page 1 of the list of txs)

 

app.get('/api/tx-history', function(req, res) {

  // the two if statements below tell the cache: if the 'from' and 'to' date parameters are not given, return everything from 0 to now;
  let fromDate = req.query.from;
  let toDate = req.query.to;
  if (fromDate === undefined) {
    fromDate = 0;
  }
  if (toDate === undefined) {
    toDate = Date.now();
  }
  // we want to allow the client to determine the number of items to get back as well by passing through page & count parameters
  let page = req.query.page;
  let itemsPerPage = req.query.count;
  let start = 0;
  // if page & count are given, return results starting at top of that page
  if (page !== undefined && itemsPerPage !== undefined) {
    start = (page * itemsPerPage) - itemsPerPage;
  }
  // if page is given but count is missing, return everything (otherwise, the page will return an error)
  if (page !== undefined && itemsPerPage === undefined) {
    start = 0;
    itemsPerPage = -1;
  }
  // if page & count missing, return everything
  if (itemsPerPage === undefined && page === undefined) {
    start = 0;
    itemsPerPage = -1;
  };
  // the following command reads: from sorted set 'data', return members with a score between our 'from' date integer value and our 'to' date integer value. Limit the number of txs sent back. Offset the returned items by 'start' number of indexes, and give us itemsPerPage items.
  client.ZRANGEBYSCORE(['data', fromDate, toDate, 'limit', start, itemsPerPage], function(err, objByDateAndPage) {
    if (objByDateAndPage) {
      res.send(objByDateAndPage);
    } else {
      console.log(`Error (line __): ${err}`);
      }
    })
  });

In most cases, you won’t need to clear the cache since each time you call the function to get the data, it overwrites the existing data. If you need to clear the cache, run this function.

function cleardb() {
  client.flushdb();
  console.log('Redis db flushed');
}

cleardb();

Lastly, we call our main function in a setInterval() function so that we’re getting fresh data in our db in a recurring manner. Frequency was defined in the beginning of the file and determines the length of time between each refresh of data.

setInterval(getDataAndCache, frequency);

That’s all the code required for the backend.

Client Side

 

In the client side, to retrieve the data, we make an API call to the endpoint we created a few moments ago. Here I’m doing via AJAX and jQuery. In the code block below I’ve included some helpful equations:

  • The integer date for 24 hours ago
  • The integer date for one week ago
  • The integer date for 30 days ago
  • Example API url requests with week ago & month ago parameters already passed in
  • Looping through example response data and picking out key values
    • Number of txs
    • Number of wallets that have sent a tx
    • Average value of txs
    • Listing all attributes of a tx

 

$( document ).ready(function() {
  console.log("jQuery ready");

  // calculating date for last 24 hours
  let date_d = new Date();
  let daysPrior_d = 1;
  date_d.setDate(date_d.getDate() - daysPrior_d);
  const dayAgo = date_d.toISOString();

  // calculating date for 1 week ago
  let date_w = new Date();
  let daysPrior_w = 7;
  date_w.setDate(date_w.getDate() - daysPrior_w);
  const weekAgo = date_w.toISOString();

  // calculating date for 1 month ago
  let date_m = new Date();
  let daysPrior_m = 30;
  date_m.setDate(date_m.getDate() - daysPrior_m);
  const monthAgo = date_m.toISOString();

  // useful endpoints with parameters 
  const apiEndpoint = `http://localhost:3000/api/tx-history`;
  const prevDayData = `http://localhost:3000/api/tx-history/?from=${Date.parse(dayAgo)}`;
  const prevWeekData = `http://localhost:3000/api/tx-history/?from=${Date.parse(weekAgo)}`;
  const prevMonthData = `http://localhost:3000/api/tx-history/?from=${Date.parse(monthAgo)}`;

  // helpful values
  // console.log(Date.parse(dayAgo));
  // console.log(Date.parse(weekAgo));
  // console.log(Date.parse(monthAgo));
  // console.log(Date.now());

  $.ajax({
  url: apiEndpoint,
  method: 'GET',
  dataType: 'json',
  success: function(txData) {
    let numTxs = txData.length;
    let wallets = [];
    let totalTxVal = 0;
    let avgTxVal = 0;

    // the length of the returned data is the number of txs for this time period
    for(var i = 0; i < txData.length; i++){
      totalTxVal += parseInt(JSON.parse(txData[i]).amount.value);
      // computing average tx value for this time period
      avgTxVal = totalTxVal / numTxs;
      $('.avgTxVal').html(`${avgTxVal.toFixed(2)}`);
      // visualizing each tx on the DOM
      $('body').append(`<div><h3>Tx ${[i]}</h3>
        <p><span>From:</span> ${JSON.parse(txData[i]).from}</p>
        <p><span>To: </span>${JSON.parse(txData[i]).to}</p>
        <p><span>Ref:</span> ${JSON.parse(txData[i]).ref}</p>
        <p><span>Name:</span> ${JSON.parse(txData[i]).name}</p>
        <p><span>Value: </span>${JSON.parse(txData[i]).amount.value}</p>
        <p><span>Currency:</span> ${JSON.parse(txData[i]).amount.currency}</p>
        <p><span>Status: </span>${JSON.parse(txData[i]).status}</p>
        <p><span>Submitted at:</span> ${JSON.parse(txData[i]).submitted_at}</p>
        <p><span>Completed at: </span>${JSON.parse(txData[i]).completed_at}</p>
      </div>`);
      
      // if multiple txs were sent by the same wallet, we'll come across that multiple times
      // to get only unique values in our array, we run the following code
      if ( !wallets.includes(JSON.parse(txData[i]).from) ) {
        wallets.push(JSON.parse(txData[i]).from);
      }

      // appending the DOM elemtns with the values
      $('.numTxs').html(`${numTxs}`); // total number of txs
      $('.numWallets').html(`${wallets.length}`); // total number of wallets
    }
  },
  error: function(err) {
    console.log(err);
    }
  })


// end of document.ready()
});

There are several ways to dissect the data depending on your needs, and this should be a nice head start.

End

 

There you have it.  We’ve now bridged the gap between data at Loyyal’s API endpoints and feeding that to the client. And it’s done it in a way that promotes speed for a positive user experience, and is not a burden on Loyyal’s servers.

Send a Message