A practical introduction into functional programming with JavaScript.

Many articles talk about advanced functional programming topics, but I want to show you simple and useful code that you can use in the day to day developer life. I’ve chosen JavaScript because you can run it almost everywhere and it’s well suited for functional programming. Two of the reasons why it’s so great are that functions are first class citizens and you can create higher-order functions with it.

Update: You can also read this post on DZone.

Higher order functions are functions that can take a function as an argument or return a function as a result. Such as createAdd below:

function createAdd(a){
    return function(x){
        return a + x;

add3 = createAdd(3);

console.log( add3(5) );
//output is 8

Note how you can store a function in a variable and call it later. Functions in variables are treated as just any other variable.

typeof add3

But why is it great that you can return a function as a result? Because you are able to return behaviour not just values and because of this the abstraction will be higher and the code will be more readable and elegant.

Let’s take an example, you want to print the double of every number in an array something that every one of us does once in a while, you would probably do something similar:

nums = [1, 2, 3, 4, 5];

for (x = 0; x < nums.length; x++) {
    console.log(nums[x] * 2);

You can see a common pattern here, the looping through an array is a behaviour that you can extract into a function so you don’t have to write it again.

How to do it?

nums = [1, 2, 3, 4, 5];

printDouble = function (k) {
    console.log(k * 2);

function forEach(array, functionToRunOnEveryElement) {
    for (x = 0; x < array.length; x++) {

forEach(nums, printDouble);

// output: 
// 2 
// 4 
// 6 
// 8 
// 10

The forEach function gets an array of numbers and a function printDouble and calls printDouble on every element of the array. Actually, this is a very useful function and its implemented in the prototype of array. So you don’t have to write the previous code in every codebase that you work on.

(forEach is a higher-order function too because it takes a function as a parameter.)

nums = [1, 2, 3, 4, 5];

printDouble = function(k){
    console.log(k * 2);


// output:
// 2
// 4
// 6
// 8
// 10

Welcome to a life without having to write loops again to do something with an array.

You can also write the previous code this way:

[1, 2, 3, 4, 5].forEach((x) = > console.log(x * 2))

Javascript has abstractions for similar common patterns such as:

  • Reduce can be used to have a single output from an array.
nums = [1, 2, 3, 4, 5];

add = function (a, b) {
    return a + b;

nums.reduce(add, 0);

//returns 15

What it does is:

0 + 1 = 1
1 + 2 = 3
3 + 3 = 6
6 + 4 = 10
10 + 5 = 15
  • Map is similar to forEach but the function you give in modifies the value of the current element:
nums = [1, 2, 3, 4, 5];

function isEven(x) {
    if (x % 2 == 0) {
        return x + " is an even number";
    } else {
        return x + " is an odd number";


// returns an array:
// [ '1 is an odd number',
// '2 is an even number',
// '3 is an odd number',
// '4 is an even number',
// '5 is an odd number' ]
  • filter is used for removing the element that do not match a criteria:
nums = [1, 2, 3, 4, 5];

isEven = function(x){
    return x % 2==0;


//returns an array with even numbers [ 2, 4 ]

or with using the fat arrow operator:

[1,2,3,4,5].filter((x) => {return x%2==0});

A similarly common example for a web application:

function addAMonthOfSubscriptionToUser(username) {
    user = db.getUserByUsername(username);
    user = addAMonthOfSubscription(user);

function addAYearOfSubscriptionToUser(username) {
    user = db.getUserByUsername(username);
    user = addAYearOfSubscription(user);

function cancelSubscriptionForUser(username) {
    user = db.getUserByUsername(username);
    user = cancelSubscription(user);


Don’t repeat yourself – as every good programmes knows

In this scenario, we can see a pattern, but it cannot be abstracted with OOP structures so let’s do our functional magic.

modifyUser = function(modification){
   return function(username) {
       user = db.getUserByUsername(username);
       user = modification(user);

addAMonthOfSubscriptionToUser = modifyUser(addAMonthOfSubscription);
addAYearOfSubscriptionToUser = modifyUser(addAYearOfSubscription);
cancelSubscriptionForUser = modifyUser(cancelSubscription);


In the end, we have the same functions in a more elegant way.

I think in the future functional programming will creep into our everyday life and will be used beside with OOP so we can have a more abstract and readable codebase.

The cost of all this nice and abstract code is efficiency and more expensive developers.

Current state of our home project

Quick overview

Last summer me and my cousin started to work on our idea to crawl the comments of the internet and harness the data from it. Our current goal is to help the marketers to get useful insights from our data about the effects of their campaigns, releases, and presence in the digital world.

For example, a Chinese brand releases new phones. How do they get info about their users feedback? Besides looking at the numbers of the sales, returned handsets, perhaps emails from the customers about the features not working.

Our quest is to solve this problem by having a huge dataset of comments, and by analysing this we can give useful insight about:

  • How positively their brand is perceived.
  • What do people think is positive/negative about certain products.
  • Yearly/monthy/weekly breakdown of the buzz around them on the internet, with channel distribution. (On which sites were the comments)
  • Did they share their own ideas or shared someone elses content about your product.
  • Conversational clouds, what did people mention in relation to your product (screen, charger, packaging)
  • Performance comparison with similar products. (Number of mentions Samsung vs Xiaomi)
Current state of the solution

Our application’s backend side based on ‘templates’/scripts which instruct the crawler service to get the comments of various forums currently a few tech sites, Ars Techinca, Xda developers we have 12M documets at the moment. Then the comments are analysed in microservices by their language, their sentiment, and the structure of the sentence. This info is saved into a database, then the data is indexed in an elasticsearch cluster so we can quickly query it.

The frontend (you can access it here) currently enables you to search the data and create some really basic pie charts based on keywords.

The architecture without the language detection and the sentiment analysis services looks like this:

The tech stack is:

  • Spring boot for microservices
  • Angular 1 + Bootstrap for the frontend
  • Mongo for storage
  • Elastic for indexing & querying

Things to add so we can demo it:

  • Create views which show where were keywords mentioned time/site/language. Sentiment of keywords. Conversational clouds. Basically anything statistics what delivers value with little as possible development time.
  • Add user/group/organisation management so only people with certain right can access the data and generate reports. (We planned to use Stormpath but their future is kinda shady)

Our current goal is to deliver the MVP in 2-3 months and get feedback from customers as fast as possible so we can make sure we are heading in the right direction.


Thanks for reading, we appreciate any feedbacks/ideas in the comments or in an email.

Best lightweight GIT service for your Raspberry/SOHO server

Not a paid ad.

Finally found what I’ve been looking for, GITEA. Its love at first sight, not only because of the beautiful UI, but because it doesn’t use SO MUCH GODDAMN MEMORY which is expensive in the cloud for such a mundane thing as version management. For the past years, I have had gitlab/bitbucket/stash servers for my personal projects but they used too much memory, considering that the server was used only by 2 people tops (gitlab recommends 4 gigs, runs with 1 gb ram + 3gb swap). The problem with them is that they are written in Java and designed for massive scalability, on the other hand, gitea is a lightweight go service forked from gogs, consuming ~30MB memory with light usage. Also, it is blazing fast, has a great ticket management and a built-in wiki. Its almost as good as Github.

You can get it here: https://gitea.io/

It’s also fairly quick to set up:

  • clone it
  • create a user for it
  • register it as a service (it’s only a single binary so not necessary)
  • edit the config file

GOD BLESS the great guys who designed golang, so people can write efficent applicantions and don’t have to bother with C and memory allocation anymore.

How to offload cpu heavy code to lambda with Nodejs


The title, says it all, but basically if you have a small service (in this case written with nodejs) running on a server with limited capabilities, you might run into problems if you want to do processor/memory heavy computations. For this can be a solution to offload the work to lambda, which can scale automatically, and you only have to pay for computation, so basically if you rarely need the computation, instead of renting a server you can do this and save heaps of money.

Talk is cheap, lets build it.

First of all if your dont already have you need the following:

Amazon AWS account

We will create the lambda function with this, and our extension of the application will run here.


Node.js is an open-source, cross-platform JavaScript runtime environment for developing a diverse variety of tools and applications.

You can get it here: download nodejs


Its for managing javascript packages, if you install nodejs, you will have this.


You have to log in to AWS and create a new lambda function:


You can choose the “Blank template” blueprint.

Name your function give it a description, a choose the Nodejs environment:


Insert the following code, it gives back a json when it recieves a request, if the json has an attribute named “data_to_transform” it gives back its value squared.

compute_heavy_task = function(data){
 return data*data;

exports.handler = (event, context, callback) => {

Create a role from a policy template and name it:


review your settings:


Now you can test it, with the test settings and you’re supposed to get back a json like:

 lambda_function_name: 'convert_stuff',
 original_data: 500,
 transformed_data: 250000 

if your test request looks like:


So in the second part we are going to create a nodejs app which has AWS auth credentials and can call the previously made function.

But first we have to create the credentials. Go to AWS console / IAM -> Users -> Add user.

welcome_to_iam   step1 step2



Save the credentials as a csv file. After clone the folowoing git repository npm install & run it with nodejs.

git clone https://github.com/yodeah/offload_to_lambda
cd offload_to_lambda
npm install

!! edit the conf file, with the neccesary data !!

node app.js


Nginx https load balancer with lets encrypt cert (On AWS)


Part 1: Create a working http load balancer

I’v decided to use amazon for hosting my (Ubuntu 14.04 trusty) server (t2.nano (still an overkill, anything with 256 mb ram is more than sufficent))

  1. you have to create a security profile which opens port 22 for ssh, 80 for http, and 443 for https.
  2. ssh into your server.
  3. Fetches the updates from the server, downloads nginx, apt-get update & upgrade, sudo apt-get install nginx
  4. Backup the config file, it is always considered a good practise to do. cp /etc/nginx/nginx.conf /etc/nginx/nginx.conf.backup
  5. Modify the http part of the config file to the code below, the 2 servers are the ones youre sending the load too (the connection to those is http). sudo nano /etc/nginx/nginx.conf
    http {
        upstream myapp1 {
            server google.com;
            server yahoo.com;
        server {
            listen 80;
            location / {
                proxy_pass http://myapp1;
  6. Restart nginx (sudo service nginx restart), if everything is alright then you should have a working loadbalancer which responds with either something from google or yahoo. Congrats.

Part 2: Generating a cert & assigning it to nginx.

  1. Install the certbot script which helps you to get the cert quickly
    wget https://dl.eff.org/certbot-auto
    chmod a+x certbot-auto
  2. The prompt will guide you through, though it is recommended to turn off nginx while you do this, so you dont have anything listening on port 80, 443. After you have finished youll have your cert files in /etc/letsencrypt/live/yoururl
  3. Modify the nginx conf to use the cert files.
    http {
        upstream myapp1 {
            server google.com;
                    listen 443 ssl;
                    server_name beta.daggersandsorcery.com www.daggersandsorcery$
                    ssl on;
                    ssl_certificate /etc/letsencrypt/live/beta.daggersandsorcery$
                    ssl_certificate_key /etc/letsencrypt/live/beta.daggersandsor$
                    location / {
                        proxy_pass http://myapp1;
  4. Restart nginx. Well done, it should be working for you.


The following script is doing the same thing that I’ve shown you in this article. With the addition of the automatic renewal process with crontab.