Scroll Up Get A Quote

Software Development | 22 Min Read

How to Write Asynchronous Tasks in Modern JavaScript?

How-to-Write-Asynchronous-Tasks-in-Modern-JavaScript-min

Article Overview

Post Views: 3580 Views

Posted on: Wednesday November 6, 2019

Reading Time: 22 minutes

JavaScript is a prime of programming languages having all important characteristics and functionalities. As developer it’s crucial to understand how their written codes will work.

Talking about JavaScript; First its synchronous in nature which means that all codes will run sequentially- line after line just like normal reading pattern of us, and secondly it’s single-threaded, which means that only single command is executed at any time.

But with evolution of language, new outputs involved in the scenario allowing asynchronous command execution; developers adopted several distinct approach while solving complex algorithm and data flows, ultimately leading to the emergence of fresh interfaces and patterns.

Synchronous Execution and Observer Patterns

Like stated above, JavaScript executes the code in a sequence mostly. Even in initial years, the language had certain exceptions to this rule like you might know them already: HTTP Requests, DOM events and time intervals.

Likewise, if we accumulate an event listener for responding the click of any element from the user, language interpreter will run the code written in the listener callback on priority basis and then return back to its previous language running or normal flow

Similarly with an interval or network request, setTimeout, addEventListener, and XMLHttpRequest were the maiden artifacts accessing the asynchronous execution for web developers.

Although these were exceptions of synchronous command execution in JavaScript, it’s important to know that the language is still single-threaded. We can reform this concurrence but the interpreter will continue to run single line of written code at a time.

Here’s an example for check out a network request:

var request = new XMLHttpRequest();

request.open (‘GET’, ‘//some.api.at/server’, true);

 // observe for server response

request.onreadystatechange = function () {

  if (request.readyState === 4 && xhr.status === 200) {

    console.log (request.responseText);

  }

}

request.send ();

In above described code, whenever the server falls back, assigned method to onreadystatechange gets called on priority before resuming back to program’s ongoing code sequence.

Likewise happens when reacting to user interaction:

const button = document.querySelector (‘button’);

// observe for user interaction

button.addEventListener (‘click’, function (e) {

  console.log (‘user click just happened!’);

})

It’s noticeable that we’re getting linked to an external event and passing a callback, to specify the code action whenever it takes place. In last decade, “What is a callback?” was an obvious interview question since the pattern was widely used everywhere in most of codebases.

In every mentioned case, we are responding to external event. At certain time interval, a user action or server response. We were unable to create a asynchronous task & always observed occurrences happening externally.

Any written code in this way is known as Observer Pattern, represented better by addEventListener interface. Later on event emitters libraries or frameworks using this pattern developed as well.

 Node JS and Event Emitters

Good example of “an asynchronous event-driven JavaScript runtime” is Node.js where event emitters and callback being considered as first-class citizens, all because of already implemented EventEmitter constructors.

const EventEmitter = require(‘events’);

const emitter = new EventEmitter();

// respond to events

emitter.on(‘greeting’, (message) => console.log(message));

// send events

emitter.emit(‘greeting’, ‘Hi there!’);

 It was not just another to-go approach for asynchronous execution but a core pattern and convention of existing ecosystem.  Node.js marked the arrival of a new era for writing JavaScript in completely distinct environment — even outside the web.  Now there was a possibility of other asynchronous situations, like creating new directories or writing files.

 const { mkdir, writeFile } = require(‘fs’);

const styles = ‘body { background: #ffdead; }’;

mkdir(‘./assets/’, (error) => {

  if (!error) {

    writeFile(‘assets/main.css’, styles, ‘utf-8’, (error) => {

      if (!error) console.log(‘stylesheet created’);

    })

  }

})

Again it’s noticeable that callbacks may receive an error as first argument and it will definitely move towards second argument if a response data is expected. This was referred to as Error-first Callback Pattern, which later on became a convention adopted by authors and contributors for their own packages and libraries.

Fauna

Renown as the world’s best serverless database, now associated with native GraphQL.

Fauna DB: Promises & Endless Call Back Chain                                                              

The need for better asynchronous artifacts persisted when leading web development companies started to face more complex problems to solve.  Above mentioned code snippets has repeated callback chaining action which doesn’t scale with number of tasks increase.

Here’s an example of file reading and styles preprocessing:

 const { mkdir, writeFile, readFile } = require(‘fs’);

const less = require(‘less’)

 readFile(‘./main.less’, ‘utf-8’, (error, data) => {

  if (error) throw error

  less.render(data, (lessError, output) => {

    if (lessError) throw lessError

    mkdir(‘./assets/’, (dirError) => {

      if (dirError) throw dirError

      writeFile(‘assets/main.css’, output.css, ‘utf-8’, (writeError) => {

        if (writeError) throw writeError

        console.log(‘stylesheet created’);

      })

    })

  })

})

Now we can easily visualize how our written programs get more complex if code becomes harder to be checked by the human eye because of repeated callback chaining and multiple error handling.

Promises, Wrappers & Chain Patterns:

Promises weren’t much applauded in their initial launching days when announced as the newer addition to JavaScript language. Ultimately promises wasn’t a new concept like other languages having similar implementations done years back. Frankly speaking, they turned out to change a lot of semantics and structure in most projects.

However, promises not only brought in-built solution for developers for creating asynchronous codes but also introduced a new podium in web development dollop as foundation for later arriving web spec features like fetch.

The migration of a method from callback to promise-based approach became more prominent in projects (i.e. libraries and browsers), and even Node.js started slowly migrating to them.

 Here’s an example of wrap Node’s readFile method:

 const { readFile } = require(‘fs’);

const asyncReadFile = (path, options) => {

  return new Promise((resolve, reject) => {

    readFile(path, options, (error, data) => {

      if (error) reject(error);

      else resolve(data);

    })

  });

}

 The callback is obscure by executing it in Promise constructor and call resolve if method is successful otherwise rejected when error object is defined.

Whenever any method returns a Promise object we can follow its resolution by passing a function since its argument is the value by which the promise (data) was resolved, however, if any error is thrown during the method then catch function is called, if present.

{For deeper understanding on how Promises work, go for Jake Archibald’s “JavaScript Promises: An Introduction” article}

You can now use these new methods to avoid callback chains.

asyncRead(‘./main.less’, ‘utf-8’)

  .then(data => console.log(‘file content’, data))

  .catch(error => console.error(‘something went wrong’, error))

  Presence of a native way for creating asynchronous tasks and easy interface to follow its possible outcomes enabled the industry to move ahead of the Observer Patterns, whereas promise-based helped in solving the unreadable and prone-to-error code.

Better syntax highlighting or clear error messages is also helpful in the code writing process. It eminent that a code easy to reason is more predictable for any developer analyzing it for a better understanding of the execution path easier to catch a possible pitfall.

The adoption of Promises was so rapid in global community that Node.js had to release built-in versions of its I/O methods for returning Promise objects just like importing the file operations from fs.promises. They even provided a promisify.util for wrapping a function succeeding the Error-first Callback Pattern and getting it transformed into a Promise-based one.

Are Promises helpful in all cases?

To understand this better let’s re-imagine a style preprocessing task written with Promises.

 const { mkdir, writeFile, readFile } = require(‘fs’).promises;

const less = require(‘less’)

 readFile(‘./main.less’, ‘utf-8’)

  .then(less.render)

  .then(result =>

    mkdir(‘./assets’)

      .then(writeFile(‘assets/main.css’, result.css, ‘utf-8’))

  )

  .catch(error => console.error(error))

 

 When there is a clear reduction of redundancy in written code, especially around the error handling we can now rely on catch, however Promises unexpectedly failed in delivering a clear code indentation which can directly relate to the concatenation of actions.

It’s actually achieved in the first place afterwards the preceding statement readFile is called.  Moreover the happening after these lines is requirement of creating newer scope for creating the directory in first place, later print result in same.  This definitely cause a break into indentation rhythm, making it difficult to understand the instructions sequence in first sight.

The way of solving this is by pre-baking a custom method to handle this and allowing the correct concatenation of the method. But this should introduce one more level of complexity to a code already having what is required to achieve the task we wanted. 

{Note: It’s an exemplary program, and we are already in control around some methods following industry conventions, but that’s not always the case.}

The introduction of a library with a different shape or more complex concatenations will lead to breaking of our code style easily.

Further addition of notations was really helpful in these kind of cases where the asynchronous tasks concatenation is not pleasant or straight-forward to read like what present in synchronous codes.  All thanks to constant learning of the JavaScript community from other language syntaxes.

 Slack

Collaborate with your team through Slack.

Async & Await

During the execution time, Promise is defined as an unresolved value, however, creating an instance of Promise is an explicit call of this object.

 const { mkdir, writeFile, readFile } = require(‘fs’).promises;

const less = require(‘less’)

readFile(‘./main.less’, ‘utf-8’)

  .then(less.render)

  .then(result =>

    mkdir(‘./assets’)

      .then(writeFile(‘assets/main.css’, result.css, ‘utf-8’))

  )

  .catch(error => console.error(error))

 We can use await reserved word inside an async method to determinate the result of a Promise before continuing to execution.

Here’s an example of revisit or code snippet using this syntax.

 const { mkdir, writeFile, readFile } = require(‘fs’).promises;

const less = require(‘less’)

 async function processLess() {

  const content = await readFile(‘./main.less’, ‘utf-8’)

  const result = await less.render(content)

  await mkdir(‘./assets’)

  await writeFile(‘assets/main.css’, result.css, ‘utf-8’)

}

processLess()

{It’s important to move all the codes into a method since we can’t use await outside the scope of an async function now}

Whenever an async method finds an await statement, it will not execute further until the proceeding value or promise is solved.

The only consequence of using async/await notation is that code appears like synchronous, despite being in asynchronous execution, something web developers are more used to see.

Talking about error handling, we need to use statements present in the language for a long time.

 const { mkdir, writeFile, readFile } = require(‘fs’).promises;

const less = require(‘less’)

async function processLess() {

  const content = await readFile(‘./main.less’, ‘utf-8’)

  const result = await less.render(content)

  await mkdir(‘./assets’)

  await writeFile(‘assets/main.css’, result.css, ‘utf-8’)

}

try {

  processLess()

} catch (e) {

  console.error(e)

}

We can be assured that any error occurring within the process will be handled by code present inside the catch statement, like a centric place for taking care of error handling processes, resulting in a code that’s easier much to read and follow.

The consequent actions returning value are not required to be stored in variables like mkdir, unable to break the code rhythm or creating a new scope for accessing the value of result in a later step.

It’s prominent to say that Promises were fundamentally essential objects brought in, necessary to enable async/await notations in JavaScript language, useful for both browsers and newer versions of Node.js.

 Conclusion

Introducing Promises into web development arena changed the way of sequencing actions in codes, how to reason about code execution and how to author libraries and packages.

However going away from callback chains is difficult to sort. Having a method to perform the same helped in moving away from continuous thought year after years, after getting use to with Observer Pattern and several approaches used by pioneers in the communities like Node.js.

There are several articles available online about incorrect uses in Promise concatenations, old callback habits and ways to escape these pitfalls. In addition these were much essential as middle step for generating asynchronous tasks, but it wasn’t helpful in proceeding towards better code patterns as its required to be more adaptable having improved language syntax.

Again it’s hard to predict how ECMAScript spec will look in upcoming times as communities are constantly extending the JavaScript governance outside the web for solving complicated puzzles.

It’s also possible that maybe some of the puzzles getting turned into simple programs however tough to say what exactly we will require from the language in future, but we’re happy on how the web and JavaScript are proceeding, getting adapt to challenges and newer environments. JavaScript now is a more asynchronous friendly than what it was over a decade ago.