Register to get access to free programming courses with interactive exercises

Signals JS: Functions

The sort() method demonstrates well the importance and convenience of higher-order functions for everyday tasks. By describing the algorithm once, we can get different behaviors by specifying them right at the sorting location. The same applies to the map(), filter() and reduce() methods discussed above.

When using higher-order functions, it is customary to divide the task into subtasks and perform them one after another, building them into a chain of operations. This resembles dragging data through a transformation pipeline.

The SICP compares this approach to the way signals are processed when designing electrical circuits. The current flowing through the circuit goes through a chain of transducers: filters, noise suppressors, amplifiers, and so on. The voltage (and the current it generates) here represents data, and the transducers play the role of functions.

This is the main way of working with collections in JavaScript. But loops are almost not used because of much less flexibility, more code (which is more error-prone), and difficulties in dividing a complex algorithm into independent steps.

Signal processing

Suppose we write a function that takes a list of file system paths as input, finds files with .js extension case insensitive among them, and returns the names of those files. To solve this problem, we need the following functions:

  • fs.existsSync(filepath) — checks if the file exists on the specified path
  • fs.lstatSync(filepath).isFile() — checks if the object is a normal "regular" file (not a directory, link or other file type)
  • path.extname(filepath) — extracts the extension from the filename
  • path.basename(filepath) — extracts file name from the full path
const getJSFileNames = (paths) => {
  const result = [];
  // The opposite approach to pipeline
  // Here everything is executed at once in a heap without separating it into steps
  for (const filepath of paths) {
    // Extracting extension
    const extension = path.extname(filepath).toLowerCase();
    // If the path exists, this is the file, and it has a .js extension
    if (fs.existsSync(filepath) && fs.lstatSync(filepath).isFile() && extension === '.js') {
      // normalize the path and add to the resulting list
      result.push(path.basename(filepath.toLowerCase(), extension));
    }
  }

  return result;
};

const names = getJSFileNames(['index.js', 'wop.JS', 'nonexists', 'node_modules']);
console.log(names); // => [index, wop]

The example above is a common loop-based solution. His algorithm can be described as follows:

  1. Reviewing each path
  2. If the current path is an ordinary file with a .js extension (not case-sensitive), then add to the resulting array

If you try to do the same using the reduce() method, you will get the same code that is identical to the loop solution. But if you think carefully, you can see that this task breaks down into two: filtering and mapping.

const getJsFileNames = (paths) => paths
   // selecting files that actually exist
  .filter((filepath) => fs.existsSync(filepath))
   // selecting by file type
  .filter((filepath) => fs.lstatSync(filepath).isFile())
   // selecting by extension
  .filter((filepath) => path.extname(filepath).toLowerCase() === '.js')
   // mapping into names (we need an array of names)
  .map((filepath) => path.basename(filepath.toLowerCase(), '.js'));

const names = getJsFileNames(['index.js', 'wop.JS', 'nonexists', 'node_modules']);
console.log(names); // => [index, wop]

The code is slightly shorter (not including comments) and more expressive, but the main thing is not its size. As the number of operations and their complexity increase, code broken down in this way is much easier to read and analyze, since each operation is performed independently for the entire set at once. You have to keep fewer details in mind and you can immediately see how the operation affects all the data. However, learning how to break a task into subtasks is not as easy as it may seem, it takes some practice and skill before your code becomes digestible.

Note that here the filtering is broken up into three steps, not done in one. Given the brevity of the function definition in js, it is much better to split the checks into a larger number of filters than to make one complex filter.

Standard Interfaces

The very possibility of such a breakdown is based on a simple idea sometimes called "standard interfaces". It derives from the fact that both the function input and output should use the same kind of data, in this case an array. This allows you to connect functions and build chains that perform a large number of different tasks without having to implement new functions. The previously discussed operations - mapping, filtering, and aggregation - combine to solve the vast majority of collection processing tasks. We've all encountered something similar in our lives when assembling Lego constructors. A small number of primitive parts due to the same connections allows you to build structures of almost unlimited complexity.

In fact, such chains frequently end with aggregation, because aggregation reduces the collection to some final value.

Performance

The performance considerations were left out of the picture. Some of you may have guessed that for every call to a function that handles a collection, we traverse the entire list. The more such functions, the more traversing done. It would seem that the code slows down, so why do that? In practice, extra traversing is rarely a problem (see "The Mature Optimization Handbook"). Tasks that require simultaneous processing of tens or hundreds of thousands of elements are extremely rare. Most operations are performed on lists of up to thousands of items. And for a list like this, one traversing more or one less makes no difference.

But that's not the whole truth. There are special collections that do not perform operations at once when filtering, displaying, etc. are called. They accumulate the necessary actions, and during the first use, they do everything at once in one traversing. These are so-called "lazy collections."


Recommended materials

  1. Signal processing
  2. The Mature Optimization Handbook

Are there any more questions? Ask them in the Discussion section.

The Hexlet support team or other students will answer you.

For full access to the course you need a professional subscription.

A professional subscription will give you full access to all Hexlet courses, projects and lifetime access to the theory of lessons learned. You can cancel your subscription at any time.

Get access
130
courses
1000
exercises
2000+
hours of theory
3200
tests

Sign up

Programming courses for beginners and experienced developers. Start training for free

  • 130 courses, 2000+ hours of theory
  • 1000 practical tasks in a browser
  • 360 000 students
By sending this form, you agree to our Personal Policy and Service Conditions

Our graduates work in companies:

Bookmate
Health Samurai
Dualboot
ABBYY
Suggested learning programs
profession
Development of front-end components for web applications
10 months
from scratch
Start at any time

Use Hexlet to the fullest extent!

  • Ask questions about the lesson
  • Test your knowledge in quizzes
  • Practice in your browser
  • Track your progress

Sign up or sign in

By sending this form, you agree to our Personal Policy and Service Conditions
Toto Image

Ask questions if you want to discuss a theory or an exercise. Hexlet Support Team and experienced community members can help find answers and solve a problem.