There are two approaches to writing code: top-down and bottom-up (descending and ascending). With the top-down approach, high-level logic is implemented first, followed by the details. With the bottom-up approach, everything is reversed; the details come first, followed by the general logic.
These approaches are frequently contrasted in books. It's generally considered that if you choose one approach, then you have to exclude the other one. However, this is not the case. In the article, I'll tell you why only going down one path can cause problems.
Developers generally agree that functions should be designed from the top down. IThe focus here is on how the code will be utilized rather than its structure. This is important since the usage strongly affects the interface part of the code: classes, methods, functions, and everything that the clients of this code interact with.
The above are just a few of the advantages of test-driven development (TDD). A developer using TDD tries on the role of users of their code to better understand how to design it.
When building in the opposite direction, there is a possibility that after a long development, the use would turn out to be inconvenient. Such code will have to be rewritten, though this could be avoided.
From the above, we can conclude that at the initial stage, it’s better to choose a top-down design. First of all, you must understand what the code is supposed to do and how it will be used. Only then you should pick one of the design directions.
Consider the second Hexlet project, in which students must create a library to print differences between two files. To summarize, the entire library interface is a single function that takes two paths to files of particular formats (yml, json, ini) and returns a description of the differences between their structures. Here's how you can use it:
import genDiff from 'differ';
const result = genDiff(pathToFile1, pathToFile2);
console.log(result);
The code above is an example of a top-down approach. We have described how it will be used, but we're not sure how it will be written (or already written). The best part is that you can implement this functionality in completely different ways, none of which will affect how the library is used. This means we've reached excellent modularity.
Now, let's try going down to a level within this function. Reading these files is one of the first (hypothetical) tasks to be completed. Because file paths can be relative, they must be normalized in order to obtain the full (absolute) path to the file. This is done with the following code:
import fs from 'fs';
import path from 'path';
export default (path1, path2) => {
// get the full path using the current working directory
const fullPath1 = path.resolve(path1, process.cwd());
const data1 = fs.readFileSync(fullPath1).toString();
const fullPath2 = path.resolve(path2, process.cwd());
const data2 = fs.readFileSync(fullPath2).toString();
// the rest of the code
};
In theory, this code could remain so. It is simple and easy to read, with no conditional logic. However, developers adore abstractions and seek to add as many of them as possible. Quite often I see the above code transformed into this:
import fs from 'fs';
import path from 'path';
const readFiles = (path1, path2) => {
const fullPath1 = path.resolve(path1, process.cwd());
const data1 = fs.readFileSync(fullPath1).toString();
const fullPath2 = path.resolve(path2, process.cwd());
const data2 = fs.readFileSync(fullPath2).toString();
return [data1, data2];
}
export default (path1, path2) => {
const [data1, data2] = readFiles(path1, path2);
// the rest of the code
};
Before continuing, ask yourself, "What is wrong with this code?”
Unfortunately, not everything is as it should be here. The resulting function is too specific. It is linked to exactly two files, even though nothing in its code connects them. Even though the processing of each file would not have changed, the function would have to be rewritten if there were three files. And what if the number of files is indefinite?
Such code significantly reduces the modularity of the program. Some parts of the application start to rely on others in places where it could have been avoided. As a result, a change in one place causes the code to break in another — where it was not expected.
Although it appears to be a small function, the problem described is quite serious. When developers only think from the top down, they start adjusting the lower-level modules and functions to the upper levels, to the available data and their structure. Internal code, on the other hand, is usually completely isolated from the application. Regardless of how many files are there, the file reading function works on its own.
Let's try to refactor this code extracting good abstractions:
import fs from 'fs';
import path from 'path';
const readFile = (path) => {
const fullPath = path.resolve(path, process.cwd());
const data = fs.readFileSync(fullPath).toString();
return data;
}
export default (path1, path2) => {
const data1 = readFile(path1);
const data2 = readFile(path2);
// the rest of the code
};
We could probably do this if we were working with a big number of files or even an unknown in advance:
const results = paths.map(readFile);
The example in this article is just one of many examples of poor design. Sometimes the function is outlined properly, but its arguments are adapted to the structures of the external code. It should, however, be the vice versa. I occasionally come across the following variant of the genDiff
function description:
genDiff(paths);
That is, an array is passed instead of two paths. Furthermore, we know for certain that this function only works with two paths, and there are no other options. How did you get this code? This library is included in an executable file that reads command line arguments. In fact, the library becomes a part of a program that works as follows:
$ gendiff path/to/file1 path/to/file2
The command-line arguments are organized as an array. A developer who approaches the problem from this perspective assumes that an array with paths has been provided as input and begins adjusting the internals, namely the genDiff
function, to this.