Frontend Optimization in Zerouno

Introduction

When I joined Zerouno, I saw the immense effort put into having a frontend that, despite being quite old-style, was extremely modular, composable, and at the same time rigid in the graphic customizations which had to meet certain usability and company coherence standards. I was pleasantly shocked to see that at first glance, not a single pixel was out of place, and it seemed to me that I had stumbled upon something from which I could learn a lot (spoiler: that was the case, but in the "hard way")... I distinctly remember that after seeing and trying it firsthand, I compiled a frontend application for the first time.

[insert gif shocked face]

"It must be just the first compilation, let's see how it behaves with incremental builds", we add a console.log somewhere and save, let's see what happens with live reloading... still minutes of compilation... shocked, I turned to the nearest colleague:

  • "is it normal that it takes minutes to compile every time I add a ';'?"
  • "yes, it's a bit slow"

[panic]

From there, I began my study to incrementally find ways to reduce compilation times. SPOILER: there is no magic or magical flag, the truth lies in the last two sections.

Note: This article was written more than a year after the start of this transformation, so it may contain historical/technical inaccuracies and does not contain statistical reports on the operations described below but aims to provide an overview of the reasoning and the journey that led to transforming an application that was extremely difficult to maintain and unnecessarily large in terms of file quantity and size into one with the same set of features but optimized in several aspects.

Removal of Unnecessary Steps in Development Mode Compilation

Starting with the ways that took less effort to speed up the compilation process, I thought to see what the compiler offered in terms of configuration to improve the situation...

  • aot: Use AOT as long as you can; compared to the other flags mentioned subsequently, it is the only one I recommend keeping enabled (true)...
  • optimization: Inserts steps for scripts and style sheets minification...
  • buildOptimizer: Angular's documentation does not provide much information on what this flag does...
  • vendorChunk: Generates a separate bundle containing only third-party modules...

Despite these interesting precautions, in my experience, they are NOT the causes of slow compilation in an Angular application. The Angular compiler is actually very fast, and the basic setup provided by the auto-generated configuration file when running 'ng new app' is more than adequate. 9 out of 10 times, the problem falls on other architectural or configuration issues.

tsconfig.json and Existential Conflicts with angular.json

As long as you come from a clean Angular project generated with 'ng new ...', you can be fairly certain that the tsconfig.json file you find is okay for your needs. But if, like me, you are thrown into an already existing project with thousands of files, it's not a bad idea to give it a check.

  • incremental: in TypeScript, setting this flag to true results in incremental builds that speed up subsequent builds after the first one, both in terms of live reloading and cold starts from disk. Angular seems to ignore this flag, and as for live reloading, the Angular compiler already tries to recompile only the modified code (plus its dependencies), but this does not happen for cold starts from disk.
  • files, include, exclude: Carefully checking which files are included or excluded from compilation can have a significant impact on compilation time. Maybe having { 'include': ['**/*.ts'] } is not a great idea. In particular, in a standard Angular application, you should not need to specify which files to compile inside tsconfig.json, but rather specify the entry point in the projects.<projectname>;.architect.build.options.main property

Packaging

We arrive at one of the key points in the journey where we need to get our hands dirty: packaging. One of the biggest challenges developers face when working with a monolithic application, especially as the application grows to over 8000 files with hundreds of lines of TypeScript code, is the slowness of the compilation process.

Nx to the Rescue

Briefly, Nx is an extremely powerful tool that, among other things, allows developers to organize code in a monorepo, with the ability to logically divide it into shared libraries. These libraries can be managed as separate npm packages, allowing for more granular code management. (https://nx.dev/getting-started/why-nx).

Getting to the Point: How Do I Transform a Monolithic Source into Packages?

Proceeding step by step, I began by identifying the parts of the code that could be logically separated. Fortunately, the source was already well divided into folders with their scopes (ui, renderers, i18n, utils, etc.), the challenge then was to eliminate an intricate 'labyrinth' of dependencies with each other (but this is another story). Using Nx, I created npm packages for each library. Each package had its own set of dependencies and could be compiled separately. At this point, for each created package, I removed the respective code in the repository where the applications were present along with the remaining libraries still connected in a monolithic way and patched here and there some remnants to make everything work.

Nice... But Really... How Does It Work?

The magic of packaging lies in incremental compilation. Through an intelligent caching system that stores the results of the compilation, instead of recompiling the entire monolith every time a change is made, Nx allows the compilation of only the packages that have actually been modified or parts that depend on them. If a library has not been modified, Nx simply uses the cached version. Furthermore, given that we have a well-defined dependency tree, Nx also allows the parallel compilation of packages. This means that, where possible, multiple libraries can be compiled simultaneously.

Generalization and Many Kisses (KISS Pattern)

The secret and mantra I like to say I brought into zerouno is 'Simplify and eliminate'. The truth and the real difference in compilation times, maintainability, etc., lies in greatly simplifying an overengineered source structure, eliminating dozens and dozens of files and thousands of lines of code. Of course, this should not be done randomly but requires careful consideration of any potential breaking changes, levels of abstraction to be maintained, and all this, if you're (un)lucky like me, without any line of testing to check that what you are doing does not break scattered logics throughout the application.

So, after eliminating here and there, making refactor upon refactor of methods, classes, etc., having an almost perfect understanding of all the gears that move the application, and with the boss's blessing, you can proceed to gather an idea of how to apply the Pareto principle to your own application context, seeking to achieve 80% of the existing functionality with 20% of the effort.

Comments are welcome: [email protected]