Voiceflow named in Gartner’s Innovation Guide for AI Agents as a key AI Agent vendor for customer service
Read now
.avif)
To resolve this problem, we have adopted the monorepo structure:

For Javascript/Typescript there are a lot of libraries that can help us build monorepos. The latest versions of the most common JS/TS package managers like {% c-line %}yarn{% c-line-end %} or {% c-line %}npm{% c-line-end %} accept the monorepo structure. For our monorepos we are going to use the {% c-line %}lerna{% c-line-end%} library and {% c-line %}yarn{% c-line-end %} workspaces.
For the monorepo, we are going to look at the following structure:
a) We are going to have a root {% c-line %}/package.json{% c-line-end %}where the common configurations will be. With this change, we will not have duplicate, obsolete or misconfiguration in our packages because it will be managed in the same place:
b) In the {% c-line %}/packages/{% c-line-end %} folder, we will have all the packages. Those packages will have only the necessary files, which means that only the specific configuration will be there. For example, there we can find the TS build configuration or the Mocha configuration for testing.
This is a cleaner way to manage and maintain a monorepo. Find below the final structure of the monorepo:

{% c-line %}lerna{% c-line-end %} supports Conventional Commits. By adding the flag {% c-line %}conventionalCommits {% c-line-end %} to the {% c-line %}lerna.json{% c-line-end %} file we can start committing using the conventional commit syntax.
So how does it work if we have multiple packages in one repo?
Let's see an example that will clarify a lot about how it works. Imagine that we have two packages:{% c-line %}B{% c-line-end %} which has a dependency on {% c-line %}A {% c-line-end %}. Let's see some common scenarios:
The next version calculation is generated when the CI process runs the command {% c-line %}lerna publish{% c-line-end %} command.
As we have only one repository with independent packages we have to configure SoncarCloud accordingly to the monorepo structure. There are 2 ways to do that:
1. The monorepo configuration is actually supported by SonarCloud natively. Not only for JS/TS process but also for other languages. You can see the instructions here: https://sonarcloud.io/documentation/analysis/setup-monorepo/. With this configuration, you have to create a new SonarCloud project per each package in the monorepo and then run multiple SonarCloud Scans.
2. The other option is to create a single SonarCloud project and different SonarCloud sub-modules. For this configuration you need a {% c-line %}sonar-project.properties{% c-line-end %} file in the root folder and one per package as well. The important thing in this configuration is that we only have to run only one SonarCloud Scan. On the GUI you will find the monorepo project with the information of all its modules in a single place:

We recommend the second option because it is more optimal in terms of SonarCloud Scan executions.
Using a monorepo structure will change the yarn install command due to the fact that we are using {% c-line %}yarn workspaces{% c-line-end %} and {% c-line %}lerna{% c-line-end %}.
Starting with {% c-line %}yarn workspaces{% c-line-end %} we will see the following improvements:
Following with {% c-line %}lerna{% c-line-end %}, the command {% c-line %}lerna bootstrap{% c-line-end %} will be executed immediately after the yarn install command. This command is a key because it will do the following tasks:

When a {% c-line %}yarn{% c-line-end %} install command is executed, a linking between all shared libraries will be created automatically. It is similar to the {% c-line %}yarn link{% c-line-end %} command but much more powerful. We have to specify the specific version of the packages in the dependencies so with that {% c-line %}lerna{% c-line-end %} knows that there is a dependency and it will create a symlink automatically:

Since we introduced our first monorepo, there were some improvements on the pipelines that had to be applied in order to reduce the build times and do things more intelligently. Because of that, we needed to develop some improvements in the monorepos that we have currently. There are monorepos that have microservices as a {% c-line %}package{% c-line-end %} and there are monorepos that only contain libraries and there are others that have a mix. This is why we introduced the following improvements depending on the monorepo composition.
The main change introduced for the monorepos is that not all {% c-line %}packages{% c-line-end %} have changes in every commit or in every PR. Due to this, we need to implement a multi-level change detection that allows us to determine which job or workflow we need to execute.
Thanks to the new dynamic configurations from CircleCI we can trigger a configuration file with some toggles depending on the changes of the current branch and comparing it with {% c-line %}master{% c-line-end %} branch. So if we take a look at the monorepo and its CircleCI config, we will see the pattern explained above:

Here, you can see that depending on the changes, and more specifically, depending on the packages that have changed, we will set some {% c-line %}vars{% c-line-end %} to {% c-line %}true{% c-line-end %}. After specifying the {% c-line %}env{% c-line-end %} vars, we will trigger the {% c-line %}continue-config.yaml{% c-line-end %} configuration file with those values properly set.

The new CircleCI configuration will trigger the workflows and the jobs required depending on the changes from {% c-line %}master{% c-line-end %} branch.
This is a great improvement, but must move forward and improve. Why is this pattern not enough? Well, it's because it only looks at the changes from the {% c-line %}master{% c-line-end %} branch instead of the last commit. Because of this, we can make an addition: determine the packages that have changed in every single commit and execute the jobs and commands in those specific packages along with the common ones.
To determine if a pipeline has to be executed or not, we have implemented two mechanisms that detect the changes from the last commit.
1. Determine if a job has to be executed:
a) In the last chapter, we saw that the dynamic configurations detects changes from {% c-line %}master{% c-line-end %}, which means that when we change a file from a {% c-line %}package{% c-line-end %}, the toggle will be always be set as true, so all the jobs and workflows that have that toggle will be executed. We have also added a detection in all the jobs that checks if a job has to be executed or not depending on the changes in the last commit. For this, we have created the {% c-line %}command stop_if_no_changes{% c-line-end %}:

2. Determine if a command has to be executed: there are some jobs that are common to all the packages like unit tests, integration tests, lint, etc. For that, we have created a command called {% c-line %}exec_command_monorepo{% c-line-end %} that only executes those tasks in the {% c-line %}packages{% c-line-end %} that have changed from the last commits:

These commands will detect the changes against the last commit and will determine the package that has to run that command. With that, we have a smarter pipeline!
Here you can find a Proof of Concept with everything explained in this blog post: https://github.com/voiceflow/poc-monorepo-ci
As you can see, Monorepos can help us a lot during the development process and during the release process. With the setup and structure explained above, we can focus on developing new features and/or fixing bugs instead of upgrading a number of repositories. With Monorepos - everything is automated.