Developing at NodeJS as a hobby is fun, but when it comes to production for many users, there are a couple of things you should know to avoid long response times and crashes.
As part of working with MyHeritage, we developed the doppelgänger service for Eurovision 2019, which you can use to download of the contestants you are most like.
In addition to the face recognition logic, the application had a very clear requirement: it had to serve tens of thousands of simultaneous users, because Eurovision is watched by millions of people around the world.
We realized very quickly that the load balancer in front of an application configured using Auto Scaling is not enough for resiliency. The following helped us a lot:
fs.readSync) are tempting because the code looks cleaner, but they literally kill performance. Instead, use
awaitoperations, because during the execution of asynchronous operation, the CPU will be available for other tasks (see Event loop ).
const res = fs.readSync ('file.txt');After:
const res = await fs.readAsync ('file.txt');
Nodeis set to a default limit of 1 GB. If the server is available, say, 4 GB specifically for your application, you will need to set the maximum memory limit manually using the CLI with the following flag:
node - max-old-space-size = 4096 server.js
Nodeworks in the same thread. If you did not specifically set up a configuration that would run several threads, save money by choosing a server with 1 core.
In our case, these tips led to a ten-fold improvement in performance and helped keep the production environment clean, even when thousands of users had to be served at the same time.
Thanks for reading.