Node.js Development in 2024: trends and tools [eng]

Talk presentation

This talk aims to highlight the latest features and tools in the Node.js ecosystem. Nikita will showcase new JavaScript/TypeScript constructs, explain use cases for the new features in Node.js v20, and provide insight into the rising popularity of various frameworks and tools. However, the main focus will be on providing insightful answers to critical questions: how, when, and most importantly, why these features should be used in product development.

Nikita Galkin
Independent Contractor
  • Fractional CTO/Cloud Architect/GDE
  • node.recipes Author
  • Loves to develop a clean code using JavaScript / TypeScript, Node.js, Docker and AWS/GCP
  • Believes at 'Software is easy, people are hard'
  • Knows how to solve problems at the right level
  • Website

Talk transcription

Hi, Anna. Thank you for the warm welcome. I'm delighted to see you. I anticipate receiving questions from you and the community at the end of this talk. Allow me to commence. Hi, everybody. What will we discuss today? We will delve into Node.js development for the upcoming year. To begin, we must comprehend the alterations in our toolkit and how we intend to utilize them. Let's start with my experience. Firstly, I serve as a fractional CTO. This entails that U.S.-based startups typically engage me as a contractor to implement best practices. I frequently share my expertise with the Ukrainian community via a Telegram channel named Node.js Recipes. Additionally, as a Google Developer Expert, I oversee GDG Cloud activities here.

Now, what is the agenda for today? From a product standpoint, we are focused on product development. What does this entail? Primarily, we prioritize business needs, which constitute the apex of our pyramid. Our product, whether developed in Node.js, Python, or PHP, is exposed as an API. This API may employ technologies such as GraphQL or WebSockets. Typically, businesses are concerned with integrating our frontend or mobile clients with our backend, which is exposed as Node.js services or potentially another language. Business stakeholders are primarily concerned with factors such as latency and the development cost of the API, rather than the specifics of the codebase, whether it's Node.js, Python, or any other language.

Underneath the API gateway, we have our codebase operating at different layers. Let's zoom in a bit. Our codebase can be divided into three layers: application code, system code, and runtime. Application code represents what we, as product developers, work on daily. System code encompasses the libraries, SDKs, or frameworks necessary to run our application code. This may include internal code that is not open source. Ideally, all system code should be open source. Lastly, the runtime refers to the specific version of Node.js that we typically specify in our Dockerfile. Throughout this talk, I will guide you through these three parts. If time permits, I will also share insights into other aspects of our ecosystem. So, let's begin.

Starting with the application code, there have been some recent changes due to ECMAScript's new features. As a friendly reminder, we have TC39 responsible for driving proposals, which progress through five stages. Stage 0 represents the initial idea, followed by Stage 1 where the proposal is formatted and describes the problem. If you're interested in the evolution of JavaScript, you'll find many ideas at Stage 1. Some may progress to the draft stage, while others remain in discussion. Stage 3 is significant as TypeScript typically begins to incorporate these syntaxes into its features. Once a proposal reaches widespread adoption at Stage 3, it can be merged, and eventually included in the specification. Let's examine the changes in ECMAScript features over the past and current years. Node.js and TypeScript have received explicit resource management capabilities. This means that when we need to clean up resources, we can use constructs such as finally in TypeScript, as illustrated in the example. This ensures that cleanup operations are executed, even in the event of errors.

But regardless, we still need to manage the cleanup of these resources. Node.js and JavaScript have introduced the finally construct for this purpose, providing us with this functionality. This is advantageous as it allows us not only to manage files but also to handle other resources, such as deleting data from third-party APIs or closing database transactions. However, it's important to note that in most cases, this cleanup process falls under system code rather than application code, specifically internal system code. Unlike code found in modern frameworks, it may not be as widespread. Nonetheless, there is potential for widespread adoption, as seen with TypeScript decorators for transactional purposes, which may be implemented using the disposable pattern.

I am somewhat concerned that this feature may not see widespread use. On a related note, for similar functionality, we have had the AbortController available in Node.js for several years. Although it's not commonly used in product code, it is highly recommended for managing the duration of connections to databases or third-party services. Most modern SDKs, such as those for AWS S3, support the AbortController, making it a valuable tool for managing connections effectively. Moving on to another feature, we have the ability to change arrays by copy. Although this is not frequently utilized in Node.js code, it is more commonly employed in frontend development, where re-rendering is a frequent concern. This feature enables frontend developers to work with objects and arrays more efficiently from a re-rendering perspective, particularly in frameworks like React.

Similarly, the structured clone feature already exists in our codebase, allowing us to create a full copy of an object, including circular references. These features provide us with the ability to work with our codebase safely, ensuring that modifications to one part of the code do not inadvertently affect other requests. For instance, in my production code, a common scenario involves setting default values for responses to clients based on business logic. Instead of creating a function to return the default value each time, structured cloning can be used to achieve this more efficiently.

Now, let's address a frequent question regarding application code: JavaScript or TypeScript? While this debate may have been settled for some time, it remains a topic of discussion. In my view, and based on business needs, TypeScript is the preferred option. The lower ownership cost of TypeScript codebases, attributable to reduced development and maintenance costs, makes it a more economical choice. While JavaScript offers greater flexibility, TypeScript's superior refactorability makes it more advantageous in the long run. Additionally, tools like GitHub Copilot can significantly accelerate development, particularly when working with TypeScript. Its ability to understand TypeScript code better than JavaScript code has been proven, making it a valuable asset for developers.

From a Node.js perspective, TypeScript recommends using extend instead of trying to fix issues. Configuring the TypeScript compiler appropriately, as shown in the screenshot, ensures compatibility. Regarding TypeScript-specific updates, the recent addition of switch true simplifies branching in code, enhancing readability and refactoring capabilities. This feature streamlines code branches, making them more intuitive and easier to manage, as demonstrated in the example. However, TypeScript's ecosystem still faces challenges, such as issues with any and spread syntaxes, which need to be addressed.

One of the most common issues encountered, especially by those critical of TypeScript, is related to the use of spread syntax. When using spread with maps or objects, TypeScript may provide workarounds, but these may not always function as expected in edge cases. Therefore, I strongly advise exercising caution when using spread with TypeScript. Object.assign, on the other hand, enjoys better support from TypeScript. This is a well-known fact, and it's advisable to use Object.assign in your daily development tasks.

Additionally, encountering situations where any notation is received from system code is not uncommon. In such cases, it's often necessary to write explicit type casts to the required type. However, if you encounter any, consider changing them to unknown for enhanced code security. For example, TypeScript now defaults to unknown in catch blocks, addressing the issue where JavaScript allows throwing of any type, including strings or objects that are not error instances.

Despite these challenges, it's important to recognize that TypeScript's issues are often rooted in the way JavaScript operates. TypeScript is the preferred language for writing application code, despite its challenges. One notable feature that has not seen widespread adoption is the extends configuration option for arrays. This feature allows for extending configurations, such as specifying Node.js version and enabling strict mode, but it is not fully supported across the ecosystem.

Conflicts between ECMAScript and TypeScript, such as those involving decorators, are well-documented. While switching from TypeScript to ECMAScript may seem tempting, it can lead to significant technical debt over time. It's important to strike a balance between adopting new ECMAScript features and maintaining compatibility with TypeScript. Looking ahead, conflicts may arise with features like switch true and stage proposal pattern matching. These conflicts represent two different approaches to achieving the same goal, leading to potential performance differences. In such cases, it's crucial to configure your toolchain appropriately. Tools like Uniform can enhance code readability and enforce best practices in your application code.

Transitioning from application code to system code introduces additional considerations. System code encompasses components like the Node Modules folder. When examining the popularity of packages, tools like NPM Rank can provide valuable insights. For example, Chalk is a popular choice for terminal string styling in Node.js CLI applications, highlighting the prevalence of CLI usage in the Node.js ecosystem.

Express remains the go-to choice for REST development in Node.js, reflecting its widespread adoption in application code. Meanwhile, Nest emerges as a popular framework, offering a structured approach to building REST-based applications. Fastify, while gaining traction, is still overshadowed by Express in terms of popularity. Overall, understanding the intricacies of both application and system code is essential for effectively developing and maintaining Node.js projects. By staying informed about the latest developments and best practices, developers can navigate the complexities of the Node.js ecosystem with confidence.

When discussing frameworks for REST-based applications in Node.js projects, NestJS has emerged as the de facto standard. However, for projects requiring functionalities beyond REST, such as workers for queue management, alternative frameworks may be considered. With the increasing prevalence of AI-related projects anticipated in the coming year, there will likely be a greater demand for REST-based applications. As a result, frameworks like Lang Chain and similar solutions may experience increased adoption.

Lang Chain is positioned as the second recommended framework to learn for the upcoming year, following NestJS. Similar to NestJS, Lang Chain is expected to evolve and attract competitors, reflecting its potential to become a standard solution in the Node.js ecosystem. This is particularly relevant as we explore new methods of working with Large Language Models (LLMs), such as chatbots powered by models like GPT. In parallel, offline discussions and talks, such as those led by Vitaly Ratushnyi, are addressing AI-related topics and prompt engineering. It's essential to stay informed about these developments as they shape the future of Node.js development.

Moving beyond application code, it's crucial to consider internal or closed code, which should be treated with the same care as open-source code. This involves using best practices for managing code, such as classifying ML-specific code separately from other codebases. Additionally, generating SDKs from code specifications can streamline development processes, ensuring consistency and efficiency across teams.

Transitioning to system code, the introduction of built-in models in Node.js presents exciting opportunities. Stability indices for experiments provide developers with confidence when integrating new features into their projects. Features such as model-based permissions enhance security, particularly in local development environments. However, challenges remain, such as the adoption of Deno for similar purposes, highlighting the evolving landscape of JavaScript development. One notable feature introduced is the ability to create single executable applications. While this may be useful for certain use cases, such as shipping desktop applications, it may not be the optimal solution for all scenarios. For instance, Electron remains a popular choice for developing cross-platform desktop applications in the Node.js ecosystem.

Finally, Node.js 20 introduces built-in support for .end files, offering enhanced functionality for developers. As Node.js continues to evolve, developers can anticipate further improvements and features to support a wide range of use cases and development scenarios. When configuring the behavior of your Node.js application, it's essential to use configuration variables, a common practice highlighted in the 12-factor app methodology. Previously, packages like dotenv were utilized for managing environment variables, but now you can simply use a .env file, primarily for local development purposes. For other environments such as CI/CD or production runtimes, it's recommended to manage environment variables through the configured environment, such as Docker or other runtime environments. DevOps teams can provide guidance on configuring these environments effectively.

Regarding .env files, I highly recommend utilizing the dotenv-safe-merged-as-function package, which offers enhanced security by comparing .env examples with existing environment variables. This ensures that the environment is properly configured before proceeding, promoting consistency and reliability in your application. For testing, adopting a Test-Driven Development (TDD) approach is beneficial, and modern test runners like Jest, with its stable flag, offer excellent support for this methodology. Jest or packages like Supertest are suitable for integration testing, especially when interacting with third-party services like Stripe or databases.

While single executable applications offer convenience, tools like Electron may be more suitable for product development, particularly for desktop applications. Alternative runtimes such as Deno and Bann are gaining traction, but it's essential to understand the differences in runtime environments, particularly when deploying to cloud vendors. Cloud-native approaches focus on optimizing time to market, making them attractive options for businesses. Considering business risks, Node.js remains a stable and mature technology, making it a preferred choice for many organizations. However, exploring alternative runtimes for local development can provide valuable insights and skills. Cloud-native knowledge is increasingly important for Node.js developers, as it aligns with modern infrastructure practices and can lead to career growth in cloud engineering.

In conclusion, prioritizing best practices in configuration management, testing methodologies, and runtime selection is crucial for developing robust Node.js applications. Continuous learning and exploration of new technologies can further enhance development processes and career prospects in the ever-evolving landscape of software development. Now, I'm open to any questions or discussions you may have, whether it's part of this session or on our Discord platform.

Sign in
Or by mail
Sign in
Or by mail
Register with email
Register with email
Forgot password?