Building a Reliable Bulk SMS System with Fastify, Node.js, and Sinch - code-examples -

Frequently Asked Questions

Use a robust queuing system like BullMQ with Redis, along with a framework like Fastify and the Sinch SMS API. This architecture handles individual messages reliably in the background, preventing server overload and ensuring delivery even with temporary failures. The provided example uses a Fastify API endpoint to accept bulk requests and queue individual SMS sending tasks, processed by a dedicated Node.js worker.
Fastify is a high-performance Node.js web framework. It serves as the API layer, receiving bulk SMS requests, validating input, and adding individual message sending jobs to the BullMQ queue. Fastify's speed and efficiency are beneficial for handling a high volume of requests.
A message queue like BullMQ with Redis decouples the API request from the actual SMS sending. This allows the API to respond quickly without waiting for each message to be sent, improving performance and reliability. It also provides retry mechanisms and handles failures gracefully.
PostgreSQL, combined with the Prisma ORM, provides persistent storage for tracking the status of each individual message, the overall batch status, and any errors encountered during the sending process. Prisma simplifies database interaction and ensures type safety.
The system leverages BullMQ's built-in retry mechanism, allowing it to automatically retry failed messages multiple times with exponential backoff. Error logs are also stored to assist with identifying and resolving persistent issues. The system uses a database to track each message's status, supporting better error analysis and recovery.
Redis acts as the backend for the BullMQ job queue, storing the individual SMS sending tasks until the worker process can pick them up and send the messages via the Sinch API. Its in-memory nature ensures fast queue operations.
Yes, the provided system includes a dedicated `/status/:batchId` API endpoint. Clients can use this endpoint to retrieve information about a specific batch of SMS messages, including the overall status and the status of each individual message within the batch.
Use `npm install fastify axios bullmq ioredis dotenv @prisma/client pino-pretty @fastify/rate-limit @fastify/helmet` for production and `npm install --save-dev prisma nodemon` for development dependencies. This installs the web framework, API request library, queueing system, Redis client, and other essentials. You'll also need Docker for local Redis and PostgreSQL setup or access to standalone instances.
You'll need a Sinch account with a Service Plan ID, an API Token, and a configured Sinch virtual number. This information can be obtained from the Sinch Customer Dashboard under SMS -> APIs. The Sinch number should be in E.164 format (e.g. +12xxxxxxxxxx).
Consider using this system for applications requiring large-scale messaging, such as marketing campaigns, important notifications, or alerts. The queueing system and retry logic ensure reliable delivery, essential when reaching a wide audience. Don't use this approach for sending a small number of messages, as a simple API call would be more efficient in that case.
Use the provided `docker-compose.yml` to start PostgreSQL and Redis containers locally. This simplifies the setup process and ensures consistency across development environments. Ensure your `.env` file's connection URLs match the Docker configuration.
The worker process is dedicated to consuming jobs from the Redis queue. It fetches individual SMS sending tasks from the queue, sends the messages via the Sinch API, and updates the database with the status of each message. This asynchronous operation keeps the main API responsive and allows for high throughput.
The system uses the `@fastify/rate-limit` plugin. By default, it limits to 100 requests per minute per IP address using an in-memory store. For scaled environments, a Redis backend is highly recommended for distributed rate limiting. You can configure `REDIS_URL` in the `.env` file.
Prisma is a modern database toolkit that simplifies database operations and provides type safety. It serves as the ORM (Object-Relational Mapper) for interacting with PostgreSQL, managing database migrations, and generating a type-safe client for accessing data.