resilience4ts
is a suite of packages that provide ergonomic tools for building performant and safe distributed systems with Typescript. While there are other Typescript ports of Java libraries like Hystrix and resilience4j, or .NET packages like Polly, it is designed to be used specifically in highly-concurrent, distributed applications.
Following in the footsteps of its namesake resilience4j, resilience4ts also aims to be a transparent fault-tolerance layer via higher-order functions (decorators).
Resilience4ts provides 10 core decorators for the following patterns:
resilience4ts-bulkhead
: Bulkhead patternresilience4ts-circuitbreaker
: Circuit Breaker patternresilience4ts-cache
: Provides distributed or request-scoped Cachingresilience4ts-concurrent-lock
: Distributed Lockresilience4ts-concurrent-queue
resilience4ts-hedge
: Hedge patternresilience4ts-fallback
: Fallback patternresilience4ts-rate-limiter
: Rate-Limiting patternresilience4ts-retry
: Retry patternresilience4ts-timeout
: Timeout pattern
🚀 The @resilience4ts-all
package provides all of the above decorators in a single package, along with additional decorators for building reusable pipelines of decorators.
đź’ˇ For effortless integration into NestJS applications, check out the resilience4ts-nestjs
package!
$ npm install @forts/resilience4ts-all
Or to install individual packages:
$ npm install @forts/resilience4ts-bulkhead
$ npm install @forts/resilience4ts-circuit-breaker
$ npm install @forts/resilience4ts-cache
$ npm install @forts/resilience4ts-concurrent-lock
$ npm install @forts/resilience4ts-concurrent-queue
$ npm install @forts/resilience4ts-hedge
$ npm install @forts/resilience4ts-fallback
$ npm install @forts/resilience4ts-rate-limiter
$ npm install @forts/resilience4ts-retry
$ npm install @forts/resilience4ts-timeout
$ npm install @forts/resilience4ts-nestjs
import { Bulkhead } from '@forts/resilience4ts-bulkhead';
const myFunction = async (...args: unknown[]) => {
// do something
};
const bulkhead = Bulkhead.of({
maxConcurrentCalls: 10,
maxWaitDuration: 1000,
});
const decoratedFn = bulkhead.on(myFunction);
const result = await decoratedFn(...args);
import { Cache } from '@forts/resilience4ts-cache';
const cache = Cache.of('my-cache', {
extractKey: (...args: Parameters<MyDecoratedMethod>) => UniqueId, // Function that returns a unique id for the call from the decorated function args.
ttl: 1000, // Time to live in milliseconds.
maxCapacity: 100, // Maximum number of entries in the cache.
});
const result = await cache.on(async () => {
// do something
});
import { RequestScopedCache, RequestScopedCacheType } from '@forts/resilience4ts-cache';
const cache = RequestScopedCache.of('my-cache', {
extractScope: (...args: Parameters<MyDecoratedMethod>) => Record<string, any>, //Function that returns a "scope" to associate with the cache entry from the decorated function args.
extractKey: (...args: Parameters<MyDecoratedMethod>) => UniqueId, // Function that returns a unique id for the call.
type: RequestScopedCacheType.Local | RequestScopedCacheType.Distributed, // RequestScopedCacheType.Local uses a WeakMap to store the cache entries and is GC'd once the `extractScope` value falls out of scope, RequestScopedCacheType.Distributed uses a distributed cache.
clearOnRequestEnd: boolean, // Distributed only. Whether to clear the cache when the request ends, or persist it for the next request with the same scope.
});
const result = await cache.on(async () => {
// do something
});
import { CacheBuster } from '@forts/resilience4ts-cache';
import { PredicateBuilder } from '@forts/resilience4ts-core';
const cacheBuster = CacheBuster.of('my-cache-buster', {
invalidatesKeys: (...args: any[]) => string | string[], // key(s) that should be invalidated upon decorated method execution.
invalidateOnException: true, // Whether to invalidate the cache if the decorated method throws an exception.
shouldInvalidate?: PredicateBuilder, // Constructs a predicate that is evaluated upon completion of the decorated method. If the predicate returns true, the cache is invalidated.
});
import { CircuitBreaker, CircuitBreakerStrategy } from '@forts/resilience4ts-circuit-breaker';
const circuitBreaker = CircuitBreaker.of('my-circuit-breaker', {
strategy: CircuitBreakerStrategy.Percentage,
threshold: 0.5,
interval: 1000 * 60 * 15,
minimumFailures: 3,
whitelist: [], // Error[]. If the decorated method throws an error that is in the whitelist, the circuit breaker will not record it as a failure.
circuitConnectionRetries: 3,
halfOpenLimit: 3,
});
const result = await circuitBreaker.on(async () => {
// do something
});
import { ConcurrentLock } from '@forts/resilience4ts-concurrent-lock';
const lock = ConcurrentLock.of('my-lock', {
withKey: (...args: Parameters<MyDecoratedMethod>) => UniqueId,
});
const result = await lock.on(async () => {
// do something
});
import { ConcurrentQueue } from '@forts/resilience4ts-concurrent-queue';
const queue = ConcurrentQueue.of('my-queue', {
withKey: (...args: Parameters<MyDecoratedMethod>) => UniqueId,
});
const result = await queue.on(async () => {
// do something
});
import { Hedge } from '@forts/resilience4ts-hedge';
const hedge = Hedge.of('my-hedge', {
delay: 1000,
});
const result = await hedge.on(async () => {
// do something
});
import { Fallback } from '@forts/resilience4ts-fallback';
const fallback = Fallback.of('my-fallback', {
shouldHandle?: PredicateBuilder,
fallbackAction: (...args: Parameters<MyDecoratedMethod>[]) => Promise<MyDecoratedMethodReturn> | MyDecoratedMethodReturn,
});
const result = await fallback.on(async () => {
// do something
});
import { RateLimiter } from '@forts/resilience4ts-rate-limiter';
const rateLimiter = RateLimiter.of('my-rate-limiter', {
permitLimit: 1000,
window: 1000,
});
const result = await rateLimiter.on(async () => {
// do something
});
import { Retry } from '@forts/resilience4ts-retry';
const retry = Retry.of('my-retry', {
maxAttempts: 3,
backoff: 1000,
});
const result = await retry.on(async () => {
// do something
});
import { Timeout } from '@forts/resilience4ts-timeout';
const timeout = Timeout.of('my-timeout', {
timeout: 1000,
});
const result = await timeout.on(async () => {
// do something
});
- Bulkhead implmentation
- Circuit Breaker implementation
- Cache implementation
- Request-scoped implementation
- Concurrent lock implementation
- Hedge implementation
- Fallback implementation
- Rate Limiter implementation
- Retry implementation
- Timeout implementation
- NestJS package
- Decorators
- documentation
- quick start examples
- NestJS quickstart
- Express quickstart
- HttpClient
- supports cancellable requests
- GrpcClient
- supports cancellable requests
- DistributedContext module
- Chaos Engineering module
- Metrics
- Datadog integration
- MetricsController / Service for @forts/resilience4ts-nestjs