Deploy Prisma ORM

Projects using Prisma Client can be deployed to many different cloud platforms. Given the variety of cloud platforms and different names, it's noteworthy to mention the different deployment paradigms, as they affect the way you deploy an application using Prisma Client.

Deployment paradigms

Each paradigm has different tradeoffs that affect the performance, scalability, and operational costs of your application.

Moreover, the user traffic pattern of your application is also an important factor to consider. For example, any application with consistent user traffic may be better suited for a continuously running paradigm, whereas an application with sudden spikes may be better suited to serverless.

Traditional servers

Your application is traditionally deployed if a Node.js process is continuously running and handles multiple requests at the same time. Your application could be deployed to a Platform-as-a-Service (PaaS) like Heroku, Koyeb, as a Docker container to Kubernetes, or as a Node.js process on a virtual machine, or good old bare metal server.

See also: Connection management in long-running processes

Serverless Functions

Your application is serverless if the Node.js processes of your application (or subsets of it broken into functions) are started as requests come in, and each function only handles one request at a time. Your application would most likely be deployed to a Function-as-a-Service (FaaS) offering, such as AWS Lambda or Azure Functions

Serverless environments have the concept of warm starts, which means that for subsequent invocations of the same function, it may use an already existing container that has the allocated processes, memory, file system (/tmp is writable on AWS Lambda), and even DB connection still available.

Typically, any piece of code remains initialized.

See also: Connection management in serverless environments

Edge Functions

Your application is edge deployed if your application is serverless and the functions are distributed across one or more regions close to the user.

Typically, edge environments also have a different runtime than a traditional or serverless environment, leading to common APIs being unavailable.