7 min read
Original source

Working with PostgreSQL using raw SQL queries

Object-Relational Mapping (ORM) libraries can often help us write our code faster. The ORM allows us not to write raw SQL. Instead, we can manipulate the data…

Object-Relational Mapping (ORM) libraries can often help us write our code faster. The ORM allows us not to write raw SQL. Instead, we can manipulate the data using an object-oriented paradigm. Using ORM can ease the learning curve of working with databases because we don’t need to go deep into learning SQL. Instead, we write the data model in the programming language we use to develop the application. On top of that, ORM should have security mechanisms that deal with issues such as SQL injection. Unfortunately, ORMs have disadvantages too. For example, depending on ORM to generate the database structure based on our models can lead to not grasping the intricacies of the underlying architecture. Also, ORM automatically generates SQL queries for fetching, inserting or modifying data. Therefore, they are not always optimal and can lead to performance issues. Also, ORMs can have problems and bugs too. It is fine to use ORM if we don’t need a lot of control over our SQL queries. If we develop a big project, though, we might prefer to deal with database management through raw SQL queries. In this article, we figure out how to structure our NestJS project to use raw SQL queries with a PostgreSQL database. You can find the code from this article in this repository. Connecting to the database As usual in this series, we use Docker to create an instance of the PostgreSQL database for us. docker-compose.yml version: "3" services: postgres: container_name: postgres-nestjs image: postgres:latest ports: - "5432:5432" volumes: - /data/postgres:/data/postgres env_file: - docker.env networks: - postgres pgadmin: links: - postgres:postgres container_name: pgadmin-nestjs image: dpage/pgadmin4 ports: - "8080:80" volumes: - /data/pgadmin:/root/.pgadmin env_file: - docker.env networks: - postgres networks: postgres: driver: bridgeTo provide Docker with the necessary environment variables, we need to create the docker.env file. docker.env POSTGRES_USER=admin POSTGRES_PASSWORD=admin POSTGRES_DB=nestjs PGADMIN_DEFAULT_EMAIL=admin@admin.com PGADMIN_DEFAULT_PASSWORD=adminWe must also provide a similar set of variables for our NestJS application. .env POSTGRES_HOST=localhost POSTGRES_PORT=5432 POSTGRES_USER=admin POSTGRES_PASSWORD=admin POSTGRES_DB=nestjsIt is also a good idea to validate if the environment variables are provided when the application starts. app.module.ts import { Module } from '@nestjs/common'; import { ConfigModule } from '@nestjs/config'; import * as Joi from 'joi'; @Module({ imports: [ ConfigModule.forRoot({ validationSchema: Joi.object({ POSTGRES_HOST: Joi.string().required(), POSTGRES_PORT: Joi.number().required(), POSTGRES_USER: Joi.string().required(), POSTGRES_PASSWORD: Joi.string().required(), POSTGRES_DB: Joi.string().required(), }), }), ], }) export class AppModule {} Establishing the connection In this article, we use the node-postgres library to establish a connection to our PostgreSQL database and run queries.npm install pg @types/pgTo manage a database connection, we can create a dynamic module. Thanks to that, we could easily copy and paste it to a different project or keep it in a separate library. If you are not familiar with dynamic modules, check out API with NestJS #70. Defining dynamic modules database.module-definition.ts import { ConfigurableModuleBuilder } from '@nestjs/common'; import DatabaseOptions from './databaseOptions'; export const CONNECTION_POOL = 'CONNECTION_POOL'; export const { ConfigurableModuleClass: ConfigurableDatabaseModule, MODULE_OPTIONS_TOKEN: DATABASE_OPTIONS, } = new ConfigurableModuleBuilder() .setClassMethodName('forRoot') .build(); We use forRoot above because we want our DatabaseModule to be global. When the DatabaseModule is imported, we expect a particular set of options to be provided. databaseOptions.ts interface DatabaseOptions { host: string; port: number; user: string; password: string; database: string; } export default DatabaseOptions; Using a connection pool The node-postgres library recommends we use a connection pool. Since we are creating a dynamic module, we can define our pool as a provider. database.module.ts import { Global, Module } from '@nestjs/common'; import { ConfigurableDatabaseModule, CONNECTION_POOL, DATABASE_OPTIONS, } from './database.module-definition'; import DatabaseOptions from './databaseOptions'; import { Pool } from 'pg'; import DatabaseService from './database.service'; @Global() @Module({ exports: [DatabaseService], providers: [ DatabaseService, { provide: CONNECTION_POOL, inject: [DATABASE_OPTIONS], useFactory: (databaseOptions: DatabaseOptions) => { return new Pool({ host: databaseOptions.host, port: databaseOptions.port, user: databaseOptions.user, password: databaseOptions.password, database: databaseOptions.database, }); }, }, ], }) export default class DatabaseModule extends ConfigurableDatabaseModule {}There is an advantage to defining the connection pool as a provider. If we want to specify some additional asynchronous configuration, this is a very good place to do so. You can find a proper example in a repository created by Jay McDoniel from the NestJS team. Thanks to the fact that above we define a provider using the CONNECTION_POOL string, we now can use it in our service. database.service.ts import { Inject, Injectable } from '@nestjs/common'; import { Pool } from 'pg'; import { CONNECTION_POOL } from './database.module-definition'; @Injectable() class DatabaseService { constructor(@Inject(CONNECTION_POOL) private readonly pool: Pool) {} async runQuery(query: string, params?: unknown[]) { return this.pool.query(query, params); } } export default DatabaseService; Managing migrations using Knex We could manage migrations manually, but using an existing tool might save us time and trouble. Therefore, in this article, we use Knex.npm install knexThe first step in using Knex is to create the configuration file. knexfile.ts import type { Knex } from 'knex'; import { ConfigService } from '@nestjs/config'; import { config } from 'dotenv'; config(); const configService = new ConfigService(); const knexConfig: Knex.Config = { client: 'postgresql', connection: { host: configService.get('POSTGRES_HOST'), port: configService.get('POSTGRES_PORT'), user: configService.get('POSTGRES_USER'), password: configService.get('POSTGRES_PASSWORD'), database: configService.get('POSTGRES_DB'), }, }; module.exports = knexConfig; Above we use dotenv to make sure that our .env is loaded before using the ConfigService. We can now create our first migration using the migration CLI.npx knex migrate:make add_posts_tableRunning the above command creates a file where we can define our migration. 20220825173451_add_posts_table.ts import { Knex } from 'knex'; export async function up(knex: Knex): Promise {} export async function down(knex: Knex): Promise {}We need to fill the up and down methods with the SQL queries Knex will run when running migrations and rolling them back. 20220825173451_add_posts_table.ts import { Knex } from 'knex'; export async function up(knex: Knex): Promise { return knex.raw(` CREATE TABLE posts ( id int GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY, title text NOT NULL, post_content text NOT NULL ) `); } export async function down(knex: Knex): Promise { return knex.raw(` DROP TABLE posts `); } If you want to know more about identity columns, check out Serial type versus identity columns in PostgreSQL and TypeORM We must execute one last command if we want Knex to run our migration.npx knex migrate:latestRunning migrations creates the knex_migrations table that contains information about which migrations Knex has already executed. Knex also creates the knex_migrations_lock table to prevent multiple procsses from running the same migrations in the same time. The official documentation is a good resource if you want to know more about managing migrations with Knex. The repository pattern and working with models It might be a good idea to keep the logic of accessing data in a single place for a particular table. A very popular way of doing that is using the repository pattern. posts.repository.ts import { Injectable } from '@nestjs/common'; import DatabaseService from '../database/database.service'; @Injectable() class PostsRepository { constructor(private readonly databaseService: DatabaseService) {} async getAll() { const databaseResponse = await this.databaseService.runQuery(` SELECT * FROM posts `); return databaseResponse.rows; } } export default PostsRepository;When running the getAll method, we get the data in the following format:[ { id: 1, title: 'Hello world', post_content: 'Lorem ipsum' } ]We often might want to transform the raw data we get from the database. A fitting way to do that is to use the class-transformer library, a popular choice when working with NestJS. post.model.ts import { Expose } from 'class-transformer'; class PostModel { id: number; title: string; @Expose({ name: 'post_content' }) content: string; } export default PostModel;We need to call the plainToInstance function in our repository to use the above model. posts.repository.ts import { Injectable } from '@nestjs/common'; import DatabaseService from '../database/database.service'; import { plainToInstance } from 'class-transformer'; import PostModel from './post.model'; @Injectable() class PostsRepository { constructor(private readonly databaseService: Database

Working with PostgreSQL using raw SQL queries | NestJS.io