Best Practices
Learn best practices for using InceptTools in your projects.
Project Structure
Here's a recommended project structure when using InceptTools:
my-project/
├── src/
│ ├── config/
│ │ └── database.ts # Database configuration
│ ├── models/
│ │ ├── mongodb/ # MongoDB schemas
│ │ │ ├── user.ts
│ │ │ └── product.ts
│ │ └── sql/ # SQL models
│ │ ├── order.ts
│ │ └── customer.ts
│ ├── services/
│ │ └── db.service.ts # Database service initialization
│ ├── migrations/ # Database migrations
│ │ ├── mongodb/
│ │ └── sql/
│ └── index.ts # Entry point
├── .env # Environment variables
└── package.json
Connection Management
Follow these best practices for managing database connections:
- Singleton Pattern: Create a single instance of your database service and reuse it throughout your application.
- Connection Pooling: Use connection pooling for SQL databases to improve performance.
- Graceful Shutdown: Always close database connections when your application shuts down.
- Error Handling: Implement proper error handling for connection failures.
// src/services/db.service.ts
import { DBService, SUPPORTED_DBS } from "@inceptools/db";
import { mongoModels } from "../models/mongodb";
import { sqlModels } from "../models/sql";
// Singleton instance
let dbInstance: DBService | null = null;
export const getDBService = async () => {
if (!dbInstance) {
const config = {
mongodb: {
type: SUPPORTED_DBS.MONGO_DB,
connectionString: process.env.MONGODB_URI,
models: mongoModels,
},
postgres: {
type: SUPPORTED_DBS.SQL,
connectionString: process.env.POSTGRES_URI,
models: sqlModels,
configOptions: {
dialect: "postgres",
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
},
},
},
};
dbInstance = new DBService(config);
try {
await dbInstance.connect();
console.log("Connected to databases");
} catch (error) {
console.error("Failed to connect to databases:", error);
throw error;
}
}
return dbInstance;
};
// Graceful shutdown
export const closeDBConnections = async () => {
if (dbInstance) {
await dbInstance.closeConnection();
dbInstance = null;
console.log("Database connections closed");
}
};
// In your main application file
process.on("SIGINT", async () => {
await closeDBConnections();
process.exit(0);
});
Model Organization
Organize your models in a modular way:
// src/models/mongodb/user.ts
import mongoose from "mongoose";
export const userSchema = new mongoose.Schema({
name: String,
email: {
type: String,
required: true,
unique: true,
},
password: {
type: String,
required: true,
},
createdAt: {
type: Date,
default: Date.now,
},
});
// src/models/mongodb/index.ts
import { userSchema } from "./user";
import { productSchema } from "./product";
export const mongoModels = {
users: userSchema,
products: productSchema,
};
Error Handling
Implement proper error handling for database operations:
// Example of proper error handling
async function getUserById(id) {
try {
const dbService = await getDBService();
const user = await dbService.mongodb.users.findById(id);
if (!user) {
// Handle not found case
return null;
}
return user;
} catch (error) {
// Log the error
console.error(`Error fetching user with ID ${id}:`, error);
// Rethrow or handle appropriately
throw new Error(`Failed to fetch user: ${error.message}`);
}
}
Migrations
Use migrations to manage database schema changes:
// Example of a MongoDB migration
// src/migrations/mongodb/20230101_add_user_roles.js
module.exports = {
async up(db) {
// Update schema
await db.collection('users').updateMany({}, {
$set: { roles: ['user'] }
});
},
async down(db) {
// Revert changes
await db.collection('users').updateMany({}, {
$unset: { roles: "" }
});
}
};
// Running migrations
const { getDBService } = require('../services/db.service');
async function runMigrations() {
const dbService = await getDBService();
const migrationService = dbService.mongodb.migrationService;
try {
await migrationService.migrate();
console.log('Migrations completed successfully');
} catch (error) {
console.error('Migration failed:', error);
}
}
Performance Optimization
Follow these tips to optimize performance:
- Indexing: Create appropriate indexes for frequently queried fields.
- Query Optimization: Use projection to select only the fields you need.
- Caching: Use Redis for caching frequently accessed data.
- Connection Pooling: Configure connection pools appropriately for your workload.
- Batch Operations: Use batch operations for bulk inserts or updates.
// Example of using Redis for caching
async function getUserWithCaching(id) {
const dbService = await getDBService();
// Try to get from cache first
const cachedUser = await dbService.redis.get(`user:${id}`);
if (cachedUser) {
return JSON.parse(cachedUser);
}
// If not in cache, get from database
const user = await dbService.mongodb.users.findById(id);
if (user) {
// Store in cache for 1 hour
await dbService.redis.set(
`user:${id}`,
JSON.stringify(user),
'EX',
3600
);
}
return user;
}