Remove obsolete deployment guides and ESLint report files to streamline project documentation and improve clarity. Updated package configurations and scripts for better development experience and logging integration.
This commit is contained in:
parent
5e21d2840a
commit
d055ba34d8
@ -1,141 +0,0 @@
|
||||
# 🚀 Pre-built Images Deployment Guide
|
||||
|
||||
This guide shows how to deploy using pre-built Docker images instead of building on Plesk.
|
||||
|
||||
## Benefits
|
||||
- ✅ No build failures on Plesk
|
||||
- ✅ Faster deployments (no compilation time)
|
||||
- ✅ Consistent images across environments
|
||||
- ✅ Better security (build in controlled environment)
|
||||
- ✅ Easy rollbacks and version control
|
||||
|
||||
## Prerequisites
|
||||
|
||||
1. **GitHub Account** (for free container registry)
|
||||
2. **Docker installed locally** (for building images)
|
||||
3. **Plesk with Docker extension**
|
||||
|
||||
## Step 1: Setup GitHub Container Registry
|
||||
|
||||
1. Go to GitHub → Settings → Developer settings → Personal access tokens → Tokens (classic)
|
||||
2. Create a new token with these permissions:
|
||||
- `write:packages` (to push images)
|
||||
- `read:packages` (to pull images)
|
||||
3. Save the token securely
|
||||
|
||||
## Step 2: Login to GitHub Container Registry
|
||||
|
||||
```bash
|
||||
# Replace YOUR_USERNAME and YOUR_TOKEN
|
||||
echo "YOUR_TOKEN" | docker login ghcr.io -u YOUR_USERNAME --password-stdin
|
||||
```
|
||||
|
||||
## Step 3: Update Build Script
|
||||
|
||||
Edit `scripts/build-and-push.sh`:
|
||||
```bash
|
||||
# Change this line:
|
||||
NAMESPACE="your-github-username" # Replace with your actual GitHub username
|
||||
```
|
||||
|
||||
## Step 4: Build and Push Images
|
||||
|
||||
```bash
|
||||
# Build and push with version tag
|
||||
./scripts/build-and-push.sh v1.0.0
|
||||
|
||||
# Or build and push as latest
|
||||
./scripts/build-and-push.sh
|
||||
```
|
||||
|
||||
## Step 5: Update Plesk Compose File
|
||||
|
||||
Edit `compose-plesk.yaml` and replace:
|
||||
```yaml
|
||||
image: ghcr.io/your-github-username/portal-frontend:latest
|
||||
image: ghcr.io/your-github-username/portal-backend:latest
|
||||
```
|
||||
|
||||
With your actual GitHub username.
|
||||
|
||||
## Step 6: Deploy to Plesk
|
||||
|
||||
1. **Upload compose-plesk.yaml** to your Plesk server
|
||||
2. **Plesk → Docker → Add Stack**
|
||||
3. **Paste the contents** of `compose-plesk.yaml`
|
||||
4. **Deploy**
|
||||
|
||||
## Step 7: Configure Plesk Reverse Proxy
|
||||
|
||||
1. **Plesk → Domains → your-domain.com → Apache & Nginx Settings**
|
||||
2. **Add to "Additional directives for HTTP":**
|
||||
```nginx
|
||||
location / {
|
||||
proxy_pass http://127.0.0.1:3000;
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection 'upgrade';
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
proxy_cache_bypass $http_upgrade;
|
||||
}
|
||||
|
||||
location /api {
|
||||
proxy_pass http://127.0.0.1:4000;
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
}
|
||||
```
|
||||
|
||||
## Step 8: Secure Database Access
|
||||
|
||||
Add to Plesk Firewall:
|
||||
```
|
||||
# Allow Docker bridge network
|
||||
ACCEPT from 172.17.0.0/16 to any port 5432
|
||||
ACCEPT from 172.17.0.0/16 to any port 6379
|
||||
|
||||
# Deny external access to database
|
||||
DROP from any to any port 5432
|
||||
DROP from any to any port 6379
|
||||
```
|
||||
|
||||
## Updating Your Application
|
||||
|
||||
1. **Make code changes**
|
||||
2. **Build and push new images:**
|
||||
```bash
|
||||
./scripts/build-and-push.sh v1.0.1
|
||||
```
|
||||
3. **Update compose-plesk.yaml** with new version tag
|
||||
4. **Redeploy in Plesk**
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Images not found
|
||||
- Check if you're logged in: `docker login ghcr.io`
|
||||
- Verify image names match your GitHub username
|
||||
- Ensure images are public or Plesk can authenticate
|
||||
|
||||
### Build failures
|
||||
- Run locally first: `docker build -f apps/portal/Dockerfile .`
|
||||
- Check Docker logs for specific errors
|
||||
- Ensure all dependencies are in package.json
|
||||
|
||||
### Connection issues
|
||||
- Verify firewall allows Docker bridge network (172.17.0.0/16)
|
||||
- Check that DATABASE_URL uses correct IP (172.17.0.1)
|
||||
- Test database connection from backend container
|
||||
|
||||
## Security Notes
|
||||
|
||||
- Database is only accessible from Docker bridge network
|
||||
- Backend API is only accessible via reverse proxy
|
||||
- Use strong passwords and JWT secrets
|
||||
- Consider using Docker secrets for sensitive data
|
||||
- Regularly update base images for security patches
|
||||
@ -1,75 +0,0 @@
|
||||
# 🚀 Deployment Guide
|
||||
|
||||
## 📁 **Environment Files Overview**
|
||||
|
||||
### **Development:**
|
||||
- `.env` - Your local development environment (active)
|
||||
- `.env.example` - Development template for new developers
|
||||
|
||||
### **Production:**
|
||||
- `.env.production` - Production environment for Plesk deployment
|
||||
- `compose-plesk.yaml` - Docker Stack definition
|
||||
|
||||
## 🔧 **Plesk Deployment Steps**
|
||||
|
||||
### **Step 1: Authenticate Docker (One-time)**
|
||||
```bash
|
||||
# SSH to Plesk server
|
||||
echo "YOUR_GITHUB_TOKEN" | docker login ghcr.io -u ntumurbars --password-stdin
|
||||
```
|
||||
|
||||
### **Step 2: Upload Files to Plesk**
|
||||
Upload these files to your domain directory:
|
||||
1. `compose-plesk.yaml` - Docker Stack definition
|
||||
2. `.env.production` - Environment variables (rename to `.env`)
|
||||
|
||||
### **Step 3: Deploy Stack**
|
||||
1. **Plesk → Docker → Stacks → Add Stack**
|
||||
2. **Project name**: `customer-portal`
|
||||
3. **Method**: Upload file or paste `compose-plesk.yaml` content
|
||||
4. **Deploy**
|
||||
|
||||
### **Step 4: Configure Nginx Proxy**
|
||||
1. **Plesk → Websites & Domains → yourdomain.com → Docker Proxy Rules**
|
||||
2. **Add rule**: `/` → `portal-frontend` → port `3000`
|
||||
3. **Add rule**: `/api` → `portal-backend` → port `4000`
|
||||
|
||||
## 🔄 **Update Workflow**
|
||||
|
||||
### **When You Push Code:**
|
||||
1. **GitHub Actions** builds new images automatically
|
||||
2. **SSH to Plesk** and update:
|
||||
```bash
|
||||
cd /var/www/vhosts/yourdomain.com/httpdocs/
|
||||
docker compose -f compose-plesk.yaml pull
|
||||
docker compose -f compose-plesk.yaml up -d
|
||||
```
|
||||
|
||||
## 🔐 **Environment Variables**
|
||||
|
||||
Your compose file uses these key variables from `.env.production`:
|
||||
|
||||
### **Database:**
|
||||
- `POSTGRES_DB`, `POSTGRES_USER`, `POSTGRES_PASSWORD`
|
||||
- `DATABASE_URL` - Full connection string
|
||||
|
||||
### **Application:**
|
||||
- `JWT_SECRET`, `CORS_ORIGIN`
|
||||
- `NEXT_PUBLIC_API_BASE`, `NEXT_PUBLIC_APP_NAME`
|
||||
|
||||
### **External APIs:**
|
||||
- `WHMCS_BASE_URL`, `WHMCS_API_IDENTIFIER`, `WHMCS_API_SECRET`
|
||||
- `SF_LOGIN_URL`, `SF_CLIENT_ID`, `SF_USERNAME`
|
||||
|
||||
### **Email & Logging:**
|
||||
- `SENDGRID_API_KEY`, `EMAIL_FROM`
|
||||
- `LOG_LEVEL`, `LOG_FORMAT`
|
||||
|
||||
## ✅ **Ready to Deploy!**
|
||||
|
||||
Your setup is clean and production-ready:
|
||||
- ✅ Environment variables properly configured
|
||||
- ✅ Docker secrets via environment variables
|
||||
- ✅ Database and Redis secured (localhost only)
|
||||
- ✅ Automated image building
|
||||
- ✅ Clean file structure
|
||||
@ -5,6 +5,7 @@
|
||||
"compilerOptions": {
|
||||
"deleteOutDir": true,
|
||||
"watchAssets": true,
|
||||
"assets": ["**/*.prisma"]
|
||||
"assets": ["**/*.prisma"],
|
||||
"tsConfigPath": "tsconfig.build.json"
|
||||
}
|
||||
}
|
||||
|
||||
@ -6,7 +6,7 @@
|
||||
"private": true,
|
||||
"license": "UNLICENSED",
|
||||
"scripts": {
|
||||
"build": "nest build",
|
||||
"build": "nest build -c tsconfig.build.json",
|
||||
"format": "prettier --write \"src/**/*.ts\" \"test/**/*.ts\"",
|
||||
"start": "nest start",
|
||||
"dev": "NODE_OPTIONS=\"--no-deprecation\" nest start --watch",
|
||||
@ -82,7 +82,7 @@
|
||||
"source-map-support": "^0.5.21",
|
||||
"supertest": "^7.1.4",
|
||||
"ts-jest": "^29.4.1",
|
||||
"ts-loader": "^9.5.2",
|
||||
|
||||
"ts-node": "^10.9.2",
|
||||
"tsconfig-paths": "^4.2.0",
|
||||
"typescript": "^5.9.2"
|
||||
|
||||
@ -1,234 +0,0 @@
|
||||
import type { Params } from "nestjs-pino";
|
||||
import type { Options as PinoHttpOptions } from "pino-http";
|
||||
import type { IncomingMessage, ServerResponse } from "http";
|
||||
import type { ConfigService } from "@nestjs/config";
|
||||
import { join } from "path";
|
||||
import { mkdir } from "fs/promises";
|
||||
|
||||
export class LoggingConfig {
|
||||
static async createPinoConfig(configService: ConfigService): Promise<Params> {
|
||||
const nodeEnv = configService.get<string>("NODE_ENV", "development");
|
||||
const logLevel = configService.get<string>("LOG_LEVEL", "info");
|
||||
const appName = configService.get<string>("APP_NAME", "customer-portal-bff");
|
||||
|
||||
// Ensure logs directory exists for production
|
||||
if (nodeEnv === "production") {
|
||||
try {
|
||||
await mkdir("logs", { recursive: true });
|
||||
} catch {
|
||||
// Directory might already exist
|
||||
}
|
||||
}
|
||||
|
||||
// Base Pino configuration
|
||||
const pinoConfig: PinoHttpOptions = {
|
||||
level: logLevel,
|
||||
name: appName,
|
||||
base: {
|
||||
service: appName,
|
||||
environment: nodeEnv,
|
||||
pid: typeof process !== "undefined" ? process.pid : 0,
|
||||
},
|
||||
timestamp: true,
|
||||
// Ensure sensitive fields are redacted across all logs
|
||||
redact: {
|
||||
paths: [
|
||||
// Common headers
|
||||
"req.headers.authorization",
|
||||
"req.headers.cookie",
|
||||
// Auth
|
||||
"password",
|
||||
"password2",
|
||||
"token",
|
||||
"secret",
|
||||
"jwt",
|
||||
"apiKey",
|
||||
// Custom params that may carry secrets
|
||||
"params.password",
|
||||
"params.password2",
|
||||
"params.secret",
|
||||
"params.token",
|
||||
],
|
||||
remove: true,
|
||||
},
|
||||
formatters: {
|
||||
level: (label: string) => ({ level: label }),
|
||||
bindings: () => ({}), // Remove default hostname/pid from every log
|
||||
},
|
||||
serializers: {
|
||||
// Keep logs concise: omit headers by default
|
||||
req: (req: {
|
||||
method?: string;
|
||||
url?: string;
|
||||
remoteAddress?: string;
|
||||
remotePort?: number;
|
||||
}) => ({
|
||||
method: req.method,
|
||||
url: req.url,
|
||||
remoteAddress: req.remoteAddress,
|
||||
remotePort: req.remotePort,
|
||||
}),
|
||||
res: (res: { statusCode: number }) => ({
|
||||
statusCode: res.statusCode,
|
||||
}),
|
||||
err: (err: {
|
||||
constructor: { name: string };
|
||||
message: string;
|
||||
stack?: string;
|
||||
code?: string;
|
||||
status?: number;
|
||||
}) => ({
|
||||
type: err.constructor.name,
|
||||
message: err.message,
|
||||
stack: err.stack,
|
||||
...(err.code && { code: err.code }),
|
||||
...(err.status && { status: err.status }),
|
||||
}),
|
||||
},
|
||||
};
|
||||
|
||||
// Development: Pretty printing
|
||||
if (nodeEnv === "development") {
|
||||
pinoConfig.transport = {
|
||||
target: "pino-pretty",
|
||||
options: {
|
||||
colorize: true,
|
||||
translateTime: "yyyy-mm-dd HH:MM:ss",
|
||||
ignore: "pid,hostname",
|
||||
singleLine: false,
|
||||
hideObject: false,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// Production: File logging with rotation
|
||||
if (nodeEnv === "production") {
|
||||
pinoConfig.transport = {
|
||||
targets: [
|
||||
// Console output for container logs
|
||||
{
|
||||
target: "pino/file",
|
||||
level: logLevel,
|
||||
options: { destination: 1 }, // stdout
|
||||
},
|
||||
// Combined log file
|
||||
{
|
||||
target: "pino/file",
|
||||
level: "info",
|
||||
options: {
|
||||
destination: join("logs", `${appName}-combined.log`),
|
||||
mkdir: true,
|
||||
},
|
||||
},
|
||||
// Error log file
|
||||
{
|
||||
target: "pino/file",
|
||||
level: "error",
|
||||
options: {
|
||||
destination: join("logs", `${appName}-error.log`),
|
||||
mkdir: true,
|
||||
},
|
||||
},
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
pinoHttp: {
|
||||
...pinoConfig,
|
||||
// Auto-generate correlation IDs
|
||||
genReqId: (req: IncomingMessage, res: ServerResponse) => {
|
||||
const existingIdHeader = req.headers["x-correlation-id"];
|
||||
const existingId = Array.isArray(existingIdHeader)
|
||||
? existingIdHeader[0]
|
||||
: existingIdHeader;
|
||||
if (existingId) return existingId;
|
||||
|
||||
const correlationId = LoggingConfig.generateCorrelationId();
|
||||
res.setHeader("x-correlation-id", correlationId);
|
||||
return correlationId;
|
||||
},
|
||||
// Custom log levels: only warn on 4xx and error on 5xx
|
||||
customLogLevel: (_req: IncomingMessage, res: ServerResponse, err?: unknown) => {
|
||||
if (res.statusCode >= 400 && res.statusCode < 500) return "warn";
|
||||
if (res.statusCode >= 500 || err) return "error";
|
||||
return "silent" as unknown as
|
||||
| "error"
|
||||
| "warn"
|
||||
| "info"
|
||||
| "debug"
|
||||
| "trace"
|
||||
| "fatal"
|
||||
| "silent";
|
||||
},
|
||||
// Suppress success messages entirely
|
||||
customSuccessMessage: () => "",
|
||||
customErrorMessage: (
|
||||
req: IncomingMessage,
|
||||
res: ServerResponse,
|
||||
err: { message?: string }
|
||||
) => {
|
||||
const method = req.method ?? "";
|
||||
const url = req.url ?? "";
|
||||
return `${method} ${url} ${res.statusCode} - ${err.message ?? "error"}`;
|
||||
},
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Sanitize headers to remove sensitive information
|
||||
*/
|
||||
private static sanitizeHeaders(
|
||||
headers: Record<string, unknown> | undefined | null
|
||||
): Record<string, unknown> | undefined | null {
|
||||
if (!headers || typeof headers !== "object") {
|
||||
return headers;
|
||||
}
|
||||
|
||||
const sensitiveKeys = [
|
||||
"authorization",
|
||||
"cookie",
|
||||
"set-cookie",
|
||||
"x-api-key",
|
||||
"x-auth-token",
|
||||
"password",
|
||||
"secret",
|
||||
"token",
|
||||
"jwt",
|
||||
"bearer",
|
||||
];
|
||||
|
||||
const sanitized: Record<string, unknown> = { ...headers } as Record<string, unknown>;
|
||||
|
||||
Object.keys(sanitized).forEach(key => {
|
||||
if (sensitiveKeys.some(sensitive => key.toLowerCase().includes(sensitive.toLowerCase()))) {
|
||||
sanitized[key] = "[REDACTED]";
|
||||
}
|
||||
});
|
||||
|
||||
return sanitized;
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate correlation ID
|
||||
*/
|
||||
private static generateCorrelationId(): string {
|
||||
return `${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get log levels for different environments
|
||||
*/
|
||||
static getLogLevels(level: string): string[] {
|
||||
const logLevels: Record<string, string[]> = {
|
||||
error: ["error"],
|
||||
warn: ["error", "warn"],
|
||||
info: ["error", "warn", "info"],
|
||||
debug: ["error", "warn", "info", "debug"],
|
||||
verbose: ["error", "warn", "info", "debug", "verbose"],
|
||||
};
|
||||
|
||||
return logLevels[level] || logLevels.info;
|
||||
}
|
||||
}
|
||||
@ -1,7 +1,7 @@
|
||||
import { Global, Module } from "@nestjs/common";
|
||||
import { ConfigModule, ConfigService } from "@nestjs/config";
|
||||
import { LoggerModule } from "nestjs-pino";
|
||||
import { LoggingConfig } from "./logging.config";
|
||||
import { createNestPinoConfig } from "@customer-portal/shared";
|
||||
|
||||
@Global()
|
||||
@Module({
|
||||
@ -10,7 +10,7 @@ import { LoggingConfig } from "./logging.config";
|
||||
imports: [ConfigModule],
|
||||
inject: [ConfigService],
|
||||
useFactory: async (configService: ConfigService) =>
|
||||
await LoggingConfig.createPinoConfig(configService),
|
||||
await createNestPinoConfig(configService),
|
||||
}),
|
||||
],
|
||||
exports: [LoggerModule],
|
||||
|
||||
@ -139,6 +139,7 @@ async function bootstrap() {
|
||||
logger.log(
|
||||
`🗄️ Database: ${configService.get("DATABASE_URL", "postgresql://dev:dev@localhost:5432/portal_dev")}`
|
||||
);
|
||||
logger.log(`🔗 Prisma Studio: http://localhost:5555`);
|
||||
logger.log(`🔴 Redis: ${configService.get("REDIS_URL", "redis://localhost:6379")}`);
|
||||
|
||||
if (configService.get("NODE_ENV") !== "production") {
|
||||
|
||||
15
apps/bff/tsconfig.build.json
Normal file
15
apps/bff/tsconfig.build.json
Normal file
@ -0,0 +1,15 @@
|
||||
{
|
||||
"extends": "./tsconfig.json",
|
||||
"compilerOptions": {
|
||||
"noEmit": false,
|
||||
"incremental": true,
|
||||
"tsBuildInfoFile": "./tsconfig.build.tsbuildinfo",
|
||||
"outDir": "./dist",
|
||||
"sourceMap": true,
|
||||
"declaration": false
|
||||
},
|
||||
"include": ["src/**/*"],
|
||||
"exclude": ["node_modules", "dist", "test", "**/*.spec.ts"]
|
||||
}
|
||||
|
||||
|
||||
@ -3,8 +3,8 @@
|
||||
"version": "0.1.0",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"dev": "next dev -p ${NEXT_PORT:-3000} --turbopack",
|
||||
"build": "next build",
|
||||
"dev": "next dev -p ${NEXT_PORT:-3000}",
|
||||
"build": "next build --turbopack",
|
||||
"build:turbo": "next build --turbopack",
|
||||
"start": "next start -p ${NEXT_PORT:-3000}",
|
||||
"lint": "eslint .",
|
||||
@ -37,7 +37,6 @@
|
||||
"@types/react": "^19.1.10",
|
||||
"@types/react-dom": "^19.1.7",
|
||||
"tailwindcss": "^4.1.12",
|
||||
"tw-animate-css": "^1.3.7",
|
||||
"typescript": "^5.9.2"
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,134 +1,5 @@
|
||||
/**
|
||||
* Application logger utility
|
||||
* Provides structured logging with appropriate levels for development and production
|
||||
* Compatible with backend logging standards
|
||||
*/
|
||||
import { createPinoLogger, getSharedLogger } from "@customer-portal/shared";
|
||||
|
||||
type LogLevel = "debug" | "info" | "warn" | "error";
|
||||
|
||||
interface LogEntry {
|
||||
level: LogLevel;
|
||||
message: string;
|
||||
data?: unknown;
|
||||
timestamp: string;
|
||||
service: string;
|
||||
environment: string;
|
||||
}
|
||||
|
||||
class Logger {
|
||||
private isDevelopment = process.env.NODE_ENV === "development";
|
||||
private service = "customer-portal-frontend";
|
||||
|
||||
private formatMessage(level: LogLevel, message: string, data?: unknown): LogEntry {
|
||||
return {
|
||||
level,
|
||||
message,
|
||||
data,
|
||||
timestamp: new Date().toISOString(),
|
||||
service: this.service,
|
||||
environment: process.env.NODE_ENV || "development",
|
||||
};
|
||||
}
|
||||
|
||||
private log(level: LogLevel, message: string, data?: unknown): void {
|
||||
const entry = this.formatMessage(level, message, data);
|
||||
|
||||
if (this.isDevelopment) {
|
||||
const safeData =
|
||||
data instanceof Error
|
||||
? {
|
||||
name: data.name,
|
||||
message: data.message,
|
||||
stack: data.stack,
|
||||
}
|
||||
: data;
|
||||
|
||||
const logData = {
|
||||
timestamp: entry.timestamp,
|
||||
level: entry.level.toUpperCase(),
|
||||
service: entry.service,
|
||||
message: entry.message,
|
||||
...(safeData != null ? { data: safeData } : {}),
|
||||
};
|
||||
|
||||
try {
|
||||
console.log(logData);
|
||||
} catch {
|
||||
// no-op
|
||||
}
|
||||
} else {
|
||||
// In production, structured logging for external services
|
||||
const logData = {
|
||||
...entry,
|
||||
...(data != null ? { data } : {}),
|
||||
};
|
||||
|
||||
// For production, you might want to send to a logging service
|
||||
// For now, only log errors and warnings to console
|
||||
if (level === "error" || level === "warn") {
|
||||
try {
|
||||
console[level](JSON.stringify(logData));
|
||||
} catch {
|
||||
// no-op
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
debug(message: string, data?: unknown): void {
|
||||
this.log("debug", message, data);
|
||||
}
|
||||
|
||||
info(message: string, data?: unknown): void {
|
||||
this.log("info", message, data);
|
||||
}
|
||||
|
||||
warn(message: string, data?: unknown): void {
|
||||
this.log("warn", message, data);
|
||||
}
|
||||
|
||||
error(message: string, data?: unknown): void {
|
||||
this.log("error", message, data);
|
||||
}
|
||||
|
||||
// Structured logging methods for better integration
|
||||
logApiCall(
|
||||
endpoint: string,
|
||||
method: string,
|
||||
status: number,
|
||||
duration: number,
|
||||
data?: unknown
|
||||
): void {
|
||||
this.info(`API ${method} ${endpoint}`, {
|
||||
endpoint,
|
||||
method,
|
||||
status,
|
||||
duration: `${duration}ms`,
|
||||
...(data != null ? { data } : {}),
|
||||
});
|
||||
}
|
||||
|
||||
logUserAction(userId: string, action: string, data?: unknown): void {
|
||||
this.info(`User action: ${action}`, {
|
||||
userId,
|
||||
action,
|
||||
...(data != null ? { data } : {}),
|
||||
});
|
||||
}
|
||||
|
||||
logError(error: Error, context?: string, data?: unknown): void {
|
||||
this.error(`Error${context ? ` in ${context}` : ""}: ${error.message}`, {
|
||||
error: {
|
||||
name: error.name,
|
||||
message: error.message,
|
||||
stack: error.stack,
|
||||
},
|
||||
context,
|
||||
...(data != null ? { data } : {}),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Export singleton instance
|
||||
export const logger = new Logger();
|
||||
// Prefer a shared singleton so logs share correlationId/userId across modules
|
||||
export const logger = getSharedLogger();
|
||||
export default logger;
|
||||
|
||||
@ -9,12 +9,23 @@
|
||||
"name": "next"
|
||||
}
|
||||
],
|
||||
|
||||
// Path mappings
|
||||
"paths": {
|
||||
"@/*": ["./src/*"]
|
||||
}
|
||||
"@/*": [
|
||||
"./src/*"
|
||||
]
|
||||
},
|
||||
// Enforce TS-only in portal and keep strict mode explicit (inherits from root)
|
||||
"allowJs": false,
|
||||
"strict": true
|
||||
},
|
||||
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
|
||||
"exclude": ["node_modules"]
|
||||
"include": [
|
||||
"next-env.d.ts",
|
||||
"**/*.ts",
|
||||
"**/*.tsx",
|
||||
".next/types/**/*.ts"
|
||||
],
|
||||
"exclude": [
|
||||
"node_modules"
|
||||
]
|
||||
}
|
||||
|
||||
File diff suppressed because one or more lines are too long
13
package.json
13
package.json
@ -9,16 +9,18 @@
|
||||
},
|
||||
"packageManager": "pnpm@10.15.0",
|
||||
"scripts": {
|
||||
"dev": "pnpm --parallel --recursive run dev",
|
||||
"build": "pnpm --recursive run build",
|
||||
"start": "pnpm --parallel --filter portal --filter @customer-portal/bff run start",
|
||||
"predev": "pnpm --filter @customer-portal/shared build",
|
||||
"dev": "./scripts/dev/manage.sh apps",
|
||||
"dev:all": "pnpm --parallel --filter @customer-portal/shared --filter @customer-portal/portal --filter @customer-portal/bff run dev",
|
||||
"build": "pnpm --recursive -w --if-present run build",
|
||||
"start": "pnpm --parallel --filter @customer-portal/portal --filter @customer-portal/bff run start",
|
||||
"test": "pnpm --recursive run test",
|
||||
"lint": "pnpm --recursive run lint",
|
||||
"lint:fix": "pnpm --recursive run lint:fix",
|
||||
"format": "prettier -w .",
|
||||
"format:check": "prettier -c .",
|
||||
"prepare": "husky",
|
||||
"type-check": "pnpm --recursive run type-check",
|
||||
"type-check": "pnpm --filter @customer-portal/shared build && pnpm --recursive run type-check",
|
||||
"clean": "pnpm --recursive run clean",
|
||||
"dev:start": "./scripts/dev/manage.sh start",
|
||||
"dev:stop": "./scripts/dev/manage.sh stop",
|
||||
@ -44,7 +46,8 @@
|
||||
"db:reset": "pnpm --filter @customer-portal/bff run db:reset",
|
||||
"update:check": "pnpm outdated --recursive",
|
||||
"update:all": "pnpm update --recursive --latest && pnpm audit && pnpm type-check",
|
||||
"update:safe": "pnpm update --recursive && pnpm audit && pnpm type-check"
|
||||
"update:safe": "pnpm update --recursive && pnpm audit && pnpm type-check",
|
||||
"dev:watch": "pnpm --parallel --filter @customer-portal/shared --filter @customer-portal/portal --filter @customer-portal/bff run dev"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@eslint/js": "^9.13.0",
|
||||
|
||||
@ -5,14 +5,29 @@
|
||||
"type": "module",
|
||||
"main": "dist/index.js",
|
||||
"types": "dist/index.d.ts",
|
||||
"private": true,
|
||||
"sideEffects": false,
|
||||
"files": [
|
||||
"dist"
|
||||
],
|
||||
"exports": {
|
||||
".": {
|
||||
"types": "./dist/index.d.ts",
|
||||
"default": "./dist/index.js"
|
||||
}
|
||||
},
|
||||
"scripts": {
|
||||
"build": "tsc",
|
||||
"dev": "tsc -w --preserveWatchOutput",
|
||||
"clean": "rm -rf dist",
|
||||
"type-check": "tsc --noEmit",
|
||||
"test": "echo \"No tests specified for shared package\"",
|
||||
"lint": "eslint .",
|
||||
"lint:fix": "eslint . --fix"
|
||||
},
|
||||
"dependencies": {
|
||||
"pino": "^9.9.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"typescript": "^5.9.2"
|
||||
}
|
||||
|
||||
@ -5,3 +5,5 @@
|
||||
|
||||
export * from "./logger.config.js";
|
||||
export * from "./logger.interface.js";
|
||||
export * from "./pino-logger.js";
|
||||
export * from "./nest-logger.config.js";
|
||||
|
||||
126
packages/shared/src/logging/nest-logger.config.ts
Normal file
126
packages/shared/src/logging/nest-logger.config.ts
Normal file
@ -0,0 +1,126 @@
|
||||
// Lightweight, framework-agnostic factory that returns an object compatible
|
||||
// with nestjs-pino's LoggerModule.forRoot({ pinoHttp: {...} }) shape without importing types.
|
||||
import { join } from "path";
|
||||
import { mkdir } from "fs/promises";
|
||||
|
||||
export async function createNestPinoConfig(configService: {
|
||||
get<T = string>(key: string, defaultValue?: T): T;
|
||||
}) {
|
||||
const nodeEnv = configService.get<string>("NODE_ENV", "development");
|
||||
const logLevel = configService.get<string>("LOG_LEVEL", "info");
|
||||
const appName = configService.get<string>("APP_NAME", "customer-portal-bff");
|
||||
|
||||
if (nodeEnv === "production") {
|
||||
try {
|
||||
await mkdir("logs", { recursive: true });
|
||||
} catch {
|
||||
// ignore
|
||||
}
|
||||
}
|
||||
|
||||
const pinoConfig: Record<string, unknown> = {
|
||||
level: logLevel,
|
||||
name: appName,
|
||||
base: {
|
||||
service: appName,
|
||||
environment: nodeEnv,
|
||||
pid: typeof process !== "undefined" ? process.pid : 0,
|
||||
},
|
||||
timestamp: true,
|
||||
redact: {
|
||||
paths: [
|
||||
"req.headers.authorization",
|
||||
"req.headers.cookie",
|
||||
"password",
|
||||
"password2",
|
||||
"token",
|
||||
"secret",
|
||||
"jwt",
|
||||
"apiKey",
|
||||
"params.password",
|
||||
"params.password2",
|
||||
"params.secret",
|
||||
"params.token",
|
||||
],
|
||||
remove: true,
|
||||
},
|
||||
formatters: {
|
||||
level: (label: string) => ({ level: label }),
|
||||
bindings: () => ({}),
|
||||
},
|
||||
serializers: {
|
||||
req: (req: { method?: string; url?: string; remoteAddress?: string; remotePort?: number }) => ({
|
||||
method: req.method,
|
||||
url: req.url,
|
||||
remoteAddress: req.remoteAddress,
|
||||
remotePort: req.remotePort,
|
||||
}),
|
||||
res: (res: { statusCode: number }) => ({ statusCode: res.statusCode }),
|
||||
err: (err: { constructor: { name: string }; message: string; stack?: string; code?: string; status?: number }) => ({
|
||||
type: err.constructor.name,
|
||||
message: err.message,
|
||||
stack: err.stack,
|
||||
...(err.code && { code: err.code }),
|
||||
...(err.status && { status: err.status }),
|
||||
}),
|
||||
},
|
||||
};
|
||||
|
||||
if (nodeEnv === "development") {
|
||||
(pinoConfig as any).transport = {
|
||||
target: "pino-pretty",
|
||||
options: {
|
||||
colorize: true,
|
||||
translateTime: "yyyy-mm-dd HH:MM:ss",
|
||||
ignore: "pid,hostname",
|
||||
singleLine: false,
|
||||
hideObject: false,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (nodeEnv === "production") {
|
||||
(pinoConfig as any).transport = {
|
||||
targets: [
|
||||
{ target: "pino/file", level: logLevel, options: { destination: 1 } },
|
||||
{
|
||||
target: "pino/file",
|
||||
level: "info",
|
||||
options: { destination: join("logs", `${appName}-combined.log`), mkdir: true },
|
||||
},
|
||||
{
|
||||
target: "pino/file",
|
||||
level: "error",
|
||||
options: { destination: join("logs", `${appName}-error.log`), mkdir: true },
|
||||
},
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
pinoHttp: {
|
||||
...(pinoConfig as any),
|
||||
genReqId: (req: any, res: any) => {
|
||||
const existingIdHeader = req.headers?.["x-correlation-id"];
|
||||
const existingId = Array.isArray(existingIdHeader) ? existingIdHeader[0] : existingIdHeader;
|
||||
if (existingId) return existingId;
|
||||
const correlationId = `${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
|
||||
res.setHeader?.("x-correlation-id", correlationId);
|
||||
return correlationId;
|
||||
},
|
||||
customLogLevel: (_req: any, res: any, err?: unknown) => {
|
||||
if (res.statusCode >= 400 && res.statusCode < 500) return "warn";
|
||||
if (res.statusCode >= 500 || err) return "error";
|
||||
return "silent" as any;
|
||||
},
|
||||
customSuccessMessage: () => "",
|
||||
customErrorMessage: (req: any, res: any, err: { message?: string }) => {
|
||||
const method = req.method ?? "";
|
||||
const url = req.url ?? "";
|
||||
return `${method} ${url} ${res.statusCode} - ${err.message ?? "error"}`;
|
||||
},
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
178
packages/shared/src/logging/pino-logger.ts
Normal file
178
packages/shared/src/logging/pino-logger.ts
Normal file
@ -0,0 +1,178 @@
|
||||
import pino from "pino";
|
||||
import { DEFAULT_LOG_CONFIG, formatLogEntry, sanitizeLogData } from "./logger.config.js";
|
||||
import type { ILogger, LoggerOptions } from "./logger.interface.js";
|
||||
|
||||
/**
|
||||
* Create a cross-platform Pino-based logger that implements ILogger
|
||||
* Works in Node and browser environments
|
||||
*/
|
||||
export function createPinoLogger(options: LoggerOptions = {}): ILogger {
|
||||
const level = options.level ?? DEFAULT_LOG_CONFIG.level;
|
||||
const service = options.service ?? DEFAULT_LOG_CONFIG.service;
|
||||
const environment = options.environment ?? DEFAULT_LOG_CONFIG.environment;
|
||||
|
||||
// Context that flows with the logger instance
|
||||
let correlationId: string | undefined = options.context?.correlationId;
|
||||
let userId: string | undefined = options.context?.userId;
|
||||
let requestId: string | undefined = options.context?.requestId;
|
||||
|
||||
// Configure pino for both Node and browser
|
||||
const isBrowser = typeof window !== "undefined";
|
||||
const pinoLogger = pino({
|
||||
level,
|
||||
name: service,
|
||||
base: {
|
||||
service,
|
||||
environment,
|
||||
},
|
||||
// Pretty output only in development for Node; browsers format via console
|
||||
...(isBrowser
|
||||
? { browser: { asObject: true } }
|
||||
: {}),
|
||||
formatters: {
|
||||
level: (label: string) => ({ level: label }),
|
||||
bindings: () => ({}),
|
||||
},
|
||||
redact: {
|
||||
paths: [
|
||||
"req.headers.authorization",
|
||||
"req.headers.cookie",
|
||||
"password",
|
||||
"password2",
|
||||
"token",
|
||||
"secret",
|
||||
"jwt",
|
||||
"apiKey",
|
||||
"params.password",
|
||||
"params.password2",
|
||||
"params.secret",
|
||||
"params.token",
|
||||
],
|
||||
remove: true,
|
||||
},
|
||||
});
|
||||
|
||||
function withContext(data?: unknown): Record<string, unknown> | undefined {
|
||||
if (data == null) return undefined;
|
||||
const sanitized = sanitizeLogData(data);
|
||||
return {
|
||||
...(correlationId ? { correlationId } : {}),
|
||||
...(userId ? { userId } : {}),
|
||||
...(requestId ? { requestId } : {}),
|
||||
data: sanitized,
|
||||
} as Record<string, unknown>;
|
||||
}
|
||||
|
||||
const api: ILogger = {
|
||||
debug(message, data) {
|
||||
pinoLogger.debug(withContext(data), message);
|
||||
},
|
||||
info(message, data) {
|
||||
pinoLogger.info(withContext(data), message);
|
||||
},
|
||||
warn(message, data) {
|
||||
pinoLogger.warn(withContext(data), message);
|
||||
},
|
||||
error(message, data) {
|
||||
pinoLogger.error(withContext(data), message);
|
||||
},
|
||||
trace(message, data) {
|
||||
pinoLogger.trace(withContext(data), message);
|
||||
},
|
||||
|
||||
logApiCall(endpoint, method, status, duration, data) {
|
||||
pinoLogger.info(
|
||||
withContext({ endpoint, method, status, duration: `${duration}ms`, ...(data ? { data } : {}) }),
|
||||
`API ${method} ${endpoint}`
|
||||
);
|
||||
},
|
||||
logUserAction(user, action, data) {
|
||||
pinoLogger.info(withContext({ userId: user, action, ...(data ? { data } : {}) }), "User action");
|
||||
},
|
||||
logError(error, context, data) {
|
||||
pinoLogger.error(
|
||||
withContext({
|
||||
error: { name: error.name, message: error.message, stack: error.stack },
|
||||
...(context ? { context } : {}),
|
||||
...(data ? { data } : {}),
|
||||
}),
|
||||
`Error${context ? ` in ${context}` : ""}: ${error.message}`
|
||||
);
|
||||
},
|
||||
logRequest(req, data) {
|
||||
pinoLogger.info(withContext({ req, ...(data ? { data } : {}) }), "Request");
|
||||
},
|
||||
logResponse(res, data) {
|
||||
pinoLogger.info(withContext({ res, ...(data ? { data } : {} ) }), "Response");
|
||||
},
|
||||
|
||||
setCorrelationId(id) {
|
||||
correlationId = id;
|
||||
},
|
||||
setUserId(id) {
|
||||
userId = id;
|
||||
},
|
||||
setRequestId(id) {
|
||||
requestId = id;
|
||||
},
|
||||
|
||||
child(context) {
|
||||
const child = pinoLogger.child(context);
|
||||
const childLogger = createPinoLogger({
|
||||
level,
|
||||
service,
|
||||
environment,
|
||||
context: {
|
||||
correlationId,
|
||||
userId,
|
||||
requestId,
|
||||
...context,
|
||||
},
|
||||
});
|
||||
// Bind methods to use child pino instance
|
||||
// We cannot replace the underlying pino instance easily, so we wrap methods
|
||||
return {
|
||||
...childLogger,
|
||||
debug(message, data) {
|
||||
child.debug(withContext(data), message);
|
||||
},
|
||||
info(message, data) {
|
||||
child.info(withContext(data), message);
|
||||
},
|
||||
warn(message, data) {
|
||||
child.warn(withContext(data), message);
|
||||
},
|
||||
error(message, data) {
|
||||
child.error(withContext(data), message);
|
||||
},
|
||||
trace(message, data) {
|
||||
child.trace(withContext(data), message);
|
||||
},
|
||||
} as ILogger;
|
||||
},
|
||||
|
||||
async flush() {
|
||||
// Flushing is typically relevant in Node streams; browsers are no-ops
|
||||
try {
|
||||
if (typeof (pinoLogger as unknown as { flush?: () => void }).flush === "function") {
|
||||
(pinoLogger as unknown as { flush?: () => void }).flush?.();
|
||||
}
|
||||
} catch {
|
||||
// no-op
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
return api;
|
||||
}
|
||||
|
||||
// Default singleton for convenience
|
||||
let defaultLogger: ILogger | undefined;
|
||||
export function getSharedLogger(): ILogger {
|
||||
if (!defaultLogger) {
|
||||
defaultLogger = createPinoLogger();
|
||||
}
|
||||
return defaultLogger;
|
||||
}
|
||||
|
||||
|
||||
4
pnpm-lock.yaml
generated
4
pnpm-lock.yaml
generated
@ -295,6 +295,10 @@ importers:
|
||||
version: 5.9.2
|
||||
|
||||
packages/shared:
|
||||
dependencies:
|
||||
pino:
|
||||
specifier: ^9.9.0
|
||||
version: 9.9.0
|
||||
devDependencies:
|
||||
typescript:
|
||||
specifier: ^5.9.2
|
||||
|
||||
@ -15,13 +15,11 @@ PROJECT_NAME="portal-dev"
|
||||
# Colors
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
RED='\033[0;31m'
|
||||
NC='\033[0m'
|
||||
|
||||
log() { echo -e "${GREEN}[DEV] $1${NC}"; }
|
||||
warn() { echo -e "${YELLOW}[DEV] $1${NC}"; }
|
||||
info() { echo -e "${BLUE}[DEV] $1${NC}"; }
|
||||
error() { echo -e "${RED}[DEV] ERROR: $1${NC}"; exit 1; }
|
||||
|
||||
# Change to project root
|
||||
@ -111,10 +109,13 @@ start_apps() {
|
||||
log "🔗 Database: postgresql://dev:dev@localhost:5432/portal_dev"
|
||||
log "🔗 Redis: redis://localhost:6379"
|
||||
log "📚 API Docs: http://localhost:${BFF_PORT:-4000}/api/docs"
|
||||
log ""
|
||||
log "Starting apps with hot-reload..."
|
||||
|
||||
pnpm dev
|
||||
|
||||
# Start Prisma Studio (opens browser)
|
||||
(cd "$PROJECT_ROOT/apps/bff" && pnpm db:studio &)
|
||||
|
||||
# Start apps (portal + bff) with hot reload in parallel
|
||||
pnpm --parallel --filter @customer-portal/portal --filter @customer-portal/bff run dev
|
||||
}
|
||||
|
||||
# Reset environment
|
||||
|
||||
@ -26,13 +26,9 @@
|
||||
|
||||
// Performance and compatibility
|
||||
"skipLibCheck": true,
|
||||
"allowJs": true,
|
||||
|
||||
// Build settings
|
||||
"incremental": true,
|
||||
"declaration": true,
|
||||
"declarationMap": true,
|
||||
"sourceMap": true
|
||||
"incremental": true
|
||||
},
|
||||
|
||||
// This is a workspace root - individual packages extend this
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user