https://vimeo.com/856341807?fl=pl&fe=cm : Node.js Event Loop
https://gist.github.com/carefree-ladka/b134007f6749fc3bf743fcdd5013ce6a : Libuv
Node.js is a JavaScript runtime built on Chrome's V8 JavaScript engine. It allows developers to run JavaScript on the server-side, enabling full-stack JavaScript development.
Key Points:
- Built on V8 engine (written in C++)
- Event-driven, non-blocking I/O model
- Designed for building scalable network applications
- Uses libuv for handling asynchronous operations
Common Interview Question: "Explain the difference between Node.js and traditional web servers like Apache."
Answer: Node.js uses an event-driven, non-blocking I/O model, handling multiple concurrent connections with a single thread. Traditional servers like Apache use a multi-threaded approach, creating a new thread for each request, which can be resource-intensive.
The Event Loop is the heart of Node.js's asynchronous architecture. It continuously checks the call stack and task queues to execute pending operations.
Event Loop Phases:
- Timers - executes setTimeout() and setInterval() callbacks
- Pending callbacks - executes I/O callbacks deferred to the next loop iteration
- Idle, prepare - internal use only
- Poll - retrieves new I/O events
- Check - executes setImmediate() callbacks
- Close callbacks - executes close event callbacks
Common Interview Question: "Explain the order of execution for setTimeout, setImmediate, and process.nextTick."
console.log('1');
setTimeout(() => console.log('2'), 0);
setImmediate(() => console.log('3'));
process.nextTick(() => console.log('4'));
console.log('5');
// Output: 1, 5, 4, 2, 3 (or 2 and 3 may swap depending on timing)Non-blocking I/O allows Node.js to handle multiple operations concurrently without waiting for any single operation to complete.
Example:
// Blocking (synchronous)
const fs = require('fs');
const data = fs.readFileSync('file.txt', 'utf8');
console.log(data);
console.log('Done');
// Non-blocking (asynchronous)
fs.readFile('file.txt', 'utf8', (err, data) => {
if (err) throw err;
console.log(data);
});
console.log('Done'); // This runs before file is readNode.js is single-threaded for JavaScript execution but uses multiple threads internally via libuv for I/O operations.
Advantages:
- Lower memory overhead
- No thread synchronization issues
- Simplified development model
Limitations:
- CPU-intensive tasks can block the event loop
- Cannot fully utilize multi-core systems without clustering
CommonJS (Legacy):
// Exporting
module.exports = { foo, bar };
exports.baz = () => {};
// Importing
const myModule = require('./myModule');ES Modules (Modern):
// Exporting
export const foo = 'bar';
export default MyClass;
// Importing
import MyClass, { foo } from './myModule.js';Key Differences:
- CommonJS loads synchronously; ES Modules load asynchronously
- CommonJS uses
require(); ES Modules useimport - ES Modules are the standard and support tree-shaking
- Need
"type": "module"in package.json or.mjsextension for ES Modules
require():
- Synchronous loading
- Can be called conditionally
- Returns the exported value
- Part of CommonJS
import:
- Static (parsed before execution)
- Cannot be used conditionally (use dynamic import() for that)
- Hoisted to the top
- Part of ES6 standard
package.json Structure:
{
"name": "my-app",
"version": "1.0.0",
"description": "My application",
"main": "index.js",
"scripts": {
"start": "node index.js",
"test": "jest",
"dev": "nodemon index.js"
},
"dependencies": {
"express": "^4.18.0"
},
"devDependencies": {
"nodemon": "^2.0.0"
}
}Common Interview Questions:
- What's the difference between dependencies and devDependencies?
- What do
^,~, and*mean in version numbers?^1.2.3- Compatible with 1.x.x (minor and patch updates)~1.2.3- Compatible with 1.2.x (patch updates only)*- Any version
Best Practices:
- Use
package-lock.jsonoryarn.lockfor consistent installs - Regularly update dependencies for security patches
- Audit dependencies with
npm audit - Consider using tools like Snyk or Dependabot
Traditional way of handling asynchronous operations in Node.js.
function fetchData(callback) {
setTimeout(() => {
callback(null, 'Data fetched');
}, 1000);
}
fetchData((err, data) => {
if (err) console.error(err);
else console.log(data);
});Callback Hell:
getData((err, data) => {
processData(data, (err, processed) => {
saveData(processed, (err, result) => {
// Deeply nested...
});
});
});Cleaner alternative to callbacks, representing eventual completion or failure.
function fetchData() {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve('Data fetched');
}, 1000);
});
}
fetchData()
.then(data => console.log(data))
.catch(err => console.error(err));
// Chaining
fetchData()
.then(data => processData(data))
.then(processed => saveData(processed))
.then(result => console.log(result))
.catch(err => console.error(err));Syntactic sugar over Promises, making asynchronous code look synchronous.
async function getData() {
try {
const data = await fetchData();
const processed = await processData(data);
const result = await saveData(processed);
return result;
} catch (err) {
console.error(err);
}
}
// Parallel execution
async function getMultipleData() {
const [data1, data2, data3] = await Promise.all([
fetchData1(),
fetchData2(),
fetchData3()
]);
}Try-Catch with Async/Await:
async function riskyOperation() {
try {
const result = await someAsyncOperation();
return result;
} catch (error) {
console.error('Error:', error.message);
throw error; // Re-throw if needed
}
}Promise Error Handling:
promise
.then(result => handleSuccess(result))
.catch(error => handleError(error))
.finally(() => cleanup());Uncaught Exception Handling:
process.on('uncaughtException', (err) => {
console.error('Uncaught Exception:', err);
process.exit(1);
});
process.on('unhandledRejection', (reason, promise) => {
console.error('Unhandled Rejection at:', promise, 'reason:', reason);
});const fs = require('fs');
const fsPromises = require('fs').promises;
// Async (callback)
fs.readFile('file.txt', 'utf8', (err, data) => {
if (err) throw err;
console.log(data);
});
// Async (promises)
async function readFileAsync() {
try {
const data = await fsPromises.readFile('file.txt', 'utf8');
console.log(data);
} catch (err) {
console.error(err);
}
}
// Sync (blocking - use sparingly)
const data = fs.readFileSync('file.txt', 'utf8');
// Writing files
await fsPromises.writeFile('output.txt', 'Hello World');
// Appending
await fsPromises.appendFile('log.txt', 'New log entry\n');
// Directory operations
await fsPromises.mkdir('newdir', { recursive: true });
const files = await fsPromises.readdir('.');const http = require('http');
const server = http.createServer((req, res) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello World\n');
});
server.listen(3000, () => {
console.log('Server running on port 3000');
});
// Making HTTP requests
const options = {
hostname: 'api.example.com',
port: 443,
path: '/data',
method: 'GET'
};
const req = https.request(options, (res) => {
let data = '';
res.on('data', (chunk) => data += chunk);
res.on('end', () => console.log(data));
});
req.end();const path = require('path');
path.join('/users', 'john', 'documents'); // '/users/john/documents'
path.resolve('file.txt'); // Absolute path
path.basename('/users/john/file.txt'); // 'file.txt'
path.dirname('/users/john/file.txt'); // '/users/john'
path.extname('file.txt'); // '.txt'
path.parse('/users/john/file.txt');
// { root: '/', dir: '/users/john', base: 'file.txt', ext: '.txt', name: 'file' }const os = require('os');
os.platform(); // 'linux', 'darwin', 'win32'
os.arch(); // 'x64'
os.cpus(); // Array of CPU information
os.totalmem(); // Total memory in bytes
os.freemem(); // Free memory in bytes
os.hostname(); // Hostname
os.uptime(); // System uptime in secondsconst EventEmitter = require('events');
class MyEmitter extends EventEmitter {}
const myEmitter = new MyEmitter();
myEmitter.on('event', (arg) => {
console.log('Event occurred:', arg);
});
myEmitter.emit('event', 'some data');
// Once (fires only once)
myEmitter.once('single', () => {
console.log('This runs only once');
});
// Remove listeners
const callback = () => console.log('Callback');
myEmitter.on('test', callback);
myEmitter.removeListener('test', callback);Streams are collections of data that might not be available all at once and don't have to fit in memory.
Types:
- Readable (fs.createReadStream)
- Writable (fs.createWriteStream)
- Duplex (TCP sockets)
- Transform (zlib, crypto)
const fs = require('fs');
// Reading large files efficiently
const readStream = fs.createReadStream('large-file.txt');
const writeStream = fs.createWriteStream('output.txt');
readStream.pipe(writeStream);
// Manual handling
readStream.on('data', (chunk) => {
console.log('Received chunk:', chunk.length);
});
readStream.on('end', () => {
console.log('Finished reading');
});
// Transform stream
const { Transform } = require('stream');
const upperCaseTransform = new Transform({
transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
});
readStream.pipe(upperCaseTransform).pipe(writeStream);Buffers handle binary data in Node.js.
// Creating buffers
const buf1 = Buffer.from('Hello', 'utf8');
const buf2 = Buffer.alloc(10); // 10 bytes of zeros
const buf3 = Buffer.allocUnsafe(10); // Uninitialized
// Reading/writing
buf1.toString(); // 'Hello'
buf1.toString('hex'); // Hex representation
// Concatenating
const buf4 = Buffer.concat([buf1, buf2]);
// Slicing
const slice = buf1.slice(0, 2);Middleware functions have access to request, response, and next function.
const express = require('express');
const app = express();
// Application-level middleware
app.use((req, res, next) => {
console.log('Time:', Date.now());
next();
});
// Built-in middleware
app.use(express.json()); // Parse JSON bodies
app.use(express.urlencoded({ extended: true })); // Parse URL-encoded bodies
app.use(express.static('public')); // Serve static files
// Router-level middleware
const router = express.Router();
router.use((req, res, next) => {
console.log('Router middleware');
next();
});
// Third-party middleware
const morgan = require('morgan');
app.use(morgan('combined'));
const cors = require('cors');
app.use(cors());// Basic routing
app.get('/', (req, res) => {
res.send('Hello World');
});
app.post('/users', (req, res) => {
res.json({ message: 'User created' });
});
// Route parameters
app.get('/users/:id', (req, res) => {
const userId = req.params.id;
res.json({ userId });
});
// Query parameters
app.get('/search', (req, res) => {
const query = req.query.q;
res.json({ query });
});
// Route handlers (multiple callbacks)
app.get('/example',
(req, res, next) => {
console.log('First handler');
next();
},
(req, res) => {
res.send('Second handler');
}
);
// Router organization
const userRouter = express.Router();
userRouter.get('/', getAllUsers);
userRouter.post('/', createUser);
userRouter.get('/:id', getUserById);
userRouter.put('/:id', updateUser);
userRouter.delete('/:id', deleteUser);
app.use('/api/users', userRouter);// Error handling middleware (4 parameters)
app.use((err, req, res, next) => {
console.error(err.stack);
res.status(err.status || 500).json({
error: {
message: err.message,
status: err.status || 500
}
});
});
// Custom error class
class AppError extends Error {
constructor(message, statusCode) {
super(message);
this.statusCode = statusCode;
this.isOperational = true;
}
}
// Usage
app.get('/error', (req, res, next) => {
next(new AppError('Something went wrong', 400));
});
// Async error handling
const asyncHandler = (fn) => (req, res, next) => {
Promise.resolve(fn(req, res, next)).catch(next);
};
app.get('/async', asyncHandler(async (req, res) => {
const data = await fetchData();
res.json(data);
}));// Request object
app.post('/example', (req, res) => {
req.body; // Parsed body (requires body parser)
req.params; // Route parameters
req.query; // Query string parameters
req.headers; // Request headers
req.cookies; // Cookies (requires cookie-parser)
req.method; // HTTP method
req.path; // Request path
req.ip; // Client IP address
req.get('Content-Type'); // Get header value
});
// Response object
app.get('/example', (req, res) => {
res.status(200); // Set status code
res.send('Hello'); // Send string
res.json({ key: 'value' }); // Send JSON
res.sendFile('/path/to/file.html'); // Send file
res.redirect('/other-route'); // Redirect
res.set('Content-Type', 'text/html'); // Set header
res.cookie('token', 'abc123'); // Set cookie
res.download('/path/to/file.pdf'); // Download file
});const mongoose = require('mongoose');
// Connection
mongoose.connect('mongodb://localhost:27017/mydb', {
useNewUrlParser: true,
useUnifiedTopology: true
});
// Schema definition
const userSchema = new mongoose.Schema({
name: { type: String, required: true },
email: { type: String, required: true, unique: true },
age: { type: Number, min: 0 },
createdAt: { type: Date, default: Date.now }
});
// Model
const User = mongoose.model('User', userSchema);
// CRUD operations
// Create
const user = new User({ name: 'John', email: 'john@example.com' });
await user.save();
// or
await User.create({ name: 'Jane', email: 'jane@example.com' });
// Read
const users = await User.find({ age: { $gte: 18 } });
const user = await User.findById(id);
const user = await User.findOne({ email: 'john@example.com' });
// Update
await User.updateOne({ _id: id }, { name: 'John Doe' });
await User.findByIdAndUpdate(id, { age: 30 }, { new: true });
// Delete
await User.deleteOne({ _id: id });
await User.findByIdAndDelete(id);
// Middleware (hooks)
userSchema.pre('save', async function(next) {
if (this.isModified('password')) {
this.password = await bcrypt.hash(this.password, 10);
}
next();
});
// Virtual properties
userSchema.virtual('fullName').get(function() {
return `${this.firstName} ${this.lastName}`;
});
// Population (relationships)
const postSchema = new mongoose.Schema({
title: String,
author: { type: mongoose.Schema.Types.ObjectId, ref: 'User' }
});
const posts = await Post.find().populate('author');const { Sequelize, DataTypes } = require('sequelize');
// Connection
const sequelize = new Sequelize('database', 'username', 'password', {
host: 'localhost',
dialect: 'postgres'
});
// Model definition
const User = sequelize.define('User', {
id: {
type: DataTypes.INTEGER,
primaryKey: true,
autoIncrement: true
},
name: {
type: DataTypes.STRING,
allowNull: false
},
email: {
type: DataTypes.STRING,
unique: true,
allowNull: false,
validate: {
isEmail: true
}
},
age: {
type: DataTypes.INTEGER,
validate: {
min: 0
}
}
});
// Sync (create tables)
await sequelize.sync();
// CRUD operations
// Create
const user = await User.create({ name: 'John', email: 'john@example.com' });
// Read
const users = await User.findAll({ where: { age: { [Op.gte]: 18 } } });
const user = await User.findByPk(id);
const user = await User.findOne({ where: { email: 'john@example.com' } });
// Update
await User.update({ name: 'John Doe' }, { where: { id } });
// Delete
await User.destroy({ where: { id } });
// Associations
User.hasMany(Post);
Post.belongsTo(User);
const posts = await Post.findAll({ include: User });
// Transactions
const t = await sequelize.transaction();
try {
await User.create({ name: 'John' }, { transaction: t });
await Post.create({ title: 'Post' }, { transaction: t });
await t.commit();
} catch (error) {
await t.rollback();
}// MongoDB
mongoose.connect('mongodb://localhost:27017/mydb', {
poolSize: 10, // Maintain up to 10 socket connections
serverSelectionTimeoutMS: 5000,
socketTimeoutMS: 45000
});
// PostgreSQL with pg
const { Pool } = require('pg');
const pool = new Pool({
user: 'username',
host: 'localhost',
database: 'mydb',
password: 'password',
port: 5432,
max: 20, // Maximum pool size
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 2000
});
const client = await pool.connect();
try {
const result = await client.query('SELECT * FROM users');
console.log(result.rows);
} finally {
client.release();
}ORMs (Object-Relational Mapping):
- Examples: Sequelize, TypeORM, Mongoose
- Pros: Abstraction, type safety, relationships, migrations
- Cons: Performance overhead, learning curve, less control
Query Builders:
- Examples: Knex.js
- Pros: More control, better performance, flexible
- Cons: More verbose, manual relationship handling
// Knex.js example
const knex = require('knex')({
client: 'pg',
connection: {
host: 'localhost',
user: 'username',
password: 'password',
database: 'mydb'
}
});
const users = await knex('users')
.where('age', '>=', 18)
.select('*');
await knex('users').insert({ name: 'John', email: 'john@example.com' });
await knex('users').where('id', id).update({ name: 'John Doe' });
await knex('users').where('id', id).del();Authentication - Verifying who you are
Authorization - Verifying what you can access
// Basic authentication middleware
const authenticate = (req, res, next) => {
const token = req.headers.authorization?.split(' ')[1];
if (!token) {
return res.status(401).json({ error: 'No token provided' });
}
try {
const decoded = jwt.verify(token, process.env.JWT_SECRET);
req.user = decoded;
next();
} catch (error) {
res.status(401).json({ error: 'Invalid token' });
}
};
// Role-based authorization
const authorize = (...roles) => {
return (req, res, next) => {
if (!roles.includes(req.user.role)) {
return res.status(403).json({ error: 'Forbidden' });
}
next();
};
};
// Usage
app.get('/admin', authenticate, authorize('admin'), (req, res) => {
res.json({ message: 'Admin access granted' });
});const jwt = require('jsonwebtoken');
// Sign (create) token
const token = jwt.sign(
{ userId: user.id, email: user.email },
process.env.JWT_SECRET,
{ expiresIn: '24h' }
);
// Verify token
try {
const decoded = jwt.verify(token, process.env.JWT_SECRET);
console.log(decoded); // { userId, email, iat, exp }
} catch (error) {
console.error('Invalid token');
}
// Refresh token pattern
const generateTokens = (user) => {
const accessToken = jwt.sign(
{ userId: user.id },
process.env.ACCESS_TOKEN_SECRET,
{ expiresIn: '15m' }
);
const refreshToken = jwt.sign(
{ userId: user.id },
process.env.REFRESH_TOKEN_SECRET,
{ expiresIn: '7d' }
);
return { accessToken, refreshToken };
};const cors = require('cors');
// Simple usage
app.use(cors());
// Configured CORS
app.use(cors({
origin: 'https://example.com',
credentials: true,
optionsSuccessStatus: 200
}));
// Dynamic origin
const corsOptions = {
origin: function (origin, callback) {
const whitelist = ['https://example.com', 'https://app.example.com'];
if (whitelist.indexOf(origin) !== -1 || !origin) {
callback(null, true);
} else {
callback(new Error('Not allowed by CORS'));
}
}
};
app.use(cors(corsOptions));
// Preflight handling
app.options('*', cors());const { body, validationResult } = require('express-validator');
// Validation middleware
const validateUser = [
body('email').isEmail().normalizeEmail(),
body('password').isLength({ min: 8 }).trim().escape(),
body('age').isInt({ min: 0, max: 120 }),
(req, res, next) => {
const errors = validationResult(req);
if (!errors.isEmpty()) {
return res.status(400).json({ errors: errors.array() });
}
next();
}
];
app.post('/register', validateUser, async (req, res) => {
// Process validated data
});
// Joi validation
const Joi = require('joi');
const userSchema = Joi.object({
email: Joi.string().email().required(),
password: Joi.string().min(8).required(),
age: Joi.number().integer().min(0).max(120)
});
const { error, value } = userSchema.validate(req.body);
if (error) {
return res.status(400).json({ error: error.details });
}SQL Injection Prevention:
// Bad (vulnerable)
const query = `SELECT * FROM users WHERE email = '${email}'`;
// Good (parameterized)
const query = 'SELECT * FROM users WHERE email = $1';
const result = await client.query(query, [email]);XSS Prevention:
- Escape user input
- Use Content Security Policy headers
- Validate and sanitize all inputs
const helmet = require('helmet');
app.use(helmet());
app.use(helmet.contentSecurityPolicy({
directives: {
defaultSrc: ["'self'"],
scriptSrc: ["'self'", "'unsafe-inline'"]
}
}));CSRF Protection:
const csrf = require('csurf');
const csrfProtection = csrf({ cookie: true });
app.use(csrfProtection);
app.get('/form', (req, res) => {
res.render('form', { csrfToken: req.csrfToken() });
});Rate Limiting:
const rateLimit = require('express-rate-limit');
const limiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100 // limit each IP to 100 requests per windowMs
});
app.use('/api/', limiter);Node.js runs on a single thread. Clustering allows you to utilize all CPU cores.
const cluster = require('cluster');
const os = require('os');
const numCPUs = os.cpus().length;
if (cluster.isMaster) {
console.log(`Master ${process.pid} is running`);
// Fork workers
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on('exit', (worker, code, signal) => {
console.log(`Worker ${worker.process.pid} died`);
cluster.fork(); // Restart worker
});
} else {
// Workers share the TCP connection
const app = require('./app');
app.listen(3000);
console.log(`Worker ${process.pid} started`);
}For CPU-intensive tasks without blocking the main thread.
const { Worker } = require('worker_threads');
// Main thread
function runService(workerData) {
return new Promise((resolve, reject) => {
const worker = new Worker('./worker.js', { workerData });
worker.on('message', resolve);
worker.on('error', reject);
worker.on('exit', (code) => {
if (code !== 0) {
reject(new Error(`Worker stopped with exit code ${code}`));
}
});
});
}
// Usage
const result = await runService({ task: 'heavy-computation' });
// worker.js
const { parentPort, workerData } = require('worker_threads');
function heavyComputation(data) {
// CPU-intensive work
let result = 0;
for (let i = 0; i < 1e9; i++) {
result += i;
}
return result;
}
const result = heavyComputation(workerData);
parentPort.postMessage(result);// In-memory caching with node-cache
const NodeCache = require('node-cache');
const cache = new NodeCache({ stdTTL: 600 }); // 10 minutes
app.get('/api/data', async (req, res) => {
const cacheKey = 'data';
const cachedData = cache.get(cacheKey);
if (cachedData) {
return res.json(cachedData);
}
const data = await fetchDataFromDB();
cache.set(cacheKey, data);
res.json(data);
});
// Redis caching
const redis = require('redis');
const client = redis.createClient();
app.get('/api/user/:id', async (req, res) => {
const { id } = req.params;
const cacheKey = `user:${id}`;
// Try cache first
const cached = await client.get(cacheKey);
if (cached) {
return res.json(JSON.parse(cached));
}
// Fetch from DB
const user = await User.findById(id);
// Store in cache
await client.setEx(cacheKey, 3600, JSON.stringify(user));
res.json(user);
});
// Cache invalidation
await client.del(`user:${id}`);
await client.flushAll(); // Clear all cacheUsing PM2:
pm2 start app.js -i max # Start app with max instances (CPU cores)
pm2 start app.js -i 4 # Start with 4 instancesNginx as reverse proxy:
upstream backend {
server localhost:3000;
server localhost:3001;
server localhost:3002;
server localhost:3003;
}
server {
listen 80;
location / {
proxy_pass http://backend;
}
}// Monitor memory usage
const formatMemoryUsage = (data) => `${Math.round(data / 1024 / 1024 * 100) / 100} MB`;
const memoryUsage = process.memoryUsage();
console.log({
rss: formatMemoryUsage(memoryUsage.rss),
heapTotal: formatMemoryUsage(memoryUsage.heapTotal),
heapUsed: formatMemoryUsage(memoryUsage.heapUsed),
external: formatMemoryUsage(memoryUsage.external)
});
// Heap dump
const v8 = require('v8');
const fs = require('fs');
const heapSnapshot = v8.writeHeapSnapshot();
console.log('Heap snapshot written to', heapSnapshot);
// Increase heap size
// node --max-old-space-size=4096 app.js
// Memory leaks to avoid
let globalArray = []; // Don't accumulate in global scope
setInterval(() => {
globalArray.push(new Array(1000000)); // Memory leak!
}, 1000);
// Fix: clear when done or use WeakMap/WeakSetJest Example:
// sum.js
function sum(a, b) {
return a + b;
}
module.exports = sum;
// sum.test.js
const sum = require('./sum');
describe('sum function', () => {
test('adds 1 + 2 to equal 3', () => {
expect(sum(1, 2)).toBe(3);
});
test('handles negative numbers', () => {
expect(sum(-1, -2)).toBe(-3);
});
test('handles zero', () => {
expect(sum(0, 5)).toBe(5);
});
});
// Async testing
test('fetches user data', async () => {
const data = await fetchUserData();
expect(data.name).toBe('John');
});
// Using matchers
expect(value).toBe(4);
expect(array).toContain('item');
expect(obj).toHaveProperty('key');
expect(fn).toThrow();
expect(str).toMatch(/pattern/);Mocha with Chai:
const chai = require('chai');
const expect = chai.expect;
describe('Array', () => {
describe('#indexOf()', () => {
it('should return -1 when value is not present', () => {
expect([1, 2, 3].indexOf(4)).to.equal(-1);
});
});
});
// Async with done callback
it('should complete async operation', (done) => {
setTimeout(() => {
expect(true).to.be.true;
done();
}, 100);
});
// Async with promises
it('should resolve promise', () => {
return fetchData().then(data => {
expect(data).to.exist;
});
});const request = require('supertest');
const app = require('./app');
describe('GET /api/users', () => {
it('should return all users', async () => {
const res = await request(app)
.get('/api/users')
.expect('Content-Type', /json/)
.expect(200);
expect(res.body).toBeInstanceOf(Array);
expect(res.body.length).toBeGreaterThan(0);
});
});
describe('POST /api/users', () => {
it('should create a new user', async () => {
const newUser = {
name: 'John Doe',
email: 'john@example.com'
};
const res = await request(app)
.post('/api/users')
.send(newUser)
.expect(201);
expect(res.body).toHaveProperty('id');
expect(res.body.name).toBe(newUser.name);
});
it('should return 400 for invalid data', async () => {
const res = await request(app)
.post('/api/users')
.send({ name: 'John' }) // Missing email
.expect(400);
expect(res.body).toHaveProperty('error');
});
});// Jest mocking
jest.mock('./database');
const db = require('./database');
test('fetches user from database', async () => {
db.findUser.mockResolvedValue({ id: 1, name: 'John' });
const user = await getUser(1);
expect(db.findUser).toHaveBeenCalledWith(1);
expect(user.name).toBe('John');
});
// Sinon for stubbing
const sinon = require('sinon');
it('should call external API', async () => {
const stub = sinon.stub(axios, 'get').resolves({ data: { result: 'success' } });
const result = await fetchExternalData();
expect(stub.calledOnce).to.be.true;
expect(result.result).to.equal('success');
stub.restore();
});
// Spy on function calls
const spy = sinon.spy(console, 'log');
myFunction();
expect(spy.calledWith('Expected message')).to.be.true;
spy.restore();# Jest coverage
npm test -- --coverage
# Istanbul (nyc) with Mocha
npm install --save-dev nyc
npx nyc mocha// package.json
{
"scripts": {
"test": "jest",
"test:coverage": "jest --coverage",
"test:watch": "jest --watch"
},
"jest": {
"collectCoverageFrom": [
"src/**/*.js",
"!src/index.js"
],
"coverageThreshold": {
"global": {
"branches": 80,
"functions": 80,
"lines": 80,
"statements": 80
}
}
}
}// .env file
PORT=3000
DB_HOST=localhost
DB_USER=admin
DB_PASS=secret123
JWT_SECRET=your-secret-key
NODE_ENV=development
// Loading with dotenv
require('dotenv').config();
const port = process.env.PORT || 3000;
const dbHost = process.env.DB_HOST;
// Validation
const requiredEnvVars = ['DB_HOST', 'DB_USER', 'JWT_SECRET'];
requiredEnvVars.forEach(varName => {
if (!process.env[varName]) {
throw new Error(`Missing required environment variable: ${varName}`);
}
});
// Different environments
// .env.development
// .env.production
// .env.test
require('dotenv').config({
path: `.env.${process.env.NODE_ENV}`
});# Install PM2
npm install -g pm2
# Start application
pm2 start app.js
# Start with name
pm2 start app.js --name "my-app"
# Start with cluster mode
pm2 start app.js -i max
# List processes
pm2 list
# Monitor
pm2 monit
# Logs
pm2 logs
pm2 logs my-app
# Restart/Stop/Delete
pm2 restart my-app
pm2 stop my-app
pm2 delete my-app
# Save process list
pm2 save
# Startup script (auto-restart on reboot)
pm2 startup
# Update PM2
pm2 updateecosystem.config.js:
module.exports = {
apps: [{
name: 'my-app',
script: './app.js',
instances: 'max',
exec_mode: 'cluster',
env: {
NODE_ENV: 'development'
},
env_production: {
NODE_ENV: 'production'
},
error_file: './logs/err.log',
out_file: './logs/out.log',
log_date_format: 'YYYY-MM-DD HH:mm:ss Z',
max_memory_restart: '1G',
watch: false,
ignore_watch: ['node_modules', 'logs']
}]
};
// pm2 start ecosystem.config.js --env productionDockerfile:
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
EXPOSE 3000
USER node
CMD ["node", "app.js"]docker-compose.yml:
version: '3.8'
services:
app:
build: .
ports:
- "3000:3000"
environment:
- NODE_ENV=production
- DB_HOST=db
depends_on:
- db
volumes:
- ./logs:/app/logs
restart: unless-stopped
db:
image: postgres:14
environment:
- POSTGRES_DB=mydb
- POSTGRES_USER=admin
- POSTGRES_PASSWORD=secret
volumes:
- postgres-data:/var/lib/postgresql/data
restart: unless-stopped
volumes:
postgres-data:Commands:
# Build image
docker build -t my-app .
# Run container
docker run -p 3000:3000 my-app
# Docker Compose
docker-compose up -d
docker-compose down
docker-compose logs -fGitHub Actions (.github/workflows/node.yml):
name: Node.js CI
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [16.x, 18.x, 20.x]
steps:
- uses: actions/checkout@v3
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
with:
node-version: ${{ matrix.node-version }}
cache: 'npm'
- run: npm ci
- run: npm run build --if-present
- run: npm test
- run: npm run lint
- name: Upload coverage
uses: codecov/codecov-action@v3
with:
file: ./coverage/lcov.infoWinston Logger:
const winston = require('winston');
const logger = winston.createLogger({
level: 'info',
format: winston.format.combine(
winston.format.timestamp(),
winston.format.errors({ stack: true }),
winston.format.json()
),
defaultMeta: { service: 'my-app' },
transports: [
new winston.transports.File({ filename: 'error.log', level: 'error' }),
new winston.transports.File({ filename: 'combined.log' })
]
});
if (process.env.NODE_ENV !== 'production') {
logger.add(new winston.transports.Console({
format: winston.format.simple()
}));
}
// Usage
logger.info('Server started', { port: 3000 });
logger.error('Database connection failed', { error: err.message });
logger.warn('High memory usage', { usage: memoryUsage });Morgan for HTTP logging:
const morgan = require('morgan');
// Development
app.use(morgan('dev'));
// Production (combined format)
app.use(morgan('combined'));
// Custom format
morgan.token('user', (req) => req.user?.id || 'anonymous');
app.use(morgan(':method :url :status :user'));
// Log to file
const fs = require('fs');
const path = require('path');
const accessLogStream = fs.createWriteStream(
path.join(__dirname, 'access.log'),
{ flags: 'a' }
);
app.use(morgan('combined', { stream: accessLogStream }));Application Monitoring:
// Health check endpoint
app.get('/health', (req, res) => {
res.json({
status: 'ok',
uptime: process.uptime(),
timestamp: Date.now()
});
});
// Readiness check
app.get('/ready', async (req, res) => {
try {
await db.ping(); // Check database connection
res.json({ status: 'ready' });
} catch (error) {
res.status(503).json({ status: 'not ready', error: error.message });
}
});Stream Types:
- Readable Streams:
const { Readable } = require('stream');
class MyReadable extends Readable {
constructor(options) {
super(options);
this.index = 0;
}
_read(size) {
if (this.index < 100) {
this.push(`data-${this.index++}\n`);
} else {
this.push(null); // Signal end
}
}
}
const readable = new MyReadable();
readable.on('data', (chunk) => {
console.log('Received:', chunk.toString());
});- Writable Streams:
const { Writable } = require('stream');
class MyWritable extends Writable {
_write(chunk, encoding, callback) {
console.log('Writing:', chunk.toString());
callback();
}
}
const writable = new MyWritable();
writable.write('Hello ');
writable.write('World');
writable.end();- Transform Streams:
const { Transform } = require('stream');
class UpperCaseTransform extends Transform {
_transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
}
const transform = new UpperCaseTransform();
process.stdin.pipe(transform).pipe(process.stdout);Backpressure Handling:
const readable = fs.createReadStream('large-file.txt');
const writable = fs.createWriteStream('output.txt');
readable.on('data', (chunk) => {
const canContinue = writable.write(chunk);
if (!canContinue) {
readable.pause();
}
});
writable.on('drain', () => {
readable.resume();
});const { exec, execFile, spawn, fork } = require('child_process');
// exec - buffered output
exec('ls -la', (error, stdout, stderr) => {
if (error) {
console.error('Error:', error);
return;
}
console.log('Output:', stdout);
});
// execFile - similar to exec but safer
execFile('node', ['--version'], (error, stdout, stderr) => {
console.log('Node version:', stdout);
});
// spawn - streaming output
const ls = spawn('ls', ['-la']);
ls.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
ls.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
ls.on('close', (code) => {
console.log(`Process exited with code ${code}`);
});
// fork - specialized for Node.js processes
const child = fork('./worker.js');
child.on('message', (message) => {
console.log('Received from child:', message);
});
child.send({ task: 'compute', data: [1, 2, 3] });
// worker.js
process.on('message', (message) => {
const result = message.data.reduce((a, b) => a + b, 0);
process.send({ result });
});const WebSocket = require('ws');
// Server
const wss = new WebSocket.Server({ port: 8080 });
wss.on('connection', (ws) => {
console.log('Client connected');
ws.on('message', (message) => {
console.log('Received:', message);
// Broadcast to all clients
wss.clients.forEach((client) => {
if (client.readyState === WebSocket.OPEN) {
client.send(message);
}
});
});
ws.on('close', () => {
console.log('Client disconnected');
});
ws.send('Welcome to the WebSocket server!');
});
// Client
const ws = new WebSocket('ws://localhost:8080');
ws.on('open', () => {
console.log('Connected to server');
ws.send('Hello server!');
});
ws.on('message', (data) => {
console.log('Received:', data);
});
ws.on('close', () => {
console.log('Disconnected from server');
});
// With Express
const express = require('express');
const http = require('http');
const app = express();
const server = http.createServer(app);
const wss = new WebSocket.Server({ server });
server.listen(3000);const { ApolloServer, gql } = require('apollo-server-express');
// Type definitions
const typeDefs = gql`
type User {
id: ID!
name: String!
email: String!
posts: [Post!]!
}
type Post {
id: ID!
title: String!
content: String!
author: User!
}
type Query {
users: [User!]!
user(id: ID!): User
posts: [Post!]!
}
type Mutation {
createUser(name: String!, email: String!): User!
createPost(title: String!, content: String!, authorId: ID!): Post!
}
`;
// Resolvers
const resolvers = {
Query: {
users: () => User.find(),
user: (_, { id }) => User.findById(id),
posts: () => Post.find()
},
Mutation: {
createUser: (_, { name, email }) => {
return User.create({ name, email });
},
createPost: (_, { title, content, authorId }) => {
return Post.create({ title, content, author: authorId });
}
},
User: {
posts: (user) => Post.find({ author: user.id })
},
Post: {
author: (post) => User.findById(post.author)
}
};
// Server setup
const server = new ApolloServer({ typeDefs, resolvers });
await server.start();
server.applyMiddleware({ app });
// Query example
// query {
// user(id: "123") {
// name
// email
// posts {
// title
// }
// }
// }API Gateway Pattern:
// gateway.js
const express = require('express');
const axios = require('axios');
const app = express();
// Route to user service
app.use('/users', async (req, res) => {
try {
const response = await axios({
method: req.method,
url: `http://user-service:3001${req.path}`,
data: req.body
});
res.json(response.data);
} catch (error) {
res.status(error.response?.status || 500).json({ error: error.message });
}
});
// Route to order service
app.use('/orders', async (req, res) => {
try {
const response = await axios({
method: req.method,
url: `http://order-service:3002${req.path}`,
data: req.body
});
res.json(response.data);
} catch (error) {
res.status(error.response?.status || 500).json({ error: error.message });
}
});
app.listen(3000);Message Queue (RabbitMQ):
const amqp = require('amqplib');
// Publisher
async function publishMessage(queue, message) {
const connection = await amqp.connect('amqp://localhost');
const channel = await connection.createChannel();
await channel.assertQueue(queue, { durable: true });
channel.sendToQueue(queue, Buffer.from(JSON.stringify(message)), {
persistent: true
});
await channel.close();
await connection.close();
}
// Consumer
async function consumeMessages(queue) {
const connection = await amqp.connect('amqp://localhost');
const channel = await connection.createChannel();
await channel.assertQueue(queue, { durable: true });
channel.prefetch(1);
channel.consume(queue, async (msg) => {
const message = JSON.parse(msg.content.toString());
console.log('Received:', message);
// Process message
await processMessage(message);
channel.ack(msg);
});
}
// Usage
await publishMessage('tasks', { task: 'send-email', to: 'user@example.com' });
await consumeMessages('tasks');Service Discovery (Consul):
const Consul = require('consul');
const consul = new Consul();
// Register service
await consul.agent.service.register({
name: 'user-service',
address: 'localhost',
port: 3001,
check: {
http: 'http://localhost:3001/health',
interval: '10s'
}
});
// Discover service
const services = await consul.health.service('user-service');
const serviceUrl = `http://${services[0].Service.Address}:${services[0].Service.Port}`;- [ ] Explain Node.js architecture and V8 engine
- [ ] Describe the Event Loop and its phases
- [ ] Understand blocking vs non-blocking I/O
- [ ] Know when Node.js is appropriate vs inappropriate
- [ ] Explain single-threaded model and its implications
- [ ] Differentiate CommonJS and ES Modules
- [ ] Know how require() and import work internally
- [ ] Understand module caching
- [ ] Explain package.json structure
- [ ] Know semantic versioning (^, ~, *)
- [ ] Master callbacks, promises, and async/await
- [ ] Handle errors in async code properly
- [ ] Understand promise chaining and Promise.all/race
- [ ] Know how to avoid callback hell
- [ ] Explain microtasks vs macrotasks
- [ ] Use fs module for file operations
- [ ] Create HTTP servers with http/https
- [ ] Work with paths correctly (cross-platform)
- [ ] Understand EventEmitter pattern
- [ ] Know when to use streams vs buffers
- [ ] Build RESTful APIs with proper routing
- [ ] Implement middleware (built-in, custom, third-party)
- [ ] Handle errors properly in middleware chain
- [ ] Understand request/response lifecycle
- [ ] Organize routes and controllers effectively
- [ ] Connect to databases (MongoDB/PostgreSQL)
- [ ] Use ORMs/ODMs (Mongoose/Sequelize)
- [ ] Implement CRUD operations
- [ ] Understand connection pooling
- [ ] Handle transactions when needed
- [ ] Implement authentication (JWT, sessions)
- [ ] Set up authorization and RBAC
- [ ] Prevent common vulnerabilities (XSS, CSRF, SQL Injection)
- [ ] Use HTTPS and CORS properly
- [ ] Validate and sanitize inputs
- [ ] Implement rate limiting
- [ ] Use clustering for multi-core utilization
- [ ] Implement caching strategies (Redis, in-memory)
- [ ] Optimize database queries
- [ ] Handle memory leaks
- [ ] Profile and monitor application performance
- [ ] Write unit tests (Jest/Mocha)
- [ ] Create integration tests
- [ ] Mock dependencies appropriately
- [ ] Achieve good test coverage
- [ ] Use TDD when appropriate
- [ ] Manage environment variables
- [ ] Use process managers (PM2)
- [ ] Containerize with Docker
- [ ] Set up CI/CD pipelines
- [ ] Implement logging and monitoring
- [ ] Understand streams in depth
- [ ] Use child processes