Prepared for: SDE Interview Preparation
Last Updated: February 2026
Focus Areas: JavaScript Internals, React, Next.js, Node.js, Express, MongoDB, MERN Architecture
== vs ===, Promises, async/await, this, Destructuring/Spread/RestThese are the JavaScript questions that are heavily tested in MERN interviews. Know these cold.
Answer:
JavaScript is single-threaded — it can only execute one operation at a time. The event loop is the mechanism that allows it to handle async operations (like HTTP requests, timers, file reads) without blocking.
Components involved:
.then, .catch, async/await) and queueMicrotask goExecution Priority Order:
Call Stack → Microtask Queue → (Render if browser) → Macrotask Queue → repeat
Microtasks (higher priority): Promise.then/catch/finally, queueMicrotask, MutationObserver
Macrotasks (lower priority): setTimeout, setInterval, setImmediate (Node), I/O callbacks, MessageChannel
Example:
console.log('1'); // sync → call stack
setTimeout(() => console.log('2'), 0); // macrotask queue
Promise.resolve().then(() => console.log('3')); // microtask queue
console.log('4'); // sync → call stack
// Output: 1, 4, 3, 2
// Why? Sync runs first (1, 4), then microtasks (3), then macrotasks (2)
Another Example (common interview trick):
console.log('start');
setTimeout(() => console.log('timeout'), 0);
Promise.resolve()
.then(() => {
console.log('promise 1');
return Promise.resolve();
})
.then(() => console.log('promise 2'));
console.log('end');
// Output: start, end, promise 1, promise 2, timeout
Answer:
A closure is a function that remembers and has access to variables from its outer (enclosing) scope even after the outer function has finished executing.
How it works: When a function is created, it captures a reference to its lexical environment (the scope it was defined in), not a copy of it.
function makeCounter() {
let count = 0; // this variable is "closed over"
return function() {
count++;
return count;
};
}
const counter = makeCounter();
console.log(counter()); // 1
console.log(counter()); // 2
console.log(counter()); // 3
// makeCounter() has returned, but count still lives in the closure
Practical use cases:
Classic interview trap — closures in loops:
// Wrong way (before ES6):
for (var i = 0; i < 3; i++) {
setTimeout(() => console.log(i), 1000);
}
// Output: 3, 3, 3 (because var is function-scoped, all callbacks share the same i)
// Fix 1: Use let (block-scoped)
for (let i = 0; i < 3; i++) {
setTimeout(() => console.log(i), 1000);
}
// Output: 0, 1, 2
// Fix 2: Use an IIFE to create a new scope
for (var i = 0; i < 3; i++) {
(function(j) {
setTimeout(() => console.log(j), 1000);
})(i);
}
// Output: 0, 1, 2
| Feature | var |
let |
const |
|---|---|---|---|
| Scope | Function-scoped | Block-scoped | Block-scoped |
| Hoisting | Hoisted, initialized as undefined |
Hoisted, NOT initialized (TDZ) | Hoisted, NOT initialized (TDZ) |
| Re-declaration | Allowed | Not allowed in same scope | Not allowed in same scope |
| Re-assignment | Allowed | Allowed | Not allowed |
| Global object property | Yes (browser) | No | No |
Temporal Dead Zone (TDZ):
console.log(x); // undefined (var is hoisted and initialized)
var x = 5;
console.log(y); // ReferenceError: Cannot access 'y' before initialization
let y = 5;
const with objects:
const obj = { a: 1 };
obj.a = 2; // ✅ Allowed — you can mutate properties
obj = { a: 2 }; // ❌ TypeError — you cannot reassign the binding
Answer:
Hoisting is JavaScript's default behavior of moving declarations to the top of their scope during the compilation phase (before execution).
var declarations are hoisted and initialized to undefinedlet/const are hoisted but NOT initialized (TDZ — accessing them before declaration throws ReferenceError)let/const)// Function declaration — fully hoisted
greet(); // "Hello" — works!
function greet() { console.log("Hello"); }
// Function expression — NOT fully hoisted
sayHi(); // TypeError: sayHi is not a function
var sayHi = function() { console.log("Hi"); };
// var sayHi is hoisted as undefined, then called → TypeError
// Arrow function — same as function expression
arrowFn(); // TypeError or ReferenceError depending on const/var
const arrowFn = () => console.log("Arrow");
Answer:
In JavaScript, every object has a hidden [[Prototype]] property that points to another object (its prototype). When accessing a property, JS first checks the object itself, then traverses up the prototype chain until it finds it or reaches null.
const animal = {
eat() { return `${this.name} is eating`; }
};
const dog = Object.create(animal);
dog.name = 'Rex';
dog.bark = function() { return `${this.name} barks`; };
dog.eat(); // "Rex is eating" — found on prototype (animal)
dog.bark(); // "Rex barks" — found on dog itself
With Classes (ES6 — syntactic sugar over prototype chain):
class Animal {
constructor(name) { this.name = name; }
eat() { return `${this.name} is eating`; }
}
class Dog extends Animal {
bark() { return `${this.name} barks`; }
}
const d = new Dog('Rex');
d.eat(); // inherited from Animal.prototype
d.bark(); // from Dog.prototype
Prototype Chain:
d → Dog.prototype → Animal.prototype → Object.prototype → null
== and ===? == performs type coercion before comparison.
=== performs strict comparison — no coercion, types must match.
0 == false // true (false coerces to 0)
0 === false // false (different types)
'' == false // true (both coerce to 0)
null == undefined // true (special case)
null === undefined // false
NaN == NaN // false (NaN is never equal to itself)
NaN === NaN // false
Always prefer === in practice.
Answer:
A Promise is an object representing the eventual completion or failure of an asynchronous operation. It has three states:
const promise = new Promise((resolve, reject) => {
const success = true;
if (success) resolve('Data fetched');
else reject(new Error('Fetch failed'));
});
promise
.then(data => console.log(data)) // "Data fetched"
.catch(err => console.error(err))
.finally(() => console.log('Done')); // always runs
Promise chaining:
fetch('/api/user')
.then(res => res.json())
.then(user => fetch(`/api/posts/${user.id}`))
.then(res => res.json())
.then(posts => console.log(posts))
.catch(err => console.error(err));
Promise combinators:
| Method | Behavior |
|---|---|
Promise.all(arr) |
Waits for ALL to resolve. Rejects if ANY rejects |
Promise.allSettled(arr) |
Waits for ALL to settle (resolve or reject). Never short-circuits |
Promise.race(arr) |
Resolves/rejects with the FIRST settled promise |
Promise.any(arr) |
Resolves with FIRST fulfilled. Rejects only if ALL reject |
// all — parallel execution
const [users, posts] = await Promise.all([
fetch('/api/users').then(r => r.json()),
fetch('/api/posts').then(r => r.json())
]);
// allSettled — when you want all results regardless of failure
const results = await Promise.allSettled([p1, p2, p3]);
results.forEach(r => {
if (r.status === 'fulfilled') console.log(r.value);
else console.log(r.reason);
});
Answer:
async/await is syntactic sugar built on top of Promises and Generators. It makes async code look synchronous. An async function always returns a Promise.
async function fetchUser(id) {
try {
const res = await fetch(`/api/users/${id}`); // pauses here, returns control
const data = await res.json();
return data;
} catch (err) {
console.error('Error:', err);
throw err; // re-throw for callers
}
}
Parallel vs Sequential async:
// Sequential — slow! (waits for each before starting next)
const user = await fetchUser(1);
const posts = await fetchPosts(1);
// Parallel — fast! (starts both simultaneously)
const [user, posts] = await Promise.all([fetchUser(1), fetchPosts(1)]);
// Another parallel pattern
const userPromise = fetchUser(1);
const postsPromise = fetchPosts(1);
const user = await userPromise;
const posts = await postsPromise;
Common mistake — async in forEach:
// WRONG — forEach does not await async callbacks
const ids = [1, 2, 3];
ids.forEach(async (id) => {
const user = await fetchUser(id); // runs concurrently, not awaited
console.log(user);
});
// CORRECT — use for...of for sequential
for (const id of ids) {
const user = await fetchUser(id);
console.log(user);
}
// Or for parallel:
await Promise.all(ids.map(id => fetchUser(id)));
this keyword — How it works Answer:
The value of this depends on how a function is called, not where it is defined (except for arrow functions).
| Call Pattern | Value of this |
|---|---|
| Global scope (non-strict) | window / global |
| Global scope (strict mode) | undefined |
| Object method | The object before the dot |
new constructor |
The newly created object |
.call(ctx) / .apply(ctx) |
Explicitly set to ctx |
.bind(ctx) |
Permanently bound to ctx |
| Arrow function | Lexically inherited from enclosing scope |
const obj = {
name: 'Test',
regular: function() { console.log(this.name); },
arrow: () => { console.log(this); } // inherits outer this (global/undefined in strict)
};
obj.regular(); // "Test"
obj.arrow(); // undefined (or window in browser non-strict)
Arrow functions and this in React — why it matters:
class Counter extends React.Component {
constructor(props) {
super(props);
this.state = { count: 0 };
}
// Regular method — 'this' is undefined when used as callback
increment() {
this.setState({ count: this.state.count + 1 });
}
// Fix 1: Arrow function as class field
increment = () => {
this.setState({ count: this.state.count + 1 });
}
render() {
return <button onClick={this.increment}>+</button>;
}
}
// Array destructuring
const [a, b, ...rest] = [1, 2, 3, 4, 5];
// a=1, b=2, rest=[3,4,5]
// Object destructuring with rename and default
const { name: fullName = 'Anonymous', age } = { name: 'Jash', age: 21 };
// fullName='Jash', age=21
// Nested destructuring
const { address: { city, pin } } = { address: { city: 'Bangalore', pin: '560001' } };
// Spread — shallow copy
const arr1 = [1, 2, 3];
const arr2 = [...arr1, 4, 5]; // [1,2,3,4,5]
const obj1 = { a: 1 };
const obj2 = { ...obj1, b: 2 }; // { a:1, b:2 }
// Shallow vs Deep copy
const shallow = { ...obj1 }; // nested objects still share reference
const deep = JSON.parse(JSON.stringify(obj1)); // true deep copy (loses functions/dates)
function greet(greeting, punctuation) {
return `${greeting}, ${this.name}${punctuation}`;
}
const user = { name: 'Jash' };
greet.call(user, 'Hello', '!'); // "Hello, Jash!" — args individually
greet.apply(user, ['Hello', '!']); // "Hello, Jash!" — args as array
const boundGreet = greet.bind(user);
boundGreet('Hi', '.'); // "Hi, Jash." — creates new permanent function
// Currying — converting f(a,b,c) into f(a)(b)(c)
const add = a => b => c => a + b + c;
add(1)(2)(3); // 6
// Practical: reusable configurator
const multiply = x => y => x * y;
const double = multiply(2);
const triple = multiply(3);
double(5); // 10
triple(5); // 15
function memoize(fn) {
const cache = new Map();
return function(...args) {
const key = JSON.stringify(args);
if (cache.has(key)) return cache.get(key);
const result = fn.apply(this, args);
cache.set(key, result);
return result;
};
}
const expensiveCalc = memoize((n) => {
// simulate heavy computation
return n * n;
});
function* idGenerator() {
let id = 1;
while (true) {
yield id++;
}
}
const gen = idGenerator();
gen.next(); // { value: 1, done: false }
gen.next(); // { value: 2, done: false }
// async generators (useful in Node.js streaming)
async function* streamData() {
for (const chunk of chunks) {
yield await processChunk(chunk);
}
}
Map |
WeakMap |
Set |
WeakSet |
|
|---|---|---|---|---|
| Keys/Values | Any type | Objects only as keys | Unique values | Objects only |
| Iterable | Yes | No | Yes | No |
| GC eligible | No | Yes (keys can be GC'd) | No | Yes |
| Size property | Yes | No | Yes | No |
Use cases:
Map over plain objects when keys aren't strings, or insertion order mattersWeakMap for storing private data associated with objects (no memory leaks)Set for unique collectionsWeakSet to track if object exists without preventing garbage collection?., Nullish Coalescing ??, Logical Assignment Optional chaining — safely access deeply nested properties without throwing if an intermediate value is null or undefined:
const user = null;
user.profile.name; // ❌ TypeError: Cannot read properties of null
user?.profile?.name; // ✅ undefined (short-circuits)
// Also works for method calls and bracket notation
user?.greet?.(); // undefined (doesn't call, doesn't throw)
data?.items?.[0]; // undefined
// Combine with nullish coalescing for defaults
const name = user?.profile?.name ?? 'Anonymous';
Nullish coalescing ?? vs logical OR ||:
// || uses falsy check (0, '', false, null, undefined all trigger fallback)
const port1 = 0 || 3000; // 3000 — BUG! 0 is a valid port
// ?? uses nullish check (ONLY null or undefined trigger fallback)
const port2 = 0 ?? 3000; // 0 — correct!
const name = '' ?? 'default'; // '' — correct (empty string is intentional)
Logical assignment operators (ES2021):
// ??= assigns only if current value is null/undefined
user.name ??= 'Anonymous'; // = user.name ?? (user.name = 'Anonymous')
// ||= assigns only if current value is falsy
user.role ||= 'guest';
// &&= assigns only if current value is truthy
user.profile &&= formatProfile(user.profile);
A Symbol is a guaranteed unique, immutable primitive value. Useful as unique object keys.
const id1 = Symbol('id');
const id2 = Symbol('id');
id1 === id2; // false — always unique
// Use as object keys (won't clash with any string key)
const ID = Symbol('id');
const user = { [ID]: 123, name: 'Jash' };
user[ID]; // 123
// Symbols are NOT enumerable:
Object.keys(user); // ['name'] — ID not included
JSON.stringify(user); // '{"name":"Jash"}' — ID stripped
Object.getOwnPropertySymbols(user); // [Symbol(id)]
// Well-known Symbols — customize built-in behaviour
class MyArray {
[Symbol.iterator]() {
let i = 0;
const data = [10, 20, 30];
return {
next: () => i < data.length
? { value: data[i++], done: false }
: { value: undefined, done: true }
};
}
}
for (const val of new MyArray()) console.log(val); // 10, 20, 30
// Symbol.toPrimitive, Symbol.hasInstance, Symbol.toStringTag
class MyClass {
get [Symbol.toStringTag]() { return 'MyClass'; }
}
Object.prototype.toString.call(new MyClass()); // '[object MyClass]'
Proxy wraps an object and intercepts operations (get, set, delete, etc.) via traps.
const handler = {
get(target, key, receiver) {
console.log(`Getting ${key}`);
return Reflect.get(target, key, receiver); // use Reflect for default behaviour
},
set(target, key, value, receiver) {
if (typeof value !== 'number') throw new TypeError('Numbers only');
return Reflect.set(target, key, value, receiver);
},
deleteProperty(target, key) {
console.log(`Deleting ${key}`);
return Reflect.deleteProperty(target, key);
}
};
const obj = new Proxy({}, handler);
obj.x = 5; // OK
obj.x = 'hi'; // TypeError
console.log(obj.x); // "Getting x" → 5
// Practical: validation proxy
function createValidator(target, schema) {
return new Proxy(target, {
set(obj, key, value) {
if (schema[key] && !schema[key](value)) {
throw new Error(`Invalid value for ${key}: ${value}`);
}
obj[key] = value;
return true;
}
});
}
const user = createValidator({}, {
age: v => typeof v === 'number' && v >= 0 && v <= 150
});
user.age = 25; // OK
user.age = -1; // Error!
// Practical: reactive data (how Vue 3 Reactivity works)
function reactive(obj) {
return new Proxy(obj, {
set(target, key, value) {
const result = Reflect.set(target, key, value);
triggerUpdate(); // notify subscribers
return result;
}
});
}
Reflect mirrors all Proxy traps as static methods — the "default" implementation for each trap:
Reflect.get(obj, key) // same as obj[key]
Reflect.set(obj, key, value) // same as obj[key] = value
Reflect.has(obj, key) // same as key in obj
Reflect.deleteProperty(obj, key)// same as delete obj[key]
Reflect.ownKeys(obj) // own property keys including Symbols
typeof / instanceof / Object.is // typeof quirks
typeof undefined // 'undefined'
typeof null // 'object' ← historical bug
typeof [] // 'object'
typeof {} // 'object'
typeof function(){} // 'function'
typeof class{} // 'function'
typeof NaN // 'number'
// Better type checking
Array.isArray([]) // true
obj instanceof MyClass // checks prototype chain
Object.prototype.toString.call([]) // '[object Array]'
Object.prototype.toString.call(null) // '[object Null]'
// instanceof pitfall: fails across different realms (iframes, vm modules)
const arr = new vm.runInNewContext('[]');
arr instanceof Array; // false! (different Array constructor)
Array.isArray(arr); // true ✓
// Object.is vs ===
Object.is(NaN, NaN); // true (=== returns false)
Object.is(0, -0); // false (=== returns true)
// More precise for edge cases, same as === otherwise
Always extend Error to create domain-specific errors, enabling instanceof checks and better stack traces.
class AppError extends Error {
constructor(message, statusCode = 500, code = 'INTERNAL_ERROR') {
super(message);
this.name = this.constructor.name; // 'AppError' or subclass name
this.statusCode = statusCode;
this.code = code;
Error.captureStackTrace(this, this.constructor); // clean stack trace
}
}
class NotFoundError extends AppError {
constructor(resource) {
super(`${resource} not found`, 404, 'NOT_FOUND');
}
}
class ValidationError extends AppError {
constructor(field, reason) {
super(`Validation failed on '${field}': ${reason}`, 400, 'VALIDATION_ERROR');
this.field = field;
}
}
class UnauthorizedError extends AppError {
constructor(message = 'Unauthorized') {
super(message, 401, 'UNAUTHORIZED');
}
}
// Usage in Express error middleware
app.use((err, req, res, next) => {
if (err instanceof AppError) {
return res.status(err.statusCode).json({
error: { code: err.code, message: err.message }
});
}
// Unknown error — don't leak internals
console.error(err); // log the real error
res.status(500).json({ error: { code: 'INTERNAL_ERROR', message: 'Something went wrong' } });
});
// Usage in route
throw new NotFoundError('User'); // caught by error middleware → 404
throw new ValidationError('email', 'must be valid email'); // → 400
Any object with a [Symbol.iterator] method is iterable and works with for...of, spread, destructuring, Array.from.
// Custom iterable: range
const range = {
from: 1,
to: 5,
[Symbol.iterator]() {
let current = this.from;
const last = this.to;
return {
next() {
return current <= last
? { value: current++, done: false }
: { value: undefined, done: true };
}
};
}
};
for (const n of range) console.log(n); // 1, 2, 3, 4, 5
[...range]; // [1, 2, 3, 4, 5]
const [a, b] = range; // a=1, b=2
// Generators auto-implement the iterator protocol
function* range2(from, to) {
for (let i = from; i <= to; i++) yield i;
}
[...range2(1, 5)]; // [1, 2, 3, 4, 5]
// Infinite lazy sequence — only compute on demand
function* fibonacci() {
let [a, b] = [0, 1];
while (true) {
yield a;
[a, b] = [b, a + b];
}
}
const fib = fibonacci();
Array.from({ length: 8 }, () => fib.next().value); // [0,1,1,2,3,5,8,13]
Instead of attaching listeners to every element, attach one listener to the parent and use event.target to determine which child was clicked. React uses this pattern internally.
// Without delegation — bad for dynamic lists, memory-heavy
document.querySelectorAll('.item').forEach(el => {
el.addEventListener('click', handleClick); // N listeners
});
// With delegation — one listener handles all current AND future children
document.getElementById('list').addEventListener('click', (e) => {
const item = e.target.closest('.item');
if (!item) return; // click wasn't on an item
console.log('Clicked:', item.dataset.id);
});
// Works even if items are added after the listener is set up!
// How React uses delegation:
// React attaches ONE event listener per event type to the root (#root)
// React's synthetic event system then dispatches to the correct component
// This is why React's event handlers look different from native DOM events
Answer:
React is a declarative, component-based UI library for building user interfaces. Key ideas:
How React renders:
React.createElement() callsAnswer:
The Virtual DOM is a plain JavaScript object tree that mirrors the real DOM structure. React uses it as a staging area to batch and minimize real DOM updates.
Reconciliation Algorithm (React Fiber):
key props indicate which elements are stable across rendersKeys — why they matter:
// Without keys — React re-creates the entire list on reorder
{items.map(item => <li>{item.name}</li>)}
// With keys — React can match and reuse DOM nodes
{items.map(item => <li key={item.id}>{item.name}</li>)}
// NEVER use index as key when list can reorder/filter
{items.map((item, i) => <li key={i}>{item.name}</li>)} // BAD
Answer:
JSX (JavaScript XML) is a syntax extension that looks like HTML but is JavaScript. Babel transpiles it to React.createElement() calls.
// JSX
const element = <h1 className="title">Hello, {name}</h1>;
// What Babel compiles it to
const element = React.createElement(
'h1',
{ className: 'title' },
'Hello, ',
name
);
JSX Rules:
<> fragment if needed)className instead of class, htmlFor instead of for<img />){}{/* comment */}| Feature | Class Component | Functional Component |
|---|---|---|
| State | this.state |
useState hook |
| Lifecycle | Lifecycle methods | useEffect hook |
this binding |
Required | Not needed |
| Boilerplate | More | Less |
| Performance | Slightly heavier | Lighter |
| Hooks | Cannot use | Can use |
| Current preference | Legacy | Preferred |
// Class Component
class Counter extends React.Component {
state = { count: 0 };
increment = () => this.setState(s => ({ count: s.count + 1 }));
render() {
return <button onClick={this.increment}>{this.state.count}</button>;
}
}
// Functional Component (equivalent, preferred)
function Counter() {
const [count, setCount] = useState(0);
return <button onClick={() => setCount(c => c + 1)}>{count}</button>;
}
| Props | State | |
|---|---|---|
| Ownership | Passed from parent | Owned by component |
| Mutability | Read-only (immutable by receiver) | Mutable via setState/setter |
| Who controls? | Parent | The component itself |
| Re-render trigger | Parent re-renders | setState / setter call |
| Default | Provided by parent | Initialized in component |
// Props
function Greeting({ name, age = 18 }) { // default prop value
return <p>Hello {name}, you are {age}</p>;
}
// State
function Form() {
const [email, setEmail] = useState('');
return <input value={email} onChange={e => setEmail(e.target.value)} />;
}
Three phases:
| Lifecycle Method | Hook Equivalent |
|---|---|
constructor |
useState initial value |
componentDidMount |
useEffect(() => {}, []) |
componentDidUpdate |
useEffect(() => {}, [dep]) |
componentWillUnmount |
useEffect(() => { return () => cleanup }, []) |
shouldComponentUpdate |
React.memo + useMemo/useCallback |
getDerivedStateFromProps |
useState + useEffect or useMemo |
getSnapshotBeforeUpdate |
No direct equivalent (rare, use ref) |
Controlled: React controls the form element's value via state.
function ControlledInput() {
const [value, setValue] = useState('');
return (
<input
value={value}
onChange={(e) => setValue(e.target.value)}
/>
);
}
Uncontrolled: DOM manages its own state; you access it via a ref.
function UncontrolledInput() {
const inputRef = useRef(null);
const handleSubmit = () => console.log(inputRef.current.value);
return <input ref={inputRef} />;
}
Use controlled components for most cases — they're predictable and testable.
Use uncontrolled for file inputs, or when integrating with non-React code.
When two sibling components need to share state, move the state to their closest common ancestor and pass it down via props.
// Problem: Sibling A and B need to share inputValue
function Parent() {
const [inputValue, setInputValue] = useState('');
return (
<>
<InputComponent value={inputValue} onChange={setInputValue} />
<DisplayComponent value={inputValue} />
</>
);
}
React.StrictMode is a development-only tool that helps surface potential bugs by intentionally running certain lifecycles and render functions twice (in development).
// Wrap your app (or part of it)
root.render(
<React.StrictMode>
<App />
</React.StrictMode>
);
What StrictMode detects:
componentWillMount, componentWillReceiveProps)ref string API usagefindDOMNode usageWhy effects run twice in StrictMode (React 18+):
React simulates mounting → unmounting → mounting again to verify your effects are properly cleaned up. If your effect doesn't clean up, you'll see bugs in StrictMode first. The fix is always a proper cleanup function.
// ❌ Bug exposed by StrictMode — click listener added twice!
useEffect(() => {
window.addEventListener('click', handler);
// no cleanup → adds listener twice in StrictMode
}, []);
// ✅ Correct
useEffect(() => {
window.addEventListener('click', handler);
return () => window.removeEventListener('click', handler);
}, []);
Automatic batching (React 18): Before React 18, state updates inside async callbacks (setTimeout, fetch) were NOT batched — each caused a separate render.
// React 17 — 2 renders inside setTimeout
setTimeout(() => {
setCount(c => c + 1); // render 1
setName('Jash'); // render 2
}, 1000);
// React 18 — batched automatically → 1 render
setTimeout(() => {
setCount(c => c + 1); // }
setName('Jash'); // } batched → 1 render
}, 1000);
// To opt OUT of batching (rare):
import { flushSync } from 'react-dom';
flushSync(() => setCount(c => c + 1)); // renders immediately
flushSync(() => setName('Jash')); // renders immediately
startTransition — marking updates as non-urgent:
import { startTransition, useTransition, useDeferredValue } from 'react';
// Without startTransition: typing blocks heavy re-render
function SearchPage() {
const [query, setQuery] = useState('');
const [results, setResults] = useState([]);
const handleChange = (e) => {
setQuery(e.target.value); // urgent — update input
// React prioritizes this over the heavy search
startTransition(() => {
setResults(searchItems(e.target.value)); // non-urgent — can be interrupted
});
};
return <input value={query} onChange={handleChange} />;
}
// useTransition — with pending state
function TabBar() {
const [isPending, startTransition] = useTransition();
const [tab, setTab] = useState('home');
function selectTab(t) {
startTransition(() => setTab(t)); // non-urgent tab switch
}
return isPending ? <Spinner /> : <TabContent tab={tab} />;
}
// useDeferredValue — defer a derived value, keeping UI responsive
function ProductList({ query }) {
const deferredQuery = useDeferredValue(query);
// deferredQuery lags behind query — user sees instant input, list updates when idle
const results = useMemo(() => heavyFilter(deferredQuery), [deferredQuery]);
return <List items={results} />;
}
React uses a synthetic event system that wraps native browser events to normalize behaviour across browsers.
function Button() {
const handleClick = (e) => {
// e is a SyntheticEvent wrapping the native MouseEvent
e.preventDefault(); // works the same as native
e.stopPropagation(); // works the same as native
e.nativeEvent; // access the underlying native event
// React 17+: events are no longer pooled (no need to call e.persist())
setTimeout(() => console.log(e.type), 500); // 'click' — still accessible
};
return <button onClick={handleClick}>Click</button>;
}
// React attaches ONE delegated listener per event type on the root DOM node
// (not on each element — this is the delegation pattern)
// Difference from native addEventListener:
// React: onClick={handler} — synthetic, bubbles through React tree
// Native: el.addEventListener('click', fn) — direct DOM listener
// Both can coexist — native fires first, then React synthetic
onCapture events — React supports capture phase handlers:
<div onClickCapture={handleCapture}> {/* runs in capture phase */}
<button onClick={handleBubble}>click</button>
</div>
// Capture fires first (top-down), then Bubble (bottom-up)
const [state, setState] = useState(initialValue);
Key rules:
// Functional updater — safe for stale closures
setCount(prev => prev + 1);
// Object state — must spread, setState does NOT merge automatically in hooks
const [user, setUser] = useState({ name: '', age: 0 });
setUser(prev => ({ ...prev, name: 'Jash' })); // ✅ keeps other fields
// Lazy initialization — runs only once (useful for expensive computations)
const [data, setData] = useState(() => JSON.parse(localStorage.getItem('data')) || []);
useEffect(() => {
// effect
return () => {
// cleanup (optional)
};
}, [dependencies]);
Dependency array behavior:
[] (empty) → runs only once after mount (like componentDidMount)[dep1, dep2] → runs after mount AND whenever dep1 or dep2 changescomponentDidUpdate)Cleanup function: Runs before the next effect execution AND on unmount. Used to cancel subscriptions, clear timers, abort fetch calls.
useEffect(() => {
const subscription = api.subscribe(id, handler);
return () => subscription.unsubscribe(); // cleanup
}, [id]);
// Aborting fetch requests
useEffect(() => {
const controller = new AbortController();
fetch('/api/data', { signal: controller.signal })
.then(r => r.json())
.then(setData)
.catch(err => {
if (err.name !== 'AbortError') console.error(err);
});
return () => controller.abort();
}, []);
Common mistake — missing dependencies:
// ❌ Stale closure — count is captured at time of effect creation
useEffect(() => {
const timer = setInterval(() => console.log(count), 1000);
return () => clearInterval(timer);
}, []); // Missing dependency!
// ✅ Correct
useEffect(() => {
const timer = setInterval(() => console.log(count), 1000);
return () => clearInterval(timer);
}, [count]);
// 1. Create context with a default value
const ThemeContext = React.createContext('light');
// 2. Provide value at a high level
function App() {
const [theme, setTheme] = useState('light');
return (
<ThemeContext.Provider value={{ theme, setTheme }}>
<MainContent />
</ThemeContext.Provider>
);
}
// 3. Consume anywhere in the tree without prop drilling
function Button() {
const { theme, setTheme } = useContext(ThemeContext);
return (
<button className={theme} onClick={() => setTheme(t => t === 'light' ? 'dark' : 'light')}>
Toggle
</button>
);
}
When NOT to use Context: For state that changes frequently and is consumed by many components. Every consumer re-renders when context value changes. For high-frequency updates, prefer Zustand/Redux or split your contexts.
const ref = useRef(initialValue);
// ref.current holds the value — changes DO NOT trigger re-renders
Use Case 1: Accessing DOM elements
function TextInput() {
const inputRef = useRef(null);
const focusInput = () => inputRef.current.focus();
return (
<>
<input ref={inputRef} />
<button onClick={focusInput}>Focus</button>
</>
);
}
Use Case 2: Storing mutable values that persist across renders without causing re-renders
function Timer() {
const intervalRef = useRef(null);
const start = () => {
intervalRef.current = setInterval(() => console.log('tick'), 1000);
};
const stop = () => {
clearInterval(intervalRef.current);
};
return (
<>
<button onClick={start}>Start</button>
<button onClick={stop}>Stop</button>
</>
);
}
Use Case 3: Tracking previous value
function usePrevious(value) {
const ref = useRef();
useEffect(() => { ref.current = value; });
return ref.current; // returns the value from the PREVIOUS render
}
const memoizedValue = useMemo(() => computeExpensiveValue(a, b), [a, b]);
Recalculates only when dependencies change. Prevents re-running expensive computations on every render.
function ProductList({ products, filter }) {
// Without useMemo: recalculates on every render regardless
// With useMemo: only recalculates when products or filter changes
const filteredProducts = useMemo(
() => products.filter(p => p.category === filter),
[products, filter]
);
return <ul>{filteredProducts.map(p => <li key={p.id}>{p.name}</li>)}</ul>;
}
Don't over-use useMemo. React's re-renders are fast; memoization has its own cost. Use it when:
const memoizedCallback = useCallback(() => doSomething(a, b), [a, b]);
Returns a memoized version of the callback that only changes when dependencies change.
Why it matters: Functions are recreated on every render. If you pass a function as a prop to a React.memo child, the child re-renders every time even if nothing changed (because the function reference is new).
function Parent() {
const [count, setCount] = useState(0);
const [text, setText] = useState('');
// Without useCallback: new function reference every render → Child always re-renders
// With useCallback: same reference as long as count doesn't change
const handleClick = useCallback(() => {
setCount(c => c + 1);
}, []); // no deps needed because we use functional updater
return (
<>
<input value={text} onChange={e => setText(e.target.value)} />
<MemoizedChild onClick={handleClick} />
</>
);
}
const MemoizedChild = React.memo(({ onClick }) => {
console.log('Child rendered');
return <button onClick={onClick}>Increment</button>;
});
const [state, dispatch] = useReducer(reducer, initialState, init?);
An alternative to useState for complex state transitions, especially when next state depends on previous state or multiple sub-values are related.
const initialState = { count: 0, loading: false, error: null };
function reducer(state, action) {
switch (action.type) {
case 'INCREMENT': return { ...state, count: state.count + 1 };
case 'DECREMENT': return { ...state, count: state.count - 1 };
case 'RESET': return initialState;
case 'SET_LOADING': return { ...state, loading: action.payload };
default: throw new Error(`Unknown action: ${action.type}`);
}
}
function Counter() {
const [state, dispatch] = useReducer(reducer, initialState);
return (
<>
<p>Count: {state.count}</p>
<button onClick={() => dispatch({ type: 'INCREMENT' })}>+</button>
<button onClick={() => dispatch({ type: 'DECREMENT' })}>-</button>
<button onClick={() => dispatch({ type: 'RESET' })}>Reset</button>
</>
);
}
useState vs useReducer:
useState: Simple, independent state valuesuseReducer: Multiple related state values, complex transitions, testable state logic (reducer is a pure function)Custom hooks are JavaScript functions starting with use that can call other hooks. They let you extract and reuse stateful logic.
// Custom hook: useFetch
function useFetch(url) {
const [data, setData] = useState(null);
const [loading, setLoading] = useState(true);
const [error, setError] = useState(null);
useEffect(() => {
let mounted = true;
const controller = new AbortController();
fetch(url, { signal: controller.signal })
.then(res => {
if (!res.ok) throw new Error(`HTTP ${res.status}`);
return res.json();
})
.then(data => { if (mounted) { setData(data); setLoading(false); } })
.catch(err => {
if (err.name !== 'AbortError' && mounted) {
setError(err.message);
setLoading(false);
}
});
return () => { mounted = false; controller.abort(); };
}, [url]);
return { data, loading, error };
}
// Usage
function UserProfile({ id }) {
const { data: user, loading, error } = useFetch(`/api/users/${id}`);
if (loading) return <Spinner />;
if (error) return <Error message={error} />;
return <div>{user.name}</div>;
}
More custom hook examples:
// useLocalStorage
function useLocalStorage(key, initialValue) {
const [value, setValue] = useState(() =>
JSON.parse(localStorage.getItem(key)) ?? initialValue
);
const setStoredValue = (newValue) => {
setValue(newValue);
localStorage.setItem(key, JSON.stringify(newValue));
};
return [value, setStoredValue];
}
// useDebounce
function useDebounce(value, delay = 300) {
const [debouncedValue, setDebouncedValue] = useState(value);
useEffect(() => {
const timer = setTimeout(() => setDebouncedValue(value), delay);
return () => clearTimeout(timer);
}, [value, delay]);
return debouncedValue;
}
// useWindowSize
function useWindowSize() {
const [size, setSize] = useState({ width: window.innerWidth, height: window.innerHeight });
useEffect(() => {
const handler = () => setSize({ width: window.innerWidth, height: window.innerHeight });
window.addEventListener('resize', handler);
return () => window.removeEventListener('resize', handler);
}, []);
return size;
}
// ❌ WRONG — conditional hook call
function Component({ show }) {
if (show) {
const [val, setVal] = useState(0); // Breaks hook ordering!
}
}
// ✅ CORRECT — hook always called
function Component({ show }) {
const [val, setVal] = useState(0);
if (show) { /* use val */ }
}
Why this rule? React tracks hooks by their call order. If hooks are called conditionally, the order can differ between renders, breaking React's ability to correctly associate hook state.
useLayoutEffect is identical to useEffect except it fires synchronously after DOM mutations, before the browser paints.
// useEffect timeline:
// render → DOM update → PAINT (user sees update) → useEffect fires
// useLayoutEffect timeline:
// render → DOM update → useLayoutEffect fires → PAINT (user sees update)
// Use case: measuring DOM before paint (prevents flicker)
function Tooltip({ target }) {
const tooltipRef = useRef(null);
const [position, setPosition] = useState({ top: 0, left: 0 });
useLayoutEffect(() => {
// Measure DOM synchronously before paint
const rect = tooltipRef.current.getBoundingClientRect();
if (rect.bottom > window.innerHeight) {
setPosition({ top: -rect.height, left: 0 }); // flip upwards
}
});
return <div ref={tooltipRef} style={position}>Tooltip</div>;
}
// useEffect for this would cause visible flicker (tooltip jumps)
// useLayoutEffect updates position before the user ever sees the tooltip
// Rule: default to useEffect. Only switch to useLayoutEffect if you see visual flicker
// from layout calculations that happen after paint.
useEffect |
useLayoutEffect |
|
|---|---|---|
| Fires | After paint | Before paint (after DOM update) |
| Blocking | No (async) | Yes (blocks paint) |
| Server | Warns on SSR | Warns on SSR |
| Use for | Data fetching, subscriptions | DOM measurement, preventing flicker |
useId (React 18) generates a stable, unique ID that is consistent between server and client renders — prevents hydration mismatches.
// Problem: Math.random() on server ≠ on client → hydration mismatch
// Problem: Counter that increments → IDs differ between SSR and CSR
function PasswordField() {
const id = useId(); // e.g., ':r0:' — stable across SSR/CSR
return (
<>
<label htmlFor={id}>Password:</label>
<input id={id} type="password" />
</>
);
}
// For multiple related IDs, use a prefix
function FormField({ label }) {
const id = useId();
return (
<>
<label htmlFor={`${id}-input`}>{label}</label>
<input id={`${id}-input`} aria-describedby={`${id}-hint`} />
<span id={`${id}-hint`}>Enter your {label}</span>
</>
);
}
// Don't use for list keys — useId is NOT for lists, only for accessibility IDs
The canonical hook for subscribing to external stores (Redux, browser APIs) in a way that is safe for concurrent rendering.
// Subscribe to browser online/offline status
function useOnlineStatus() {
return useSyncExternalStore(
(callback) => {
// subscribe: called when store changes
window.addEventListener('online', callback);
window.addEventListener('offline', callback);
return () => {
window.removeEventListener('online', callback);
window.removeEventListener('offline', callback);
};
},
() => navigator.onLine, // getSnapshot (client)
() => true // getServerSnapshot (server — assume online)
);
}
function App() {
const isOnline = useOnlineStatus();
return <p>{isOnline ? 'Online' : 'Offline'}</p>;
}
// Why not just useEffect + useState?
// In concurrent mode, React can re-render between reading the store and subscribing,
// leading to "tearing" (different components see different store values mid-render).
// useSyncExternalStore prevents this by snapshotting consistently.
// This is the hook Redux uses internally since React 18.
Answer:
A Higher-Order Component is a function that takes a component and returns a new enhanced component.
// HOC: withAuth
function withAuth(WrappedComponent) {
return function AuthenticatedComponent(props) {
const { isLoggedIn } = useAuth();
if (!isLoggedIn) return <Redirect to="/login" />;
return <WrappedComponent {...props} />;
};
}
// Usage
const ProtectedDashboard = withAuth(Dashboard);
// HOC: withLoading
function withLoading(WrappedComponent) {
return function WithLoadingComponent({ isLoading, ...rest }) {
if (isLoading) return <Spinner />;
return <WrappedComponent {...rest} />;
};
}
// HOC: withLogger (logs render)
function withLogger(WrappedComponent) {
return function LoggedComponent(props) {
useEffect(() => {
console.log(`${WrappedComponent.displayName || WrappedComponent.name} rendered`);
});
return <WrappedComponent {...props} />;
};
}
Downsides of HOCs:
Answer:
A technique where a component receives a function as a prop and calls it to render its children, sharing stateful logic.
// Mouse tracker using render props
class Mouse extends React.Component {
state = { x: 0, y: 0 };
handleMouseMove = (e) => this.setState({ x: e.clientX, y: e.clientY });
render() {
return (
<div onMouseMove={this.handleMouseMove}>
{this.props.render(this.state)}
</div>
);
}
}
// Usage — different rendering, same logic
<Mouse render={({ x, y }) => <p>Mouse at {x}, {y}</p>} />
<Mouse render={({ x, y }) => <Cat x={x} y={y} />} />
Modern equivalent: Custom hooks replace most render props patterns:
function useMouse() {
const [pos, setPos] = useState({ x: 0, y: 0 });
useEffect(() => {
const handle = (e) => setPos({ x: e.clientX, y: e.clientY });
window.addEventListener('mousemove', handle);
return () => window.removeEventListener('mousemove', handle);
}, []);
return pos;
}
// Pattern: Split state and dispatch contexts to avoid unnecessary re-renders
const CountStateContext = React.createContext();
const CountDispatchContext = React.createContext();
function CountProvider({ children }) {
const [count, dispatch] = useReducer(reducer, 0);
return (
<CountStateContext.Provider value={count}>
<CountDispatchContext.Provider value={dispatch}>
{children}
</CountDispatchContext.Provider>
</CountStateContext.Provider>
);
}
// Components that only dispatch DON'T re-render when state changes
function IncrementButton() {
const dispatch = useContext(CountDispatchContext);
return <button onClick={() => dispatch({ type: 'INC' })}>+</button>;
}
Components that catch JavaScript errors in their child tree and display a fallback UI.
class ErrorBoundary extends React.Component {
constructor(props) {
super(props);
this.state = { hasError: false, error: null };
}
static getDerivedStateFromError(error) {
return { hasError: true, error };
}
componentDidCatch(error, errorInfo) {
logErrorToService(error, errorInfo.componentStack);
}
render() {
if (this.state.hasError) {
return this.props.fallback ?? <h1>Something went wrong.</h1>;
}
return this.props.children;
}
}
// Usage
<ErrorBoundary fallback={<ErrorPage />}>
<Dashboard />
</ErrorBoundary>
Note: Error boundaries only catch errors in render, lifecycle methods, and constructors. They do NOT catch errors in event handlers, async code (setTimeout), or server-side rendering.
Render a component's output outside the parent DOM hierarchy while keeping it in the React tree.
import { createPortal } from 'react-dom';
function Modal({ children, isOpen }) {
if (!isOpen) return null;
return createPortal(
<div className="modal-overlay">
<div className="modal-content">{children}</div>
</div>,
document.getElementById('modal-root') // DOM node outside React root
);
}
Use cases: Modals, tooltips, dropdowns, toasts — elements that need to visually "escape" their container (overflow: hidden, z-index stacking context issues).
A pattern where multiple components work together to form a cohesive UI, sharing implicit state via context.
// Select compound component
const SelectContext = React.createContext();
function Select({ children, onChange }) {
const [selected, setSelected] = useState(null);
const handleSelect = (val) => { setSelected(val); onChange?.(val); };
return (
<SelectContext.Provider value={{ selected, onSelect: handleSelect }}>
<div className="select">{children}</div>
</SelectContext.Provider>
);
}
Select.Option = function Option({ value, children }) {
const { selected, onSelect } = useContext(SelectContext);
return (
<div
className={`option ${selected === value ? 'selected' : ''}`}
onClick={() => onSelect(value)}
>
{children}
</div>
);
};
// Usage
<Select onChange={console.log}>
<Select.Option value="a">Option A</Select.Option>
<Select.Option value="b">Option B</Select.Option>
</Select>
const HeavyComponent = React.lazy(() => import('./HeavyComponent'));
// import() is dynamic — browser fetches this chunk only when needed
function App() {
return (
<Suspense fallback={<LoadingSpinner />}>
<HeavyComponent />
</Suspense>
);
}
// Route-based code splitting (most impactful)
const Dashboard = React.lazy(() => import('./pages/Dashboard'));
const Profile = React.lazy(() => import('./pages/Profile'));
function App() {
return (
<Suspense fallback={<PageLoader />}>
<Routes>
<Route path="/dashboard" element={<Dashboard />} />
<Route path="/profile" element={<Profile />} />
</Routes>
</Suspense>
);
}
// ForwardRef lets parent access child's DOM node
const FancyInput = React.forwardRef((props, ref) => {
return <input ref={ref} className="fancy" {...props} />;
});
function Parent() {
const inputRef = useRef(null);
return (
<>
<FancyInput ref={inputRef} />
<button onClick={() => inputRef.current.focus()}>Focus</button>
</>
);
}
// useImperativeHandle — expose only specific methods to parent
const VideoPlayer = React.forwardRef((props, ref) => {
const videoRef = useRef(null);
useImperativeHandle(ref, () => ({
play: () => videoRef.current.play(),
pause: () => videoRef.current.pause(),
// Only these methods are exposed, not the full DOM node
}));
return <video ref={videoRef} src={props.src} />;
});
Choosing the right state management solution is a common interview topic.
When to use built-in Context + useReducer:
Redux Toolkit (RTK):
import { createSlice, configureStore } from '@reduxjs/toolkit';
const counterSlice = createSlice({
name: 'counter',
initialState: { value: 0 },
reducers: {
increment: (state) => { state.value += 1; }, // Immer under the hood (safe mutation)
decrement: (state) => { state.value -= 1; },
incrementBy: (state, action) => { state.value += action.payload; },
}
});
export const { increment, decrement, incrementBy } = counterSlice.actions;
// Async action with createAsyncThunk
import { createAsyncThunk } from '@reduxjs/toolkit';
const fetchUser = createAsyncThunk('users/fetchById', async (userId) => {
const res = await fetch(`/api/users/${userId}`);
return res.json();
});
const store = configureStore({ reducer: { counter: counterSlice.reducer } });
Zustand — minimal, no boilerplate:
import { create } from 'zustand';
const useStore = create((set, get) => ({
count: 0,
user: null,
increment: () => set(state => ({ count: state.count + 1 })),
fetchUser: async (id) => {
const user = await fetchUserApi(id);
set({ user }); // partial updates, no spreading needed
},
reset: () => set({ count: 0, user: null }),
}));
// React component
function Counter() {
const count = useStore(state => state.count); // ← subscribe to slice
const increment = useStore(state => state.increment); // ← stable reference
return <button onClick={increment}>{count}</button>;
}
// Components only re-render when their selected slice changes
Comparison:
| Context | Redux Toolkit | Zustand | |
|---|---|---|---|
| Boilerplate | Low | Medium | Minimal |
| DevTools | ❌ | ✅ Excellent | ✅ (plugin) |
| Bundle size | 0 | ~11KB | ~1KB |
| Async | Manual | RTK Query / thunk | Built-in |
| Learning curve | Low | Medium | Low |
| Re-render control | Coarse | Fine (selectors) | Fine (selectors) |
| Use when | Simple app global state | Large app, complex state | Medium app, simple API |
Suspense lets components "wait" for something before rendering, showing a fallback in the meantime. Initially only for code splitting (React.lazy), now also for data fetching (React 18 + frameworks).
// Code splitting (supported everywhere)
const UserProfile = React.lazy(() => import('./UserProfile'));
// Data fetching with Suspense (Next.js App Router, React Query v5, etc.)
// The component "suspends" (throws a Promise) until data is ready
// Next.js Server Component — automatically integrates with Suspense
async function UserPage() {
const user = await fetchUser(id); // server-side, can suspend
return <UserProfile user={user} />;
}
function App() {
return (
<Suspense fallback={<PageSkeleton />}>
<UserPage /> {/* Shows skeleton until user data loads */}
</Suspense>
);
}
// Multiple boundaries — granular loading states
function Dashboard() {
return (
<div>
<Suspense fallback={<HeaderSkeleton />}>
<Header />
</Suspense>
<Suspense fallback={<FeedSkeleton />}>
<Feed /> {/* loads independently */}
</Suspense>
<Suspense fallback={<SidebarSkeleton />}>
<Sidebar /> {/* loads independently */}
</Suspense>
</div>
);
}
// Each section shows its skeleton and fills in when ready — parallel loading!
React.memo is a higher-order component that memoizes a functional component. It only re-renders if props change (shallow comparison).
const ExpensiveChild = React.memo(function ExpensiveChild({ value, onUpdate }) {
console.log('Child rendered');
return <div>{value}</div>;
});
// With custom comparison
const SmartChild = React.memo(Component, (prevProps, nextProps) => {
return prevProps.data.id === nextProps.data.id; // return true to SKIP re-render
});
1. React.memo — wrap components that receive same props often
2. useMemo — memoize expensive computed values
3. useCallback — memoize callbacks passed to memo'd children
4. Code splitting — React.lazy + Suspense
5. List virtualization — react-window, react-virtual for long lists
6. Avoid anonymous functions/objects in JSX render (create new references each render)
7. State colocation — keep state close to where it's used
8. Split contexts — separate high-frequency and low-frequency contexts
9. Production build — always test performance in production mode
10. React DevTools Profiler — identify wasted renders
// ❌ WRONG — direct mutation
const [items, setItems] = useState([1, 2, 3]);
items.push(4); // mutates state directly
setItems(items); // React may not re-render! Same reference
// ✅ CORRECT — new reference
setItems([...items, 4]); // new array
setItems(items.filter(i => i !== 2)); // new array
setItems(items.map(i => i === 2 ? 99 : i)); // new array
// Object state
const [user, setUser] = useState({ name: 'Jash', age: 21 });
// ❌
user.age = 22;
setUser(user);
// ✅
setUser({ ...user, age: 22 });
This is a critical topic for SDE interviews.
CSR (Client-Side Rendering):
SSR (Server-Side Rendering):
getServerSideProps (Pages Router) / async Server Components (App Router)SSG (Static Site Generation):
getStaticProps (Pages Router) / default Server Components (App Router)ISR (Incremental Static Regeneration):
revalidate seconds, next request triggers rebuild in background; user still gets cached version; fresh version served to next usergetStaticProps with revalidate / fetch with next.revalidate (App Router)Comparison Table:
| Strategy | Rendering Time | Data Freshness | SEO | Use Case |
|---|---|---|---|---|
| CSR | At request (client) | On demand | ❌ Poor | Dashboards, authenticated apps |
| SSR | At request (server) | Always fresh | ✅ Good | Social media, personalized pages |
| SSG | At build time | Stale until rebuild | ✅ Best | Docs, blogs, marketing |
| ISR | At build + background | Configurable | ✅ Good | E-commerce, news, semi-static |
| Feature | Pages Router | App Router (Next 13+) |
|---|---|---|
| Directory | /pages |
/app |
| Data fetching | getServerSideProps, getStaticProps |
async Server Components, fetch |
| Layouts | _app.js / manual |
Nested layout.js files |
| Server Components | No | Yes (default) |
| Client Components | All components | Opt-in with 'use client' |
| Loading UI | Manual | loading.js file |
| Error UI | _error.js |
error.js file |
| Metadata | <Head> component |
metadata export / generateMetadata |
| Streaming | No | Yes |
Server Components (default in App Router):
useState/useEffectClient Components ('use client' directive):
// ServerComponent.tsx (no directive = server component)
async function UserProfile({ id }: { id: string }) {
const user = await db.user.findById(id); // direct DB access!
return <div>{user.name}</div>;
}
// ClientComponent.tsx
'use client';
import { useState } from 'react';
function Counter() {
const [count, setCount] = useState(0);
return <button onClick={() => setCount(c => c + 1)}>{count}</button>;
}
Composition pattern — Server wraps Client:
// ✅ OK — Server Component renders Client Component
// app/page.tsx (Server)
export default function Page() {
return (
<main>
<ServerData /> {/* server component, async, DB access */}
<InteractiveUI /> {/* client component, has state */}
</main>
);
}
// Server Component — async by default
async function Posts() {
// No cache — always fresh (SSR equivalent)
const data = await fetch('https://api.example.com/posts', {
cache: 'no-store'
});
// Default cache — cached permanently (SSG equivalent)
const staticData = await fetch('https://api.example.com/config');
// Revalidate — ISR equivalent
const freshData = await fetch('https://api.example.com/news', {
next: { revalidate: 60 } // re-fetch if older than 60 seconds
});
const posts = await data.json();
return <ul>{posts.map(p => <li key={p.id}>{p.title}</li>)}</ul>;
}
Caching layers in Next.js:
fetch() calls in one render pass are deduplicatedno-store or revalidated)Server Actions let you define async functions that run on the server and can be called directly from client components — no separate API route needed.
// app/actions.ts
'use server'; // marks all exports in this file as Server Actions
import { revalidatePath } from 'next/cache';
import { redirect } from 'next/navigation';
export async function createPost(formData: FormData) {
const title = formData.get('title') as string;
const body = formData.get('body') as string;
// Validate
if (!title || title.length < 3) throw new Error('Title too short');
// Direct DB access on server — no API layer!
await db.post.create({ data: { title, body } });
revalidatePath('/posts'); // invalidate cached route
redirect('/posts'); // redirect after mutation
}
// Client component calling Server Action
'use client';
import { createPost } from './actions';
import { useFormStatus } from 'react-dom';
function SubmitButton() {
const { pending } = useFormStatus();
return <button disabled={pending}>{pending ? 'Posting...' : 'Post'}</button>;
}
export default function NewPostForm() {
return (
<form action={createPost}> {/* action = server action */}
<input name="title" placeholder="Title" />
<textarea name="body" placeholder="Body" />
<SubmitButton />
</form>
);
}
Server Actions vs API Routes:
Parallel Routes — render multiple pages in the same layout simultaneously using @folder convention:
app/
layout.tsx
page.tsx
@modal/
page.tsx → renders in parallel with main page
@sidebar/
page.tsx
// app/layout.tsx
export default function Layout({ children, modal, sidebar }) {
return (
<div>
{sidebar}
{children}
{modal} {/* modal slot — conditionally rendered */}
</div>
);
}
// Use case: feed with a side-by-side chat, or modals that URL-addressable
Intercepting Routes — show a route in a different context (e.g., open a photo in a modal while keeping the gallery beneath):
app/
photos/
[id]/page.tsx → full page at /photos/123
(.)photos/ → intercepts /photos route when navigating within same dir
[id]/page.tsx → shows as modal overlay
// Every route can opt into Edge Runtime:
export const runtime = 'edge'; // or 'nodejs' (default)
// Edge Runtime — Cloudflare Workers-like V8 isolate at CDN edge
// ✅ Globally distributed, near-zero cold starts
// ✅ Fastest TTFB for simple responses
// ❌ No Node.js built-ins (no fs, no Buffer, no crypto.X509Certificate...)
// ❌ No native modules (bcrypt, sharp — use WebAssembly versions)
// ❌ 25MB max bundle size, no long-running processes
// Node.js Runtime (default) — full Node.js environment on a server
// ✅ All Node APIs and npm packages
// ✅ Long-running connections (WebSockets)
// ❌ Cold starts under serverless deployments
// ❌ Not edge-distributed
// Example: edge is great for auth checks (fast, lightweight)
// app/api/auth/check/route.ts
export const runtime = 'edge';
export async function GET(req: Request) {
const token = req.headers.get('authorization');
const valid = await verifyJWT(token); // WebCrypto API works in Edge
return Response.json({ valid });
}
// Reading cookies in Server Component
import { cookies, headers } from 'next/headers';
async function ServerPage() {
const cookieStore = cookies();
const token = cookieStore.get('auth-token')?.value;
const headersList = headers();
const userAgent = headersList.get('user-agent');
return <div>Token: {token}</div>;
}
// Setting cookies in Server Action
'use server';
import { cookies } from 'next/headers';
export async function login(credentials) {
const { token } = await authenticate(credentials);
cookies().set('auth-token', token, {
httpOnly: true, // not accessible via document.cookie (XSS protection)
secure: true, // HTTPS only
sameSite: 'lax', // CSRF protection
maxAge: 60 * 60 * 24 * 7, // 7 days
path: '/'
});
}
App Router routing:
app/
page.tsx → /
about/
page.tsx → /about
blog/
page.tsx → /blog
[slug]/
page.tsx → /blog/:slug
(auth)/ → route group (doesn't affect URL)
login/page.tsx → /login
@modal/ → parallel routes
[...slug]/page.tsx → /any/nested/path (catch-all)
[[...slug]]/page.tsx → /optionally/nested (optional catch-all)
Special files:
| File | Purpose |
|---|---|
layout.tsx |
Shared UI that wraps children, persists across navigations |
page.tsx |
Unique UI of a route, makes route publicly accessible |
loading.tsx |
Instant loading UI (Suspense boundary) |
error.tsx |
Error UI (Error Boundary) |
not-found.tsx |
404 UI |
route.ts |
API endpoint (replaces /api/ pages convention) |
middleware.ts |
Runs before request is completed |
// app/api/users/route.ts
import { NextRequest, NextResponse } from 'next/server';
export async function GET(request: NextRequest) {
const { searchParams } = new URL(request.url);
const page = searchParams.get('page') ?? '1';
const users = await db.user.findMany({ skip: (parseInt(page) - 1) * 10, take: 10 });
return NextResponse.json(users);
}
export async function POST(request: NextRequest) {
const body = await request.json();
const user = await db.user.create({ data: body });
return NextResponse.json(user, { status: 201 });
}
// app/api/users/[id]/route.ts
export async function DELETE(request: NextRequest, { params }: { params: { id: string } }) {
await db.user.delete({ where: { id: params.id } });
return new NextResponse(null, { status: 204 });
}
Middleware runs before a request is completed — before the page/API route is called. Used for auth checks, redirects, rewriting URLs, adding headers.
// middleware.ts (root level)
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';
export function middleware(request: NextRequest) {
const token = request.cookies.get('auth-token');
if (!token && request.nextUrl.pathname.startsWith('/dashboard')) {
return NextResponse.redirect(new URL('/login', request.url));
}
// Add custom header
const response = NextResponse.next();
response.headers.set('x-custom-header', 'value');
return response;
}
// Configure which paths it runs on
export const config = {
matcher: ['/dashboard/:path*', '/api/protected/:path*']
};
import Image from 'next/image';
import { Inter } from 'next/font/google';
const inter = Inter({ subsets: ['latin'] }); // auto-optimizes, zero layout shift
// next/image:
// - Automatic WebP/AVIF conversion
// - Lazy loading by default
// - Prevents Cumulative Layout Shift (requires width/height or fill)
// - Built-in blur placeholder
<Image
src="/hero.jpg"
alt="Hero"
width={1200}
height={600}
priority // loads eagerly (above the fold)
placeholder="blur"
/>
Answer:
Node.js is a JavaScript runtime built on Chrome's V8 engine that allows running JavaScript on the server-side. Key differences from the browser:
| Node.js | Browser | |
|---|---|---|
| Global object | global |
window |
| DOM | ❌ No | ✅ Yes |
| File system | ✅ fs module |
❌ No |
| HTTP | ✅ http/https module |
Via fetch |
| Module system | CommonJS (require) + ESM |
ESM (import) |
| Built-in modules | fs, path, os, crypto, etc. |
Web APIs |
| Process control | process.exit(), env vars |
❌ No |
this at top level |
module.exports object |
window |
// CommonJS (older, Node.js default)
const fs = require('fs');
const { readFile } = require('fs');
module.exports = { myFunction };
module.exports.myFunction = myFunction;
// ES Modules (modern, use "type": "module" in package.json)
import fs from 'fs';
import { readFile } from 'fs';
export function myFunction() {}
export default class MyClass {}
Key differences:
await__dirname, __filename; ESM uses import.meta.url insteadStreams handle data in chunks rather than loading everything into memory — essential for large files, network data, etc.
Types:
fs.createReadStream, HTTP request body)fs.createWriteStream, HTTP response)const fs = require('fs');
const zlib = require('zlib');
// Pipe: file → gzip → output (memory efficient for large files)
fs.createReadStream('input.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('input.txt.gz'))
.on('finish', () => console.log('Done'));
// vs reading entire file into memory
const data = fs.readFileSync('input.txt'); // loads ENTIRE file into RAM
// Buffer: fixed-size chunk of memory (binary data)
const buf = Buffer.from('hello', 'utf-8');
buf.toString('hex'); // '68656c6c6f'
buf.toString('base64'); // 'aGVsbG8='
buf.length; // 5 (bytes)
// Useful for:
// - Binary file operations
// - Cryptography
// - Network protocol implementations
// - Processing image/audio data
| Worker Threads | Child Process | Cluster | |
|---|---|---|---|
| Use case | CPU-intensive JS tasks | Run any program | Scale HTTP servers |
| Shared memory | Yes (SharedArrayBuffer) | No (separate memory) | No (separate processes) |
| Communication | postMessage |
IPC / stdin/stdout | IPC |
| Node instance | Same | New process | New process (fork) |
| Overhead | Low (threads) | High (processes) | High (processes) |
// Cluster — scale HTTP server across CPU cores
const cluster = require('cluster');
const http = require('http');
const numCPUs = require('os').cpus().length;
if (cluster.isPrimary) {
for (let i = 0; i < numCPUs; i++) cluster.fork();
} else {
http.createServer((req, res) => res.end(`Worker ${process.pid}`)).listen(3000);
}
Node's event loop has 6 phases (libuv):
┌───────────────────────────┐
┌─>│ timers │ ← setTimeout, setInterval callbacks
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ pending callbacks │ ← I/O errors from previous iteration
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ idle, prepare │ ← internal use
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ poll │ ← retrieve new I/O events
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ check │ ← setImmediate callbacks
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
└──┤ close callbacks │ ← 'close' events
└───────────────────────────┘
Between EACH phase: process.nextTick() and Promise microtasks run
process.nextTick vs setImmediate:
setImmediate(() => console.log('setImmediate'));
process.nextTick(() => console.log('nextTick'));
Promise.resolve().then(() => console.log('promise'));
// Order: nextTick → promise → setImmediate
// nextTick runs BEFORE I/O callbacks and BEFORE promise microtasks? No...
// Actually: nextTick queue is drained first, then other microtasks, then event loop phases
// When you require() a module, Node:
// 1. Resolves the path
// 2. Checks the cache (require.cache)
// 3. If cached, returns the cached exports
// 4. If not, loads, executes, caches, and returns exports
// This means require() is synchronous and modules are singletons
const db = require('./database'); // same instance everywhere
// process.env contains all environment variables
const PORT = process.env.PORT || 3000;
const DB_URL = process.env.DATABASE_URL;
// .env file (use dotenv package in development)
require('dotenv').config();
// Never hardcode secrets — always use env vars
// NODE_ENV is conventionally 'development', 'test', or 'production'
if (process.env.NODE_ENV === 'production') {
// production config
}
EventEmitter is the backbone of Node's asynchronous event-driven architecture. Streams, http.Server, net.Socket — all extend EventEmitter.
const { EventEmitter } = require('events');
class OrderService extends EventEmitter {
async createOrder(data) {
const order = await db.order.create(data);
this.emit('order:created', order); // notify all listeners
return order;
}
}
const orderService = new OrderService();
// Subscribe
orderService.on('order:created', async (order) => {
await emailService.sendConfirmation(order.userId, order.id);
});
orderService.on('order:created', async (order) => {
await inventoryService.reserve(order.items);
});
// One-time listener
orderService.once('connect', () => console.log('Connected'));
// Remove listener (important to avoid memory leaks)
const handler = (order) => console.log(order);
orderService.on('order:created', handler);
orderService.off('order:created', handler); // or .removeListener()
// Max listeners warning (default 10) — raise for legitimate use
orderService.setMaxListeners(20);
// Error events — ALWAYS handle them or the process crashes!
orderService.on('error', (err) => {
console.error('OrderService error:', err);
});
// If 'error' is emitted with no listener → uncaught exception → process crash
// List listeners
orderService.listeners('order:created'); // array of handlers
orderService.eventNames(); // ['order:created', 'error']
orderService.listenerCount('order:created');
// Unhandled promise rejections (e.g., await without try/catch)
process.on('unhandledRejection', (reason, promise) => {
console.error('Unhandled Rejection at:', promise, 'reason:', reason);
// In production: log to error tracker, then gracefully shut down
process.exit(1);
});
// Uncaught synchronous exceptions
process.on('uncaughtException', (err) => {
console.error('Uncaught Exception:', err);
// The process is in an unknown state after this — shut down immediately
process.exit(1);
});
// Graceful shutdown — clean up before process exits
process.on('SIGTERM', async () => {
console.log('SIGTERM received. Shutting down gracefully...');
httpServer.close(() => {
mongoose.connection.close(); // close DB connection
console.log('Server closed');
process.exit(0);
});
// Force shut after 10s
setTimeout(() => process.exit(1), 10000);
});
// SIGINT is Ctrl+C; SIGTERM is sent by process managers (PM2, Kubernetes)
process.on('SIGINT', gracefulShutdown);
// process.exit codes:
// 0 = clean exit 1 = error exit 2 = misuse of shell builtins
const crypto = require('crypto');
// Generate secure random tokens (password reset, auth tokens)
const resetToken = crypto.randomBytes(32).toString('hex'); // 64 char hex string
const apiKey = crypto.randomUUID(); // UUID v4
// HMAC — signed hash, used for JWT signatures and webhook verification
const signature = crypto
.createHmac('sha256', process.env.SECRET)
.update(data)
.digest('hex');
// Verify webhook signature (Stripe, GitHub, etc.)
function verifyWebhook(payload, receivedSig, secret) {
const expected = crypto
.createHmac('sha256', secret)
.update(payload)
.digest('hex');
// Use timingSafeEqual to prevent timing attacks!
return crypto.timingSafeEqual(
Buffer.from(expected, 'hex'),
Buffer.from(receivedSig, 'hex')
);
}
// AES encryption for sensitive data at rest
const algorithm = 'aes-256-gcm';
const key = crypto.scryptSync(password, salt, 32); // derive key from password
const iv = crypto.randomBytes(16);
const cipher = crypto.createCipheriv(algorithm, key, iv);
let encrypted = cipher.update(data, 'utf8', 'hex');
encrypted += cipher.final('hex');
const authTag = cipher.getAuthTag().toString('hex');
// Hashing sensitive data (API keys stored in DB — compare hash not plaintext)
const hashApiKey = (key) => crypto.createHash('sha256').update(key).digest('hex');
const path = require('path');
const { URL } = require('url');
// path module — cross-platform path operations
path.join('/users', 'jash', 'docs', '../file.txt'); // '/users/jash/file.txt'
path.resolve('src', 'utils', 'helpers.js'); // absolute path from cwd
path.dirname('/users/jash/file.txt'); // '/users/jash'
path.basename('/users/jash/file.txt'); // 'file.txt'
path.extname('/users/jash/file.txt'); // '.txt'
// __dirname vs import.meta.url
// CommonJS:
const dir = __dirname;
const file = path.join(__dirname, 'public', 'index.html');
// ESM (no __dirname):
import { fileURLToPath } from 'url';
import { dirname } from 'path';
const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);
// URL module
const url = new URL('https://api.example.com/users?page=2&limit=10#top');
url.hostname; // 'api.example.com'
url.pathname; // '/users'
url.searchParams.get('page'); // '2'
url.searchParams.set('page', '3');
url.toString(); // updated URL string
// Parse query string from request
const reqUrl = new URL(request.url);
const page = parseInt(reqUrl.searchParams.get('page') ?? '1');
Express is a minimal, unopinionated web framework for Node.js. It provides:
http moduleconst express = require('express');
const app = express();
// Middleware
app.use(express.json()); // parse JSON body
app.use(express.urlencoded({ extended: true })); // parse form data
// Route
app.get('/users/:id', async (req, res, next) => {
try {
const user = await User.findById(req.params.id);
if (!user) return res.status(404).json({ error: 'Not found' });
res.json(user);
} catch (err) {
next(err); // pass to error middleware
}
});
// Error middleware — must have 4 params
app.use((err, req, res, next) => {
console.error(err.stack);
res.status(err.status || 500).json({ error: err.message });
});
app.listen(3000);
Middleware functions have access to req, res, and next. They run in the order they are defined.
// Types of middleware:
// 1. Application-level: app.use()
// 2. Router-level: router.use()
// 3. Error-handling: (err, req, res, next)
// 4. Built-in: express.json(), express.static()
// 5. Third-party: cors, morgan, helmet, compression
const logger = (req, res, next) => {
console.log(`${req.method} ${req.path} - ${Date.now()}`);
next(); // call next or the request hangs!
};
// Auth middleware
const authenticate = async (req, res, next) => {
const token = req.headers.authorization?.split(' ')[1];
if (!token) return res.status(401).json({ error: 'No token' });
try {
const payload = jwt.verify(token, process.env.JWT_SECRET);
req.user = payload;
next();
} catch {
res.status(401).json({ error: 'Invalid token' });
}
};
// Applying selectively
app.use('/api', logger);
app.get('/profile', authenticate, profileHandler);
// routes/users.js
const router = express.Router();
router.use(authenticate); // applies to all routes in this router
router.get('/', getAllUsers);
router.post('/', createUser);
router.get('/:id', getUserById);
router.put('/:id', updateUser);
router.delete('/:id', deleteUser);
module.exports = router;
// app.js
app.use('/api/users', usersRouter);
const rateLimit = require('express-rate-limit');
const limiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // 100 requests per window per IP
message: { error: 'Too many requests, try again later' },
standardHeaders: true,
legacyHeaders: false,
});
app.use('/api', limiter);
// Stricter limit for auth routes
const authLimiter = rateLimit({ windowMs: 15 * 60 * 1000, max: 10 });
app.use('/api/auth', authLimiter);
const multer = require('multer');
// Memory storage (for processing before saving to S3, etc.)
const upload = multer({
storage: multer.memoryStorage(),
limits: { fileSize: 5 * 1024 * 1024 }, // 5MB
fileFilter: (req, file, cb) => {
if (!file.mimetype.startsWith('image/')) {
return cb(new Error('Only image files allowed'), false);
}
cb(null, true);
}
});
app.post('/upload', upload.single('avatar'), async (req, res) => {
// req.file has: fieldname, originalname, buffer, mimetype, size
const url = await uploadToS3(req.file.buffer, req.file.originalname);
res.json({ url });
});
Zod is a TypeScript-first schema validation library — preferred over Joi in modern projects.
const { z } = require('zod');
// Define schema
const createUserSchema = z.object({
name: z.string().min(2).max(50).trim(),
email: z.string().email(),
age: z.number().int().min(0).max(150).optional(),
password: z.string()
.min(8)
.regex(/[A-Z]/, 'Must contain uppercase')
.regex(/[0-9]/, 'Must contain number'),
role: z.enum(['user', 'admin']).default('user'),
});
// Validate and throw on failure
const user = createUserSchema.parse(req.body); // throws ZodError
// Validate and return result object (safer)
const result = createUserSchema.safeParse(req.body);
if (!result.success) {
return res.status(400).json({
errors: result.error.errors.map(e => ({
field: e.path.join('.'),
message: e.message
}))
});
}
const validatedData = result.data;
// Middleware factory
const validate = (schema) => (req, res, next) => {
const result = schema.safeParse(req.body);
if (!result.success) {
return res.status(400).json({ errors: result.error.errors });
}
req.body = result.data; // replace with sanitized/coerced data
next();
};
// Usage
router.post('/users', validate(createUserSchema), createUserHandler);
// Type inference (TypeScript)
type CreateUserInput = z.infer<typeof createUserSchema>;
const cookieParser = require('cookie-parser');
const session = require('express-session');
const MongoStore = require('connect-mongo');
// Cookie parser
app.use(cookieParser(process.env.COOKIE_SECRET)); // secret for signed cookies
app.get('/set-cookie', (req, res) => {
res.cookie('userId', '123', {
httpOnly: true, // inaccessible to JavaScript (prevents XSS theft)
secure: true, // HTTPS only
sameSite: 'lax', // 'strict' | 'lax' | 'none' — CSRF protection
maxAge: 7 * 24 * 60 * 60 * 1000, // 7 days in ms
signed: true, // tamper-proof with cookie secret
});
res.json({ ok: true });
});
// Read signed cookie
app.get('/me', (req, res) => {
const userId = req.signedCookies.userId; // verified signature
res.json({ userId });
});
// Session (stores session data server-side, sends session ID in cookie)
app.use(session({
secret: process.env.SESSION_SECRET,
resave: false,
saveUninitialized: false,
store: MongoStore.create({ mongoUrl: process.env.DATABASE_URL }),
cookie: { httpOnly: true, secure: true, maxAge: 24 * 60 * 60 * 1000 }
}));
// JWT in httpOnly cookie (more secure than Authorization header)
app.post('/login', async (req, res) => {
const { accessToken, refreshToken } = await authenticateUser(req.body);
res.cookie('refreshToken', refreshToken, {
httpOnly: true, secure: true, sameSite: 'strict',
maxAge: 7 * 24 * 60 * 60 * 1000
});
res.json({ accessToken }); // access token still in response body (short-lived, 15min)
});
const etag = require('etag');
// ETag — conditional requests, 304 Not Modified
app.get('/api/products', async (req, res) => {
const products = await Product.find().lean();
const responseBody = JSON.stringify(products);
const etagValue = etag(responseBody);
if (req.headers['if-none-match'] === etagValue) {
return res.status(304).end(); // client cache is still valid, no body sent
}
res.setHeader('ETag', etagValue);
res.setHeader('Cache-Control', 'public, max-age=60, s-maxage=120');
// max-age=60: browser caches for 60s
// s-maxage=120: CDN caches for 120s
res.json(products);
});
// Cache-Control directives:
// public: anyone (browser, CDN) can cache
// private: only browser can cache (don't cache at CDN)
// no-cache: must revalidate with server before using cached version
// no-store: never cache
// max-age=N: cache for N seconds (browser)
// s-maxage=N: cache for N seconds (shared/CDN) — overrides max-age for proxies
// stale-while-revalidate=N: serve stale while fetching fresh in background
// must-revalidate: once stale, must revalidate (don't serve stale)
Answer:
MongoDB is a document-oriented NoSQL database. Data is stored as BSON documents (Binary JSON) in collections (analogous to SQL tables).
Key Concepts:
_id (ObjectId by default)CRUD Operations:
// Insert
db.users.insertOne({ name: 'Jash', email: 'jash@example.com', age: 21 });
db.users.insertMany([{ name: 'A' }, { name: 'B' }]);
// Read
db.users.find({ age: { $gt: 18 } }); // find all adults
db.users.findOne({ email: 'jash@example.com' }); // find one
db.users.find({}, { name: 1, email: 1, _id: 0 }); // projection (only name and email)
// Update
db.users.updateOne({ _id: id }, { $set: { age: 22 } });
db.users.updateMany({ age: { $lt: 18 } }, { $set: { blocked: true } });
db.users.findOneAndUpdate({ _id: id }, { $set: { name: 'New' } }, { returnDocument: 'after' });
// Delete
db.users.deleteOne({ _id: id });
db.users.deleteMany({ blocked: true });
Common Query Operators:
| Operator | Meaning |
|---|---|
$eq, $ne |
Equal, not equal |
$gt, $gte, $lt, $lte |
Comparison |
$in, $nin |
In array, not in array |
$and, $or, $not, $nor |
Logical |
$exists |
Field exists |
$regex |
Pattern match |
$elemMatch |
Array element matches query |
The aggregation pipeline processes documents through a sequence of stages.
db.orders.aggregate([
// Stage 1: Filter
{ $match: { status: 'completed', createdAt: { $gte: new Date('2024-01-01') } } },
// Stage 2: Join with users collection
{ $lookup: {
from: 'users',
localField: 'userId',
foreignField: '_id',
as: 'user'
}},
// Stage 3: Flatten the joined array
{ $unwind: '$user' },
// Stage 4: Group and aggregate
{ $group: {
_id: '$user.country',
totalRevenue: { $sum: '$amount' },
orderCount: { $sum: 1 },
avgOrderValue: { $avg: '$amount' }
}},
// Stage 5: Sort
{ $sort: { totalRevenue: -1 } },
// Stage 6: Limit
{ $limit: 10 },
// Stage 7: Reshape output
{ $project: { _id: 0, country: '$_id', totalRevenue: 1, orderCount: 1 } }
]);
Key aggregation stages:
| Stage | Purpose |
|---|---|
$match |
Filter documents (use early for performance) |
$group |
Group by field and aggregate |
$project |
Include/exclude/compute fields |
$sort |
Sort documents |
$limit / $skip |
Pagination |
$lookup |
Left outer join with another collection |
$unwind |
Deconstruct array field into multiple documents |
$facet |
Multiple aggregation pipelines in parallel |
$addFields |
Add computed fields |
Indexes dramatically speed up queries by creating efficient data structures but slow down writes and use storage.
// Single field
db.users.createIndex({ email: 1 }); // 1=ascending, -1=descending
// Compound index — order matters! (covers left-prefix queries)
db.orders.createIndex({ userId: 1, createdAt: -1 });
// covers: { userId } and { userId, createdAt } queries
// does NOT cover: { createdAt } alone
// Unique index
db.users.createIndex({ email: 1 }, { unique: true });
// Sparse index (only indexes documents that have the field)
db.users.createIndex({ phone: 1 }, { sparse: true });
// TTL index (auto-delete documents after a time period)
db.sessions.createIndex({ expiresAt: 1 }, { expireAfterSeconds: 0 });
// Text index (full-text search)
db.articles.createIndex({ title: 'text', body: 'text' });
db.articles.find({ $text: { $search: 'mongodb indexing' } });
// Explain query (see if index is used)
db.users.find({ email: 'x@x.com' }).explain('executionStats');
// Check: totalDocsExamined (low = good), indexesUsed
const mongoose = require('mongoose');
const userSchema = new mongoose.Schema({
name: {
type: String,
required: [true, 'Name is required'],
trim: true,
minlength: 2,
maxlength: 50,
},
email: {
type: String,
required: true,
unique: true,
lowercase: true,
validate: {
validator: (v) => /^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(v),
message: 'Invalid email'
}
},
password: { type: String, select: false }, // never returned in queries by default
role: { type: String, enum: ['user', 'admin', 'moderator'], default: 'user' },
createdAt: { type: Date, default: Date.now },
profile: {
bio: String,
avatar: String,
}
}, {
timestamps: true, // auto-adds createdAt and updatedAt
toJSON: { virtuals: true }
});
// Virtual field (computed, not stored in DB)
userSchema.virtual('fullName').get(function() {
return `${this.firstName} ${this.lastName}`;
});
// Pre-save middleware (hook) — hash password
userSchema.pre('save', async function(next) {
if (!this.isModified('password')) return next();
this.password = await bcrypt.hash(this.password, 12);
next();
});
// Instance method
userSchema.methods.correctPassword = async function(candidate) {
return bcrypt.compare(candidate, this.password);
};
// Static method
userSchema.statics.findByEmail = function(email) {
return this.findOne({ email: email.toLowerCase() });
};
const User = mongoose.model('User', userSchema);
| MongoDB | PostgreSQL/MySQL | |
|---|---|---|
| Data model | Flexible documents | Fixed schema, tables |
| Relationships | Embedding / $lookup |
JOINs (efficient) |
| Schema changes | Easy | Migrations required |
| Transactions | Yes (multi-doc, v4+) | Yes (strong ACID) |
| Horizontal scaling | Built-in sharding | Harder |
| Query language | MQL / Aggregation | SQL |
| Best for | Hierarchical data, rapid iteration, catalogs | Financial data, complex relations, strong consistency |
For financial platforms: Financial platforms need strong consistency → they likely use MongoDB carefully with transactions for critical operations and may use PostgreSQL for ledger-type data.
const session = await mongoose.startSession();
session.startTransaction();
try {
const user = await User.create([{ name: 'Jash' }], { session });
await Wallet.create([{ userId: user[0]._id, balance: 0 }], { session });
await session.commitTransaction();
} catch (error) {
await session.abortTransaction();
throw error;
} finally {
session.endSession();
}
One of the most important MongoDB design decisions. There is no single right answer — it depends on access patterns.
Embedding (denormalization): Store related data inside the same document.
// Embedded: blog post with its comments
{
_id: ObjectId('...'),
title: 'MongoDB Tips',
content: '...',
comments: [ // ← embedded array
{ author: 'Jash', text: 'Great!', createdAt: Date },
{ author: 'Sam', text: 'Thanks!', createdAt: Date }
]
}
// ✅ Single query fetches post AND comments
// ✅ Atomic updates on one document
// ❌ Document size limit (16MB) — if comments grow unboundedly
// ❌ Can't efficiently query/index all comments across posts
Referencing (normalization): Store related data in separate documents with ObjectId references.
// Post document
{ _id: ObjectId('post1'), title: 'MongoDB Tips', authorId: ObjectId('user1') }
// Comment documents (separate collection)
{ _id: ObjectId('c1'), postId: ObjectId('post1'), text: 'Great!', author: 'Jash' }
{ _id: ObjectId('c2'), postId: ObjectId('post1'), text: 'Thanks!', author: 'Sam' }
// ✅ Unbounded growth (comments collection can be huge)
// ✅ Can efficiently query all comments across posts
// ✅ Reusable data (e.g., user info shared across posts)
// ❌ Requires $lookup joins (slower than embedded)
// ❌ Not atomic across documents (need transactions)
Decision rules:
| Embed when | Reference when |
|---|---|
| Data is always accessed together | Data is accessed independently |
| 1-many with few items (< ~100) | 1-many with many/unbounded items |
| Nested data is rarely queried alone | Need to query nested data directly |
| You need atomic updates | Each document has independent lifecycle |
// Basic populate (foreign key reference)
const postSchema = new mongoose.Schema({
title: String,
author: { type: mongoose.Schema.Types.ObjectId, ref: 'User' } // ← reference
});
const post = await Post.findById(id)
.populate('author', 'name email -password') // select fields
.populate('comments.user', 'name'); // nested populate
// Chained populate
await Order.findById(id)
.populate({ path: 'items.product', select: 'name price' })
.populate({ path: 'customer', select: 'name email' });
// Virtual populate — reverse population without storing references in parent
const userSchema = new mongoose.Schema({ name: String }, { toJSON: { virtuals: true } });
userSchema.virtual('posts', {
ref: 'Post', // model to reference
localField: '_id', // field in User
foreignField: 'author',// field in Post that references User
justOne: false // array of posts
});
const user = await User.findById(id).populate('posts'); // runs a query on Post collection
// This keeps User document clean (no postIds array) while still fetching posts
// select: false on sensitive fields still respected with populate
userSchema.add({ password: { type: String, select: false } });
await Post.findById(id).populate('author'); // password NOT included
await Post.findById(id).populate('author +password'); // explicitly include
Critical for distributed systems — controls consistency vs availability tradeoff.
Write Concern — how many replica set members must acknowledge before the write is considered successful:
// w:1 (default) — primary acknowledges. Fast but can lose data on failover
await User.create({ name: 'Jash' }); // default w:1
// w:'majority' — most replicas acknowledge. Safer, slightly slower
await Transaction.create(data, { writeConcern: { w: 'majority' } });
// j:true — data written to journal before acknowledging. Survives crashes
await CriticalData.create(data, { writeConcern: { w: 'majority', j: true } });
// w:0 — fire and forget. Fastest, no acknowledgement
// Use only for non-critical data (metrics, logs)
await AnalyticsEvent.collection.insertOne(event, { writeConcern: { w: 0 } });
Read Preference — which replica set member to read from:
// primary (default): read from primary. Always fresh data
// primaryPreferred: primary if available, else secondary
// secondary: always read from secondary. May be stale!
// secondaryPreferred: secondary if available, else primary
// nearest: lowest latency member
// Read from secondary for heavy analytics (offload primary)
const reports = await Order.find({ createdAt: { $gte: lastMonth } })
.read('secondaryPreferred');
// Mongoose connection-level setting
mongoose.connect(uri, {
readPreference: 'secondaryPreferred',
readConcern: { level: 'majority' } // only read data acknowledged by majority
});
Change streams let you listen to real-time changes in collections — built on the oplog (operations log).
// Watch a collection for changes
const changeStream = Order.watch([
{ $match: { 'fullDocument.status': 'paid', operationType: 'update' } }
]);
changeStream.on('change', async (change) => {
const { operationType, fullDocument, documentKey, updateDescription } = change;
if (operationType === 'insert') {
await sendOrderConfirmation(fullDocument.userId, fullDocument._id);
}
if (operationType === 'update') {
console.log('Modified fields:', updateDescription.updatedFields);
if (fullDocument.status === 'shipped') {
await sendShippingNotification(fullDocument);
}
}
if (operationType === 'delete') {
await cleanupOrderData(documentKey._id);
}
});
// Resume after restart using resumeToken
changeStream.on('change', (change) => {
lastResumeToken = change._id; // save this to DB/Redis
});
// Resume on reconnect
const resumableStream = Order.watch(pipeline, {
resumeAfter: lastResumeToken // pick up where you left off
});
// Watch entire database
const dbStream = mongoose.connection.db.watch();
// Watch entire cluster
const clusterStream = mongoose.connection.db.admin().watch();
// Requirements: MongoDB 3.6+, replica set (or sharded cluster), no standalone
// Use case: real-time notifications, cache invalidation, audit logs, sync to Elasticsearch
[Client] [Server]
│ │
│──── POST /auth/login ─────>│
│ { email, password } │
│ │ 1. Validate credentials
│ │ 2. Create JWT (header.payload.signature)
│<─── { token, refreshToken }│
│ │
│── GET /api/profile ───────>│
│ Authorization: Bearer jwt │
│ │ 3. Verify JWT signature
│ │ 4. Decode payload (userId, role, exp)
│<─── { user data } ─────────│
// Server: Create JWT
const jwt = require('jsonwebtoken');
function createTokens(userId) {
const accessToken = jwt.sign(
{ userId, type: 'access' },
process.env.JWT_SECRET,
{ expiresIn: '15m' }
);
const refreshToken = jwt.sign(
{ userId, type: 'refresh' },
process.env.JWT_REFRESH_SECRET,
{ expiresIn: '7d' }
);
return { accessToken, refreshToken };
}
// Server: Verify JWT middleware
function authenticate(req, res, next) {
const token = req.headers.authorization?.split(' ')[1];
if (!token) return res.status(401).json({ error: 'No token provided' });
try {
const payload = jwt.verify(token, process.env.JWT_SECRET);
req.userId = payload.userId;
next();
} catch (err) {
if (err.name === 'TokenExpiredError') {
return res.status(401).json({ error: 'Token expired', code: 'TOKEN_EXPIRED' });
}
res.status(401).json({ error: 'Invalid token' });
}
}
CORS (Cross-Origin Resource Sharing): A browser security mechanism that restricts HTTP requests made from one origin to another.
An origin = protocol + domain + port. http://localhost:3000 ≠ http://localhost:5000
// Express CORS configuration
const cors = require('cors');
// Simple: allow all origins (don't use in production for sensitive APIs)
app.use(cors());
// Production: explicit allowlist
app.use(cors({
origin: ['https://app.mycompany.com', 'https://staging.mycompany.com'],
methods: ['GET', 'POST', 'PUT', 'PATCH', 'DELETE', 'OPTIONS'],
allowedHeaders: ['Content-Type', 'Authorization'],
credentials: true, // allow cookies
maxAge: 86400 // cache preflight for 24 hours
}));
# docker-compose.yml
version: '3.8'
services:
mongo:
image: mongo:7
environment:
MONGO_INITDB_ROOT_USERNAME: root
MONGO_INITDB_ROOT_PASSWORD: password
volumes:
- mongo_data:/data/db
ports:
- "27017:27017"
redis:
image: redis:7-alpine
ports:
- "6379:6379"
backend:
build: ./backend
environment:
- NODE_ENV=development
- DATABASE_URL=mongodb://root:password@mongo:27017/mydb
- REDIS_URL=redis://redis:6379
ports:
- "5000:5000"
depends_on:
- mongo
- redis
frontend:
build: ./frontend
ports:
- "3000:3000"
depends_on:
- backend
volumes:
mongo_data:
Cache-Aside (Lazy Loading) — most common:
const redis = require('ioredis');
const client = new redis(process.env.REDIS_URL);
async function getUser(userId) {
const cacheKey = `user:${userId}`;
// 1. Check cache first
const cached = await client.get(cacheKey);
if (cached) {
return JSON.parse(cached); // cache HIT
}
// 2. Cache MISS — fetch from DB
const user = await User.findById(userId).lean();
if (!user) return null;
// 3. Store in cache with TTL
await client.setex(cacheKey, 3600, JSON.stringify(user)); // TTL = 1 hour
return user;
}
// Cache invalidation on update
async function updateUser(userId, updates) {
const user = await User.findByIdAndUpdate(userId, updates, { new: true });
await client.del(`user:${userId}`); // evict stale cache
return user;
}
// Pattern 2: Write-Through — update cache on every write
async function updateUserWt(userId, updates) {
const user = await User.findByIdAndUpdate(userId, updates, { new: true }).lean();
await client.setex(`user:${userId}`, 3600, JSON.stringify(user));
return user;
}
// Pattern 3: Cache stampede prevention (many requests on cold cache)
async function getUserWithLock(userId) {
const cacheKey = `user:${userId}`;
const lockKey = `lock:${cacheKey}`;
const cached = await client.get(cacheKey);
if (cached) return JSON.parse(cached);
// Try to acquire lock (SET NX = set if not exists)
const lock = await client.set(lockKey, '1', 'EX', 5, 'NX');
if (!lock) {
// Another request is fetching — wait and retry
await new Promise(r => setTimeout(r, 100));
return getUserWithLock(userId);
}
const user = await User.findById(userId).lean();
await client.setex(cacheKey, 3600, JSON.stringify(user));
await client.del(lockKey);
return user;
}
Other Redis use cases:
// Leaderboard with sorted sets
await client.zadd('leaderboard', score, userId);
await client.zrevrange('leaderboard', 0, 9, 'WITHSCORES'); // top 10
// Rate limiting with sliding window
async function isRateLimited(ip, limit = 100, windowSecs = 60) {
const key = `rl:${ip}:${Math.floor(Date.now() / (windowSecs * 1000))}`;
const count = await client.incr(key);
await client.expire(key, windowSecs);
return count > limit;
}
// Session storage
await client.setex(`session:${sessionId}`, 86400, JSON.stringify(sessionData));
// Pub/Sub for real-time
const subscriber = new redis();
const publisher = new redis();
subscriber.subscribe('notifications');
subscriber.on('message', (channel, message) => io.emit('notification', JSON.parse(message)));
publisher.publish('notifications', JSON.stringify({ userId, text: 'Order shipped!' }));
Offload slow/heavy tasks (email sending, image processing, report generation) from the HTTP request-response cycle.
const { Queue, Worker, QueueEvents } = require('bullmq');
const redis = { host: 'localhost', port: 6379 };
// Producer — add jobs to queue
const emailQueue = new Queue('emails', { connection: redis });
app.post('/orders', async (req, res) => {
const order = await Order.create(req.body);
// Don't await — add to queue and respond immediately
await emailQueue.add('order-confirmation', {
to: order.userEmail,
orderId: order._id,
items: order.items
}, {
attempts: 3, // retry up to 3 times on failure
backoff: { type: 'exponential', delay: 2000 }, // 2s, 4s, 8s
delay: 0, // process immediately
removeOnComplete: 100 // keep last 100 completed jobs
});
res.status(201).json(order); // fast response
});
// Consumer — process jobs (separate process/worker)
const emailWorker = new Worker('emails', async (job) => {
const { to, orderId, items } = job.data;
// Can take as long as needed — doesn't block HTTP server
await sendTemplateEmail(to, 'order-confirmation', { orderId, items });
await Analytics.track('email_sent', { orderId });
return { sent: true, timestamp: Date.now() }; // result stored in Redis
}, { connection: redis, concurrency: 5 }); // 5 parallel workers
emailWorker.on('completed', (job, result) => console.log(`Job ${job.id} done:`, result));
emailWorker.on('failed', (job, err) => console.error(`Job ${job.id} failed:`, err));
// Scheduled jobs (cron)
const reportQueue = new Queue('reports', { connection: redis });
await reportQueue.add('daily-report', {}, {
repeat: { cron: '0 8 * * *' } // every day at 8am
});
Resource naming:
✅ GET /users — list users
✅ POST /users — create user
✅ GET /users/:id — get one user
✅ PATCH /users/:id — partial update (preferred over PUT for partials)
✅ PUT /users/:id — full replace
✅ DELETE /users/:id — delete user
✅ GET /users/:id/orders — sub-resource (user's orders)
❌ GET /getUsers — verb in URL
❌ POST /users/delete/:id — method in URL
❌ GET /userOrders?userId=123 — should be /users/123/orders
Response consistency:
// Success response structure
res.status(200).json({
success: true,
data: { user },
meta: { page: 1, total: 100, limit: 10 }
});
// Error response structure
res.status(400).json({
success: false,
error: {
code: 'VALIDATION_ERROR',
message: 'Email is invalid',
field: 'email' // for field-level errors
}
});
// Pagination pattern
// GET /users?page=2&limit=20&sortBy=createdAt&order=desc
router.get('/users', async (req, res) => {
const { page = 1, limit = 20, sortBy = 'createdAt', order = 'desc' } = req.query;
const skip = (page - 1) * limit;
const [users, total] = await Promise.all([
User.find().sort({ [sortBy]: order }).skip(skip).limit(Number(limit)).lean(),
User.countDocuments()
]);
res.json({
data: users,
meta: { page: Number(page), limit: Number(limit), total, pages: Math.ceil(total / limit) }
});
});
// Strategy 1: URL path versioning (most common, most visible)
app.use('/api/v1', v1Router);
app.use('/api/v2', v2Router);
// GET /api/v1/users vs GET /api/v2/users
// Strategy 2: Header versioning (cleaner URLs)
app.use('/api', (req, res, next) => {
const version = req.headers['api-version'] || 'v1';
req.apiVersion = version;
next();
});
// Strategy 3: Query param (rarely used)
// GET /api/users?version=2
// Best practices:
// - Never break existing clients (add, don't change)
// - Deprecate with headers: Deprecation: true, Sunset: Sat, 31 Dec 2025 00:00:00 GMT
// - Version only when you have breaking changes
// - Keep v1 working for at least 1 year after v2 launch
Short-lived access tokens + long-lived refresh tokens = security + good UX.
// Full implementation
// Login — issue both tokens
async function login(email, password) {
const user = await User.findOne({ email }).select('+password');
if (!user || !(await user.correctPassword(password))) {
throw new UnauthorizedError('Invalid credentials');
}
const accessToken = jwt.sign(
{ userId: user._id, role: user.role },
process.env.JWT_SECRET,
{ expiresIn: '15m' } // short-lived!
);
const refreshToken = jwt.sign(
{ userId: user._id, tokenFamily: crypto.randomUUID() },
process.env.JWT_REFRESH_SECRET,
{ expiresIn: '7d' }
);
// Store refresh token hash in DB (not plaintext)
await RefreshToken.create({
tokenHash: crypto.createHash('sha256').update(refreshToken).digest('hex'),
userId: user._id,
expiresAt: new Date(Date.now() + 7 * 24 * 60 * 60 * 1000)
});
// Return refresh token in httpOnly cookie, access token in body
return { accessToken, refreshToken };
}
// Refresh — issue new access token, rotate refresh token
async function refreshTokens(oldRefreshToken) {
let payload;
try {
payload = jwt.verify(oldRefreshToken, process.env.JWT_REFRESH_SECRET);
} catch {
throw new UnauthorizedError('Invalid refresh token');
}
const tokenHash = crypto.createHash('sha256').update(oldRefreshToken).digest('hex');
const stored = await RefreshToken.findOne({ tokenHash, userId: payload.userId });
if (!stored) {
// Token reuse detected! Invalidate ALL tokens for this user (potential theft)
await RefreshToken.deleteMany({ userId: payload.userId });
throw new UnauthorizedError('Refresh token reuse detected');
}
// Rotate: delete old, issue new
await stored.deleteOne();
const newAccessToken = jwt.sign({ userId: payload.userId }, process.env.JWT_SECRET, { expiresIn: '15m' });
const newRefreshToken = jwt.sign({ userId: payload.userId }, process.env.JWT_REFRESH_SECRET, { expiresIn: '7d' });
await RefreshToken.create({
tokenHash: crypto.createHash('sha256').update(newRefreshToken).digest('hex'),
userId: payload.userId,
expiresAt: new Date(Date.now() + 7 * 24 * 60 * 60 * 1000)
});
return { accessToken: newAccessToken, refreshToken: newRefreshToken };
}
// Logout — invalidate refresh token
async function logout(refreshToken) {
const tokenHash = crypto.createHash('sha256').update(refreshToken).digest('hex');
await RefreshToken.deleteOne({ tokenHash });
}
1. Injection (NoSQL Injection)
// VULNERABLE: using user input directly in query
const user = await User.findOne({ username: req.body.username });
// If username = { $gt: '' } → logs in as first user!
// SAFE: use mongoose validation or sanitize
const username = String(req.body.username); // force string
const user = await User.findOne({ username });
// Or use express-mongo-sanitize middleware
const mongoSanitize = require('express-mongo-sanitize');
app.use(mongoSanitize());
2. Broken Authentication
3. XSS in React
// React auto-escapes by default — this is SAFE
<div>{userInput}</div>
// dangerouslySetInnerHTML — DANGEROUS, avoid or sanitize
<div dangerouslySetInnerHTML={{ __html: sanitizeHtml(userInput) }} />
4. CSRF
SameSite=Strict or CSRF tokens5. Security Headers (use Helmet)
const helmet = require('helmet');
app.use(helmet()); // sets 15+ security headers
// Content-Security-Policy, X-Frame-Options, X-XSS-Protection, etc.
const bcrypt = require('bcryptjs');
// Hashing (at registration)
const SALT_ROUNDS = 12; // higher = slower but more secure
const hash = await bcrypt.hash(plainPassword, SALT_ROUNDS);
// Verifying (at login)
const isMatch = await bcrypt.compare(plainPassword, hash);
// Why bcrypt? It's adaptive (can increase cost factor), includes salt automatically
// Never: MD5, SHA1 for passwords (too fast, GPU-crackable)
// Good: bcrypt, argon2, scrypt
This is one of the most debated security decisions in MERN:
| localStorage / sessionStorage | httpOnly Cookie | |
|---|---|---|
| XSS vulnerability | ❌ Yes — JS can read it | ✅ No — JS cannot read it |
| CSRF vulnerability | ✅ No — can't be auto-sent | ❌ Yes — auto-sent with requests |
| SSR/SSG | ❌ Not available on server | ✅ Available in Server Components |
| Sent automatically | ❌ No — must set header manually | ✅ Yes — browser sends automatically |
| Size limit | ~5MB | ~4KB per cookie |
| Recommended for | Non-sensitive data | Access tokens and refresh tokens |
Best practice recommendation:
httpOnly; Secure; SameSite=Strict cookie// Interceptor to refresh expired access token (React + Axios)
let accessToken = null; // in memory — clears on tab close
axios.interceptors.request.use(config => {
if (accessToken) config.headers.Authorization = `Bearer ${accessToken}`;
return config;
});
axios.interceptors.response.use(
res => res,
async error => {
if (error.response?.status === 401 && error.config && !error.config._retry) {
error.config._retry = true;
try {
const res = await axios.post('/api/auth/refresh', {}, { withCredentials: true });
// withCredentials: true → sends httpOnly refresh token cookie
accessToken = res.data.accessToken;
error.config.headers.Authorization = `Bearer ${accessToken}`;
return axios(error.config); // retry original request
} catch {
accessToken = null;
window.location.href = '/login';
}
}
return Promise.reject(error);
}
);
const https = require('https');
const fs = require('fs');
// Self-signed cert (development only):
// openssl req -x509 -newkey rsa:4096 -keyout key.pem -out cert.pem -days 365 -nodes
const options = {
key: fs.readFileSync('key.pem'),
cert: fs.readFileSync('cert.pem'),
};
https.createServer(options, app).listen(443, () => {
console.log('HTTPS server running on port 443');
});
// Redirect HTTP to HTTPS
const http = require('http');
http.createServer((req, res) => {
res.writeHead(301, { Location: `https://${req.headers.host}${req.url}` });
res.end();
}).listen(80);
// In production: NEVER terminate TLS in Node directly
// Use nginx/Cloudflare/load balancer for TLS termination — much more performant
// Node only handles plain HTTP behind the reverse proxy
// TLS concepts:
// TLS 1.3 — current standard (faster, more secure than 1.2)
// Certificate = public key + identity verified by CA
// HSTS header: app.use(helmet.hsts({ maxAge: 31536000 })) — force HTTPS for 1 year
// Certificate pinning: verifies specific cert fingerprint (mobile apps)
// CSP prevents XSS by telling browsers which sources are valid for each resource type
app.use(helmet.contentSecurityPolicy({
directives: {
defaultSrc: ["'self'"], // default: only same origin
scriptSrc: ["'self'", "'nonce-{nonce}'", "https://cdn.jsdelivr.net"],
styleSrc: ["'self'", "'unsafe-inline'", "https://fonts.googleapis.com"],
imgSrc: ["'self'", "data:", "https:"],
fontSrc: ["'self'", "https://fonts.gstatic.com"],
connectSrc: ["'self'", "https://api.mycompany.com"],
frameSrc: ["'none'"], // prevent clickjacking
objectSrc: ["'none'"], // no plugins
upgradeInsecureRequests: [], // auto-upgrade http: to https:
}
}));
// Nonce-based CSP (allows only scripts with the matching nonce)
app.use((req, res, next) => {
res.locals.cspNonce = crypto.randomBytes(16).toString('base64');
next();
});
// In template: <script nonce="<%= cspNonce %>">...</script>
// Other Helmet headers:
// X-Frame-Options: SAMEORIGIN → prevent clickjacking
// X-Content-Type-Options: nosniff → prevent MIME sniffing
// Referrer-Policy: strict-origin-when-cross-origin
// Permissions-Policy: camera=(), microphone=(), geolocation=()
E2E Tests (Playwright, Cypress)
↑ few, slow, expensive, high confidence
Integration Tests
↑ some, test service + DB + API
Unit Tests (Jest, Vitest)
↑ many, fast, cheap, isolated
Unit testing Express routes:
// Using Jest + Supertest
const request = require('supertest');
const app = require('../app');
describe('GET /api/users/:id', () => {
it('should return 404 for non-existent user', async () => {
const res = await request(app)
.get('/api/users/nonexistentid')
.set('Authorization', `Bearer ${testToken}`);
expect(res.status).toBe(404);
expect(res.body.error).toBe('User not found');
});
});
React Testing Library:
import { render, screen, fireEvent, waitFor } from '@testing-library/react';
import Login from './Login';
test('shows error on invalid login', async () => {
render(<Login />);
fireEvent.change(screen.getByLabelText('Email'), {
target: { value: 'invalid@test.com' }
});
fireEvent.click(screen.getByRole('button', { name: /login/i }));
await waitFor(() => {
expect(screen.getByText('Invalid credentials')).toBeInTheDocument();
});
});
// 1. Mock a module
jest.mock('../services/emailService', () => ({
sendEmail: jest.fn().mockResolvedValue({ messageId: 'test-123' })
}));
// 2. Mock only specific functions
jest.mock('../services/userService');
const userService = require('../services/userService');
userService.findById.mockResolvedValue({ _id: '123', name: 'Jash' });
// 3. Spy on (and optionally mock) existing function
const spy = jest.spyOn(bcrypt, 'compare').mockResolvedValue(true);
// Verify it was called
expect(spy).toHaveBeenCalledWith('password', 'hashedPassword');
spy.mockRestore(); // restore original
// 4. Mock environment variables
process.env.JWT_SECRET = 'test-secret';
// Or in jest.config.js:
// testEnvironment: 'node', setupFiles: ['./test/setup.js']
// 5. Timers (setTimeout, setInterval, Date)
jest.useFakeTimers();
jest.setSystemTime(new Date('2025-01-01'));
jest.advanceTimersByTime(5000); // advance 5 seconds
jest.useRealTimers();
// Full route test example with mocking
const request = require('supertest');
const app = require('../app');
const User = require('../models/User');
jest.mock('../models/User');
describe('POST /api/auth/login', () => {
beforeEach(() => {
jest.clearAllMocks();
});
it('should return 401 for wrong password', async () => {
User.findOne.mockResolvedValue({
_id: 'userId',
email: 'test@test.com',
correctPassword: jest.fn().mockResolvedValue(false),
});
const res = await request(app)
.post('/api/auth/login')
.send({ email: 'test@test.com', password: 'wrongpass' });
expect(res.status).toBe(401);
expect(res.body.error.code).toBe('UNAUTHORIZED');
});
it('should return 200 with tokens on valid credentials', async () => {
User.findOne.mockResolvedValue({
_id: 'userId',
email: 'test@test.com',
role: 'user',
correctPassword: jest.fn().mockResolvedValue(true),
});
const res = await request(app)
.post('/api/auth/login')
.send({ email: 'test@test.com', password: 'validpass' });
expect(res.status).toBe(200);
expect(res.body).toHaveProperty('accessToken');
});
});
// jest.config.js
module.exports = {
preset: '@shelf/jest-mongodb', // OR use mongodb-memory-server directly
testEnvironment: 'node',
globalSetup: './test/globalSetup.js',
globalTeardown: './test/globalTeardown.js',
};
// test/setup.js — using mongodb-memory-server
const { MongoMemoryServer } = require('mongodb-memory-server');
const mongoose = require('mongoose');
let mongod;
beforeAll(async () => {
mongod = await MongoMemoryServer.create();
await mongoose.connect(mongod.getUri());
});
afterEach(async () => {
// Clean all collections between tests
const collections = mongoose.connection.collections;
await Promise.all(Object.values(collections).map(c => c.deleteMany({})));
});
afterAll(async () => {
await mongoose.disconnect();
await mongod.stop();
});
// Now write tests that interact with real Mongoose/MongoDB
describe('UserService', () => {
it('should hash password before saving', async () => {
const user = await User.create({ name: 'Jash', email: 'j@j.com', password: 'plain123' });
expect(user.password).not.toBe('plain123');
expect(user.password).toHaveLength(60); // bcrypt hash length
});
it('should enforce unique email', async () => {
await User.create({ name: 'A', email: 'dup@test.com', password: 'pass' });
await expect(
User.create({ name: 'B', email: 'dup@test.com', password: 'pass' })
).rejects.toThrow(/duplicate key/i);
});
});
1. Browser checks cache (memory cache, disk cache)
2. DNS resolution: domain name → IP address
- Browser DNS cache → OS DNS cache → Router/ISP DNS → Recursive DNS resolver → Authoritative NS
3. TCP handshake: SYN → SYN-ACK → ACK
4. TLS handshake (for HTTPS): negotiates encryption, verifies certificate
5. HTTP request: GET / HTTP/1.1 (or HTTP/2 multiplexed)
6. Server processes request
- Hits load balancer → routes to app server
- Middleware (auth, rate limiting, logging)
- Route handler → business logic
- Database query → cache check first
- Build response
7. HTTP response: 200 OK with HTML
8. Browser parsing:
- Parse HTML → build DOM tree
- Parse CSS → build CSSOM
- DOM + CSSOM → Render tree
- Execute JavaScript (may modify DOM)
9. Critical Rendering Path → First Paint
10. React hydration (for SSR) or React takes over (for CSR)
Entities: URL { id, shortCode, originalUrl, userId, createdAt, expiresAt, clickCount }
API:
POST /api/shorten → create short URL
GET /:code → redirect to original URL
GET /api/stats/:code → analytics
Key decisions:
1. Short code generation:
- Base62 encoding of auto-increment ID (predictable, but simple)
- RandomBytes(4) → base62 encoding (6 chars = 62^6 ≈ 56 billion)
- Hash of URL (collision risk, handle with counter suffix)
2. Redirect:
app.get('/:code', async (req, res) => {
const cached = await redis.get(`url:${req.params.code}`);
if (cached) {
// fire-and-forget analytics
updateClickCount(req.params.code);
return res.redirect(301, cached);
}
const url = await URL.findOne({ shortCode: req.params.code });
if (!url) return res.status(404).send('Not found');
await redis.setex(`url:${req.params.code}`, 3600, url.originalUrl);
res.redirect(301, url.originalUrl);
});
3. Analytics: Use message queue (or fire-and-forget async call) to avoid slowing redirect
4. Expiration: TTL index on expiresAt in MongoDB
5. Rate limiting: per user/IP for creating URLs
// Q1: What is the output?
for (var i = 0; i < 3; i++) {
setTimeout(() => console.log(i), 0);
}
// Output: 3, 3, 3
// Q2:
console.log(typeof null);
// Output: "object" (historical JS bug)
// Q3:
console.log(0.1 + 0.2 === 0.3);
// Output: false (floating point precision issue)
// Fix: Math.abs(0.1 + 0.2 - 0.3) < Number.EPSILON
// Q4:
const obj = {};
obj.a = 1;
obj.b = 2;
delete obj.a;
console.log(Object.keys(obj));
// Output: ["b"]
// Q5: Explain the output
async function foo() {
return 1;
}
console.log(foo());
// Output: Promise { 1 } — async function always returns a Promise
// Q6:
console.log([] + []); // "" (both coerce to "", "" + "" = "")
console.log({} + []); // "[object Object]"
console.log([] + {}); // "[object Object]"
console.log(+[]); // 0 ([] → "" → 0)
console.log(+{}); // NaN
Q: What triggers a re-render in React?
setState (class) / state setter (hooks) is calledReact.memo)useContext value changes (all consumers re-render)forceUpdate() (class components)Q: Does re-render mean DOM update?
No. Re-render means React re-executes the component function to get the new vDOM. Then it diffs and only commits actual changes to the real DOM if something changed.
Q: What is React Fiber?
React Fiber is the reimplementation of React's core reconciliation algorithm (introduced in React 16). It enables:
Q: What are React's concurrent features?
startTransition — mark updates as non-urgent (can be interrupted)useDeferredValue — defer updating a value until higher-priority updates settleuseTransition — track if a transition is pendingimport { startTransition, useTransition, useDeferredValue } from 'react';
// Heavy filter operation — mark as non-urgent
function SearchResults({ query }) {
const deferredQuery = useDeferredValue(query); // lags behind query
const results = useMemo(() => filterResults(deferredQuery), [deferredQuery]);
return <ResultsList items={results} />;
}
Q: How does Node.js handle 10,000 concurrent connections?
Node.js uses a non-blocking, event-driven architecture. It doesn't create a new thread per connection. Instead:
Q: What is the difference between process.nextTick and setImmediate?
process.nextTick: Runs after current operation completes, BEFORE the event loop continues to next phase. Greedy — nested nextTicks can starve the event loopsetImmediate: Runs in the check phase of the current event loop iteration — after I/O callbacksQ: How do you prevent callback hell?
.then() chainingasync/await# Built-in inspector
node --inspect server.js # default port 9229
node --inspect-brk server.js # break on first line
# VS Code: attach to process in launch.json with "attach" type
# Chrome DevTools: chrome://inspect → Open dedicated DevTools for Node
# Common debugging techniques:
# 1. console.log (basic, not production)
# 2. VS Code debugger with breakpoints
# 3. node --inspect + Chrome DevTools
# 4. Structured logging: Winston, Pino
# 5. APM tools: Datadog, New Relic, OpenTelemetry
| Package | Purpose |
|---|---|
express |
Web framework |
mongoose |
MongoDB ODM |
dotenv |
Load .env variables |
jsonwebtoken |
JWT create/verify |
bcryptjs |
Password hashing |
cors |
CORS middleware |
helmet |
Security headers |
express-rate-limit |
API rate limiting |
morgan |
HTTP request logger |
joi / zod |
Input validation |
multer |
File upload handling |
nodemon |
Auto-restart in dev |
jest + supertest |
Backend testing |
axios |
HTTP client |
socket.io |
WebSockets |
bull |
Job/task queues |
redis / ioredis |
Redis client |
winston / pino |
Production logging |
// Pattern 1: try/catch (verbose but clear)
async function handler(req, res) {
try {
const user = await User.findById(req.params.id);
res.json(user);
} catch (err) {
res.status(500).json({ error: err.message });
}
}
// Pattern 2: Wrapper utility (DRY, clean routes)
const asyncHandler = (fn) => (req, res, next) =>
Promise.resolve(fn(req, res, next)).catch(next);
app.get('/users/:id', asyncHandler(async (req, res) => {
const user = await User.findById(req.params.id);
if (!user) throw { status: 404, message: 'User not found' };
res.json(user);
}));
// Centralized error handler
app.use((err, req, res, next) => {
const status = err.status || err.statusCode || 500;
const message = err.message || 'Internal Server Error';
res.status(status).json({ error: message });
});
// Pattern 3: Go-style [error, result] tuple
async function to(promise) {
try {
const data = await promise;
return [null, data];
} catch (err) {
return [err, null];
}
}
const [err, user] = await to(User.findById(id));
if (err) return res.status(500).json({ error: err.message });
Database:
✅ Create indexes for all queried fields
✅ Use projection (select only needed fields)
✅ Use aggregation pipeline efficiently (filter early with $match)
✅ Enable connection pooling (Mongoose does this by default)
✅ Use lean() for read-only Mongoose queries (plain JS object, not Mongoose doc)
User.find().lean() // 2-5x faster, less memory
Caching:
✅ Redis for frequently-read, rarely-changed data
✅ Cache invalidation strategy (TTL + event-driven)
✅ HTTP caching headers (Cache-Control, ETags)
API:
✅ Pagination for all list endpoints
✅ Rate limiting to prevent abuse
✅ Compression (gzip): app.use(require('compression')())
✅ Response streaming for large data
Node.js:
✅ Cluster mode or PM2 for multi-core utilization
✅ Avoid blocking the event loop (crypto, heavy math → worker thread)
✅ Use streaming for large file operations
✅ Proper error handling (unhandled rejections crash Node!)
// Server
const io = require('socket.io')(server, {
cors: { origin: 'http://localhost:3000' }
});
io.on('connection', (socket) => {
console.log(`Client connected: ${socket.id}`);
// Join a room
socket.on('join-room', (roomId) => {
socket.join(roomId);
});
// Broadcast to room
socket.on('message', ({ roomId, text }) => {
io.to(roomId).emit('new-message', { text, from: socket.id });
});
socket.on('disconnect', () => {
console.log(`Client disconnected: ${socket.id}`);
});
});
// Client (React)
import { io } from 'socket.io-client';
const socket = io('http://localhost:5000');
useEffect(() => {
socket.emit('join-room', roomId);
socket.on('new-message', (msg) => {
setMessages(prev => [...prev, msg]);
});
return () => {
socket.off('new-message');
};
}, [roomId]);
Hydration is the process where React attaches event listeners and makes server-rendered HTML interactive on the client. React doesn't re-create the DOM — it "walks" the existing HTML and matches it to the component tree.
Flow for SSR/Next.js:
1. Server renders HTML string → sends to browser
2. Browser paints the HTML (First Contentful Paint — fast!)
3. React bundle loads on client
4. React "hydrates" — walks existing DOM, matches to vDOM, attaches event handlers
5. App is now fully interactive
Hydration mismatch — when server HTML doesn't match what React tries to render on client:
// ❌ Common causes:
// 1. Math.random() or Date.now() in render — different each time
function Avatar() {
return <img src={`/avatar?v=${Math.random()}`} />; // different on server vs client
}
// 2. typeof window check — window doesn't exist on server
function Component() {
return <div>{window.innerWidth > 768 ? 'Desktop' : 'Mobile'}</div>; // crashes on server
}
// 3. Browser-only APIs in initial render
function Component() {
const saved = localStorage.getItem('prefs'); // not available on server
return <div>{saved}</div>;
}
// ✅ Fix 1: useEffect (only runs on client)
function Component() {
const [width, setWidth] = useState(0);
useEffect(() => setWidth(window.innerWidth), []); // runs after hydration
return <div>{width > 768 ? 'Desktop' : 'Mobile'}</div>;
}
// ✅ Fix 2: suppressHydrationWarning (only for intentional mismatches)
<time suppressHydrationWarning>{new Date().toLocaleTimeString()}</time>
// ✅ Fix 3: Dynamic imports with ssr: false (Next.js)
const BrowserOnlyChart = dynamic(() => import('./Chart'), { ssr: false });
// Common causes:
// 1. EventEmitter without cleanup — most common leak
function createServer() {
emitter.on('data', processData); // ← added every call, never removed
// fix: emitter.once() or emitter.removeListener() when done
}
// 2. Closures holding large data
let cache = [];
setInterval(() => {
cache.push(new Array(1000000).fill('*')); // grows forever
// fix: bound cache size or use WeakMap
}, 100);
// 3. Global variable accumulation
global.requests = []; // grows forever if requests aren't removed
// fix: use bounded data structures, WeakRef for non-critical references
// 4. Forgotten promise callbacks
const promises = [];
for (let i = 0; i < 10000; i++) {
promises.push(fetch('/api')); // never awaited, pending promises accumulate
}
// Detection tools:
// node --inspect=9229 server.js → Chrome DevTools → Memory tab → Heap Snapshot
// node --expose-gc → manual gc(), compare heap sizes
// clinic.js (by NearForm): clinic doctor -- node server.js
// heapdump package: heapdump.writeSnapshot('./heap-' + Date.now() + '.heapsnapshot')
process.memoryUsage();
// { rss: 45MB, heapTotal: 20MB, heapUsed: 15MB, external: 1MB, arrayBuffers: 0.5MB }
// Track this over time — growing heapUsed without plateau = leak
// Race condition: two async operations that can interfere with each other
// Example: check-then-act (classic TOCTOU race)
async function transferFunds(fromId, toId, amount) {
const from = await Account.findById(fromId);
if (from.balance < amount) throw new Error('Insufficient funds');
// ← RACE: another transfer could run here, depleting balance!
await Account.updateOne({ _id: fromId }, { $inc: { balance: -amount } });
await Account.updateOne({ _id: toId }, { $inc: { balance: +amount } });
}
// Fix 1: Atomic MongoDB operation
async function transferFundsSafe(fromId, toId, amount) {
const session = await mongoose.startSession();
session.startTransaction();
try {
const from = await Account.findOneAndUpdate(
{ _id: fromId, balance: { $gte: amount } }, // only update if sufficient
{ $inc: { balance: -amount } },
{ session, new: true }
);
if (!from) throw new Error('Insufficient funds');
await Account.updateOne({ _id: toId }, { $inc: { balance: amount } }, { session });
await session.commitTransaction();
} catch (e) {
await session.abortTransaction();
throw e;
} finally {
session.endSession();
}
}
// React race condition: stale response (user types fast, earlier response arrives last)
useEffect(() => {
let isCancelled = false;
async function search() {
const results = await fetchResults(query);
if (!isCancelled) setResults(results); // ← only update if still relevant
}
search();
return () => { isCancelled = true; }; // cancel on new query or unmount
}, [query]);
// Without pooling: creating a new connection per request is EXPENSIVE
// (TCP handshake, auth handshake, ~100ms overhead per connection)
// Mongoose handles pooling automatically
mongoose.connect(uri, {
maxPoolSize: 10, // max concurrent connections (default: 5)
minPoolSize: 2, // keep at least 2 connections open
serverSelectionTimeoutMS: 5000,
socketTimeoutMS: 45000,
connectTimeoutMS: 10000,
});
// Next.js / Serverless gotcha: each serverless invocation creates a NEW process
// Solution: cache the Mongoose connection in module-level variable
let cached = global._mongoose;
if (!cached) cached = global._mongoose = { conn: null, promise: null };
export async function connectDB() {
if (cached.conn) return cached.conn; // reuse existing connection
if (!cached.promise) {
cached.promise = mongoose.connect(process.env.MONGODB_URI, { maxPoolSize: 10 });
}
cached.conn = await cached.promise;
return cached.conn;
}
// Call this in every API route/Server Component — creates connection once, reuses it
Requirements: real-time in-app, email, push notifications with user preferences
Architecture:
Any Service → Message Queue (BullMQ) → Notification Worker
├── WebSocket (socket.io) → real-time
├── Email (SendGrid)
└── Push (FCM)
Database:
notifications: { _id, userId, type, title, body, data, read: false, createdAt }
notification_prefs: { userId, type, channels: ['email', 'push', 'in_app'] }
API:
GET /api/notifications?unread=true&page=1
PATCH /api/notifications/:id/read
PATCH /api/notifications/read-all
const notifWorker = new Worker('notifications', async (job) => {
const { userId, type, data } = job.data;
const prefs = await NotifPrefs.findOne({ userId, type });
if (!prefs) return; // user disabled this type
if (prefs.channels.includes('in_app')) {
const notif = await Notification.create({ userId, type, ...data });
const socketId = await redis.get(`socket:${userId}`);
if (socketId) io.to(socketId).emit('notification', notif);
}
if (prefs.channels.includes('email')) await emailService.send(userId, type, data);
if (prefs.channels.includes('push')) await fcm.send(userId, data);
}, { connection: redis });
# Cluster mode (one process per CPU core)
pm2 start server.js --name api -i max
pm2 list # see all processes
pm2 logs api # stream logs
pm2 monit # live monitoring dashboard
pm2 reload api # zero-downtime reload (cluster mode)
# ecosystem.config.js
module.exports = {
apps: [{
name: 'api',
script: 'server.js',
instances: 'max',
exec_mode: 'cluster',
max_memory_restart: '1G', // restart if memory exceeds 1GB
env_production: { NODE_ENV: 'production', PORT: 5000 },
error_file: './logs/error.log',
out_file: './logs/out.log',
}]
};
pm2 start ecosystem.config.js --env production
pm2 startup # auto-restart on server reboot
pm2 save # save current process list
REST is better when:
- Simple CRUD with clear resources (public APIs, caching critical)
- Team is unfamiliar with GraphQL
- GET requests need HTTP-level caching
GraphQL is better when:
- Mobile apps need to minimize bandwidth (request exactly what you need)
- Frontend needs multiple resources in one round-trip
- Rapid frontend iteration (frontend defines its own queries)
- Highly connected data (social graphs, e-commerce recommendations)
N+1 problem — the classic GraphQL performance issue:
Fetching 10 users → for each user fetch their posts = 11 queries!
Solution: DataLoader (batching and caching per request)
const DataLoader = require('dataloader');
// Creates one query for all IDs, not one per ID
const postLoader = new DataLoader(async (userIds) => {
const posts = await Post.find({ authorId: { $in: userIds } });
// Return array in same order as userIds
return userIds.map(id => posts.filter(p => p.authorId.toString() === id.toString()));
});
// In GraphQL resolver
const resolvers = {
User: {
posts: (user) => postLoader.load(user.id.toString()), // batched!
}
};
// 10 users → postLoader batches → 1 DB query instead of 10
End of MERN Questions Guide. This document covers React, Next.js, Node.js, Express, MongoDB, and the JavaScript fundamentals that underpin the entire MERN stack.