← Назад

Mastering Asynchronous Programming: A Complete Guide from Callbacks to Async/Await

Why Async Programming Matters

Every app you love—Twitter, Netflix, Uber—relies on asynchronous programming. Without it, a single slow database query would freeze the entire interface. Async code lets programs start a task, move on, and come back when the result is ready. The payoff is simple: higher throughput, lower latency, and happier users.

The Blocking Problem

Traditional synchronous code runs top to bottom. When it hits a long operation—reading a 4 MB file, calling an API, querying Postgres—the thread sits idle. In single-threaded environments like JavaScript in the browser, that freeze locks the UI. In server environments, it wastes RAM and CPU cycles. Async patterns solve this by delegating the slow work and continuing with other tasks.

Callbacks: The Original Async Tool

A callback is a function you pass into another function to run later. In browser land, setTimeout is the classic demo:

function later(msg, cb) {
setTimeout(() => cb(msg), 1000);
}
later('hi', console.log); // prints after 1 s

Looks simple, but real apps need sequential calls. Nesting callbacks produces the dreaded “pyramid of doom”:

getUser('ann', (err, user) => {
if (err) return handle(err);
getOrders(user.id, (err, orders) => {
if (err) return handle(err);
chargeCard(orders[0], (err, res) => {
if (err) return handle(err);
emailReceipt(user.email, res, (err) => {
if (err) return handle(err);
console.log('done');
});
});
});
});

Error handling repeats, indentation marches right, and reasoning about flow becomes mental gymnastics.

Promises: Flattening the Pyramid

A Promise is an object representing the eventual completion—or failure—of an async operation. It has three states: pending, fulfilled, rejected. Creating one is straightforward:

const load = new Promise((resolve, reject) => {
readFile('data.txt', (e, data) => {
e ? reject(e) : resolve(data);
});
});

Consume it with .then and .catch:

load
.then(text => console.log(text))
.catch(err => console.error(err));

Best of all, you can chain without nesting:

getUser('ann')
.then(user => getOrders(user.id))
.then(orders => chargeCard(orders[0]))
.then(res => emailReceipt(res.email, res))
.then(() => console.log('done'))
.catch(handle);

One catch block handles every step. The code reads top to bottom like synchronous script, yet still runs non-blocking.

Promise Static Methods

Promise.all waits for every promise to finish; useful for parallel API calls:

const [weather, news] = await Promise.all([
fetch('/api/weather'),
fetch('/api/news')
]);

Promise.race returns whichever settles first; handy for timeout patterns:

const winner = await Promise.race([
fetch('/api/slow'),
new Promise((_, reject) => setTimeout(() => reject('timeout'), 5000))
]);

Async/Await: Syntactic Bliss

ES2017 introduced async functions. Prefix any function with async and you may use await inside. Example:

async function checkout(userId) {
const user = await getUser(userId);
const orders = await getOrders(user.id);
const receipt = await chargeCard(orders[0]);
await emailReceipt(user.email, receipt);
return 'done';
}

The engine pauses at each await, frees the thread, then resumes with the resolved value. Errors propagate as exceptions, so try/catch works naturally:

try {
await checkout('ann');
} catch (e) {
handle(e);
}

Under the hood, async/await is sugar over Promises. Anything you await must be then-able, so mixing styles is seamless.

Concurrent vs Sequential Await

Await each line in turn and you get sequential, not concurrent, work:

const a = await taskA(); // 1 s
const b = await taskB(); // 1 s
// total 2 s

Fan them out with Promise.all:

const [a, b] = await Promise.all([taskA(), taskB()]); // ~1 s total

Rule of thumb: be explicit about what must wait. Independent I/O should run together.

Error Handling Strategies

Never await inside a bare loop without try/catch; one failure halts everything. Wrap batch jobs in a handler that isolates faults:

const results = await Promise.allSettled(
urls.map(u => fetch(u))
);
results.forEach(res => {
if (res.status === 'rejected') log(res.reason);
});

AllSettled never throws; it returns an array of objects indicating success or failure.

Parallel Rate Limiting

Firing 10,000 HTTP requests at once invites 429 errors or worse. Use a simple semaphore to cap concurrency:

class Semaphore {
constructor(max) { this.max = max; this.c = 0; this.queue = []; }
async acquire() {
if (this.c < this.max) { this.c++; return; }
await new Promise(res => this.queue.push(res));
}
release() {
this.c--;
const next = this.queue.shift();
if (next) { this.c++; next(); }
}
async run(fn) {
await this.acquire();
try { return await fn(); }
finally { this.release(); }
}
}

Usage:

const sem = new Semaphore(10);
await Promise.all(
urls.map(u => sem.run(() => fetch(u)))
);

Node.js Event Loop in One Minute

JavaScript offloads slow OS work to libuv threads. When the OS finishes, libuv places a callback in the event queue. The single main thread dequeues and runs it. Async functions return control at await points, letting the loop process timers, I/O, and UI updates. Blocking the thread with heavy computation defeats the model; for CPU-bound work, spawn a worker thread or child process.

Converting Old Callback APIs

Node’s util.promisify wraps error-first callbacks automatically:

import { promisify } from 'util';
import { readFile } from 'fs';
const read = promisify(readFile);
const txt = await read('config.json', 'utf8');

For non-standard signatures, write a thin wrapper:

function delay(ms) {
return new Promise(res => setTimeout(res, ms));
}

Async Iterators and Generators

Streams and paginated APIs can be modeled as async generators. Yield each chunk as it arrives:

async function* paginate(endpoint) {
let page = 1;
while (true) {
const res = await fetch(`${endpoint}?page=${page}`);
const json = await res.json();
if (json.items.length === 0) break;
for (const item of json.items) yield item;
page++;
}
}

Consume with for-await-of:

for await (const repo of paginate('/api/repos')) {
console.log(repo.name);
}

Browser Gotchas

Top-level await only works inside modules. If you need it inline, wrap in an async IIFE:

(async () => {
const data = await fetch('/api');
render(await data.json());
})();

Also, await does not yield to the browser paint loop. For smooth animation, break CPU chunks with setTimeout(0) or use requestIdleCallback.

Testing Async Code

Test runners like Jest understand promises; return a promise or use async tests:

test('fetch user', async () => {
const user = await getUser(1);
expect(user.name).toBe('Ada');
});

Remember to await expectations when checking rejects:

await expect(flaky()).rejects.toThrow('timeout');

Performance Checklist

  • Favor concurrency with Promise.all over sequential await where order is irrelevant.
  • Pool expensive connections—database, Redis—instead of creating new ones per request.
  • Profile! Node’s --trace-events flag and Chrome DevTools Performance tab expose idle gaps.
  • Don’t await inside tight loops; collect promises and await once outside.
  • Cache promise-returning functions to avoid redundant work: memoize async results just like any expensive computation.

Logging in Async Flows

Context gets lost when multiple requests interleave. Use async_hooks or continuation-local-storage to propagate request IDs:

import { createHook, executionAsyncId } from 'async_hooks';
const store = new Map();
createHook({
init(asyncId, type, triggerId) {
if (store.has(triggerId)) store.set(asyncId, store.get(triggerId));
},
destroy(asyncId) { store.delete(asyncId); }
}).enable();

Attach a unique ID at entry point (e.g., HTTP middleware) and log it with every statement. Your distributed tracing will thank you.

Putting It Together: Mini Project

Build a CLI that downloads five images concurrently, resizes each, and saves thumbnails. Use node-fetch, sharp, and a concurrency limit of three:

import fetch from 'node-fetch';
import sharp from 'sharp';
import { writeFile } from 'fs/promises';
import { Semaphore } from './semaphore.js';
const sem = new Semaphore(3);
async function fetchAndResize(url, name) {
return sem.run(async () => {
console.log('starting', name);
const res = await fetch(url);
const buf = await res.buffer();
const thumb = await sharp(buf).resize(200).toBuffer();
await writeFile(`thumb_${name}.jpg`, thumb);
console.log('done', name);
});
}
await Promise.all([
fetchAndResize('https://i.picsum.photos/id/10/800/800.jpg', 'pic1'),
fetchAndResize('https://i.picsum.photos/id/20/800/800.jpg', 'pic2'),
fetchAndResize('https://i.picsum.photos/id/30/800/800.jpg', 'pic3'),
fetchAndResize('https://i.picsum.photos/id/40/800/800.jpg', 'pic4'),
fetchAndResize('https://i.picsum.photos/id/50/800/800.jpg', 'pic5')
]);

Run it and watch only three downloads active at any moment. The code stays readable, errors bubble to one catch block, and total wall time shrinks dramatically.

Key Takeaways

Start with callbacks to understand the pain points. Move to Promises for cleaner composition, then embrace async/await for readability indistinguishable from sync code. Keep operations parallel when possible, limit concurrency when resources are scarce, and always propagate errors intentionally. Master these patterns and you will write apps that feel instant while using minimal hardware.

Disclaimer: This article is for educational purposes only and was generated by an AI language model. Verify code samples in your specific runtime before production use.

← Назад

Читайте также