How to Fix Slow API Requests in Your Frontend
Why API calls feel slow in the browser, how to diagnose bottlenecks with DevTools, and practical fixes from caching to request waterfalls.
How to Fix Slow API Requests in Your Frontend
You click a button and nothing happens for two seconds. The data loads eventually, but the UI feels sluggish. You open DevTools and see:
GET /api/dashboard — 1.8s
GET /api/notifications — 2.3s
GET /api/user/preferences — 900ms
Slow API requests are one of the most noticeable performance problems in any web app. Users don’t care whether the bottleneck is the server, the network, or your code. They just see a slow page.
Diagnosing the problem
Before fixing anything, figure out where the time is being spent. Open the Network tab in Chrome DevTools, trigger the slow request, and click on it. The Timing tab breaks the request into phases:
- Stalled/Queueing: The browser waited before sending the request (connection limits, service worker overhead)
- DNS Lookup: Resolving the domain name
- Initial Connection / TLS: Establishing the TCP and TLS handshake
- Time to First Byte (TTFB): Time from when the request was sent to when the first byte of the response arrived. This is server processing time plus network latency.
- Content Download: Time to transfer the response body
If TTFB is high, the server is slow. If Content Download is high, the response is too large. If Stalled time is high, you have too many concurrent requests or a connection bottleneck.
Common causes
1. Request waterfalls
The biggest frontend performance killer. Request A finishes, then request B starts, then request C starts. Each one waits for the previous one to complete, even though they don’t depend on each other.
// Bad — sequential requests, total time = A + B + C
const user = await fetch('/api/user')
const posts = await fetch('/api/posts')
const notifications = await fetch('/api/notifications')
Fix: Fire independent requests in parallel with Promise.all.
// Good — parallel requests, total time = max(A, B, C)
const [user, posts, notifications] = await Promise.all([
fetch('/api/user'),
fetch('/api/posts'),
fetch('/api/notifications'),
])
If one request failing shouldn’t block the others, use Promise.allSettled instead.
2. No client-side caching
Every navigation or component mount fetches the same data again. The user visits the dashboard, clicks away, clicks back, and waits for the same API calls a second time.
Fix: Use a client-side data fetching library with a cache. React Query (TanStack Query), SWR, and Apollo Client all implement stale-while-revalidate patterns.
import { useQuery } from '@tanstack/react-query'
const DashboardComponent = () => {
const { data, isLoading } = useQuery({
queryKey: ['dashboard'],
queryFn: () => fetch('/api/dashboard').then(r => r.json()),
staleTime: 30_000, // serve cached data for 30 seconds
})
if (isLoading) return <Spinner />
return <Dashboard data={data} />
}
With staleTime set, revisiting the page shows cached data instantly while a background refetch updates it. The user sees data immediately instead of a loading spinner.
3. Overfetching data
The API returns the entire user object with 40 fields when you only need the name and avatar. Large response payloads take longer to transfer and parse.
Fix (backend): Add field selection or a dedicated lightweight endpoint.
GET /api/user?fields=name,avatar
Fix (frontend): If you control the API, implement pagination for list endpoints. Returning 1,000 records when the user sees 20 at a time is wasted bandwidth.
const response = await fetch('/api/posts?page=1&limit=20')
4. Unbounced search or filter inputs
A search input fires an API request on every keystroke. Typing “javascript” sends 10 requests, and 9 of them are useless because the user is still typing.
Fix: Debounce the input. Wait until the user stops typing before sending the request.
function debounce(fn, delay) {
let timer
return (...args) => {
clearTimeout(timer)
timer = setTimeout(() => fn(...args), delay)
}
}
const debouncedSearch = debounce(async (query) => {
const results = await fetch(`/api/search?q=${encodeURIComponent(query)}`)
renderResults(await results.json())
}, 300)
searchInput.addEventListener('input', (e) => {
debouncedSearch(e.target.value)
})
300ms is a good starting point. Short enough to feel responsive, long enough to avoid wasted requests.
5. Missing HTTP cache headers
The browser has a built-in HTTP cache, but it only works if the server sends the right headers. Without Cache-Control or ETag headers, the browser treats every response as uncacheable.
Fix (server): Add cache headers for responses that don’t change frequently.
// Data that rarely changes
app.get('/api/config', (req, res) => {
res.set('Cache-Control', 'public, max-age=3600') // cache for 1 hour
res.json(config)
})
// Data that changes per user but is stable for a session
app.get('/api/user/profile', (req, res) => {
res.set('Cache-Control', 'private, max-age=300') // cache for 5 minutes, per user
res.json(profile)
})
public means shared caches (CDNs) can store it. private means only the user’s browser can cache it. Use private for any user-specific data.
6. No prefetching for predictable navigation
If you know the user is about to navigate somewhere (hovering over a link, reaching the end of a list), you can start fetching the data before they click.
// Prefetch on hover
link.addEventListener('mouseenter', () => {
fetch('/api/next-page-data') // browser caches the response
})
React Query supports this directly:
const queryClient = useQueryClient()
const prefetchNextPage = () => {
queryClient.prefetchQuery({
queryKey: ['posts', nextPage],
queryFn: () => fetchPosts(nextPage),
})
}
Prevention
Build performance awareness into your development workflow:
- Watch the Network tab during development. If you see sequential requests that could be parallel, fix it immediately.
- Set performance budgets. If any API call regularly exceeds 500ms, investigate the server side.
- Use a data fetching library with caching from the start. Retrofitting a cache into raw
fetchcalls across an entire app is painful. - Paginate everything. No list endpoint should return unlimited results.
- Log slow requests server-side. If your API consistently takes over 1 second, the fix is often a missing database index or an N+1 query, not a frontend optimization.
Hushbug tracks slow network requests and performance issues automatically while you browse. Coming soon to the Chrome Web Store.