Blog/Developer Tools
Developer ToolsMarch 2026 · 6 min read

Image Compression Without Webpack or Build Tools

Build-time image optimization adds complexity and slows down your builds. Pre-compression is simpler, faster, and framework-agnostic.

The conventional advice for image optimization in front-end projects is to add a build plugin: imagemin-webpack-plugin for webpack, vite-imagetools or vite-plugin-imagemin for Vite, or rely on Next.js's built-in image handling. This advice is not wrong — but it is often heavier than necessary, and it comes with real trade-offs.

The build plugin trade-offs

Adding image compression to your build pipeline introduces several costs:

Build time increases

Image compression is CPU-intensive. Compressing 200 images at build time adds 30–120 seconds per build, depending on your machine and image library size. In CI/CD, you are paying for this compute on every deploy.

Framework lock-in

A webpack plugin only works in webpack projects. A Vite plugin only works in Vite. When you move to a new framework or migrate a project, you reconfigure image compression from scratch.

Plugin maintenance overhead

Image processing plugins have native binary dependencies. They break on Node.js major version upgrades, conflict with other plugin versions, and go unmaintained. This is the imagemin problem, replicated across every framework-specific variant.

Re-compressed on every clean build

Many build setups reprocess images on every fresh build or in CI. You compress the same files repeatedly, burning compute time to produce identical output.

Pre-compression: a simpler model

The alternative is to compress images before they enter your project — not during the build. This means your repository contains already-optimized images. The build tool never has to touch them.

The model is:

  1. Images are compressed when they are added to the project (automatically via watch folder, or manually via batch drag-and-drop)
  2. Compressed files are committed to the repo
  3. Build tool processes pre-optimized images — zero extra compression work
  4. Build time is unaffected by image library size

Build time comparison

Build plugin (200 images)+45–90 seconds per build
Pre-compressed images+0 seconds per build

How to set this up with TinyPixels

TinyPixels handles pre-compression either manually (batch mode) or automatically (watch folder mode).

Batch mode (one-time or ad hoc): Drag your entire /public or /assets folder into TinyPixels, configure format and quality, set a separate output folder, and hit compress. Takes seconds. Run it whenever you have a batch of new images to add.

Watch folder mode (continuous): Point TinyPixels at a source image folder. It monitors for new files and compresses them automatically to your output folder — originals untouched, output always ready. You add a raw image, the compressed version appears in the output folder immediately.

Works with Next.js, Vite, Astro, plain HTML — anything

Because pre-compression happens at the filesystem level before the build tool ever runs, it is completely framework-agnostic. Your build tool sees pre-optimized files and processes them normally — no plugin required, no special config, no compatibility concerns.

FrameworkBuild plugin needed?
Next.jsNo — pre-compressed files in /public are served as-is
Vite + React/Vue/SvelteNo — static assets in /public are copied directly
AstroNo — assets in /public bypass processing
Plain HTMLNo — there is no build, files serve directly
WordPressNo — upload pre-compressed images to the media library
Docusaurus / VitePressNo — static assets are copied without processing

What about Next.js Image component?

next/image does runtime optimization — it resizes and converts images on demand when they are requested. This is genuinely useful for responsive images and lazy loading. But it does not mean you should skip pre-compression.

next/image adds server overhead and requires the Next.js runtime. Pre-compressed source images reduce the work next/image has to do, lower memory usage during optimization, and ensure your images are lean regardless of whether they go through Next.js's image pipeline or are served as static files directly.

The two approaches are complementary, not mutually exclusive. Pre-compress your source files, then let next/image handle responsive sizing. You get the benefits of both.

CI/CD implications

When images are pre-compressed and committed to the repo, your CI/CD pipeline does not need an image processing step. This simplifies your pipeline configuration, reduces compute costs in CI, and eliminates an entire class of build failures related to native binary dependencies in image processing packages.

For teams with large image libraries, moving from build-time compression to pre-compression can reduce CI build times by minutes per run — across every developer on the team, every day.

The one case where build-time compression makes sense

Build-time image optimization is a better fit when images are dynamically generated or sourced from a CMS at build time (for example, a Contentful or Sanity source in a static site generator). In this case, pre-compression is not possible because the images are not known until build time.

For everything else — design assets, screenshots, marketing images, static content — pre-compression is simpler, faster, and requires no build tool integration.

Pre-compress images before they enter your build

TinyPixels runs on Mac and Windows. Zero build config required. Join the waitlist.

Join the waitlist for early access and a launch discount.