How to Convert Multiple Images in Bulk
โ Back to Blog
How to Convert Multiple Images in Bulk
ยท 5 min read
Use Cases for Bulk Conversion
Bulk image conversion is necessary in these scenarios: e-commerce platforms converting large numbers of product photos from PNG to WebP to optimize load speeds; blogs or news sites migrating large numbers of historical JPG images to WebP; large numbers of old scanned BMP photos needing conversion to JPG archives; website rebuilds requiring unified image format standardization; game development converting designer-provided PSD/PNG files in bulk to engine-supported formats.
Method 1: ImageMagick Command Line (Recommended)
# ImageMagick ๆน้่ฝฌๆขๅคงๅ
จ
# ImageMagick batch conversion complete guide
# ๅฎ่ฃ
/ Install
# macOS: brew install imagemagick
# Ubuntu: sudo apt install imagemagick
# ๆน้ JPG ่ฝฌ WebP๏ผไฟ็ๅๆไปถ๏ผ
# Batch JPG to WebP (keep originals)
for f in *.jpg *.jpeg; do
[ -f "$f" ] && convert "$f" -quality 80 "${f%.*}.webp"
done
# ๆน้ PNG ่ฝฌ WebP๏ผๆ ๆ๏ผ
# Batch PNG to WebP (lossless)
for f in *.png; do
[ -f "$f" ] && convert "$f" -define webp:lossless=true "${f%.png}.webp"
done
# ไฝฟ็จ mogrify ๅฐฑๅฐ่ฝฌๆข๏ผๆฟๆขๅๆไปถๆ ผๅผ๏ผ
# Use mogrify for in-place conversion (change format in directory)
mogrify -format webp -quality 80 *.jpg
mogrify -format jpg -quality 85 *.bmp
# ้ๅฝๅค็ๅญ็ฎๅฝ
# Process subdirectories recursively
find . -name "*.jpg" -exec sh -c \
'convert "$1" -quality 80 "${1%.jpg}.webp"' _ {} \;
# ๅนถ่ก่ฝฌๆข๏ผๅฉ็จๅคๆ ธ CPU๏ผ
# Parallel conversion (using multiple CPU cores)
find . -name "*.jpg" | \
xargs -P 8 -I{} sh -c 'convert "$1" -quality 80 "${1%.jpg}.webp"' _ {}
Method 2: Python Script (Flexible Customization)
from PIL import Image
import os
import glob
from pathlib import Path
from concurrent.futures import ThreadPoolExecutor
import time
def convert_image(input_path, output_format, quality=80, output_dir=None):
"""ๅๅพ็่ฝฌๆขๅฝๆฐ"""
input_path = Path(input_path)
if output_dir:
output_path = Path(output_dir) / input_path.with_suffix(f'.{output_format}').name
else:
output_path = input_path.with_suffix(f'.{output_format}')
try:
with Image.open(input_path) as img:
orig_size = input_path.stat().st_size
if output_format.lower() == 'webp':
if img.mode in ('RGBA', 'P'):
img.save(output_path, 'WEBP', lossless=True)
else:
img.convert('RGB').save(output_path, 'WEBP', quality=quality)
elif output_format.lower() in ('jpg', 'jpeg'):
if img.mode != 'RGB':
img = img.convert('RGB')
img.save(output_path, 'JPEG', quality=quality)
else:
img.save(output_path, output_format.upper())
new_size = output_path.stat().st_size
reduction = (1 - new_size / orig_size) * 100
return (str(input_path), reduction)
except Exception as e:
return (str(input_path), f"ERROR: {e}")
def batch_convert(source_dir, output_format='webp', quality=80,
output_dir=None, workers=4):
"""ๆน้่ฝฌๆขๅพ็"""
if output_dir:
os.makedirs(output_dir, exist_ok=True)
patterns = ['*.jpg', '*.jpeg', '*.png', '*.bmp']
files = []
for pattern in patterns:
files.extend(glob.glob(os.path.join(source_dir, pattern)))
print(f"Found {len(files)} images to convert...")
start = time.time()
with ThreadPoolExecutor(max_workers=workers) as executor:
results = list(executor.map(
lambda f: convert_image(f, output_format, quality, output_dir),
files
))
elapsed = time.time() - start
successes = [r for r in results if isinstance(r[1], float)]
avg_reduction = sum(r[1] for r in successes) / len(successes) if successes else 0
print(f"Done: {len(successes)}/{len(files)} files in {elapsed:.1f}s")
print(f"Average size reduction: {avg_reduction:.1f}%")
batch_convert('./images', output_format='webp', quality=80,
output_dir='./images_webp', workers=8)
Method 3: Sharp (Node.js) High-Performance Batch Conversion
// npm install sharp glob
const sharp = require('sharp');
const glob = require('glob');
const path = require('path');
const fs = require('fs');
async function batchConvert(sourceDir, outputFormat, quality = 80) {
const files = glob.sync(`${sourceDir}/**/*.{jpg,jpeg,png,bmp}`);
const outputDir = `${sourceDir}_${outputFormat}`;
fs.mkdirSync(outputDir, { recursive: true });
let converted = 0;
let totalSaved = 0;
// ๅนถ่กๅค็๏ผๆฏๆน 10 ไธชๆไปถ
// Process in parallel, 10 files per batch
for (let i = 0; i {
const relPath = path.relative(sourceDir, file);
const outputPath = path.join(outputDir,
relPath.replace(/\.[^.]+$/, `.${outputFormat}`));
fs.mkdirSync(path.dirname(outputPath), { recursive: true });
const origSize = fs.statSync(file).size;
await sharp(file)[outputFormat]({ quality }).toFile(outputPath);
const newSize = fs.statSync(outputPath).size;
totalSaved += origSize - newSize;
converted++;
}));
process.stdout.write(`\rConverted: ${converted}/${files.length}`);
}
console.log(`\nDone! Saved ${(totalSaved/1024/1024).toFixed(1)} MB total`);
}
batchConvert('./public/images', 'webp', 80);
Automated Workflow: Folder Monitoring
For scenarios with continuous image additions (like e-commerce adding new products daily), set up folder monitoring: when a new image is added to the "upload" folder, automatically trigger the conversion script, convert to WebP, and save to the "publish" folder. On Linux/macOS use inotifywait or fswatch; in Node.js use the chokidar library; in Python use the watchdog library.
# Python ๆไปถๅคน็ๆง่ชๅจ่ฝฌๆข
# Python folder monitoring auto-conversion
# pip install watchdog pillow
from watchdog.observers import Observer
from watchdog.events import FileSystemEventHandler
from PIL import Image
import time, os
class ImageHandler(FileSystemEventHandler):
def on_created(self, event):
if event.is_directory:
return
if event.src_path.lower().endswith(('.jpg', '.png', '.jpeg')):
self.convert_to_webp(event.src_path)
def convert_to_webp(self, path):
output = path.rsplit('.', 1)[0] + '.webp'
with Image.open(path) as img:
img.save(output, 'WEBP', quality=80)
print(f"Auto-converted: {path} -> {output}")
observer = Observer()
observer.schedule(ImageHandler(), path='./uploads', recursive=False)
observer.start()
print("Monitoring ./uploads for new images...")
try:
while True:
time.sleep(1)
except KeyboardInterrupt:
observer.stop()
observer.join()
Performance Optimization for Batch Conversion
When processing large numbers of images, performance optimization is very important. Key recommendations: (1) use multi-threading/multi-processing for parallel processing, fully utilizing multi-core CPUs; (2) Sharp (Node.js) is 5โ10x faster than Pillow (Python), recommended for processing large numbers of images; (3) if image count exceeds 10,000, consider batch processing (500โ1,000 at a time) to prevent memory overflow; (4) running on SSD is 3โ5x faster than HDD (I/O is the bottleneck); (5) pre-check and skip already-converted files to avoid duplicate work.
Try the online tool now โ no installation, completely free.
Open Tool โ
Try the free tool now
Use Free Tool โ