Utility scripts for managing and hosting transferred iPhone photos.
To run locally, create a .env
exporting your S3 IAM account credentials like so:
ACCESS_KEY_ID=...
SECRET_ACCESS_KEY=...
(see Setting up S3 for how to generate these). Run
npm run info
to display parsed information about the photo directory structure,
npm run upload:all
to process and optimize all local photos, and
npm run dev
to start the photo server locally.
The main idea with this project was to create an image server capable of hosting the many photos I've taken over the years. I transfer most of my photos to my computer, where they are sorted by date / event in the following structure:
Photos
|___ 2024-08-12 Santa Cruz
| |___ IMG_...
| |___ IMG_...
|
|___ 2025-05-11 ...
|___ ...
I wanted to be able to send shareable links for certain folders to people, but OneDrive was a little inconvenient to use and not primarily designed for photo sharing (plus, I wanted custom behavior for edited photos).
To process these photos for the web,
- Each photo needs to be converted to a web-renderable file format (see next section).
- Each photo needs to be optimized for size with minimal loss of quality (~2 MB → ~400 KB).
- "Duplicated" photos need to be coalesced (e.g. if a folder contains both an edited and non-edited version of a photo,
keep only the edited version) and non-photo files (any stray
.AAE
s) need to be ignored.
Some time around June 2023, iPhone photos started transferring by default as HEICs instead of as JPGs as they did before.
When directly converting a HEIC to a web-friendly format like JPG or WEBP with something like heic-convert
or
heic-decode
, though, colors are lost even if the raw pixel data is used in the conversion:
const buffer = await readFile(path)
const { width, height, data } = await heicDecode({ buffer });
img = sharp(new Uint8Array(data), {
raw: { width, height, channels: 4 }
});
IMG_E4106.HEIC.2025-01-14.09-14-47.mp4
While initially I thought this was due to HEICs supporting 10-bit color, this is actually due to color spaces: namely, HEIC uses the Display P3 colorspace, which is slightly wider than the sRGB colorspace used by most tools by default.
To fix this, we can try performing a color mapping using ICC profiles, somewhat hackily depending on ImageMagick to do so.
- Using
sRGB v4 preference
(the default sRGB profile included with ImageMagick), a lot of the dark colors get washed out:
magick IMG_E4106.HEIC -profile ..\sRGB_v4_ICC_preference.icc IMG_E4106.jpg
IMG_E4106.HEIC.2025-01-14.09-55-12.mp4
- Using
sRGB v4 appearance
is better, but still a bit faded:
magick IMG_E4106.HEIC -profile ..\sRGB_ICC_v4_Appearance.icc IMG_E4106.jpg
IMG_E4106.HEIC.2025-01-14.10-50-19.mp4
What I didn't realize was that you can just have a JPG that uses the Display P3 color space. While this still loses some color, it is the best option by far.
magick IMG_E4106.HEIC IMG_E4106.jpg
Then, the script can just send the ImageMagick output to stdout,
piping it into sharp
for the rest of the optimization.
For scalability and pricing concerns, this project uses AWS S3 as a backend for image hosting. To set up S3, create a bucket to store raw photos and another bucket to store their optimized, web-friendly previews.
To allow programmatic access to S3, create an IAM user in the AWS IAM dashboard.
For simplicity, under permissions we can just grant AmazonS3FullAccess
:
After creating this user, create an access key in the console like so:
Once done, copy the access key ID and secret access key into your .env
above:
Finally, to allow public access to the S3-hosted photos, we need to set a bucket policy on the preview and photos buckets. Under Bucket policy, add something like
{
"Version": "2012-10-17",
"Id": "Policy1697863280179",
"Statement": [
{
"Sid": "Stmt1697863276774",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::[bucket-name-here]/*"
}
]
}
to allow read access on all items in the bucket.
The scripts
folder contains utility scripts for uploading / optimizing photos for S3. In particular,
npm run upload:all
This is the main script that backs up local photos to S3. upload:all
treats the local photos directory as the SSOT for
photo information— it will optimize and upload missing photos to S3, and delete photos from S3 that are not present
locally.
npm run upload:single [subdir] [file]
This script optimizes and uploads a single file located at
{BASE_DIR}/{subdir}/{file}
useful for testing the full optimization workflow on a file, or for overriding a specific file on S3 (since image files
being modified locally is an uncommon occurrence, upload:all
skips re-uploading files that are already present on S3
for efficiency).
npm run upload:only-small
This script behaves like upload:all
, but only uploads the small optimized image to S3, without modifying the raw or
full-size optimized images.
Use this script when tweaking optimizeSmallToBuffer()
without changing optimize()
(e.g. the only files that need to
be overwritten are the small preview images).