react
firebase
cloud
typescript
javascript
storage

Before you start reading this article, make sure that your Firebase project is already created, you have Blaze or another premium plan on Firebase, and the React app is configured. We'll not dive into these concepts.

Uploading Images with Firebase and React

Uploading images with Firebase and React is quite straightforward. Nevertheless, there are potential pitfalls that can consume your time unnecessarily. These include validation, permissions, determining the appropriate image format, and uploading logic.

With these prerequisites in mind, let's dive in! Here you have a demo of the final result, that is already working on this site!

Demo of Image Upload Demo of Image Upload

I simplified the example really hard, to focus on upload logic. If you want to check how to implement the following UI from the demo, go to the final repository that is linked at the end of the article.

Let's Start with Frontend

To begin, we need to have a file input. Take a look at the following code:

const FileInput = () => {
  const handleChange: React.ChangeEventHandler<HTMLInputElement> = (e) => {
    console.log(e.target.files); // The files list to pick up and validate.
  };

  return <input type="file" onChange={handleChange} />;
};

We've set up an input mechanism. Now, our next step is to ensure that any uploaded images are parsed correctly before they're sent to the Cloud Function. To accomplish this, we'll create the following parser:

const readFileAsBase64 = (file: File): Promise<FileReader['result']> => {
  return new Promise((resolve, reject) => {
    const reader = new FileReader();
    reader.onload = () => resolve(reader.result);
    reader.onerror = reject;
    // Converts image to URL -> data:image/jpeg;base64,
    reader.readAsDataURL(file);
  });
};

export { readFileAsBase64 };

The readAsDataURL enables you to read files asynchronously and obtain a data URL representation.

Now, you need to call readFileAsBase64 just after getting files from file input. Take note of the Array.isArray call, which checks whether the files have been selected by the user or not.

  const handleChange: React.ChangeEventHandler<HTMLInputElement> = async (
    e,
  ) => {
    // Files may be "null" or "FileList".
    if (Array.isArray(e.target.files)) {
      const base64URL = await readFileAsBase64(e.target.files[0]);
    }
  };

Frontend is almost ready. To finish that part, we need to invoke a Cloud Function. In our case, it will be a uploadImage.

import { FirebaseOptions, initializeApp } from 'firebase/app';
import { getFunctions } from 'firebase/functions';

// firebase-setup.ts
const config: FirebaseOptions = {
  apiKey: process.env.FIREBASE_API_KEY,
  authDomain: process.env.FIREBASE_AUTH_DOMAIN,
  projectId: process.env.FIREBASE_PROJECT_ID,
  storageBucket: process.env.FIREBASE_STORAGE_BUCKET,
  messagingSenderId: process.env.FIREBASE_MESSAGING_SENDER_ID,
  appId: process.env.FIREBASE_APP_ID,
  measurementId: process.env.FIREBASE_MEASURMENT_ID,
};

const app = initializeApp(config);
const functions = getFunctions(app);

export { app, functions };

Then, in any place in the React components tree, you may import the functions variable and perform a call to Cloud Function.

import React from 'react';
import { httpsCallable } from 'firebase/functions';
import { readFileAsBase64 } from 'development-kit/file-reading';
import { functions } from './firebase-setup';

// Payload object shape.
type UploadImagePayload = {
   image: string;
};
// Response object shape.
type UploadImageResponse = {};

const FileInput = () => {
  const handleChange: React.ChangeEventHandler<HTMLInputElement> = async (
    e,
  ) => {
    if (Array.isArray(e.target.files)) {
      const { data } = await httpsCallable<UploadImagePayload, UploadImageResponse>(
        functions,
        `uploadImage`,
      )({ image: await readFileAsBase64(e.target.files[0]) });
    }
  };

  return <input type="file" onChange={handleChange} />;
};

Why we've chosen the base64 representation instead of the simple FormData and File? The answer is simple, we want to have the option to upload images from clipboard without any additional headaches.

By default Cloud Functions will send a application/json type of request.

That's all on Frontend, now it's time for Backend.

Adding Storage Bucket

We need to have a place to store our images. By default, there is no storage attached to your project on Firebase. You need to create it manually, or programmatically based on your application logic. To create it manually you need to go to Firebase Console. Just follow this gif:

Finding Storage in Dashboard Finding Storage in Dashboard

In my case, I already have the storage. If it's your first time, don't worry, the UX on Firebase is great, and they will guide you. It's just several fields to populate, and nothing more.

Keep in mind, that after creating a storage, the default bucket for files will be created - this is the place where we'll save images.

Creating Rules for Storage

We need to navigate to the Storage -> Rules tab in the dashboard.

Rules Setup Location Rules Setup Location

Some changes in default rules will be required. We need to be sure that operations on our images will be allowed only for signed-in users (update, delete, write). On the other hand, we want to make images public for everyone. Of course, your app requirements may be different. It's just for the showcase:

rules_version = '2';

service firebase.storage {
  match /b/{bucket}/o {
    match /{allPaths=**} {
      allow read;
      allow update, delete, write: if request.auth != null;
    }
  }
}

Cloud Function to Upload Image

Now, on the Backend we need to create a function called uploadImage. We'll use the onCall helper from sdk.

import { https } from 'firebase-functions';
import * as admin from 'firebase-admin';
import { v4 as uuid } from 'uuid';

admin.initializeApp();

const { onCall, HttpsError } = https;

type UploadImagePayload = { image: string };

export const uploadImage = onCall(
  async ({ image }: UploadImagePayload, context) => {
    const { auth } = context;

    if (!auth)
      throw new HttpsError(`internal`, `Only authorized users allowed`);

    const storage = admin.storage();
    const bucket = storage.bucket();
    const [bucketExists] = await bucket.exists();

    if (!bucketExists)
      throw new HttpsError(
        `internal`,
        `No bucket. Create it manually on dashboard or use existing one`,
      );

    const id = uuid();
    const location = `${auth.uid}/images/${id}`;
    const file = bucket.file(location);

    const [meta] = image.split(`,`);
    const contentType = meta.split(`:`)[1].split(`;`)[0]; // image/png or other...
    const extension = contentType.replace(`image/`, ``); // png or other...
    const blob = image.replace(/^data:image\/\w+;base64,/, ``); // Image data.
    const buffer = Buffer.from(blob, `base64`); // Buffer to upload.

    await file.save(buffer, {
      contentType,
    });
    // Unique URL to fetch image.
    const url = `https://firebasestorage.googleapis.com/v0/b/${
      bucket.name
    }/o/${encodeURIComponent(location)}?alt=media`;

    return { url, extension, contentType, id }; // It goes to Frontend
  },
);

At the beginning we checked for authorization status - it may be determined by the existence of the auth property in the context object. If it's null, it means the user is not authorized.

Next, we've checked for bucket existence. Same situation, if the storage and bucket are not created, we're throwing an error.

By calling admin.storage() we're getting a reference to default storage. The same happens with the storage.bucket() call. If you want to choose another storage or bucket, you may pass a reference path as a parameter.

Later, we took the original image parameter which is the base64 format string. That's the payload from Frontend. We split it into several parts: type, extension, and data.

At that end, we've used the encodeURIComponent function. Without that, the image will not be displayed, instead, you'll get an error. It's because of the complex URL form that may contain many / characters and this encoded form contains a replaced variant.

The encodeURIComponent function transform example: This is a & test / to This%20is%20a%20%26%20test%20%2F.

Validating Image

Before uploading an image we need to have a validation mechanism, for example, how to check if the uploaded file is an image or check its size. To achieve that, at the beginning of the previous function, you can add something like this:

    const buffer = Buffer.from(blob, `base64`); // Buffer to upload.

    const sizeAsMegabytes = Number.parseFloat(
      (Buffer.byteLength(buffer) / 1024 / 1024).toFixed(2),
    );
    // Checks if the image is smaller than 4 Megabytes
    const hasAllowedSize = sizeAsMegabytes < 4;

If you want to check the file type, use the following code:

    const extension = contentType.replace(`image/`, ``); // png or other...
    const IMAGE_EXTENSIONS = [`png`, `jpeg`, `jpg`, `gif`] as const;
    type ImageExtension = (typeof IMAGE_EXTENSIONS)[number];
    const isFormatCorrect = IMAGE_EXTENSIONS.includes(extension as ImageExtension)

Changing Images Quality and Size

Some storage providers, such as Google Cloud, have specific costs per GB or offer a type of free tier. If you store original images uploaded by users, you will quickly exhaust your budget. For instance, only 500 images, each 10 MB, are needed to exceed the free tier limit on Google. This happens quite rapidly. Afterward, you'll incur a cost of $0.28 per GB. If your user base grows significantly, the expenses could become exorbitant. That's why it's advisable to reduce the quality and size of images during upload.

Don't worry, there's a cool library that allows you to do this with just a few function calls. It should be installed in your backend codebase. To install the sharp library, type npm i sharp --save in your terminal.

Let's say we want to convert every input image (png, jpeg, jpg) to the modern webp format and reduce its size and quality (I've omitted format validation, as it's done beforehand).

import * as sharp from 'sharp';

const buffer = Buffer.from(blob, `base64`);

const webpBuffer = await sharp(buffer)
  .resize(width, height)
  .webp({ quality: 60 })
  .toBuffer();
// Decreased quality image saved in storage.
await file.save(webpBuffer, {
  contentType,
});

const url = `https://firebasestorage.googleapis.com/v0/b/${
  bucket.name
}/o/${encodeURIComponent(path)}?alt=media`;

return { url };

This library has other useful methods for reading image metadata. Additionally, you can change the aspect ratio, apply effects, or transform images. Feel free to explore it! Here you've Sharp examples.

Final Result and Source Code

All code that has been shown here can be found in the following repositories:

  1. Backend - https://github.com/polubis/greenonsoftware-api/tree/develop/functions/src.
  2. Frontend - https://github.com/polubis/4markdown/tree/develop.

Summary

Now you know the quick way of uploading images on Firebase Storage. We've used Cloud Functions to provide some logic on the Backend side, we performed a validation, and at the end, the image was saved on default bucket localized in storage.

On the Frontend we transformed the original File to base64 format, to prepare a base for uploading images from clipboard - especially useful on desktop devices φ(* ̄0 ̄).

Author avatar
About Authorpolubis

👋 Hi there! My name is Adrian, and I've been programming for almost 7 years 💻. I love TDD, monorepo, AI, design patterns, architectural patterns, and all aspects related to creating modern and scalable solutions 🧠.