Asynchronous File Uploads with JavaScript Fetch and Async/Await

Asynchronous File Uploads with JavaScript Fetch and Async/Await

Asynchronous File Uploads with JavaScript Fetch and Async/Await

 

Introduction

Uploading files without blocking the user interface is an essential part of modern web applications. With the power of JavaScript’s fetch API and async/await, developers can create highly responsive upload workflows, complete with smooth progress tracking and error handling. This post walks through building asynchronous file uploads using these modern features, showing how to make uploads efficient, non-blocking, and user-friendly.

1. Setting Up the Basic File Upload Form

We’ll start by creating a simple HTML form that allows a user to select files. For this example, our goal is to upload the selected file asynchronously when the form is submitted.

<form id="uploadForm">
  <input type="file" id="fileInput" name="file" />
  <button type="submit">Upload</button>
</form>
<div id="status"></div>

In the markup, we have a file input and a div for displaying progress or status updates. Now, let’s attach an event listener with JavaScript so the file can be uploaded asynchronously.

document.getElementById('uploadForm').addEventListener('submit', async (event) => {
  event.preventDefault();
  const file = document.getElementById('fileInput').files[0];
  if (file) {
    await uploadFile(file);
  }
});

Here we use async/await for clarity—ensuring each step in the upload operation reads like synchronous code, even though it’s non-blocking.

2. Performing the Asynchronous Upload with fetch()

Next, let’s construct the uploadFile function using the fetch API. This function will send the file data wrapped in a FormData object to an upload endpoint.

async function uploadFile(file) {
  const formData = new FormData();
  formData.append('file', file);
  
  try {
    const response = await fetch('/api/upload', {
      method: 'POST',
      body: formData
    });
    
    if (!response.ok) {
      throw new Error('Upload failed.');
    }
    
    const result = await response.json();
    document.getElementById('status').innerText = 'Upload successful: ' + result.fileName;
  } catch(err) {
    document.getElementById('status').innerText = 'Error: ' + err.message;
  }
}

Using fetch() with await makes our code look synchronous yet remain non-blocking. If one upload takes time, other parts of the page remain responsive.

3. Adding Progress Tracking Using XMLHttpRequest

One limitation of fetch() is that it doesn’t yet provide native progress event listeners for uploads. For detailed progress tracking, we can use XMLHttpRequest (XHR) instead—paired with modern async wrappers.

function uploadWithProgress(file, onProgress) {
  return new Promise((resolve, reject) => {
    const xhr = new XMLHttpRequest();
    xhr.open('POST', '/api/upload');

    xhr.upload.addEventListener('progress', (event) => {
      if (event.lengthComputable) {
        const percent = Math.round((event.loaded / event.total) * 100);
        onProgress(percent);
      }
    });

    xhr.onload = () => {
      if (xhr.status === 200) {
        resolve(JSON.parse(xhr.response));
      } else {
        reject(new Error('Upload failed'));
      }
    };

    xhr.onerror = () => reject(new Error('Network error'));

    const formData = new FormData();
    formData.append('file', file);
    xhr.send(formData);
  });
}

Now we can integrate it with async/await and visually update progress:

async function handleUpload(file) {
  const status = document.getElementById('status');
  try {
    const result = await uploadWithProgress(file, (percent) => {
      status.innerText = `Uploading... ${percent}%`;
    });
    status.innerText = 'Upload complete: ' + result.fileName;
  } catch (e) {
    status.innerText = 'Error: ' + e.message;
  }
}

This hybrid approach—XHR for progress, async/await for code clarity—combines the best of both worlds.

4. Integrating Async Uploads into a Larger Application

In practice, file uploads might be part of a more complex data pipeline: validating form inputs, scheduling batch uploads, or notifying users once files are processed. Async workflows make these pipelines easy to coordinate:

async function submitData() {
  const file = document.getElementById('fileInput').files[0];
  const meta = { userId: '12345', timestamp: Date.now() };

  if (!file) return alert('No file selected');

  const response = await uploadWithProgress(file, (progress) => {
    console.log('Progress:', progress);
  });

  await fetch('/api/registerUpload', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({ fileName: response.fileName, ...meta })
  });

  alert('File uploaded and registered!');
}

This demonstrates async coordination among multiple network operations, where each step logically follows the previous one while preserving non-blocking behavior.

5. Performance Tips and Best Practices

  • Batch Uploads: Use Promise.all() to upload multiple files simultaneously.
  • Chunked Uploads: For large files, split them into chunks and upload incrementally for better fault tolerance.
  • Error Recovery: Implement retry logic with exponential backoff when network errors occur.
  • Compression and Optimization: Compress files client-side before uploading (for example, images).
  • Server Coordination: Ensure your API returns structured JSON responses so the client can easily interpret success or failure.

By combining async programming syntax with modern browser APIs, developers can craft file upload systems that are resilient, fast, and smooth—without blocking the user interface.

Conclusion

Modern JavaScript offers elegant tools for managing asynchronous tasks. Using fetch and async/await (with XHR fallbacks for progress tracking) results in file uploads that are maintainable, responsive, and ready for professional web applications. Implement these concepts to enhance your next project’s user experience while maintaining high performance.

 

Useful links: