Search code examples
javascriptgoogle-chromebase64compressionfrontend

Split an uploaded file into multiple chunks using javascript


I'm looking for a way to split up any text/data file on the front end in the browser before being uploaded as multiple files. My limit is 40KB per upload. So if a user uploads a 400KB file, it would split this file into 10 separate chunks or 10 separate files on the front end before uploading it to the server.

Currently, I'm doing it by converting this file into a base64 formatted string, then split this string by 40KB which comes out to 10 separate chunks. From there I upload each chunk as with a filename of chunk-1-of-10, chunk-2-of-10...

When pulling down these files, I just concat all these chunks back and deconvert it from base64 into its file format.

Is there a better way of doing this? Is there a library that handles all of this instead of writing it from scratch? I'm not sure if the base64 route is the best way to do this.


Solution

  • There is no need for reading the content into ram with the FileReader using base64 will only increase the size of what you need to upload, base64 takes up ~33% more in size

    Use Blob.slice to get chunks

    blob slices (chunks) will not increase the memory, it will just create a new reference to the old blob with a changed offset and a new size to where it should start reading from.

    when fetch sends the data it will be piped directly from the disk to the network without even touching the main thread.

    // simulate a file from a input
    const file = new File(['a'.repeat(1000000)], 'test.txt')
    
    const chunkSize = 40000
    const url = 'https://httpbin.org/post'
    
    for (let start = 0; start < file.size; start += chunkSize) {
      const chunk = file.slice(start, start + chunkSize)
      const fd = new FormData()
      fd.set('data', chunk)
      
      await fetch(url, { method: 'post', body: fd }).then(res => res.text())
    }