Large file upload more than 1GB using HTML5 Web workers, Drag and Drop and XHR2

To know about HTML5

This tutorial provides a guide and code examples for leveraging the large file upload using File API inside of a Web Worker. While we are trying to upload large file, we getting the problems like browser showing kill pages or unresponsive page etc.. This are because of we are blocking the user agent and we are utilising lot of CPU, it will lead to performance down or user agent getting continuos processing. We need to avoid the blocking or performance utilization on document. Solution for that is simple, we need to upload file using background processing nothing but using webworkers. By using webworkers we can upload files in background so that no errors or problems will occur.

In some cases reading the entire file into memory isn't the best option. For example, say you wanted to write an async file uploader. One possible way to speed up the upload would be to read and send the file in separate byte range chunks. The server component would then be responsible for reconstructing the file content in the correct order. Lucky for us, the File interface supports a slice method to support this use case. The method takes a starting byte as its first argument, ending byte as its second, and an option content type string as a third.

I was demonstrated the upload using file input and drag and drop. you could use client-side logic to verify an upload's mimetype matches its file extension or restrict the size of an upload. The most straightforward way to load a file is to use a standard <input type="file"> element. JavaScript returns the list of selected File objects as a FileList. Here's an example that uses the 'multiple' attribute to allow selecting several files at once. Another technique for loading files is native drag and drop from the desktop to the browser. This example include drag and drop support. The following example demonstrates reading chunks of a file.

If you load models or textures from external files, due to browsers' "same origin policy" security restrictions, loading from a file system will fail with a security exception. To solve this Start Chrome executable with a command line flag: chrome --allow-file-access-from-files or else run from local server

This will solve the following problems:
  1. How to upload large file more than 1GB
  2. Synchronous File Uploading
  3. Large File upload with XHR2 uisng webworkers
  4. Drag and Drop File upload
  5. Multiple File Upload without blocking UI
  6. Increasing performance While File uploads
  7. Background Processing of file uploads
  8. Slicing the file

  <input type="file" id="files" name="files[]" multiple />
  <div id="drop_zone">
   Drop files here

  <output id="list"></output>


 var worker = new Worker('fileupload.js');
 worker.onmessage = function(e) {
worker.onerror =werror;
function werror(e) {
  console.log('ERROR: Line ', e.lineno, ' in ', e.filename, ': ', e.message);
function handleFileSelect(evt) {

 var files = evt.dataTransfer.files||;
 // FileList object.
 'files' : files
 //Sending File list to worker
 // files is a FileList of File objects. List some properties.
 var output = [];
 for (var i = 0, f; f = files[i]; i++) {
  output.push('<li><strong>', escape(, '</strong> (', f.type || 'n/a', ') - ', f.size, ' bytes, last modified: ', f.lastModifiedDate ? f.lastModifiedDate.toLocaleDateString() : 'n/a', '</li>');
 document.getElementById('list').innerHTML = '<ul>' + output.join('') + '</ul>';

function handleDragOver(evt) {
 evt.dataTransfer.dropEffect = 'copy';
 // Explicitly show this is a copy.

// Setup the dnd listeners.
var dropZone = document.getElementById('drop_zone');
dropZone.addEventListener('dragover', handleDragOver, false);
dropZone.addEventListener('drop', handleFileSelect, false);
 document.getElementById('files').addEventListener('change', handleFileSelect, false);

Worker (fileupload.js) :

var file = [], p = true;
function upload(blobOrFile) {
 var xhr = new XMLHttpRequest();'POST', '/server', false);
 xhr.onload = function(e) {

function process() {
 for (var j = 0; j <file.length; j++) {
  var blob = file[j];

  const BYTES_PER_CHUNK = 1024 * 1024;
  // 1MB chunk sizes.
  const SIZE = blob.size;

  var start = 0;
  var end = BYTES_PER_CHUNK;

  while (start < SIZE) {

   if ('mozSlice' in blob) {
    var chunk = blob.mozSlice(start, end);
   } else {
    var chunk = blob.webkitSlice(start, end);


   start = end;
   end = start + BYTES_PER_CHUNK;
  p = ( j = file.length - 1) ? true : false;
  self.postMessage( + " Uploaded Succesfully");

self.onmessage = function(e) {

for (var j = 0; j <; j++)

 if (p) {


I hope this article has help you to upload large files. Web Workers are an underutilized and under-appreciated feature of HTML5. We can minimize the work to upload a large file. The technique is to slice the upload into multiple chunks, spawn an XHR for each portion, and put the file together on the server. This is similar to how GMail uploads large attachments so quickly.

Further Reading
  1.  Mastering in webworkers 
  2. Reading files and chunking. 
  3. HTML5  

Share this

Related Posts

Next Post »


July 24, 2012 at 3:38 PM delete

Thank you for this example.
Could you share the server script as well? I don't really get how to save the file server-side.
Thank you in advance,


January 16, 2013 at 9:30 AM delete

While this is a great overview, I wasn't able to implement due to a few bugs.

January 18, 2013 at 8:50 PM delete

Bro...change globally's hard read using mobile browsers

January 18, 2013 at 8:51 PM delete

Bro...change globally's hard read using mobile browsers

January 24, 2013 at 4:06 PM delete

I will share php script in a quick session

February 4, 2013 at 10:24 PM delete

consloe.log - in first javascript - must be console.log,
expect that little thing - Great Post

March 13, 2013 at 3:07 PM delete

Updates: webkitslice prefix removed so you can use as it as in specification.

April 19, 2013 at 2:32 PM delete

Hi. Thanks for sharing.

Dunno if I miss something but when I select file, I get this error in console "DataCloneError: The object could not be cloned.
'files' : files"

I search on the web but don't really find what can be the reason of this error.

As previously ask, could you share a server-side script (PHP ?) to rebuilt file on the server.

Thank you in advance.

June 11, 2013 at 5:14 AM delete

Great article.

Can you provide the php script to try this on my computer and test how much memory will consume?

Thanks in advance

June 27, 2013 at 2:10 PM delete

Great Article,

Hi can you please send the php script.

November 6, 2013 at 8:46 PM delete

Great! Could you please share the php script? Thanks in advance, Jan

January 26, 2014 at 11:11 AM delete

It would be nice if you could attach the working project for reference.

January 30, 2014 at 10:04 AM delete

okay, I would Place working project ASAP.

May 22, 2014 at 9:09 AM delete

Can you please share how to save the file on the server side?

June 26, 2014 at 7:01 PM delete

please share how to save the file on server side?

June 26, 2014 at 7:55 PM delete

please share how to save this file on server particular location i.e. shared location on server?

February 13, 2015 at 5:24 PM delete

The blob,webkitSlice or blob.WebSlice is not working in safari could you please suggest me any alternative.Tried few snippets nothing worked the method should return a blob object which has size and type of blob. Any help or pointer appreciated

February 24, 2015 at 3:47 PM delete

hi Rakesh,

webkitSlice deprecated use blob.slice

June 1, 2015 at 9:19 PM delete

i am also interested in seeing your serverSide script. was that posted @ some point? thank you for the share