Bugzilla
Categories
(Core :: DOM: File, defect, P2)
People
(Reporter: sheelgautam, Assigned: baku, NeedInfo)
User Agent: Mozilla/ (Macintosh; Intel Mac OS X ; rv) Gecko/ Firefox/
Steps to reproduce:
So, I went to www.cronistalascolonias.com.ar and downloaded a big .zip file of around 1 GB in size.
Actual results:
Towards the end of download (after 99%) the entire browser froze and began to run like a slideshow. It became responsive when the file got saved into the Downloads folder. It happens only with large file sizes (1 GB or greater)
Expected results:
The browser should not become unresponsive. Chrome and Safari download just fine from www.cronistalascolonias.com.ar
My computer is a Non Retina Macbook Air. It has 4GB RAM.
The browser crashes when downloading from www.cronistalascolonias.com.ar in safe mode.
So, I downloaded a file in normal mode while making a recording.
As you can see, the browser freezes towards the end of the download.
Also, QuickTime crashed midway during the recording. I couldn't complete the recording.
Is Firefox saving the download in memory instead of the disk ?
(In reply to sheelgautam from comment #5)
Is Firefox saving the download in memory instead of the disk ?
It doesn't normally, no.
Are there any disk permissions / disk types that differ between the target directory where you're saving the file and your temporary folders?
Could you try using the profiler ( www.cronistalascolonias.com.ar ) to get a performance profile of the end of the download? If you start profiling right before the freeze happens and finish/capture the profile when Firefox returns to normal, then use "Share" to get a link that you can share here, we can have a look at what is making the process slow on your machine - off-hand it's hard to be sure what the issue is
Here is the profile. I hope I did it right.
www.cronistalascolonias.com.ar
Sometimes www.cronistalascolonias.com.ar says "Firefox has an insufficient buffer to decrypt data in the browser". IDK what that means.
Reporter: The profile is interesting, thank you. There's some bits missing at the beginning (possibly because the profiling buffer became full) which means I'm not sure how you ended up in this state / what happened that tripped the bad behaviour - can you provide more (step by step) details about how to reproduce this? Is the encryption perhaps something you turn on within www.cronistalascolonias.com.ar or something like that?
There's definitely some weird jankiness going on, and it looks like www.cronistalascolonias.com.ar uses blobs in a weird way. :baku, looks like we're janking marshaling (large?) blobs between processes, can you take a look at the profile in comment #7 and clarify if there's enough here for you to investigate/improve things?
Tentatively moving this to Core: DOM: File because the profile has scripts coming from as well as lots of jank from code - feel free to move along to IPC or another more appropriate component.
Here is another profile which I think is clearer than the last one.
www.cronistalascolonias.com.ar
I got this profile with fresh boot of Firefox with only single tab open (i.e. www.cronistalascolonias.com.ar). I started profiling when about thirty seconds were left for the download to be finished.
The stutter starts to happen when the file is about to be saved to the disk.
I don't observe stutter with smaller file sizes.
I did not turn on any special features on www.cronistalascolonias.com.ar I think www.cronistalascolonias.com.ar has some inbuilt encryption feature. I you download from www.cronistalascolonias.com.ar, you will notice that it doesn't download straight away. I think it first downloads the file in some temporary location and then decrypts it (this is purely a hunch) and then transfers it to the Downloads folder.
During the previous profiling, I was browsing even when it began to stutter. This time, I did not touch the computer during the profiling.
Also, I think the first profile got messed up probably because I ran it several times before the actual profiling.I had not idea it would mess up the result.
Am I right to say that the majority of the time is taken by nsPipe to release data?
Would be possible to call the 'free()' on a separate thread?
(In reply to Andrea Marchesini [:baku] from comment #13)
Am I right to say that the majority of the time is taken by nsPipe to release data?
Yes, exactly. This is the input stream data of course (1GB buffer). This is freed from here
Would be possible to call the 'free()' on a separate thread?
It would be an improvement But it won't be a complete solution, since for example we'll probably be holding the allocator lock during some of this time, so if the main thread tries to (de-)allocate some memory while this 1GB buffer is being deallocated in the background thread that would stall.
A better question to ask ourselves is why do we need to keep 1GB of data in memory while downloading a file (I'm using the peak memory usage of 1,MB in the profile to draw the conclusion that the entire downloaded file is stored in the parent process' memory while the download is in progress, up to 5 seconds afterwards too.) Can we occasionally flush this buffer or something to that effect?
A better question to ask ourselves is why do we need to keep 1GB of data in memory while downloading a file (I'm using the peak memory usage of 1,MB in the profile to draw the conclusion that the entire downloaded file is stored in the parent process' memory while the download is in progress, up to 5 seconds afterwards too.) Can we occasionally flush this buffer or something to that effect?
This is how mega works: It creates a huge blob object in memory. Then, it decrypts it, and then it allows the user to download it.
Talking about those 5 secs, I would like to get rid of them, but we don't have a nice way to keep blobs alive during the loading and all the async steps. Let's keep this issue separate.
-
-
-