-
Notifications
You must be signed in to change notification settings - Fork 321
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebGPU and WGSL W3C Working Drafts on Chrome desktop consume way too much RAM: up to 542 MB #4749
Comments
I agree that this topic matters, given specifications are the ground truth (or the desire is for them to be the ground truth) for the validation and operation of the API and the language, and if people can access it with less friction, whether from low power mobile or from a powerful machine that's simply occupied with other tasks, it can be helpful for developers, especially given the current situation with certain text generation systems hallucinating at their suggestions where search can easily lead people to spec for a better reference. To replicate the issue, I made a memory profile on Safari (Sonoma 14.5), and on M1, I see 260-270 MB usage for both API and WGSL. As a simple attempt, I tried a few HTML minification tools, and these didn't translate to a meaningful reduction in file size but they reduced memory usage by around ~10MB for API spec. However, I could not replicate the 50MB memory usage for the raw text version; the following txt files gave 200MB memory usage on Safari. Maybe I should also try on Chrome.
Recognizing the other priorities wrt spec development, the improvements might come a bit later as we need to make sure any change should be compatible with publication flows that we have. Lazy loading might be an obvious way, but it is also important to make sure the page supports reader tools, printing, etc., or at least have versions for those. |
Thank you for the reply. For the raw text I merely copied the entire text content (Ctrl-A Ctrl-C) and pasted it into the empty body of an html file. I'm sure a functional document would require considerably more content than that. It "randomly" changes from 200 to 600 MB. This must be Chrome irresponsibly taking up as much RAM as it can to speed up page browsing, achieving the opposite effect. I'm amazed there's not a simple way to just tell the browser to limit the maximum memory available for that tab. The document is perfectly functional and fast with "just" 200 MB. I am glad this is seen as something that matters, even if not a priority right now. The Vulcan spec instantly gobbles up 730 MB on my machine. 2 GB is outrageous. |
The online document's pages consume up to 540 MB of memory according to Chrome. Right after opening it takes up around 350, then increases over time up to 500+ MB. This makes it noticeably taxing to keep it open (or re-opening it) for sporadic consultation on medium/low specs hardware.
Some effort should be made to optimize this, as I'm sure that for most cases (just searching and reading text) there's no need to have so much stuff loaded in memory. For comparison, an html file with only the raw text of the entire document consumes 50 MB.
The text was updated successfully, but these errors were encountered: