Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: CompressionStream #11728

Merged
merged 35 commits into from
Jan 24, 2022
Merged

Conversation

crowlKats
Copy link
Member

Closes #7113

@lucacasonato lucacasonato self-requested a review August 16, 2021 16:21
@lucacasonato lucacasonato added this to the 1.14.0 milestone Aug 16, 2021
@lucacasonato
Copy link
Member

I just looked into this. I think the only way we can get it to work is by splitting reading from a compressor and writing into it into seperate ops that happen in parallel. This is because a single write may not always correspond to a single read.

How exactly this can be most trivially implemented is still up for debate. I'll revisit this again.

@bartlomieju bartlomieju modified the milestones: 1.14.0, 1.15.0 Sep 13, 2021
@lucacasonato lucacasonato removed this from the 1.15.0 milestone Oct 11, 2021
@CLAassistant
Copy link

CLAassistant commented Oct 15, 2021

CLA assistant check
All committers have signed the CLA.

@zandaqo
Copy link

zandaqo commented Oct 16, 2021

This could be a nice solution for compression in Deno: it's standard compliant and should be more performant than third-party js/wasm modules. As I understand, exposing the zlib in Deno the way Node does is problematic, but this would cover most use cases for compression without exposing zlib.

@linux-china
Copy link
Contributor

Please include brotli support.

@crowlKats
Copy link
Member Author

Please include brotli support.

We wont. it isnt part of the specification

@kt3k
Copy link
Member

kt3k commented Nov 4, 2021

Please include brotli support.

Maybe we can develop std/compress for non standard algorithms like we did std/crypto for non standard digest functions

@zandaqo
Copy link

zandaqo commented Nov 4, 2021

I assume the most common use case for this API will be online compression of response bodies. In that case, the default compression (zlib's level 6) is good enough, and arguably it's a much better choice than say brotli in most cases due to the complexity/performance trade off.

@dojyorin
Copy link

I used CompressionStream in web application development and found it very fast and convenient, so I would like Deno to implement it as well!
I'm sure it will bring a wonderful experience.

@ry ry marked this pull request as ready for review January 23, 2022 23:39
@ry
Copy link
Member

ry commented Jan 23, 2022

@crowlKats @lucacasonato This is close to working condition now. PTAL.

Copy link
Member

@lucacasonato lucacasonato left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! The JS side bindings are not yet quite correct. I'll clean them up tomorrow, but then this is good to land!

Great job on figuring out this flate2 stuff. It looks so simple now that it's done 🙂

}
Inner::GzEncoder(d) => {
d.write_all(&input)?;
d.get_mut().drain(..).collect()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the trick I couldn't figure out in previous attempts. Great! 👍

@lucacasonato
Copy link
Member

@ry You either don't have an up to date WPT submodule, or you didn't run ./tools/wpt.ts setup before running WPT tests, as you didn't run some tests. The /compression/decompression-correct-input.tentative.any.html test hangs for me while running locally.

@lucacasonato
Copy link
Member

Boiled it down to this test case failing:

const deflateChunkValue = new Uint8Array([120, 156, 75, 173, 40, 72, 77, 46, 73, 77, 81, 200, 47, 45, 41, 40, 45, 1, 0, 48, 173, 6, 36]);
const trueChunkValue = new TextEncoder().encode('expected output');

const ds = new DecompressionStream('deflate');
const reader = ds.readable.getReader();
const writer = ds.writable.getWriter();
const writePromise = writer.write(deflateChunkValue);
const { done, value } = await reader.read();
assert_array_equals(Array.from(value), trueChunkValue, "value should match");
await writePromise;
await writer.close();

The test fails on the line reader.read() with the following error:

error: Module evaluation is still pending but there are no pending ops or dynamic imports. This situation is often caused by unresolved promises.

@lucacasonato
Copy link
Member

Pushed the changes required to fix the IDL stuff. The happy case is not working at all though. Always fails with error: Module evaluation is still pending but there are no pending ops or dynamic imports. This situation is often caused by unresolved promises..

@ry
Copy link
Member

ry commented Jan 24, 2022

@lucacasonato Yes I experienced that too. This seems to be a bug deep inside Deno somewhere, not in this compression stream - there are no async ops.

For example if I run:

import { assertEquals } from "https://deno.land/std/testing/asserts.ts";
const deflateChunkValue = new Uint8Array([120, 156, 75, 173, 40, 72, 77, 46, 73, 77, 81, 200, 47, 45, 41, 40, 45, 1, 0, 48, 173, 6, 36]);
const trueChunkValue = new TextEncoder().encode('expected output');
async function main() {
  const ds = new DecompressionStream('deflate');
  const reader = ds.readable.getReader();
  const writer = ds.writable.getWriter();
  console.log("A", Deno.resources());
  await writer.write(deflateChunkValue);
  console.log("B", Deno.resources());
  const { done, value } = await reader.read();
  assertEquals(Array.from(value), trueChunkValue, "value should match");
  console.log("C", Deno.resources());
}
main();

It never gets to the second console.log, but rather the program just exits. If you remove the await on the write.write line, it never gets to the third console.log. Something very wrong is happening.

@lucacasonato
Copy link
Member

Mh, that looks like a bug in the TransformStream implementation then. We kinda need to solve that first I guess. I'll investigate that later today.

Copy link
Member

@ry ry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this first pass is done. LGTM

Copy link
Member

@bartlomieju bartlomieju left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM too, nice work

Copy link
Member

@lucacasonato lucacasonato left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@lucacasonato lucacasonato merged commit 30ddf43 into denoland:main Jan 24, 2022
@crowlKats crowlKats deleted the compression_stream branch January 31, 2022 17:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement CompressionStream
9 participants