Skip to content

Commit

Permalink
chore: update deno build
Browse files Browse the repository at this point in the history
  • Loading branch information
juanjoDiaz committed May 26, 2023
1 parent 64e44e8 commit 5d47e66
Show file tree
Hide file tree
Showing 13 changed files with 70 additions and 136 deletions.
140 changes: 36 additions & 104 deletions packages/node/dist/deno/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# @streamparser/json
# @streamparser/json-node

[![npm version][npm-version-badge]][npm-badge-url]
[![npm monthly downloads][npm-downloads-badge]][npm-badge-url]
Expand All @@ -10,63 +10,40 @@ Fast dependency-free library to parse a JSON stream using utf-8 encoding in Node
*tldr;*

```javascript
import { JSONParser } from '@streamparser/json-what';

const inputStream = new ReadableStream({
async start(controller) {
controller.enqueue('{ "test": ["a"] }');
controller.close();
},
});
import { JSONParser } from '@streamparser/json-node';

const parser = new JSONParser();
const reader = inputStream.pipeThrough(jsonparser).pipeTo(destinationStream)

// Or manually getting the values

const reader = inputStream.pipeThrough(jsonparser).getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
processValue(value);
// There will be 3 value:
// "a"
// ["a"]
// { test: ["a"] }
}
```

## streamparser/json ecosystem
inputStream.pipe(jsonparser).pipe(destinationStream);

There are multiple flavours of @streamparser:
// Or using events to get the values

* The **[@streamparser/json](https://www.npmjs.com/package/@streamparser/json)** package allows to parse any JSON string or stream using pure Javascript.
* The **[@streamparser/json-whatwg](https://www.npmjs.com/package/@streamparser/json-whatwg)** wraps `@streamparser/json` into WHATWG `@streamparser/json-whatwg`.

## Dependencies / Polyfilling
parser.on("data", (value) => { /* ... */ });
parser.on("error", err => { /* ... */ });
parser.on("end", () => { /* ... */ });
```

@streamparser/json requires a few ES6 classes:
## @streamparser/json ecosystem

* [Uint8Array](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array)
* [TextEncoder](https://developer.mozilla.org/en-US/docs/Web/API/TextEncoder)
* [TextDecoder](https://developer.mozilla.org/en-US/docs/Web/API/TextDecoder)
* [TransformStream](https://developer.mozilla.org/en-US/docs/Web/API/TransformStream)
There are multiple flavours of @streamparser:

If you are targeting browsers or systems in which these might be missing, you need to polyfil them.
* The **[@streamparser/json](https://www.npmjs.com/package/@streamparser/json)** package allows to parse any JSON string or stream using pure Javascript.
* The **[@streamparser/json-whatwg](https://www.npmjs.com/package/@streamparser/json-whatwg)** wraps `@streamparser/json` into a WHATWG TransformStream.
* The **[@streamparser/json-node](https://www.npmjs.com/package/@streamparser/json-node)** wraps `@streamparser/json` into a node Transform stream.

## Components

### Tokenizer

A JSON compliant tokenizer that parses a utf-8 stream into JSON tokens
A JSON compliant tokenizer that parses a utf-8 stream into JSON tokens that are emitted as objects.

```javascript
import { Tokenizer } from '@streamparser/json-whatwg';
import { Tokenizer } from '@streamparser/json-node';

const tokenizer = new Tokenizer(opts, writableStrategy, readableStrategy);
const tokenizer = new Tokenizer(opts, transformOpts);
```

Writable and readable strategy are standard WhatWG Stream settings (see [MDN](https://developer.mozilla.org/en-US/docs/Web/API/TransformStream/TransformStream)).
Transform options take the standard node Transform stream settings (see [Node docs](https://nodejs.org/api/stream.html#class-streamtransform)).

The available options are:

Expand All @@ -91,12 +68,12 @@ Strings are inmutable in Javascript so every string operation creates a new stri
A token parser that processes JSON tokens as emitted by the `Tokenizer` and emits JSON values/objects.

```javascript
import { TokenParser} from '@streamparser/json-whatwg';
import { TokenParser} from '@streamparser/json-node';

const tokenParser = new TokenParser(opts, writableStrategy, readableStrategy);
```

Writable and readable strategy are standard WhatWG Stream settings (see [MDN](https://developer.mozilla.org/en-US/docs/Web/API/TransformStream/TransformStream)).
Transform options take the standard node Transform stream settings (see [Node docs](https://nodejs.org/api/stream.html#class-streamtransform)).

The available options are:

Expand All @@ -116,7 +93,7 @@ The available options are:
The full blown JSON parser. It basically chains a `Tokenizer` and a `TokenParser`.

```javascript
import { JSONParser } from '@streamparser/json-whatwg';
import { JSONParser } from '@streamparser/json-node';

const parser = new JSONParser();
```
Expand All @@ -135,53 +112,17 @@ const jsonParser = tokenizer.pipeTrough(tokenParser);
You can subscribe to the resulting data using the

```javascript
import { JSONParser } from '@streamparser/json-whatwg';

const inputStream = new ReadableStream({
async start(controller) {
controller.enqueue(parser.write('"Hello world!"')); // will log "Hello world!"
// Or passing the stream in several chunks
parser.write('"');
parser.write('Hello');
parser.write(' ');
parser.write('world!');
parser.write('"');// will log "Hello world!"
controller.close();
},
});
import { JSONParser } from '@streamparser/json-node';

const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$'] });
const reader = inputStream.pipeThrough(jsonparser).getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
console.log(value);
}
```

Write is always a synchronous operation so any error during the parsing of the stream will be thrown during the write operation. After an error, the parser can't continue parsing.
inputStream.pipe(jsonparser).pipe(destinationStream);

```javascript
import { JSONParser } from '@streamparser/json-whatwg';

const inputStream = new ReadableStream({
async start(controller) {
controller.enqueue(parser.write('"""')); // will log "Hello world!"
controller.close();
},
});
const parser = new JSONParser({ stringBufferSize: undefined });

try {
const reader = inputStream.pipeThrough(parser).getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
console.log(value);
}
} catch (err) {
console.log(err); // logs
}
// Or using events to get the values

parser.on("data", (value) => { /* ... */ });
parser.on("error", err => { /* ... */ });
parser.on("end", () => { /* ... */ });
```

## Examples
Expand All @@ -191,47 +132,38 @@ try {
Imagine an endpoint that send a large amount of JSON objects one after the other (`{"id":1}{"id":2}{"id":3}...`).

```js
import { JSONParser} from '@streamparser/json-whatwg';
import { JSONParser} from '@streamparser/json-node';

const parser = new JSONParser();

const response = await fetch('http://example.com/');
const reader = response.body.pipeThrough(parse)getReader();
while(true) {
const { done, value } = await reader.read();
if (done) break;
// TODO process element
}
const reader = response.body.pipe(parser);
reader.on('data', value => /* process element */)
```

### Stream-parsing a fetch request returning a JSON array

Imagine an endpoint that send a large amount of JSON objects one after the other (`[{"id":1},{"id":2},{"id":3},...]`).

```js
import { JSONParser } from '@streamparser/json-whatwg';
import { JSONParser } from '@streamparser/json-node';

const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$.*'], keepStack: false });

const response = await fetch('http://example.com/');

const reader = response.body.pipeThrough(parse)getReader();
while(true) {
const { done, value: parsedElementInfo } = await reader.read();
if (done) break;
const reader = response.body.pipe(parse)getReader();

const { value, key, parent, stack } = parsedElementInfo;
// TODO process element
}
reader.on('data', ({ value, key, parent, stack }) => /* process element */)
```

## License

See [LICENSE.md].

[npm-version-badge]: https://badge.fury.io/js/@streamparser%2Fjson.svg
[npm-badge-url]: https://www.npmjs.com/package/@streamparser/json
[npm-downloads-badge]: https://img.shields.io/npm/dm/@streamparser%2Fjson.svg
[npm-version-badge]: https://badge.fury.io/js/@streamparser%2Fjson-node.svg
[npm-badge-url]: https://www.npmjs.com/package/@streamparser/json-node
[npm-downloads-badge]: https://img.shields.io/npm/dm/@streamparser%2Fjson-node.svg
[build-status-badge]: https://github.com/juanjoDiaz/streamparser-json/actions/workflows/on-push.yaml/badge.svg
[build-status-url]: https://github.com/juanjoDiaz/streamparser-json/actions/workflows/on-push.yaml
[coverage-status-badge]: https://coveralls.io/repos/github/juanjoDiaz/streamparser-json/badge.svg?branch=main
Expand Down
2 changes: 1 addition & 1 deletion packages/node/dist/deno/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,4 @@ export {
TokenParserMode,
type StackElement,
TokenType,
} from "https://deno.land/x/[email protected].14/index.ts";
} from "https://deno.land/x/[email protected].15/index.ts";
2 changes: 1 addition & 1 deletion packages/node/dist/deno/jsonparser.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ import {
type TransformOptions,
type TransformCallback,
} from "stream";
import { JSONParser, type JSONParserOptions } from "https://deno.land/x/[email protected].14/index.ts";
import { JSONParser, type JSONParserOptions } from "https://deno.land/x/[email protected].15/index.ts";

export default class JSONParserTransform extends Transform {
private jsonParser: JSONParser;
Expand Down
4 changes: 2 additions & 2 deletions packages/node/dist/deno/tokenizer.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ import {
} from "stream";
import Tokenizer, {
type TokenizerOptions,
} from "https://deno.land/x/[email protected].14/tokenizer.ts";
} from "https://deno.land/x/[email protected].15/tokenizer.ts";

export default class TokenizerTransform extends Transform {
private tokenizer: Tokenizer;
Expand Down Expand Up @@ -55,7 +55,7 @@ export default class TokenizerTransform extends Transform {

override _final(done: any) {
try {
this.tokenizer.end();
if (!this.tokenizer.isEnded) this.tokenizer.end();
done();
} catch (err: unknown) {
done(err);
Expand Down
2 changes: 1 addition & 1 deletion packages/node/dist/deno/tokenparser.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ import {
type TransformOptions,
type TransformCallback,
} from "stream";
import { TokenParser, type TokenParserOptions } from "https://deno.land/x/[email protected].14/index.ts";
import { TokenParser, type TokenParserOptions } from "https://deno.land/x/[email protected].15/index.ts";

export default class TokenParserTransform extends Transform {
private tokenParser: TokenParser;
Expand Down
2 changes: 1 addition & 1 deletion packages/node/dist/deno/utils.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import type { ParsedElementInfo } from "https://deno.land/x/[email protected].14/utils/types/parsedElementInfo.ts";
import type { ParsedElementInfo } from "https://deno.land/x/[email protected].15/utils/types/parsedElementInfo.ts";

export function cloneParsedElementInfo(
parsedElementInfo: ParsedElementInfo
Expand Down
19 changes: 10 additions & 9 deletions packages/plainjs/dist/deno/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ Fast dependency-free library to parse a JSON stream using utf-8 encoding in Node
*tldr;*

```javascript
import { JSONParser } from "https://deno.land/x/[email protected].14/index.ts";/
import { JSONParser } from "https://deno.land/x/[email protected].15/index.ts";/

const parser = new JSONParser();
parser.onValue = ({ value }) => { /* process data */ };
Expand All @@ -27,12 +27,13 @@ try {
}
```

## streamparser/json ecosystem
## @streamparser/json ecosystem

There are multiple flavours of @streamparser:

* The **[@streamparser/json](https://www.npmjs.com/package/@streamparser/json)** package allows to parse any JSON string or stream using pure Javascript.
* The **[@streamparser/json-whatwg](https://www.npmjs.com/package/@streamparser/json-whatwg)** wraps `@streamparser/json` into WHATWG `@streamparser/json-whatwg`.
* The **[@streamparser/json-whatwg](https://www.npmjs.com/package/@streamparser/json-whatwg)** wraps `@streamparser/json` into a WHATWG TransformStream.
* The **[@streamparser/json-node](https://www.npmjs.com/package/@streamparser/json-node)** wraps `@streamparser/json` into a node Transform stream.

## Dependencies / Polyfilling

Expand All @@ -51,7 +52,7 @@ If you are targeting browsers or systems in which these might be missing, you ne
A JSON compliant tokenizer that parses a utf-8 stream into JSON tokens

```javascript
import { Tokenizer } from "https://deno.land/x/[email protected].14/index.ts";/
import { Tokenizer } from "https://deno.land/x/[email protected].15/index.ts";/

const tokenizer = new Tokenizer(opts);
```
Expand Down Expand Up @@ -162,7 +163,7 @@ A drop-in replacement of `JSONparse` (with few ~~breaking changes~~ improvements


```javascript
import { JSONParser } from "https://deno.land/x/[email protected].14/index.ts";/
import { JSONParser } from "https://deno.land/x/[email protected].15/index.ts";/

const parser = new JSONParser();
```
Expand Down Expand Up @@ -222,7 +223,7 @@ You push data using the `write` method which takes a string or an array-like obj
You can subscribe to the resulting data using the

```javascript
import { JSONParser } from "https://deno.land/x/[email protected].14/index.ts";/
import { JSONParser } from "https://deno.land/x/[email protected].15/index.ts";/

const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$'] });
parser.onValue = console.log;
Expand All @@ -240,7 +241,7 @@ parser.write('"');// logs "Hello world!"
Write is always a synchronous operation so any error during the parsing of the stream will be thrown during the write operation. After an error, the parser can't continue parsing.

```javascript
import { JSONParser } from "https://deno.land/x/[email protected].14/index.ts";/
import { JSONParser } from "https://deno.land/x/[email protected].15/index.ts";/

const parser = new JSONParser({ stringBufferSize: undefined });
parser.onValue = console.log;
Expand All @@ -255,7 +256,7 @@ try {
You can also handle errors using callbacks:

```javascript
import { JSONParser } from "https://deno.land/x/[email protected].14/index.ts";/
import { JSONParser } from "https://deno.land/x/[email protected].15/index.ts";/

const parser = new JSONParser({ stringBufferSize: undefined });
parser.onValue = console.log;
Expand Down Expand Up @@ -293,7 +294,7 @@ Imagine an endpoint that send a large amount of JSON objects one after the other
Imagine an endpoint that send a large amount of JSON objects one after the other (`[{"id":1},{"id":2},{"id":3},...]`).

```js
import { JSONParser } from "https://deno.land/x/[email protected].14/index.ts";/
import { JSONParser } from "https://deno.land/x/[email protected].15/index.ts";/

const jsonparser = new JSONParser({ stringBufferSize: undefined, paths: ['$.*'] });
jsonparser.onValue = ({ value, key, parent, stack }) => {
Expand Down
Loading

0 comments on commit 5d47e66

Please sign in to comment.