Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

wasm serialize & hook interface #4479

Draft
wants to merge 14 commits into
base: master
Choose a base branch
from
Draft

Conversation

jerch
Copy link
Member

@jerch jerch commented Apr 16, 2023

Early WIP/PoC...

@jerch
Copy link
Member Author

jerch commented Apr 16, 2023

@Tyriar The build passed in CI 😃 (seems my windows patch also works on azure containers...)

But the runtime penalty is quite big for mac and windows (>3 min), under linux it is only +20s 🤔

To not penalize things from the compiler sdk, I could change gitignore to include the built wasm file, thus the CI does not have to spin up a compiler at all. Downside - the wasm file would have to reside in the repo.


Edit: CI runtime is back to normal 😅 with the two commits below, but we now we might need an explicit force compile step before packing to not let slip through an old wasm binary by accident. Well CI setup still needs fine-tuning...

@jerch
Copy link
Member Author

jerch commented Apr 17, 2023

Running xterm-benchmark gives these numbers:

   Context "out-benchmark/benchmark/SerializeAddon.benchmark.js"
      Context "Terminal: sh -c "dd if=/dev/urandom count=40 bs=1k | hexdump | lolcat -f""
         Context "serialize"
            Case "#1" : 5 runs - average throughput: 5.91 MB/s
   Context "out-benchmark/benchmark/Serialize2Addon.benchmark.js"
      Context "Terminal: sh -c "dd if=/dev/urandom count=40 bs=1k | hexdump | lolcat -f""
         Context "serialize"
            Case "#1" : 5 runs - average throughput: 64.30 MB/s

@jerch
Copy link
Member Author

jerch commented Apr 18, 2023

@Tyriar Found a way to avoid costly SDK pulling during normal CI runs:

  • wasm target gets committed into repo by default
  • with inwasm -S recompilation can be force-skipped (compilation would pull SDKs), instead it just bundles & tests the repo version
  • later before packaging inwasm -f can be used to force a recompilation before doing final tests and packaging

Time to get back to the actual serialization code 😸

@Tyriar
Copy link
Member

Tyriar commented May 18, 2023

later before packaging inwasm -f can be used to force a recompilation before doing final tests and packaging

@jerch committing it and forcing compile during packaging sounds like a fine compromise to me. package and prepackage is always run before releasing. You could add the compile as postpackage as that should always run before we release, then just try remember to always commit it during PR changes so it stays up to date.

@@ -0,0 +1,42 @@
## xterm-addon-serialize

An addon for [xterm.js](https://github.com/xtermjs/xterm.js) that enables xterm.js to serialize a terminal framebuffer into string or html. This addon requires xterm.js v4+.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

v4 -> v5.2+? Should we just remove that line as there's the experimental line + we call it out in the release notes?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure thing. I just copy-cloned from the old addon and did not bother to fix all things yet.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add a "serialize2 (wasm)" button to the demo for convenient testing?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yepp, already did that in my local branch (not pushed yet...)

});

describe('text', () => {
// TODO: wirte alot more test cases here...
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to just copy over the old SerializeAddon.test.ts?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did that, and removed all HTML test code (since the addon does not implement any HTML serialization yet). Then I realized, that there are no real test cases for the text-based serializer beside a dummy one... 😱

throw new Error('Cannot use addon until it has been loaded');
}

return serialize((this._terminal as any)._core);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Haven't checked what you need here, but casting it to an internal interface like this will catch typing problems at compile time:

const core = (terminal as any)._core as ITerminal;

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well that code is still the early hacking version (weirdly hacked in type conversions, no proper memory control, no nice interface yet). I hope you did read it too closely, or your eyes will bleed 😺

I am still messing with the right interface idea here - technically the wasm serialize part can act as a static singleton (would safe some resources/runtime later on during wasm bootstrapping), as JS does not allow multiple concurrent serializations from its single thread idea, and we also cannot asyncify things here, as we have to ensure, that the terminal buffer does not change, while a serialization is running.
So my current idea is to put everything behind a lazy static initializer, which will serve all serialization requests.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't look too hard, just skimmed it 🙂

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

... did not read ..., tss tss tss 🤣

Comment on lines +109 to +117
// itoa LUT
// note: performant decimal conversion of numbers is actually a hard problem
// we use a LUT approach with 0-99 precomputed in utf16 for better performance
const LUT100 = new Uint32Array(100);
for (let i1 = 0; i1 < 10; ++i1) {
for (let i2 = 0; i2 < 10; ++i2) {
LUT100[i1 * 10 + i2] = (48 + i2) << 16 | (48 + i1);
}
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This isn't used other than d32.set(LUT100, 64)? We could either set it directly on d32 or put LUT100 in a closure such that we don't hold on to the Uint32Array

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, thats part of the bleeding eye code - see my remarks above.

// single call is pretty wasteful? preload combines once instead?
writeCombinedOverload = (dst, x) => {
let dp = dst >> 1;
const comb: string = (line as any)._combined[x] || '';
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any as any should use internal types where possible, if we need to expose a combined on the interface that's fine

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above 😺

Comment on lines +122 to +124
let writeCombinedOverload: (dst: number, x: number) => number = (dst, x) => dst;
const writeCombined: (dst: number, x: number) => number = (dst, x) => writeCombinedOverload(dst, x);
let writeLinkOverload: (dst: number, x: number) => number = (dst, x) => dst;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of setting these functions via let, should they just be functions that take line and whatever else is needed?

Copy link
Member Author

@jerch jerch May 18, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above 😺 - I already reworked that one locally, it is now a static function pointer on a context object and not on top level anymore.


const buffer = t.buffer;
const len = t.buffer.lines.length;
let wPos = 150*65536;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Surprised this isn't a lint rule:

Suggested change
let wPos = 150*65536;
let wPos = 150 * 65536;

Copy link
Member Author

@jerch jerch May 18, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, interesting. Yeah I fixed all linter nagging (was quite a bunch in my hacky code), but this one did not show up.

The hardcoded values will be replaced by proper memory management, prolly static again to avoid wasm compilation over and over, instead calling earlier into TextDecoder.decode.

@jerch
Copy link
Member Author

jerch commented May 23, 2023

reminder - also add overline attribute support.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants