-
-
Notifications
You must be signed in to change notification settings - Fork 293
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🧂 more precompile for dynamic functions! #1978
Conversation
This reverts commit a12233e.
@rikhuijzer i realised i don't really know what i'm doing 😅 can you take a look at adding more precompiles for our dynamic things? |
Yeah sure. Gladly. Can you tell give in code or pseudocode are list of methods are called? Or, I can just see what the firebasey sends to the backend and trigger those? |
|
@@ -169,6 +171,7 @@ function notebook_to_js(notebook::Notebook) | |||
"cell_execution_order" => cell_id.(collect(topological_order(notebook))), | |||
) | |||
end | |||
precompile(notebook_to_js, (Notebook,)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Slight improvement:
Without precompile
$ julia --project -ie 'using Pluto'
julia> @time @eval Pluto.notebook_to_js(Pluto.Notebook([Pluto.Cell("1")]));
3.078880 seconds (6.69 M allocations: 376.505 MiB, 3.90% gc time, 91.05% compilation time)
With precompile
julia> @time @eval Pluto.notebook_to_js(Pluto.Notebook([Pluto.Cell("1")]));
3.039852 seconds (6.00 M allocations: 338.446 MiB, 5.37% gc time, 91.06% compilation time)
end | ||
precompile(unpack, (Vector{UInt8},)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
precompile(unpack, (Vector{UInt8},)) |
Minor effect
Without precompile
$ julia --project -ie 'using Pluto'
...
julia> @time @eval Pluto.MsgPack.unpack([0x00]);
0.511228 seconds (1.27 M allocations: 67.586 MiB, 24.02% gc time, 99.97% compilation time)
With precompile
julia> @time @eval Pluto.MsgPack.unpack([0x00]);
0.454856 seconds (1.24 M allocations: 65.688 MiB, 15.59% gc time, 99.98% compilation time)
I've been going deeper on I've obtained this by running the following notebook: begin
using Pkg
Pkg.activate(; temp=true)
Pkg.add([
"SnoopCompile",
"ProfileSVG"
])
Pkg.develop(; path="/home/rik/git/Pluto.jl")
end begin
using SnoopCompile
using ProfileSVG
using Pluto
end tinf = @snoopi_deep begin
port = 13435
options = Pluto.Configuration.from_flat_kwargs(; port, launch_browser=false, workspace_use_distributed=true, require_secret_for_access=false, require_secret_for_open_links=false)
🍭 = Pluto.ServerSession(; options)
server_task = @async Pluto.run(🍭)
retry(; delays=ExponentialBackOff(n=5, first_delay=0.5)) do
HTTP.get("http://localhost:$port/edit").status == 200
end
server_task
end let
g = flamegraph(tinf)
ProfileSVG.view(g)
end This flamegraph is good news. All inference is rooted on the same block, so if we can get the right EDIT: Without
|
As another possibility, maybe we should just build julia> using Pluto
julia> Pluto.compile()
julia> Pluto.run() to have a super speedy Pluto. Only need to call julia> using Pluto
julia> Pluto.run() EDIT: Nope. Let's hope for JuliaLang/julia PR 44527 and the successors to be merged in newer Julia versions. |
@@ -67,6 +67,7 @@ function run(; kwargs...) | |||
options = Configuration.from_flat_kwargs(; kwargs...) | |||
run(options) | |||
end | |||
precompile(run, ()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This one works. Although I cannot measure Pluto.run()
easily, I can measure Configuration.from_flat_kwargs
which is the biggest source of latency according to SnoopCompile
Without precompile(run, ())
$ julia --project -ie 'using Pluto'
...
julia> @time @eval Pluto.Configuration.from_flat_kwargs();
2.158083 seconds (3.60 M allocations: 194.337 MiB, 7.19% gc time, 99.96% compilation time)
With precompile(run, ())
julia> @time @eval Pluto.Configuration.from_flat_kwargs();
1.431652 seconds (2.64 M allocations: 143.595 MiB, 11.35% gc time, 99.92% compilation time)
With JET.jl, I've been digging down in why julia> using Pluto
julia> precompile(Pluto.ExpressionExplorer.maybe_macroexpand, (Expr,))
true
julia> @time @eval Pluto.ExpressionExplorer.maybe_macroexpand(:(x = 1))
0.213755 seconds (397.54 k allocations: 20.237 MiB, 31.72% gc time, 100.06% compilation time)
julia> @time @eval Pluto.ExpressionExplorer.maybe_macroexpand(:(x = 1))
0.000607 seconds (75 allocations: 3.484 KiB) So that's more than 100% compilation time 🤣 😬 JET shows 120 lines of runtime dispatches and optimization failures. EDIT: Fixed most of it. Got this now in a fresh session by fixing some type inference issues: julia> using Pluto
julia> @time @eval Pluto.ExpressionExplorer.maybe_macroexpand(:(@time 1))
0.084008 seconds (86.97 k allocations: 4.649 MiB, 99.64% compilation time) PR at #1991. |
AMAZING 🌟🌟 |
Co-authored-by: Rik Huijzer <[email protected]>
Co-authored-by: Rik Huijzer <[email protected]>
Co-authored-by: Rik Huijzer <[email protected]>
No description provided.