JIT compilation
Just-in-time compilation: compiling parts of a program during execution based on what’s actually running.
Definition
JIT compilation (just-in-time compilation) is compilation that happens while a program is running.
Instead of compiling everything upfront, the system compiles what it needs as it goes (often focusing on frequently executed code paths).
Synonyms and related terms
- Synonyms: dynamic compilation
- Related: profiling, warmup, code cache, de-optimization
- Often contrasted with: AOT compilation
Why do it at runtime?
- Use real execution data to optimize what’s actually hot.
- Potentially reduce upfront compilation time by delaying work until it’s needed.
How JITs typically work (high level)
Many JIT systems:
- start by interpreting or running baseline-compiled code
- profile execution to find “hot” functions/loops
- compile hot paths into optimized machine code
- keep a code cache of compiled results
- sometimes de-optimize (fall back) if assumptions become invalid
A useful analogy
An AOT compiler is like planning the entire route before a trip. A JIT is like starting to drive with a rough plan, then learning from traffic and repeatedly choosing faster streets based on what’s actually happening.
Tradeoffs
- Warmup time: the program may start slower or run slower until it “warms up”.
- More complex runtime behavior: profiling, compilation, caching, de-optimization.
- Different performance characteristics across machines and workloads (because the runtime is adapting).
Common misconceptions
- “JIT is always faster.” A JIT can produce excellent peak performance, but warmup and runtime compilation overhead can hurt short-lived workloads.
- “JIT means unpredictable.” JITs can be stable, but they introduce additional runtime states (interpreting, compiling, optimized, de-optimized) that affect tails and debuggability.
JIT vs AOT
Compared to AOT compilation, JITs can be more adaptive (optimize for what actually happens) but introduce extra moving parts at runtime. Operationally, this can affect tail latency, memory usage, and predictability.