Jump to content

Bootstrapping (compilers)

From Wikipedia, the free encyclopedia
(Redirected from Compiler bootstrapping)

In computer science, bootstrapping is the technique for producing a self-compiling compiler – that is, a compiler (or assembler) written in the source programming language that it intends to compile. An initial core version of the compiler (the bootstrap compiler) is generated in a different language (which could be assembly language); successive expanded versions of the compiler are developed using this minimal subset of the language. The problem of compiling a self-compiling compiler has been called the chicken-or-egg problem in compiler design, and bootstrapping is a solution to this problem.[1][2]

Bootstrapping is a fairly common practice when creating a programming language. Many compilers for many programming languages are bootstrapped, including compilers for BASIC, ALGOL, C, C#, D, Pascal, PL/I, Haskell, Modula-2, Oberon, OCaml, Common Lisp, Scheme, Go, Java, Elixir, Rust, Python, Scala, Nim, Eiffel, TypeScript, Vala, Zig and more.

Process

[edit]

A typical bootstrap process works in three or four stages:[3][4][5]

  • Stage 0: preparing an environment for the bootstrap compiler to work with. This is where the source language and output language of the bootstrap compiler are chosen. In the case of a "bare machine" (one which has no compiler for any language) the source and output are written as binary machine code, or may be created by cross compiling on some other machine than the target. Otherwise, the bootstrap compiler is to be written in one of the programming languages which does exist on the target machine, and that compiler will generate something which can execute on the target, including a high-level programming language, an assembly language, an object file, or even machine code.
  • Stage 1: the bootstrap compiler is produced. This compiler is enough to translate its own source into a program which can be executed on the target machine. At this point, all further development is done using the language defined by the bootstrap compiler, and stage 2 begins.
  • Stage 2: a full compiler is produced by the bootstrap compiler. This is typically done in stages as needed, e.g. compiler for version X of the language will be able to compile features from version X+1, but that compiler does not actually use those features. Once this compiler has been tested and can compile itself, now version X+1 features may be used by subsequent releases of the compiler.
  • Stage 3: a full compiler is produced by the stage 2 full compiler. If more features are to be added, work resumes at stage 2, with the current stage 3 full compiler replacing the bootstrap compiler.

The full compiler is built twice in order to compare the outputs of the two stages. If they are different, either the bootstrap or the full compiler contains a bug.[3]

Advantages

[edit]

Bootstrapping a compiler has the following advantages:[6]

  • It is a non-trivial test of the language being compiled, and as such is a form of dogfooding.
  • Compiler developers and bug reporters only need to know the language being compiled.
  • Compiler development can be performed in the higher-level language being compiled.
  • Improvements to the compiler's back-end improve not only general-purpose programs but also the compiler itself.
  • It is a comprehensive consistency check as it should be able to reproduce its own object code.

Note that some of these points assume that the language runtime is also written in the same language.

Methods

[edit]

If one needs to compile a compiler for language X written in language X, there is the issue of how the first compiler can be compiled. The different methods that are used in practice include:

  • Implementing an interpreter or compiler for language X in language Y. Niklaus Wirth reported that he wrote the first Pascal compiler in Fortran.[7]
  • Another interpreter or compiler for X has already been written in another language Y; this is how Scheme is often bootstrapped.
  • Earlier versions of the compiler were written in a subset of X for which there existed some other compiler; this is how some supersets of Java, Haskell, and the initial Free Pascal compiler are bootstrapped.
  • A compiler supporting non-standard language extensions or optional language features can be written without using those extensions and features, to enable it being compiled with another compiler supporting the same base language but a different set of extensions and features. The main parts of the C++ compiler clang were written in a subset of C++ that can be compiled by both g++ and Microsoft Visual C++. Advanced features are written with some GCC extensions.
  • The compiler for X is cross compiled from another architecture where there exists a compiler for X; this is how compilers for C are usually ported to other platforms. Also this is the method used for Free Pascal after the initial bootstrap.
  • Writing the compiler in X; then hand-compiling it from source (most likely in a non-optimized way) and running that on the code to get an optimized compiler. Donald Knuth used this for his WEB literate programming system.

Methods for distributing compilers in source code include providing a portable bytecode version of the compiler, so as to bootstrap the process of compiling the compiler with itself. The T-diagram is a notation used to explain these compiler bootstrap techniques.[6] In some cases, the most convenient way to get a complicated compiler running on a system that has little or no software on it involves a series of ever more sophisticated assemblers and compilers.[8]

History

[edit]

Assemblers were the first language tools to bootstrap themselves.

The first high-level language to provide such a bootstrap was NELIAC in 1958. The first widely used languages to do so were Burroughs B5000 Algol in 1961 and LISP in 1962.

Hart and Levin wrote a LISP compiler in LISP at MIT in 1962, testing it inside an existing LISP interpreter. Once they had improved the compiler to the point where it could compile its own source code, it was self-hosting.[9]

The compiler as it exists on the standard compiler tape is a machine language program that was obtained by having the S-expression definition of the compiler work on itself through the interpreter.

— AI Memo 39[9]

This technique is only possible when an interpreter already exists for the very same language that is to be compiled. It borrows directly from the notion of running a program on itself as input, which is also used in various proofs in theoretical computer science, such as the variation of the proof that the halting problem is undecidable that uses Rice's Theorem.

Current efforts

[edit]

Due to security concerns regarding the Trusting Trust Attack (which involves a compiler being maliciously modified to introduce covert backdoors in programs it compiles or even further replicate the malicious modification in future versions of the compiler itself, creating a perpetual cycle of distrust) and various attacks against binary trustworthiness, multiple projects are working to reduce the effort for not only bootstrapping from source but also allowing everyone to verify that source and executable correspond. These include the Bootstrappable builds project[10] and the Reproducible builds project.[11]

See also

[edit]

References

[edit]
  1. ^ Reynolds, John H. (December 2003). "Bootstrapping a self-compiling compiler from machine X to machine Y". CCSC: Eastern Conference. Journal of Computing Sciences in Colleges. 19 (2): 175–181. The idea of a compiler written in the language it compiles stirs up the old 'chicken-or-the-egg' conundrum: Where does the first one come from?
  2. ^ Glück, Robert (2012). "Bootstrapping compiler generators from partial evaluators". In Clarke, Edmund; Virbitskaite, Irina; Voronkov, Andrei (eds.). Perspectives of Systems Informatics: 8th International Andrei Ershov Memorial Conference, PSI 2011, Novosibirsk, Russia, June 27 – July 1, 2011, Revised Selected Papers. Lecture Notes in Computer Science. Vol. 7162. Springer. pp. 125–141. doi:10.1007/978-3-642-29709-0_13. ISBN 978-3-642-29708-3. Getting started presents the chicken-and-egg problem familiar from compiler construction: one needs a compiler to bootstrap a compiler, and bootstrapping compiler generators is no exception.
  3. ^ a b "Installing GCC: Building". GNU Project - Free Software Foundation (FSF).
  4. ^ "rust-lang/rust: bootstrap". GitHub.
  5. ^ "Advanced Build Configurations — LLVM 10 documentation". llvm.org.
  6. ^ a b Patrick D. Terry (1997). "3. Compiler Construction and Bootstrapping". Compilers and Compiler Generators: An Introduction With C++. International Thomson Computer Press. ISBN 1-85032-298-8. Archived from the original on 2009-11-23.
  7. ^ Wirth, Niklaus (2021-02-22). "50 years of Pascal". Communications of the ACM. 64 (3). Association for Computing Machinery (ACM): 39–41. doi:10.1145/3447525. ISSN 0001-0782. S2CID 231991096.
  8. ^ Edmund Grimley-Evans (2003-04-23). "Bootstrapping a simple compiler from nothing". homepage.ntlworld.com. Archived from the original on 2010-03-03.
  9. ^ a b Tim Hart and Mike Levin. "AI Memo 39-The new compiler" (PDF). Archived from the original (PDF) on 2020-12-13. Retrieved 2008-05-23.
  10. ^ "Bootstrappable builds". bootstrappable.org.
  11. ^ "Reproducible Builds — a set of software development practices that create an independently-verifiable path from source to binary code". reproducible-builds.org.