fbpx
Wikipedia

Compiler-compiler

In computer science, a compiler-compiler or compiler generator is a programming tool that creates a parser, interpreter, or compiler from some form of formal description of a programming language and machine.

The most common type of compiler-compiler is called a parser generator.[1] It handles only syntactic analysis.

A formal description of a language is usually a grammar used as an input to a parser generator. It often resembles Backus–Naur form (BNF) or extended Backus–Naur form (EBNF) or has its own syntax. Grammar files describe a syntax of a generated compiler's target programming language and actions that should be taken against its specific constructs.

Source code for a parser of the programming language is returned as the parser generator's output. This source code can then be compiled into a parser, which may be either standalone or embedded. The compiled parser then accepts the source code of the target programming language as an input and performs an action or outputs an abstract syntax tree (AST).

Parser generators do not handle the semantics of the AST, or the generation of machine code for the target machine.[2]

A metacompiler is a software development tool used mainly in the construction of compilers, translators, and interpreters for other programming languages.[3] The input to a metacompiler is a computer program written in a specialized programming metalanguage designed mainly for the purpose of constructing compilers.[3][4] The language of the compiler produced is called the object language. The minimal input producing a compiler is a metaprogram specifying the object language grammar and semantic transformations into an object program.[4][5]

Variants edit

A typical parser generator associates executable code with each of the rules of the grammar that should be executed when these rules are applied by the parser. These pieces of code are sometimes referred to as semantic action routines since they define the semantics of the syntactic structure that is analyzed by the parser. Depending upon the type of parser that should be generated, these routines may construct a parse tree (or abstract syntax tree), or generate executable code directly.

One of the earliest (1964), surprisingly powerful, versions of compiler-compilers is META II, which accepted an analytical grammar with output facilities that produce stack machine code, and is able to compile its own source code and other languages.

Among the earliest programs of the original Unix versions being built at Bell Labs was the two-part lex and yacc system, which was normally used to output C programming language code, but had a flexible output system that could be used for everything from programming languages to text file conversion. Their modern GNU versions are flex and bison.

Some experimental compiler-compilers take as input a formal description of programming language semantics, typically using denotational semantics. This approach is often called 'semantics-based compiling', and was pioneered by Peter Mosses' Semantic Implementation System (SIS) in 1978.[6] However, both the generated compiler and the code it produced were inefficient in time and space. No production compilers are currently built in this way, but research continues.

The Production Quality Compiler-Compiler (PQCC) project at Carnegie Mellon University does not formalize semantics, but does have a semi-formal framework for machine description.

Compiler-compilers exist in many flavors, including bottom-up rewrite machine generators (see JBurg) used to tile syntax trees according to a rewrite grammar for code generation, and attribute grammar parser generators (e.g. ANTLR can be used for simultaneous type checking, constant propagation, and more during the parsing stage).

Metacompilers edit

Metacompilers reduce the task of writing compilers by automating the aspects that are the same regardless of the object language. This makes possible the design of domain-specific languages which are appropriate to the specification of a particular problem. A metacompiler reduces the cost of producing translators for such domain-specific object languages to a point where it becomes economically feasible to include in the solution of a problem a domain-specific language design.[4]

As a metacompiler's metalanguage will usually be a powerful string and symbol processing language, they often have strong applications for general-purpose applications, including generating a wide range of other software engineering and analysis tools.[4][7]

Besides being useful for domain-specific language development, a metacompiler is a prime example of a domain-specific language, designed for the domain of compiler writing.

A metacompiler is a metaprogram usually written in its own metalanguage or an existing computer programming language. The process of a metacompiler, written in its own metalanguage, compiling itself is equivalent to self-hosting compiler. Most common compilers written today are self-hosting compilers. Self-hosting is a powerful tool, of many metacompilers, allowing the easy extension of their own metaprogramming metalanguage. The feature that separates a metacompiler apart from other compiler compilers is that it takes as input a specialized metaprogramming language that describes all aspects of the compiler's operation. A metaprogram produced by a metacompiler is as complete a program as a program written in C++, BASIC or any other general programming language. The metaprogramming metalanguage is a powerful attribute allowing easier development of computer programming languages and other computer tools. Command line processors, text string transforming and analysis are easily coded using metaprogramming metalanguages of metacompilers.

A full featured development package includes a linker and a run time support library. Usually, a machine-oriented system programming language, such as C or C++, is needed to write the support library. A library consisting of support functions needed for the compiling process usually completes the full metacompiler package.

The meaning of metacompiler edit

In computer science, the prefix meta is commonly used to mean about (its own category). For example, metadata are data that describe other data. A language that is used to describe other languages is a metalanguage. Meta may also mean on a higher level of abstraction. A metalanguage operates on a higher level of abstraction in order to describe properties of a language. Backus–Naur form (BNF) is a formal metalanguage originally used to define ALGOL 60. BNF is a weak metalanguage, for it describes only the syntax and says nothing about the semantics or meaning. Metaprogramming is the writing of computer programs with the ability to treat programs as their data. A metacompiler takes as input a metaprogram written in a specialized metalanguages (a higher level abstraction) specifically designed for the purpose of metaprogramming.[4][5] The output is an executable object program.

An analogy can be drawn: That as a C++ compiler takes as input a C++ programming language program, a metacompiler takes as input a metaprogramming metalanguage program.

Forth metacompiler edit

Many advocates of the language Forth call the process of creating a new implementation of Forth a meta-compilation and that it constitutes a metacompiler. The Forth definition of metacompiler is:

"A metacompiler is a compiler which processes its own source code, resulting in an executable version of itself."

This Forth use of the term metacompiler is disputed in mainstream computer science. See Forth (programming language) and History of compiler construction. The actual Forth process of compiling itself is a combination of a Forth being a self-hosting extensible programming language and sometimes cross compilation, long established terminology in computer science. Metacompilers are a general compiler writing system. Besides the Forth metacompiler concept being indistinguishable from self-hosting and extensible language. The actual process acts at a lower level defining a minimum subset of forth words, that can be used to define additional forth words, A full Forth implementation can then be defined from the base set. This sounds like a bootstrap process. The problem is that almost every general purpose language compiler also fits the Forth metacompiler description.

When (self-hosting compiler) X processes its own source code, resulting in an executable version of itself, X is a metacompiler.

Just replace X with any common language, C, C++, Pascal, COBOL, Fortran, Ada, Modula-2, etc. And X would be a metacompiler according to the Forth usage of metacompiler. A metacompiler operates at an abstraction level above the compiler it compiles. It only operates at the same (self-hosting compiler) level when compiling itself. One has to see the problem with this definition of metacompiler. It can be applied to most any language.

However, on examining the concept of programming in Forth, adding new words to the dictionary, extending the language in this way is metaprogramming. It is this metaprogramming in Forth that makes it a metacompiler.

Programming in Forth is adding new words to the language. Changing the language in this way is metaprogramming. Forth is a metacompiler because Forth is a language specifically designed for metaprogramming. Programming in Forth is extending Forth adding words to the Forth vocabulary creates a new Forth dialect. Forth is a specialized metacompiler for Forth language dialects.

History edit

Design of the original Compiler Compiler was started by Tony Brooker and Derrick Morris in 1959, with initial testing beginning in March 1962.[8] Brooker's Compiler Compiler was used to create compilers for the new Atlas computer at the University of Manchester, for several languages: Mercury Autocode, Extended Mercury Autocode, Atlas Autocode, ALGOL 60 and ASA Fortran. At roughly the same time, related work was being done by E. T. (Ned) Irons at Princeton, and Alick Glennie at the Atomic Weapons Research Establishment at Aldermaston whose "Syntax Machine" paper (declassified in 1977) inspired the META series of translator writing systems mentioned below.

The early history of metacompilers is closely tied with the history of SIG/PLAN Working group 1 on Syntax Driven Compilers. The group was started primarily through the effort of Howard Metcalfe in the Los Angeles area.[9] In the fall of 1962, Howard Metcalfe designed two compiler-writing interpreters. One used a bottom-to-top analysis technique based on a method described by Ledley and Wilson.[10] The other used a top-to-bottom approach based on work by Glennie to generate random English sentences from a context-free grammar.[11]

At the same time, Val Schorre described two "meta machines", one generative and one analytic. The generative machine was implemented and produced random algebraic expressions. Meta I the first metacompiler was implemented by Schorre on an IBM 1401 at UCLA in January 1963. His original interpreters and metamachines were written directly in a pseudo-machine language. META II, however, was written in a higher-level metalanguage able to describe its own compilation into the pseudo-machine language.[12][13][14]

Lee Schmidt at Bolt, Beranek, and Newman wrote a metacompiler in March 1963 that utilized a CRT display on the time-sharing PDP-l.[15] This compiler produced actual machine code rather than interpretive code and was partially bootstrapped from Meta I.[citation needed]

Schorre bootstrapped Meta II from Meta I during the spring of 1963. The paper on the refined metacompiler system presented at the 1964 Philadelphia ACM conference is the first paper on a metacompiler available as a general reference. The syntax and implementation technique of Schorre's system laid the foundation for most of the systems that followed. The system was implemented on a small 1401, and was used to implement a small ALGOL-like language.[citation needed]

Many similar systems immediately followed.[citation needed]

Roger Rutman of AC Delco developed and implemented LOGIK, a language for logical design simulation, on the IBM 7090 in January 1964.[16] This compiler used an algorithm that produced efficient code for Boolean expressions.[citation needed]

Another paper in the 1964 ACM proceedings describes Meta III, developed by Schneider and Johnson at UCLA for the IBM 7090.[17] Meta III represents an attempt to produce efficient machine code, for a large class of languages. Meta III was implemented completely in assembly language. Two compilers were written in Meta III, CODOL, a compiler-writing demonstration compiler, and PUREGOL, a dialect of ALGOL 60. (It was pure gall to call it ALGOL).

Late in 1964, Lee Schmidt bootstrapped the metacompiler EQGEN, from the PDP-l to the Beckman 420. EQGEN was a logic equation generating language.

In 1964, System Development Corporation began a major effort in the development of metacompilers. This effort includes powerful metacompilers, Bookl, and Book2 written in Lisp which have extensive tree-searching and backup ability. An outgrowth of one of the Q-32 systems at SDC is Meta 5.[18] The Meta 5 system incorporates backup of the input stream and enough other facilities to parse any context-sensitive language. This system was successfully released to a wide number of users and had many string-manipulation applications other than compiling. It has many elaborate push-down stacks, attribute setting and testing facilities, and output mechanisms. That Meta 5 successfully translates JOVIAL programs to PL/I programs demonstrates its power and flexibility.

Robert McClure at Texas Instruments invented a compiler-compiler called TMG (presented in 1965). TMG was used to create early compilers for programming languages like B, PL/I and ALTRAN. Together with metacompiler of Val Schorre, it was an early inspiration for the last chapter of Donald Knuth's The Art of Computer Programming.[19]

The LOT system was developed during 1966 at Stanford Research Institute and was modeled very closely after Meta II.[20] It had new special-purpose constructs allowing it to generate a compiler which could in turn, compile a subset of PL/I. This system had extensive statistic-gathering facilities and was used to study the characteristics of top-down analysis.

SIMPLE is a specialized translator system designed to aid the writing of pre-processors for PL/I, SIMPLE, written in PL/I, is composed of three components: An executive, a syntax analyzer and a semantic constructor.[21]

The TREE-META compiler was developed at Stanford Research Institute in Menlo Park, California. April 1968. The early metacompiler history is well documented in the TREE META manual. TREE META paralleled some of the SDC developments. Unlike earlier metacompilers it separated the semantics processing from the syntax processing. The syntax rules contained tree building operations that combined recognized language elements with tree nodes. The tree structure representation of the input was then processed by a simple form of unparse rules. The unparse rules used node recognition and attribute testing that when matched resulted in the associated action being performed. In addition like tree element could also be tested in an unparse rule. Unparse rules were also a recursive language being able to call unparse rules passing elements of thee tree before the action of the unparse rule was performed.

The concept of the metamachine originally put forth by Glennie is so simple that three hardware versions have been designed and one actually implemented. The latter at Washington University in St. Louis. This machine was built from macro-modular components and has for instructions the codes described by Schorre.

CWIC (Compiler for Writing and Implementing Compilers) is the last known Schorre metacompiler. It was developed at Systems Development Corporation by Erwin Book, Dewey Val Schorre and Steven J. Sherman With the full power of (lisp 2) a list processing language optimizing algorithms could operate on syntax generated lists and trees before code generation. CWIC also had a symbol table built into the language.

With the resurgence of domain-specific languages and the need for parser generators which are easy to use, easy to understand, and easy to maintain, metacompilers are becoming a valuable tool for advanced software engineering projects.

Other examples of parser generators in the yacc vein are ANTLR, Coco/R,[22] CUP,[citation needed] GNU Bison, Eli,[23] FSL,[citation needed] SableCC, SID (Syntax Improving Device),[24] and JavaCC. While useful, pure parser generators only address the parsing part of the problem of building a compiler. Tools with broader scope, such as PQCC, Coco/R and DMS Software Reengineering Toolkit provide considerable support for more difficult post-parsing activities such as semantic analysis, code optimization and generation.

Schorre metalanguages edit

The earliest Schorre metacompilers, META I and META II, were developed by D. Val Schorre at UCLA. Other Schorre based metacompilers followed. Each adding improvements to language analysis and/or code generation.

In programming it is common to use the programming language name to refer to both the compiler and the programming language, the context distinguishing the meaning. A C++ program is compiled using a C++ compiler. That also applies in the following. For example, META II is both the compiler and the language.

The metalanguages in the Schorre line of metacompilers are functional programming languages that use top down grammar analyzing syntax equations having embedded output transformation constructs.

A syntax equation:

<name> = <body>;

is a compiled test function returning success or failure. <name> is the function name. <body> is a form of logical expression consisting of tests that may be grouped, have alternates, and output productions. A test is like a bool in other languages, success being true and failure being false.

Defining a programming language analytically top down is natural. For example, a program could be defined as:

 program = $declaration; 

Defining a program as a sequence of zero or more declaration(s).

In the Schorre META X languages there is a driving rule. The program rule above is an example of a driving rule. The program rule is a test function that calls declaration, a test rule, that returns success or failure. The $ loop operator repeatedly calling declaration until failure is returned. The $ operator is always successful, even when there are zero declaration. Above program would always return success. (In CWIC a long fail can bypass declaration. A long-fail is part of the backtracking system of CWIC)

The character sets of these early compilers were limited. The character / was used for the alternant (or) operator. "A or B" is written as A / B. Parentheses ( ) are used for grouping.

A (B / C) 

Describes a construct of A followed by B or C. As a boolean expression it would be

A and (B or C) 

A sequence X Y has an implied X and Y meaning. ( ) are grouping and / the or operator. The order of evaluation is always left to right as an input character sequence is being specified by the ordering of the tests.

Special operator words whose first character is a "." are used for clarity. .EMPTY is used as the last alternate when no previous alternant need be present.

X (A / B / .EMPTY) 

Indicates that X is optionally followed by A or B. This is a specific characteristic of these metalanguages being programming languages. Backtracking is avoided by the above. Other compiler constructor systems may have declared the three possible sequences and left it up to the parser to figure it out.

The characteristics of the metaprogramming metalanguages above are common to all Schorre metacompilers and those derived from them.

META I edit

META I was a hand compiled metacompiler used to compile META II. Little else is known of META I except that the initial compilation of META II produced nearly identical code to that of the hand coded META I compiler.

META II edit

Each rule consists optionally of tests, operators, and output productions. A rule attempts to match some part of the input program source character stream returning success or failure. On success the input is advanced over matched characters. On failure the input is not advanced.

Output productions produced a form of assembly code directly from a syntax rule.

TREE-META edit

TREE-META introduced tree building operators :<node_name> and [<number>] moving the output production transforms to unparsed rules. The tree building operators were used in the grammar rules directly transforming the input into an abstract syntax tree. Unparse rules are also test functions that matched tree patterns. Unparse rules are called from a grammar rule when an abstract syntax tree is to be transformed into output code. The building of an abstract syntax tree and unparse rules allowed local optimizations to be performed by analyzing the parse tree.

Moving of output productions to the unparse rules made a clear separation of grammar analysis and code production. This made the programming easier to read and understand.

CWIC edit

In 1968–1970, Erwin Book, Dewey Val Schorre, and Steven J. Sherman developed CWIC.[4] (Compiler for Writing and Implementing Compilers) at System Development Corporation Charles Babbage Institute Center for the History of Information Technology (Box 12, folder 21),

CWIC is a compiler development system composed of three special-purpose, domain specific, languages, each intended to permit the description of certain aspects of translation in a straight forward manner. The syntax language is used to describe the recognition of source text and the construction from it to an intermediate tree structure. The generator language is used to describe the transformation of the tree into appropriate object language.

The syntax language follows Dewey Val Schorre's previous line of metacompilers. It most resembles TREE-META having tree building operators in the syntax language. The unparse rules of TREE-META are extended to work with the object based generator language based on LISP 2.

CWIC includes three languages:

  • Syntax: Transforms the source program input, into list structures using grammar transformation formula. A parsed expression structure is passed to a generator by placement of a generator call in a rule. A tree is represented by a list whose first element is a node object. The language has operators, < and >, specifically for making lists. The colon : operator is used to create node objects. :ADD creates an ADD node. The exclamation ! operator combines a number of parsed entries with a node to make a tree . Trees created by syntax rules are passed to generator functions, returning success or failure. The syntax language is very close to TREE-META. Both use a colon to create a node. CWIC's tree building exclamation !<number> functions the same as TREE-META's [<number>].
  • Generator: a named series of transforming rules, each consisting of an unparse, pattern matching, rule. and an output production written in a LISP 2 like language. the translation was to IBM 360 binary machine code. Other facilities of the generator language generalized output.[4]
  • MOL-360: an independent mid level implementation language for the IBM System/360 family of computers developed in 1968 and used for writing the underlying support library.

Generators language edit

Generators Language had semantics similar to Lisp. The parse tree was thought of as a recursive list. The general form of a Generator Language function is:

 function-name(first-unparse_rule) => first-production_code_generator (second-unparse_rule) => second-production_code_generator (third-unparse_rule) => third-production_code_generator ...

The code to process a given tree included the features of a general purpose programming language, plus a form: <stuff>, which would emit (stuff) onto the output file. A generator call may be used in the unparse_rule. The generator is passed the element of unparse_rule pattern in which it is placed and its return values are listed in (). For example:

 expr_gen(ADD[expr_gen(x),expr_gen(y)]) => <AR + (x*16)+y;> releasereg(y); return x; (SUB[expr_gen(x),expr_gen(y)])=> <SR + (x*16)+y;> releasereg(y); return x; (MUL[expr_gen(x),expr_gen(y)])=> . . . (x)=> r1 = getreg(); load(r1, x); return r1; ...

That is, if the parse tree looks like (ADD[<something1>,<something2>]), expr_gen(x) would be called with <something1> and return x. A variable in the unparse rule is a local variable that can be used in the production_code_generator. expr_gen(y) is called with <something2> and returns y. Here is a generator call in an unparse rule is passed the element in the position it occupies. Hopefully in the above x and y will be registers on return. The last transforms is intended to load an atomic into a register and return the register. The first production would be used to generate the 360 "AR" (Add Register) instruction with the appropriate values in general registers. The above example is only a part of a generator. Every generator expression evaluates to a value that con then be further processed. The last transform could just as well have been written as:

 (x)=> return load(getreg(), x); 

In this case load returns its first parameter, the register returned by getreg(). the functions load and getreg are other CWIC generators.

CWIC addressed domain-specific languages before the term domain-specific language existed edit

From the authors of CWIC:

"A metacompiler assists the task of compiler-building by automating its non creative aspects, those aspects that are the same regardless of the language which the produced compiler is to translate. This makes possible the design of languages which are appropriate to the specification of a particular problem. It reduces the cost of producing processors for such languages to a point where it becomes economically feasible to begin the solution of a problem with language design."[4]

Examples edit

See also edit

References and notes edit

  1. ^ Compilers : principles, techniques, & tools. Alfred V. Aho, Monica S. Lam, Ravi Sethi, Jeffrey D. Ullman, Alfred V. Aho (Second ed.). Boston. 2007. p. 287. ISBN 978-0-321-48681-3. OCLC 70775643.{{cite book}}: CS1 maint: location missing publisher (link) CS1 maint: others (link)
  2. ^ "A Syntax Directed Compiler for ALGOL 60" Edgar T. Irons, Communications of the ACM Volume 4 Issue 1, Jan. 1961.
  3. ^ a b Metacompiler: (computer science) A compiler that is used chiefly to construct compilers for other programming languages."Sci-Tech Dictionary McGraw-Hill Dictionary of Scientific and Technical Terms, 6th edition". McGraw-Hill Companies. from the original on 2018-04-07. Retrieved 2018-04-07.
  4. ^ a b c d e f g h Book, Erwin; Dewey Val Schorre; Steven J. Sherman (June 1970). "The CWIC/36O system, a compiler for writing and implementing compilers". ACM SIGPLAN Notices. 5 (6): 11–29. doi:10.1145/954344.954345. S2CID 44675240.
  5. ^ a b C. Stephen Carr, David A. Luther, Sherian Erdmann, The TREE-META Compiler-Compiler System: A Meta Compiler System for the Univac 1108 and General Electric 645, University of Utah Technical Report RADC-TR-69-83.
  6. ^ Peter Mosses, "SIS: A Compiler-Generator System Using Denotational Semantics," Report 78-4-3, Dept. of Computer Science, University of Aarhus, Denmark, June 1978
  7. ^ Neighbors, J. M. Software Construction using Components 2018-03-18 at the Wayback Machine. Technical Report 160, Department of Information and Computer Sciences, University of California, Irvine, 1980.
  8. ^ Lavington, Simon (April 2016). "Tony Brooker and the Atlas Compiler Compiler" (PDF). (PDF) from the original on 2023-03-26. Retrieved 2023-09-29.
  9. ^ Howard Metcalfe, "A Parameterized Compiler Based on Mechanical Linguistics" Planning Research Corporation R-311, 1 March 1963, also in Annual Review in Automatic Programming, Vol. 4
  10. ^ Robert Ledley and J. B. Wilson, "Automatic Programming, Language Translation Through Syntactical Analysis", Communications of the Association for Computing Machinery, Vol. 5, No. 3 pp. 145–155, March 1962.
  11. ^ A. E. Glennie, "On the Syntax Machine and the Construction of a Universal Computer", Technical Report Number 2, AD 240–512, Computation Center, Carnegie Institute of Technology, 1960.
  12. ^ Schorre, D. V., META II a syntax-oriented compiler writing language, Proceedings of the 1964 19th ACM National Conference, pp. 41.301-41.3011, 1964
  13. ^ Dewey, Val Schorre (1963). "A Syntax – Directed SMALGOL for the 1401". ACM National Conference, Denver, Colorado.
  14. ^ Meta I is described in the paper given at the 1963 Colorado ACM conference. See SMALGOL.
  15. ^ L. O. Schmidt, "The Status Bitt ACM SegPlan "Special Interest Group on Programming Languages" Working Group 1 News Letter, 1964.
  16. ^ Roger Rutman, "LOGIK. A Syntax Directed Compiler for Computer Bit-Time Simulation", Master thesis, UCLA, August 1964.
  17. ^ F. W. Schneider and (G. D. Johnson, "A Syntax-Directed Compiler-writing, Compiler to generate Efficient Code", Proceedings of the 19th National Conference of the Association for Computing Machinery, 1964
  18. ^ D. Oppenheim and D. Haggerty, "META 5: A Tool to Manipulate Strings of Data", Proceedings of the 21st National Conference of the Association for Computing Machinery, 1966.
  19. ^ Knuth, Donald (1990). "The genesis of attribute grammars" (PDF). In P. Deransart; M. Jourdan (eds.). Proceedings of the International Conference on Attribute Grammars and Their Applications (Paris, France). International Conference on Attribute Grammars and Their Applications. Lecture Notes in Computer Science. Vol. 461. New York: Springer-Verlag. pp. 1–12. CiteSeerX 10.1.1.105.5365. doi:10.1007/3-540-53101-7_1. ISBN 978-3-540-53101-2. (PDF) from the original on 2020-11-23. Retrieved 2020-02-06.
  20. ^ Charles R. Kirkley and Johns F. Rulifson, "The LOT System of Syntax Directed Compiling", Stanford Research Institute Internal Report ISR 187531-139, 1966.
  21. ^ George J. E. (1967a). Syntax Analyzer, Recognizer, Parser and Semantic interpretation System, Stanford Linear Accelerator Center, 15 November 1967.
  22. ^ a b Rechenberg, Peter [in German]; Mössenböck, Hanspeter [in German] (1985). Ein Compiler-Generator für Mikrocomputer - Grundlagen, Anwendungen, Programmierung in Modula-2 (in German) (1 ed.). Munich, Germany: Carl Hanser Verlag. ISBN 3-446-14495-1. (NB. The book describes the construction of Coco in Modula-2.)
  23. ^ Gray, Robert W.; Levi, Steven P.; Heuring, Vincent P.; Sloane, Anthony M.; Waite, William M. (1992). "Eli: A complete, flexible compiler construction system". Communications of the ACM. 35 (2): 121–130. doi:10.1145/129630.129637. S2CID 5121773.
  24. ^ Foster, J. M. (1968). "A syntax improving program". The Computer Journal. 11: 31–34. doi:10.1093/comjnl/11.1.31.

Further reading edit

  • Brooker, R. A.; MacCallum, I. R.; Morris, D.; Rohl, J. S. (1963). "The compiler-compiler" (PDF). Annual Review in Automatic Programming. 3: 229–275. doi:10.1016/S0066-4138(63)80009-9.
  • Brooker, R .A.; Morris, D.; Rohl, J. S. (February 1967). "Experience with the Compiler Compiler" (PDF). Computer Journal. 9: 350. doi:10.1093/comjnl/9.4.350. Retrieved 2021-05-07.
  • Napper, R. B. E (December 1965). An Introduction To The Compiler Compiler (PDF).
  • MacCallum, I. R. (January 1963). Some Aspects of the Implementation of the Compiler Compiler on ATLAS (PDF) (Thesis).
  • Johnson, Stephen C. (July 1975). Yacc—yet another compiler-compiler. Murray Hill, New Jersey, USA: Bell Laboratories. Computer Science Technical Report 32.
  • McKeeman, William M.; Horning, James J.; Wortman, David B. (1970). A Compiler Generator. Englewood Cliffs, New Jersey, USA: Prentice-Hall. ISBN 978-0-13-155077-3. Retrieved 2012-12-13.

External links edit

  • Tony Brooker and the Atlas Compiler Compiler
  • An Explanation of the Compiler Compiler listings (1963)
  • Compiler Compiler source code (starts around PDF page 182)
  • Original Compiler Compiler flowcharts
  • , Brooker Autocodes
  • Catalog.compilertools.net 2011-08-13 at the Wayback Machine, The Catalog of Compiler Construction Tools
  • Labraj.uni-mb.si, Lisa
  • Skenz.it, Jflex and Cup resources (in Italian)

compiler, compiler, confused, with, self, hosting, compiler, source, source, compiler, this, article, needs, additional, citations, verification, please, help, improve, this, article, adding, citations, reliable, sources, unsourced, material, challenged, remov. Not to be confused with self hosting compiler or source to source compiler This article needs additional citations for verification Please help improve this article by adding citations to reliable sources Unsourced material may be challenged and removed Find sources Compiler compiler news newspapers books scholar JSTOR October 2015 Learn how and when to remove this template message In computer science a compiler compiler or compiler generator is a programming tool that creates a parser interpreter or compiler from some form of formal description of a programming language and machine The most common type of compiler compiler is called a parser generator 1 It handles only syntactic analysis A formal description of a language is usually a grammar used as an input to a parser generator It often resembles Backus Naur form BNF or extended Backus Naur form EBNF or has its own syntax Grammar files describe a syntax of a generated compiler s target programming language and actions that should be taken against its specific constructs Source code for a parser of the programming language is returned as the parser generator s output This source code can then be compiled into a parser which may be either standalone or embedded The compiled parser then accepts the source code of the target programming language as an input and performs an action or outputs an abstract syntax tree AST Parser generators do not handle the semantics of the AST or the generation of machine code for the target machine 2 A metacompiler is a software development tool used mainly in the construction of compilers translators and interpreters for other programming languages 3 The input to a metacompiler is a computer program written in a specialized programming metalanguage designed mainly for the purpose of constructing compilers 3 4 The language of the compiler produced is called the object language The minimal input producing a compiler is a metaprogram specifying the object language grammar and semantic transformations into an object program 4 5 Contents 1 Variants 1 1 Metacompilers 1 1 1 The meaning of metacompiler 1 2 Forth metacompiler 2 History 3 Schorre metalanguages 3 1 META I 3 2 META II 3 3 TREE META 3 4 CWIC 3 4 1 Generators language 3 4 2 CWIC addressed domain specific languages before the term domain specific language existed 4 Examples 5 See also 6 References and notes 7 Further reading 8 External linksVariants editA typical parser generator associates executable code with each of the rules of the grammar that should be executed when these rules are applied by the parser These pieces of code are sometimes referred to as semantic action routines since they define the semantics of the syntactic structure that is analyzed by the parser Depending upon the type of parser that should be generated these routines may construct a parse tree or abstract syntax tree or generate executable code directly One of the earliest 1964 surprisingly powerful versions of compiler compilers is META II which accepted an analytical grammar with output facilities that produce stack machine code and is able to compile its own source code and other languages Among the earliest programs of the original Unix versions being built at Bell Labs was the two part lex and yacc system which was normally used to output C programming language code but had a flexible output system that could be used for everything from programming languages to text file conversion Their modern GNU versions are flex and bison Some experimental compiler compilers take as input a formal description of programming language semantics typically using denotational semantics This approach is often called semantics based compiling and was pioneered by Peter Mosses Semantic Implementation System SIS in 1978 6 However both the generated compiler and the code it produced were inefficient in time and space No production compilers are currently built in this way but research continues The Production Quality Compiler Compiler PQCC project at Carnegie Mellon University does not formalize semantics but does have a semi formal framework for machine description Compiler compilers exist in many flavors including bottom up rewrite machine generators see JBurg used to tile syntax trees according to a rewrite grammar for code generation and attribute grammar parser generators e g ANTLR can be used for simultaneous type checking constant propagation and more during the parsing stage Metacompilers edit Metacompilers reduce the task of writing compilers by automating the aspects that are the same regardless of the object language This makes possible the design of domain specific languages which are appropriate to the specification of a particular problem A metacompiler reduces the cost of producing translators for such domain specific object languages to a point where it becomes economically feasible to include in the solution of a problem a domain specific language design 4 As a metacompiler s metalanguage will usually be a powerful string and symbol processing language they often have strong applications for general purpose applications including generating a wide range of other software engineering and analysis tools 4 7 Besides being useful for domain specific language development a metacompiler is a prime example of a domain specific language designed for the domain of compiler writing A metacompiler is a metaprogram usually written in its own metalanguage or an existing computer programming language The process of a metacompiler written in its own metalanguage compiling itself is equivalent to self hosting compiler Most common compilers written today are self hosting compilers Self hosting is a powerful tool of many metacompilers allowing the easy extension of their own metaprogramming metalanguage The feature that separates a metacompiler apart from other compiler compilers is that it takes as input a specialized metaprogramming language that describes all aspects of the compiler s operation A metaprogram produced by a metacompiler is as complete a program as a program written in C BASIC or any other general programming language The metaprogramming metalanguage is a powerful attribute allowing easier development of computer programming languages and other computer tools Command line processors text string transforming and analysis are easily coded using metaprogramming metalanguages of metacompilers A full featured development package includes a linker and a run time support library Usually a machine oriented system programming language such as C or C is needed to write the support library A library consisting of support functions needed for the compiling process usually completes the full metacompiler package The meaning of metacompiler edit In computer science the prefix meta is commonly used to mean about its own category For example metadata are data that describe other data A language that is used to describe other languages is a metalanguage Meta may also mean on a higher level of abstraction A metalanguage operates on a higher level of abstraction in order to describe properties of a language Backus Naur form BNF is a formal metalanguage originally used to define ALGOL 60 BNF is a weak metalanguage for it describes only the syntax and says nothing about the semantics or meaning Metaprogramming is the writing of computer programs with the ability to treat programs as their data A metacompiler takes as input a metaprogram written in a specialized metalanguages a higher level abstraction specifically designed for the purpose of metaprogramming 4 5 The output is an executable object program An analogy can be drawn That as a C compiler takes as input a C programming language program a metacompiler takes as input a metaprogramming metalanguage program Forth metacompiler edit This section s tone or style may not reflect the encyclopedic tone used on Wikipedia See Wikipedia s guide to writing better articles for suggestions August 2015 Learn how and when to remove this template message Many advocates of the language Forth call the process of creating a new implementation of Forth a meta compilation and that it constitutes a metacompiler The Forth definition of metacompiler is A metacompiler is a compiler which processes its own source code resulting in an executable version of itself This Forth use of the term metacompiler is disputed in mainstream computer science See Forth programming language and History of compiler construction The actual Forth process of compiling itself is a combination of a Forth being a self hosting extensible programming language and sometimes cross compilation long established terminology in computer science Metacompilers are a general compiler writing system Besides the Forth metacompiler concept being indistinguishable from self hosting and extensible language The actual process acts at a lower level defining a minimum subset of forth words that can be used to define additional forth words A full Forth implementation can then be defined from the base set This sounds like a bootstrap process The problem is that almost every general purpose language compiler also fits the Forth metacompiler description When self hosting compiler X processes its own source code resulting in an executable version of itself X is a metacompiler Just replace X with any common language C C Pascal COBOL Fortran Ada Modula 2 etc And X would be a metacompiler according to the Forth usage of metacompiler A metacompiler operates at an abstraction level above the compiler it compiles It only operates at the same self hosting compiler level when compiling itself One has to see the problem with this definition of metacompiler It can be applied to most any language However on examining the concept of programming in Forth adding new words to the dictionary extending the language in this way is metaprogramming It is this metaprogramming in Forth that makes it a metacompiler Programming in Forth is adding new words to the language Changing the language in this way is metaprogramming Forth is a metacompiler because Forth is a language specifically designed for metaprogramming Programming in Forth is extending Forth adding words to the Forth vocabulary creates a new Forth dialect Forth is a specialized metacompiler for Forth language dialects History editDesign of the original Compiler Compiler was started by Tony Brooker and Derrick Morris in 1959 with initial testing beginning in March 1962 8 Brooker s Compiler Compiler was used to create compilers for the new Atlas computer at the University of Manchester for several languages Mercury Autocode Extended Mercury Autocode Atlas Autocode ALGOL 60 and ASA Fortran At roughly the same time related work was being done by E T Ned Irons at Princeton and Alick Glennie at the Atomic Weapons Research Establishment at Aldermaston whose Syntax Machine paper declassified in 1977 inspired the META series of translator writing systems mentioned below The early history of metacompilers is closely tied with the history of SIG PLAN Working group 1 on Syntax Driven Compilers The group was started primarily through the effort of Howard Metcalfe in the Los Angeles area 9 In the fall of 1962 Howard Metcalfe designed two compiler writing interpreters One used a bottom to top analysis technique based on a method described by Ledley and Wilson 10 The other used a top to bottom approach based on work by Glennie to generate random English sentences from a context free grammar 11 At the same time Val Schorre described two meta machines one generative and one analytic The generative machine was implemented and produced random algebraic expressions Meta I the first metacompiler was implemented by Schorre on an IBM 1401 at UCLA in January 1963 His original interpreters and metamachines were written directly in a pseudo machine language META II however was written in a higher level metalanguage able to describe its own compilation into the pseudo machine language 12 13 14 Lee Schmidt at Bolt Beranek and Newman wrote a metacompiler in March 1963 that utilized a CRT display on the time sharing PDP l 15 This compiler produced actual machine code rather than interpretive code and was partially bootstrapped from Meta I citation needed Schorre bootstrapped Meta II from Meta I during the spring of 1963 The paper on the refined metacompiler system presented at the 1964 Philadelphia ACM conference is the first paper on a metacompiler available as a general reference The syntax and implementation technique of Schorre s system laid the foundation for most of the systems that followed The system was implemented on a small 1401 and was used to implement a small ALGOL like language citation needed Many similar systems immediately followed citation needed Roger Rutman of AC Delco developed and implemented LOGIK a language for logical design simulation on the IBM 7090 in January 1964 16 This compiler used an algorithm that produced efficient code for Boolean expressions citation needed Another paper in the 1964 ACM proceedings describes Meta III developed by Schneider and Johnson at UCLA for the IBM 7090 17 Meta III represents an attempt to produce efficient machine code for a large class of languages Meta III was implemented completely in assembly language Two compilers were written in Meta III CODOL a compiler writing demonstration compiler and PUREGOL a dialect of ALGOL 60 It was pure gall to call it ALGOL Late in 1964 Lee Schmidt bootstrapped the metacompiler EQGEN from the PDP l to the Beckman 420 EQGEN was a logic equation generating language In 1964 System Development Corporation began a major effort in the development of metacompilers This effort includes powerful metacompilers Bookl and Book2 written in Lisp which have extensive tree searching and backup ability An outgrowth of one of the Q 32 systems at SDC is Meta 5 18 The Meta 5 system incorporates backup of the input stream and enough other facilities to parse any context sensitive language This system was successfully released to a wide number of users and had many string manipulation applications other than compiling It has many elaborate push down stacks attribute setting and testing facilities and output mechanisms That Meta 5 successfully translates JOVIAL programs to PL I programs demonstrates its power and flexibility Robert McClure at Texas Instruments invented a compiler compiler called TMG presented in 1965 TMG was used to create early compilers for programming languages like B PL I and ALTRAN Together with metacompiler of Val Schorre it was an early inspiration for the last chapter of Donald Knuth s The Art of Computer Programming 19 The LOT system was developed during 1966 at Stanford Research Institute and was modeled very closely after Meta II 20 It had new special purpose constructs allowing it to generate a compiler which could in turn compile a subset of PL I This system had extensive statistic gathering facilities and was used to study the characteristics of top down analysis SIMPLE is a specialized translator system designed to aid the writing of pre processors for PL I SIMPLE written in PL I is composed of three components An executive a syntax analyzer and a semantic constructor 21 The TREE META compiler was developed at Stanford Research Institute in Menlo Park California April 1968 The early metacompiler history is well documented in the TREE META manual TREE META paralleled some of the SDC developments Unlike earlier metacompilers it separated the semantics processing from the syntax processing The syntax rules contained tree building operations that combined recognized language elements with tree nodes The tree structure representation of the input was then processed by a simple form of unparse rules The unparse rules used node recognition and attribute testing that when matched resulted in the associated action being performed In addition like tree element could also be tested in an unparse rule Unparse rules were also a recursive language being able to call unparse rules passing elements of thee tree before the action of the unparse rule was performed The concept of the metamachine originally put forth by Glennie is so simple that three hardware versions have been designed and one actually implemented The latter at Washington University in St Louis This machine was built from macro modular components and has for instructions the codes described by Schorre CWIC Compiler for Writing and Implementing Compilers is the last known Schorre metacompiler It was developed at Systems Development Corporation by Erwin Book Dewey Val Schorre and Steven J Sherman With the full power of lisp 2 a list processing language optimizing algorithms could operate on syntax generated lists and trees before code generation CWIC also had a symbol table built into the language With the resurgence of domain specific languages and the need for parser generators which are easy to use easy to understand and easy to maintain metacompilers are becoming a valuable tool for advanced software engineering projects Other examples of parser generators in the yacc vein are ANTLR Coco R 22 CUP citation needed GNU Bison Eli 23 FSL citation needed SableCC SID Syntax Improving Device 24 and JavaCC While useful pure parser generators only address the parsing part of the problem of building a compiler Tools with broader scope such as PQCC Coco R and DMS Software Reengineering Toolkit provide considerable support for more difficult post parsing activities such as semantic analysis code optimization and generation Schorre metalanguages editThe earliest Schorre metacompilers META I and META II were developed by D Val Schorre at UCLA Other Schorre based metacompilers followed Each adding improvements to language analysis and or code generation In programming it is common to use the programming language name to refer to both the compiler and the programming language the context distinguishing the meaning A C program is compiled using a C compiler That also applies in the following For example META II is both the compiler and the language The metalanguages in the Schorre line of metacompilers are functional programming languages that use top down grammar analyzing syntax equations having embedded output transformation constructs A syntax equation lt name gt lt body gt is a compiled test function returning success or failure lt name gt is the function name lt body gt is a form of logical expression consisting of tests that may be grouped have alternates and output productions A test is like a bool in other languages success being true and failure being false Defining a programming language analytically top down is natural For example a program could be defined as program declaration Defining a program as a sequence of zero or more declaration s In the Schorre META X languages there is a driving rule The program rule above is an example of a driving rule The program rule is a test function that calls declaration a test rule that returns success or failure The loop operator repeatedly calling declaration until failure is returned The operator is always successful even when there are zero declaration Above program would always return success In CWIC a long fail can bypass declaration A long fail is part of the backtracking system of CWIC The character sets of these early compilers were limited The character was used for the alternant or operator A or B is written as A B Parentheses are used for grouping A B C Describes a construct of A followed by B or C As a boolean expression it would be A and B or C A sequence X Y has an implied X and Y meaning are grouping and the or operator The order of evaluation is always left to right as an input character sequence is being specified by the ordering of the tests Special operator words whose first character is a are used for clarity EMPTY is used as the last alternate when no previous alternant need be present X A B EMPTY Indicates that X is optionally followed by A or B This is a specific characteristic of these metalanguages being programming languages Backtracking is avoided by the above Other compiler constructor systems may have declared the three possible sequences and left it up to the parser to figure it out The characteristics of the metaprogramming metalanguages above are common to all Schorre metacompilers and those derived from them META I edit META I was a hand compiled metacompiler used to compile META II Little else is known of META I except that the initial compilation of META II produced nearly identical code to that of the hand coded META I compiler META II edit Main article META II Each rule consists optionally of tests operators and output productions A rule attempts to match some part of the input program source character stream returning success or failure On success the input is advanced over matched characters On failure the input is not advanced Output productions produced a form of assembly code directly from a syntax rule TREE META edit Main article TREE META TREE META introduced tree building operators lt node name gt and lt number gt moving the output production transforms to unparsed rules The tree building operators were used in the grammar rules directly transforming the input into an abstract syntax tree Unparse rules are also test functions that matched tree patterns Unparse rules are called from a grammar rule when an abstract syntax tree is to be transformed into output code The building of an abstract syntax tree and unparse rules allowed local optimizations to be performed by analyzing the parse tree Moving of output productions to the unparse rules made a clear separation of grammar analysis and code production This made the programming easier to read and understand CWIC edit In 1968 1970 Erwin Book Dewey Val Schorre and Steven J Sherman developed CWIC 4 Compiler for Writing and Implementing Compilers at System Development Corporation Charles Babbage Institute Center for the History of Information Technology Box 12 folder 21 CWIC is a compiler development system composed of three special purpose domain specific languages each intended to permit the description of certain aspects of translation in a straight forward manner The syntax language is used to describe the recognition of source text and the construction from it to an intermediate tree structure The generator language is used to describe the transformation of the tree into appropriate object language The syntax language follows Dewey Val Schorre s previous line of metacompilers It most resembles TREE META having tree building operators in the syntax language The unparse rules of TREE META are extended to work with the object based generator language based on LISP 2 CWIC includes three languages Syntax Transforms the source program input into list structures using grammar transformation formula A parsed expression structure is passed to a generator by placement of a generator call in a rule A tree is represented by a list whose first element is a node object The language has operators lt and gt specifically for making lists The colon operator is used to create node objects ADD creates an ADD node The exclamation operator combines a number of parsed entries with a node to make a tree Trees created by syntax rules are passed to generator functions returning success or failure The syntax language is very close to TREE META Both use a colon to create a node CWIC s tree building exclamation lt number gt functions the same as TREE META s lt number gt Generator a named series of transforming rules each consisting of an unparse pattern matching rule and an output production written in a LISP 2 like language the translation was to IBM 360 binary machine code Other facilities of the generator language generalized output 4 MOL 360 an independent mid level implementation language for the IBM System 360 family of computers developed in 1968 and used for writing the underlying support library Generators language edit Generators Language had semantics similar to Lisp The parse tree was thought of as a recursive list The general form of a Generator Language function is function name first unparse rule gt first production code generator second unparse rule gt second production code generator third unparse rule gt third production code generator The code to process a given tree included the features of a general purpose programming language plus a form lt stuff gt which would emit stuff onto the output file A generator call may be used in the unparse rule The generator is passed the element of unparse rule pattern in which it is placed and its return values are listed in For example expr gen ADD expr gen x expr gen y gt lt AR x 16 y gt releasereg y return x SUB expr gen x expr gen y gt lt SR x 16 y gt releasereg y return x MUL expr gen x expr gen y gt x gt r1 getreg load r1 x return r1 That is if the parse tree looks like ADD lt something1 gt lt something2 gt expr gen x would be called with lt something1 gt and return x A variable in the unparse rule is a local variable that can be used in the production code generator expr gen y is called with lt something2 gt and returns y Here is a generator call in an unparse rule is passed the element in the position it occupies Hopefully in the above x and y will be registers on return The last transforms is intended to load an atomic into a register and return the register The first production would be used to generate the 360 AR Add Register instruction with the appropriate values in general registers The above example is only a part of a generator Every generator expression evaluates to a value that con then be further processed The last transform could just as well have been written as x gt return load getreg x In this case load returns its first parameter the register returned by getreg the functions load and getreg are other CWIC generators CWIC addressed domain specific languages before the term domain specific language existed edit From the authors of CWIC A metacompiler assists the task of compiler building by automating its non creative aspects those aspects that are the same regardless of the language which the produced compiler is to translate This makes possible the design of languages which are appropriate to the specification of a particular problem It reduces the cost of producing processors for such languages to a point where it becomes economically feasible to begin the solution of a problem with language design 4 Examples editSee also Comparison of parser generators ANTLR GNU Bison Coco R Coco 2 22 DMS Software Reengineering Toolkit a program transformation system with parser generators Epsilon Grammar Studio Lemon parser generator LRSTAR LR parser generator META II parboiled a Java library for building parsers Packrat parser PackCC a packrat parser with left recursion support PQCC a compiler compiler that is more than a parser generator Syntax Improving Device SID SYNTAX an integrated toolset for compiler construction TREE META Yacc Xtext XPL JavaCCSee also editParsing expression grammar LL parser LR parser Simple LR parser LALR parser GLR parser Domain analysis Domain specific language History of compiler construction History of compiler construction Self hosting compilers Metacompilation Program transformationReferences and notes edit Compilers principles techniques amp tools Alfred V Aho Monica S Lam Ravi Sethi Jeffrey D Ullman Alfred V Aho Second ed Boston 2007 p 287 ISBN 978 0 321 48681 3 OCLC 70775643 a href Template Cite book html title Template Cite book cite book a CS1 maint location missing publisher link CS1 maint others link A Syntax Directed Compiler for ALGOL 60 Edgar T Irons Communications of the ACM Volume 4 Issue 1 Jan 1961 a b Metacompiler computer science A compiler that is used chiefly to construct compilers for other programming languages Sci Tech Dictionary McGraw Hill Dictionary of Scientific and Technical Terms 6th edition McGraw Hill Companies Archived from the original on 2018 04 07 Retrieved 2018 04 07 a b c d e f g h Book Erwin Dewey Val Schorre Steven J Sherman June 1970 The CWIC 36O system a compiler for writing and implementing compilers ACM SIGPLAN Notices 5 6 11 29 doi 10 1145 954344 954345 S2CID 44675240 a b C Stephen Carr David A Luther Sherian Erdmann The TREE META Compiler Compiler System A Meta Compiler System for the Univac 1108 and General Electric 645 University of Utah Technical Report RADC TR 69 83 Peter Mosses SIS A Compiler Generator System Using Denotational Semantics Report 78 4 3 Dept of Computer Science University of Aarhus Denmark June 1978 Neighbors J M Software Construction using Components Archived 2018 03 18 at the Wayback Machine Technical Report 160 Department of Information and Computer Sciences University of California Irvine 1980 Lavington Simon April 2016 Tony Brooker and the Atlas Compiler Compiler PDF Archived PDF from the original on 2023 03 26 Retrieved 2023 09 29 Howard Metcalfe A Parameterized Compiler Based on Mechanical Linguistics Planning Research Corporation R 311 1 March 1963 also in Annual Review in Automatic Programming Vol 4 Robert Ledley and J B Wilson Automatic Programming Language Translation Through Syntactical Analysis Communications of the Association for Computing Machinery Vol 5 No 3 pp 145 155 March 1962 A E Glennie On the Syntax Machine and the Construction of a Universal Computer Technical Report Number 2 AD 240 512 Computation Center Carnegie Institute of Technology 1960 Schorre D V META II a syntax oriented compiler writing language Proceedings of the 1964 19th ACM National Conference pp 41 301 41 3011 1964 Dewey Val Schorre 1963 A Syntax Directed SMALGOL for the 1401 ACM National Conference Denver Colorado Meta I is described in the paper given at the 1963 Colorado ACM conference See SMALGOL L O Schmidt The Status Bitt ACM SegPlan Special Interest Group on Programming Languages Working Group 1 News Letter 1964 Roger Rutman LOGIK A Syntax Directed Compiler for Computer Bit Time Simulation Master thesis UCLA August 1964 F W Schneider and G D Johnson A Syntax Directed Compiler writing Compiler to generate Efficient Code Proceedings of the 19th National Conference of the Association for Computing Machinery 1964 D Oppenheim and D Haggerty META 5 A Tool to Manipulate Strings of Data Proceedings of the 21st National Conference of the Association for Computing Machinery 1966 Knuth Donald 1990 The genesis of attribute grammars PDF In P Deransart M Jourdan eds Proceedings of the International Conference on Attribute Grammars and Their Applications Paris France International Conference on Attribute Grammars and Their Applications Lecture Notes in Computer Science Vol 461 New York Springer Verlag pp 1 12 CiteSeerX 10 1 1 105 5365 doi 10 1007 3 540 53101 7 1 ISBN 978 3 540 53101 2 Archived PDF from the original on 2020 11 23 Retrieved 2020 02 06 Charles R Kirkley and Johns F Rulifson The LOT System of Syntax Directed Compiling Stanford Research Institute Internal Report ISR 187531 139 1966 George J E 1967a Syntax Analyzer Recognizer Parser and Semantic interpretation System Stanford Linear Accelerator Center 15 November 1967 a b Rechenberg Peter in German Mossenbock Hanspeter in German 1985 Ein Compiler Generator fur Mikrocomputer Grundlagen Anwendungen Programmierung in Modula 2 in German 1 ed Munich Germany Carl Hanser Verlag ISBN 3 446 14495 1 NB The book describes the construction of Coco in Modula 2 Gray Robert W Levi Steven P Heuring Vincent P Sloane Anthony M Waite William M 1992 Eli A complete flexible compiler construction system Communications of the ACM 35 2 121 130 doi 10 1145 129630 129637 S2CID 5121773 Foster J M 1968 A syntax improving program The Computer Journal 11 31 34 doi 10 1093 comjnl 11 1 31 Further reading editBrooker R A MacCallum I R Morris D Rohl J S 1963 The compiler compiler PDF Annual Review in Automatic Programming 3 229 275 doi 10 1016 S0066 4138 63 80009 9 Brooker R A Morris D Rohl J S February 1967 Experience with the Compiler Compiler PDF Computer Journal 9 350 doi 10 1093 comjnl 9 4 350 Retrieved 2021 05 07 Napper R B E December 1965 An Introduction To The Compiler Compiler PDF MacCallum I R January 1963 Some Aspects of the Implementation of the Compiler Compiler on ATLAS PDF Thesis Johnson Stephen C July 1975 Yacc yet another compiler compiler Murray Hill New Jersey USA Bell Laboratories Computer Science Technical Report 32 McKeeman William M Horning James J Wortman David B 1970 A Compiler Generator Englewood Cliffs New Jersey USA Prentice Hall ISBN 978 0 13 155077 3 Retrieved 2012 12 13 External links editTony Brooker and the Atlas Compiler Compiler An Explanation of the Compiler Compiler listings 1963 Compiler Compiler source code starts around PDF page 182 Original Compiler Compiler flowcharts Computer50 org Brooker Autocodes Catalog compilertools net Archived 2011 08 13 at the Wayback Machine The Catalog of Compiler Construction Tools Labraj uni mb si Lisa Skenz it Jflex and Cup resources in Italian Retrieved from https en wikipedia org w index php title Compiler compiler amp oldid 1186612884, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.