Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
lfa:2023:c01draft [2023/10/06 19:01] tpruteanu |
lfa:2023:c01draft [2023/10/07 10:04] (current) mihai.udubasa changed regex to context free grammars for syntactic analysis |
||
---|---|---|---|
Line 31: | Line 31: | ||
- **syntactic stage**: In this stage, most parsers will build an Abstract Syntax Tree (AST) which describes the relations between tokens. For instance, the program fragment ''int x = 0'' may be interpreted as a //definition// which consists in the assignment of variable ''x'' to an expression ''0''. This stage is also responsible for making sure that the program is syntactically correct. | - **syntactic stage**: In this stage, most parsers will build an Abstract Syntax Tree (AST) which describes the relations between tokens. For instance, the program fragment ''int x = 0'' may be interpreted as a //definition// which consists in the assignment of variable ''x'' to an expression ''0''. This stage is also responsible for making sure that the program is syntactically correct. | ||
- | - **semantic checks**: Most of these checks are related to typing, which may be more relaxed, as in dynamic languages such as Racket of Python, or rigid, as in most OO-languages or Haskell. | + | - **semantic checks**: Most of these checks are related to typing, which may be more relaxed, as in dynamic languages such as Racket or Python, or rigid, as in most OO-languages or Haskell. |
- **optimisation** and **code-generation**: During these stages machine code will be generated as well as reorganised or rewritten in order to increase efficiency. | - **optimisation** and **code-generation**: During these stages machine code will be generated as well as reorganised or rewritten in order to increase efficiency. | ||
Line 39: | Line 39: | ||
Finally, note that some languages (and many modern ones) do not fit perfectly on the previous description. Java is such an example. On the one hand, they are compiled, because bytecode will be generated during the process. Next, the bytecode will be further translated to machine code by the JVM. But JIT (Just-In-Time) compilation makes the setting more complex and more similar to interpretation. | Finally, note that some languages (and many modern ones) do not fit perfectly on the previous description. Java is such an example. On the one hand, they are compiled, because bytecode will be generated during the process. Next, the bytecode will be further translated to machine code by the JVM. But JIT (Just-In-Time) compilation makes the setting more complex and more similar to interpretation. | ||
- | Historically, writing parsers was challenging and took a lot of time. Nowadays, writing parsers from scratch is rarely done in practice. This process has been replaced by powerful abstractions, which allow us to specify what type of lexemes we should search for in the lexical phase, and what kind of program structure we should look for, during the syntactic phase. The former are the well-known **regular expressions**, while the latter are, more often than not, **regular expressions**. | + | Historically, writing parsers was challenging and took a lot of time. Nowadays, writing parsers from scratch is rarely done in practice. This process has been replaced by powerful abstractions, which allow us to specify what type of lexemes we should search for in the lexical phase, and what kind of program structure we should look for, during the syntactic phase. The former are the well-known **regular expressions**, while the latter are, more often than not, **context-free grammars**. |
These abstractions are central to our lecture. | These abstractions are central to our lecture. |