Tuesday, August 12, 2008

Sunday, August 3, 2008

activity 3

I.)

1.) If E is a variable or constant then prefix(E)=E
2.) If E is E1 op E2 then prefix(E)=prefix(E1opE2)=op prefix(E1) prefix(E2)
3.) If E is (E1) then prefix(E)=prefix(E1)

II.)

1.) If E is a variable or constant then infix(E)=E
2.) If E is E1E2op then infix(E)=infix(E1E2op)=infix(E1) op infix(E2)
3.) If E is (E1) then infix(E)=infix(E1)

III.)

1.) If E is a variable or constant then prefix(E)=E
2.) If E is E1E2op then prefix(E)=prefix(E1E2op)=op prefix(E1) prefix(E2)

Thursday, July 31, 2008

S  SS+ | SS* | a
a.) show how a string aa+a* can be generated by the grammar
b.) construct a parse tree for this string.
c.) what language is generated by this language

S

SS*
Sa*
SS+a*
aa+a*

Photobucket

a language that accepts a string with a minimum value of “a” and may also contain an infinite amount of a’s, +’s and *’s.

2.2

a.) S  0S1 | 01
A language that accepts a string which has equal numbers of 0s and 1s where the 0s precede the 1s, with the minimum value of 0011.

b.) S+SS| -SS| a


c.) SS(S)S|Є
A language that accepts a string which may contain an infinite number of S’s and may contain a minimum value of SS.

d.) SaSbS| bSaS| Є
A language that accepts a string which contains an equal number of a’s and b’s and which could contain a minimum value of one “a” and one “b”.

e.) S a| S+S| SS| S*| (S)
A language that accepts a string which contains a minimum value of “a” and may contain an infinite amount of a’s, +’s and *’s.

2.3
The language in letter e is ambiguous.



2.4)

Construct an unambiguous context-free grammar for each of the following languages. In each case show that your grammar is correct.

Friday, June 27, 2008

Phases of a Compiler

Programming comes to mind when most people are presented with the term “computer science”. This is may not necessarily be the case, but a basic skill inherent with most computer scientists is the ability to program. What most people don’t know is that even the most sophisticated of programs will not be able to do anything without a compiler.

A compiler is a program that decodes instructions written in a higher order language and produces an assembly language program [1], and just like there are seven deadly sins, there are seven phases of a compiler—okay, that may have well been a really bad analogy. *grins*


Anyhow, a compiler is divided into 7 phases [2], basically from the original character stream, to the target machine code, assuming you want to run it on a machine, or virtual machine for that matter.

So basically what we’re working with is shown in the following diagram: [3]


Lexical Analyzer – lexical analysis breaks down the entire text of the program into the smallest atomic units, called tokens, at the same time throwing out unnecessary information such as white space and comments. [4]

Syntax Analyzer – syntax analysis, otherwise known as parsing, takes the tokens produced from lexical analysis and groups them into grammatical structures. [5]

Semantic Analyzer – semantic analysis utilizes the syntax tree and information in the symbol table to check the semantic consistency of the source program with the language definition. This is also where type-checking is done. [6]

Intermediate Code Generator – generates an explicit low-level or machine-like intermediate representation [7], having two properties: it should be easy to produce and it should be easy to translate into the target machine. [8]

Code Optimization – code optimization attempts to improve the intermediate code so a better target code will result. Usually better means faster, but other objectives may be desired, such as shorter code, or target code that consumes less power. [9]

Code Generator – code generation takes as input the intermediate representation as a result of code optimization, and produces the target code (e.g. machine code) to be executed by the machine (or virtual machine). [10]


Bibliography:

1 - http://wordnet.princeton.edu/perl/webwn

2 - Aho, Lam, Sethi and Ullman (2007), Compilers: Principles, Techniques, Tools (2nd Edition) pp.5, Boston: Pearson/Addison-Wesley.
3 -
ibid.

4 - P. N. Hilfinger, CS 164: Programming Languages and Compilers (Class Notes #2: Lexical) pp.1 - http://www.cs.berkeley.edu/~hilfingr/cs164/public_html/lectures/note2.pdf

5 -
ibid.

6 - Aho, Lam, Sethi and Ullman (2007), Compilers: Principles, Techniques, Tools (2nd Edition) pp.8, Boston: Pearson/Addison-Wesley.


7 - C.Sc. 131: Systems Architecture (Topic 3: Controlling the Computer) pp. 3 - http://www.comp.lancs.ac.uk/computing/staff/kc/Lecturing/131/7.pdf

8 - Aho, Lam, Sethi and Ullman (2007), Compilers: Principles, Techniques, Tools (2nd Edition) pp.9, Boston: Pearson/Addison-Wesley.

9 - Aho, Lam, Sethi and Ullman (2007), Compilers: Principles, Techniques, Tools (2nd Edition) pp.10, Boston: Pearson/Addison-Wesley.

10 - http://en.wikipedia.org/wiki/Code_generation_(compiler)



Tuesday, June 24, 2008

Introduction

Members of the group:

Lim, Ritchell Royce M.

Lleno, John Patrick R.

Sasoy, Michael Jay G.

Our group name is JRJsolutions; the first three characters taken from the first letters of first names, added to the first three characters is the word solutions.

This blog was made in partial fulfillment to our requirement in our ICS40 class: Compiler Design and Principles.

ICS40: Compiler Design and Principles -An introduction to the design and implementation of high-level programming language translators and compilers. It will open with a discussion of translators related to compilers, followed by an overview of the phases and data structures involved in compilation. Topics including lexical analysis, parsing techniques, symbol tables, run-time storage allocation, semantic routines, code generation, and optimization will then be covered with a series of projects assigned to illustrate practical issues.