Lexcial Analysis With F# – Part 5

I’ve begun to establish a reasonably sound design pattern for the lexical analyzer. Of course this isn’t intended to be an ideal solution to the general case of writing a tokenizer for any language, it does not support any kind of short hand for describing token structure for example. But it isn’t overly complex and at this stage supports some of the common tokens seen in C, C++ or C#.

Continue reading

Lexical Analysis With F# – Part 4

Confronting immutability

As I’ve been working on making the lexical analyzer (hereafter called “tokenizer”) more complete and simpler to understand, it’s begun to really dawn on me why immutability is so very important. I’ve begun to realize that by eliminating the traditional concept of assignment we are left only with function invocation.

Continue reading