This is part two of a series of videos about compilation. Part two is about lexical analysis, also known as tokenization. It explains how the lexical analyser, otherwise known as the lexer, or the scanner, identifies the individual elements of a source program known as lexemes, which are then fed to the syntax analyser. It also mentions the symbol table as a repository for information about programmer defined names. The symbol table is used throughout the whole compilation process. As you will see when you watch this series, compilation involves a diverse range of themes in the field of computer science including high and low level programming paradigms, the definition of context free grammars, the application of dynamic data structures such as stacks, linked lists, hash tables, graphs and trees, memory management, processor architectures, and more. This series will give you an insight into some of the concepts and features that are typical of many compilers.