A compiler's initial phase is lexical analysis. It reads changed source code written in the form of phrases from language preprocessors. By deleting any whitespace or comments in the source code, the lexical analyzer converts these syntaxes into a set of tokens. See how reserved words are added below.
<h3>How are reserved words added on the lexical analyzer?</h3>
When the lexical analyzer reads the target program code, it examines it letter by letter, and when it encounters a whitespace, operator symbol, or special symbol, it determines that a word has been finished.
Using the float floatvalue as an example, while scanning both lexemes until it reaches 'float,' the lexical analyzer is unable to determine whether it is a reserved word float or the details of a keyword float value.
According to the Longest Match Rule, the lexeme scanned should be decided by the longest match among all possible keywords.
The lexical analyzer also uses rule priority, which means that a reserved term, such as a keyword, of a language takes precedence over human input.
That is, if the lexical analyzer detects a lexeme that resembles any existing reserved term, an error should be generated.
Learn more about the lexical analyzer at;
brainly.com/question/13211785
#SPJ1