i don't understand you?what is the problem?
A parser is made up of two parts, a lexical analyzer and a syntactic/semantic analyzer. Although, problematically the term "parser" is often used to describe both the syntatctic/semantic analyzer as well as the combination of both the lexical and syntactic/semantic analyzers.
1.) Lexical analysis breaks the source language text into small pieces called tokens. Each token is a single atomic unit of the language, for instance a keyword, identifier or symbol name. The token syntax is typically a regular language, so a finite state automaton constructed from a regular expression can be used to recognize it. This phase is also called lexing or scanning, and the software doing lexical analysis is called a lexical analyzer or scanner.
2.) Syntax analysis involves parsing the token sequence to identify the syntactic structure of the program.
3.) Semantic analysis is the phase that checks the meaning of the program to ensure it obeys the rules of the language. One example is type checking. A parser emits most diagnostics during semantic analysis, and frequently combines it with syntax analysis.
Bison is a parser generator. However, it only generates code that deals with the phases described in item 2 and item 3. What this means is that the code generated by Bison requires a seperate implementation of the lexical phase described in item 1.
Bison provides a function called yyparse that is the entry point into the parser. However, yyparse expects a function called yylex to be defined to generate tokens from the source language. You either have to write this function yourself, or use a lexical analyzer generator tool like flex to generate it for you.
The following page provides some good resources on building parsers/compilers using Lex/Yacc or the modern GNU equivalents Flex/Bison.
The Lex and Yacc Page