val dummyLexer =
{n}Parser.Stream.cons
({n}LrVals.Tokens.{dummy token name}
({dummy lineno},{dummy lineno}),
lexer)
You have to pass a Tokens structure to the lexer. This Tokens structure
contains functions which construct tokens from values and line numbers.
So to create your dummy token just apply the appropriate token constructor
function from this Tokens structure to a value (if there is one) and the
line numbers. This is exactly what you do in the lexer to construct tokens.
functor Encapsulate(
structure Parser : PARSER
structure Interface : INTERFACE
sharing type Parser.arg = Interface.arg
sharing type Parser.pos = Interface.pos
sharing type Parser.result = ...
structure Tokens : {parser name}_TOKENS
sharing type Tokens.token = Parser.Token.token
sharing type Tokens.svalue = Parser.svalue) =
struct
...
end
The signature INTERFACE, defined below, is a possible signature for
a structure
defining the types
of line numbers and arguments (types pos and arg, respectively)
along with operations for them. You need this structure
because
these types will be abstract types inside the body of your
functor.
The directory example/fol contains a sample parser in which the code for tying together the lexer and parser has been encapsulated in a functor.signature INTERFACE = sig type pos val line : pos ref val reset : unit -> unit val next : unit -> unit val error : string * pos * pos -> unit type arg val nothing : arg end