lex/yacc questions from a novice...

Jeff Paton paton at latcs1.oz
Thu Aug 24 19:00:21 AEST 1989


> >From: jwp at larry.sal.wisc.edu (Jeffrey W Percival)
> Newsgroups: comp.unix.questions
> Subject: lex/yacc questions from a novice...
> Message-ID: <711 at larry.sal.wisc.edu>
> Date: 22 Aug 89 16:41:14 GMT
> Organization: Space Astronomy Lab, Madison WI
> Lines: 81
> 
> I am trying to use lex and yacc to help me read a dense, long, (...)
> (deleted stuff here)
> 
> My first question is how one trades off work between lex and yacc.
> 
> Along these lines, a problem I am having is getting the message "too
> many definitions" from lex, when all I have are a few keywords and
> ancillary definitions: (lex file included below for illustration).  Is
> lex truly this limited in the number of definitions?  Can I increase
> this limit?  Or am I using lex for too much, and not using yacc for
> enough?

These are probably the right sort of tools for you - I am working on a similar
sort of problem.  Some rough rules of thumb that I have found to work:

1.	lex doesn't like too many things to recognise, but you can get around a
	lot of troubles by defining states for your rules - ie some rules are
	ignored - be careful with states however, I recall some problem with
	the number that you can have active at any one time.

2.	Store your keywords in a table of some description, and just get lex
	to look for "words" and then do a lookup and return token id (as per
	the y.tab.h from yacc -d) - this means you need to define more rules
	in yacc maybe but at least your statement analysis *may* be clearer.

3.	Having lex tell me what sort of token it has seen works well.

4.	lex has a number of parameters you can alter to make it recognise
	more things - working out what to adjust is a "suck and see" exercise
	(If any of the whizards can give me some better rules please do!)

Jeff Paton		while ( --braincell ) drink(...);



More information about the Comp.unix.questions mailing list