A handy utility or clever technique to parse error logs?
User (firstname.lastname@example.org) posted:
Does anyone have a handy utility or clever technique to parse a text
file, searching for lines containing a certain keyword ("Error"),
then deleting the corresponding row (based on 'User=') from a
different text file (Users.dat).
For example, I have a text file containing LoadRunner error messages
as illustrated below:
Action1.c(1271): Error -17999 : !!NO DATA Iter#-1228 User-WILLS
Action1.c(1297): Iter#-1231 User-WIREFM Pass
Action1.c(1271): Error -17999 : !!NO DATA Iter#-1253 User-789REL
I have a parameter data table (USERS.dat) that contains my user names
and passwords -- but I want to delete the rows containing the bad
user names I've discovered on previous runs. In the example above,
the rows begining with "WILLS" and "789REL" need to be eliminated
from my data input file.
I have thousands of records in the original User.dat file and
hundreds that need to be deleted. Can this be done with a simple
PERL script? Several iterations of Excel? WinRunner script? I've
considered modifying my LoadRunner script just to build a
new "Users.dat", but I'm sure there must be a faster way Surely
someone has done this before and I don't want to re-invent the
wheel. Please help ASAP.
Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/