[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
> I guess to clarify, using your suggestions instead of awk (which I hope to
> avoid) I used the Net::Telnet module to telnet to a radius server and I was
> successful at logging in. From that server I want to parse a colon
> delimited file that is being written real time at a rate to fast to read
> for customer login failures for troubleshooting.
Sounds like you want to 'tail -f' a remote log file?
You might be better off using a 'logrotate' type of activity on the radius
box to transfer, say, daily logs to your logalyzer machine. Then you can
analyze at will, and you don't need to worry about 'forever' disk space
on the radius box, as only the most recent day's log will need storage.
And you can move to using ssh and scp to move the file from the Radius
box to your backend securely.
I'm pretty certain either you can use the Net::SSH module
and do what you need to do.
If you need a streaming feed (i.e. a tee on the Radius log file)
you might want to look into using a named pipe on the Radius box to
push the stream somewhere else. I don't recommend such a setup unless
your Radius box is _really_ thin, as it creates a dependency for
*something* to be at the other end of the pipe if you push stuff down
it, and you can't halt your Radius box if the other end of the pipe
goes away or fills up or can't keep up. Basically, you get a more
complex, fragile system for not much benefit over the 'batch pull' model
over a 'streaming push' model.
perl -le "$_='7284254074:0930970:H4012816';tr[0->][ BOPEN!SMUT];print"
To unsubscribe, send email to email@example.com with
"unsubscribe silug-discuss" in the body.