Find the answer to your Linux question:
Results 1 to 2 of 2
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1

    Best way to search massive text files!


    I am doing some work with MASSIVE log files for a special project. Over time I will get litterally hundreds of ASCII log files that can range from 100 MB to 2 GB in size each.

    I have been using grep and awk to search these files, but it has become cumbersome to do this.

    Is there a "better" way to search really big log files? Something with lots of features and whatnot.

    Is there an opensource tool that could crawl the log files and index them using some sexy fast algorithms? Something that I could interface with via a web app (javascript, AJAX, or PHP) to retrieve the results from this index?

    Does anyone know of a sexy tool like that?


  2. #2
    Linux Engineer drl's Avatar
    Join Date
    Apr 2006
    Saint Paul, MN, USA / CentOS, Debian, Slackware, {Free, Open, Net}BSD, Solaris

    I've been using glimpse for years. I have a cron job that runs nightly to update the index. It does one's home directory by default, so you may need to do some configuration to restrict it to your log files.

    I use only the CLI interface, I do not use the web interface.

    I'm not sure I'd call it open source, but it doesn't cost anything for personal use:

    Webglimpse and Glimpse: advanced site search software for Unix : index websites or intranets

    Best wishes ... cheers, drl
    Welcome - get the most out of the forum by reading forum basics and guidelines: click here.
    90% of questions can be answered by using man pages, Quick Search, Advanced Search, Google search, Wikipedia.
    We look forward to helping you with the challenge of the other 10%.
    ( Mn, 2.6.n, AMD-64 3000+, ASUS A8V Deluxe, 1 GB, SATA + IDE, Matrox G400 AGP )

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts