Topic
  • 4 replies
  • Latest Post - ‏2013-06-06T04:06:17Z by RSMT
fossl
fossl
2 Posts

Pinned topic Using Log Analytics for non-IBM log analysis

‏2013-05-16T13:41:42Z |

How hard is it, and does IBM provide examples for creating text analytics for non-IBM software environments.

For example

JBOSS

RAILS apps

Oracle databases

Red Hat Linux system logs

other open source applications

 

Thanks,

  • dmcclure
    dmcclure
    17 Posts

    Re: Using Log Analytics for non-IBM log analysis

    ‏2013-05-16T14:13:04Z  

    We have options you can use to get these in. 

    The first and easiest option is to use the Generic Annotator. All that's required here is that we have a recognizable timestamp in each log record. You'll be able to bring in any log type and we'll automatically discover patterns in those logs that will be useful for directed search activities.

    For RHEL, you can follow my blog now and see examples of how to use standard RHEL rsyslog to send in any /var/log/* or application logs into SCALA. At this point, I'd recommend that any of those log types you mention be aggregated into rsyslog for shipping to SCALA's Generic Annotator function.

    Moving beyond the use of our Generic Annotator, the next phase of developing content is to begin to add more structure to what's parsed from the logs. For example, if you know the structure of what each log record looks like, you can build your own annotators to break that out. The annotators could be scripts you write or you can use our eclipse based IDE to develop your own Insight Packs. This process is much more involved and will require deeper skills enablement on your part.

    I'd be happy to help guide you through one or two of your use cases if you'd like. Please feel free to contact me here or direct for help.


    Doug

  • fossl
    fossl
    2 Posts

    Re: Using Log Analytics for non-IBM log analysis

    ‏2013-05-16T14:25:42Z  
    • dmcclure
    • ‏2013-05-16T14:13:04Z

    We have options you can use to get these in. 

    The first and easiest option is to use the Generic Annotator. All that's required here is that we have a recognizable timestamp in each log record. You'll be able to bring in any log type and we'll automatically discover patterns in those logs that will be useful for directed search activities.

    For RHEL, you can follow my blog now and see examples of how to use standard RHEL rsyslog to send in any /var/log/* or application logs into SCALA. At this point, I'd recommend that any of those log types you mention be aggregated into rsyslog for shipping to SCALA's Generic Annotator function.

    Moving beyond the use of our Generic Annotator, the next phase of developing content is to begin to add more structure to what's parsed from the logs. For example, if you know the structure of what each log record looks like, you can build your own annotators to break that out. The annotators could be scripts you write or you can use our eclipse based IDE to develop your own Insight Packs. This process is much more involved and will require deeper skills enablement on your part.

    I'd be happy to help guide you through one or two of your use cases if you'd like. Please feel free to contact me here or direct for help.


    Doug

    Doug, 

    Thanks for the insights.   In terms of developing our own annotators, is that done using the Text Analytics function within the Hadoop/BigInsights repository of the log data?  Are you using AQL with the Eclipse GUI to create the insight packs?

     

    Thanks,

    Lou Foss

  • dmcclure
    dmcclure
    17 Posts

    Re: Using Log Analytics for non-IBM log analysis

    ‏2013-05-16T15:26:12Z  
    • fossl
    • ‏2013-05-16T14:25:42Z

    Doug, 

    Thanks for the insights.   In terms of developing our own annotators, is that done using the Text Analytics function within the Hadoop/BigInsights repository of the log data?  Are you using AQL with the Eclipse GUI to create the insight packs?

     

    Thanks,

    Lou Foss

    We're using the very same BigInsights 2.0 stuff in terms of the tool kit, SystemT/AQL, etc. We've extended the Big Insights 2.0 Eclipse IDE to help develop all of the text analytics parts (splitters, annotators, dictionaries, etc.) as well as other parts specific to our product and then be able to package them for installation in the Smart Cloud Analytics for Log Analysis tool. The key difference is we're not acting on log data in the BI Hadoop backend (yet). We're posting the annotated log data against the DataExplorer for indexing and search.

    HTH,

    Doug

  • RSMT
    RSMT
    8 Posts

    Re: Using Log Analytics for non-IBM log analysis

    ‏2013-06-06T04:06:17Z  
    • fossl
    • ‏2013-05-16T14:25:42Z

    Doug, 

    Thanks for the insights.   In terms of developing our own annotators, is that done using the Text Analytics function within the Hadoop/BigInsights repository of the log data?  Are you using AQL with the Eclipse GUI to create the insight packs?

     

    Thanks,

    Lou Foss

    Hi,

    You may find the list of logtypes tested @
    https://www.ibm.com/developerworks/community/wikis/home?lang=en#!/wiki/IBM%20Log%20Analytics%20Beta/page/Supported%20Log%20Types
     

    Regards.

    Geetha.