USM Anywhere™

Collect Other Logs from an Amazon S3 Bucket

Role Availability Read-Only Analyst   Manager

In addition to the native service-specific logging that Amazon Web Services (AWS) provides, individual applications you run in the AWS environment often generate their own log files. You can forward these logs to an Amazon Simple Storage Service (S3) bucket and configure USM Anywhere to collect logs from that Amazon S3 bucket. USM Anywhere does not restrict the number of logs you can collect, but AWS does set limits on the number of logs it can return in each operation.

To collect logs from an Amazon S3 bucket

  1. Go to Settings > Scheduler.
  2. In the left navigation menu, click Log Collection.

    Note: You can use the Sensor filter at the top of the list to review the available log collection jobs on your AWS Sensor.

  3. Click Create Log Collection Job.

    Click Create Log Collection Job to add a scheduled log collection job

    Note: If you have recently deployed a new USM Anywhere Sensor, it can take 10 to 20 minutes for USM Anywhere to discover the various log sources. After it discovers the logs, you must manually enable the AWS log collection jobs you want before the system collects the log data.

  4. The Schedule New Job dialog box opens.

  5. Enter the name and description for the job.

    The description is optional, but it is a best practice to provide this information so that others can easily understand what it does.

  6. In the Action Type option, select Amazon Web Services.
  7. In the App Action option, select Monitor S3 Bucket.

    Select the AWS sensor, Amazon Web Services app, and the Monitor S3 Bucket action

  8. Enter the Bucket Name and Path.

    The bucket name is the name of the Amazon S3 bucket as configured in your AWS account, such as alienvault-test-0726 in the screenshot below.

    The path is the path prefix within the Amazon S3 bucket, such as sub-folder1 in the screenshot below. This does not include the bucket name.

    Note: Logs from the directory and its subdirectories are collected.

  9. In Source Format, select either of the following log formats:

    • syslog: Standard format for transmitting log data to USM Anywhere
    • raw: Use for non-syslog formatted data.

    Specify the bucket name, path, and source format for the S3 logs

  10. In the Schedule section, specify when USM Anywhere runs the job:

    1. Select the increment as Minute, Hour, Day, Week, Month, or Year.
    2. Set the interval options for the increment.

      The selected increment determines the available options. For example, on a weekly increment you can select the days of the week to run the job.

      Set the schedule for the job to run each week

      Or on a monthly increment, you can specify a date or a day of the week that occurs within the month.

      Set the schedule for the job to run each month

    3. Set the Start time.

      This is the time that the job starts at the specified interval. It uses the time zone configured for your USM Anywhere instance (default is Coordinated Universal Time [UTC]).

  11. Click Save.

    USM Anywhere detects any enabled jobs with the same configuration, and asks you to confirm before continuing. This is because having two jobs with the same configuration generates duplicate events and alarms.

  12. In the AWS console, restart the AWS Sensor instance so that it detects the new configuration.

Moving Logs from an Amazon EC2 Instance to an Amazon S3 Bucket

In Amazon Elastic Compute Cloud (EC2), it can be difficult to create direct network connections between isolated parts of your environment. Amazon S3 provides a convenient way to move application logs from an Amazon EC2 instance to an Amazon S3 bucket. Amazon S3 buckets are used to store objects that consist of data and metadata that describes the data. You then configure the AWS Sensor to retrieve and process the log files.

You'll want to synchronize logs from your instance with an Amazon S3 bucket. There are multiple ways to do this. The easiest method is to use the AWS CLI documented by Amazon. You then create a script similar to the following example and configure it to run periodically as a cron job.

aws s3 sync "<path_to_log>" "S3://<bucket_name>/<storage_path>/"