• Home
  • Observe and explore
  • Logs
  • Log Monitoring v1
  • Log file formats v1

Log file formats v1

Legacy Log Monitoring v1

You are viewing documentation for Log Monitoring v1. Dynatrace Log Monitoring v1, is considered a legacy solution.

We strongly encourage you to switch to the latest Dynatrace Log Monitoring version.

  • Compare legacy Log Monitoring v1 to the latest version of Dynatrace log monitoring.
  • Switch to the latest version of Dynatrace log monitoring.

Log Monitoring can read and analyze:

Windows event logs

System, Security, and Application logs are automatically discovered on hosts. Other custom event-log format logs can be added manually on the process group level.

Plain-text logs

Any plain-text log file as long as it contains a timestamp and meets these basic requirements:

  • The file must be encoded as UTF-8 or UTF-16. All other encoded files will be recognized as binary.

  • A timestamp must appear at the beginning of each log entry.

  • The timestamp date can be separated using any of the following:
    (space)
    / (slash)
    - (dash)
    . (period)
    T (Combined date and time in UTC, ISO 8601 format)
    , (comma)
    @ (ampersand)

  • For the timestamp date, either month abbreviations or full names can be used in the date format.
    By default, Log Monitoring recognizes only English month abbreviations and months represented by number (1-12 and jan-dec).

  • A timestamp must include BOTH the date and time.

  • A timestamp time is in the following format:

plaintext
[0-9]{1,2}:[0-9]{2}:[0-9]{2}((\.|,)[0-9]{1,9})?( *AM|PM)? *((GMT|UTC)?[+-][0-9]{2,4})?(GMT|CEST|CET|OTHER_TZ_ABBREVIATION)?

JSON logs

The timestamp in a JSON file is automatically detected through the time or timestamp tag and it must be in the following timestamp format: ::*_year_*:-:*_month_*:-:*_day_*:T:*_time_*:0::.
For example: 2018-02-28T16:17:50.000

Also, the JSON file must meet the following conditions:

  • The date must be in UTC standard.
  • The file cannot contain any headers.
  • In the file, each log entry is represented by one JSON object on one line.
    For example:
    plaintext
    { "log" : "content = 0, t = 1000000000", "time":"2019-10-09T14:45:00.000000Z", "stream" : "stderr" } { "log" : "content = 0, t = 1000000000", "time":"2019-10-09T14:46:00.000000Z", "stream" : "stderr" }
  • To be automatically recognized and indexed, any attributes must be placed as a top level JSON object. For example, you can make an API log ingest call with JSON that contains the following log attributes:
    json
    { "timestamp": "2021-07-29T10:54:40.962165022Z", "level": "error", "source": "Skynet", "application.id": "PaymentService-Prod", "message": "PaymentService-Prod failure.", "data": {} }
    Any nested data will be recognized as a string value for that top-level property.
    For example:
    json
    { "timestamp": "2021-07-29T10:54:40.962165022Z", "level": "error", "source": "Skynet", "application.id": "PaymentService-Prod", "message": "PaymentService-Prod failure.", "data": { "error": [ { "id": "1001", "type": "Regular" }, { "id": "1002", "type": "Extreme" } ] } }
    Everything in the data property will be treated as a string value for data.

Examples of valid log file date formats

plaintext
2018 04 06 2018/04/06 2018-04-06 2018.04.06 2018-04-06T18:46:19Z

Examples of valid log file time formats

plaintext
12:23:34.123 12:23:34.123GMT+0100 12:23:34.123 GMT+0100 0:00:00 GMT 0:00:00 GMT+0100 00:00:00 12:13:01+0100 12:13:02.123 12:13:03.123123+0100 12:13:02,123 12:13:03,123123+0100 12:13:04GMT 12:13:05GMT+0100 12:13:06GMT+01 12:13:09+0100 12:13:10+01 12:13:12+0200 12:13:13.123pm 12:13:14.123 AM 12:13:15.123PM+01 12:13:16.123 AM+02 12:13:17CEST 12:13:18 CET 00:13:19

Examples of valid log file timestamps

plaintext
2018-04-06 09:54:04.839 UTC 2018-04-06 11:01:19,625 2018/04/06 11:06:23 UTC Apr 6 12:23:52 Apr-6 13:35:57.621

Incorrect date formats

Any log file containing an invalid timestamp will generate an Incorrect date format error and will not be analyzed or stored (only the file status will be reported).