Export logs to a Google Cloud Storage

This task is designed to send logs to a Google Cloud Storage.

yaml
type: "io.kestra.plugin.ee.gcp.gcs.LogExporter"

Ship logs to GCP

yaml
id: log_shipper
namespace: company.team

triggers:
  - id: daily
    type: io.kestra.plugin.core.trigger.Schedule
    cron: "@daily"

tasks:
  - id: log_export
    type: io.kestra.plugin.ee.core.log.LogShipper
    logLevelFilter: INFO
    lookbackPeriod: P1D
    logExporters:
      - id: GCPLogExporter
        type: io.kestra.plugin.ee.gcp.gcs.LogExporter
        projectId: myProjectId
        format: JSON
        maxLinesPerFile:10000
        bucket: my-bucket
        logFilePrefix: kestra-log-file
        chunk: 1000
Properties

GCS Bucket to upload logs files.

The bucket where log files are going to be imported

Validation RegExp ^[a-zA-Z0-9][a-zA-Z0-9_-]*
Min length 1
Default 1000

The chunk size for every bulk request.

Default JSON
Possible Values
IONJSON

Format of the exported files

The format of the exported files

The GCP service account to impersonate.

Default kestra-log-file

Prefix of the log files

The prefix of the log files name. The full file name will be logFilePrefix-localDateTime.json/ion

Default 100000

Maximum number of lines per file

The maximum number of lines per file

The GCP project ID.

SubType string
Default ["https://www.googleapis.com/auth/cloud-platform"]

The GCP scopes to be used.

The GCP service account key.