You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/appsec/configuration.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ sidebar_position: 6
6
6
7
7
## Overview
8
8
9
-
This page explains the interraction between various files involved in AppSec configuration and the details about the processing pipeline AppSec request processing.
9
+
This page explains the interaction between various files involved in AppSec configuration and the details about the processing pipeline AppSec request processing.
10
10
11
11
**Prerequisites**:
12
12
- Familiarity with [AppSec concepts](/appsec/intro.md)
@@ -24,7 +24,7 @@ The goals of the acquisition file are:
24
24
- To specify the **address** and **port** where the AppSec-enabled Remediation Component(s) will forward the requests to.
25
25
- And specify one or more [AppSec configuration files](#appsec-configuration) to use as definition of what rules to apply and how.
26
26
27
-
Details can be found in the [AppSec Datasource page](/log_processor/data_sources/apps).
27
+
Details can be found in the [AppSec Datasource page](/log_processor/data_sources/appsec.md).
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/appsec/quickstart/traefik.mdx
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -25,7 +25,7 @@ Additionally, we'll show how to monitor these alerts through the [console](https
25
25
- Traefik Plugin **[Remediation Component](/u/bouncers/intro)**: Thanks to [maxlerebourg](https://github.com/maxlerebourg) and team they created a [Traefik Plugin](https://plugins.traefik.io/plugins/6335346ca4caa9ddeffda116/crowdsec-bouncer-traefik-plugin) that allows you to block requests directly from Traefik.
26
26
27
27
:::info
28
-
Prior to starting the guide ensure you are using the [Traefik Plugin](https://plugins.traefik.io/plugins/6335346ca4caa9ddeffda116/crowdsec-bouncer-traefik-plugin) and **NOT** the older [traefik-crowdsec-bouncer](https://app.crowdsec.net/hub/author/fbonalair/remediation-components/traefik-crowdsec-bouncer) as it hasnt recieved updates to use the new AppSec Component.
28
+
Prior to starting the guide ensure you are using the [Traefik Plugin](https://plugins.traefik.io/plugins/6335346ca4caa9ddeffda116/crowdsec-bouncer-traefik-plugin) and **NOT** the older [traefik-crowdsec-bouncer](https://app.crowdsec.net/hub/author/fbonalair/remediation-components/traefik-crowdsec-bouncer) as it hasnt received updates to use the new AppSec Component.
29
29
:::
30
30
31
31
:::warning
@@ -77,7 +77,7 @@ If you have a folder in which you are persisting the configuration files, you ca
77
77
There steps will change depending on how you are running the Security Engine. If you are running via `docker run` then you should launch the container within the same directory as the `appsec.yaml` file. If you are using `docker-compose` you can use a relative file mount to mount the `appsec.yaml` file.
78
78
79
79
Steps:
80
-
1. Change to the location where you exectued the `docker run` or `docker compose` command.
80
+
1. Change to the location where you executed the `docker run` or `docker compose` command.
81
81
2. Create a `appsec.yaml` file at the base of the directory.
82
82
3. Add the following content to the `appsec.yaml` file.
83
83
@@ -96,11 +96,11 @@ Since CrowdSec is running inside a container you must set the `listen_addr` to `
96
96
97
97
<FormattedTabs
98
98
docker={`# Note if you have a docker run already running you will need to stop it before running this command
99
-
docker run -d --name crowdsec -v /path/to/orginal:/etc/crowdsec -v ./appsec.yaml:/etc/crowdsec/acquis.d/appsec.yaml crowdsecurity/crowdsec`}
99
+
docker run -d --name crowdsec -v /path/to/original:/etc/crowdsec -v ./appsec.yaml:/etc/crowdsec/acquis.d/appsec.yaml crowdsecurity/crowdsec`}
100
100
dockerCompose={`services:
101
101
crowdsec:
102
102
volumes:
103
-
- /path/to/orginal:/etc/crowdsec ## or named volumes
103
+
- /path/to/original:/etc/crowdsec ## or named volumes
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/getting_started/crowdsec_tour.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -250,7 +250,7 @@ Those metrics are a great way to know if your configuration is correct:
250
250
The `Acquisition Metrics` is a great way to know if your parsers are setup correctly:
251
251
252
252
- If you have 0 **LINES PARSED** for a source : You are probably *missing* a parser, or you have a custom log format that prevents the parser from understanding your logs.
253
-
- However, it's perfectly OK to have a lot of **LINES UNPARSED** : Crowdsec is not a SIEM, and only parses the logs that are relevant to its scenarios. For example, [ssh parser](https://hub.crowdsec.net/author/crowdsecurity/configurations/sshd-logs), only cares about failed authentication events (at the time of writting).
253
+
- However, it's perfectly OK to have a lot of **LINES UNPARSED** : Crowdsec is not a SIEM, and only parses the logs that are relevant to its scenarios. For example, [ssh parser](https://hub.crowdsec.net/author/crowdsecurity/configurations/sshd-logs), only cares about failed authentication events (at the time of writing).
254
254
-**LINES POURED TO BUCKET** tell you that your scenarios are matching your log sources : it means that some events from this log source made all their way to an actual scenario
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/log_processor/data_sources/introduction.md
+26-12Lines changed: 26 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,20 +1,25 @@
1
1
---
2
2
id: intro
3
-
title: Acquisition Datasources Introduction
3
+
title: Acquisition Datasources
4
4
sidebar_position: 1
5
5
---
6
6
7
-
## Datasources
7
+
To monitor applications, the Security Engine needs to read logs.
8
+
DataSources define where to access them (either as files, or over the network from a centralized logging service).
8
9
9
-
To be able to monitor applications, the Security Engine needs to access logs.
10
-
DataSources are configured via the [acquisition](/configuration/crowdsec_configuration.md#acquisition_path) configuration, or specified via the command-line when performing cold logs analysis.
10
+
They can be defined:
11
11
12
+
- in [Acquisition files](/configuration/crowdsec_configuration.md#acquisition_path). Each file can contain multiple DataSource definitions. This configuration can be generated automatically, please refer to the [Service Discovery documentation](/log_processor/service-discovery-setup/intro.md)
13
+
- for cold log analysis, you can also specify acquisitions via the command line.
14
+
15
+
16
+
## Datasources modules
12
17
13
18
Name | Type | Stream | One-shot
14
19
-----|------|--------|----------
15
20
[Appsec](/log_processor/data_sources/appsec.md) | expose HTTP service for the Appsec component | yes | no
16
21
[AWS cloudwatch](/log_processor/data_sources/cloudwatch.md) | single stream or log group | yes | yes
17
-
[AWS kinesis](/log_processor/data_sources/kinesis.md)| read logs from a kinesis strean | yes | no
22
+
[AWS kinesis](/log_processor/data_sources/kinesis.md)| read logs from a kinesis stream | yes | no
18
23
[AWS S3](/log_processor/data_sources/s3.md)| read logs from a S3 bucket | yes | yes
[file](/log_processor/data_sources/file.md) | single files, glob expressions and .gz files | yes | yes
@@ -46,6 +51,7 @@ An expression that will run after the acquisition has read one line, and before
46
51
It allows to modify an event (or generate multiple events from one line) before parsing.
47
52
48
53
For example, if you acquire logs from a file containing a JSON object on each line, and each object has a `Records` array with multiple events, you can use the following to generate one event per entry in the array:
@@ -62,39 +68,47 @@ By default, when reading logs in real-time, crowdsec will use the time at which
62
68
63
69
Setting this option to `true` will force crowdsec to use the timestamp from the log as the time of the event.
64
70
65
-
It is mandatory to set this if your application buffers logs before writting them (for example, IIS when writing to a log file, or logs written to S3 from almost any AWS service).<br/>
71
+
It is mandatory to set this if your application buffers logs before writing them (for example, IIS when writing to a log file, or logs written to S3 from almost any AWS service).<br/>
66
72
If not set, then crowdsec will think all logs happened at once, which can lead to some false positive detections.
67
73
68
74
### `labels`
69
75
70
76
A map of labels to add to the event.
71
77
The `type` label is mandatory, and used by the Security Engine to choose which parser to use.
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/log_processor/data_sources/syslog_service.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -51,6 +51,6 @@ This module does not support command-line acquisition.
51
51
52
52
:::warning
53
53
This syslog datasource is currently intended for small setups, and is at risk of losing messages over a few hundreds events/second.
54
-
To process significant amounts of logs, rely on dedicated syslog server such as [rsyslog](https://www.rsyslog.com/), with this server writting logs to files that Security Engine will read from.
54
+
To process significant amounts of logs, rely on dedicated syslog server such as [rsyslog](https://www.rsyslog.com/), with this server writing logs to files that Security Engine will read from.
55
55
This page will be updated with further improvements of this data source.
Copy file name to clipboardExpand all lines: crowdsec-docs/docs/log_processor/intro.mdx
+21-16Lines changed: 21 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,24 +4,23 @@ title: Introduction
4
4
sidebar_position: 1
5
5
---
6
6
7
-
The Log Processor is one of the core component of the Security Engine to:
7
+
The Log Processor is a core component of the Security Engine. It:
8
8
9
-
- Read logs from [Data Sources](log_processor/data_sources/introduction.md) in the form of Acquistions.
10
-
- Parse the logs and extract relevant information using [Parsers](log_processor/parsers/introduction.mdx).
11
-
- Enrich the parsed information with additional context such as GEOIP, ASN using [Enrichers](log_processor/parsers/enricher.md).
12
-
- Monitor the logs for patterns of interest known as [Scenarios](log_processor/scenarios/introduction.mdx).
13
-
- Push alerts to the Local API (LAPI) for alert/decisions to be stored within the database.
14
-
15
-
!TODO: Add diagram of the log processor pipeline
9
+
- Reads logs from [Data Sources](log_processor/data_sources/introduction.md) via Acquistions.
10
+
- Parses logs and extract relevant information using [Parsers](log_processor/parsers/introduction.mdx).
11
+
- Enriches the parsed information with additional context such as GEOIP, ASN using [Enrichers](log_processor/parsers/enricher.md).
12
+
- Monitors patterns of interest via [Scenarios](log_processor/scenarios/introduction.mdx).
13
+
- Pushes alerts to the Local API (LAPI), where alert/decisions are stored.
16
14
- Read logs from datasources
17
15
- Parse the logs
18
16
- Enrich the parsed information
19
17
- Monitor the logs for patterns of interest
20
18
19
+
<!--!TODO:Adddiagramofthelogprocessorpipeline-->
21
20
22
-
## Introduction
21
+
##LogProcessor
23
22
24
-
The Log Processor is an internal core component of the Security Engine in charge of reading logs from Data Sources, parsing them, enriching them, and monitoring them for patterns of interest.
-`acquis.yaml` file: This used to be only place to define Acquisitions prior to `1.5.0`. This file is still supported for backward compatibility.
41
-
-`acquis.d`folder: This is a directory where you can define multiple Acquisitions in separate files. This is useful when you want to auto generate files using an external application such as ansible.
0 commit comments