docker-elk | The Elastic stack powered by Docker and Compose | Continuous Deployment library
kandi X-RAY | docker-elk Summary
kandi X-RAY | docker-elk Summary
The Elastic stack (ELK) powered by Docker and Compose.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of docker-elk
docker-elk Key Features
docker-elk Examples and Code Snippets
Community Discussions
Trending Discussions on docker-elk
QUESTION
I am using the latest code of git@github.com:deviantony/docker-elk.git
repository to host ELK stack with docker-compose up
command. Elastic search and kibana are running fine.
Although I cannot index into logstash with my logstash.conf which is as shown below:
...ANSWER
Answered 2021-Mar-20 at 16:47In your output elasticsearch
plugin, set the hosts
property to elasticsearch:9200
.
QUESTION
I'm learning how to use logstash and I'm facing some problems in reading a file with logstash which is constantly updated. Here is my test:
- logstash.conf
ANSWER
Answered 2021-Feb-15 at 17:10If you are using a text editor then you are probably creating a new file each time you exit it.
That could be an inode reuse issue. There are links to various issues in the META issue 211. Especially see 251.
Tracking which files have been read when those files can get rotated is an extremely hard problem. Way harder than most folks would initially think. A good option to get it right is to checksum the file contents (although this is not foolproof). The file input does not do that, because it can get ridiculously expensive. Instead it implements a very cheap technique that almost always gets it right (but in a few cases it decides it has already read a file that it has not read).
There are other cases where it gets it wrong by duplicating data (which is what you are hitting). As I said, it is a really hard problem.
QUESTION
I'm new to elk stack, and I'm trying to do a very basic experiment: send a message to logstash stdout with a PUT request, based on this repo: link
The logstash's port is 9600, and I use postman to send a PUT
request. It returns 404
My logstash.conf
is very simple.
ANSWER
Answered 2021-Jan-29 at 13:40The port 9600
is the port for the Logstash API, for monitoring logstash, not the port for the http
input.
If you want to use the http
input and since you didn't specify a port in the configuration, you should use the port 8080
, which is the default port for this input.
You will need to expose this port also in your docker configuration.
QUESTION
I'm using a docker-elk and I'd like to clean all the log files, but I'm not sure where they're stored. The funny thing is, when I stop and remove all the docker containers and then run them from the docker-compose file, the ELK server still contains all the old logs. Why is that?
Here's my docker-compose.yml for reference:
...ANSWER
Answered 2020-Aug-12 at 15:59While non-Docker Elasticsearch logs to /var/log/elasticsearch/elasticsearch.log
by default (on Linux), the Docker containers write their logs to STDOUT
, which is generally a Docker best practice.
Those logs should be in /var/lib/docker/containers/
, but note that on Mac this is inside the small VM layer that Docker is using, so you can't access it directly.
How do you "stop and remove all the docker containers" and still "the ELK server still contains all the old logs"? docker-compose down -v
should remove everything and do you see the logs in docker logs
or somewhere else?
QUESTION
I am using this_repo to get started running ELK with Docker.
my question is regarding the logstash image in the docker-compose file:
When I run locally I have 3 files
ANSWER
Answered 2020-Apr-30 at 16:11You need to mount your pipelines.yml file to the container as well. The default location Logstash is looking for a possible pipelines.yml file is /usr/share/logstash/config/
(the same folder you've already mounted the logstash.yml file to).
Please note that you also have to update your current, local pipelines.yml file to the correct paths of the pipelines inside the container. To be precise, you need to change
path.config: "/etc/logstash/my-first-pipeline.config"
to
path.config: "/usr/share/logstash/pipeline/my-first-pipeline.config"
Also, have a look at these official guides for running Logstash with Docker and how to configure multiple pipelines:
https://www.elastic.co/guide/en/logstash/current/docker-config.html#docker-config
https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html
I hope I could help you!
EDIT:
The official documentations call the file pipelines.yml instead of pipeline.yml
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install docker-elk
Docker Engine version 17.05 or newer
Docker Compose version 1.20.0 or newer
1.5 GB of RAM
5044: Logstash Beats input
5000: Logstash TCP input
9600: Logstash monitoring API
9200: Elasticsearch HTTP
9300: Elasticsearch TCP transport
5601: Kibana
The stack is pre-configured with the following privileged bootstrap user:. Although all stack components work out-of-the-box with this user, we strongly recommend using the unprivileged built-in users instead for increased security.
user: elastic
password: changeme
Initialize passwords for built-in users $ docker-compose exec -T elasticsearch bin/elasticsearch-setup-passwords auto --batch Passwords for all 6 built-in users will be randomly generated. Take note of them.
Unset the bootstrap password (optional) Remove the ELASTIC_PASSWORD environment variable from the elasticsearch service inside the Compose file (docker-compose.yml). It is only used to initialize the keystore during the initial startup of Elasticsearch.
Replace usernames and passwords in configuration files Use the kibana_system user (kibana for releases <7.8.0) inside the Kibana configuration file (kibana/config/kibana.yml) and the logstash_system user inside the Logstash configuration file (logstash/config/logstash.yml) in place of the existing elastic user. Replace the password for the elastic user inside the Logstash pipeline file (logstash/pipeline/logstash.conf). :information_source: Do not use the logstash_system user inside the Logstash pipeline file, it does not have sufficient permissions to create indices. Follow the instructions at Configuring Security in Logstash to create a user with suitable roles. See also the Configuration section below.
Restart Kibana and Logstash to apply changes $ docker-compose restart kibana logstash :information_source: Learn more about the security of the Elastic stack at Secure the Elastic Stack.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page