loki | loki - the lightweight online game framework | Socket library
kandi X-RAY | loki Summary
kandi X-RAY | loki Summary
loki is a online game framework, inspired by cloudwu's skynet. to start, see the header file loki.h. loki now doesn't have documentations for now, but i will finish that after complete the built-in service. loki is not only used in online game server, but more. it's a multi-thread service signal/slot system. every service register it's own slots, and send/receive message to offer service. loki only have one header as it's core. to define loki functions (only allow in only one c file), define loki_implementation before include loki.h. loki also has several built-in service, that will defined as lsvr_*.c files, each file can singly build as a dll/so file, or build together with lsvr_init.c file. in loki, you must create a lk_state to contain all information loki used, loki doesn't use global variables. a lk_state is also a lk_service, and a lk_service is also a lk_slot. to get the name of a slot, just cast it to const char*. you can emit signals to slot by lk_emit(), and process them in a function called lk_slothandler. a service must have a function named lk_servicehandler to
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of loki
loki Key Features
loki Examples and Code Snippets
Community Discussions
Trending Discussions on loki
QUESTION
I need to create a JSON array in order to split it into several jobs with Nifi. The array needs to be created based on an existing array inside the JSON.
Can't figure out how to dynamically create a reference to another object in the JSON. I want the reference "@(2,@)" to work, but this is not supported.
INPUT
...ANSWER
Answered 2021-Jun-08 at 15:50You can proceed one more step by adding "*"
key to nest the current spec more while roaming by @(3,&)
dynamically as this ampersand represents the incurred key values name
and id
such as
QUESTION
I was going through the loki documentation. And i came across storage section, where you can set the storage to be any DB/FileSystem/InMemory. Currently, i need to store the logs into MongoDB. How can i do it?
I don't see any configuration file to store the logs to MongoDB. Is there any reference/configuration file which could help me set these loki chunks and indexes to be stored in MongoDB?
...ANSWER
Answered 2021-Jun-04 at 22:36MongoDB is not supported currently. Only a certain set of DBs are supported.
Storage options in Loki
The following are supported for the index:
Single Store (boltdb-shipper) - Recommended for 2.0 and newer index store which stores boltdb index files in the object store
- Amazon DynamoDB
- Google Bigtable
- Apache Cassandra
- BoltDB (doesn’t work when clustering Loki)
The following are supported for the chunks:
- Amazon DynamoDB
- Google Bigtable
- Apache Cassandra
- Amazon S3
- Google Cloud Storage
- Filesystem
QUESTION
I have elastic configured with Grafana and it has logs. I tried to query logs for the elasticsearch in grafana but did not have much succes. I went online to try to learn how to do so, but when I do it talks about Loki. Are you able to use Loki with Elasticsearch? Do not see a definite answer for this online.
...ANSWER
Answered 2021-May-17 at 15:52Using Loki with ES defeats the purpose of using Loki itself.
Loki prides itself on indexing only the metadata/labels of the logs and storing the actual log data separately in a compressed manner.
This reduces storage costs and leads to faster retrieval of data as there is less data to index as compared to the an ES index which indexes everything in a log line and worse still ,if the data is missing ,stores the index attribute as empty. (Almost similar to the diff between SQL vs NoSQL)
As of now, Loki does not support ES as the index store.
It uses two types of indices:- Labels and log chunks and stores them separately to be queried as and when required.
Label/metadata/index :- uses Cassandra,GCS,File System,S3
Data chunks:- Cassandra,BigTable,DynamoDB,BoltDB
For more info see Loki storage.
QUESTION
tl;dr:
Loki-docker-log-driver -> Loki : ✅ works.
Loki-docker-log-driver -> JSON Decode -> Loki : How?
For my local development, I run several services which log in GELF Format. To get a better overview and time-ordered log stream with filter functionality, I use the loki docker log driver.
The JSON log messages (GELF style) are successfully sent to loki, but I want to get them further processed so that labels are extracted. How can I achieve that?
...ANSWER
Answered 2021-May-14 at 12:20If you have already sent the logs in JSON format to Loki, all you need to do is to select the desired log stream and pipe it to the "json" parser, like in the following example:
QUESTION
I am new with Loki and have made an alert in Loki but I don't see any notification in the Alertmanager. Loki is working fine (collecting logs), Alertmanager also (getting alerts from other sources), but the logs from loki don't get pushed to alertmanager.
Loki config:
...ANSWER
Answered 2021-May-06 at 12:12The config looks good, similar as mine. I would troubleshoot it with following steps:
Exec to docker container and check if the rules file is not empty
cat /etc/loki/rules/rules.yaml
Check the logs of loki. When rules are loaded properly logs like this will pop up:
QUESTION
I have configured PLG (Promtail, Grafana & Loki) on an AWS EC2 instance for log management. The Loki uses BoltDB shipper & AWS store.
Grafana - 7.4.5, Loki - 2.2, Prommtail - 2.2, AlertManager - 0.21
The issue I am facing is that the Loki does not trigger or push alerts on alertmanager. I cannot see any alert on the AlertManager dashboard though I can run a LogQL query on Grafana which shows the condition was met for triggering an alert.
The following is a screenshot of my query on Grafana.
The following are my configs.
- Docker Compose
ANSWER
Answered 2021-Apr-15 at 22:57If Loki is running in single tenant mode, the required ID is fake (yes we know this might seem alarming but it’s totally fine, no it can’t be changed).
QUESTION
hope you're all well during this pandemic.
I've got a kubernetes cluster running. The comunication between pods is done through kafka. It is currently logging to stdout only. No files. no kafka topic. This is obviously pretty bad.
I want to setup a grafana instance that lets me centralize all logs there. The storage would be Loki + S3
In order to do that, I found that many people use tools like Fluentd, FluentBit and Promtail, which centralizes the logs and sends them to Loki. However, I already have Kafka running. I can't see why I'd use some tool like fluentd if I can send all logs to kafka through a "logging" topic.
My question is: How could I send all messages inside the logging topic to Loki? Fluentd cannot get input from kafka.
Would I have to setup some script that runs periodically, sorts data and sends it to loki directly?
...ANSWER
Answered 2021-Apr-14 at 09:27I recommend you to use promtail because is also from Grafana and not use the kafka solution.
If you send the logs from your apps to kafka then you need to:
- modify your apps to send to kafka instead of stdout
- configure a log forwarder to send messages on kafka to loki (it can be fluentd)
And if you use one the normal proposed approach you need to:
- configure a log forwarder to send messages from docker stdout to loki (you can use promtail default configuration)
But if you want to go for your solution with kafka in the middle there are some plugins of fluentd to configure kafka as input and output. https://github.com/fluent/fluent-plugin-kafka
QUESTION
I have a question about the retention mechanism in grafana-loki I need store logs for one year and be able to query them Setup is in k8s with the official Loki chart
below is my config
...ANSWER
Answered 2021-Apr-13 at 14:38No, you'll not lose all logs from the previous year and start from scratch. The Table Manager keeps the last tables alive using the following formula:
QUESTION
I am new to Loki but all i want to do is to use it as simply as possible with helm.
I want to get the logs of my app witch is in kubernetes, but it seems that there is no instructions on how to do that. All I find is how to install Loki and to add it to Grafana as a datasource but I don't think that's what Loki is made for.
I simply want to track my app's logs in kubernetes so I am using Loki helm chart and all I can find about a custom config is this line:
...ANSWER
Answered 2021-Apr-12 at 20:30After installing Loki you can set it as a data source for Grafana. For more details you can follow this example :Logging in Kubernetes with Loki and the PLG Stack
I hope that this can help you to resolve your issue .
QUESTION
my application's services are deployed via docker-compose. Currently, I also deployed Grafana, Loki and Promtail within the same docker-compose network.
Following the getting-started guide, collecting and displaying the log files from /var/log
with the config
ANSWER
Answered 2021-Mar-24 at 18:27In your pipeline stages you need to store the extracted values:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install loki
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page