2017-08-22 24 views
0

これは私のfilebeat.ymlファイルです... ファイルビートサービスを開始するときはいつでもエラー1053が発生します。 このファイルで何らかの誤りがありますが、間違っている箇所を修正してください。yml設定のためFileBeatサービスが起動していません

###################### Filebeat Configuration Example ######################### 

# This file is an example configuration file highlighting only the most common 
# options. The filebeat.full.yml file from the same directory contains all the 
# supported options with more comments. You can use it as a reference. 
# 
# You can find the full configuration reference here: 
# https://www.elastic.co/guide/en/beats/filebeat/index.html 

#=========================== Filebeat prospectors ============================= 

filebeat.prospectors: 

# Each - is a prospector. Most options can be set at the prospector level, so 
# you can use different prospectors for various configurations. 
# Below are the prospector specific configurations. 



    # Paths that should be crawled and fetched. Glob based paths. 
paths: 
- E:\ELK-STACK\logstash-tutorial-dataset.log 
input_type: log 
document_type: apachelogs 
    # document_type: apachelogs 



    #paths: 
    # - E:\ELK-STACK\mylogs.log 
    #fields: {log_type: mypersonal-logs} 
     #- C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170810 
    # - C:\ECLIPSE WORKSPACE\jcgA1\jcgA1\logs-logstash.* 
    # Exclude lines. A list of regular expressions to match. It drops the lines that are 
    # matching any regular expression from the list. 
    #exclude_lines: ["^DBG"] 

    # Include lines. A list of regular expressions to match. It exports the lines that are 
    # matching any regular expression from the list. 
    #include_lines: ["^ERR", "^WARN"] 

    # Exclude files. A list of regular expressions to match. Filebeat drops the files that 
    # are matching any regular expression from the list. By default, no files are dropped. 
    #exclude_files: [".gz$"] 

    # Optional additional fields. These field can be freely picked 
    # to add additional information to the crawled log files for filtering 
    #fields: 
    # level: debug 
    # review: 1 

    ### Multiline options 

    # Mutiline can be used for log messages spanning multiple lines. This is common 
    # for Java Stack Traces or C-Line Continuation 

    # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [ 
    #multiline.pattern: ^\[ 

    # Defines if the pattern set under pattern should be negated or not. Default is false. 
    #multiline.negate: false 

    # Match can be set to "after" or "before". It is used to define if lines should be append to a pattern 
    # that was (not) matched before or after or as long as a pattern is not matched based on negate. 
    # Note: After is the equivalent to previous and before is the equivalent to to next in Logstash 
    #multiline.match: after 


#================================ General ===================================== 

# The name of the shipper that publishes the network data. It can be used to group 
# all the transactions sent by a single shipper in the web interface. 
#name: 

# The tags of the shipper are included in their own field with each 
# transaction published. 
#tags: ["service-X", "web-tier"] 

# Optional fields that you can specify to add additional information to the 
# output. 
#fields: 
# env: staging 

#================================ Outputs ===================================== 

# Configure what outputs to use when sending the data collected by the beat. 
# Multiple outputs may be used. 

#-------------------------- Elasticsearch output ------------------------------ 
#output.elasticsearch: 
    # Array of hosts to connect to. 
# hosts: ["localhost:9200"] 

    # Optional protocol and basic auth credentials. 
    #protocol: "https" 
    #username: "elastic" 
    #password: "changeme" 

#----------------------------- Logstash output -------------------------------- 
output.logstash: 
    # The Logstash hosts 
    hosts: ["localhost:5043"] 

    # Optional SSL. By default is off. 
    # List of root certificates for HTTPS server verifications 
    #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"] 

    # Certificate for SSL client authentication 
    #ssl.certificate: "/etc/pki/client/cert.pem" 

    # Client Certificate Key 
    #ssl.key: "/etc/pki/client/cert.key" 

#================================ Logging ===================================== 

# Sets log level. The default log level is info. 
# Available log levels are: critical, error, warning, info, debug 
#logging.level: debug 

# At debug level, you can selectively enable logging only for some components. 
# To enable all selectors use ["*"]. Examples of other selectors are "beat", 
# "publish", "service". 
#logging.selectors: ["*"] 

実際に私が何をしようとしています私はこれがでdepcreated見るように(私は、「DOCUMENT_TYPE」を指定する複数のログを使用しようとしています、私は「DOCUMENT_TYPE」を削除した場合、それは動作しますが、なぜ「DOCUMENT_TYPE」であり、 filebeat 5.5)または "fields"が機能していません。

助けてください。

+0

次のとおりです。http://yaml-online-parser.appspot.com/ – Val

+0

@Valはい私は特定の変更を加えましたが、現在は正しくフォーマットされており、サービスの開始中にエラーが表示されます。 –

+0

出力: - { "output.logstash":{ "ホスト":[ "はlocalhost:5043" ] }、 "filebeat.prospectors":{ "-input_type": "ログイン"、 "パス ":[ "E:\\ ELK-STACK \\ logstash-チュートリアルdataset.log" ] }、 "DOCUMENT_TYPE": "apachelogs" が働い –

答えて

1

設定ファイルに構文エラーがあります。 filebeat.prospectorsキーには配列値が必要ですが、代わりにハッシュ値を渡しています。

さらに、インデントに問題があります。

これはあなたYMLフィールドが正しくフォーマットされていることを確認します(簡潔にするためにコメントせずに)あなたの設定ファイルの修正バージョン

filebeat.prospectors: 
- 
    paths: 
    - E:\ELK-STACK\logstash-tutorial-dataset.log 
    input_type: log 
    document_type: apachelogs 
output.logstash: 
    hosts: ["localhost:5043"] 
+0

}おかげ!今私はもう一つの質問があります。私のプロジェクトの本質的な構造化されていないログがあります...どうすればgrokパターンを適用できますか? –

関連する問題