I accidentally loaded some data from LogStash into elastic search.
Actually, I started with start_position = & gt; LogStage is "beginning" in config,
, so if I delete .sincedb _ *
and run again, then I will have a small part of the data which is duplicated .
I have used the key to look at this data and clicked on the "Inspection" button to see the question:
curl -xget 'http : // LLS: 9 0000 / LogStash-20144.02.19, LogStage-20144.02.18 / _Catch? Beautiful "-D" {"Aspect": {"Date_Histogram": {"Field": "@Stimestamp", "Interval": "10m"}, "Facet_filter": {"fquery": {"Query": {"filter": {"query": {"query_string": {"query": "tags: \" a-tag-that-unique-matches-next \ ""}}, "filter" : {"Bool": {"should": [{"match_all": {}}, {"category": {"@stestamp": "{": "to": 1392723206360, "to": "now"} }}, {"Bool": {"should": [{"match_all": {}}}}}}}}}}}} If I run it on ELS server, then this same result set (as expected 23 "Time-Out": Wrong, "_shard": {"Total": 10, "Successful" : 10, "failed": 0}, "h "{0}: {" _type ":" date_host "," entries ": {{" time "): {" Total ":" Total ": 558829," max_score ": 0.0," Hit ": []}," Aspect " ": 1392799200000," Count ": 91 How can I remove these 91 entries to remove it in a delayed operation?
Thank you,
KB
You can query 1.0 or later Can be removed in
I use to run my questions manually against ES
Example:
Delete / Twitter / Tweet / _query {"query": {"term": {"user": "kimchy"}}}
In your case, you should use the question part of your query:
delete / twitter / _search {"query": {"filtered": {"query ": {" Query ":" tags: \ "a-tag-that-unique-match-the-mistake \" "}}," filter ": {" bool ": {" must ": {"Match_all": {} "{" from ": 1392723206360," to ":" now "}}}, {" bool ": {" should ": [{" Match_all ": {} }}}}}}}}}}
Comments
Post a Comment