Questions tagged [ingest]
41 questions
3
votes
1 answer
elasticsearch split document ingest processor
elasticsearch provides the ingest mechanism to transform documents while they are indexed. The processory can transform fields, add and remove fields from indexed documents. For the rare cases it is possible to even write your own transform plugin.…

paweloque
- 18,466
- 26
- 80
- 136
2
votes
1 answer
How to get the local day of week from timestamp in elasticsearch
I'm using the ingest pipeline script processors to extract the day of the week from the local time for each document.
I'm using the client_ip to extract the timezone, use that along with the timestamp to extract the local time, and then extract day…

Saba Far
- 133
- 2
- 9
2
votes
0 answers
Fetch index name from alias during ingest in elasticsearch ingest plugin java
I am using Elasticsearch v7.9 and need to get index name during ingest instead of alias name.
Alias name = employees_prod and Index Name = employees
POST /employees_prod/_doc?pipeline=test-pipeline&refresh
{
"name": "Quick Brown Fox",
…

Galet
- 5,853
- 21
- 82
- 148
1
vote
1 answer
Azure Data Explorer oneclick Ingest from blob container (UI)
I'm trying to configure and use the Azure Data Explorer OneClick Ingest from blob container (continous ingest).
Whatever I try the URL is never accepted, I always end up with this error:
Invalid URL. Either the URL leads to a blob instead of a…

jeromesubs
- 63
- 1
- 7
1
vote
1 answer
Elasticsearch ingest pipelines to extract log level as Field:Value
Source log sample from message field:
{"log":"2022/02/15 22:47:07 insert into public.logs (time, level, message, hostname, loggerUID, appmodule) values ('2022-02-15 22:47:07.494330952','ERROR','GetRequestsByUserv2 :pq: column \"rr.requestdate\" must…

Prasad
- 35
- 6
1
vote
1 answer
Skip Header row when loading data from csv using Ingest Utility in db2
I am trying to load data into a db2 target table from a csv file using the ingest utility.
I see the header row getting rejected with an error message.
Is there any option (similar to skipcount in import utility) to skip the header row so to avoid…

vineeth
- 641
- 4
- 11
- 25
1
vote
1 answer
How to insert/ingest Current timestamp into kusto table
I am trying to insert current datetime into table which has Datetime as datatype using the following query:
.ingest inline into table NoARR_Rollout_Status_Dummie <| @'datetime(2021-06-11)',Sam,Chay,Yes
Table was created using the following…

A D
- 51
- 2
- 12
1
vote
2 answers
Elasticsearch ingest pipeline: how to recursively modify values in a HashMap
Using an ingest pipeline, I want to iterate over a HashMap and remove underscores from all string values (where underscores exist), leaving underscores in the keys intact. Some values are arrays that must further be iterated over to do the same…

Jonathan
- 125
- 1
- 9
1
vote
1 answer
Elasticsearch ingest pipeline script processor fails to cast
I am trying to reindex data and do some calculations based on fields in the source document.
I have used ingest pipelines to enrich the document with geo_point and want to calculate some other values as well.
The issue that I have is that the source…

Chibisuketyan
- 15
- 5
1
vote
1 answer
Add an object value to a field to Elastic Search during ingest and drop empty valued fields all during ingest
I am ingesting csv data into elasticsearch using the append processor. I already have two fields that are objects (object1 and object2) and I want to append them both into an array of a different field (mainlist). So it would come out as mainlist:[…

Dimeji Olayinka
- 71
- 3
- 12
1
vote
0 answers
Filtering JDBC Ingestion with AWS Glue and PySpark
I am using AWS Glue to ingest from a mysql database. I know that I can use custom queries when using pyspark-JDBC to ingest data. Does the same apply for when ingesting based on a crawler?
Right now I am using this:
datasource…

Gerasimos
- 279
- 2
- 8
- 17
1
vote
0 answers
Elasticsearch - Update a field of several records already indexed based on a value from a new record still not indexed
I need to update a field of several records already indexed based on a value from a new record still not indexed. Is there any way to implement it?
The table below describes my index with current docs. The yellow line would be a new doc…

Lavor
- 71
- 5
1
vote
2 answers
Kusto data ingestion from an Azure Function App ends with a 403
I try to ingest data from azure function app into a ADX database. I followed the instruction found in the the article here.
The difference is, I'd like to insert data into the table. I struggle with a 403 error "Principal 'aadapp=;' is not…

Jean
- 11
- 4
1
vote
1 answer
Insert Object Array or CSV file content into Kusto Table
Unable to insert data from object array or csv file into kusto table
My goal is to build a pipeline in Azure DevOps which reads data using PowerShell and writes the data into Kusto Table.
I was able to write the data which I have read from…

A D
- 51
- 2
- 12
0
votes
1 answer
KDB Q Ingestion
currently presented with data can ingest a csv file..
Currently data is all in one column but need to split the data below into sperate columns currently a space indicates its a new column and char count for each col may be a good way to…

toldmimsy
- 1
- 1