You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have the same issue. My tracking column is a datetime2 as well. The timestamp in logstash_jdbc_last_run looks good. However, it lost its precision when using it in the SQL statement.
From Logstash
logstash01 | [2025-01-08T09:35:00,490][INFO ][logstash.inputs.jdbc ][main][def6a17c9ec1ce2ef62f22c761ddd037290efc702480f407683d4445c1622a65] (0.007329s) SELECT RequestDtm, JsonData
logstash01 | FROM [User].[RequestAudit]
logstash01 | WHERE RequestDtm > '2025-01-08T03:39:07.542'
logstash01 | ORDER BY RequestDtm
Content of the file logstash_jdbc_last_run
$ cat logstash_jdbc_last_run
--- 2025-01-08 03:39:07.542451900 Z
I am using Logtash docker image docker.elastic.co/logstash/logstash:8.16.0
Hi @ngwwm as I remember correctly from 5 years, we solved issue in our query to no receive duplicates in a way, that we dont ask for precise last timestamp value, but we decreased the last digit in timestamp, this resolved duplicates, but obviously we dont get all the last logs, until there is new logs with increased timestamp. I hope this helps.
The bug:
Using datetime2 format as tracking_collumn with logstash jdbc-input plugin and MSSQL server, results in duplicated data.
config:
screenshots from database input and redis output shows there is 398800 rows in input and 401791 results in output
Screenshot of min and max t_datetime2 values
screenhots of actualy queries jdbc-input used to get the data from database
Sql queries are loosing the precision and this may result in duplicated data.
I am using following docker logstash image
logstash/logstash:7.3.2
mssql server image
microsoft/mssql-server-linux
Tried to search forum for similar issue, only found this comment describing same problem.
#140 (comment)
Sample data
db definition
dbdefinition.txt
dataimport
dataimport.txt
Steps to reprduce:
OS debian 9
Version logstash 7.3.2
Thank you for response.
The text was updated successfully, but these errors were encountered: