I am just writing a brief post about a conversation I was asked to join.
The question paraphrased:
For dates in an Microsoft SQL Server, when they get passed through Qlik Replicate to Kafka Avro – how are they interpreted? Epoch to utc? Or to the current time zone?
I didn’t know the answer so I created the following simple test:
CREATE TABLE dbo.JD_DATES_TEST
(
ID INT IDENTITY(1,1) PRIMARY KEY,
TEST_SMALLDATETIME SMALLDATETIME,
TEST_DATE date,
TEST_DATETIME datetime,
TEST_DATETIME2 datetime2,
TEST_DATETIMEOFFSET datetimeoffset
);
GO
INSERT INTO dbo.JD_DATES_TEST VALUES(current_timestamp, current_timestamp, current_timestamp, current_timestamp, current_timestamp);
SELECT * FROM dbo.JD_DATES_TEST;
/*
ID TEST_SMALLDATETIME TEST_DATE TEST_DATETIME TEST_DATETIME2 TEST_DATETIMEOFFSET
----------- ----------------------- ---------- ----------------------- --------------------------- ----------------------------------
1 2025-06-12 12:04:00 2025-06-12 2025-06-12 12:04:16.650 2025-06-12 12:04:16.6500000 2025-06-12 12:04:16.6500000 +00:00
(1 row(s) affected)
*/
On the other side after passing through Qlik Replicate and then onto Kafka in Avro format; we got:
{
"data": {
"ID": {
"int": 1
},
"TEST_SMALLDATETIME": {
"long": 1749729840000000
},
"TEST_DATE": {
"int": 20251
},
"TEST_DATETIME": {
"long": 1749729856650000
},
"TEST_DATETIME2": {
"long": 1749729856650000
},
"TEST_DATETIMEOFFSET": {
"string": "2025-06-12 12:04:16.6500000 +00:00"
},
"x_y": {
"string": "1.0.0"
}
},
"beforeData": null,
"headers": {
"operation": "REFRESH",
"changeSequence": "",
"timestamp": "",
"streamPosition": "",
"transactionId": "",
"changeMask": null,
"columnMask": null,
"transactionEventCounter": null,
"transactionLastEvent": null
}
}
So 1749729856650000 equals Thursday, June 12, 2025 12:04:16.650 PM – which is local time.