I got a list of IDs as a comma separated list of JSON values, some sample datasets are as follows [340596,340597,340595] This list can be quite large, sometimes 50k IDs, separated by commas
The following query connects these IDs to the table primary key and gets the records that currently exist in the table
SELECT s.id,s.contactid, s.Quantity FROM JSON_TABLE('[340596,340597,340595]', '$[*]' columns (Id int path '$')) AS sm LEFT JOIN mastertable s ON s.Id = sm.id
The main table may contain these IDs, or these records may have been deleted from the main table, so the purpose of this query is to ensure that the returned result set only contains active records
I have to apply one more filtering to this query and this filtering is based on another JSON int array and needs to match it against the ContactID column
SELECT s.id,s.contactid, s.Quantity FROM JSON_TABLE('[340596,340597,340595]', '$[*]' columns (Id int path '$')) AS sm LEFT JOIN mastertable s ON s.Id = sm.id WHERE s.ContactId IN ( SELECT cm.id FROM JSON_TABLE('[12345,450597,640595]', '$[*]' columns (Id int path '$')) AS cm )
However, Mysql IN performance is not better for large result sets. Can we replace this IN with something better?
You can dump the ids in the IN clause into a temporary table and then join them with JSON_TABLE to get the result. Alternatively, you can use CTE and join in the fun.