I use spring spring-boot-starter-batch 2.7.9 and use mysql DB as the data source. I have a strange problem, the batch job runs locally on me but cannot be started in the development environment (kubernetes) with the following exception:
Exception while starting job org.springframework.jdbc.UncategorizedSQLException: PreparedStatementCallback; uncategorized SQLException for SQL [INSERT into BATCH_JOB_EXECUTION_PARAMS(JOB_EXECUTION_ID, KEY_NAME, TYPE_CD, STRING_VAL, DATE_VAL, LONG_VAL, DOUBLE_VAL, IDENTIFYING) values (?, ?, ?, ?, ?, ?, ?, ?)]; SQL state [HY000]; error code [3098]; The table does not comply with the requirements by an external plugin.; nested exception is java.sql.SQLException: The table does not comply with the requirements by an external plugin.
Jobs are triggered through the rest API, which provides two job parameters: string and datetime.
Example parameters:
{ "idType" : "ALL", "triggerTime": "2023-03-16T19:54:18.262Z" }
One difference between my local database and the development database might be the replication factor. The development database is configured with 3 replicas, whereas locally I only have one. Could this be the cause of this exception? How can I solve this problem? TIA.
You need to customize the definition of Spring Batch metatables so that each table has a primary key.
By default, MySQL accepts tables without primary keys. But in any setup using MySQL replication, a table without a primary key will either not work at all (as in your group replication situation) or cause operational headaches later on.
The root cause is that the DDL of the Spring Batch architecture contains a very limited number of indexes. This is intentional, as the best approach to indexing depends heavily on the specific use case: https://docs.spring.io/spring-batch/docs/4.3.8/reference/html/schema -appendix.html#recommendationsForIndexingMetaDataTables