Monday 12 April 2021

JDBI v3 and BatchChunkSize annotation, Only inserting up to the size of the BatchChunkSize

 Well hasn't it been a long time.


We recently upgraded to JDBI v3 and as part of that came across an interesting problem. The issue was that with v3 the insert was only inserting up to the number of rows defined in the annotation. So if the annotation was set at 50 you would get a max of 50 rows inserted.


in v2 we had a method on an interface defined as :

@GetGeneratedKeys 

@SqlBatch("Insert into my_table (......)

@BatchChunkSize(50)

void insertAsBatch(@BindBean("r") List<Foo> foos);


NB: in v2 this returned void


So innocently & clearly incorrectly I changed this to return 'int' assuming that would be the number of rows returned. Note that stupidly we had no test for this.


Users then found that we were only able to insert 50 rows.


Writing a test I then found that the value returned was the ID of the first row inserted. The fix was very simple. Change the return to an int[] IE this:

@GetGeneratedKeys 

@SqlBatch("Insert into my_table (......)

@BatchChunkSize(50)

int[] insertAsBatch(@BindBean("r") List<Foo> foos);