Converting a jOOQ-Record into a vertx JsonObject

In vertx-jooq, there are a couple of ways to convert a jOOQ Record into a vertx JsonObject. First, checkout the straightforward way, which involves two steps: load the POJO using a generated DAO and then convert it with the toJson method.


public class SomeJsonTest {
public void getJsonFromPojo(){
SomethingDao somethingDao = new SomethingDao();
//setConfiguration/ setVertx left out
CompletableFuture<JsonObject> jsonFuture = somethingDao
.fetchOneBySomeidAsync(1)
.thenApply(ISomething::toJson);
}
}

It turns out however that on the way to a JsonObject, a lot of copying is going on. First the SELECT statement is mapped to a Record, then converted to a POJO and finally converted into a JsonObject. The code below is copied from DAOImpl class which is extended by every VertxDao:


public abstract class DAOImpl<R extends UpdatableRecord<R>, P, T> implements DAO<R, P, T> {
// …
@Override
public /* non-final */ <Z> P fetchOne(Field<Z> field, Z value) {
R record = using(configuration)
.selectFrom(table)
.where(field.equal(value))
.fetchOne();
return record == null ? null : mapper().map(record);
}
// …
}

view raw

DAOImpl.java

hosted with ❤ by GitHub

If you’ve enabled the generation of interfaces in vertx-jooq (which is the default), then you could convert the Record directly into a JsonObject without converting to POJO first.


public class SomeJsonTest {
public void getJsonFromRecord(){
SomethingDao somethingDao = new SomethingDao();
//setConfiguration/ setVertx left out
CompletableFuture<JsonObject> jsonFuture = somethingDao.executeAsync(dslContext -> dslContext
.selectFrom(somethingDao.getTable())
.where(Tables.SOMETHING.SOMEID.eq(1))
.fetchOne())
.thenApply(ISomething::toJson)
;
}
}

The major disadvantage of this solution is that you have to write the SELECT statement by yourself. On the other hand, you save the creation of the POJO, which is a plus. But what if we’re joining on another table so we cannot directly map into a generated Record? See this example:


public class SomeJsonTest {
public void getJsonFromRecord2(){
SomethingDao somethingDao = new SomethingDao();
CompletableFuture<JsonObject> jsonFuture = somethingDao.executeAsync(dslContext -> dslContext
.select(Tables.SOMETHING.SOMEID)
.select(Tables.SOMEOTHERTHING.SOMEVALUE)
.join(Tables.SOMEOTHERTHING)
.on(Tables.SOMETHING.SOMEID.eq(Tables.SOMEOTHERTHING.SOMEOTHERID))
.where(Tables.SOMETHING.SOMEID.eq(1))
.fetchOneMap())
//or .fetchOne(Record::intoMap)
.thenApply(JsonObject::new);
}
}

Because the fetched Record is dynamic, we cannot simply call toJson on it. Instead we call the fetchOneMap method which “returns at most one resulting record as a name/value map.” Luckily, JsonObject has a constructor taking a Map<String,Object> which is exactly the same generic type returned (it even returns the same Map implementation that JsonObject uses under the hood: LinkedHashMap). It is noteworthy that this could even be done with pure jOOQ and vertx (leaving the execution aside). Some caveats though:

  1. No conversion happens. If you use a custom generator to generate the schema objects and have overwritten handleCustomTypeToJson / handleCustomTypeFromJson, you’re probably in bad shape. If one of the fields you’re fetching is one of those types, this conversion is bypassed.
  2. Because the constructor makes no checks and takes the map ‘as is’ without copying, you’re probably adding illegal JSON-types.
  3. The Map‘s key will have the name of the database column it is representing. This differs from the default-setting in vertx-jooq, which uses the name of the POJO’s member variable representing that column and thus would produce different JSON compared to the toJson method for the same Record/POJO. Starting with 2.3.5 however this can be changed so the keys are rendered with the same syntax.

Lessons learned

When simple CRUD is not enough, you have to fetch some joined Records and need to convert the results into a JsonObject, you should change the syntax how the JsonObject's keys are rendered in the POJO (3.). This is the first step in making both ways of generating JSON interoperable.

Things get trickier if you have to deal with conversion of custom JSON types. If your app/service is mostly reading data, you could handle conversion already on jOOQ-level so your converted type is already a valid JsonObject type. For example, if you’re using a SQL DateTime field “lastUpdated” on database level, just write a converter that formats the DateTime as a String. In turn, both the generated POJO and the fetchOneMap/fetchMaps methods will return a String for the “lastUpdated” entry and produce same results.

This can become a problem when those converted values need to be altered by the same app: a) it is more convenient and less error-prone to set a java.time.LocalDateTime object instead of dealing with Strings and b) some of the types may have special SQL-functions (e.g. dateAdd) which you cannot use any longer for that type.

Conclusion

Surprisingly there is no golden rule. First, I recommend to change the syntax of how JSON keys are rendered to be the same as the database column names (I will most likely change that behavior to the default in one of the next releases). This means you can use the fetchOneMap/fetchMaps methods most of the time to produce JSON. When dealing with custom types you should check how frequently those are updated by your app. If you’re facing a read-only app, write custom jOOQ converters for these types. If these types are altered quite often, you should stick to their actual type and handle the conversion into a proper JSON value on your own.

Leave a comment