Originally posted on 2020-11-25
Yesterday I needed to convert a large object to JSON. The object had 300k+ other objects inside of it. At first I didn't realize how big it was and naively tried to convert it to JSON with GSON the way I normally do:
String json = new Gson().toJson(object);
This fails pretty quickly when the string is 1.7 GB. I could actually increase my heap size to accommodate it but both Jackson and GSON provide other methods to serialize JSON to output streams, like files, that avoid the intermediate string and save a lot of time.
Below are the two snippets for each library. It was interesting for me to see that in this case Jackson serialized and wrote the 1.7 GB data structure to disk in 19 seconds while Gson took 98 seconds to do the same. I'm not sure if other things were getting in the way but it may be worth trying Jackson if you're using Gson with large structures.
Also, I tried using Java's native ObjectWriter to write the objects to disk and it took significantly longer than either of the JSON methods. Actually it took so long I didn't even let it finish...
NOTE: I'm using vavr's Try here but you could do the same thing with simple try/catch syntax too.
The Jackson way:
ObjectWriter objectWriter = new ObjectMapper().writer().withDefaultPrettyPrinter(); Try.withResources(() -> new FileWriter("jackson.json")) .of(fileWriter -> Try.withResources(() -> objectWriter.writeValues(fileWriter)) .of(sequenceWriter -> sequenceWriter.write(object))) .get();
The Gson way:
Gson gson = new GsonBuilder().setPrettyPrinting().create(); Try.withResources(() -> new FileWriter("gson.json")) .of(writer -> Try.run(() -> gson.toJson(object, writer)).get()) .get();