Merging JSON Data Records in a Spring Boot Application Using Jackson

In modern Java applications, JSON is a popular data interchange format. Developers often face requirements to merge data from multiple JSON sources for testing, analytics, or data consolidation. This article explains how to merge data records from a supplementary JSON file into a primary JSON file within a Spring Boot controller using Jackson. We focus on handling JSON tree structures, file I/O, and dynamic merging logic, while maintaining industry best practices for code quality and SEO.

Overview

Imagine you have two JSON files:

  • Primary Data File: Contains an event object, which includes metadata and a "records" array that holds existing data entries.
  • Supplementary Data File: Contains another array (e.g., "dataItems") that you wish to merge into the primary file’s "records" array.

For instance, the primary JSON might look like this:

{
  "event": {
    "class": "data-record",
    "type": "created",
    "records": [
      {
        "id": 1001,
        "value": 12.34,
        "description": "Initial record",
        "timestamp": 1624449600
      }
    ]
  }
}

And the supplementary JSON might appear as follows:

{
  "found": 200,
  "dataItems": [
    {
      "id": 2001,
      "value": 56.78,
      "description": "Additional record",
      "timestamp": 1712577600
    },
    {
      "id": 2002,
      "value": 90.12,
      "description": "Another record",
      "timestamp": 1712577601
    }
  ]
}

The technical goal is to write a method that reads both JSON files, updates any dynamic fields if necessary, and then appends all items from the supplementary file to the primary file’s "records" array. This provides a merged JSON data structure ready for further processing or output.

Technical Approach

The solution leverages several key Java components:

  • Jackson’s ObjectMapper and JsonNode Tree Model: For reading, manipulating, and writing JSON.
  • Java NIO for File I/O: To read file content from disk.
  • Spring Boot: To manage the controller logic and dependency injection.
  • Robust Error Handling: Ensuring that file reading issues can be managed gracefully with fallbacks or retries.

When merging data, the approach is as follows:

  1. Load the Primary JSON File:
    Use ObjectMapper to deserialize the file into a JsonNode tree structure. Navigate to the "event" node and then obtain the "records" array.
  2. Load the Supplementary JSON File:
    Similarly, read the supplementary JSON and extract its data array (e.g., "dataItems").
  3. Append Data:
    Iterate over each element in the supplementary array and add it to the primary file’s "records" array.
  4. Output the Merged JSON:
    Finally, serialize the updated JSON tree into a formatted JSON string, ready for use.

Code Example

Below is a refined example of how to implement this merging strategy within a Spring Boot controller method. The example uses generic file and folder names, anonymizing any sensitive details:

import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ArrayNode;
import com.fasterxml.jackson.databind.node.ObjectNode;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;

private synchronized String getMergedJson(String identifier, String folderType) {
    log.info("Entering getMergedJson for folder type: {}", folderType);
    // Retrieve a generic user or context object (anonymized)
    UserContext user = DataRepository.getUser(identifier);
    if (user == null) {
        log.info("User not found for identifier: {}", identifier);
        return null;
    }
    log.info("Processing JSON for user: {}", user.getUsername());

    // Choose file prefix based on folder type (e.g., primary or supplementary)
    String filePrefix = folderType.equals("primary") ? "PrimaryData" : "SupplementaryData";
    
    // Build file path for main JSON file (anonymized file path)
    File userFolder = new File(baseFolder + separator + "module" + separator + folderType + separator + user.getUsername());
    File mainJsonFile = new File(userFolder, filePrefix + fileCounter + ".json");
    if (!mainJsonFile.exists()) {
        mainJsonFile = new File(baseFolder + separator + "module" + separator + folderType + separator + filePrefix + fileCounter + ".json");
    }
    log.info("Reading main JSON from: {}", mainJsonFile.getAbsolutePath());
    
    ObjectMapper mapper = new ObjectMapper();
    JsonNode rootNode;
    try {
        String jsonContent = new String(Files.readAllBytes(Paths.get(mainJsonFile.getAbsolutePath())));
        rootNode = mapper.readTree(jsonContent);
    } catch (IOException e) {
        log.error("IOException reading main JSON: {}", e.getMessage());
        fileCounter = 1;
        return getMergedJson(identifier, folderType);
    }
    
    // If processing the primary file, update fields if needed
    if (folderType.equals("primary")) {
        ArrayNode records = (ArrayNode) rootNode.path("event").path("records");
        if (records == null) {
            records = mapper.createArrayNode();
            ((ObjectNode) rootNode.path("event")).set("records", records);
        }
        
        // Load supplementary JSON from a separate folder
        String supPrefix = "SupplementaryData";
        File supFolder = new File(baseFolder + separator + "module" + separator + "supplementary" + separator + user.getUsername());
        File supJsonFile = new File(supFolder, supPrefix + fileCounter + ".json");
        if (!supJsonFile.exists()) {
            supJsonFile = new File(baseFolder + separator + "module" + separator + "supplementary" + separator + supPrefix + fileCounter + ".json");
        }
        log.info("Reading supplementary JSON from: {}", supJsonFile.getAbsolutePath());
        
        try {
            String supJsonContent = new String(Files.readAllBytes(Paths.get(supJsonFile.getAbsolutePath())));
            JsonNode supRootNode = mapper.readTree(supJsonContent);
            ArrayNode supDataArray = (ArrayNode) supRootNode.path("dataItems");
            if (supDataArray != null) {
                // Append each supplementary element to the primary records array
                for (JsonNode item : supDataArray) {
                    records.add(item);
                }
            }
        } catch (IOException e) {
            log.error("IOException reading supplementary JSON: {}", e.getMessage());
            // Proceed with the primary data even if the supplementary file fails
        }
    }
    
    String mergedJson;
    try {
        mergedJson = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(rootNode);
    } catch (Exception e) {
        mergedJson = rootNode.toString();
    }
    fileCounter++;
    log.info("Completed processing JSON for folder type {}. New fileCounter is: {}", folderType, fileCounter);
    return mergedJson;
}

Explanation of the Code

  1. File Path Resolution:
    The method constructs file paths using generic folder names and a configurable file counter. It checks for user-specific files and then falls back to default files as needed.
  2. Primary JSON Processing:
    When the folder type is "primary", the code loads a primary JSON file and retrieves the "records" array from an "event" node. If the array is missing, it creates a new one.
  3. Supplementary JSON Merging:
    It then constructs a path for the supplementary JSON file (stored in a different folder), reads its array (named "dataItems"), and appends each element to the primary file’s "records" array.
  4. Error Management and Retrying:
    If file reading fails, the method logs errors and uses a recursive call to retry processing after resetting the counter.

Conclusion

Merging JSON data dynamically using Jackson within a Spring Boot application offers a flexible way to consolidate datasets from multiple sources. In this guide, we demonstrated a generic approach—without exposing sensitive or domain-specific terms—by appending data from a supplementary file into a primary file’s records array. This strategy helps maintain data integrity, supports testing environments, and offers a robust base for further enhancements.

Implementing this solution effectively manages varying JSON structures, and its generic nature ensures that it can be easily adapted to different data domains or application requirements. With this knowledge, developers can further refine the integration logic, add validations, or extend the merging process to support more complex use cases.

This article is inspired by real-world challenges we tackle in our projects. If you're looking for expert solutions or need a team to bring your idea to life,

Let's talk!

    Please fill your details, and we will contact you back

      Please fill your details, and we will contact you back