Search

Information Technology_USA - USA_Technical Architect

Real Soft, Inc.
locationJacksonville, FL, USA
PublishedPublished: 5/4/2026
Full time
ALL CAPS, NO SPACES B/T UNDERSCORES PTN_US_GBAMSREQID_
Candidate BeelineID i.e. PTN_US_9999999_SKIPJOHNSON0413
MSP Owner: Thomas Hodges
Targeted - -hr
REQUIREMENT_CITY - New Jersey (White House Station) Must be able to come in for F2F interview and be willing to work in office for 3-4 days a week.
REQUIREMENT_ID-10645635
Role Name - Data Architect

ROLE_DESCRIPTION -

We are seeking a Senior Data & Integration Architect to lead data model design and downstream integration strategy for a large-scale policy administration system modernization. You will reverse engineer legacy mainframe data structures, forward engineer them into modern SQL/MongoDB document schemas, and define how data flows to downstream consumers. You will leverage purpose-built AI agents to accelerate reverse engineering, model generation, and documentation - bringing human judgment to validate and refine AI-produced outputs.
Key Responsibilities
- Design and maintain technology-agnostic Logical Data Models (entities, relationships, cardinality, PK/FK)
- Transform LDMs into modern physical schemas applying aggregate-oriented and DDD patterns
- Reverse engineer IMS hierarchical segments and DB2 tables - extract business entities from physical storage structures without original design documentation
- Interpret COBOL copybooks as data structure definitions and map legacy field types to modern equivalents
- Define embedding vs. referencing strategies, versioning patterns, and collection boundaries for target database platforms
- Design downstream integration patterns - REST APIs, event streaming (Kafka/MQ), Change Data Capture (CDC), and data distribution to consuming systems
- Direct and validate AI agent pipelines for automated reverse engineering, ERD generation, data dictionary synthesis, and schema artifact production
- Produce data dictionaries, ERD diagrams, ETL field mapping specifications, and integration contracts
- Collaborate with SMEs to validate models and integration flows against undocumented business logic

Required Qualifications
- 8+ years experience in data architecture and system integration within OLTP / transactional domains (insurance, banking, billing, or similar)
- Hands-on experience with IBM IMS, DB2 for z/OS, and COBOL copybooks - able to read a segment hierarchy or copybook independently
- 5+ years designing physical data models for modern relational or document-oriented databases
- Strong grasp of logical modeling: ERD notation, composition vs. reference, cardinality, key design
- Proven experience designing integration architectures: REST APIs, event streaming (Kafka, MQ), CDC pipelines, and message-based data distribution
- Experience with ELT processing, including designing and implementing ELT workflows, data transformation, data cleansing, and data validation
- Experience with real-time data processing, including designing and implementing real-time data processing pipeline with event-driven architectures
- Comfort working in an AI-augmented workflow - directing LLM-based agents, reviewing AI-generated artifacts, and applying domain expertise to close gaps AI cannot resolve
- Scripting proficiency (Python or equivalent) for schema validation and artifact generation
- Ability to abstract legacy physical data structures into business-oriented target models - separating IMS/DB2 storage implementation details from true business keys and domain entities
Preferred
- Insurance domain knowledge (policy lifecycle, coverages, LOBs, premium rating)
- Domain-Driven Design (DDD) - aggregates, bounded contexts, event-driven design
- Experience synthesizing a unified model from multiple heterogeneous sources (IMS + DB2 + application logic)
- Prior experience working with AI coding assistants (Claude, GitHub Copilot, or similar) in a software engineering or data architecture context

Skills: Data Architecture and Modeling
Experience Required: 10 & Above, Project Code :