Framework and execution model

SNPTX separates framework layers from runtime execution so infrastructure, governance, workload surfaces, mediated extensions, and deployment scope remain explicit.

Architecture view
Orchestrated execution for research-facing deployment

Layered framework architecture with a staged execution spine and declared extension boundaries.

The architecture is presented in two complementary views. The framework view identifies infrastructure, data adapters, execution coordination, modality-specific workloads, autonomous experimentation, and the deployment surface. The execution view describes staged transitions, persisted artifacts, and declared interfaces for downstream analytical attachment.

Framework architecture

Layered system organization

The framework separates infrastructure controls, data adapters, execution coordination, workload surfaces, experiment-selection components, and research-facing deployment interfaces.

FRAMEWORK VIEW 1 Infrastructure and reproducible practice config-driven execution, tests, versioned repos, artifact semantics, repeatable runs FOUNDATION 2 Data platform and adapters multi-dataset registry, modality adapters, embedding registry, FHIR / OMOP alignment, dataset manifests OPERATIONAL 3 Execution coordination and extension boundary Snakemake spine, persisted artifacts, MLflow logging, contract validation, manifest capture owner-mediated extension runner with contract validation and manifest capture EXECUTION LAYER 4 Model and modality layer tabular, omics, graph, imaging, text, fusion, comparative evaluation, modality-specific methods WORKLOAD SURFACE 5 Autonomous experimentation surface GP surrogates, meta-features, experiment memory, value-of-information, stopping rules Experiment-selection loop using scoped optimization inputs and stopping rules. OPTIMIZATION 6 Research deployment surface FastAPI serving, batch inference, audit trails, customer-hosted path, and one lab-facing workflow Research-facing deployment scope with defined workflow and reporting outputs. PILOT SURFACE
Interpretation

Layered responsibilities

This view distinguishes infrastructure, coordination, workloads, and outward-facing surfaces. Execution coordination remains separate from modality-specific analysis and from deployment-facing interfaces.

Execution spine, mediated extensions, autonomous loop

The execution view traces staged transitions, persisted artifacts, the contract-validated extension runner, the experiment-selection loop, and the deployment-facing evaluation package.

SNPTX execution spine, extension runner, autonomy and deployment surfaces Four-stage execution spine (ingestion, training, evaluation, reporting) with persisted artifacts; mediated contract-validated extension runner attaches through declared interfaces; an autonomous experimentation surface feeds back into training; a deployment surface receives the evaluation package. EXECUTION SPINE — SNAKEMAKE DAG · DVC ARTIFACTS · MLFLOW TRACKING Ingestion adapters + manifests Training models + configs Evaluation metrics + comparisons Reporting evidence packages dataset state trained model evaluation set report bundle contract-validated extension runner — calibration · aggregation · evaluation summary AUTONOMOUS EXPERIMENTATION · EXPERIMENTENGINE GP surrogate + EI / VoI acquisition SPRT (α=β=0.05) · DuckDB experiment catalog 1,037 exp / hr validated throughput DEPLOYMENT SURFACE · EVALUATION PACKAGE FastAPI · RBAC · HMAC-signed tokens hash-chained audit (21 CFR Part 11 posture) customer-hosted via Terraform + Helm NEXT-RUN SELECTION
Stage sequence

Staged execution

Runs proceed through ingestion, training, evaluation, and reporting in a fixed sequence, with persisted state at the handoff between each stage.

Extension boundary

Mediated attachment

Comparative analysis, modality-specific methods, and summary layers attach through a contract-validated runner rather than becoming hidden logic inside the core DAG.

Optimization interface

Guided autonomy

The autonomous loop influences next-run selection and experiment prioritization, but it operates through declared interfaces and does not erase stage sequence or artifact boundaries.

Deployment interface

Explicit outputs

The deployment surface consumes an evaluation package with defined scope, rather than exposing the full experimental system as an unconstrained product surface.

Why this structure matters

The separation of layers and interfaces is intended to preserve a clear execution record while allowing analytical extension, experiment selection, and pilot delivery to evolve at declared boundaries.

For researchers

Execution record is visible

Artifact handoffs, manifests, and stage boundaries create an inspectable record of what was run, what was produced, and where downstream interpretation begins.

For extension work

Analysis stays modular

New diagnostics, modality-specific methods, and comparative logic can be attached at declared interfaces without destabilizing the execution core.

For autonomy

Optimization stays scoped

Experiment selection can be sophisticated while still operating on persisted records, declared interfaces, and explicit stopping logic.

For pilots

Deployment scope is defined

Research-facing delivery is defined as a scoped surface with a specified workflow, a defined evidence package, and an auditable path into operational use.

Design rationale

The architecture keeps orchestration, analytical extension, and deployment scope explicit. That separation supports a clear execution record and auditability in the execution core while preserving flexibility at controlled interfaces.