Complete Data Flow
graph TB
Start([CLI: orion run]) --> LoadConfig[Carregar catalog.yml]
LoadConfig --> CreateContext[Criar OrionContext]
CreateContext --> LoadPipeline[Carregar Pipeline via PipelineBuilder]
LoadPipeline --> InitRunner[PipelineRunner]
InitRunner --> UseCase[RunPipelineUseCase.execute]
UseCase --> Loop{Iterar Nodes}
Loop --> PrepareInputs{Preparar Inputs}
PrepareInputs -->|Em Memória| MemCheck{Tem em data?}
PrepareInputs -->|Do Catalog| CatalogLoad[Catalog.load]
MemCheck -->|Sim| UseMemory[Usar valor em memória]
MemCheck -->|Não| CatalogLoad
CatalogLoad --> ExecuteNode[Executar Node Function]
UseMemory --> ExecuteNode
ExecuteNode --> StoreMemory[Armazenar em data dict]
StoreMemory --> CheckCatalog{Output existe no catalog?}
CheckCatalog -->|Sim| AutoSave[Auto-salvar via Catalog]
CheckCatalog -->|Não| NextNode{Próximo Node?}
AutoSave --> NextNode
NextNode -->|Sim| Loop
NextNode -->|Não| ReturnResults[Retornar data dict]
ReturnResults --> End([Fim])
Layer Architecture
graph TB
subgraph "APPLICATION LAYER"
CLI[CLI Commands]
Runner[Pipeline Runner]
Builder[Pipeline Builder]
end
subgraph "CORE LAYER"
Pipeline[Pipeline Entity]
Node[Node Entity]
Interface[Interfaces]
UseCase[RunPipelineUseCase]
end
subgraph "INFRASTRUCTURE LAYER"
LocalConn[LocalCSV Connector]
DBConn[Databricks Connector]
Catalog[Data Catalog]
Logger[Console Logger]
Context[Orion Context]
end
CLI --> Runner
Runner --> UseCase
Builder --> Pipeline
UseCase --> Pipeline
Pipeline --> Node
Node --> Interface
Node --> Context
Context --> Catalog
Context --> Logger
Catalog --> LocalConn
Catalog --> DBConn
Fluxo de Dados Completo
graph TB
Start([CLI: orion run]) --> LoadConfig[Carregar catalog.yml]
LoadConfig --> CreateContext[Criar OrionContext]
CreateContext --> LoadPipeline[Carregar Pipeline via PipelineBuilder]
LoadPipeline --> InitRunner[PipelineRunner]
InitRunner --> UseCase[RunPipelineUseCase.execute]
UseCase --> Loop{Iterar Nodes}
Loop --> PrepareInputs{Preparar Inputs}
PrepareInputs -->|Em Memória| MemCheck{Tem em data?}
PrepareInputs -->|Do Catalog| CatalogLoad[Catalog.load]
MemCheck -->|Sim| UseMemory[Usar valor em memória]
MemCheck -->|Não| CatalogLoad
CatalogLoad --> ExecuteNode[Executar Node Function]
UseMemory --> ExecuteNode
ExecuteNode --> StoreMemory[Armazenar em data dict]
StoreMemory --> CheckCatalog{Output existe no catalog?}
CheckCatalog -->|Sim| AutoSave[Auto-salvar via Catalog]
CheckCatalog -->|Não| NextNode{Próximo Node?}
AutoSave --> NextNode
NextNode -->|Sim| Loop
NextNode -->|Não| ReturnResults[Retornar data dict]
ReturnResults --> End([Fim])
Arquitetura em Camadas
graph TB
subgraph "APPLICATION LAYER"
CLI[CLI Commands]
Runner[Pipeline Runner]
Builder[Pipeline Builder]
end
subgraph "CORE LAYER"
Pipeline[Pipeline Entity]
Node[Node Entity]
Interface[Interfaces]
UseCase[RunPipelineUseCase]
end
subgraph "INFRASTRUCTURE LAYER"
LocalConn[LocalCSV Connector]
DBConn[Databricks Connector]
Catalog[Data Catalog]
Logger[Console Logger]
Context[Orion Context]
end
CLI --> Runner
Runner --> UseCase
Builder --> Pipeline
UseCase --> Pipeline
Pipeline --> Node
Node --> Interface
Node --> Context
Context --> Catalog
Context --> Logger
Catalog --> LocalConn
Catalog --> DBConn