Bringing Alloy and Ember to Snowflake: DataForge Expands to a New Ecosystem
Earlier this week, we introduced the Databricks implementation of Alloy and Ember in DataForge 10.0. Today, we are expanding that same architecture to Snowflake. DataForge now brings structured refinement, metadata-driven execution, and predictable processing to a second major cloud data platform.
Snowflake users can now run the Alloy Architecture natively inside Snowflake with table-backed refinement, metadata-driven change detection, enrichment, and merging. All stages are powered by Ember’s structured metadata model. This provides Snowflake customers with a consistent, governed refinement flow without relying on hidden logic or custom layers.
Alloy, Now Running Natively on Snowflake
Alloy’s five-layer processing flow of ORE, MINERAL, ALLOY, INGOT, and PRODUCT is now implemented as Snowflake tables and views. This allows refinement to occur directly in the warehouse using Snowflake’s native capabilities. Each stage is visible, queryable, and aligned with Snowflake’s execution patterns.
Snowflake teams benefit from a consistent refinement model that is easy to inspect, audit, and govern. This implementation preserves Alloy’s structured behavior while integrating smoothly with Snowflake’s SQL-first environment and performance model.
Powered by Ember: Structured Metadata for Snowflake
Ember acts as the definition layer behind Alloy on Snowflake. Ember stores explicit metadata that describes:
how sources are interpreted
how incremental change detection operates
how attributes relate across domains
how enrichment logic is applied
how merge behavior functions during refresh
how outputs are shaped and delivered
These definitions map directly into Snowflake’s table structures and allow Alloy to execute deterministically across domains.
Snowflake Local Agent Support
DataForge 10.0 introduces Snowflake Local Agent support. This enables ingestion, metadata synchronization, and connectivity to run inside a customer-controlled network while refinement runs inside Snowflake. It provides a hybrid option for organizations with secure networking or on-premises requirements.
Snowpark SDK Availability
DataForge 10.0 also includes a Snowpark-based SDK for Snowflake developers. This allows custom ingestion logic, connectors, and extensions to be built using Snowflake-native programming patterns while still participating in Alloy’s structured refinement model.
Incremental Processing Built Into the Architecture
Incremental behavior is implemented the same way on Snowflake as on Databricks. MINERAL isolates new or changed records. ALLOY enriches this reduced dataset before it scales. INGOT merges updates back into the complete dataset.
Because Ember defines attribute behavior at each stage, Alloy can push logic earlier in the process, reduce warehouse compute consumption, and eliminate the need for custom orchestration. Incremental workloads become predictable, efficient, and consistent across domains.
A Familiar Experience for Existing DataForge Customers
Teams already using DataForge will find that the Snowflake experience matches their existing workflow. The development model is identical:
the same declarative configurations
the same attribute-level refinement model
the same Alloy flow across five stages
the same operational experience in Talos and the DataForge UI
Snowflake becomes simply another execution environment, rather than a separate implementation.
Performance and Operational Optimization for Snowflake
DataForge 10.0 includes performance optimizations specific to Snowflake environments:
improved ingestion paths optimized for Snowflake’s loading patterns
efficient merge logic in INGOT using Snowflake’s micro-partitioning
improved Data Profile extraction aligned with Snowflake’s INFORMATION_SCHEMA
reduced warehouse consumption through Alloy’s incremental design
improved agent-level performance that minimizes round-trips to Snowflake
These enhancements provide faster and more cost-effective refinement cycles.
A Unified Multi-Platform Architecture
With Snowflake support, Alloy and Ember now operate consistently across Databricks and Snowflake. This allows organizations to adopt either platform without learning a new data engineering model or maintaining multiple pipeline designs.
Alloy provides the structure.
Ember provides the definitions.
Snowflake provides the warehouse execution environment.
Together, they offer a modern, predictable refinement experience without hidden layers or platform-specific rewrites.
What’s Next
The release of DataForge 10.0 completes our largest platform launch to date. Over the past week, we introduced the Alloy Architecture, a new structured refinement model, and Ember, the metadata catalog that defines and governs Alloy’s behavior. We then brought both technologies to Databricks with a fully table-native implementation and Unity Catalog alignment. Today’s announcement extends that same architecture to Snowflake, enabling Alloy and Ember to operate consistently across multiple cloud data platforms.
With the core architecture now fully deployed, our upcoming content will focus on deep-dive technical walkthroughs, transformation framework details, Ember metadata patterns, and best practices for Alloy across platforms. We will also be sharing new customer stories that highlight how organizations are adopting Alloy and Ember to simplify complex pipelines and accelerate time-to-value.
More updates will follow throughout the next few weeks and months.