Remote Cloud Data Warehouse Architect

Remote Full-time
This a Full Remote job, the offer is available from: Pennsylvania (USA) SUMMARY The Cloud Data Warehouse Architect will design and deliver the next-generation enterprise analytics platform. This position is highly technical and will focus on building a cloud-native, SAP-integrated, AI-ready architecture that supports analytics, reporting, and advanced machine learning at scale. The architect will modernize current BI and data warehouse environment, anchored today in IBM Netezza, Cognos, and Tableau into a cloud-based architecture. This role will require deep technical expertise in data modeling, cloud-native design, and hybrid architectures that bridge legacy on-prem systems with cloud-first capabilities. The Data Science & Insights group is at the center of analytics transformation. Our mission is to: • Consolidate legacy BI systems (Netezza, Cognos) into a modern cloud architecture. • Support the SAP S/4HANA migration with tight integration into futrestate. • Deliver governed, high-performance datasets for self-service analytics in Tableau, Power BI, and SAC. • Enable AI/ML use cases through Databricks and Azure ML. • Extend analytics capabilities to our partners and vendors via embedded reporting This is an opportunity to be the hands-on architect shaping future-state data strategy, working in a fast-paced, hybrid cloud environment that balances innovation with enterprise stability. ESSENTIAL DUTIES AND RESPONSIBILITIES Architectural Design & Modernization • Lead the design of a cloud data warehouse and data lakehouse architecture capable of ingesting large-scale transactional and operational data. • Define integration strategies for core systems. • Develop a reference architecture that leverages Azure Data Lake Storage (ADLS) and Databricks Delta Lake as core components. • Implement semantic modeling to unify reporting across Tableau, Power BI, and SAP Analytics Cloud (SAC). Data Engineering & Performance • Oversee ingestion pipelines for batch (Netezza extracts, flat files, nightly jobs) and near real-time (APIs, streaming) data sources. • Optimize query performance through partitioning, clustering, caching, and Delta Lake / warehouse design. • Establish reusable ETL/ELT patterns across Databricks notebooks, SQL-based orchestration, and integration with ActiveBatch scheduling. Governance, Security & Compliance • Define and enforce data governance standards (naming conventions, metadata, lineage, data quality). • Partner with InfoSec on identity management (Azure AD), encryption, and RBAC/ABAC models. • Implement governance tooling such as Azure Purview, SAP metadata catalogs, Databricks Unity Catalog, and Glasswing. Collaboration & Enablement • Partner with data engineers and visualization teams to deliver governed, high-performance datasets consumable in Tableau, Power BI, SAC, and SAP Fiori. • Serve as the technical SME for architects, engineers, and analysts, ensuring alignment to best practices in cloud-native data warehouse design. • Drive knowledge transfer from legacy platforms (Netezza, Cognos) into the new ecosystem. EDUCATION and/or EXPERIENCE Education • Bachelor's degree in Computer Science, Engineering, or related field. Experience • 7+ years in data engineering, data warehouse architecture, or cloud data architecture. • Expertise in Azure (ADLS, Synapse, Purview, Databricks, networking, security). • Strong proficiency in Databricks (Delta Lake, PySpark, SQL) and/or Snowflake (warehouse design, scaling, security). • Proven experience in data modeling (3NF, star schema, semantic layers). • Deep SQL expertise across both cloud and traditional RDBMS (Netezza, SQL Server, Progress OpenEdge). • Understanding of SAP S/4HANA integration and familiarity with SAP Datasphere. Preferred • Prior experience migrating from on-prem Netezza or other MPP systems to cloud-native platforms. • Familiarity with Cognos to Tableau/Power BI migrations and dashboard optimization. • Hands-on experience with SAP Analytics Cloud (SAC) and embedded analytics. • Knowledge of machine learning workflows and integration with Databricks MLflow or Azure ML. • Strong knowledge of data governance frameworks and tooling (Purview, Unity Catalog, SAC governance). This offer from "EDI Staffing, an EDI Specialists Company" has been enriched by Jobgether.com and got a 72% flex score. Apply tot his job
Apply Now

Similar Opportunities

Database Administrator (Remote Opportunity)

Remote

100% REMOTE: IBM DB2 Database Administrator

Remote

[Remote] Databricks Engineer - Azure

Remote

Senior Databricks Engineer- fully remote

Remote

[Remote] DataOps Engineer (AWS)

Remote

Dialysis Coordinator job at DaVita in Washington, WA

Remote

DaVita – Administrative Assistant (AA) – Sainte-Claire, QC – Mission, TX

Remote

AI and Machine Learning Engineer III

Remote

[Remote] Senior Deep Learning Tools Developer

Remote

Staff Data Scientist, Machine Learning - USA Remote

Remote

**Experienced Full Stack Distributed Systems Engineer – Web & Cloud Application Development**

Remote

Account Manager, Media and Entertainment – Amazon Store

Remote

**Experienced Customer Sales and Service Representative – Delivering Exceptional Customer Experiences at arenaflex**

Remote

BSA/AML/OFAC Analyst – Work From Home Opportunity (Must Reside in Texas) in San Antonio, TX in Randolph-Brooks Federal Credit Union

Remote

Russian Customer Support Agent - (Rotation Team)

Remote

Experienced Remote Customer Service Representative - Excellent Communication Skills, Competitive Pay up to $19/hr, and Career Growth Opportunities with blithequark

Remote

Experienced Customer Service Representative for Evening Shifts – Automotive Industry Specialist at Blithequark

Remote

Senior Data Scientist - Microsoft Data Science Opportunity with Competitive Salary and Remote Flexibility

Remote

Manager of Revenue Operations (REMOTE)

Remote

Underwriter I, Commercial Lines

Remote
← Back to Home