Data Solutions Architect (Financial Services)

Remote Full-time
Description • Architect and deliver end-to-end data solutions that power critical decision-making for the world’s largest banks, insurers, and asset managers. You will design lakehouse platforms on Databricks that unify structured trading data, unstructured customer communications, and real-time market feeds into a single source of truth capable of sub-second analytics at petabyte scale. • Serve as the primary technical authority during pre-sales pursuits, translating vague RFP requirements into crisp solution visions. You will run white-boarding sessions with C-level stakeholders, build rapid prototypes in Databricks SQL and PySpark, and quantify ROI models that demonstrate how a modern lakehouse can reduce total cost of ownership by 40 % while accelerating regulatory-reporting cycles from weeks to hours. • Own the full delivery lifecycle—from initial discovery workshops through production cut-over—ensuring every architecture decision meets banking-grade security, compliance, and performance standards. You will map business capabilities to medallion-architecture zones (Bronze, Silver, Gold), define streaming ingestion patterns using Delta Live Tables, and implement Unity Catalog governance policies that satisfy Basel III, CCAR, and GDPR mandates without stifling self-service analytics. • Lead cross-functional teams of 5–10 data engineers, data scientists, and cloud architects spread across North America, EMEA, and APAC. Provide hands-on mentorship in Spark performance tuning, Delta Lake optimization, and MLOps automation using MLflow; maintain burndown charts, risk registers, and client-satisfaction KPIs that keep multi-million-dollar programs on track and on budget. • Continuously optimize mission-critical workloads that process over 50 TB of transactions, market data, and reference data daily. You will refactor legacy ETL pipelines into streaming Delta pipelines, reduce batch windows from 6 hours to 15 minutes, and eliminate Kafka lag spikes during market-open surges—directly impacting trading desk P&L and regulatory-reporting deadlines. • Establish enterprise data-governance guardrails that allow business analysts to explore sensitive datasets without compromising privacy or audit trails. Define data-quality SLAs, lineage policies, and role-based access controls that satisfy both internal risk committees and external regulators, while enabling self-service analytics adoption to grow from 20 % to 80 % of the user base within 12 months. • Evaluate and integrate emerging technologies—Apache Iceberg, Snowflake native apps, Lakehouse Federation, real-time feature stores—through internal POCs and reference architectures. Package reusable accelerators (Terraform modules, dbt macros, MLflow project templates) that shorten future project ramp-up by 30 % and reinforce SunnyData’s reputation as the premier Databricks partner in financial services. • Translate complex technical achievements into board-ready narratives. Build executive dashboards in Databricks SQL and Tableau that visualize data-platform ROI, cost-to-serve trends, and predictive capacity-planning metrics, guiding multi-year investment decisions and securing follow-on expansions worth $2–$5 M annually. • Champion a culture of experimentation and continuous learning by running weekly architecture guilds, lunch-and-learn sessions, and quarterly hackathons. Publish best-practice blogs, speak at industry conferences, and mentor junior consultants—elevating the entire firm’s technical bar and ensuring SunnyData remains two steps ahead of market demand. Apply tot his job
Apply Now
← Back to Home