Senior Data Platform Engineer

Remote Full-time
Who We Are: Alpaca is a US-headquartered self-clearing broker-dealer and brokerage infrastructure for stocks, ETFs, options, crypto, fixed income, 24/5 trading, and more. Our recent Series C funding round brought our total investment to over $170 million, fueling our ambitious vision. Amongst our subsidiaries, Alpaca is a licensed financial services company, serving hundreds of financial institutions across 40 countries with our institutional-grade APIs. This includes broker-dealers, investment advisors, wealth managers, hedge funds, and crypto exchanges, totalling over 6 million brokerage accounts. Our global team is a diverse group of experienced engineers, traders, and brokerage professionals who are working to achieve our mission of opening financial services to everyone on the planet. We're deeply committed to open-source contributions and fostering a vibrant community, continuously enhancing our award-winning, developer-friendly API and the robust infrastructure behind it. Alpaca is proudly backed by top-tier global investors, including Portage Ventures, Spark Capital, Tribe Capital, Social Leverage, Horizons Ventures, Unbound, SBI Group, Derayah Financial, Elefund, and Y Combinator. Our Team Members: We're a dynamic team of 230+ globally distributed members who thrive working from our favorite places around the world, with teammates spanning the USA, Canada, Japan, Hungary, Nigeria, Brazil, the UK, and beyond! We're searching for passionate individuals eager to contribute to Alpaca's rapid growth. If you align with our core values—Stay Curious, Have Empathy, and Be Accountable—and are ready to make a significant impact, we encourage you to apply. Your Role: We are seeking a Senior Data Platform Engineer to design and develop the data management layer for our platform to ensure its scalability as we expand to larger customers and new jurisdictions. At Alpaca, data engineering encompasses financial transactions, customer data, API logs, system metrics, augmented data, and third-party systems that impact decision-making for both internal and external users. We process hundreds of millions of events daily, with this number growing as we onboard new customers. We prioritize open-source solutions in our data management approach, leveraging a Google Cloud Platform (GCP) foundation for our data infrastructure. This includes batch/stream ingestion, transformation, and consumption layers for BI, internal use, and external third-party sinks. Additionally, we oversee data experimentation, cataloging, and monitoring and alerting systems. Our team is 100% distributed and remote. Responsibilities: • Design and oversee key forward- and reverse-ETL patterns to deliver data to relevant stakeholders. • Develop scalable patterns in the transformation layer to ensure repeatable integrations with BI tools across various business verticals. • Expand and maintain the Alpaca Data Lakehouse architecture's constantly evolving elements. • Collaborate closely with sales, marketing, product, and operations teams to address key data flow needs. • Operate the system and manage production issues in a timely manner. Must-Haves: • 7+ years of experience in data engineering, including 2+ years of building scalable, low-latency data platforms capable of handling >100M events/day. • Proficiency in at least one programming language, with strong working knowledge of Python and SQL. • Experience with cloud-native technologies like Docker, Kubernetes, and Helm. • Strong hands-on experience with relational database systems and object storage implementations like Apache Iceberg. • Strong hands-on experience with Google Cloud Platform and its various data-related services (Composer, Dataproc, Datastream, etc.) • Experience in building scalable transformation layers, preferably through formalized SQL models (e.g., dbt). • Ability to work in a fast-paced environment and adapt solutions to changing business needs. • Experience with ETL orchestrators / frameworks like Apache Airflow and Airbyte. • Production experience with streaming systems like Kafka. • Exposure to infrastructure, DevOps, and Infrastructure as Code (IaaC), like Terraform. • Deep knowledge of distributed systems, storage, transactions, and query processing utilizing open-source distributed query engines like Trino (formerly PrestoSQL). • If you're passionate about data engineering and thrive in a dynamic startup environment, we'd love to hear from you! How We Take Care of You: • Competitive Salary & Stock Options • Health Benefits • New Hire Home-Office Setup: One-time USD $500 • Monthly Stipend: USD $150 per month via a Brex Card Alpaca is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce. Recruitment Privacy Policy Apply tot his job
Apply Now

Similar Opportunities

Analyst, Data Management (CMS Network - MA HealthPlan)

Remote

Text Data Integrity Specialist (Remote) Job at Outlier AI in Corpus Christi

Remote

Mgr, HIM Data Integrity

Remote

[Remote] Customer Implementation Specialist, Data Migration

Remote

Cloud Data Operations Engineer (Technology Specialist I) - Digital and Technology Partners - Enterprise Data & Analytics - Hybrid or Remote

Remote

[Remote] Clinical Operations Specialist, Quality & Safety

Remote

Principal Engineer – Data Pipeline – US Remote United States

Remote

Part Time Ongoing - Data Engineering & Analytics Pipeline Team

Remote

Data Platform Engineer Enterprise Architecture – Real Careers – New York City, NY – work from home job

Remote

Staff Cyber Engineer - Data Loss Prevention - Remote Available

Remote

Experienced Customer Experience Advocate – Delivering Exceptional Support and Empowering Users in a Dynamic and Innovative Environment at blithequark

Remote

Senior Account Manager, Large Customer Sales - Driving Business Growth through Strategic Digital Advertising Solutions at blithequark

Remote

**Experienced Full Stack Data Product Manager – Web & Cloud Application Development**

Remote

Experienced Customer Service Representative - Remote Work from Home Opportunity with blithequark

Remote

Manpower and Personnel Manager, MD31

Remote

Kroger Sales Intern - Summer 2026

Remote

PART TIME DATA ENTRY (REMOTE)

Remote

**Part-Time Evening Data Entry Specialist – Remote Opportunity with arenaflex**

Remote

**Experienced Online Automotive Sales Chat Specialist – Thrive in a Fast-Paced Environment at blithequark**

Remote

Manager, Industry Insights and Competitive Intelligence

Remote
← Back to Home