Analytics Platform Engineer Associate

Remote Full-time
Summary Huron is redefining what a global consulting organization can be, and they are seeking an Analytics Platform Engineer Associate to develop and maintain analytics infrastructure and deployment pipelines. This role focuses on implementing platform capabilities and supporting infrastructure development using cloud and data engineering technologies. Responsibilities • Develops and maintains analytics infrastructure components and deployment pipelines • Implements infrastructure-as-code solutions following established patterns • Supports automated deployment processes for analytics products and data pipelines • Contributes to platform scalability and reliability improvements • Assists in ensuring platform performance meets team requirements • Builds and maintains ETL/ELT pipelines for data ingestion, transformation, and loading • Develops data integration solutions connecting source systems to analytics platforms • Implements data quality and validation processes within pipeline workflows • Supports monitoring and alerting for data pipeline health and performance • Assists in optimizing pipeline performance and resource utilization • Contributes to development of self-service tools and platform capabilities • Helps create standardized templates and patterns for analytics use cases • Supports API and service development for platform integration • Implements version control, testing, and deployment automation for platform components • Assists in maintaining platform documentation and usage guidelines • Partners with Product Owners, Data Scientists, and Analytics teams to understand platform needs • Supports teams in adopting and leveraging platform tools and infrastructure • Provides technical guidance on platform capabilities under supervision • Gathers feedback from platform users to inform improvements • Participates in agile ceremonies and contributes to platform roadmap discussions • Implements cloud-based analytics infrastructure using modern cloud platforms • Applies DevOps and DataOps principles to platform development • Supports CI/CD pipelines for analytics infrastructure and data workflows • Implements security, access control, and compliance measures in platform components • Assists in monitoring platform performance and optimizing resource usage • Follows platform engineering standards and best practices • Participates in code reviews and incorporates feedback • Implements testing for platform components and data pipelines • Documents technical designs and operational procedures • Contributes to continuous improvement of platform capabilities Skills • 2-4+ years of experience in platform engineering, software engineering, or data engineering • Proficiency in Python, SQL, and/or other relevant programming languages • Experience building cloud-based data infrastructure (AWS, Azure, or GCP) • Knowledge of ETL/ELT tools and frameworks (e.g., Airflow, dbt, Databricks, or similar) • Understanding of infrastructure-as-code tools (Terraform, CloudFormation, or similar) • Familiarity with CI/CD tools and practices (Jenkins, GitLab CI, GitHub Actions, or similar) • Knowledge of data warehousing concepts and analytics architectures • Good problem-solving skills and ability to design technical solutions • Effective communication and collaboration abilities • Self-motivated with ability to work independently and as part of a team • Bachelor's degree in Computer Science, Engineering, or related technical discipline • Experience with containerization and orchestration (Docker, Kubernetes) • Familiarity with data lakehouse architectures and technologies • Understanding of BI platforms and their deployment requirements (Tableau, Power BI, Looker) • Exposure to streaming data platforms (Kafka, Kinesis, or similar) • Knowledge of data governance, security, and compliance requirements • Experience in agile software development practices • Certifications in cloud platforms or relevant technologies Qualifications Must Haves • 2-4+ years of experience in platform engineering, software engineering, or data engineering • Proficiency in Python, SQL, and/or other relevant programming languages • Experience building cloud-based data infrastructure (AWS, Azure, or GCP) • Knowledge of ETL/ELT tools and frameworks (e.g., Airflow, dbt, Databricks, or similar) • Understanding of infrastructure-as-code tools (Terraform, CloudFormation, or similar) • Familiarity with CI/CD tools and practices (Jenkins, GitLab CI, GitHub Actions, or similar) • Knowledge of data warehousing concepts and analytics architectures • Good problem-solving skills and ability to design technical solutions • Effective communication and collaboration abilities • Self-motivated with ability to work independently and as part of a team Nice to Haves • Bachelor's degree in Computer Science, Engineering, or related technical discipline • Experience with containerization and orchestration (Docker, Kubernetes) • Familiarity with data lakehouse architectures and technologies • Understanding of BI platforms and their deployment requirements (Tableau, Power BI, Looker) • Exposure to streaming data platforms (Kafka, Kinesis, or similar) • Knowledge of data governance, security, and compliance requirements • Experience in agile software development practices • Certifications in cloud platforms or relevant technologies Benefits • Medical, dental and vision coverage • Other wellness programs Apply tot his job
Apply Now
← Back to Home