Staff Data Engineer – Cloud Data Platform
This a Full Remote job, the offer is available from: North America, United States Please note that all emails from Calix will come from a @calix.com email address. If you receive a communication that you think may not be from Calix, please report it to us at [email protected]. This is a remote position that could be based anywhere in the United States. Calix is leading a service provider transformation to deliver a differentiated subscriber experience around the Smart Home and Business, while monetizing their network using Role based Cloud Services, Telemetry, Analytics, Automation, and the deployment of Software Driven Adaptive networks. As part of a high performing global team, the right candidate will play a significant role as Calix Cloud Data Engineer involved in architecture design, implementation, technical leadership in data ingestion, extraction, transformation and analytics. Responsibilities and Duties: • Work closely with Cloud product owners to understand, analyze product requirements and provide feedback. • Develop conceptual, logical, physical models and meta data solutions. • Design and manage an array of data design deliverables including data models, data diagrams, data flows and corresponding data dictionary documentations. • Determine database structural requirements by analyzing client operations, applications, and data from existing systems. • Technical leadership of software design in meeting requirements of service stability, reliability, scalability, and security • Guiding technical discussions within engineer group and making technical recommendations. Design review and code review with peer engineers • Guiding testing architecture for large scale data ingestion and transformations. • Customer facing engineering role in debugging and resolving field issues. Qualifications: • This role may be required to travel and attend face-to-face meetings and Calix sponsored events. • 10+ years of development experience performing Data modeling, master data management and building ETL/data pipeline implementations. • Cloud Platforms: Proficiency in both Google Cloud Platform (GCP) services (BigQuery, Dataflow, Dataproc, PubSub/Kafka, Cloud Storage) and AWS • Big Data Technologies: Knowledge of big data processing frameworks such as Apache Spark ,Flink . • Programming Languages: Strong knowledge of SQL and at least one programming language (Python, Java, or Scala),DBT. • Data Visualization: Experience with BI tools such as Google Data Studio, Looker, ThoughtSpot, and using BigQuery BI Engine for optimized reporting • Problem Solving: Strong analytical and troubleshooting skills, particularly in complex data scenarios. • Collaboration: Ability to work effectively in a team environment and engage with cross-functional teams. • Communication: Proficient in conveying complex technical concepts to stakeholders. • Knowledge of data governance, security best practices, and compliance regulations in both GCP and AWS environments. • Bachelor’s degree in Computer Science, Information Technology or a related field. Location: • Remote-based position located in the United States. #LI-Remote The base pay range for this position varies based on the geographic location. More information about the pay range specific to candidate location and other factors will be shared during the recruitment process. Individual pay is determined based on location of residence and multiple factors, including job-related knowledge, skills and experience. San Francisco Bay Area: 156,400 - 265,700 USD Annual All Other US Locations: 136,000 - 231,000 USD Annual As a part of the total compensation package, this role may be eligible for a bonus. For information on our benefits click here. This offer from "Calix" has been enriched by Jobgether.com and got a 77% flex score. Apply tot his job