ETL designer with Google Cloud Platform_Remote
• Job ID: J49053 - Job Title: ETL designer with Google Cloud Platform_Remote - Location: Dallas, TX - Duration: 12 Months + Extension - Hourly Rate: Depending on Experience (DOE) - Work Authorization: US Citizen, Green Card, OPT-EAD, CPT, H-1B, H4-EAD, L2-EAD, GC-EAD - Client: To Be Discussed Later - Employment Type: W-2, 1099, C2C Job Title: ETL designer with Google Cloud Platform (Telecom is preferred) Location: Remote/Dallas preferred, need to work in CT timezone. Experience: years Mail: Job Description: We are seeking a highly experienced ETL designer to join our team. The ideal candidate will have over 12-15 years of experience in designing and implementing data solutions, with a strong focus on cloud data warehousing and real-time data processing. This role requires advanced knowledge in various data technologies and the ability to translate business requirements into scalable and efficient data architectures. Key Responsibilities: • Design, develop, and maintain robust ETL pipelines on Google Cloud Platform using tools like Dataflow, BigQuery, Cloud Composer (Apache Airflow), and Pub/Sub • Optimize data ingestion, transformation, and loading processes for high-volume, real-time, and batch data processing • Ensure data integrity, quality, and security throughout the ETL lifecycle • Collaborate with data engineers, analysts, and business teams to understand data needs and translate them into efficient ETL solutions • Implement best practices for performance tuning, error handling, and monitoring of ETL jobs • Work with structured and unstructured data sources, integrating API-based, event-driven, and batch processing workflows • Automate and document ETL processes to enhance maintainability and scalability • Utilize Git for version control and deployment of ETL workflows, ensuring smooth CI/CD integration Key Skills: • 12+ years of experience in ETL design and development • Proficiency in SQL, Python, and Apache Beam for data processing using dataflow, dataproc appliances. • Strong experience in advance DML operations, query optimization • Strong expertise in Google Cloud Platform services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer) • Experience with data modeling, warehousing concepts, and data lakes • Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer on DAG design, Error handling and monitoring • Knowledge of performance tuning, data partitioning, and optimization techniques • Experience in real-time streaming ETL and event-driven architectures is a plus • Familiarity with Git for version control and deployment of ETL pipelines • Strong problem-solving skills and ability to work in an agile environment Qualifications: • Bachelor s or master s degree in computer science, Information Technology, or a related field. • Proven experience in ETL design and implementation in DWH systems. • Strong problem-solving skills and the ability to work in a fast-paced environment. • Excellent communication and collaboration skills. Apply tot his job