Analytics Engineer III
About the position Shaw Industries is seeking an experienced Analytics Engineer III to join our Residential Enablement and Insights team in Dalton, GA. This role sits at the intersection of business strategy, advanced analytics, and data engineering — with a strong emphasis on data modeling, Power BI, and Databricks. As an Analytics Engineer, you will design and implement scalable, trusted data models and visualization tools that drive strategic decision-making across our Residential Sales and Marketing organizations. The ideal candidate will blend technical depth with business insight, playing a critical role in enabling data self-service and modern analytics practices across Shaw. This position operates in a hybrid work environment. Responsibilities • Partner with senior business leaders (DVPs, VPs, Directors) to gather requirements and translate them into scalable data models and analytic frameworks. • Design, develop, and maintain semantic models, star schemas, and data marts to support self-service analytics. • Build and deploy advanced analytic pipelines using Databricks, transforming large and complex data sets into analytics-ready assets. • Create clear, actionable visualizations and dashboards using Power BI, enabling users to uncover insights and make informed decisions. • Collaborate with Data Engineering to ensure data integrity, optimize pipeline performance, and enforce data governance standards. • Provide subject matter expertise in Power BI development, data modeling practices, and dashboard UX. • Coach and mentor analysts and business users in best practices for data visualization, model reuse, and analytic thinking. • Contribute to the development of enterprise data standards, catalogs, and metadata documentation. • Stay current with evolving technologies in analytics, cloud computing, and data engineering — especially Databricks, Power BI, and modern data architectures. Requirements • Advanced knowledge of SQL and experience designing scalable logical and physical data models. • Expert-level proficiency in Power BI (DAX, Power Query, dataflows, custom visuals, row-level security). • Proven experience working with Databricks (preferably on Azure) to process and model data using notebooks and Delta Lake. • Strong understanding of data pipeline orchestration, ETL, and data lake architectures (Azure Data Lake). • Familiarity with Python or R for advanced analytics and automation tasks. • Experience working with version control (e.g., Git) and implementing DataOps or CI/CD practices. Nice-to-haves • Interest in continuous learning, especially in areas of machine learning, AI, and advanced analytics. • Passion for creating clean, reusable, and well-documented data assets. Benefits • Equal opportunity employer as to all protected groups, including protected veterans and individuals with disabilities. Apply tot his job