Senior Data and Cloud Engineer, GFT
馃嚚馃嚘RBC
Job Description
Job Description What is the opportunity? Are you a talented, creative, and results-driven professional who thrives on delivering high-performing applications. Come join us! Global Functions Technology (GFT) is part of RBC鈥檚 Technology and Operations division. GFT鈥檚 impact is far-reaching as we collaborate with partners from across the company to deliver innovative and transformative IT solutions. Our clients represent Risk, Finance, HR, CAO, Audit, Legal, Compliance, Financial Crime, Capital Markets, Personal and Commercial Banking and Wealth Management. We also lead the development of digital tools and platforms to enhance collaboration. The Retail Credit AI & Digital Transformation (ADT) Program aims to improve AI-enhanced underwriting capabilities will drive market-leading client acquisitions and best-in-class depth of client relationships. This program will enable us to train and operationalize new ML and AI models rapidly with improved deployment and maintenance patterns. This project focuses on advancing NextGen compute and engineering capabilities on public cloud and on-prem infrastructure, establishing data pipelines and feature stores with integrated data quality and governance for high-quality, traceable insights, and parameterizing the ML lifecycle to enhance delivery speed with improved controls, monitoring, and guardrails. We value positive attitude, willingness to learn, open communication, teamwork, and commitment to clean, secure and well-tested code. What will you do? Develop a feature store with integrated data quality and governance to ensure high-quality, traceable ML insights. Design and implement pipelines for feature extraction, transformation, and storage using scalable cloud solutions. Ensure data consistency, lineage, and metadata management to support regulatory and governance needs. Collaborate with data scientists to standardize feature definitions and promote reusability across teams. Implement reusable pipelines and MLOps solutions, to optimize machine learning models and other quantitative algorithms life cycle management. E2E technical competency including conducting data analysis, data preprocessing, and feature engineering to prepare datasets for model training. Work alongside data scientists, quantitative analysts, software engineers, data engineers, and domain experts to collect requirements and design solutions. What do you need to succeed? Must Have A degree in Computer Science, Engineering, Mathematics, Statistics, or a closely related field at the Bachelor's or Master's level. Experience in programming, small to large-scale applications with focus on full-stack development; High-level expertise in programming languages such as Python, PySpark, highly proficient in both data and ML frameworks, such as Spark, Pandas etc. Experience with PySpark for Data Engineering/ETL pipelines Expert in DevOps practices and tools for CI/CD pipelines. Excellent problem-solving skills and analytical thinking. Strong communic
Read original postingRequired Skills
RBC