Lead Data Engineer, GFT
馃嚚馃嚘RBC
Job Description
Job Description What is the opportunity? As Lead Data Engineer in the Data as a Service (DaaS) Team within Global Functions Technology, you will provide technical leadership and execution of development deliverables for the reimagined Finance Data Platform. The platform, leveraging public cloud infrastructure, will serve as the central repository of finance related datasets, with capabilities including the acquisition, standardization, enrichment, and provisioning, of positional/trade, sub-ledger and general-ledger trial balances, and reference data, and include capabilities supporting reconciliation, analytics, and reporting functions. We are seeking a Data Engineer with extensive hands-on experience in designing and developing data platforms. This role requires strong data architecture and engineering skills, effective written and verbal communication, a strong work ethic, and the ability to multi-task effectively. Additionally, you must possess strong interpersonal, organizational, and problem-solving skills, along with a demonstrated sense of urgency to respond to changing priorities. What will you do? This role will encompass an end-to-end system view from data sourcing, lineage, transformation, and storage standpoint to support complex advanced analytics and data management and governance needs and require extensive collaboration with Business architecture, System architecture, Business SME and Data Stewards. Work closely with the data team and other stakeholders to understand their data requirements and assist in building data solutions that cater to their needs. Design, develop, and support new and existing data pipelines, recommending improvements and modifications. Communicate strategies and processes around data modeling and architecture to cross-functional groups. Identify, design, and implement internal process improvements. Utilize your expertise in data transformation techniques to enrich raw data, making it more accessible and valuable for analytics and reporting purposes. Build and implement ETL frameworks to improve code quality and reliability. Develop scripts and programs for converting various types of data into usable formats. Ensure the accuracy and consistency of data processing, results, and reporting. What do you need to succeed? Must Have: Undergraduate degree/diploma in computer science/engineering or related technology discipline. 7 years of experience in data engineering with at least 3 years of hands-on experience in system integration, data engineering and cloud architecture, and tools & technologies listed below: Big-Data/Lakehouse platforms as Cloudera Data Platform, Microsoft Azure, AWS Data transformation tools/technologies/platforms such as: Data Build Tool (DBT), Databricks, Snowflake. Orchestration: Apache Airflow/ Stonebranch UAC/Control-M Cloud and Containers: OpenShift (OCP)/Docker/Kubernetes API Development: FastAPI, or other API framework Security: LDAP, Kerberos, OAuth 2.0, Microsoft Entra ID, HashiCor
Read original postingRequired Skills
RBC