DevOps Data Engineer
🇨🇦RBC
Job Description
Job Description What is the opportunity? Designs, develops, and optimizes data solutions, ensuring the delivery of high quality data assets, reporting, and business intelligence to support organizational objectives. Manages data engineering projects or assignments of increasing complexity, scope and impact, applying professional judgment and expertise. What will you do? As part of the Data Engineering team, you will encompass an end to end view from data sourcing, lineage, quality, transformation and storage. Execute Development and Integration activities (planning, execution, testing, deployment and post implementation support) of existing and new data and visualization Technologies (e.g. streamlit). Prepare data to be used for diagnostic/descriptive reporting and data science work. Build container based solutions, pipelines, APIs/Microservices and analytics portal for heterogeneous data sources on internal data lakes or cloud platforms Collaborate with Data scientists, Process Engineers and Business Stakeholders to develop data pipelines, and assist with prescriptive and predictive analytics through consolidated data Identify and resolve data movement/transformation/processing bottlenecks Research emerging Data and Visualization technologies trends/best practices and propose solutions for Technology and Business partners. What do you need to succeed? Must Have: Bachelor’s degree in Computer Science, Engineering or equivalent Experienced in building end to end data pipelines Demonstrated leadership skills; ability to identify / foresee issues and to proactively recommend and build resilient solutions Advanced SQL knowledge and experience working with relational databases Experience in developing scalable, configurable applications using Python and application frameworks Knowledge of relevant security considerations for applications on cloud Knowledge of VMs and various MLOps platforms such as WandB, AWS SageMaker Knowledge of AWS 5 years of hands-on experience in following key areas: Data engineering solutions: Logstash, Python, SQL Server, Kafka, Hadoop, Spark, Trino API: Streamlit, Flask, Node.JS, Django, and Microservices technologies Automation/DevOps: Github Actions, Airflow, UCD, Selenium and similar technologies Cloud technologies: Openshift, Docker, Kubernetes Git & code version management Nice to Have Security frameworks: LDAP, Kerberos, OAuth 2.0, Vault integration Visualization tools: Dash, Plotly, Tableau Experience working in agile environment Master’s degree in Computer Science, Engineering, or equivalent Supervised and Unsupervised Machine learning, Natural Language Processing What’s In For You? We thrive on the challenge to be our best, progressive thinking to keep growing, and working together to deliver trusted advice to help our clients thrive and communities prosper. We care about each other, reaching our potential, making a difference to our communities, and achieving success that is mutual. A comprehensive Total Rewards P
Read original postingRequired Skills
RBC