Requisition ID : 78244
Join the Global Community of Scotiabankers to help customers become better off.
As Scotiabank’s engine of modernization, the PLATO platform enables technology teams to build software quickly and securely using modern practices.
PLATO is an integrated set of technical capabilities, services and processes that encapsulate critical enterprise functions through standardization, re-use and automation.
The PLATO team is comprised of engineers, problem solvers, agilists and creatives in roles such as Enterprise Platform Engineering and Architecture, Enterprise Data Services, Cloud Infrastructure and Architecture, Product Engineering, and Product Management.
Together, the team provides the platform that enables the Bank to deliver transformative experiences that help our 24 million customers become better off.
Interested in joining an agile team that’s impacting change for our customers around the world? Watch our video
The data engineer designs, develops, tests, implements, and maintains complex (SQL) ELT functions, user defined functions and complex queries for custom solutions with limited direction.
Leads and coordinates code / peer review of focused development work to insure it aligns to the business and technical requirements
1) Data Pipeline Development
Analyze, define and document requirements for data pipelines on GCP including workflow, logical processes, hardware and operating system environment, interface with other systems, internal and external checks, controls, and outputs
Work with business / data analyst in performing investigation, analysis and evaluation to determine business requirements
Design, develop and maintain ELT specific code
Design reusable components, user defined functions
Troubleshoot production support issues post - deployment and come up with solutions as required
2) Data Testing
Coordinate coding, testing, implementation, integration and documentation of solution. Develop program specifications
Perform complex applications programming activities. Code, test, debug, document, maintain, and modify complex applications program
Provide support in the implementation of processes within the QA and production environments
3) Code Development
Develop functions / apis in Python, Java for use in other ELT tools such as CDF
Possesses solid knowledge of design and analysis methodology and application development processes from both an industry and client perspective
Ability to translate business source-to-target mappings to relevant technical source-to-target mapping
Demonstrates good understanding of the Software Development Life Cycle
Proficiency in ETL tools - specifically Talend, Data Stage or Informatica. Cloud Data Fusion experience a huge plus.
Strong understanding of Cloud concepts and technologies (GCP, Azure or AWS)
Experience of working with relational or columnar databases specifically BigQuery or Cloud Spanner
Strong SQL skills
Experience and knowledge of data architecture and concepts of relational and dimensional databases
Experience with enterprise application architecture and enterprise integration patterns
Ability to implement re-usable data-integration / ETL code in an enterprise data-warehouse environment
Previous experience of working in an Agile Development environment
Ability to work well in a challenging environment.
Strong troubleshooting skills
Excellent writing skills, oral communication skills, strong process skills, and leadership ability.
Ability to multi-task and prioritize for self and team members.
Examine and solve the performance bottlenecks in the ETL process
Python / Java / Scala
Experience with MapReduce
The ideal candidate will be a self- starter with drive, initiative and a can-do attitude who is eager to learn new technologies, as well as a team player who can work within or lead a team.