Senior Java (8) to create data components using Kafka within a Big Data environment (28678)
S.i. Systems
4d ago


Job Title : Senior Full Stack Engineer

Location : Downtown Toronto

Contract Duration : 6 Months (Chance for Extension If Successful)


GB&M Big Data is a Global Markets and Banking Initiative that is part of the Group Data Strategy to transform the way we govern, manage and use all our data to its full potential across the Bank.

The selected Senior Full Stack Engineer will be part of core big data technology and design team. This individual would be entrusted to developed solutions / design ideas, identify design ideas to enable the software to meet the acceptance and success criteria.

Work with architects / BA to build data component on the Big Data environment.

  • Exposure to Agile Project methodology but also with exposure to other methodologies (such as Kanban)
  • Understanding of data modelling techniques using relational and non-relational techniques
  • Experience on Debugging the Code issues and then publishing the highlighted differences to the development team / Architects

  • Java development / design using Java 8 (Advanced understanding of core features of Java and when to use them)
  • Experience with time-series / analytics databases such as Elasticsearch or noSQL database
  • Building MicrosServices and Restful API
  • Experience working withKafka
  • Sound knowledge on working Unix / Linux Platform

  • Experience in UI / app development, knowledge of react JS - Big Nice to have
  • Unix / Linux environment on-premises and in the cloud
  • Experienced in construction of robust batch and real-time data processing solutions
  • Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Hive, HBase, Presto, Python, ETL frameworks, MapReduce, SQL, RESTful services)
  • Hands-on experience building data pipelines using Hadoop components Sqoop, Hive, Pig, Spark, Spark SQL
  • Experience with developing Hive QL, UDF’s for analyzing semi structured / structured datasets
  • Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA
  • Knowledge of cloud computing technology such as Google Cloud Platform (GCP)
  • As a key member of the technical team alongside Engineers, Data Scientists and Data Users, you will be expected to define and contribute at a high-level to many aspects of our collaborative Agile development process :

  • Software design, development, automated testing of new and existing components in an Agile, DevOps and dynamic environment
  • Promoting development standards, code reviews, mentoring, knowledge sharing
  • Product and feature design, scrum story writing
  • Data Engineering and Management
  • Product support & troubleshooting
  • Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring
  • Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Liaison with Testers to ensure that they understand how requirements have been implemented so that they can be effectively tested.
  • Participation in regular planning and status meetings. Input to the development process through the involvement in Sprint reviews and retrospectives.
  • Input into system architecture and design

    Expertise and Skills

  • Software Development, Java 5 - 7 years
  • Priority Requirements

  • Must-have : Please describe our exp using Java 8
  • Must-have : Please describe your experience with time-series / analytics databases such as Elasticsearch or noSQL database
  • Must-have : Please describe your experience building Micros Services and Restful API
  • Downtown Toronto

    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Application form