Data Platform Engineer
Great-West Life
London , ON, CA
4d ago

We are Canada Life

Being a part of Canada Life means you have a voice. This is a place where your unique background, perspectives and talents are valued, and shape our future success.

You can be your best here. You’re part of a diverse and inclusive workplace where your career and well-being are championed.

You’ll have the opportunity to excel in your way, finding new and better ways to deliver exceptional customer and advisor experiences.

Together, as part of a great team, you’ll deliver on our shared purpose to improve the well-being of Canadians. It’s our driving force.

Become part of a strong and successful company that’s trusted by millions of Canadians to do the right thing.

Be your best at Canada Life.

We are looking for a Data Platform Engineer

The Data Platform Engineer is involved in the complete lifecycle of cloud based big data platform from design and development to the implementation, as well as operational support.

She / he champions and contributes to the development of standards, tools, processes and best practices.

What you will do :

  • Installing, configuring, maintaining, and securing big data platform environments focused primarily on an Azure Synapse ecosystem using toolsets such as (but not limited to) :
  • Azure Data Factory
  • Azure Databricks clusters
  • PolyBase
  • Azure Service Bus
  • Azure Synapse
  • Azure Data Lake (Gen 2)
  • Experience in key aspects of platform performance health and availability monitoring and administration in a 24x7 production environment
  • Assisting users, when required, in troubleshooting data extract errors and data pipeline execution issues
  • Strong communication and presentation skills with a high degree of comfort with large and small technical audiences including ability to influence senior level executives
  • Identify and implement the optimal data ingestion method technology for a batch processing solution using a variety of ELT / ELT toolsets (i.
  • e. Syncsort, ADF) to develop / support batch processing solutions

  • Identify transformation logic to be used in the Mapping Data Flow in Azure Data Factory Design batch and near real-time processing solutions including streaming
  • Design, implement, and support scheduled data pipelines from a variety of sources into Azure ecosystem
  • Manage the data lifecycle, design Azure Data Storage Solutions and recommend a solution based on requirements which includes data archival and restoration
  • What you will bring :

  • Extensive knowledge working with Microsoft Azure ecosystem and strong knowledge about ADLS, Blob Storage, Data Factory, Azure Synapse, Azure Databricks
  • Good understanding of multiple cloud data platforms
  • Technical background with RDBMS (i.e. SQL Server, DB2 LUW and zOS) and non-relational systems (Mainframe, Hadoop-based systems etc.
  • to build ingestions from variety of disparate data sources

  • Demonstrated use of big data ecosystem tools to ingest / transform structured, semi-structured, and unstructured data (i.
  • e. ADF, Syncsort, Python)

  • Strong background in Unix and Windows integrated environments including scripting, batch processing, etc.
  • Ability to show flexibility, initiative, and innovation when dealing with ambiguous and fast paced situations
  • Communication and presentation experience with proven track record of using insights to influence executives and colleagues
  • Report this job
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form