Data Engineer
GSoft
Montreal, Québec, Canada
1d ago

Job Description

So, what will your new role look like?

In addition to data on the use of our products, our various internal teams (Marketing, Sales, Customer Experience, etc.) also generate a significant amount of relevant data for decision making.

GSoft wants to make the most of this large amount of data to support our various business decisions and product development!

You will support the team in managing the infrastructure and architecture of our data warehouses, in addition to :

  • Maintain and ensure the evolution of GSoft's data quality and data infrastructure.
  • Design, develop, test and maintain optimized ETL processes involving a wide variety of data sources.
  • Maintain and enhance GSoft's data architecture to create simple, reliable and efficient data models that facilitate analysis by data scientists and teams.
  • Deploy and maintain machine learning model pipelines (ML Ops) in production.
  • Communicate, raise awareness, inform and educate data scientists and various teams on processes, best practices and changes to our data models.
  • What does your future team look like?

    You will join GSoft's Data & Analytics team, a team composed of various data science expertise, including data engineers and data scientists.

    You will therefore be called upon to work in collaboration with them in taking into account the needs and evolution of our data environments.

    In addition, you will be led to collaborate with business teams to fully understand their environments and their uses. Finally, you will have to collaborate with our team of developers who manage the configuration, creation and storage of data in source systems.

    What are the newest challenges awaiting your team?

  • Migrate our data warehouse to Azure Synapse Analytics.
  • Optimization of data models in our data warehouses.
  • Continuous improvement of data infrastructure.
  • Implement standards, improve performance, and ensure the quality and integrity of our architecture and data pipelines (improve our DataOps processes).
  • Qualifications

  • Have a few years of development experience in the data world, preferably using SQL, MongoDB / NoSQL, C#, Java, Python or similar technologies.
  • Experience in : ETL process development in a cloud environment and / or via tools such as Azure Data Factory; data modeling and data governance;
  • with security best practices; in the use of 3rd party APIs.

  • Ability to monitor a data pipeline.
  • Have a good knowledge of machine learning concepts and practices.
  • Ability to understand business issues, transform them into technological requirements and clearly communicate the results of analyses.
  • Ability to adapt quickly to a changing environment.
  • Additional Information

    At GSoft, we build together, we trust each other and we support each other in success or failure. You will be able to express yourself, evolve and develop your creativity in an environment that will adapt to your daily life and your needs.

    Our aspirations are to build a healthy and inclusive work environment. This is everyone's business.

    Report this job
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form