Intermediate Data Analyst to build and maintain data pipeline architecture.
S.i Systems
6d ago

Description : DATA ENGINEER


One of our large enterprise Energy sector clients is looking for a Data Engineer to build and maintain data pipeline architecture.

In this role, you’ll help ingest, transform and store clean and enriched data in ready for business intelligence consumption.


  • You’ll implement data models and structures data in ready-for business consumption formats You’ll work independently on complex data engineering problems to support data science strategy of product You’ll use broad and deep technical knowledge in the data engineering space to tackle complex data problems for product teams, with a core focus on using technical expertise
  • You’ll improve the data availability by acting as a liaison between Lab teams and source systems
  • You’ll collect, blend, and transform data using ETL tools, database management system tools, and code development
  • You’ll aggregate data across various warehousing models (e.g. OLAP cubes, star schemas, etc.) for BI purposes
  • You’ll collaborate with business teams and understand how data needs to be structured for consumption

  • You’ll have experience in a Data Engineer role (5+ years), with a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field
  • You build and maintain optimal data pipeline architecture.
  • You assemble large, complex data sets that meet functional / non-functional business requirements.
  • You identify, design, and implement internal process improvements : automating manual processes, optimizing data delivery, re-
  • designing infrastructure for greater scalability, data quality checks, minimize Cloud cost, etc.

  • Object-oriented / object function scripting languages : Python, Scala, SQL.?
  • You build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DataBricks, No-SQL
  • You build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • You document and communicate standard methods and tools used.
  • You work with other data engineers, data ingestion specialists, and experts across the company to consolidate methods and tool standards where practical.
  • You’re experienced using the following software / tools :
  • Big data tools : Hadoop, HDI, & Spark
  • Relational SQL and NoSQL databases, including COSMOS
  • Data pipeline and workflow management tools : DataBricks (Spark), ADF, Dataflow
  • Microsoft Azure
  • Stream-processing systems : Storm, Streaming-Analytics, IoT Hub, Event Hub
  • Business Analysis Data Analysis 5 - 7 years Define your scripting experience (specify tools) (5 - 7 years)

    Outline your experience building and maintaining data pipeline architecture utilizing tools such as DataBricks (Spark), ADF, Dataflow, etc. (5 - 7 years)

    Add to favourites
    Remove from favourites
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Application form