DataOps Architect
Pythian
Greater Vancouver,Canada
2d ago

What will you be doing?

  • Lead the Data Ops Consultants to implement data pipelines
  • Working on complex and varied Data Ops projects including tasks such as collecting, parsing, managing, analyzing, and visualizing very large datasets.
  • Translating complex functional and technical requirements into detailed designs.
  • Writing high-performance, reliable and maintainable code.
  • Performing data processing requirements analysis.
  • Performance tuning for batch and real-time data processing.
  • Securing components of clients’ Big Data platforms.
  • Diagnostics and troubleshooting of operational issues.
  • Health-checks and configuration reviews.
  • Data pipelines development ingestion, transformation, cleansing.
  • Data flow integration with external systems.
  • Integration with data access tools and products.
  • Assisting application developers and advising on efficient data access and manipulations.
  • Defining and implementing efficient operational processes
  • Participating equally in your team’s regular on-call rotation as tier 3 support
  • What do we need from you?

  • While we realise you might not have everything on the list to be the successful candidate for the Data Ops Architect job you will likely have most of the following;
  • Experience building data pipelines in any public cloud (GCP Dataflow, Glue, Azure DataFactory) or any equivalent
  • Experience writing ETL (Any popular tools)
  • Experience in data modeling, data design and persistence (e.g. warehousing, data marts, data lakes).
  • Strong Knowledge of Big Data architectures and distributed data processing frameworks : Hadoop, Spark, Kafka, Hive
  • Experience and working knowledge of various development platforms, frameworks and languages such as Java, Python, Scala and SQL
  • Experience with Apache Airflow, Oozie and Nifi would be great
  • General knowledge of modern data-center and cloud infrastructure including server hardware, networking and storage.
  • Strong written and verbal English communication skills
  • Nice-to-haves

  • Experience with BI platforms, reporting tools, data visualization products, ETL engines.
  • Experience with data streaming frameworks.
  • DevOps experience with a good understanding of continuous delivery and deployment patterns and tools (Jenkins, Artifactory, Maven, etc)
  • Experience with Hbase.
  • Experience in data management best practices, real-time and batch data integration, and data rationalization
  • What do you get in return?

  • Competitive total rewards package
  • Flexible work environment : Why commute? Work remotely from your home, there’s no daily travel requirements to an office!
  • Substantial training allowance : Hone your skills or learn new ones; participate in professional development days, attend conferences, become certified, whatever you like!
  • Office Allowance : A device of your choosing and personalise your work environment!
  • Apply
    Add to favourites
    Remove from favourites
    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form