Big Data Engineer – Hadoop
IBISKA Telecom Inc
Ottawa
40d ago

Job Information :

IBISKA has a requirement for an Architect or Engineerto deliver a clustered computing environment for our Government client.

The role is to design and build large-scale data analytics platforms to deal with the complexities of ingesting, storing, and manipulating masses of data in real-time.

Skills and Experience Required

  • 5+ years of experience in software development / engineering, including requirements analysis, software development and implementation;
  • 5+ years of experience developing software with high level languages such as Java, C, C++;
  • Experience with distributed scalable Big Data programming model and technologies such as Hadoop, Hive, Pig, etc;
  • Experience with Hadoop Distributed File System (HDFS);
  • Experience with serialization such as JavaScript Object Notation (JSON) and / or BinaryScript Object Notation (BSON);
  • Experience developing solutions integrating and extending FOSS / COTS products;
  • Experience deploying applications in a cloud environment;
  • Understanding of Cloud Scalability;
  • Hadoop / Cloud Developer Certification is an asset;
  • Experience designing and developing automated analytic software, techniques, and algorithms;
  • Experience developing and deploying data driven analytics with heterogeneous environment.
  • Apply
    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form