The role of Specialist Software Development is to contribute to the whole solution by analyzing functional specifications to identify the best technical design (blueprint) and coding parts of the solution.
The Specialist takes full responsibility of assigned deliverables, aligns own workload and focuses on key tasks in order to deliver as per service commitment, leveraging own expertise and skill set to achieve delivery goals.
Depending on the assignment, the role may apply in either a Project, Enhancement or Support environment.
Write performing quality code to fulfill design and pass code review with minimal number of defects
Apply configuration on development environment when required
Participate in implementing and supporting full product in production.
Analyze source system data to assess data quality, connect to data sources, import data and transform data for Business Intelligence
Technical Expertise 25%
Design ETL processes and develop source-to-target data mappings, integration workflows, and load processes
Interact with Data Designer to understand requirements for solutions. Highlight the technical impacts of the functional design on existing solutions based on a detailed analysis.
Deliver technical design and database structure for medium to high product complexity
Create, review and maintain technical documentations.
Analyze and troubleshoot the production issues and provide remediation
Contribute in developing the design and coding standards that will apply to the whole practice
Document blueprint based on requirements & functional designs
Document designs and architect data maps, develop data quality components and establish and / or conduct unit tests
Involved in gathering, understanding and validating the project specifications and participate in ETL architecture design reviews
Quality Controls 25%
Ensure Quality KPI are identified, measured and produced ensuring respect of development standards. Ensure right level of testing is consistent across all projects.
Identify problems, develop ideas and propose solutions within differing situations requiring analytical, evaluative or constructive thinking in daily work.
Perform reviews and quality checks after data has been loaded
Decision Making & Impacts
Decide on development standards and develop technical blueprint & design
Direct impact on the quality of the applications, solutions and part of the code delivered
Direct impact on product delivery schedule based on adherence to estimates and duration committed
Level of Interaction / Influence
External : N / A
Internal : Work closely with Development Architects, Development Practice Leads and other Development Specialists. Interact with project teams (functional, development and test leads, project managers).
Minimum 7-10 years overall work experience
5 years experience as a developer
Azure Certification, DataBricks Certification
Knowledge of Hadoop ecosystem (Hive, Spark, HDFS, NiFi)
Education / Certification / Designation
Bachelor’s degree in computer science or equivalent degree or work experience
Functional competencies / Soft Skills
Strong communication skills, including the ability to speak clearly to technical and nontechnical people.
Self-driven, highly motivated, team player and able to learn quickly
Technical skills / Knowledge
Proficiency with Kusto Query Language (KQL) and / or Expert SQL and data modeling skills.
Proficiency with programming technologies in area of expertise, Python, Scala, PowerShell
Experience in troubleshooting and resolving database integrity and performance issues
Experience in Data warehouse design, ELT / ETL and BI reporting / analytics tools
Experience with Big Data techniques and Cloud, Knowledge of Messaging Queue (Kafka, Azure Event Hub, RabbitMQ, Etc..) and ELK
Awareness of Agile principles, automation, Scripting Skills and DevOps
Strong understanding of data warehousing and business intelligence architecture
Experience with Azure (DataLake, DataFactory, DataBricks, Data Explorer, Data warehouse)
Experience with version control systems (git) and Azure DevOps
Knowledge of Big Data analytics technologies in a Cloud environment