Маєш запитання стосовно вакансій, проєктів, умов роботи? Напиши нашому рекрутеру!
Надіслати резюме

Project Overview

Implementation of Data Platform solution for Retail Bank within the following capabilities:

  • Core Data Platform;
  • Onboarded Data Producer/Consumer Applications, Configuration & Customisations;
  • Onboarding & Management of Data Producer/Consumer Applications;
  • Application, Data Source, Job, Job Data, Job Steps, and Data Access Taxonomies;
  • Standardized Push/Pull Streaming/Batch Ingestion Interfaces;
  • Polyglot Data-lake Storage (e.g., Bucket, KV, NoSQL, Search, Warehouse, SQL, etc.);
  • Customizable Ingestion Conformance-tier/Stage/Archive Processors;
  • Customizable Ingestion Optimised Format (e.g., Parquet/ORC) Processors;
  • Customizable Transformation/Enrichment Processors;
  • Customizable Microservices & API Processors;
  • Standardized Query API (e.g., J/ODBC) Data Access Interfaces;
  • Standardized Export API, Bucket Endpoint & Notify Data Access Interfaces;
  • Standardized Streaming Egress Data Access Interface;
  • RBAC (Role-Based Access Controls);
  • TLS (in-flight) & TDE (at-rest) Encryption capabilities;
  • Operational Metrics & Monitoring capabilities;
  • Data Platform Resource Utilisation Tracking & Cost Allocation capabilities;
  • DLM, Archive, Retention capabilities;
  • Schema Registry capabilities etc.

Responsibilities:

  • Design and development of data processing pipelines for ML solutions;
  • Building cloud native deliveries for on-premise Kubernetes cluster or AWS deployment;
  • Interacting with stakeholders for requirements elicitation;
  • Research and prototyping of promising tools/approaches/practices with further implementation;
  • Managing knowledge base / technical documentation for developed solutions;
  • Participating in Enterprise Data Platform design and development;
  • Deep co-operation with Product Teams (POs, analysts, architectors, devs, testers, IT-deliverers, etc.);
  • SW Architecture design & maintenance;
  • Release, Delivery, Change Mgmt, Support, Incident mgmt &  Troubleshooting:
    • Planning & control. Improvement & development;            
    • Technical requirements & specifications preparation & approval;
    • API/ specifications development, maintenance & approval;         
    • Ensuring quality testing (QAT, integreation);
    • Providing quality & on-time SW delivery (installation & configuration)  & support.

Requirements:

  • 3+ years of experience as a DevOps Engineer;
  • Experience in Data Engineering / DB Development / ETL-ELT Development (3+ years);
  • Expert knowledge of Python, SQL, PL/SQL;
  • Good knowledge of RDBMS (Oracle, PostgreSQL, SAP IQ);
  • Experience in Big Data stack (Hadoop, HBase, Kafka, Spark, Nifi, Superset);
  • Experience in AWS (EC2, S3, EBS, EKS, Lambda, Athena, SNS/SQS);
  • Experience in Kubernetes, Docker, Linux, UNIX;
  • Experience in Git (GitLab), Bitbucket, Ansible, Bash, Nexus;
  • Experience in maintaining and executing build scripts to automate development and production builds;
  • Practical experience in IT systems building & support;
  • Automation of: installation & maintenance routines, troubleshooting, incident mgmt (logs; monitoring KPIs, triggers & alerts);
  • Analyzing the needs and proposals of business units in order to improve the services;
  • Control and enforcement of services on-time implementation & delivery;
  • Experience in Network Protocols*:*SMTP, SNMP, ICMP, TCP/IP, FTP, TELNET, NIS, LDAP, UDP;
  • Good communication skills (English intermediate+) – verbal and written;
  • Bug Tracking System: Jira;
  • Confluence.

Nice to have:

  • Experience in Apache NiFi, Pulsar, Groovy;
  • Knowledge of Jenkins and CI/CD principles, experience in pipelines development;
  • Knowledge and experience in Data Science / Machine Learning;
  • Experience with processing of large amount of data;
  • Analytics and design skills;
  • Experience of working in an Agile team and environment.

Higher Education: 

  • Bachelor’s Degree or higher.
Надіслати резюме