Join our Talent Network

Skip to main content

Corporate Information  |  Careers

Careers Home > Job Search Results > Data streaming engineer

Data streaming engineer

Location: Chennai India
Job ID: 1019299HV
Date Posted: Aug 10, 2022
Segment: IT
Business Unit: Hitachi Vantara
Company Name: Hitachi Vantara Corporation
Profession (Job Category): General Management

Share: mail
Save Job Saved

Role Title: Data Streaming Engineer (GDC)
Experience: 4.5 Years Min
Location: Hyderabad /Pune/Bangalore -[WFH]
Job Type: Permanent Position

About Company
Hitachi Vantara, a wholly owned subsidiary of Hitachi, Ltd., guides our customers from whats now to whats next by solving their digital challenges. Working alongside each customer, we apply our unmatched industrial and digital capabilities to their data and applications to benefit both business and society. More than 80% of the Fortune 100 trust Hitachi Vantara to help them develop new revenue streams, unlock competitive advantages, lower costs, enhance customer experiences, and deliver social and environmental value

Website : https://www.hitachivantara.com/

Technical experience:
  • Design and recommend the best approach suited for data movement to/from different sources using Apache/Confluent Kafka.
  • Good understanding of Event-based architecture, messaging frameworks and stream processing solutions using Kafka Messaging framework.
  • Hands-on experience working on Kafka connect using schema registry in a high-volume environment.
  • Strong knowledge and exposure to Kafka brokers, zookeepers, KSQL, KStream and Kafka Control centre.
  • Good knowledge of big data ecosystem to design and develop capabilities to deliver solutions using CI/CD pipelines.
  • Skilled experience in writing and troubleshooting Python/PySpark scripts to generate extracts, cleanse, conform and deliver data for consumption
  • Strong working knowledge of the AWS Data analytics eco-system such as AWS Glue, S3, Athena, SQS etc.
  • Good understanding of other AWS services such as CloudWatch monitoring, scheduling and automation services
  • Good experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connectors, JMS source connectors, Tasks, Workers, converters, and
  • Working knowledge on Kafka Rest proxy and experience on custom connectors using the Kafka core concepts and API.
  • Create topics, set up redundancy cluster, deploy monitoring tools, and alerts, and has good knowledge of best practices.
  • Develop and ensure adherence to published system architectural decisions and development standards

Good to have:
  • Ability to perform data-related benchmarking, performance analysis and tuning.
  • Understanding of Data warehouse architecture and data modelling
  • Strong skills in In-memory applications, Database Design, and Data Integration
  • Ability to guide and mentor team members on using Kafka.
Share: mail