The CompanyHitachi Vantara, a wholly-owned subsidiary of Hitachi, Ltd., guides our customers from what's now to what's next by solving their digital challenges. Working alongside each customer, we apply our unmatched industrial and digital capabilities to their data and applications to benefit both business and society. More than 80% of the Fortune 100 trust Hitachi Vantara to help them develop new revenue streams, unlock competitive advantages, lower costs, enhance customer experiences, and deliver social and environmental value.
Meet our TeamOur people are our greatest asset. We are diverse team with deep experience in Healthcare, Public Sector, Manufacturing and Financial Services across SEA. We support our employees and want them to thrive through exposure in our accounts in different industries. We seek to upskill our employees in the latest technologies to ensure relevance and to meet our customers' evolving technology and solution needs. Our focus on our customer's success ensures that our employees can share in their success as well.
What you'll be doingWe are seeking to enable our new and existing customers to grow their business through building modern data analytics platforms that will enable better decision making through data and information. By building strong relationships and being a trusted technology and solutions partner, Hitachi Vantara seeks to establish a long term relationship with our customers to support them in their ongoing Digital Transformation initiatives.
Your role as a Big Data Architect (Delivery) is to engage with our customers to understand their business, their existing and future technology roadmaps and build solutions that will enable our customers to achieve their business goals in the most effective manner. You will be required to:
Roles - Responsibilities- The individual will be in charge of Hadoop administrator role and system administrator role of the big data delivery team.
- Install and maintain the Hadoop eco system for the delivered Cloudera / MapR instances.
- Debug and conduct investigation for the issues reported.
- Support the customized application development team.
- ETL development.
Must Have Skills- Hadoop Administrator - HDFS and eco-systems, HIVE, Hbase, Zookeeper, Yarn, Hue etc
- MapR, Cloudera CDH
- Java J2EE, Spring Boot Framework
- Spark, Python, Drill, Jupyter etc
- SQL
- Javascript
- Kerberos, Linux system administrator knowledge
- Streaming: Kafka or MapR ES
- NoSQL: HBase, MapRDB or Cassandra
Good to Have Skills- DevOps Experience - Docker/Kubernetes, Jenkins, OpenShift, Ansible, Git etc.
- ETL development
- API design and development
- Cloud experience - AWS/GCP/Azure
- Angular / React
Personal Competencies- Excellent cooperation and communication skills
- Problem solver, and team player
- Responsible and proactive
- Ability to work independently
We are an equal opportunity employer. All applicants will be considered for employment without attention to age, race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status.