Data Engineer

Data Engineer


Your Role and Responsibilities

Develops applications on Big Data and Cognitive technologies including API development. Expected to have traditional Application Development background along with knowledge of Analytics libraries, open-source Natural Language Processing, statistical and big data computing libraries. Strong technical abilities to understand, design, write and debug complex code. Skills include Python, Spark, Kafka, IBM BigInsights, Hadoop, NoSQL, HBase, HIVE, PIG, C++, SQL, Linux, Java, EAI, SOA, CEP, HDFS, ETL
Job Responsibilities of Big Data Engineer

  • Design, Development & Deployment of various technology components needed for Big Data Analytics
  • Leading end to end deployment of ITOA Big Data solutions for enterprise customers
  • Provide inputs to Architects and Data scientists on various stages of solution design
  • Perform Integration of AIOPS solution as per design provided by Architects
  • Participate & be an active member of internal capability building projects
  • Train & support junior resources as needed
  • Provide resolution to customer queries and issues

Required Technical and Professional Expertise

Must Have skills Hadoop Hive Bash Scala or Python SparkĀ  Nice to have skills Hbase Cloudera Jenkins IBM Cloud
  • Hands on expertise over Apache Spark and Hadoop framework.
  • Hands on coding experience in PySpark or Scala using Spark libraries.
  • A decent understanding of Relational Databases and can write SQL Codes.
  • Well versed with Non Relational databases (No-SQL) and have good experience on Mongo DB, Cassandra, Hive, HBase etc.
  • Understanding of Subscriber and publisher messageing pattern with good grasp on Kafka Queue or anyother streaming services(Amazon MQ, Aws-SQS etc).
  • Well versed with Event driven and asynchrounous progragamming paradigm.
  • Have experience in writing enterprise grade codes, which is secure,efficient and unit tested.
  • Experience in static and Dynamic Code analyser ( SAST and DAST).
  • Well versed with coding practices and command over “Unit testing” frameworks for the respective technology stack.
  • Develop scalable data and analytics solutions leveraging standard platforms, frameworks, patterns and full stack development skills

Preferred Technical and Professional Experience

  • Experience working in IT Service delivery and Infrastructure domain,
  • Knowledge of ITIL Framework
  • Experience working with large data sets leveraging distributed systems e.g. Spark/Hadoop.
  • Working knowledge of Relational (Oracle, Db2, SQL, etc) and new age databases (NoSQL databases) e.g. MongoDB, Cassandra
  • People with demonstrated past projects executed based on above teachnologies

Required Education

Bachelor’s Degree

Preferred Education

Master’s Degree



State / Province


City / Township / Village


City / Township / Village 1


City / Township / Village 2


To apply for this job please visit

You may also like...


Subscribe Now