Hadoop Developer - IT, Pune

Job Description

We are searching for a Big Data / Hadoop developer who will work in our data ingestion team to develop ingestion pipelines for loading terabytes of data into Hadoop. Ingestion involves loading both structured and unstructured data. Work with data scientist team to help them analyze large amounts of data.

Required Skills

  1. • Experience working of programming languages like Java, Python etc.
  2. • Knowledge of relational databases (RDBMS) like Oracle, MYSQL, MS SQL Server etc.
  3. • Understanding and working experience with Hadoop, Hive, Pig, Sqoop, Map/Reduce, Oozie, Spark etc.
  4. • Understanding of No-SQL databases like Mongo, HBase etc.
  5. • Knowledge in scripting for automation process.
  6. • Proficiency in Linux and Windows environment.
  7. • Troubleshooting skills, understanding of system's capacity, bottlenecks, basics of memory, CPU, OS, storage and networks.
  8. • Familiarity and experience in different phases of software development life cycle.
  9. • Good understanding of algorithms, data structures, performance optimization techniques, and object-oriented programming.
  10. • Experience building complex Enterprise Apps that have been successfully delivered to customers.
  11. • Exceptional analytical abilities, creativity and attention to details.
  12. • Good organizational and problem solving skills.
  13. • Good team player who is a self-starter and well organized.
  14. • Strong oral and written communication skills.

Required Education

• (UG - B.Tech/B.E. - Any Specialization) OR (PG- MCA - Computers) AND (Doctorate- Any Doctorate - Any Specialization, Doctorate Not Required).

Please email your resume to careers@gtpltech.com