Country/Region:  IN

Job Title 

Big Data Consultant 

Responsible for designing and mobilization infrastructure for big data solutions for enterprises including its development and engineering. Scope of responsibility includes execution and management of comprehensive architecture of big data platform, management of packages and all other resources in the complete environmental system. Responsible for a big data production environment which includes Jupyter and Hadoop. Enhance best practices for application big data management process using basic system features through a profound understanding of big data systems. Technology leadership capabilities. Provide solutions for big data and applications. Analyze requirements in relation to the related platform. 

Basic Tasks and Responsibilities  

  • Execution and management of comprehensive infrastructure of big data platform to fulfil performance basic system requirements. 
  • Responsible for operation, management and engineering of the big data platforms in multiple packages including capacity planning, and requirements estimation to reduce or increase packages capabilities 
  • Ability to distribute groups and identify the volume of packages based on stored data in HDFS, maintenance of packages, remove and addition of clusters using packages control tools, create high availability name/work node (s) and its scheduling and tack all commercial functions and take backups thereof. 
  • Responsible for setting up Anaconda packages and its setting up and JUPYTER HUB for user access including R Kernels, PySpark and Python 3 in Jupyter notebook, 
  • Responsible for big data environment system which includes Hadoop (YARN and HDFS), Hive, Spark, Liyv, SOLR, Oozie, Kafka, Airflow, Nifi, Hbase etc 
  • Execute tools and frames for automation of big data flow plans including data loading from different resources using ETL tools 
  • Orchestrate with the systems management team in order to suggest and implement new hardware and software in order to expand the current environment 
  • Execute data security, automation and update processing based on requests. 
  • Close cooperation with the development team, BI team, engineers, data scientists, processes teams or jointly with on key deliverables to ensure stability and production. 
  • Monitor safety of the basic system. Work to tune the performance, and increase operational efficiency to provide continuous improvements. 
  • Know-how transfer to NGHA ERP work team. 
  • Execute any other duties related to the job 

Big Credentials 

  • Computer sciences bachelor or equivalent from a recognized academic institution – a basic requirement 

Practical Experience 

  • 5 years minimum experience in big data field.