Automation - RPA
Cloud / DevOps
QA & Testing
Servicenow vs. Salesforce
SAP vs. Salesforce
Hubspot vs. Salesforce
Pipedrive vs. Salesforce
SugarCRM vs. Salesforce
Zendesk vs. Salesforce
Test in Seconds
True Project Insights
The Great IT
About CAI India
Location : BENGALURU, Job Posted : 21/04/2020
Responsibilities and Duties:
Installed and configured various components of Hadoop ecosystem and maintained their integrity
Deploying and maintaining a Hadoop cluster, adding and removing nodes using cluster monitoring tools like Ganglia Nagios or Cloudera Manager
Work closely with the database team, network team, BI team and application teams to make sure that all the big data applications are highly available and performing as expected.
Managed all aspects of our AWS infrastructure (compute, storage, network, permissions, cost) using configuration management tools like Ansible, Cloud Formation and shell scripts
Assist in designing, automating, implementing and sustainment of Amazon machine images (AMI) across Cloud environment.
Configured S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on requirement.
Implementing and sustainment of Amazon machine images (AMI) across Cloud environment
Experience with VPC, Subnets, and Route tables
Responsible for capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster.
Monitoring the cluster connectivity and performance.
Backup and recovery tasks
Resource and security management
Troubleshooting application errors and ensuring that they do not occur again.
Competencies & Experience Required/Desired
You Must Have
Bachelor’s degree in computer science, information technology, data science, data analytics, interactive media, or related field
8+ years of experience in Data warehousing or similar analytic data experience
3+ years of experience creating and managing complex data architectures.
5+ years of experience in database design, development and data modeling.
5+ years if experience working as a Hadoop Admin
Hands-on experience on Cloudera installation, configuration, debugging, tuning and administration.
Must have prior Cloudera Hadoop cluster deployment experience from the scratch.
Hands-on experience on Cloudera, working with data delivery teams to setup new Hadoop users. This includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive.
Competency in Red Hat Linux administration (security, configuration, tuning, troubleshooting and monitoring).
Expert knowledge on Active Directory/LDAP security integration with Cloudera Big Data platform.
Performance tuning experience of Cloudera clusters and Spark (PySpark, Spark, Language R) and MapReduce routines.
Experience Optimizing clusters for future workloads.
Hands-on experience on node management, monitoring and response, support processes creation, upgrades and patches, logging configuration and managing user rights and space quota.
Working knowledge of Networks, Linux OS and Unix Shell Scripting.
Experience working on Agile projects and Agile methodology in general
Hadoop, AWS and Azure certifications is a plus
To apply for an open position, send a cover letter and resume to
Sandhya.Venkataramana @ cai.io