Responsible for creating, monitoring and maintaining various databases including Big Data platforms like Hadoop, ClickHouse etc.
What you'll be responsible for?
• Ensure optimal health, integrity, performance, and security of all databases.
• Develop and maintain data categorization and security standards.
• Evaluate and recommend new database technologies and management tools; optimize existing and future technology investments to maximize returns.
• Provide day-to-day support to internal IT support groups, external partners, and customers as required.
• Manage outsourced database administration services to perform basic monitoring and administrative-level tasks as directed.
• Participate in change and problem management activities, root cause analysis, and development of knowledge articles to support the organization’s program.
• Provide subject matter expertise to internal and external project teams, applications developers, and others as needed. Support application testing and production operations. Serve as database administration.
• Document, monitor, test, and adjust backup and recovery procedures to ensure important data is available in a disaster
• Serve as on-call database administrator on a rotating basis.
• Develop, Implement, and Maintain Oracle, MySQL, PostgreSQL & Mongo Instances including scripts for monitoring and maintenance of individual databases.
• Responsible for implementation and ongoing administration of Hadoop infrastructure
• Screen Hadoop cluster job performances and capacity planning
• Monitor Hadoop cluster connectivity and security
• Manage and review Hadoop log files
• File system management and monitoring
• HDFS support and maintenance
• Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
• Collaborating with various teams to install Hadoop updates, patches, version upgrades when required.
• Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, and Cloudera Manager.
• Performance tuning of Hadoop clusters and Hadoop MapReduce routines
• Familiarity with eco-system tools like Flume, Sqoop, Hive, Pig etc.,
• Knowledge of workflow/schedulers like Oozie
• HBase setup experience
What You'd have ?
• 5-9 years of experience in managing enterprise databases
• Hadoop (HBase & Hive), Kafka, Solr Administration
• Oracle, MySQL, PostgreSQL & knowledge on NoSQL like Mongo & Redis etc.,
• Installing MySQL, PostgreSQL & Mongo.
• Backup and Recovering Oracle, MySQL and PostgreSQL databases.
• User level Access: Risks & Threats.
• Synchronous and Asynchronous replication, converged systems, partitioning, and storage-as-aservice (cloud technologies)
• Linux operating systems, including shell scripting
• Windows Server operating system
• Industry-leading database monitoring tools and platforms
• Data integration techniques, platforms, and tools
• Modern database backup technologies and strategies
Why join us?
We thought you would never ask! We offer all the usual stuff: competitive salary, flexible working hours, challenging product culture but the real perks are:
• Challenging and fun work environment solving meaningful real-life business problems - you will never have a boring day at the office.
• World-class team who love solving tough problems and have a bias for action.
Tanla is an equal opportunity employer. We welcome and encourage diversity in the workplace regardless of race, gender, religion, age, sexual orientation, gender identity, disability, or veteran status.
Please be aware that we will contact only
candidates who best match the requirements of the position.