System Admin, Hadoop admin,Java Learner
Last active: 12/13/2014
Having around 15 years of experience in IT industry, Involving Project Management, Team Management, Problem Management, Solution providing, System Administration, Network Administration, Systems Planning, co-ordination and Commissioning in IT Infrastructure.
The core expertise area on New infrastructure implementation and maintenance of projects in different sector like telecom, data warehouse, financial industries.
The core technical platform includes on different flavors of UNIX Administration,
My responsibility included:
> Installation and configuration of UNIX(Linux, AIX,) Servers
> implementation of Hardware & software RAID (Ex: SAN/NAS, LVM, Cluster manager).
> Technical account management for clients & providing trainings
> Assistance to customers for product integration's and giving technical presentations.
> Preparation of installation docket for the assigned installation projects.
Principle certifications :
Sun Certified System Administrator Solaris 10 - SCSA
Sun Certified Network Administrator - SCNA
Sun Certified Security Administrator – SCSA
MYSQL Certified DBA. 5.0
Sun Certified Cluster Administration 3.2
Foundation Certificate on ITSM / ITIL
Linux Administration(General)- Brain bench
IBM Certified pSeries Support & AIX Administration – 5L Ver.5.2
Cisco Certified Network Associate (CCNA)
Microsoft Certified Professional (MCP)
Having Good knowledge of hadoop ecosystems in HDFS and Hadoop Cluster in background of Unix technology. Personally setup Hadoop environment and self practicing in Bigdata technologies.
• Experienced in Installing apache hadoop and implementing hadoop ecopsystem project.
• Hadoop cluster configuration & deployment to integrate with different systems hardware.
• Commissioning and Decommissioning of Node. Managing hadoop Services like Namenode, Datanode, Jobtracker, Tasktracker. Configured core-site.xml, hdfs-site.xml, mapred-site.xml.
• Working knowledge in Pig, Hive Mysql, Hbase
• Experiened in loading data into the cluster from dynamically-generated files using Flume also from the local file system to the hadoop Cluster.
• Troubleshooting, diagnosing, tuning, and solving Hadoop issues.