DesignMind is seeking a Senior Hadoop and Database Administrator to join our Big Data Solutions team. Our clients’ Big Data / Hadoop ecosystems are key next generation platforms for payments, risk management and analytics.  This position is responsible for administering key Big Data systems and applications that handle data ingestion, processing and presentation layers, and scaling the system to handle billions of events from 100’s of millions of customers worldwide in near real-time.

DesignMind offers Medical, Dental & Vision benefits, 401(k), performance bonuses, sunny offices in the heart of downtown San Francisco, and many other great perks.

Responsibilities:

  • Implementation and deployment of new Big Data core services, including Hadoop, HBase, Hive, Spark, Impala, and search in on-premises and cloud (AWS, Azure, Oracle Cloud).
  • Deployment and administration of common widely used services throughout the Big Data ecosystem, including scheduling, data integration, and monitoring services.
  • Architecture, capacity planning, monitoring, maintenance, tuning and workload management of all key services listed above as needed to ensure the systems meet SLAs.
  • Create documentation for first level monitoring teams and users, respond to escalations for support, alerts, and requests for help from monitoring teams and end users.

Desired Skills & Experience:

  • 2 or more years building and operating high performance, high availability database platforms with a mix of traditional (Oracle, SQL Server, Golden Gate, Redshift, Teradata) and non-traditional (Hadoop, NoSQL) database technology.
  • Strong experience with automation tools including Ansible, as well as scripting languages such as Perl, Python and Ruby, for DevOps automation. 
  • Experience deploying and monitoring apps that run on distributed systems, and all the issues that go with them – maintaining consistency with HA / DR solutions, performance tuning, troubleshooting.
  • Experience administering Hadoop, including Cloudera CDH, MapR, Hortonworks and Apache Hadoop as well as Amazon AWS and Microsoft Azure big data services.
  • Experience installing, configuring and managing LDAP and Kerberos services (MS and open source / MIT), and integrating Unix clients with them.
  • Strong SQL, data warehousing, and ETL experience on traditional databases is a plus.

Apply for this Job

* Required

File   X
File   X