We’re doing a migration of time-series data from PostgreSQL to Druid — Druid requires Hadoop processing and we are slowed down because we have no experience with Hadoop.
We are under a tight deadline and are looking for someone who can help us with the implementation of Hadoop processing for our Druid cluster.
Prior experience with Hadoop is required; familiarity with druid is a plus, but not required.
We reach 3.2 million academic researchers per month and are growing 14% per week -- we provide the 'Recommended links' widget found in thousands of academic research journals.
We're looking for a Hadoop Expert:
1) Work is remote (your location doesn't matter)
2) Schedule is flexible (part-time or full-time)
3) About our technology stack: Fabric, Git, Ruby on Rails (web app), Python (machine learning) Node.js (crawling), PostgreSQL, Solr, ElasticSearch, Druid
1) We are profitable and growing fast
2) We're based in New York and Toronto (everyone works...