Hire the best Hadoop Developers & Programmers in Armenia
Check out Hadoop Developers & Programmers in Armenia with the skills you need for your next job.
- $120 hourly
- 4.9/5
- (24 jobs)
I am a co-founder and CTO of ABSM Data Solutions. My team will 𝗱𝗲𝘀𝗶𝗴𝗻 𝗮𝗻𝗱 𝗶𝗺𝗽𝗹𝗲𝗺𝗲𝗻𝘁 𝗮 𝘀𝗰𝗮𝗹𝗮𝗯𝗹𝗲 𝗗𝗮𝘁𝗮 𝗣𝗹𝗮𝘁𝗳𝗼𝗿𝗺 ingesting data from any sources, optimizing storage costs for 𝟭𝟬𝟬𝗧𝗕𝘀+ 𝗼𝗳 𝗱𝗮𝘁𝗮, and streaming data into your analytical and production systems 𝘄𝗶𝘁𝗵 <𝟭𝟬𝟬𝗺𝘀 𝗹𝗮𝘁𝗲𝗻𝗰𝘆. 🤝 𝗪𝗵𝗮𝘁 𝘆𝗼𝘂 𝗴𝗲𝘁 𝗵𝗶𝗿𝗶𝗻𝗴 𝘂𝘀 — Quality: I will 𝙥𝙚𝙧𝙨𝙤𝙣𝙖𝙡𝙡𝙮 track the implementation progress and keep you posted. — System design: we will help you understand your technical needs and design a Data Platform for your particular use case. — End-to-end approach: we will implement a working 𝗠𝗩𝗣 𝗶𝗻 𝗮𝘃𝗲𝗿𝗮𝗴𝗲 𝟮–𝟯 𝘄𝗲𝗲𝗸𝘀. — Experience: 𝟭𝟬 𝘆𝗲𝗮𝗿𝘀 𝗵𝗮𝗻𝗱𝘀 𝗼𝗻 building with AWS, GCP, etc. using Airflow, BigQuery, ClickHouse, Kafka, S3, Redshift, Snowflake, Spark, Hadoop. 🔥 𝗢𝘂𝗿 𝗿𝗲𝗰𝗲𝗻𝘁 𝗽𝗿𝗼𝗷𝗲𝗰𝘁𝘀 — Collected 𝟱 𝗧𝗕𝘀 𝗼𝗳 𝗯𝗹𝗼𝗰𝗸𝗰𝗵𝗮𝗶𝗻 𝗵𝗶𝘀𝘁𝗼𝗿𝘆 to calculate and display Ethereum wallets balances 𝘄𝗶𝘁𝗵 <𝟭𝟬𝟬𝗺𝘀 𝗹𝗮𝘁𝗲𝗻𝗰𝘆. Using: ECS, S3, DynamoDB, Athena, Glue. — Connected data from 𝟭𝟱 𝗺𝗮𝗿𝗸𝗲𝘁𝗶𝗻𝗴 𝗮𝗻𝗱 𝗲-𝗰𝗼𝗺𝗺 𝗮𝗱𝘀 𝗰𝗮𝗯𝗶𝗻𝗲𝘁𝘀 to report marketing ROI via real-time dashboards. Using: Airflow, ClickHouse, Tableau, Superset, Kinesis. — Scalable ML infrastructure for trends analysis and anomaly detection for investment funds, 𝗶𝗻𝗰𝗿𝗲𝗮𝘀𝗶𝗻𝗴 𝗿𝗲𝘃𝗲𝗻𝘂𝗲 𝗯𝘆 𝟯𝟴%. Using: GCP, Dataflow, FB Prophet. — … and more than 40 other various complex solutions! 🟢 𝗪𝗵𝘆 𝘂𝘀? 1. You define a 𝘣𝘶𝘴𝘪𝘯𝘦𝘴𝘴 goal, we offer the fastest and the most efficient solution. 2. We deliver the solution end-to-end: from design to implementation and maintenance. 3. Average MVP delivery: 2–3 full-time weeks. 4. Fully documented project: we will transfer the necessary knowledge to your team to optimize your costs for maintenance and further development. 5. Near 100% uptime of the system (depends on the cloud provider's uptime). 6. Data ingestion at 10,000 requests per second (AWS) and higher. 7. Storage of 5TB of uncompressed data for $5 monthly (S3). 8. Quick data transfer to your systems: <100ms latency. 👨🎓 𝗔𝗰𝗮𝗱𝗲𝗺𝗶𝗰 𝗲𝘅𝗰𝗲𝗹𝗹𝗲𝗻𝗰𝗲 Master's degree in Computer Science. Authored a scientific paper in Big Data and Computer Science. I am an active open-source contributor: my Airflow ClickHouse Plugin is in top-1% of Python packages worldwide. See my code: github.com/bryzgaloff 🤙 𝗟𝗲𝘁’𝘀 𝗵𝗮𝘃𝗲 𝗮 𝟯𝟬 𝗺𝗶𝗻𝘂𝘁𝗲𝘀 𝗰𝗮𝗹𝗹! Sign up for a free 30-minute video call: you describe your task in business terms — I tell you possible solutions and steps to implement. In three business days, I will provide timeline and cost estimations.Hadoop
ClickHouseAmazon Web ServicesData WarehousingBig DataCloud ArchitectureBlockchainCloud ComputingApache HadoopApache SparkETL PipelineApache AirflowKubernetesPythonSQL - $40 hourly
- 4.8/5
- (66 jobs)
I'm Linux DevOps and Cloud architect since 2002. Most of my professional career is with design, setup and DevOps of medium and high loaded web farms, NoSQL databases which are time-critical and require 24/7/365 uptime. During the last several years, I'm concentrated on architect & administration of Hadoop ecosystem, Big-Data systems (Cassandra, ElasticSearch, Riak ...) and distributed storage Ceph. I have big experience with a variety of web servers and load balancers (Apache, Nginx, HAProxy, Tomcat, Jetty etc .. ) as well as with cloud services such as AWS, Azure and GCP.Hadoop
Big DataApache HBaseLinux System AdministrationApache CassandraGolangNomadCI/CD PlatformApache HadoopConsulKubernetesElasticsearchGoogle Cloud PlatformPythonAmazon Web ServicesLinux - $25 hourly
- 5.0/5
- (4 jobs)
Hi I'm Van, Big Data Engineer with big experience in FAANG projects and I’ll help you to level up 🚀 your business with right data-driven decisions.📊I build efficient and cost-effective data pipelines, ensuring smooth data flow and analysis to boost business performance.Also I can find data for you)Hadoop
Databricks PlatformApache KafkaSnowflakeApache HiveApache HadoopApache NiFiPythonJavaScalaAmazon Web ServicesGoogle Cloud PlatformApache AirflowApache BeamApache Spark - $21 hourly
- 0.0/5
- (1 job)
I'm a developer experienced in Python, Java, Apache Spark and ETL pipelines. I've worked on projects of two major Russian banks.Hadoop
C++/CLIApache SparkApache AirflowETL PipelineApache HadoopAgile Software DevelopmentGitETLJavaSQLGolangPython Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.