Hire the best Amazon S3 Developers in Jersey City, NJ
Check out Amazon S3 Developers in Jersey City, NJ with the skills you need for your next job.
- $80 hourly
- 5.0/5
- (7 jobs)
I have been coding for over 20 years, have a computer science degree, a PhD, and 7 years of IT consulting experience for major fortune 500 clients. I am AWS certified, I have intermediate experience with GCP, and am familiar with Azure. I am proficient in Python, SQL, Typescript, Bash, Terraform, Java, React Native, and other languages. I have worked with hundreds of APIs, including Google Sheets, OpenAI, GitHub, and many more.Amazon S3
ChatGPTGPT-4Data ScienceScriptingGoogle Cloud PlatformAmazon Web ServicesJavaScriptBashTerraformAPI DevelopmentAWS CloudFormationGoogle SheetsApache SparkPython - $40 hourly
- 5.0/5
- (6 jobs)
I am highly efficient and reliable professional who possesses a broad skill set for Mobile app and Web application development. (8 years of experience) A highly motivated, versatile expert, I have demonstrated expertise in software design and delivery covering mobile apps, and on-premise software. - Skills 1. Native app development ( 8 years ) : iOS ( Swift, UIKit, Swift UI ), Android ( Kotlin, Java, Jetpack Compose ) 2. Hybrid app development : Flutter ( 5 years ), React Native ( 3 years ) 3. Web development ( 5 years ) : PHP, Laravel, Angular, ReactAmazon S3
RESTful APIAmazon EC2App DevelopmentApple XcodeRxSwiftMySQLPHPReact NativeFlutterJavaKotlinAndroidSwiftiOS - $75 hourly
- 0.0/5
- (4 jobs)
Hi, I’m Ramakrishna, an aspiring Data Engineer and a certified Tableau developer with over 2 years of experience in the field of Data Science. My academics and practical experiences in Data Science have highly prompted me to become a dedicated and enthusiastic person that I am today. Besides academics, I have gained invaluable experience in several extra-curricular activities, advancing my interpersonal skills and raising my confidence. Moreover, I am a recent graduate with a Masters degree in Business Analytics. Graduate degree has prompted me to attain the necessary skills and transition into an analytics professional going forward. Excited about learning new technologies and implementing them practically, I had the most amazing opportunities to work for industry leaders and consulting firms. The most recent experience of mine accounts to be for the one from the FAANG, wherein I have worked as a Data Engineer for Amazon here in the United States. Working for Amazon, I had hands-on experience working for an end-to-end project with leading professionals in the domain. Prior to that, I have over 2 years of experience working for a leading tech startup in India dealing in consulting Data Science. These are my core competentices: - Experienced in effective communication to procure project requirements from the client and produce impeccable deliverables - Experienced in building Data Engineering pipelines using traditional/cloud based solutions. - Experienced in developing dashboards for different management levels in a hierarchy. - Experienced in developing predictive models for informed decision making. Here’s a highlight of the services I commonly help my clients with: ☑️ ☑️ Data Analytics Solutions & Business Intelligence Implementation (Amazon Web Services) ☑️ Data Visualisation & Dashboard Design / Development ☑️ Data Warehouse & Data Engineering Services ☑️ Data Science, Predictive Analytics & Machine Learning ☑️ Customer Segmentation, Churn Prediction 🛠 Here’s a highlight of the skillsets I commonly help my clients with: 🛠 Data Visualization: 🛠 Microsoft Power BI, Tableau, QuickSight Google Data Studio Data Engineering: 🛠 Amazon Web Services (AWS), AWS Lambda, AWS S3, AWS Glue, AWS Redshift, AWS MWAA Data Analytics: 🛠 R, Python, SQL Few of the industries I commonly work in constitute: 🔸 Retail 🔸 E-commerce 🔸 Consumer Packaged Goods 🔸 Financial Services 🔸 Manufacturing and Production 🔸 Non-profit OrganisationsAmazon S3
Apache AirflowETL PipelineAmazon RedshiftData WarehousingAmazon AthenaData EngineeringAWS GlueTableau - $90 hourly
- 0.0/5
- (0 jobs)
About Me I’m a Senior Data Engineer and Machine Learning Practitioner with a strong software engineering background, specializing in data pipelines, AI/ML infrastructure, and cloud-native solutions. With expertise in AWS, Databricks, Apache Spark, Langchain, MLOps, and large-scale data architectures, I focus on building scalable, cost-efficient, and production-ready solutions that drive real business impact. My work sits at the intersection of data engineering, machine learning, and cloud automation, allowing me to design, deploy, and optimize AI-powered applications end-to-end. I also received my Masters in Data Science further adding to my expertise with AI/ML. I’ve worked across highly secure environments, including defense-related work, and I bring a security-first approach to data pipelines and machine learning deployments. Whether it’s streamlining ETL processes, optimizing ML models, or deploying AI-driven applications, I focus on solutions that scale efficiently and integrate seamlessly into enterprise cloud environments. What I Offer 🔹 End-to-End AI Development – From data exploration to production deployment, I build and scale AI-driven applications that integrate seamlessly into cloud environments. AI Model Development & Fine-Tuning • Model Development & Optimization: Build and fine-tune models for structured and unstructured data, including computer vision (YOLO), NLP (transformers), clustering, anomaly detection, and time series forecasting. • Transfer Learning & Fine-Tuning: Utilize pre-trained models (BERT, GPT, CLIP, Whisper, YOLOv8, LLaMA, Falcon, etc.) and fine-tune them for domain-specific applications to improve accuracy while reducing training costs. • Hyperparameter Tuning: Automate hyperparameter optimization using Ray Tune, Optuna, and Bayesian Optimization to enhance model performance. • Few-Shot & Zero-Shot Learning: Implement low-data ML techniques to improve model generalization when limited labeled data is available. Retrieval-Augmented Generation (RAG) & Prompt Engineering • RAG Pipelines: Build custom knowledge-augmented LLMs that efficiently retrieve and generate responses using FAISS, Pinecone, ChromaDB, and vector databases. • LLM Fine-Tuning & Customization: Fine-tune LLMs for domain-specific applications, optimizing for efficiency, cost, and accuracy. • Prompt Engineering & Optimization: Craft optimized prompts for LLM applications, including Chain-of-Thought (CoT), ReAct, and Meta-Prompting strategies to enhance model reasoning. • Custom AI Agents & Workflows: Develop LLM-powered AI agents that integrate with databases, APIs, and enterprise systems for intelligent automation. Data Engineering & Cloud Pipelines • ETL & Data Processing: Build efficient, serverless ETL pipelines using Apache Spark, AWS Glue, and dbt to process and transform large datasets. • Streaming & Real-Time Data Pipelines: Architect real-time data ingestion and processing using AWS Kinesis, Kafka, and Flink. • Data Lake & Warehouse Architecture: Design AWS-based data lakes (S3 + Glue) and enterprise data warehouses (Redshift, Snowflake). • Infrastructure-as-Code (IaC): Automate cloud deployments using Terraform, CloudFormation, and Ansible. • Cost-Optimized Cloud Solutions: Implement cost-efficient storage, compute scaling, and serverless architectures for AI/ML workloads. Cloud & DevOps Automation • CI/CD for Data & ML Pipelines: Build automated pipelines for ML training, model deployment, and data processing using AWS CodePipeline, Jenkins, and GitHub Actions. • Kubernetes & Docker Orchestration: Deploy containerized applications with EKS (Elastic Kubernetes Service) for scalable ML models and data processing pipelines. • Security & Compliance: Implement secure access controls, IAM policies, and data encryption to comply with SOC 2, HIPAA, and DoD security standards. • Monitoring & Observability: Set up CloudWatch, Prometheus, and Grafana dashboards to track performance and optimize cloud resources. Why Work With Me? I bring a unique mix of data engineering, artificial intelligence, machine learning, cloud architecture, and security expertise, making me well-suited for highly complex, large-scale projects. Here’s what you can expect: ✔ End-to-end AI & Data Solutions – From raw data ingestion to production ML models, I handle the entire pipeline. ✔ Scalable & Cost-Efficient Infrastructure – I optimize architectures to reduce costs without sacrificing performance. ✔ Security-First Mindset – With experience in highly regulated industries, I ensure robust security & compliance. ✔ Production-Ready AI – I specialize in deploying AI models that deliver real business impact, not just research experiments. ✔ Cloud-Native Expertise – I build solutions that leverage AWS, Kubernetes, and Spark for enterprise-grade scalability. If you’re looking for a data engineer who understands your business objectives, machine learning, DevOps, and cloud automation, let’s connect. 🚀Amazon S3
dbtApache AirflowData MigrationDatabricks PlatformAmazon RedshiftAmazon RDSAWS LambdaAmazon ECSRetrieval Augmented GenerationLLM Prompt EngineeringLangChainETLMachine LearningArtificial Intelligence - $28 hourly
- 5.0/5
- (1 job)
Hi I am a Data Engineer with web development experience. I worked in large retail corporate before started freelancing. I am experienced in writing ETL pipeline to process data. I am also a native mandarin/cantonese speaker, I am comfortable with working with East Asian industries.Amazon S3
PySparkAWS GlueTranslationSpring BootJavaPythonData AnalysisData EngineeringAgile Software Development - $25 hourly
- 0.0/5
- (0 jobs)
Hi I am FullStack engineering changing the world with React graphQL and node. I have experience with Azure and AWSAmazon S3
RoutingDatabaseBeanstalkMocha FrameworkMicrosoft AzureNode.jsReactJavaScriptMongoDBVue.jsExpressJSGraphQLHTTPAmazon Web Services - $12 hourly
- 0.0/5
- (0 jobs)
DevOps Engineer | Cloud Specialist | Automation Expert With a strong foundation in Information Technology and a Master's in Software Engineering, I bring a unique blend of academic excellence and hands-on experience in DevOps, cloud platforms, and infrastructure automation. My career highlights include enhancing large-scale systems, creating robust automation pipelines, and delivering innovative solutions that streamline operations and improve performance. What I Offer: DevOps Expertise: Proven experience in building and managing pipelines, automating workflows, and enhancing infrastructure stability using tools like Jenkins, Docker, Kubernetes, and Terraform. Cloud Proficiency: Skilled in leveraging AWS and Google Cloud Platform for automation, certificate management, and monitoring solutions. Programming Savvy: Adept in Python, Bash, and Ruby, with a track record of developing efficient scripts and automation tools. Security and Monitoring: Designed security pipelines to scan and quarantine files with ClamAV, ensuring compliance and safety for cloud data. Innovative Problem-Solving: Built an intelligent DevOps chatbot using a vector database and the Mistral 70B model, empowering teams with instant access to company-specific documentation. Notable Projects: Automated the health monitoring of a 50-node big data platform, generating comprehensive service reports and enhancing system reliability. Created a seamless integration to send Nagios events to Google Chat using AWS SNS and Google Apps Script, reducing response time for incidents. Developed an RAG pipeline with a vector database to assist DevOps teams in retrieving critical information efficiently. Why Hire Me? I’m passionate about solving complex technical challenges and helping businesses optimize their infrastructure. Whether you need to automate processes, enhance system performance, or secure your cloud environment, I have the skills, tools, and drive to deliver exceptional results. Let’s collaborate to take your project to the next level! 🚀Amazon S3
AWS GlueAWS FargateAWS LambdaAutomationBashBig DataRed Hat Enterprise LinuxRed Hat AdministrationFlaskDevOpsPythonKubernetesDockerCI/CD Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.
How do I hire a Amazon S3 Developer near Jersey City, NJ on Upwork?
You can hire a Amazon S3 Developer near Jersey City, NJ on Upwork in four simple steps:
- Create a job post tailored to your Amazon S3 Developer project scope. We’ll walk you through the process step by step.
- Browse top Amazon S3 Developer talent on Upwork and invite them to your project.
- Once the proposals start flowing in, create a shortlist of top Amazon S3 Developer profiles and interview.
- Hire the right Amazon S3 Developer for your project from Upwork, the world’s largest work marketplace.
At Upwork, we believe talent staffing should be easy.
How much does it cost to hire a Amazon S3 Developer?
Rates charged by Amazon S3 Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.
Why hire a Amazon S3 Developer near Jersey City, NJ on Upwork?
As the world’s work marketplace, we connect highly-skilled freelance Amazon S3 Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Amazon S3 Developer team you need to succeed.
Can I hire a Amazon S3 Developer near Jersey City, NJ within 24 hours on Upwork?
Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Amazon S3 Developer proposals within 24 hours of posting a job description.