Amazon S3 Jobs

70 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Intermediate ($$) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
We need some listing support in amazon. We have few products to be listed in amazon with variations in size and colors. We want someone to take care of that while we send you details of pictures and price etc. I would like to work with someone who has been working as an expert for long time.
Skills: Amazon S3
Hourly - Expert ($$$) - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
We have a EC2 instance (running windows server) and 2 RDS instances (running SQL Server) that are not currently in a VPC, and we would like to migrate them into one without breaking external connections and minimizing downtime. In addition, we would like to add an OpenVPN server that supports MFA (multi-factor authentication) to allow us to access the otherwise externally inaccessible RDS instances. We have a few complications to the scenario that we know about, and there are possibly others that we will discover as we proceed. Primarily, we have a pair of Elastic external IPs on our EC2 instance that need to move with the machine to the VPC, and we need to mitigate downtime so that we do not upset our existing customers. Secondarily, we hope this will prepare us to eventually move to a multi-AZ redundant system down the line. You will be required to sign an NDA and work first to build/document a proof of concept, then assist in the actual migration.
Skills: Amazon S3 Amazon EC2 Amazon Relational Database Service Network Security
Hourly - Entry Level ($) - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
Hi , I am looking for the company which can manage my cloud based servers hosted at aws . digital ocean , linode , google cloud and openstack payment will be done hourly through upwork only after work done . you have to provide support on linux and windows based servers . Our company is using centos panel, whm and plesk panel on servers . Send me proposal with below question answers. I want to filter out genuine bids only. 1. Which ports should you open in host firewall for an email server? 2. what is YUM ? 3. what is security group in aws? Thanks
Skills: Amazon S3 Amazon EC2 Amazon Relational Database Service Amazon Web Services
Hourly - Entry Level ($) - Est. Time: More than 6 months, 10-30 hrs/week - Posted
We are looking for an experienced CodeIgniter developer with solid PHP skills and design experience to help us on a long term project. Starting out at a small number of hours per week and eventually expanding to full-time. Freelancer needs to be highly available, highly responsive, and meet strict deadlines. You need strong expertise and experience with PHP, CodeIgniter, SQL, HTML5, CSS, Bootstrap, Javascript, JQuery, Ajax, Git/BitBucket, and JIRA. You will also need acceptable design skills for basic and general UI design. Experience with Amazon Web Services, PHP and Javascript plugins and libraries, and payment processor APIs are all pluses.
Skills: Amazon S3 Atlassian JIRA BitBucket CSS
Hourly - Entry Level ($) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
We're a small company looking for a Tech Virtual Assistant to help us with marketing campaigns and basic website maintenance. The job duties of this position include: Setting up email marketing campaigns (autoresponders/sequences/tags) Setting up the cart for sales (connecting the cart in InfusionSoft to the credit card processing system) Setting up opt in pages/landing pages/sales pages (we would give you the text/copy) Adding products to Product cart when needed Connecting newly purchased URLs to the database and Wordpress Setting up/Updating affiliate center for affiliates Troubleshooting any problems that arise The applicant is required to know: Wordpress Click Funnels Optimize Press InfusionSoft Infusionsoft Customer Service
Skills: Amazon S3 ClickFunnels Customer support Facebook Marketing
Fixed-Price - Intermediate ($$) - Est. Budget: $30,000 - Posted
Estamos en búsqueda de un especialista en infraestructura y administración de servicios en la nube de AWS.​ Administración de servicios e infraestructura en la nube de Amazon como: EC2, RDS, Route 53 etc. ​ Conocimientos de infraestructura y administración de sistemas operativos Linux y Windows, experiencia en la administración de servicios en AWS.
Skills: Amazon S3 Amazon Relational Database Service Amazon Web Services AWS Lambda
Fixed-Price - Expert ($$$) - Est. Budget: $1,000 - Posted
There is a code example provided by Amazon Web Services https://github.com/awslabs/amazon-kinesis-data-visualization-sample/tree/master/src/main/java/com/amazonaws/services/kinesis/samples/datavis and described in http://docs.aws.amazon.com/streams/latest/dev/kinesis-sample-application.html The scope is fixed. Only the input data formats can slightly change. We are creating a pipeline which processes high amount of events, agregates it, stores to DynamoDB and shows in a near-real-time graph on the UI. You need to: 0. Read the provided article, run and understand the code. If you understand it clearly, all next steps will be easy. Also read and understand the task statement and discuss it with me. 1. Agregate data of 5 types from 5 different Kinesis streams. The agregation window is 1 second. Data types are: bid request, bid response, bid win, impression, click. All are json records. For bid request: store TIMESTAMP as DynamoDB Hash, w + h as DynamoDB Sort key, and number of request during this second. For bid response: store TIMESTAMP as DynamoDB Hash, banner id as DDB sort key, and number of responses during this second. For bid win: store TIMESTAMP as DynamoDB Hash, banner id as DDB sort key, number of wins during this second and total win price for this second. Total price is a sum of winPrice field across all bid wins. For impressions: store TIMESTAMP as DynamoDB Hash, banner id as DDB sort key, number of impressions during this second and total price of all impressions during this second. For clicks: store TIMESTAMP as DynamoDB Hash, banner id as DDB sort key, number of clicks during this second. Json examples are provided below. To do this you need to combine the classes provided. All appropriate tests should be created, and console applications to generate test stream data should be created for each of 5 streams. 2. Create separate servlets (by example) for each of the data types. Each graph should be configurable it should show agregates for: last 10 minutes, step is 2 seconds, update each 2 seconds. last day, step is 2 minutes, update each 2 minutes. last week, step is 1 hour, update each 1 hour. last month, step is 1 hour, update each 1 hour. For bid requests: just display the number of bid requests and update it each second. For bid responses: display the number of responses for a group of banner ids. Banner ids are passed as servlet request parameter. For bid wins: display 2 parameters, the number of responses and win total for a group of banner ids. Banner ids are passed as servlet request parameter. For impressions: display 2 parameters, the number of impressions and impression total price for a group of banner ids. Banner ids are passed as servlet request parameter. For clicks: display the number of clicks for a group of banner ids. Banner ids are passed as servlet request parameter. 3. Create separate near-real-time graphs for the following: eCPM (effective impression cost per mile) = total impressions price / number of impressions * 1000 eCPC (effective click cost) = total impression price / number of clicks * 100 CTR (click through rate) = number of clicks / number of impressions * 100 Each of 3 points should take a group of banner ids as servket parameter. Graphs should also be shown for month, week, hour, and 10 minutes. 4. For each stream: After the data are aggregated, store all stream data to s3 files. Data should be stored as it is, no conversions. Files should be stored in 5 folders, and then subfolder for each day, hour and minute should be created. Each file can contain batch of json records. Json records should be separated with line "===============". Many files can be created for each minute. Use this https://github.com/awslabs/amazon-kinesis-connectors/blob/master/src/main/java/com/amazonaws/services/kinesis/connectors/s3/S3Emitter.java A console application should be created, which loads all the files of the specified stream and for the specified period to local folder. Stream and period should be specified as command line arguments. 5*. bonus part. Update the Amazon Web Services template file to create all the needed streams, DDB tables, policies, roles and others to easily deploy the whole infrestructure. The scope is fixed. Only the input data formats can slightly change. Data examples are attached. Click example will come later. Please create a repository on github (of fork from the provived repo) and push your changes every day. Also, we will need support time to time.
Skills: Amazon S3 Amazon DynamoDB Java Multithreaded Programming