Etl Jobs

31 were found based on your criteria {{|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Expert ($$$) - Est. Budget: $50 - Posted
We're looking Jaspersoft developers who can - Deliver end to end BI implementation - Demo the sample application to our prospects - Must be able to do requirements analysis, find gaps and suggest best practices - Understand customer's business, challanges and translate it into a technology solution - Should have worked in Jaspersoft open source BI projects developing ETL and front end reports connecting to various data sources - Should be well versed in SQL/ PLSQL, Java and other related technologies
Skills: "Extract, Transform and Load (ETL)" ireport JasperReports Java
Fixed-Price - Expert ($$$) - Est. Budget: $500 - Posted
Hi, This is Nandani from Softstandard Solutions LLC, Hope you are doing good. We have an urgent requirement for ETL Developer for our a remote project where you have to spend your 2-3 hour weekly.
Skills: "Extract, Transform and Load (ETL)"
Fixed-Price - Intermediate ($$) - Est. Budget: $100 - Posted
Need online tutor for Informatica Powercenter. Mostly off line support like, will give you questions and you need to create video recording to answer that question or sometime it will be email support. You will never need to show your face or appearance if you don't want to. For ONE - 1 HOUR of video recording (4 videos of 15 minutes each), you will be paid $15 Thanks, Kish
Skills: "Extract, Transform and Load (ETL)" Informatica Oracle database Oracle Database Administration
Fixed-Price - Expert ($$$) - Est. Budget: $100 - Posted
Objective: 1 - At the end of each day we want automatically import data from an affiliate network to Google Sheets via API (perhaps using node.js). 2 - Once the data is imported to Google Sheets, we need a script (Google APP Script) to automatically gather and aggregate data to then organize it under each Google analytics KEY and Custom Dimensions. 3 - Once the data is gathered and the properly organized under its correspondent dimension we need to automatically export that data to Google Analytics. In summary we need a system that at a specific time each day automatically extracts the convenient information from the affiliate platform and uploads it to Google Analytics. API - Data we need extracted: Conversions + Payout + Time + SubID 1 + SubID 2 + SubID 3 + SubID 4 + Clicks + Pending Payout + Session IP Script: Once the data has been downloaded we need the script to automatically do the following: 1 - De-codify SUB ID 2 2 - Delete rows that contain specific words. 3- Order URLs in alphabetical order 4 - Extract URL information from a column and place it on another sheet (Sheet 2) under heading KEY. Once there it has to delete all duplicate URLs. 5 - On sheet 1 aggregate all the revenue generated by individual URLs and input the sum in sheet 2 under its correspondent dimension. 6 - On sheet 1 aggregate all the conversions generated by individual URLs and input the sum in sheet 2 under its correspondent dimension. Notes: The KEY we are going to be using to link the affiliate data and the Google analytics data is the page URL. We will extract this from a parameter in the tracking code (SUB ID 2). Questions: Is it possible to also organize the data by time, so we can also see the information reported by hour? For example, how many conversions and how much revenue was generate at a specific time.
Skills: "Extract, Transform and Load (ETL)" Google Analytics Google Analytics API Node.js
Fixed-Price - Expert ($$$) - Est. Budget: $100 - Posted
Currently I have a dashboard, se attached image, that can be found at: The data for this dashboard is queried from a REST endpoint. The current UI is written with Angular, relying on a D3 library. The job is to create an identical dashboard in a Databricks notebook. The notebook could be written preferably with Python, or otherwise Scala.
Skills: "Extract, Transform and Load (ETL)" Data Visualization
Fixed-Price - Intermediate ($$) - Est. Budget: $300 - Posted
I need pricing and other relevant data on the lodging industry in the vicinity of the Bluecut fire in Southern California, which burned from August 16th through August 22nd. This means that I need data from the past. I have already tried Google webcache and waybackmachine. These do not work. The farther back in time the better. One year or more would be terrific. I am looking for a freelancer who can use data scraping techniques and internet archives to scrape - at a minimum - the prices and zip codes for each listing by date and by number of guests. For each day and each listing within 100 miles of the fire I would like the price for a one night stay. There are specific zipcodes shown in the document. Also extract the qualitative aspects of listings - such as airbnb listing has a gym (1 or 0). The data absolutely must include the August 16th through 22nd dates I mentioned, some before, and some after. This project will go til at least October 22nd so that data will exist for 2 full months after the event. Some basic information about the fire can be found here:
Skills: "Extract, Transform and Load (ETL)" Web Crawling Data Science Data scraping