Etl Jobs

59 were found based on your criteria {{|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Intermediate ($$) - Est. Budget: $750 - Posted
This is an ETL project: I have my business website designed using Magento. There are two front-ends to the single back-end: e-commerce front-end: informational front-end: I am a Medical Equipment and Supply dealer representing many manufacturers and wholesalers.
Skills: "Extract, Transform and Load (ETL)" Magento
Fixed-Price - Entry Level ($) - Est. Budget: $30 - Posted
Looking for someone to collect information about main reasons of refused wind onshore schemes planning application. What you need to do is: 1. google the planning authority 2.use the reference ID to find the specific application 3.find out the reason for refusing the proposal 4.record it in the excel list. Each person will only need to find about 20 applications. And it is unnecessary to read every documents of it. Only the 'Decision' or'Appeal Decision' and 'committee decision' this two documents and find out the reason.
Skills: "Extract, Transform and Load (ETL)" Web Crawling Data Entry Data mining
Hourly - Expert ($$$) - Est. Time: 1 to 3 months, Less than 10 hrs/week - Posted
Help the team design best practices, assist with building our Pentaho environment and servers, as well as develop and test several jobs.​ Designing and developing Pentaho DI jobs, transformations, and connecting with various data stores (Hadoop, Redshift, Postgres). Writing unit tests against Pentaho jobs in a continuous integration environment. Connecting Pentaho to REST APIs. Writing complex SQL queries. Design and development of jobs that capture changes in upstream databases and sync downstream. Administrating, installing, configuring components of Carte server. Writing shell scripts. Setting up logging, exception handling, notification, and performance tuning of Pentaho jobs.​ 5+ years with PDI Experience with the entire project life cycle Strong communication skills; written and verbal Strong critical thinking and systematic problem solving skills
Skills: "Extract, Transform and Load (ETL)" Pentaho
Fixed-Price - Intermediate ($$) - Est. Budget: $2,600 - Posted
Performance Test environment Requirements: We are looking for an independent and proactive ETL Application Developer who has 5+ years of experience at the same or related position. ... Control M) - Deep expertise in writing PL SQLs in Oracle and ETL tools - Strong knowledge of UNIX, shell and perl scripts - University degree level (preferably in IT or Finance) Nice to have: - Good knowledge of DB modeling - Knowledge of JIRA/Quality Center - Experience with Agile development methodology - Excellent written and verbal communication skills - Ability to work independently and collaboratively We offer: - Stable and long-term work among professionals on the basis of contract of employment, B2B contract, contract of mandate or contract of specific work - Real influence on progress of your career path - Work in a team for which your opinion matters - Support of Business Assistant dedicated to the Specialist throughout the entire period of cooperation - Integration events, sport events e.g.
Skills: "Extract, Transform and Load (ETL)" Database Modeling Informatica Oracle PL/SQL
Fixed-Price - Intermediate ($$) - Est. Budget: $300 - Posted
I need pricing and other relevant data on the lodging industry in the vicinity of the Bluecut fire in Southern California, which burned from August 16th through August 22nd. I am looking for a freelancer who can use data scraping techniques and internet archives to scrape - at a minimum - the prices and zip codes for each listing by date and by number of guests. For each day and each listing within 100 miles of the fire I would like the price for a one night stay. If you could also extract the qualitative aspects of listings - such as airbnb listing has a gym (1 or 0) that would be stellar and we can negotiate a bonus for that. I would like the data to range from June 16th through October 22nd. Even though the data do not fully exist yet I would like to begin the project sooner than later to find out from an expert what the technological capabilities are for scraping from these or similar sites. Some basic information about the fire can be found here:
Skills: "Extract, Transform and Load (ETL)" Web Crawling Data Science Data scraping
Hourly - Intermediate ($$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
Looking to hire a full time Microsoft SQL, SSIS expert ($1600/month) with a sound understanding of databases. To test your understanding of database structures we'll ask that in your application provide a summary of how this database is structured:
Skills: "Extract, Transform and Load (ETL)" Data Modeling SQL SQL Server Integration Services (SSIS)
Hourly - Expert ($$$) - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
Our client has a document management system that is powered by a 4D Server Version 2004. The Company that is supplying the replacement document management solution will handle importing the data into the new system. We need someone who is an expert with 4D databases and able to extract certain data into a series of flat files or construct some linkage to a SQL DB. Remote Desktop access can be granted to the legacy production server.
Skills: "Extract, Transform and Load (ETL)" SQL
Hourly - Entry Level ($) - Est. Time: 3 to 6 months, Less than 10 hrs/week - Posted
I need help to optimize an ImageMagick/ Powershell script with several additional components: (The script is already complete, in a basic version. I've selected an analytics instead of a programming professional, because you need to be experienced with extracting data from jfif/jpg images to perform much of the below): Components below will sit inside of the main Action. Please just worry about completing the components. ... 1) Create a script only to REDUCE images (if an image has its quality set too high, it may take an inappropriate amount of space. Therefore, I want to reduce incoming; Q=>=94 to Q=83.) (by viewing compression level by ImageMagick: Run from the command line: identify -format '%Q' yourimage.jpg . You get a value from 0 (low quality) to 100). 2) "Officially" determine the best final output method to save all exported images in, (using jpegtran/IM) for export to Web. (err to the side of quality, and use all lossless compression techniques available.) 3) Losslessly convert all jpg (and PNG?) images to the smallest available (overwrite action). ( jpegcrush ) ( or automation ) ( or ) 4) Remove ALL "Optional Markers" in the jfif/jpg "not" Exif. 5) Find images w/ a Quality setting of ~30% or below. Sort out to folder. 6) Find jpeg bit-depth "not" 8 bits/channel. Sort out to folder. 7) Find jpeg images ->Mode = Black/White. Sort out to folder. 8) Find jpeg images = Progressive (jpeg images not in the RGB Mode (in PS (Image->Mode), the Mode can be set to RGB plus any of the following: Bitmap, Greyscale, DuoTone, Indexed, CMYK, Lab, or Multichannel: I would like to find all these.) Sort out to folder. 9) Find jpeg images with color profiles not RGB (and not empty). Sort to folder and append the Profile name to the beginning of filename. 10) Question/Answer: Can color space be determined by reading the image? (NOT the Exif) 11) Be able to port to new folders for these 'seven types': . 12) Effect a side script for final export: add bytes to image data near end-of-file (not exif) to identify Self (Silhouette Graphics). 13) Find whenever non-image data is inserted in jfif/jpeg files: (such as merging: (DOS:) copy /b image.jpg + text.txt final.jpg, OR for the method used in 12., above). 14) Question/Answer: is Exif used for bmp or gif? 15) Effect a side script for Lossless rotate (if you know of a good utility that does this, that is fine!): load image, rotate, clear orientation bit, rewrite with updated/ removed Exif.
Skills: "Extract, Transform and Load (ETL)" Adobe Photoshop Data Science Data scraping