Etl Jobs

33 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Expert ($$$) - Est. Time: 1 to 3 months, Less than 10 hrs/week - Posted
Help the team design best practices, assist with building our Pentaho environment and servers, as well as develop and test several jobs.​ Designing and developing Pentaho DI jobs, transformations, and connecting with various data stores (Hadoop, Redshift, Postgres). Writing unit tests against Pentaho jobs in a continuous integration environment. Connecting Pentaho to REST APIs. Writing complex SQL queries. Design and development of jobs that capture changes in upstream databases and sync downstream. Administrating, installing, configuring components of Carte server. Writing shell scripts. Setting up logging, exception handling, notification, and performance tuning of Pentaho jobs.​ 5+ years with PDI Experience with the entire project life cycle Strong communication skills; written and verbal Strong critical thinking and systematic problem solving skills
Skills: "Extract, Transform and Load (ETL)" Pentaho
Hourly - Intermediate ($$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
Looking to hire a full time Microsoft SQL, SSIS expert ($1600/month) with a sound understanding of databases. To test your understanding of database structures we'll ask that in your application provide a summary of how this database is structured: http://ec.europa.eu/eurostat/estat-navtree-portlet-prod/BulkDownloadListing
Skills: "Extract, Transform and Load (ETL)" Data Modeling SQL SQL Server Integration Services (SSIS)
Hourly - Expert ($$$) - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
Our client has a document management system that is powered by a 4D Server Version 2004. The Company that is supplying the replacement document management solution will handle importing the data into the new system. We need someone who is an expert with 4D databases and able to extract certain data into a series of flat files or construct some linkage to a SQL DB. Remote Desktop access can be granted to the legacy production server.
Skills: "Extract, Transform and Load (ETL)" SQL
Hourly - Entry Level ($) - Est. Time: 3 to 6 months, Less than 10 hrs/week - Posted
I need help to optimize an ImageMagick/ Powershell script with several additional components: (The script is already complete, in a basic version. I've selected an analytics instead of a programming professional, because you need to be experienced with extracting data from jfif/jpg images to perform much of the below): Components below will sit inside of the main Action. Please just worry about completing the components. ... 1) Create a script only to REDUCE images (if an image has its quality set too high, it may take an inappropriate amount of space. Therefore, I want to reduce incoming; Q=>=94 to Q=83.) (by viewing compression level by ImageMagick: Run from the command line: identify -format '%Q' yourimage.jpg . You get a value from 0 (low quality) to 100). 2) "Officially" determine the best final output method to save all exported images in, (using jpegtran/IM) for export to Web. (err to the side of quality, and use all lossless compression techniques available.) 3) Losslessly convert all jpg (and PNG?) images to the smallest available (overwrite action). ( jpegcrush ) ( or https://imageoptim.com/mozjpeg automation ) ( or https://github.com/tjko/jpegoptim ) 4) Remove ALL "Optional Markers" in the jfif/jpg "not" Exif. 5) Find images w/ a Quality setting of ~30% or below. Sort out to folder. 6) Find jpeg bit-depth "not" 8 bits/channel. Sort out to folder. 7) Find jpeg images ->Mode = Black/White. Sort out to folder. 8) Find jpeg images = Progressive (jpeg images not in the RGB Mode (in PS (Image->Mode), the Mode can be set to RGB plus any of the following: Bitmap, Greyscale, DuoTone, Indexed, CMYK, Lab, or Multichannel: I would like to find all these.) Sort out to folder. 9) Find jpeg images with color profiles not RGB (and not empty). Sort to folder and append the Profile name to the beginning of filename. 10) Question/Answer: Can color space be determined by reading the image? (NOT the Exif) 11) Be able to port to new folders for these 'seven types': http://fileformats.archiveteam.org/wiki/JPEG#Types_of_JPEG_files . 12) Effect a side script for final export: add bytes to image data near end-of-file (not exif) to identify Self (Silhouette Graphics). 13) Find whenever non-image data is inserted in jfif/jpeg files: (such as merging: (DOS:) copy /b image.jpg + text.txt final.jpg, OR for the method used in 12., above). 14) Question/Answer: is Exif used for bmp or gif? 15) Effect a side script for Lossless rotate (if you know of a good utility that does this, that is fine!): load image, rotate, clear orientation bit, rewrite with updated/ removed Exif.
Skills: "Extract, Transform and Load (ETL)" Adobe Photoshop Data Science Data scraping
Hourly - Expert ($$$) - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
Here's the basic idea. I want to create a fairly simple ETL architecture that can pull in data from a variety of sources / formats. ... The part I need help with is thinking through the technologies I should use as well as what the ETL pipeline architecture should look like. I'm sure many of you have worked on Big Data analytic projects and have learned a lot in the process.
Skills: "Extract, Transform and Load (ETL)" Amazon Web Services AWS Lambda Big Data
Hourly - Expert ($$$) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
We are looking for an engineer who can develop data extraction / web scraping scripts and matching DB and API to expose data from various sources for use in our BI Dashboards. This will be an on-going, on-demand project. We will have heavy load of work in the beginning of this project, 4+ weeks full time, transitioning to on-demand after that. Please send examples of previous work specific to this. Looking for expert-level candidate with multiple examples of this exact type of work. Thank you.
Skills: "Extract, Transform and Load (ETL)" API Development API Documentation Web scraping
Hourly - Entry Level ($) - Est. Time: 1 to 3 months, 30+ hrs/week - Posted
We need to build a script to extract data from multiple GA and Facebook accounts. The script should run as a cron-job and output the data to CSV. In addition, we need to build a few views in React to help managing the various accounts. The project is an ongoing project and we would love to continue with the contractor for full-time.
Skills: "Extract, Transform and Load (ETL)" Data scraping Facebook Development Google Analytics API