Bash Jobs

41 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Entry Level ($) - Est. Budget: $50 - Posted
Logic: ----------------------------------------- step 1 Folders validation: ----------------------------------------- 1. check that account already got "our folder" if not create if yes 2. check that folder have defined subfolders - if not add them to its parent. folders structure RobotFolder: Autofolder - folder for processing emails from subfolders Archfolder - folder for keeping processed emails in same subfolders structure like abouve Autofolder and Archfolde subfolders structure -today -1day -2days -3days -7days -14days -21days -future -archive - task_done -archive - Someday ----------------------------------------- step 2 check Autofolder subfolders for emails insaid ----------------------------------------- 1. check first folder - if email insaid send it too folderName@pinbox24.com - after email processed - move it to Archfolder with same subfolder name 2. proced undtil folder is empty 3. check next folder from folder list 4. repeat untill all folders are checked
Skills: Bash shell scripting JavaScript Node.js Scripting
Hourly - Intermediate ($$) - Est. Time: 3 to 6 months, 10-30 hrs/week - Posted
This project will be made of multiple sub-projects. The first of which is a set of command line scripts (display market data, get portfolio, etc). More detail will be given after candidate has been selected for the next phase. If candidate successful in this project a series of other related projects will be directly assigned
Skills: Bash shell scripting Pandas Python Python Numpy
Fixed-Price - Entry Level ($) - Est. Budget: $250 - Posted
We want a system where we can have a single CSV file that specifies these settings for squid (or similar), so that we only need to update that CSV file and the VPN configurations. We also would like a bash script that will configure the server prerequisites such as squid, tor, etc… on Centos 7 Proxy types: Static proxy --Using specific port (9055 and 9056 in diagram), the client’s traffic should use single static upstream proxy associated with that port.
Skills: Bash Squid
Hourly - Intermediate ($$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
We are seeking for DevOps Engineer to maintain and actively develop our AWS-based infrastructure. That includes reliability and failover, monitoring, problem detection and resolution, auto discovery, internal scripts and tools, continuous builds, warm ups and tear down of our scalability patterns. You will be exposed to modern approaches and technologies, high-load environment, cloud computing services, distributed fault-tolerant stack. The role will require hands-on experience across the backend stack. Candidates will be familiar with the technologies we are using, but may have not necessarily had the ability to work on them in their current job setting. That's generally OK. However, you should have true passion for technology that is apparent through your previous work experience. Responsibilities: Collaborate closely with product managers and application developers Manage the fleet of our production and development servers Maintain and develop our infrastructure Maintain and develop continuous integration / delivery system Deploy new features and bug fixes every week Optimize server performance and responsiveness Technology stack: Linux nginx PHP Java AWS services Couchbase ELK stack Dropwizard Redis Storm ZooKeeper Kafka Docker Vagrant Cucumber Angular2 WebPack TypeScript ES6 Requirements: 3+ years of hands on Linux administration Fluent in Shell scripting Basic knowledge of PHP/Java and experience managing application stack Familiarity with NoSQL solutions, Event Stream Processing Freely operate within cloud computing infrastructure Experience building deployment automation About SolidOpinion: SolidOpinion is a fun, new and dynamic commenting platform. Members earn points for being engaged in the discussion – for example, for posting and viewing comments. They can use points to buy cool power-ups and dominate the discussion, engage in micro-transactions or sell points to other members. Every transaction monetizes your content. This opportunity is full-time only. No consultancies or recruiters will be considered.
Skills: Bash shell scripting Amazon Web Services Continuous Integration DevOps
Fixed-Price - Intermediate ($$) - Est. Budget: $80 - Posted
Hi I need a script which can run on a linux based server (qnap) as a cron job. 1. It should automaticly download all mail attachments from a mail adress with a specified suffix. 2. After that, it should rename the file with the date from that day. 3. The renamed file should be uploaded to an external FTP Server. 4. The script should double-check if the previous steps worked properly. 5. The mail should be moved to an other IMAP Folder. The developer also should implement the script to the cronjob of my qnap nas server.
Skills: Bash shell scripting
Fixed-Price - Intermediate ($$) - Est. Budget: $150 - Posted
I am looking for someone to develop a custom script for me to optimize PDFs on a Linux server (Ubuntu). I routinely get large color PDF books which I use for proofreading. However, they are often very slow to scroll when being viewed in a desktop program or web browser. I would like a script on my Ubuntu server which I can run to optimize these PDFs so that they load and scroll more quickly when being viewed in a browser or PDF viewer. I've already played around with imagemagick, ghost script, and pdftk but have had trouble getting them to convert at a reasonable speed. I am still open to using one of these but ONLY if you can meet the requirements and essential test below. REQUIREMENTS: - Runs as an Ubuntu command-line script or program - Easy to install (it is OK to have a few package dependencies, or even be a thin wrapper for programs that perform this well) - Uses a simple API, such as convertpdf.sh optimize -input mybook.pdf -output newbook.pdf - Converts a PDF to grayscale and/or black&white and reduce the file size - Downsamples the resolution of text and images and reduce the file size - Performs any additional optimizations you can think of, which will boost performance of scrolling in a PDF viewer. - OCR and text searchability *must* remain unaffected. - Fast conversion performance: e.g., conversion rate of 1 second or less per page - Scrolling speed between the original PDF and your optimized PDF *must* show an improvement of at least 25% in PDF scrolling speed when using the native PDF viewer in Google Chrome web browser. ESSENTIAL TEST: I have four (4) PDF books that your script *must* optimize to speed up scrolling performance and follow all of the above requirements. These four books are located in the GDrive folder here: https://drive.google.com/folderview?id=0B27-rxnuLaIGR2FOdUE0b1dQUzg&usp=sharing
Skills: Bash shell scripting Agile software development PDF Conversion