Bash Jobs

40 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Expert ($$$) - Est. Budget: $1,000 - Posted
Hello! I need a good url extractor. I has to run based on big .txt keyword file. The main goal is to extract millions of links from the web based on keywords combination. I had one in the past based on bing.com but everything had changed and it won't extract not even 1% like it did before. I think it's banning it or smth, you must know better. Next I had a tool based on search-results.com it worked very well but I think they keep on changing the parameters because it needed modification a couple of days. Any ideas? On how to build something powerful?
Skills: Bash Bash shell scripting C C++
Fixed-Price - Intermediate ($$) - Est. Budget: $50 - Posted
I have a Torrent Auto-Upload Script that download torrents via 1 private tracker RSS feed and make description (screenshots , upload them to imagehost etc) and then create new .torrent file and upload it to 2 public torrent sites (et,tpb) . I want the developer to edit/re-code the script described below , and install it on the linux based VPS to run it (& providing me instruction/todo/readme to do that) The editing that i required is that the script should download from various (more then 1) RSS feeds including private and public feeds , and there is a ratio maintain function , as we creating public rss download so ratio function should not run on that. then in the uploading part there is 2 working website in which the new torrent is uploaded , i want to add few more sites in that part. And there are some more functions in the scripts that is of no use so remove it after understand that it will cause no error in the processing . Further discussion can be done about the editing the script.
Skills: Bash shell scripting MySQL Programming PHP
Hourly - Entry Level ($) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
Integrate software for online storage : i2p tor tahoe lafs and owncloud List of task to be done: Setup the lab: https://github.com/Librerouter/Librekernel#installation-and-setup Owncloud: Connect to tahoe lafs via external sftp I2p: Configure to work with tahoe and multiple introducer called foolscap Bash: Create a script that backup in tahoe several paths folder from linux Tahoe: https://en.wikipedia.org/wiki/Tahoe-LAFS Follow the dev and push for having the full integration of TOR and i2P working properly, additional budget will be release to bid in the official Tahoe dev group when every is clear https://github.com/tahoe-lafs/tahoe-lafs https://github.com/kytvi2p/tahoe-i2p-install https://david415.github.io/category/tor-hidden-services-tahoe-lafs.html http://cryto.net/projects/tahoe.html https://github.com/david415/ansible-tahoe-lafs https://cageos.org/index.php?
Skills: Bash shell scripting Internet Security Java JavaScript
Hourly - Intermediate ($$) - Est. Time: 1 to 3 months, 10-30 hrs/week - Posted
Looking for an experienced Hadoop systems Architect/Administrator (Cloudera ONLY). I have two positions (Suitable candidate can assume both the roles) 1) Systems Architect who can advice on the areas to improve with respect to Automation, deployment, performance tuning, capacity management etc, Document with the steps, Diagrams (Please apply if you only have experience large deployments) — This would be hourly job 2) Hadoop Administrator with experience with Troubleshooting of various ecosystem tools, JVM, setting up monitoring like Ganglia, Nagios, Automation experience (Shell and Python) etc here is the detailed Job description 1) Big data System architecture/Administration (Cloudera Hadoop, Elasticsearch, MongoDB) 2) Cloudera Administration, Cloudera Manager API Preferably Cloudera certified 3) In-depth knowledge of Security practice on Cloudera (Kerberos, KMS, Cloudera Navigator, Sentry ) 4) expert in troubleshooting (ecosystem tools, JVM, Hive/Impala Query Tuning ) 5) Solid Scripting in Python, Shell (Proof needed: Github) 6) Someone who has experience with Monitoring setup (Ganglia, Nagios) complimenting existing Cloudera Manager Solid Linux Admin Skills Work
Skills: Bash shell scripting Ansible Cloudera Hadoop
Fixed-Price - Intermediate ($$) - Est. Budget: $33 - Posted
Hi I'm looking for a shell script with quick turnaround. Please state approximate turn around time. We have a weekly backup of around 10 unix servers which run on sundays. The backup job creates .flar files and dumps it on to a local folder. At the moment we carry out manual checks on Monday to ensure the backups have completed successfully. We do this by checking the backup size and time stamp. The script will need to automate this process. So scan through the folders, find the .flar files(there should be 1 file per folder), check the time stamp or file created date, if the time stamp is 2 days older than the current date then it's a new backup file and job is deemed a success otherwise it's a failer and needs to be investigated. After scanning through all the folders, the email will sent out. See below for the header and body of the email. See attachment for email table. The script needs to be commented throughout so that it's easy to understand. Email header: 10 Flash Backups: 8 Successful, 2 Failed (ukbhu051q + ukbhu024p) - Investigate Flash Backup Summary Number of Flash Backups Expected: 10 Successfull Flash Backups: 8 Failed Flash Backup: 2 Information (See attachment for email table), it doesn't have to be in a table format if that's tricky to do.
Skills: Bash shell scripting Solaris Administration Unix shell Unix System Administration
Fixed-Price - Intermediate ($$) - Est. Budget: $150 - Posted
I want to move all my bower- and npm- dependencies to composer. According to composer-asset-plugin https://github.com/fxpio/composer-asset-plugin documentation it can handle this task for me. I already managed to move 21 bower dependencies into composer and it install them like a charm. BUT now I'm trying to install npm packages (grunt with some plugins) with composer and it doesn't work. It seems the problem is that composer trying to get packages through private VCS while npm downloads tgz from the "dist" section of package.json List of packages: "grunt": "^0.4.5", "grunt-cli": "^0.1.13", "grunt-contrib-clean": "~1.0.0", "grunt-mkdir": "^1.0.0", "grunt-contrib-copy": "~1.0.0", "grunt-string-replace": "^1.3.0", "grunt-contrib-concat": "^0.5.1", "grunt-contrib-cssmin": "~1.0.1", "grunt-contrib-uglify": "^0.11.0", "grunt-sass": "^1.1.0", "grunt-contrib-less": "^1.1.0", "grunt-contrib-watch": "^0.6.1". I need a working solution to this issue. composer with composer-asset-plugin should be able to install all my npm- dependencies WITHOUT "npm install". Please look at composer.json attached to this task. If you create a package.json, "npm install" will obviously work. "composer install" - doesn't: [RuntimeException] Failed to execute git clone --mirror 'git+ssh://git@github.com/isaacs/abbrev-js.git' '/root/.composer/cache/vcs/git-ssh---git-github.com-isaacs-abbrev-js.git/' Permission denied (publickey). fatal: The remote end hung up unexpectedly My environment is Debian 7 x86_64 with nodejs v4.5.0, php 5.4.45, composer 1.2.1, fxp/composer-asset-plugin v1.2.1 The solution could be: - a fix to my composer.json (the preffered method) - a workaround with some files I could put locally in my project file structure (not globally) - some environment changes that don't require escalation to root - installation of other plugins from official repositories that would somehow fix this issue (plugins and methods that lead to "npm install" call from composer ARE NOT ACCEPTED) - a composer-asset-plugin patch that would be accepted to its official release by dev team - a composer patch that would be accepted to its official release by dev team - a custom additional composer plugin (not just a composer-asset-plugin fork) that would fix the issue Please dig into this issue and post if you really KNOW how to make a solution.
Skills: Bash shell scripting PHP Research
Fixed-Price - Intermediate ($$) - Est. Budget: $40 - Posted
I have multiple templates with funcitional scripts for scrape/parse speakers, speech, date and other data for xml akomantoso mysociety version, i need minimum 20 parliaments then 20 scripts differents in regex but similar in download and conversion. 40 usd for script in total 800 usd. pls review this google docs https://docs.google.com/document/d/1-Z5dXelXV36PwWHMVdxrr-d2ACnaAVWk8uJcvNjKVTM/edit?usp=sharing i need script for Download all files in original format Convert to txt if need Convert txt to xml format like this: ORIGINAL FORMAT http://www.camara.gov.co/portal2011/gestor-documental/doc_download/11022-003-acta-no-003-de-2015 FINISH FORMAT http://comisionsextacamara.jhonfelipeurregomejia.com/comisi%C3%B3n-sexta-camara/2015/agosto/acta-no-003-2015-08-12.an
Skills: Bash shell scripting Data scraping Linux System Administration Python
Hourly - Intermediate ($$) - Est. Time: 3 to 6 months, 30+ hrs/week - Posted
Programming Skills a) GO b) BASH c) Python d) Git vcs Html Skills a) Css b) Javascript c) jQuery d) AngularJS Operational Skill a) Working knowledge of DNS Operating Systems a) Windows b) Ubuntu c) Mac
Skills: Bash AngularJS CSS HTML