Bash Jobs

80 were found based on your criteria {{|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Expert ($$$) - Est. Budget: $1,000 - Posted
Hello! I need a good url extractor. I has to run based on big .txt keyword file. The main goal is to extract millions of links from the web based on keywords combination. I had one in the past based on but everything had changed and it won't extract not even 1% like it did before. I think it's banning it or smth, you must know better. Next I had a tool based on it worked very well but I think they keep on changing the parameters because it needed modification a couple of days. Any ideas? On how to build something powerful?
Skills: Bash Bash shell scripting C C++
Fixed-Price - Intermediate ($$) - Est. Budget: $50 - Posted
I have a Torrent Auto-Upload Script that download torrents via 1 private tracker RSS feed and make description (screenshots , upload them to imagehost etc) and then create new .torrent file and upload it to 2 public torrent sites (et,tpb) . I want the developer to edit/re-code the script described below , and install it on the linux based VPS to run it (& providing me instruction/todo/readme to do that) The editing that i required is that the script should download from various (more then 1) RSS feeds including private and public feeds , and there is a ratio maintain function , as we creating public rss download so ratio function should not run on that. then in the uploading part there is 2 working website in which the new torrent is uploaded , i want to add few more sites in that part. And there are some more functions in the scripts that is of no use so remove it after understand that it will cause no error in the processing . Further discussion can be done about the editing the script.
Skills: Bash shell scripting MySQL Programming PHP
Hourly - Entry Level ($) - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
Given a dir containing several tif files, write a bash script that produces a single pdf file. You can use gs or imagemagick convert (convert uses gs on the backend, so you should better use gs) I can do it but the produced pdf is very large usually.
Skills: Linux System Administration
Fixed-Price - Intermediate ($$) - Est. Budget: $10 - Posted
Finally I created ssh file like below: vi ~/.unison/ below is the content: #!/bin/bash # set paths _paths="/home/xysbbka" # binary file name _unison=/usr/bin/unison # server names # sync with rest of the server in cluster # i.e _rserver="" _rserver="" # sync it for r in ${​_rserver}​ do for p in ${​_paths}​ do ${​_unison}​ -batch "${​p}​" "ssh://${​r}​/${​p}​" done done Owe to B server doesn't use default port 2258 port for ssh connection instead of port 22, so the test failed.
Skills: Linux System Administration
Fixed-Price - Intermediate ($$) - Est. Budget: $50 - Posted
Need simple web page that shows the status of something, usually in/out or on/off and then click to change image from on to off or in to out and while doing that to run a specific script that I designate. (The script would be some bash script with system commands etc). This needs to run on basic NGINX + LUA/luajit without requiring PHP.
Skills: CSS CSS3 HTML HTML5 JavaScript Web design Website Development
Hourly - Entry Level ($) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
Integrate software for online storage : i2p tor tahoe lafs and owncloud List of task to be done: Setup the lab: Owncloud: Connect to tahoe lafs via external sftp I2p: Configure to work with tahoe and multiple introducer called foolscap Bash: Create a script that backup in tahoe several paths folder from linux Tahoe: Follow the dev and push for having the full integration of TOR and i2P working properly, additional budget will be release to bid in the official Tahoe dev group when every is clear
Skills: Bash shell scripting Internet Security Java JavaScript
Hourly - Intermediate ($$) - Est. Time: 1 to 3 months, 10-30 hrs/week - Posted
Looking for an experienced Hadoop systems Architect/Administrator (Cloudera ONLY). I have two positions (Suitable candidate can assume both the roles) 1) Systems Architect who can advice on the areas to improve with respect to Automation, deployment, performance tuning, capacity management etc, Document with the steps, Diagrams (Please apply if you only have experience large deployments) — This would be hourly job 2) Hadoop Administrator with experience with Troubleshooting of various ecosystem tools, JVM, setting up monitoring like Ganglia, Nagios, Automation experience (Shell and Python) etc here is the detailed Job description 1) Big data System architecture/Administration (Cloudera Hadoop, Elasticsearch, MongoDB) 2) Cloudera Administration, Cloudera Manager API Preferably Cloudera certified 3) In-depth knowledge of Security practice on Cloudera (Kerberos, KMS, Cloudera Navigator, Sentry ) 4) expert in troubleshooting (ecosystem tools, JVM, Hive/Impala Query Tuning ) 5) Solid Scripting in Python, Shell (Proof needed: Github) 6) Someone who has experience with Monitoring setup (Ganglia, Nagios) complimenting existing Cloudera Manager Solid Linux Admin Skills Work
Skills: Bash shell scripting Ansible Cloudera Hadoop