Hortonworks Jobs

3 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Expert ($$$) - Est. Time: 1 to 3 months, 10-30 hrs/week - Posted
Looking for a Hortonworks expert architect. My goal is to design a new data sciences platform. The purpose of this platform for 1) a team of data scients to train predictive models and 2) an tech operations team to productionalize and run these predictive models in a recurring manner. ... The team has strong expertise and best practices with data sciences, data architecture, data governance and data management. Our need is for a Hortonworks expert. This Hortonworks expert would be an advisor to the existing team. ... This expert is needed to provide a “generic” Hortonworks architecture and then work with the team to adapt this architecture to our specific needs.
Skills: Apache Hive Atlas Bash shell scripting Hadoop HBase Python
Hourly - Intermediate ($$) - Est. Time: 1 to 3 months, Less than 10 hrs/week - Posted
We are experrise in multiple Bigdata areas:- - Spark - Hadoop - R - Data Science - Nifi - Metron - Cassandra - Mongodb - Scala - Big Data - ElasticSearch - Amazon Web Services (AWS) - SQL - Tableau - Cloudera, Hortonworks, Datastax, Databricks We are involved in whole life cycle of Bigdata, Spark, Hadoop and DataScience Projects.
Fixed-Price - Intermediate ($$) - Est. Budget: $20 - Posted
I am setting up hortonworks for the first time and everything is working fine but there seems to be a problem with the user rights - i cannot get a pig job to run from the ambari view - from the cli everything works fine - File does not exist: /user/kwhitson/pig/jobs/g_10-09-2016-19-34-48/stdout at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:672) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:373) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2206) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2202) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2200)
Skills: Data Science