Natural Language Processing Jobs

17 were found based on your criteria {{|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Intermediate ($$) - Est. Budget: $5,000 - Posted
We are looking for an experienced web scraping/crawling expert, preferably with experience in natural language programming and web development. The web crawler will be utilized to scrape data from multiple sites and deposit that data into a series of tables, viewable by multiple users. Please reference for an example of the profiles we're looking to create. Additionally, we will be constructing a login-accessible site for people to view and administrate this data. We can hire multiple freelancers, if the web scraping expert does not have the necessary web development skills. This contract is part of a much larger project. If the selected freelancer is successful with this pilot project, there will be additional work to follow. In your quote, please include a timeline and cost estimate (i.e. "It will take 100 hours to complete this prototype, 40 hours to debug, etc."). Bids that do not include a specific timeline and cost estimate will not be accepted. Additional Information: *We're interested in scraping the websites of manufacturing companies and identifying their capabilities (machinery, location, expertise, products manufactured, etc.) and using this information to build profiles. This scraper will need to collect information on several thousand companies and it will need to be programmed and documented in English. These profiles will be stored in a database and published to a website where users can view and administer the data. *The profiles that were linked (related to the Connectory) are the end product that we're hoping to create. We want the profiles your scraper creates to capture the same information that's housed in those profiles. The crawler you create would be scraping the internet, identifying manufacturing companies, and pulling their relevant information from their websites and placing them into Connectory profiles. *Most websites follow a generic template, in terms of layout (visually). Most sites have an "About Us," a "Products" page, etc. See for an example of what we're visualizing. Our hope is that we can program a scraper to go into a website's home page, identify if there is an "About Us" page, and pull the first paragraph from that "About Us" page, and deposit it into a database that we could pull from. Our expectation is that the quality of the profiles would improve through iterations.
Skills: Natural language processing API Development Data scraping MySQL Administration
Fixed-Price - Intermediate ($$) - Est. Budget: $250 - Posted
Hi ! I got few simple tool written in python ! Problem in them is that those tools are very slow ! I need decent performance improvemnt for them ! Tool 1:!M00BVCxS!n4UbvDvv-z4PMTE9iLcdrAWBH3F_tVgB5KGL_AmufcQ Consits of few modules: - - add multithreading - improve algorithm for better performance if possible - maybe using indexed database instesd of file as data source would help? - - add multithreading if possible - make as fast algorithm as possible - should work with files like 50GB but not eat more than 75% of computer RAM - - already multothreaded ! but still very slow !! need decent improvemnt because LSA* scripts are most important to me Tool2:!8wEwAZQS!s8EAQK27Lh1IiA5XMbjGv1oISjbmSyOXKwIK5Phbatw removes duplicates from parallel sets of texts. Now it works but problem is that I want to process 300GB data and the tool tries to load all into memory that needs to be solved. Also introducing multithreading is needed. Other algorithmical improvements are not needed but well seen if possible. Tool3:!E0tkUKDY!4CxlRzz3omo2ux3CpUjtsqls0wWw_QLkPbEGgHdG4uI Works good, but too slow. I would need to add multhreading (but the tools should not consume more that 75% of PC RAM). Other algorithmical improvements are in my opinion requred if possible.
Skills: Natural language processing Artificial Neural Networks CUDA Deep Neural Networks
Fixed-Price - Intermediate ($$) - Est. Budget: $400 - Posted
Why this job ? I work for a Travelers company that sells membership to people who travels the world and I found out that most of my clients are in dating websites and i want to contact them in a more informal way, create a little bit of rapport in order to have a in person meeting, where i can seduce them with my travel products. Who can do this job ? Freelancer who understand how to create a chatbot that looks informal and using natural language. How to do this job ? You will be creating a chatbot, just like the ones described on this link and this chatbot must be able to start and engage conversations on the following apps: Tinder Then the chatbot will have the main goal of talking about travelling around the world and mentioning that we will be meeting in person, then pick up the phone number of the client. With the phone number picked up, convert the conversation to a SMS conversation, using a Android Smartphone. The chatbot will also populate all data from the client to the "google contacts" account that is already synchronized with the mobile phone. (Name, Phone number, profile picture, descriptive text and whatever is available. Of course, the chatbot cannot answer to questions too quickly. Since he is pretending to be a human, it has to take sometime to respond back. Have a look at in order to see the type of thing i want. Our end Goal is to: Set up meetings with as many clients as possible and keep track of all conversations, so that before the meeting, I can have a quick look at the entire conversation and go from there. How tough/easy this job is ? This will depend on your capacity to create this chatbots and integrate them with the mentioned websites. How much will I get ? This job is paid on a fixed amount basis, meaning , whatever you bid for this to work, this will be the price to develop this tool. Other important things I cannot super manage, and that’s why I need your help. So, I want some who is hard working and honest. I will expect a daily report to know how the work is going. I know you still may have queries, and I will help you with all your queries. Once agreed, I will send instructions on how to proceed with the project. ** Please no robotic answers. Read, understand and then write a proposal if you are interested. If you have any relevant experience, please do write about them.
Skills: Natural language processing Artificial Intelligence Artificial Neural Networks Chatbot Development
Fixed-Price - Intermediate ($$) - Est. Budget: $10 - Posted
We need a program (Python/Java) which will generate feature vector from a given named entity recognition tag in IOB format. The generated feature should be usable for machine learning algorithm like SVM/CRF etc. The output should be like the iris dataset. An example explanation has been given for POS, we need a similar NER version.
Skills: Natural language processing Artificial Neural Networks Machine learning
Fixed-Price - Intermediate ($$) - Est. Budget: $350 - Posted
I need you to develop some software for me. I would like this software to be developed for Linux using Python. Corpus Alignment (python) online with "Moses" (statistical machine translation) and updating features on decoder time (while translating) through xnlrpc
Skills: Natural language processing Python Ubuntu XML-RPC
Fixed-Price - Intermediate ($$) - Est. Budget: $100 - Posted
have a bunch of (scheduled) python scrapers using selenium webdriver, that's just too slow to run and unreliable. Like to migrate that to Scrapy. At the moment, scripts output to individual excel files (with minor data cleaning), and a separate python script copy/pastes the latest of each file to one excel file (multiple sheets). This file is then normalized/recategorized before uploading into postgresql. Job is either: i) Scrapy to 1 excel file (and the next stage takes over from there) or ii) scrapy to pgsql. In the latter case, duplicates need to be removed, dates parsed, and there is a reference xls file for items recategorization. Main thing is more robust scraping, not too much overhead.
Skills: Natural language processing Machine learning PostgreSQL Programming Scrapy
Fixed-Price - Intermediate ($$) - Est. Budget: $10 - Posted
Write a program to find the five character-level trigrams (strings with three characters, such as "abc" or "rey") that appear the highest number of times in the following poem (“When You Are Old” by W. B. Yeats). Please lowercase all letters. The trigrams should not contain spaces, but may include punctuations. The result should list the top five trigrams and how many times they occur, in the decreasing order of the occurrence frequency. POEM: When you are old and grey and full of sleep, And nodding by the fire, take down this book, And slowly read, and dream of the soft look Your eyes had once, and of their shadows deep; How many loved your moments of glad grace, And loved your beauty with love false or true, But one man loved the pilgrim soul in you, And loved the sorrows of your changing face; And bending down beside the glowing bars, Murmur, a little sadly, how Love fled And paced upon the mountains overhead And hid his face amid a crowd of stars.
Skills: Natural language processing Python
Fixed-Price - Intermediate ($$) - Est. Budget: $2,000 - Posted
We need a generic crawler, which can: 1) analyze the webpage structure of the input website 2) recognize the webpages where the target information may exist 3) parse and extract required contents in these pages using NLP&ML technologies 4) store scraped contents into their corresponding fields in MongoDB For example: We need some product information from online store, and we input the URL of one store. The crawler will visit different links in the website and make statistics. He found that some webpages have the similar structure, and this structure has a high repeat rate. Then the crawler will think that this kind of webpages may contain the product information, and will check the contents in it. He will parse the contents, and when he has recognized the contents which we need like Product Name, Product Type, Price, Production Description, he will extract them and store them to the corresponding fields in MongoDB. To make this crawler, the following skills are required: • Crawler skills • MongoDB skills • Machine learning • Natural Language processing We prefer to use Java as programming language. Python is acceptable. We will reveal more when contacting with you.
Skills: Natural language processing Java Machine learning MongoDB
Fixed-Price - Intermediate ($$) - Est. Budget: $1,000 - Posted
Hi, I need a sentence formatting engine which can re-word sentences. Example I saw a mouse jumping from table The mouse jumped from table and I saw it I was watching the table and the mouse jumped from it. So basically each sentence should be re'worded and should make some sense. I need people who have experience with this. Let me know what you need. I need to run this on my server using PHP. But let me know if this can be implemented in any language or technology. budget is negotiable. But let me know the price.
Skills: Natural language processing English Grammar Machine learning Perl