Job details

Annotators needed to determine quality of search results from a search engine

Posted

2 months ago

Worldwide
Needs to hire 12 Freelancers
Who we are:
Researchers at the Allen Institute for AI (AI2), a non-profit research institute in Seattle, WA. Our team, Semantic Scholar (https://www.semanticscholar.org/) is working to create a better search experience for our users by analyzing search results for a set of queries over the period of a year (12 months) with a possibility that this task will extend longer.

Who we're looking for:
Undergraduate students, graduate students, or professionals with coursework or professional experience in computer science of biomedicine. Ideally, applicants would have experience using academic search engines to search for research papers in their field of study.

What the task involves:
In this task, you will compare search results from two search engines. You will be given a query, and based on the search results, will determine which search engine produced more relevant results. The search engines (A and B) will be anonymized.

You will be asked to select:
1. 'Is Search Engine A more relevant to the query than Search Engine B' (and why?),'Is Search Engine A equally relevant to the query than Search Engine B?' (and why?), and 'Is Search Engine A less relevant to the results than Search Engine B' (and why?).
2. As well as 'Is Search Engine A overall good? (yes/no)' and 'Is Search Engine B overall good? (yes/no)'

Note:
To be eligible for this job, you will be screened by completing one task of (paid) annotation work.

This task will be run monthly for a year (12 months). There is a possibility this task will continue on for longer. Computer science annotators will see 125-200 queries a month (about one to two hours of work), while biomed annotators will see 300-375 queries a month (about three to four hours of work). Overall, computer science annotators can expect 24 hours of work over the course of the year, and biomed annotations will see 36-48 hours of work over the course of the year. The annotations will be reviewed for quality and contract extensions for this task will be applied dependent on delivery of high quality annotations.

This project is subject to the terms and conditions of the attached Participation Agreement and Recording Consent (the "Agreement").  By agreeing to participate in the project, you expressly accept and agree to the terms of the Agreement.
  • Less than 30 hrs/week
    Hourly
  • More than 6 months 6+ months
    Project Length Duration
  • Intermediate
    I am looking for a mix of experience and value Experience Level
  • $15.00-$20.00
    Hourly
  • Project Type: Ongoing project

Skills and Expertise

Computer Science Jobs Biology Jobs Research Jobs Chemistry Jobs

Activity on this job

  • Less than 5
  • Last viewed by client: 22 hours ago
  • Hires: 8
  • Interviewing: 2
  • Invites sent: 0
  • Unanswered invites: 0

About the client

5.00 of 19 reviews
  • United States
    Seattle 05:37 am
  • 12 jobs posted
    84% hire rate, 1 open job
  • Member since Jun 26, 2020