Quality contributions on Crowdsource
Ankur Sharma with Alex Gogarty, August 2022
They say it takes a village to raise a child. At Crowdsource, we believe this applies to us more than most. Our system is powered by contributions from people all over the world; the more quality data we receive from our “village”, the better a service we can provide to our users.
These contributions are crucial to making Google’s products accessible and inclusive for all. But how do we make sure they’re accurate? This is a topical subject at Crowdsource, so in this blog we’ll be demystifying some fundamentals of ensuring quality contributions. Let’s get into it!
The journey to genuine
The data contributed by our users is used by machine learning algorithms to power various Google products, which make life better for people across the globe. Some of this data is even open sourced at times, and available free of cost. This allows other people to use it and solve problems that may have otherwise gone unsolved.
Naturally, with this data powering so much activity, accuracy is key. To gauge quality, we ask ourselves an important question with every contribution: was this made with a ‘genuine’ intention?
Let’s review the actual journey of a contribution:
- It begins by contributors responding to a question.
- Every question is asked to at least 5 users in order to capture different opinions.
- The module then combines the identified genuine answers into one single response.
- This single, combined response is stored in temporary storage and used at a later stage of developing a machine learning model.
Genuine contributions are at the heart of this process. A simple maxim is used to guide the user in providing an answer that is most accurate to their experience:
“Put in your best effort, but don’t overthink it.”
Users are instructed to understand the question and then respond with what they think is correct. However, we also encourage them to move on or “skip” if they are not sure of the answer.
This is a really important factor, as skipping a question indicates that the information provided was not simple enough, and helps us make changes and improve our questions.
Pumpkin or pump-con?
Here is a common example of this practice in place.
Note the below image. Does it contain pumpkin soup?
Image Courtesy: Open sourced from Crowdsource
This might look delicious, but it’s also pretty ambiguous! To most people, the picture lacks enough clarity to identify the type of soup. If you had thought of skipping the question, then you made the right choice.
(In fact, this is actually tomato soup, with pepper that may appear as seeds. Yum!)
Let’s try another. Does this image contain pumpkin soup?
Image Courtesy: Open sourced from Crowdsource
To a lot of people, this picture would be considerably clearer than the last one. If your answer was yes, then you’re correct: it contains pumpkin seeds.
However, this is also a good example of answering questions to the best of our knowledge. If you didn’t recognise the second picture as being pumpkin soup, don’t worry! There’s no need to labor on it for too long – the best thing to do is skip the question.
(TIP: If you find yourself skipping too many questions, feel free to switch the category on the Crowdsource app.)
Time to start quizzing!
Hopefully you found this insight useful, and are ready to create more genuine impact by providing more genuine contributions! All you need to do is remember: ‘put in your best effort, but don’t overthink it.’
If you’re new here, you can get started at the main Crowdsource website or download our Android app. We would love to hear your feedback and thoughts! Please feel free to write to us on Facebook or find us on Twitter using #GoogleCrowdsource.
Thank you for reading!
Try Crowdsource. Make a difference.
Back to top