Jump to content

Suggestion: Fiverr could do a text analysis of each review to help verify ratings


uk1000

Recommended Posts

Sometimes a buyer can give a great, genuine review eg. "Quick Job. Great service!" but give low ratings for all fields (eg. 1 star).

If the seller contacts support about the review there may be a risk of them getting an account warning about it, but the rating as it stands is most likely invalid.

What Fiverr could do is they could do an automated text analysis of every review, including every new review created (eg. sentiment analysis) - at least for the most recent reviews, and compare that with the rating value given. If they are very different it could be flagged in some way and could either be manually checked by staff or an automated message sent to the buyer to confirm the rating (while showing additional info about ratings and showing them their review). Maybe if the system determined it was very likely invalid and there was no response from the buyer then the review could be removed/hidden - until the buyer did respond to verify it.

Doing this would allow for the ratings on Fiverr to be more accurate, since there'd have been more verification of them.

Edited by uk1000
  • Like 5
Link to comment
Share on other sites

1 hour ago, uk1000 said:

Doing this would allow for the ratings on Fiverr to be more accurate, since there'd have been more verification of them.

What about giving the buyer a quick "preview" of their review and they can either confirm that's how they want to rate (or go back and edit)?

  • Like 3
Link to comment
Share on other sites

57 minutes ago, vickieito said:

What about giving the buyer a quick "preview" of their review and they can either confirm that's how they want to rate (or go back and edit)

Yes I think they should do that. Though I think they could still run a sentiment analysis too so it could let them know if the system detected a possible discrepancy between their text review and their rating and told them more about the numerical ratings (eg. 1=low, 5=best, fully satisfied) and then allowed then to confirm or edit it.

Edited by uk1000
  • Like 2
Link to comment
Share on other sites

@uk1000 There's no way for an automatic system to decide which star rating is correct based on the review text, so as you say, Fiverr would have to reach out to the buyer to confirm the rating, adding yet another step to what's already a convoluted review process.

I don't think buyers are going to love that. The whole thing's a pain as it is.

First, there are the three different star ratings, then you're expected to write a manual review. Next, Fiverr nudges you to complete a private survey, peppering their messages with serious-sounding subject lines like [ACTION REQUESTED].

Even as a seller, I don't bother reviewing other sellers anymore, unless the experience is a disaster.

Adding a preview, like @vickieito suggests, would also add another step to this process. 

And for what? Because a negative rating on a very rare occasion was meant to be positive? I can't imagine this happening that often. But perhaps it would happen less frequently if the process was simpler. 

In my opinion, the review process should be streamlined, and not complicated further. How about a single-star rating and asking for a private review simultaneously? Less is more sometimes. 

I mean... Look at my inbox! And I'm not a "big buyer". 

image.thumb.png.71d6364df25d6615378b3f1623d849bc.png

  • Like 3
Link to comment
Share on other sites

23 minutes ago, smashradio said:

There's no way for an automatic system to decide which star rating is correct based on the review text, so as you say, Fiverr would have to reach out to the buyer to confirm the rating

They can calculate an approximate score that the rating should likely be based on that text review.

See https://en.wikipedia.org/wiki/Sentiment_analysis

Quote

Sentiment analysis is widely applied to voice of the customer materials such as reviews and survey responses

It's a thing they have APIs for, including open source ones written in Python. So there should be readily available software interfaces/functions for them to use to get an approx score. You could even try checking out a language model and ask it to rate the text (to perform a sentiment analysis of it) with a score of 1-5 and it will give you a score there.

It doesn't have to give the exact score the buyer would have given if they took their time to rate properly, knowing properly what each rating value means (rather than them just clicking anything to finish the rating or being mistaken about the meaning of the different rating scores), it just needs to know if there is a big enough discrepancy between the expected rating based on a text review and the actual numerical rating given by the buyer to know if it should let the buyer know about that discrepancy and to ask them for confirmation (or to allow them to edit it if necessary).

It could show them a preview of it too but letting them know about the possible discrepancy and that the system was checking to make sure the reviews were accurate would likely make them check more than showing them a preview of the rating and confirmation/edit button alone.

Edited by uk1000
  • Like 2
Link to comment
Share on other sites

Here's one free, open source Python package that meet be able to do the sentiment anaysis for the Fiverr review text:

https://guides.library.upenn.edu/penntdm/python/textblob

This is code perplexity.ai suggested - but edited by me (just using this as an example to show easily it might be able to be done):

from textblob import TextBlob

def get_sentiment_score(text):
    blob = TextBlob(text)
    sentiment = blob.sentiment.polarity
    score = round(((sentiment + 1) / 2) * 5, 1)
    return score
text = "Quick and easy job - he was complete in 10 min. Great service!"
score = get_sentiment_score(text)

So that could give an approx score for what it probably should have been rated given the text the buyer gave.

Then you could find the difference between that and the buyer's rating

eg. diff = abs(score - buyers_rating)

then if that difference is more than you expect, let the buyer know (or maybe flag it for staff to check) so the buyer can check it and edit it or confirm it.

eg. if diff>2 ...

Edited by uk1000
  • Like 2
Link to comment
Share on other sites

4 hours ago, uk1000 said:

If the seller contacts support about the review there may be a risk of them getting an account warning about it, but the rating as it stands is most likely invalid.

Not sure why people aren't pushing back on this one - account warning for seeking clarification with the support team? That's absurd.

Support functions, such as customer service, HR, training, compliance, IT, etc., exist to ensure that operations run smoothly - in this case, between buyers and sellers. No one should be penalized for contacting support.

Things we've allowed to be normalized by not pushing back where we have to 🤦‍♂️

  • Like 2
  • Up 3
Link to comment
Share on other sites

7 minutes ago, yourbrandingpal said:

account warning for seeking clarification with the support team? That's absurd.

It's if they think you're trying to get a review changed or get a buyer to change a review (eg. in a event like this where the review seems obviously wrong) that they can give an account warning (thinking you're trying to manipulate the reviews/pressure the buyer to change it), so it can be risky asking Fiverr about them currently but also reviews can be left invalid in the current system.

Edited by uk1000
  • Like 5
Link to comment
Share on other sites

10 minutes ago, yourbrandingpal said:

Not sure why people aren't pushing back on this one - account warning for seeking clarification with the support team? That's absurd.

 

It clearly says that feedback manipulation is not allowed, check the terms of service. You are trying to manipulate the feedback that someone left after the order. i actually had the same situation that person did when they had the lowest possible rating and positive text. People randomly leave feedback, which makes me believe some private feedback is also random. I didn't have a bad review for many months, yet I was in the "gutter" for 3 months or so without any clarification of how bad my buyer satisfaction rate is..

Honestly the best approach here is to let Seller Plus subscribers to see their buyer satisfaction rate, at least a score out of 100, so we can understand where we fall and how much we can improve? I for one went overboard with my customer service, extra words, etc, and eventually I started to receive messages from new clients. But we should know when/if the buyer satisfaction rate is going up/down, and showing that via a score out of 100 in Seller Plus would keep buyer privacy, while still helping sellers. That is a must have, in my opinion. Otherwise as a seller you're always in the dark, and constantly nagging your seller manager isn't something I am ok with. I am not that kind of person.

Edited by donnovan86
  • Like 3
  • Sad 1
Link to comment
Share on other sites

Just now, uk1000 said:

It's if they think you're trying to get a review changed or get ask a buyer to change a review (eg. in a event like this where the review seems obviously wrong) that they can give an account warning (thinking you're trying to manipulate the reviews/pressure the buyer to change it), so it can be risky asking Fiverr about them currently but also reviews can be left invalid in the current system.

Yeah, that's what's worrying – giving this kind of power to a team that sometimes responds in a scripted manner. I mean, think of instances where your intentions aren't malicious, but you end up getting warned because the support team misunderstood you. 

1 minute ago, donnovan86 said:

You are trying to manipulate the feedback that someone left after the order.

In genuine cases, it could be termed as a correction, and not manipulation. There should be some sort of mechanism in place to address isolated instances where the concern pertaining to a bad review is genuine. I mean, when I run ads on Google, if the ads are disapproved (automatically or after a manual review), I do get to appeal the disapproval. It's not perceived as "oh this advertiser is trying to circumvent the system".

I am only trying to emphasize the fact that sellers should be able to get their concerns addressed without being perceived as manipulative.

  • Like 6
Link to comment
Share on other sites

6 minutes ago, yourbrandingpal said:

In genuine cases, it could be termed as a correction, and not manipulation.

As I said, I had this happen and the review is still on my account. I had someone go to Fiverr behind my back without me asking for that, they requested Fiverr to remove a review and write another one, but as soon as they did that I received an account warning. And that was without me even asking anything. So yeah, as soon as I got that single star, I just took it and moved on. It's not worth it to receive another warning for something. You are reporting it, you are saying that's wrong, and you are manipulating the system basically if you are asking for another review.

8 minutes ago, yourbrandingpal said:

I am only trying to emphasize the fact that sellers should be able to get their concerns addressed without being perceived as manipulative.

Reviews are final and they can't be changed. Trying to change them is seen as a manipulation obviously. And that comes from soneone that had the random review thing happen to them. So, if you encounter this, just move on. 

  • Like 3
  • Sad 1
Link to comment
Share on other sites

21 hours ago, smashradio said:

There's no way for an automatic system to decide which star rating is correct based on the review text

So with this free, open source library that I linked to before: https://guides.library.upenn.edu/penntdm/python/textblob

I just tested the library function.

With code to scale the sentiment score to 1 to 5:

def get_sentiment_score(text):
    blob = TextBlob(text)
    sentiment = blob.sentiment.polarity

    score = (sentiment + 1) * 2.5
    if (score<1): score=1 # note: this should not really be needed but sometimes gave a number slightly less than 1
    score = round(score, 2) # no real need to round it, could give more accurate answer without rounding  
    return score

This is the results of a test of how well it could judge example review text:

Manual Rating: 1.3, Sentiment Score: 2.48, Rating diff: 1.18 Review Text: Seller has extreme difficulty interpreting what is written and understanding what is requested. Not only the issue with communication, but also the lack of creativity of the professional came up during the job execution. 
Manual Rating: 1.0, Sentiment Score: 1.8, Rating diff: 0.80 Review Text: Not flexible, the quality was not good, hard to communicate with, no relevant images, ..etc
Manual Rating: 1.0, Sentiment Score: 2.5, Rating diff: 1.50 Review Text: He was unresponsive, took unannounced absences for weeks during the project, unable to complete detailed revisions in the project
Manual Rating: 3.0, Sentiment Score: 3.35, Rating diff: 0.35 Review Text: I found the animations to be good. Overall the video was great. Seller did ask me for an extra day and when I asked him to show me the progress he got really defensive.
Manual Rating: 1.7, Sentiment Score: 4.0, Rating diff: 2.30 Review Text: Nice job, but delay in delivery
Manual Rating: 1.0, Sentiment Score: 1.43, Rating diff: 0.43 Review Text: Very bad service on Fiverr. Reason: After I place order he demand extra money and I paid. But he delivered the worst Voiceover he record with phone. I really dissatisfied with work and I don't recommend to anyone.
Manual Rating: 2.3, Sentiment Score: 2.13, Rating diff: 0.17 Review Text: the gig is very expensive but does not deliver what is expected very simple but charges high
Manual Rating: 2.3, Sentiment Score: 2.31, Rating diff: 0.01 Review Text: It took too long to get that work delivered and it's not really what I was looking for as he didn't provide the changes I asked.
Manual Rating: 4.0, Sentiment Score: 4.5, Rating diff: 0.50 Review Text: Thank you, great service.
Manual Rating: 4.3, Sentiment Score: 3.56, Rating diff: 0.74 Review Text: Not bad, but it could be better if seller can have an easier communication.
Manual Rating: 4.3, Sentiment Score: 4.64, Rating diff: 0.34 Review Text: Very good experience working with him. Timely delivery and great communication skills.
Manual Rating: 1.0, Sentiment Score: 1, Rating diff: 0.00 Review Text: Cancelled order. Seller failed to deliver on time!
Manual Rating: 4.3, Sentiment Score: 3.38, Rating diff: 0.92 Review Text: Overall good
Manual Rating: 4.3, Sentiment Score: 4.5, Rating diff: 0.20 Review Text: Great job
Manual Rating: 4.0, Sentiment Score: 3.62, Rating diff: 0.38 Review Text: very prompt with changes required and good value
Manual Rating: 4.0, Sentiment Score: 2.5, Rating diff: 1.50 Review Text: thank you for getting it done
Manual Rating: 5.0, Sentiment Score: 3.5, Rating diff: 1.50 Review Text: I am so happy with the final result of my video! Seller was so patient
Manual Rating: 5.0, Sentiment Score: 3.83, Rating diff: 1.17 Review Text: Seller is nothing short of amazing. He always delivers the best, most professional, top-notch videos.
Manual Rating: 1.3, Sentiment Score: 2.62, Rating diff: 1.32 Review Text: It appears this is one of those experience you wouldn't like to have. Placing an order and not be being able to make modifications and even worse he decided to block our conversation because.... 
Manual Rating: 3.0, Sentiment Score: 1.5, Rating diff: 1.50 Review Text: We worked with him for about a year. But the voice quality has decreased over time. We've talked about this before, but nothing has changed.
Manual Rating: 4.3, Sentiment Score: 5.0, Rating diff: 0.70 Review Text: Very happy with these VOs
Manual Rating: 1.0, Sentiment Score: 4.25, Rating diff: 3.25 Review Text: Good thank you
     ** This review has a high diff of 3.25 between sentiment score and manual rating. ** Review Text: Good thank you
Manual Rating: 4.0, Sentiment Score: 3.62, Rating diff: 0.38 Review Text: He was good price and was fast.
Manual Rating: 4.0, Sentiment Score: 3.46, Rating diff: 0.54 Review Text: Quick precise and super friendly!
Manual Rating: 1.0, Sentiment Score: 3.67, Rating diff: 2.67 Review Text: Quick and easy job - he was complete in 10 min. Great service!
     ** This review has a high diff of 2.67 between sentiment score and manual rating. ** Review Text: Quick and easy job - he was complete in 10 min. Great service!
Manual Rating: 5.0, Sentiment Score: 3.17, Rating diff: 1.83 Review Text: I want to thank him so much, he was so professional and he deliever it before the deadline, ofcourse i will do more projects with him
Manual Rating: 5.0, Sentiment Score: 2.67, Rating diff: 2.33 Review Text: Seller has helped me out three times already with very sick narration that doesn't require additional reviews or much editing. Impeccable result every time for a very affordable price!
Manual Rating: 5.0, Sentiment Score: 4.48, Rating diff: 0.52 Review Text: Excellent work! We love our video! It looks amazing!

So I said if rating_diff > 2.4 then output that there's a high difference (ie. between the sentiment score (the expected rating score) and the manual rating score).

So you can see in the test above, with the example reviews, the only 2 lines it's output are the ones that are most likely invalid ratings (all the other test reviews had lower rating differences than 2.4)

ie.

Manual Rating: 1.0, Sentiment Score: 4.25, Rating diff: 3.25 Review Text: Good thank you
     ** This review has a high diff of 3.25 between sentiment score and manual rating. ** Review Text: Good thank you

Manual Rating: 1.0, Sentiment Score: 3.67, Rating diff: 2.67 Review Text: Quick and easy job - he was complete in 10 min. Great service!
     ** This review has a high diff of 2.67 between sentiment score and manual rating. ** Review Text: Quick and easy job - he was complete in 10 min. Great service!

So that seems some evidence that this free library of functions (or something like it, eg. if they aren't using python there) may be able to help automatically decide if it should ask the buyer for more validation of the review (or CS if they wanted). Though there's probably even better functions available for this, like ones that use large language models maybe.

Edited by uk1000
  • Like 4
Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...