Jump to content

charlsmcfarlane

Member
  • Posts

    360
  • Joined

  • Last visited

Everything posted by charlsmcfarlane

  1. I think there are a lots of ways to ask for a review. I have a quick response saved, which I wrote myself and have been using for years. In that time I have never had any dispute with Fiverr over review manipulation.
  2. Maybe it's because you've stressed that it's important. Getting a negative review wouldn't help you, so it could be perceived that by saying it's important, there's an implication that it's important for you to receive a positive review. Also, by saying it's very important for you, it could make the buyer feel guilty for not leaving a review. I don't know if any to this is right, by the way. I'm just throwing ideas out there as to why your comment was flagged.
  3. My concern with this is that, in most cases, a seller should be more on top of their workload than to need a 1 hour reminder. Scrambling to finish the work in the last hour isn’t a great experience for the buyer or the seller, and will likely result in the work being rushed. Maybe it’s different for orders with short turnarounds like 24 hours. I don’t know. Even then, still working until the last hour definitely isn’t optimal. Probably worth requesting an extension in the spirit of prioritising quality over haste. I guess it’s good for a situation where you’ve done the work but have forgotten to actually deliver it, but I think the 12 hour thing is good enough for that, as it provides a reminder with enough time that the buyer won’t feel as though the seller delivered the work at the last second. I can’t see much downside in Fiverr adding a 1 hour reminder, though. I just can’t see much upside either.
  4. I find that maybe 90% of the time I ask a buyer for a review (after auto-completion), they actually do. When I don’t ask, I almost never get people leaving a review after an auto-completion.
  5. My reviews are largely unchanged. I’ve had a couple 4s for “value for money”. I also had another review that was a 3.2 or something, overall. However, that one just went badly and I don’t think has anything to do with the new review system. I’ve had almost as many 5 star reviews as normal. It feels like instead of 99/100 reviews being 5 stars, it’s more like 9/10 are 5 star reviews now. It’s harder but certainly doesn’t feel like the end of the world. Weirdly, I think charging more is helping the value for money score, as it seems to attract more buyers for whom cost is less important than quality.
  6. Very true. Although, rubbish sellers would then be highlighted and drop in the rankings, which helps fiverr to elevate decent sellers.
  7. I hope this isn’t factored into the success score. A buyer’s lack of interest/timeliness/efficiency, shouldn’t have any bearing on how the system evaluates sellers.
  8. My suggestion was that the system restrict buyers from being able to request revisions beyond the included number in the order. As I said in my post, it’s rare I get these issues, and I think part of the reason is that I offer premium services that attract particularly good buyers. Perhaps raising prices helps might help? Either way, sellers need to be comfortable having tough conversations with buyers when necessary, and bringing support in as a last resort. That won’t ever change. I just think the system needs to align with itself. If a seller includes 2 revisions, for example, that should be the number the buyer can systematically request. As I said in my original post, there would still need to be a way to request a free revision for when a seller has objectively got it wrong. But misuse of that option should be something that can get a buyer’s account flagged. That way, it will ensure the behaviour of just hitting the “request a revision” button for all eternity doesn’t continue.
  9. Did we ever get a concrete (transparent) answer on whether deleting a low performing gig will have a positive influence on the overall success score? I have a few gigs that I'd happily sacrifice for a positive influence on the success score, but they do occasionally bear fruit, so I'd rather keep them if it makes no difference. Thanks
  10. Some people have reported seeing a drop in success score. I had several of my lower scores gigs drop from an 8 to a 7 but luckily they are my lower traffic gigs too, so it didn’t visibly affect my aggregate score. I wish they would add a decimal point though, so we know roughly where we are.
  11. There are a few success score improvements that would be very helpful. 1. Please show a decimal point, so sellers have some idea of where they are within their rating. I have a rating of 9 but have no idea if I’m just about to drop to 8, or whether I’m almost at 10. This could also be achieved by converting the 1-10 scale to a 1-100 scale 2. Having the individual gig insights hidden behind so many clicks is really clunky. It would be much simpler to get an overview of how the success score is derived on a dashboard below the current stats or when the success score arrow is clicked. Show all the gigs in a grid, with all of the insights already visible. Right now, having to click on every gig, one by one, to see what metrics are contributing to the success score is very slow and doesn’t give a ‘bird’s eye view’ of the stats. Every seller seems to agree that much more detail of how the success score is calculated needs to be shown but even the limited detail we can see is buried in submenus requiring lots of individual clicks. Maybe this is less of an issue for sellers with 2-3 gigs, but I have 21 and this system is not fit for that amount. This information could also be shown in the “gigs” list as an extra column. 3. Provide an overall view of things to work on and maintain that are consistent across all gigs. Examples: “We’ve noticed a dip in your communication performance recently. This might be something to reflect on and address”, and, “One of your key strengths recently is ‘delivery time’. Keep up the good work”. This could even be shown as a radar chart of all the metrics that comprise the success score, showing overall areas for improvement. 4. Show specific orders where a seller fell short on a certain stat. This wouldn’t be appropriate if the data are derived from a private review, but if it’s been derived from an objective metric, showing 3-5 orders that have damaged a seller’s metrics, gives them something to actually review, so they can reflect on their performance. 5. Completely rewrite the “how to improve the success score” guidance. Right now it could be useful for new sellers or sellers that are no good, but any vaguely decent seller will read everything on that page and think, “yes, I already do all of this.” The guidance needs to be tailored to the individual seller, highlighting specific suggestions that directly address the specific ways in which that seller has fallen short recently. Note: The success score is now fundamental to every seller’s ability to thrive on this platform, so please do not create any of these suggestions for seller plus members only. These suggestions represent minimum functionality that every seller should have access to. Thanks for considering these suggestions.
  12. I don’t disagree but I can’t see fiverr going back to the old system. I’m hoping they will be open to improving the new system though.
  13. It seems that, in some cases, buyers don’t understand the impact the new feedback form has on the seller’s rating. Can we have a 5-star rating next to the submit button, that shows what their emoji selections are when translated into stars? Maybe another way would be to have a pop-up that says, “based on your selections, this order will receive a rating of 4.3 stars. Does that align with your opinion of the seller’s performance?” In the spirit of transparency, this would give them the context to understand what their rating means in real terms. Thanks
  14. I get that this thread is only a small sample size, but it’s a sample of the people who are the most passionate and invested in the platform. Also, when everyone is saying the same things, it’s relatively safe to infer the same sentiment across the whole population.
  15. Kind-of unintuitively, I wonder if the way to do well with this score is to raise prices. If I raise prices to a point where the only people purchasing are buyers for whom the price is not an important factor, that person is less likely to rate value for money poorly because the cost is less important for them. Those buyers see the value in paying more for a good service, so quality holds more importance for them than cost. Does this make sense? I have no evidence to support this idea. It's just a random thought I had.
  16. If you could wave a magic wand and change one part of this whole update, what would it be? For me, it’s replacing the “Value for Money” question in private and public reviews with something meaningful.
  17. However, almost every answer we have had so far has been, “speak to support”.
  18. As much as almost 750 replies speaks for itself, it’s going to be pretty hard for Fiverr staff to find and address everyone’s specific concerns. I think we need a completely new guide that explains everything in detail. Not just how communication is informed by one’s “politeness” and “responsiveness”. I mean actually, how each thing is calculated. What metrics come from subjective reviews, what metrics come from objective data – and specifically what data. The more I read, the more complex the questions get. Like, for example, does a buyer requesting a revision damage a seller’s “conflict-free” metric? — but further to that, does it also affect other metrics too? For example, is a buyer requesting a revision, considered a failure in the seller’s communication? (Obviously, in reality, no! – but how does the system measure it?) My point is, there’s a huge number of questions and I think they need to go back to the drawing board and create a comprehensive, transparent set of documentation on this. There are a few things that simply need to be changed – plain and simple – but there are a lot of others that we simply can’t evaluate yet because of the lack of clarity.
  19. This is absolutely right. I have a buyer right now that has come back once or twice a week for the last month and has left maybe 3 reviews total. The first one happened because I asked for it, and then they’ve left the odd review on subsequent orders (presumably when it’s been convenient for them). Every review from them has been a glowing appraisal, but there’s no way that will continue if, after every order, I start bugging them to leave yet another review. It’s up to customers when and if they leave a review. If Fiverr want more reviews, they should let sellers initiate them too, rather than the keys to the process being held exclusively by buyers.
×
×
  • Create New...