Let's talk about reviews [12/21 Update: Section 2.1 added!]

1. Introduction

Hey everyone,

So if you haven't noticed, SFF Network has been fortunate enough to be able to do a few product reviews since we started the site earlier in the year. In fact, we've actually been doing a lot more component testing in the past month or so, with a pile of forthcoming reviews now in the pipeline, and a few of these already published. (Some serious kudos to confusis, by the way, for the many hours he's spent testing boxes upon boxes of coolers!)

To be honest with you all, though, this push of reviews is somewhat unexpected for us, especially since we originally planned on solely reviewing products of interest that we bought personally (or just happened to come across). That was certainly enough to get us going up to now, content-wise, but more recently we've actually started to hear from some companies, and reach out to others, as traffic to the news site and forum have both grown. From this, we've now found ourselves at a point where we're already receiving samples, provided from various companies, specifically for review purposes - a very exciting development, as it enables us to publish component reviews that we otherwise wouldn't be able to. (It's as if we're a legitimate news site!)

Anyway, I mention all of this because, as is the case for many technology websites, our reviews have quickly become an essential part of the content we publish, and drive a lot of the traffic and interest in our site...

...And yet, ever since I can remember, I've held a strong, deep-seeded dissatisfaction with the way the great majority of product reviews are done today in industry, with ours being no exception. Too often, product reviews are strongly tinted with the preference and priorities of the reviewer, clouding how well a component might work for my own purposes. Most of the time, they don't tend to be critical enough, glossing over seemingly minor points that actually matter a great deal to some. Frequently, in fact, this is underscored by a lack of quantitative data to answer my questions, though more and more I've seen the opposite problem - a glut of charts and tables that completely bury the testing data that's actually relevant. And, worst of all, the very format of reviews, and their desire to distill themselves into rating scales and other arbitrary systems, doesn't help at all when trying to make comparisons, and often makes the process more difficult, not less.

I get the sense that I'm not alone in feeling this disappointment, either, even if most people don't think about or vocalize that feeling very much. But, perhaps most importantly, as an editor and contributor to a fledgling publication, I've found myself feeling somewhat responsible for voicing my thoughts, now that I'm in a position to do something about it. It's easy to complain as a seemingly powerless reader, but now I have no excuse - if I'm not critically evaluating how we make reviews here at SFF Network, any silence on my part is effectively an endorsement of how we do things. And I don't endorse how practically any publication or individual does reviews right now.

Up to today, my involvement in generating any of SFFN's reviews has been limited mostly to my day-to-day editing duties. But I feel compelled to capitalize on our opportunity to publish, and have felt for a while now that it would be an interesting and beneficial exercise to really think through how product reviews in the technology space are created - from format and testing methodologies, to ethics and writing styles. Consequently, over the next week or two, I'll be doing exactly that: writing a series of pieces that will look at what product reviews are meant to do, how the ones created today fail, and how we can possibly - just possibly - do better, for those who seek guidance in building their systems.

For years, I've tried to imagine what the "ideal" review might look like - the sort of review that can be the most helpful to the greatest number of readers - and I think I'm at a point where I can begin to flesh that out. At least, with the help of you all!

---

Everything I have to write about this wouldn't possibly be able to fit within a single post or article, so instead I'll be putting up my thoughts in a multi-part series, with updates every few days or so, starting with this one. These posts will also be published on SFFN, albeit edited somewhat so that each entry can stand on its own as an editorial piece.

Right now, I expect that the progression of these parts will look something like this:

1. Introduction (this post)
2.1 "What is a review?" (What are reviews meant to do?)
2.2 "The Problem" (What's wrong with technology reviews today?)
3.1 "The Deep Dive" (Analysis & critique of common review formats)
3.2 "Lessons Learned" (A high-level summary of what's generally right and wrong with existing product reviews)
4.1 "Building a Better Review" (Rolling up all the previous sections, outlining a format and style of review that seeks to address the problems with more traditional reviews)
4.2 "Looking ahead" (How SFFN will incorporate these ideas, taking in feedback, etc.)

During the time these go live on the thread, I encourage all of you to give me feedback, and engage in some good-ol' community discussion with respect to product reviews and related topics. As I continue to put thoughts into words, I'll be participating in that discussion, and getting a better sense of some of the positive experiences and frustrations that you all have with reviews, that I may not be thinking about.

Also, it's worth mentioning that, while SFFN's reviews will themselves change in response to this work (and your feedback), we won't be holding back on publishing reviews we're already working on, so don't worry about a dearth of content :) We may retroactively edit past reviews to whatever new format we implement down the line, but for the most part any changes in writing or process are meant to be forward-looking. At this point, having this discussion and bringing about these changes to our review process is more a personal pursuit of mine than anything else.

---

To conclude, this topic may seem like an unusual one to spend a lot of time and effort thinking over, but to me it is one of the most important parts of tech journalism. Product reviews by publications - especially large ones - hold incredible influence and sway, and have the capacity to empower millions of users to make better buying decisions, in a way that tangibly improves their lives. Us as individuals couldn't possibly find the time, energy and resources to thoroughly vet a field of competing products, and discern which is best for us specifically, but journalistic entities can do this for us, and this has a huge impact on all of us as enthusiasts. It makes or breaks companies and products, and it sets the standards that those companies compete against for our dollars.

Simply put, product reviews have the potential to give us, as consumers, direction. And I want to see how far that we, at SFF Network, can really take that.

Hopefully, in a few days, we'll begin to find out!

Thanks,
-PlayfulPhoenix
 

iFreilicht

FlexATX Authority
Feb 28, 2015
3,241
2,355
freilite.com
I'm really liking the way SFN (sorry, four letter abbreviations are not my thing) is headed and how professional everything feels. This editorial piece and the ones following are certainly in line with that feel, and I hope you will be able to make or let the reviews be made in such a way that you are satisfied them. Only showing temperature Deltas on CPU cooler tests is a great first step in that direction, I believe.

Personally, I feel like a lot of reviews deliberately leave room for interpretation to allow readers to confirm their own beliefs about a product, which fuels discussion, which keeps the page views up. There is a lot of respect to be had for sites like SPCR or HardOCP which take such an in-depth view at products you almost feel dumb reading them.

What I personally want from reviews on here are special regards to the concerns of SFF. If you test a motherboard, I want to know whether there are very tall components on the backside or whether I could save 2mm of space in a custom build. On boards with daughterboards, I want to know the height of those. With CPU coolers, I want to know about possible intrusions into keepout zones, with PSUs, odd placement of connectors.

Either way, I which you and us the best of luck with this network, it already shapes up to be something truly special in a world of numerous generic forums.
 

Phuncz

Lord of the Boards
Editorial Staff
Moderator
Gold Supporter
May 9, 2015
5,186
4,519
I'm very pleased to read this as the table hasn't been flipped on this topic enough.

I personally prefer Anandtech's reviews because the focus (sometimes very deeply) is on the facts and not at all on the score. A few reviewers there also share a motto that I feel answers a lot of "why does this product exist"-questions, namely that there are no bad products, just bad prices. I like the deep delving into tech, why a certain aspect is what it is, without focusing to much on how brand X "just is better".

This goes hand in hand with my personal motto: a product should be viewed as a product, not as a part of a brand. I like to read why design choices are made, substantiated by schooled insight and knowledge instead of nitrous-injected click-bait.

If I somehow can be of any useful assistance, I'd be glad to help out.
 

Regack

Trash Compacter
Sep 11, 2015
53
29
My personal pet peeve for reviews is a lack of physical measurements/dimensions of the product. For me, this seems to be something that is consistently lacking in reviews. Frequently when measurements are provided, they are simply the basic length/width/depth provided by the manufacturer. GPUs, for example vary quite a lot in size and shape, and just a basic measurement doesn't really do justice when to trying to figure out if you can fit something into a tiny little space.

I think @iFreilicht summed this up nicely, so I'm just going to repeat it verbatim and suggest adding in some detailed physical measurements to go along with this information would be fantastic.

What I personally want from reviews on here are special regards to the concerns of SFF. If you test a motherboard, I want to know whether there are very tall components on the backside or whether I could save 2mm of space in a custom build. On boards with daughterboards, I want to know the height of those. With CPU coolers, I want to know about possible intrusions into keepout zones, with PSUs, odd placement of connectors.
 

PlayfulPhoenix

Founder of SFF.N
Original poster
SFFLAB
Chimera Industries
Moderator
Gold Supporter
Feb 22, 2015
1,049
1,960
I'm really liking the way SFN (sorry, four letter abbreviations are not my thing) is headed and how professional everything feels. This editorial piece and the ones following are certainly in line with that feel, and I hope you will be able to make or let the reviews be made in such a way that you are satisfied them. Only showing temperature Deltas on CPU cooler tests is a great first step in that direction, I believe.

I'm still conflicted about the whole SFN-vs-SFFN thing :p The latter is accurate, but the former just looks and sounds and feels nicer. Ah well.

Anyways, we appreciate the feedback, and the kind words! With respect to SFF Network, I honestly only really see the areas where we need to improve - volume-wise, we need to create much more content, for example, and it's easy to see through our writing that we remain relatively new to publishing. I will say, though, that the diversity of our content, and tone of our writing, and how we're fielding said content, is strong. Our execution needs to mature, but our resourcefulness is solid (at least in my own estimation). And execution is the sort of thing that can only be improved upon over time, anyway.

...To be clear, though, and if this thread wasn't indication enough, everything we're doing across SFF Forum and SFF Network is highly experimental in nature. We're definitely looking to turn these websites and platforms we've been working on into serious and prolific institutions for the SFF community and beyond - but we recognize that we're new to most of this, and we want to have fun while doing right by our fellow enthusiasts. So we'll be taking the time to try a bunch of stuff and make mistakes, since we think it's our best chance to make something great, and enjoy the journey of that creative process.

My personal pet peeve for reviews is a lack of physical measurements/dimensions of the product. For me, this seems to be something that is consistently lacking in reviews. Frequently when measurements are provided, they are simply the basic length/width/depth provided by the manufacturer. GPUs, for example vary quite a lot in size and shape, and just a basic measurement doesn't really do justice when to trying to figure out if you can fit something into a tiny little space.

I suggest adding in some detailed physical measurements to go along with this information...

Strangely enough, we haven't put a lot of thought to this, but it's a solid point. Frustratingly, manufacturers aren't at all consistent about how they measure dimensions for products that aren't otherwise standardized - the "height" of PCI devices being an oft-cited pain point, though there exist many others. Obviously, in the SFF space, 'actual' dimensions and tolerances are frequently make-or-break factors that carry significant weight when deciding whether or not to buy something.

In current reviews, we've been making note of whether or not a component exceeds or otherwise impedes on our test hardware, or an industry standard (such as clearance above and around a motherboard's socket, for CPU coolers). But developing a standard of measure for the purposes of product reviews is an interesting idea, especially in the context of providing directly comparable context across multiple products. And as far as I know, no publication is doing it.

I'll have to bring this to the team and see what we can do to provide this sort of information. I don't know how detailed we can realistically get, but we should be able to do a lot, and the need is certainly there.

Personally, I feel like a lot of reviews deliberately leave room for interpretation to allow readers to confirm their own beliefs about a product, which fuels discussion, which keeps the page views up. There is a lot of respect to be had for sites like SPCR or HardOCP which take such an in-depth view at products you almost feel dumb reading them.

I think that, more broadly, most reviews tend to shy away from absolutes, and more specifically they tend to not elaborate on what particular users a given product may be good for. Sometimes they'll go half-way, and say "at this price point, this is good at x and bad at y", but its rare that there isn't some ambiguity present.

For some of the later sections, I'm trying to source published examples that illustrate this sort of ambiguity, because it's really quite sneaky at times. In general, though, "experiential"-style reviews tend to lean a lot more on the qualitative opinion of the reviewer, at the expense of leaving more to interpretation on the part of readers.

What I personally want from reviews on here are special regards to the concerns of SFF. If you test a motherboard, I want to know whether there are very tall components on the backside or whether I could save 2mm of space in a custom build. On boards with daughterboards, I want to know the height of those. With CPU coolers, I want to know about possible intrusions into keepout zones, with PSUs, odd placement of connectors.

We already do this to a point with existing review procedures, but to Regack's point, we could do a lot more. We'll have to think on this, and I'll elaborate on it in future posts.

My only real worry is that I'm not sure how much of these interactions we can really ever hope to pick up on, since we realistically can only review a component against so many configurations. We're not ever going to be able to say that "we can confirm that this is the complete list of CPU coolers that are incompatible with the daughterboard for this motherboard", for example - but we may be able to say that "given this height of the daughterboard, CPU coolers with this much clearance will interfere", or some such conclusion.

If I somehow can be of any useful assistance, I'd be glad to help out.

Participating in the discussion here will do a lot to inform me (and, thus, SFF Network) as we try to hone our review processes, and better serve our readers. So I encourage anyone reading this to chime in, even for the "little things", as this input will help immensely as we synthesize our thoughts and ideas.
 

NFSxperts

SFF Lingo Aficionado
Aug 7, 2015
112
53
Hey everyone,

Too often, product reviews are strongly tinted with the preference and priorities of the reviewer, clouding how well a component might work for my own purposes. Most of the time, they don't tend to be critical enough, glossing over seemingly minor points that actually matter a great deal to some. Frequently, in fact, this is underscored by a lack of quantitative data to answer my questions, though more and more I've seen the opposite problem - a glut of charts and tables that completely bury the testing data that's actually relevant. And, worst of all, the very format of reviews, and their desire to distill themselves into rating scales and other arbitrary systems, doesn't help at all when trying to make comparisons, and often makes the process more difficult, not less.

Thanks,
-PlayfulPhoenix


I don't think reviewer preferences and priorities is something that can be corrected. It isn't possible to provide and know every tiny little detail people are looking for.
(maybe post in forum asking what the readers want to know before doing the review or have 2 people review each product and combining them?)

I've stopped looking at review scores ever since GTA IV received a perfect 10/10. Nowadays if somethings receives anything less than 9, people perceive it as something not worth their time so reviewers usually end up padding up the final score.

Some grips I have with reviews:
Misleading charts
Inconsistent results from other sites
Inconsistent testing methods/environments from the same site.
reviews for new products that only compare against competing products and do not include comparisons to old products.
reviews spread over 15 pages and each page has only 1 or 2 paragraphs.
 

PlayfulPhoenix

Founder of SFF.N
Original poster
SFFLAB
Chimera Industries
Moderator
Gold Supporter
Feb 22, 2015
1,049
1,960
2.1: What are product reviews really meant to do?

Before we can ever understand the “why” of something, we must first consider the “what”. In the case of critically evaluating product reviews, then, we have to ask ourselves what the point of them are to begin with. So let’s first clarify what product reviews are supposed to do, and the problem they seek to solve, before we tear apart some of the reviews that are published today.

(I recognize that a lot of what I’m about to say will come off as almost painfully obvious - but bear with me. Having that purpose repeated, and clearly defined in our minds, should make it much easier to see whether or not a given review is actually fulfilling it.)

So what’s the ‘problem’ - or the ‘what’ - that reviews try to tackle? Well, if we take a step back, and look at what inspired product reviews to begin with, we quickly understand that the preponderance of reviews are a reflection of the many purchases that we all individually choose to make on a day-to-day basis. Furthermore, we buy products and services to begin with because they address a need or want that we have at a given moment. Simple enough.

If we lived in a world where there was only one product or service for all the needs and wants in the world, we’d be in a pretty crummy world, but at least our decision-making regarding what we purchase would be trivial - "just get the one that does the thing"! However, since we are fortunate enough to live in a world where we usually have many choices, that decision-making process is usually a lot more complicated. And this complexity is ultimately the real crux of the issue, in fact, since the best product or service for our individual needs is no longer self-evident - we must expend resources (usually in the form of time) in order to investigate and uncover it. As rational consumers, we may want whatever is best for us, but there’s a tangible cost associated with finding that.

Of course, in the PC component space, this ‘cost’ is compounded by the many interactions separate components have with our systems as a whole. For a silent-tuned build, for example, a particular video card may be the ‘best’ at staying silent for its performance class, but it might not be the best for us because it has a cooler that’s too large to fit in our enclosure. Or, perhaps a certain CPU cooler is the absolute ‘best’ performer for highly-overclocked chips, but it’s among the worst for us because it restricts use of other components in a way that negatively impacts our entire build.

Consequently, as you bring all of these dimensions into consideration, you begin to realize that this ‘cost’ of finding the best product for a particular use case is actually quite high for an individual to bear. And the only tool we really have to find the best for ourselves is research - the dredging of as much information as possible - which itself is fraught with challenges, since product information does not flow freely. More often than not, the information we need to make rational buying decisions is significantly constrained, and hidden beneath layers of distraction and obfuscation. Most quantifiable measures (that correlate with actual performance in use), in fact, aren’t even discoverable without buying and testing a component outright, defeating much of the purpose!

Through traditional retail channels, and without going well out of our way, we’re bombarded with a lot of marketing that doesn’t yield the sort of insights that can develop the informed perspective that we want. Furthermore, even as we’ve democratized retail platforms (especially online) by making feedback from buyers available, such testimonials - and even user reviews in the aggregate - are unreliable and poorly formatted. And that’s even when they’re abundant, which they often aren’t, particularly for newer products, or products that address specific niches. In combination with the sophistication of most electronics and components, the expense of investigating or testing things individually, and the sheer quantity of options that exist in most vertical markets today, it’s clear that customers who are left to fend for themselves are quite disadvantaged when it comes to deciding what to buy…

…And it’s at this point where we finally see the merit and place of product reviews, since publications are able to overcome nearly all of the limitations that we, as individual enthusiasts, face. Websites like AnandTech have the relationships with vendors that allow them to get hardware for testing, sometimes ahead of a product’s launch. They can afford the equipment and software and time investment needed to perform robust analyses. They have experienced writers who are good at distilling complex information into accessible advice. And they have the ability to specialize, and become experts, at performing testing and analysis in a consistent manner.

In a sentence, publications are able to provide consumers with the information they need to make informed buying decisions, that make the most sense for them. And I hope you’ve noticed my repeated emphasis of ‘for us’ or ‘for them’ at this point, because that distinction is perhaps the most important of them all: I don’t really care what’s best for ‘most’ people, or the ‘average’ person, or the reviewer. I care about what’s best for me, as everyone else feels about themselves! All else is distraction.

And yet, too often, that distinction seems to be lost, or the focus is on comparing product stats, instead of interpreting why they matter in the context of actual use cases. Thus, in the next section, I’ll take some time to elaborate on how everything from the target audience of product reviews, to the way in which most reviews are structured, contribute to a widening gap between what reviews do, and what they’re actually meant to.

Still, throughout that part - and the rest of this series, really - never forget that definition. “A product review should help its readers figure out the best product or service for their individual needs”. As obvious as it seems, practically everything I’ll write and discuss will be put up against that standard. And I think you’ll be surprised at what does and doesn’t make sense when we deliberately keep that definition fresh in our minds.
 
  • Like
Reactions: Phuncz

Phuncz

Lord of the Boards
Editorial Staff
Moderator
Gold Supporter
May 9, 2015
5,186
4,519
I'm looking forward to the other parts, I think I can feel what's coming in the next part and I've felt this for a while too.

I would like to advise to use some pointers or use of color to differentiate the content in each post, because when I glance away and I've forgotten where I was, I'm confronted with a wall of text.

I've done a few in this post to show you what I mean, I like visual cues a lot obviously.
 

PlayfulPhoenix

Founder of SFF.N
Original poster
SFFLAB
Chimera Industries
Moderator
Gold Supporter
Feb 22, 2015
1,049
1,960
I'm looking forward to the other parts, I think I can feel what's coming in the next part and I've felt this for a while too.

I would like to advise to use some pointers or use of color to differentiate the content in each post, because when I glance away and I've forgotten where I was, I'm confronted with a wall of text.

I've done a few in this post to show you what I mean, I like visual cues a lot obviously.

I'll play around with formatting for the upcoming section, and see if I can't organize the previous sections a bit ;)

Also, I've been abroad the past week but the next section is forthcoming, and should be up soon!