One of my favorite websites is OutdoorGearLab.com, founded by former big wall Yosemite climber Chris McNamara (who also founded SuperTopo.com). The best way to describe it is the *Pitchfork* of outdoor gear: a little site that has become a big site that can make or break your product. The coveted “Editors’ Choice” is the equivalent of Pitchfork’s “Best New Music” badge. I’m sure there is a bump to sales.

But in my opinion, OutdoorGearLab is a more reliable review site than Pitchfork because its reviews are far less subjective. OutdoorGearLab is a breath of fresh air in the world of gear reviews because (1) manufacturers do not pay for or provide the gear for review, (2) a team of reviewers assigns ratings (rather than an individual), and (3) each score is a sum of sub-scores, making the scores transparent.

## Brand Analysis

OutdoorGearLab has more than 194 reviews for 2,198 products on its website. Each review provides a great summary of the products, including a scatterplot of score versus price. However, there is no public analysis of brands across reviews.

So here is a November 2018 snapshot of brand performance compared to brand value.

Briefly, on the Y-axis we have review scores, and the X-axis we have value (i.e., points per dollar. *A lot more explanation below). *Z-scores are related to a normal distribution, where higher is better and a typical range is from -3 to 3. A Z-score can be interpreted as the **number of standard deviations above the mean**. The size of the dots is the number of reviews.

## The Four Quadrants of the Outdoor Industry

This figure gives us four quadrants in the outdoor industry:

**Neither Performance nor Value**. Brands such as MSR and Five Ten haven’t delivered consistently above-average products or consistently high-value products.**Capable Budget Brands:**These brands are typically below-average performers, but they consistently provide good value. REI is the face of this group. Metolius and Mad Rock are undoubtedly the value brands of climbing. The North Face is surprisingly in this group.**Premium Brands at Premium Prices:**These are the “technical” brands, which consistently have received above-average ratings but do not provide above-average value. Arc’teryx is the face of this group; they frequently offer top-of-the-line products for insane prices.**Value and Performance:**These are the brands the gearheads should love. Above-average performance at above-average value. Outdoor Research and Black Diamond are most representative of this group.

## Four Quadrants in Outdoor Sub-Industries

It’s interesting to see that some brands position themselves differently in some sub-industries. For example, REI is in value and performance in the camping and hiking sub-industry.

Similarly, The North Face is value and performance in the shoe sub-industry, whereas Arc’teryx is barely a premium brand. This makes sense since Arc’teryx is a new entrant into shoes.

Patagonia (although only having a few products reviewed) meets value and performance in the climbing sub-industry, which is consistent with my experience. You don’t have to pay a premium for their climbing packs and they perform well.

In snow sports, we have Salomon positioned as a budget brand, even though their shoes are premium and their clothes are neither performance nor value.

The quadrants for the outdoor clothing sub-industry are spot on, from my experience.

And lastly, we have bikes, which I don’t know much about. Let me know if this jives with your feel for the industry!

There is an interactive dashboard below where you can look at specific gear categories. But before setting you loose, a bit more explanation.

## Methods

First, how were the data collected? I scraped OutdoorGearLab.com in November 2018, but only downloaded data from reviews where a 1–100 score was given.

As mentioned, Z-scores are related to a normal distribution. When data follow a skewed distribution, the data must first be transformed before a Z-score is calculated.

Review scores were skewed left, so the best transformation was the square transformation. Because scores can vary widely across reviews, I calculated the z-score within a review. So the average **review z-score** is the average number of **standard deviations above the mean squared score **for a review. For example, Petzl products are *on average* 0.4 standard deviations better than other products in a review.

Value was calculated as points per dollar. Again value can differ a lot across reviews (e.g., bikes vs. bike helmets), so I calculated the z-scores within a review. Value was skewed right, so a log transform was appropriate. Thus average **value z-score **is the average number of **standard deviations above the mean log value **for a review. For example, Mad Rock products have *on average* 0.7 standard deviations better value than the average product.

## Interactive Dashboard

Here is the interactive dashboard created in Tableau. Things you can do in the dashboard:

Select one or multiple categories using the checkboxes.

Set a different minimum number of reviews for a brand. I recommend this be set to at least 3. (There are many brands with only one review on OutdoorGearLab.)

Click on a brand to see it highlighted in other charts.

Mouse over the histogram bars to see individual product ratings, price, and awards.