AirGradient Open Source Air Quality Monitors
We design professional, accurate and long-lasting air quality monitors that are open-source and open-hardware so that you have full control on how you want to use the monitor.
Learn MoreLast week, I published a blog post about our experience with a recent WIRED review. In that article, I also shared a link to a survey because I wanted to hear from the community. What happened next completely blew me away.
The response was massive. We had over 500 survey responses in just a few days! We also received 100+ comments on Hacker News and were trending for hours. Overall, we had hundreds of comments across our blog, forum, and survey.
But besides the quantity, what really impressed me was the quality and thoughtfulness of the responses. A real discussion about something that affects all of us emerged.
The Hacker News discussion was fascinating because it was divided into two clear camps. People who already knew us - who understood our open-source approach, our scientific validation work, and our commitment to repairability - immediately got what I was trying to say. This was not about one bad review but rather about systematic problems in how tech journalism works.
Then there were others who seemed to have just read the title or skimmed the comments. For them, it was simple:
broken display = not recommended.
Fair enough. Some of them thought we were just whining about justified criticism.
This split was actually pretty revealing. For the people who understand what we’re trying to build and why, the methodology issues was obvious. On the other hand, for those who were just looking at the surface, it looked like a company complaining about a bad review.
The survey that was part of our blog post and my newsletter, really revealed that difference as well. Nearly all of the people who answered the survey actually own an AirGradient monitor and really understood what we stand for. So the 300 comments in the survey were 99% supportive and focused on the real problem: Is tech journalism losing its way? When a major publication with millions of readers presents personal preferences as “The Best” guides, something’s broken.
Also one Hacker News commenter nailed it: “You’ve found the core takeaway about nearly all ‘product reviews’ in nearly all publications. They are almost all simply ’the personal preferences of the author’… But they are never based upon any objective criteria, and are never (nor ever were intended to be) reproducible in any scientific fashion.”
That’s exactly the problem. When someone’s trying to manage asthma or protect their family from air pollution, they need reliable information based on real criteria, not whether a reviewer likes how something looks. For something like a smartphone review, this subjective approach makes sense. Whether you pick an Android or iOS phone, it really comes down to user preference. However, for devices like air quality monitors - which can influence health decisions, especially for those with asthma or other respiratory conditions - objectivity on aspects like accuracy is far more important.
To all tech journalists out there working for major publications: Take this issue seriously. This is actually a great opportunity to differentiate yourself with indepth and sound product reviews. Now, with Anandtech being no more, there is a gap you can actually fill!
I read hundreds of comments across all platforms. Here’s roughly what I found:
About 70% were supportive of us:
About 20% were critical:
About 10% were mixed:
And running through all of this: lots of people saying they don’t trust many mainstream tech reviews anymore. They’re turning to community sources, scientific evaluations, long-term user experiences instead of traditional tech publications. I’m writing a more thorough review on this. Stay tuned.
The testimonials from actual users were powerful. One wrote: “My airgradient monitor has been online for years and sending data to Prometheus reliably. I’ve been able to plot the air quality across a few climate events and the introduction of a Samsung air filter in my bedroom. It’s a good little product.”
Another: “I own several AirGradient monitors and have used other brands in the past. As far as I am concerned AirGradient is clearly superior, not only for ease of use, repairability and their open source approach, but also because of their tremendous enthusiasm for getting accurate data and being totally transparent about the strengths and weaknesses of the technology.”
Someone explained our LED system perfectly: “The unit also has a series of LEDs across the top and I can read the actual status from 20’ away… One green led? Good. Two green leds? Meh. Three LEDs? They’re red now and that’s not great… The reviewer was overly severe and did his readers a disservice.”
This is what matters. These people understand that air quality monitoring is about long-term data, accuracy, and reliability. Not whether something looks pretty in photos.
Several people explicitly backed our decision to respond publicly. One wrote: “This sounds like something Louis Rossmann should cover as a counter-example of manufacturers trying to do the right thing but fickle, corporate reviewers behaving in a petty, unfair manner.”
Another: “I hadn’t heard of you folks before, but I’m interested in your product - open source and repairability are high on my list for home monitors.”
Trust in tech media is declining, and money seems to be a big part of why. Multiple people brought up concerns about financial influence on reviews. One commenter noted something interesting: “And then there is the fact, that the reviewers favoured product has a logo on the product page for the reviewers publication. There is certainly potential for financial interests to impact reviews.”
Another was more direct: “I pretty much assume that all product review sites are at worst crooked and at best biased.” Someone else added: “There is a vanishingly small collection of youtubers that I might still trust when it comes to product reviews, and that list is shrinking.”
Maybe most telling was this observation: “I wouldn’t worry too much tbh if I was Airgradient. I don’t think anyone trusts Wired for serious tech reviews and the target audience would veer towards plug and play crowd anyway.”
A former writer explained the economic reality behind the scenes: “So if the reviewer is staff, they might be assigned three or four reviews in a given week on top of other work. If they’re freelance, they might have to take on more just to make their rent.”
That’s the system we’re dealing with. No time for proper testing, no budget for scientific rigor, economic pressure to maintain relationships with advertisers and PR-friendly companies. Personal impressions get dressed up as authoritative guides because that’s what the economics allow.
When readers start assuming that financial relationships drive recommendations more than product quality, something fundamental is broken in the review ecosystem.
Here’s what really validated our approach: being completely transparent about this situation resonated with people. We could have quietly accepted the “Not Recommended” rating, hired a ‘reputation agency’ to make it go away or even engage lawyers. Instead, we opened up about what happened and why we thought it mattered.
As one person put it: “Thanks for a great product and for running a company with integrity.”
That’s what transparency gets you. Not everyone will agree with your approach, but people respect honesty.
This whole experience reinforced a few things for us. Building real relationships with people who understand what you’re trying to do matters way more than getting good press coverage. Technical users appreciate technical honesty - they get the complexities of hardware development and scientific measurement. Being transparent, even when it reveals problems, builds more trust than trying to spin everything positive.
Most importantly: The best response to criticism is to keep improving while sticking to your principles. We’re moving forward with product improvements based on community feedback, continued scientific validation, and our commitment to open-source, repairable design.
To everyone who engaged with this - supportive, critical, or somewhere between - thank you. You helped us understand not just how you see this situation, but how you research products, what you value in reviews, and what you expect from companies you support.
To our existing customers: your loyalty means everything. You supported us based on actual experience with our products, not magazine ratings.
To new people joining our community: welcome. You’re joining a group that values accuracy, repairability, and transparency.
To the critics: your feedback makes us better. Keep it coming.
We’ve already contacted the two winners of our “Not Recommended” monitor giveaway from the survey participants - congratulations!
We’re working on analyzing the full survey results, which should provide some fascinating insights into how people actually research products in 2025. We’re also moving ahead with product improvements and continued scientific validation.
Most importantly, we’re doubling down on what got us here: building accurate, repairable, open-source monitors while being transparent about both our successes and failures.
Stay tuned for the follow up post with in-depth survey results!
We design professional, accurate and long-lasting air quality monitors that are open-source and open-hardware so that you have full control on how you want to use the monitor.
Learn MoreCurious about upcoming webinars, company updates, and the latest air quality trends? Sign up for our weekly newsletter and get the inside scoop delivered straight to your inbox.
Join our Newsletter