Posted by Liriel 4 days ago
you instantly got like 40k likes - but there was a catch
algorithm saw you getting a lot of likes from Iran/Pakistan, so went on recommending the post to those countries, got no response and stopped recommending said post altogether
in a sense, it became a self-regulating system, where fake impressions extinguish their very reason to be bought
Why am I not surprised big Capital corrupts everything. Also, Goodhart's law applies again: "When a measure becomes a target, it ceases to be a good measure".
HN Folks: What reliant, diverse signals do you use to quickly eval a repo's quality? For me it is: Maintenance status, age, elegance of API and maybe commit history.
PS: From the article:
> instead tracks unique monthly contributor activity - anyone who created an issue, comment, PR, or commit. Fewer than 5% of top 10,000 projects ever exceeded 250 monthly contributors; only 2% sustained it across six months.
> [...] recommends five metrics that correlate with real adoption: package downloads, issue quality (production edge cases from real users), contributor retention (time to second PR), community discussion depth, and usage telemetry.
Finding any curse words in hidden comments in the commit history is for me a good indication of a human working on a passion project, though ymmv.
And there are always exceptions to the exception of the exceptions.
In my opinion, nothing could be more wrong. GitHub's own ratings are easily manipulated and measure not necessarily the quality of the project itself, but rather its Popularity. The problem is that popularity is rarely directly proportional to the quality of the project itself.
I'm building a product and I'm seeing what important is the distribution and comunication instead of the development it self.
Unfortunately, a project's popularity is often directly proportional to the communication "built" around it and inversely proportional to its actual quality. This isn't always the case, but it often is.
Moreover, adopting effective and objective project evaluation tools is quite expensive for VCs.
I'm not supporting this view but it is what it is unfortunately.
VCs that invest based on stars do know something I guess or they are just bad investors.
IMO using projects based on start count is terrible engineering practice.
Surely a project's popularity is often related to its utility. A useful and popular project seems like exactly the kind of thing a VC might be interested in.
Hype helps raise funds, of course, and sells, of course.
But it doesn't necessarily lead to long-term sustainability of investments.
It’s more expensive to compute, but the resulting scores would be more trustworthy unless I’m missing something.
https://github.com/karakeep-app/karakeep
Sounds useful.
I’ll star it and check it out later ;)
Ran it on Union Labs and a few other repos from the Awesome Agents investigation. Results are interesting.
Unfortunately I still look at them, too, out of habit: The project or repo's star count _was_ a first filter in the past, and we must keep in mind it no longer is.
> Good reminder that everything gets gamed given the incentives.
Also known as Goodhart's law [1]: "When a measure becomes a target, it ceases to be a good measure".
Essentially, VCs screwed this one up for the rest of us, I think?
Id suggest the first question to ask is "if the project is an AI project or not?" If it is, dont pay attention to the stars - if it's not, use the stars as a first filter. That's the way I analyse projects on Github now.
I agree that it has been a first filter, but should it ever have been? A star only says that someone had a passing interest in a project. Not significantly different from a 'like' on a social media post.
As a side note it's kind of disheartening that everytime there is a metric related to popularity there would be some among us that will try to game it for profit, basically to manipulate our natural bias.
As a side note it's always a bit sad how the parasocial nature of the modern web make us like machine interfacing via simple widgets, becoming mechanical robot ourselves rationalising IO via simple metrics kind of forgetting that the map is never the territory.
Specifically if those avatars are cute animie girls.
I know you are half joking/not joking, but this is definitely a golden signal.
It’s supposed to get people to actually try your product. If they like it, they star it. Simple.
At that point, forcing the action just inflates numbers and strips them of any meaning.
Gaming stars to set it as a positive signal for the product to showcase is just SHIT.