Ruby

LessWrong Team

 

I have signed no contracts or agreements whose existence I cannot mention.

Sequences

LW Team Updates & Announcements
Novum Organum

Comments

Sorted by
Ruby20

"Serendipity" is a term I've been seen used for this, possibly was Venkatesh Rao.

Ruby42

Curated. The wiki pages collected here, despite being written in 2015-2017 remain excellent resources on concepts and arguments for key AI alignment ideas (both still widely used and those lesser known). I found that even for concepts/arguments like the orthogonality thesis and corrigibility, I felt a gain in crispness from reading these pages. The concept of, e.g. epistemic and instrumental efficiency I didn't have, yet feels useful in thinking about the rise of increasingly powerful AI.

Of course, there's also non-AI content that got imported. The Bayes guide likely remains the best resource for building Bayes intuition, and same with the guide on logarithms that is extremely thorough.

Ruby42

I think the guide should be 10x more prominent in this post.

Ruby20

You should see the option when you click on the triple dot menu (next to the Like button).

Ruby20

So the nice thing about karma is that if someone thinks a wikitag is worthy of attention for any reason (article, tagged posts, importance of concept), they're able to upvote it and make it appear higher.

Much of the current karma comes from Ben Pace and I who did a pass. Rationality Quotes didn't strike me a page I particularly wanted to boost up the list, but if you disagree with me you're able to Like it.

In general, I don't think have a lot of tagged posts should mean a wikitag should be ranked highly. It's a consideration, but I like it flowing via people's judgments about whether or not to upvote it.


The categorization is an interesting question. Indeed currently only admins can do it and that perhaps requires more thought.

Ruby30

Interesting. Doesn't replicate for me. What phone are you using?

Answer by Ruby197

It's a compass rose, thematic with the Map and Territory metaphor for rationality/truthseeking.

The real question is why does NATO have our logo. 

Ruby179

Curated!  I like this post for the object-level interestingness of the cited papers, but also for pulling in some interesting models from elsewhere and generally reminding us that this is something we can do.

In times of yore, LessWrong venerated the the neglected virtue of scholarship. And well, sometimes it feels like it's still neglected. It's tough because indeed many domains have a lot of low quality work, especially outside of hard sciences, but I'd wager on there being a fair amount worth reading, and appreciate Buck point at a domain where that seems to be the case.

Load More