I follow this religiously. The process of posting is manual but it works fairly well if your intention is good and you're not blog spamming in different forums.
But I intentionally haven't added a comment section to my blog [1]. Mostly because I don't get paid to write there and addressing the comments - even the good ones - requires a ton of energy.
Also, scaling the comment section is a pain. I had disqus integrated into my Hugo site but it became a mess when people started having actual discussion and the section got longer and longer.
If the write ups are any useful, it generally appears here or reddit and I often link back those discussions in the articles. That's good enough for me.
I follow this approach. It's mostly because I want to own the land I build on.
It works well, but it's hard to automate. In the end you must manually cross-post, and both the post and the discussion will vary by community. You end up being active in multiple different communities and still getting little traffic from the effort.
It's not such a great way to drive traffic. On the other hand, it's a wonderful way to work in public.
That's because social media sites have purposefully made it hard (or relatively expensive) to post on their platforms with automated tools - they specifically don't want you to POSSE
Facebook also deprioritises posts with links in them to disincentivize people using their platform to promote their own primary source, that's why there's the "link in comments" crap.
They want you to engage with their user base and not just dump links. They also prefer if users stay on the platform, but recent research shows that the deprioritization is partly a myth; people actually engage less with external links.
I don't agree with your first point at all, posting on other platforms is trivial and there's what must be hundreds of post scheduling options you can hook into an RSS feed. Here's a completely open source one that you have to spend a lot of time configuring API keys for, but then it just works: https://postiz.com/
What makes it difficult is all of the quirks you have to account for. For the most trivial example, Twitter has a character limit of 280 characters, but it's 300 on Bluesky, 500 on Threads, and on Mastodon it is whatever your instance wants it to be.
For another quirk, I have like a side-project in which I publish snippets of DJs playing copyrighted music, and while I can post those videos on TikTok/Instagram/YouTube without worrying about copyright, I am like 99% confident my website would be instantly delisted from all the search engines if I used the POSSE strategy for that use case.
I agree with your second point that getting anything useful out of it (as in traffic to the source) is pretty much impossible. On Instagram you can only do that via stories, but you can't automate it, because you need Instagram's story editor to add a link to the story. On TikTok you can't even put a link-in-bio until you reach a 1000 followers. On Twitter you might as well not bother, as the medium itself prefers completely unsourced claims. As for Facebook, I honestly don't even know why anyone would bother with Facebook these days, it's completely irrelevant.
If I werent more critical, I would have read this as an astroturf by big tech sort of thing. Like "it's inevitable that big tech will win, and so therefore syndicate everywhere or you have lost the game." I dont really get what game we're playing though. Why do I care if my friend who only uses Facebook sees my blog posts? What do I get from that other than the feeling of a maybe-connection (much like their criticism of federated networks - youre hoping for a future where this works out for you.)
I dont post on federated networks yet but I would rather share in my principles with those willing to listen than to throw up my hands and share my stuff everywhere.
As someone on the receiving end of POSSE, who is often on the multiple platforms people post to, this approach ends up feeling impersonal and spammy. I totally get the reasoning people have for doing it. But, to me, it's very "ship it" focused, rather than conversation focused. Maybe I'm just getting old.
atproto feels like a move in the right direction for personal publishing that makes content discovery easier withtout the need to post to multiple channels / platforms. https://standard.site/ is one initiative working towards making this a reality.
ATProto is a great idea that will never go anywhere because of its close association with Bluesky the service, and Bluesky the company, and that's a shame.
Impersonal in the sense the article etc isn't being presented for the specific audience. It's just being dumped everywhere with the same contextual text ("Wrote this piece about...."). So, I'm seeing it everywhere in the exact same way. Which feels way spammy (and which I've admittedly had to do myself, as per the times). But, I'm used to feeling like the person I follow is posting stuff to the community in language specific to their readers. I say "used to", but I'm probably thinking back decades now. Back when your audience / reader base was the metric of personal. Not the platform.
However, I skip permashortlinks - I try to keep my regular links relevant and short. Also, I like seeing full links, they can often indicate what content awaits there - vs short links, which are more opaque.
That's one more benefit of this workflow: it can be adjusted to fit one's personal preferences. I suppose others might prefer short links or maybe at some point I'll change my mind; with POSSE making these kind of changes is easy.
Yeah, I’m happy to just call permashortlinks a bad idea, seldom warranted historically and roughly never now. The article offers no explanation about why to use permashortlinks—what looks to be “a few reasons why” is actually a few reasons why to link to the original (rather than copying and pasting the contents), nothing to do with the permashortlink practice.
https://indieweb.org/permashortlink does give a few reasons, but they’re bunk. “More reliable in email”? Not meaningfully so. “Quicker to recall / copy due to size”? Not typically a concern. Maybe a nice-to-have, but you can consider adjusting your URL style, then it can be even better. “Less effort to manually enter”? Repeat of the previous point.
And it doesn’t address the problems of the permashortlink. Cost. Diluting across different domains. Having something different to maintain and remember.
Don’t do separate permashortlinks. Just fix your regular links to not be bad.
I built a syndication service for my site, though it only supports Mastodon. Each supported type has a toggle to syndicate out and, on success, a timestamp is displayed to note when it was syndicated after which the toggle is also displayed.
There's an RSS and JSON feed for each collection and a combined feed as well.
I like when I read something, and it has links to the "main" discussion on HN/reddit/etc. Most blogs don't have a very active comment field, and if I'm reading it a few days late, it's nice to still be able to find other's thoughts on the matter.
> links to the "main" discussion on HN/reddit/etc.
I don't mean to pick on your comment specifically, but it's saddening to see how after these years of the "appification" of the internet and corporations successfully conditioning us to think of terms of their walled gardens, we lost the web.
There shouldn't be a "main" discussion. Our browsers should be able to find these links and present the information in a way that it makes sense to consumer, not the publisher. This gets deeply frustrating for me now that I am working more on ActivityPub and Linked Data. Most of the AP projects are so focused on emulating the closed gardens, they don't even think about building their systems with linking as the primary discovery method.
I love it when this shows up from time to time. Everyone should own their own content! And the indieweb community and its underlying philosophy are worth celebrating.
If you haven't, you should try to get to a Homebrew Website Club. Go talk to people about making your own, weird spot on the web that truly represents you. It'll make you feel great about technology again, I promise.
However I am not sure about "perma-shortlinks", for discovery on other sites as the means of networking and discovering content. It seems clunky to maintain as it requires a human or some automation to curate/maintain the links. If a blog removes a link to another blog, then that pathway is closed.
It would be cool if we could solve that with a "DNS for tags/topics" a - Domain Content Server (DCS) e.g.
1. tomaytotomato.com ==> publishes to the DCS of topics (tech, java, travel)
2. DCS adds domain to those topics
3. Rating or evaluating of the content on website based on those tags (not sure the mechanics here, but it could be exploited or gamed)
You could have several DCS for topics servers run by organisations or individuals.
e.g. the Lobsters DNS for topics server would be really fussy about #tech or #computerscience blog posts, and would self select for more high brow stuff
Meanwhile a more casual tech group would score content higher for Youtube content or Toms Hardware articles.
RSS is a refreshingly simple way (and thus, trustworthy) of taking back control over what we see in a world of algorithmic "curation" (i.e. mixing in ads and manipulation, and taking away things that would interest us).
I like the principle, but I also find that we software folk commonly mistake the creation of a website as the goal, rather than the production of "content" (e.g. blog posts). I spent years trying to publish a blog and continually getting derailed building the ultimate static website. Recently I switched to a Substack hosted on my own subdomain, and now I'm finally writing. At least I still own the subdomain.
Hah, reminds of trying to make a blog as a teenager, 20+ years ago. Built my own CMS in PHP with various features. But never got further than having a few lines of text in the draft state. Most of the time was actually spent on having rounded corners (border-radius didn't exist) with some kinda of glass effect for a cool look (inspired by the then unreleased Windows Longhorn). And named my tool the generic name Publish-it, because publi-shit was funny.
I follow the opposite with PESOS: Publish Elsewhere, Syndicate (to your) Own Site. Work really nicely as I've got some automation and systems in place that allow me to maintain a full firehose of all my posts and notable actions across the web on my own site. Then I can sort through them and reference them (which I do frequently) with ease. I do recommend.
Nice that we have a name now for something that's pretty much standard and common practice. Not that we necessarily needed a name, but it's still nice to have one.
> something that's pretty much standard and common practice
Is it? How many people publish to their sites small texts that they then syndicate to Twitter/Bluesky/whatever? How many people publish videos to their sites and then syndicate to Youtube?
The idea is not that you necessarily write a Twitter-length post on your website - you can write a full blog post, but then post links back to that post on social media.
I don't post stuff on a blog, but I do have replies to common arguments written down in Obsidian. I can just copy stuff from there, edit a bit and post.
I can immediately some problems to do with content formats. Fro example, facebook lets you have a 'montage' of photos, but instagram only shows one. The music available is heavily restricted on different platforms. Video length limits are different. Etc.
Does any software let you make a main post on your own site, but then render differently to the silos?
> Q: Do we need to worry about search engines penalizing apparently duplicate posts?
> A: That's why the POSSE copies SHOULD always link back to the originals. So that search engines can infer that the copies are just copies. Ideally POSSE copies on silos should use rel-canonical to link back to the originals, but even without explicit rel-canonical, the explicit link back to the original is a strong hint that it is an original.
But I intentionally haven't added a comment section to my blog [1]. Mostly because I don't get paid to write there and addressing the comments - even the good ones - requires a ton of energy.
Also, scaling the comment section is a pain. I had disqus integrated into my Hugo site but it became a mess when people started having actual discussion and the section got longer and longer.
If the write ups are any useful, it generally appears here or reddit and I often link back those discussions in the articles. That's good enough for me.
[1]: https://rednafi.com
Publish on your own site, syndicate elsewhere - https://news.ycombinator.com/item?id=46468600 - Jan 2026 (248 comments)
POSSE: Publish on your Own Site, Syndicate Elsewhere - https://news.ycombinator.com/item?id=35636052 - April 2023 (70 comments)
POSSE: Publish on your own site, syndicate elsewhere - https://news.ycombinator.com/item?id=29115696 - Nov 2021 (43 comments)
Publish on Your Own Site, Syndicate Elsewhere - https://news.ycombinator.com/item?id=16663850 - March 2018 (26 comments)
To me, it feels like Star Wars' rebellion, in a struggle against the big tech (big data, big relationship, big dopamine) empire.
POSSE? This is the way.
It works well, but it's hard to automate. In the end you must manually cross-post, and both the post and the discussion will vary by community. You end up being active in multiple different communities and still getting little traffic from the effort.
It's not such a great way to drive traffic. On the other hand, it's a wonderful way to work in public.
That's because social media sites have purposefully made it hard (or relatively expensive) to post on their platforms with automated tools - they specifically don't want you to POSSE
Facebook also deprioritises posts with links in them to disincentivize people using their platform to promote their own primary source, that's why there's the "link in comments" crap.
They want you to engage with their user base and not just dump links. They also prefer if users stay on the platform, but recent research shows that the deprioritization is partly a myth; people actually engage less with external links.
What makes it difficult is all of the quirks you have to account for. For the most trivial example, Twitter has a character limit of 280 characters, but it's 300 on Bluesky, 500 on Threads, and on Mastodon it is whatever your instance wants it to be.
For another quirk, I have like a side-project in which I publish snippets of DJs playing copyrighted music, and while I can post those videos on TikTok/Instagram/YouTube without worrying about copyright, I am like 99% confident my website would be instantly delisted from all the search engines if I used the POSSE strategy for that use case.
I agree with your second point that getting anything useful out of it (as in traffic to the source) is pretty much impossible. On Instagram you can only do that via stories, but you can't automate it, because you need Instagram's story editor to add a link to the story. On TikTok you can't even put a link-in-bio until you reach a 1000 followers. On Twitter you might as well not bother, as the medium itself prefers completely unsourced claims. As for Facebook, I honestly don't even know why anyone would bother with Facebook these days, it's completely irrelevant.
I dont post on federated networks yet but I would rather share in my principles with those willing to listen than to throw up my hands and share my stuff everywhere.
However, I skip permashortlinks - I try to keep my regular links relevant and short. Also, I like seeing full links, they can often indicate what content awaits there - vs short links, which are more opaque.
That's one more benefit of this workflow: it can be adjusted to fit one's personal preferences. I suppose others might prefer short links or maybe at some point I'll change my mind; with POSSE making these kind of changes is easy.
https://indieweb.org/permashortlink does give a few reasons, but they’re bunk. “More reliable in email”? Not meaningfully so. “Quicker to recall / copy due to size”? Not typically a concern. Maybe a nice-to-have, but you can consider adjusting your URL style, then it can be even better. “Less effort to manually enter”? Repeat of the previous point.
And it doesn’t address the problems of the permashortlink. Cost. Diluting across different domains. Having something different to maintain and remember.
Don’t do separate permashortlinks. Just fix your regular links to not be bad.
There's an RSS and JSON feed for each collection and a combined feed as well.
I don't mean to pick on your comment specifically, but it's saddening to see how after these years of the "appification" of the internet and corporations successfully conditioning us to think of terms of their walled gardens, we lost the web.
There shouldn't be a "main" discussion. Our browsers should be able to find these links and present the information in a way that it makes sense to consumer, not the publisher. This gets deeply frustrating for me now that I am working more on ActivityPub and Linked Data. Most of the AP projects are so focused on emulating the closed gardens, they don't even think about building their systems with linking as the primary discovery method.
1. I like your blog and subscribe to its RSS
2. I see new posts in my RSS reader with syndication links to (HN/reddit/twitter/etc).
3. I can go to those places to talk about it.
Low tech version is just linking to those discussions at the bottom of your post I guess.
If you haven't, you should try to get to a Homebrew Website Club. Go talk to people about making your own, weird spot on the web that truly represents you. It'll make you feel great about technology again, I promise.
However I am not sure about "perma-shortlinks", for discovery on other sites as the means of networking and discovering content. It seems clunky to maintain as it requires a human or some automation to curate/maintain the links. If a blog removes a link to another blog, then that pathway is closed.
It would be cool if we could solve that with a "DNS for tags/topics" a - Domain Content Server (DCS) e.g.
1. tomaytotomato.com ==> publishes to the DCS of topics (tech, java, travel)
2. DCS adds domain to those topics
3. Rating or evaluating of the content on website based on those tags (not sure the mechanics here, but it could be exploited or gamed)
You could have several DCS for topics servers run by organisations or individuals.
e.g. the Lobsters DNS for topics server would be really fussy about #tech or #computerscience blog posts, and would self select for more high brow stuff
Meanwhile a more casual tech group would score content higher for Youtube content or Toms Hardware articles.
This is just spit balling.
The whole point of syndication is that it's curated by humans (you, if it's your own feed).
A social media feed implies 1(n) curated by 1 algorithm hosted on Facebook/Twitter/Instagram
What I was thinking is:
- foo.social
- bar.social (tech curations)
- java.bar.social (sub curated Java list)
All these DCS (domain content servers) would be polled by your own local client
Your client can then aggregate or organise how it shows this feed
e.g. I could have a trending aggregator for situations where a blog post is shown on multiple domains (sort of shows virality)
[1]: https://github.com/searlsco/posse_party
[0]: https://web.archive.org/web/20160904131420/https://indieweb....
[1]: https://en.wikipedia.org/wiki/AT_Protocol
Is it? How many people publish to their sites small texts that they then syndicate to Twitter/Bluesky/whatever? How many people publish videos to their sites and then syndicate to Youtube?
The idea is about 10 years old. At least that's when I first heard about it, with relation to RSS. It may go back earlier.
Edit: confirmed by the "See Also" section at the end of TFA.
I can immediately some problems to do with content formats. Fro example, facebook lets you have a 'montage' of photos, but instagram only shows one. The music available is heavily restricted on different platforms. Video length limits are different. Etc.
Does any software let you make a main post on your own site, but then render differently to the silos?
> Q: Do we need to worry about search engines penalizing apparently duplicate posts?
> A: That's why the POSSE copies SHOULD always link back to the originals. So that search engines can infer that the copies are just copies. Ideally POSSE copies on silos should use rel-canonical to link back to the originals, but even without explicit rel-canonical, the explicit link back to the original is a strong hint that it is an original.