IndieWebbing from the couch

I鈥檓 not going to pretend for a moment that building a personal website is easier than posting to a platform that someone else has already built. I know that updating content for a static site generator in a git repository from the command line introduces a whole lot of friction (and jargon) to the blogging/publishing process, when I could easily post elsewhere with a mobile app from the comfort of my couch or while commuting on a tram. But there are enough development and automation tools available for iOS that I can use my iPad or phone as a portable development lab. Here鈥檚 how I write and publish away from a computer. ...

 路 369 words 路 Claudine Chionh

Locking out AI bots with the Dark Visitors API

The Robots Exclusion Protocol was initially developed in the early years of the web to prevent small websites from being overwhelmed by search engine crawlers, and its use has expanded to exclude robots for a variety of reasons, most recently to keep artificial intelligence bots from feeding their large language models with web content. There are various hand-picked lists of known AI bots that have been circulating among web developers, and there鈥檚 also Dark Visitors, which maintains a comprehensive list of known bots, categorised by type. You can use this resource to generate your own robots.txt, either by copying the example file, or by using the free Dark Visitors API to customise the types you want to exclude. ...

 路 419 words 路 Claudine Chionh