Always Simplifying Part II
2020-03-11
Periodically I intend to revisit additional steps I have taken to further uncomplicate my daily life. In January I touched on my quest to simplify every part of my life. At the time I mentioned consuming RSS feeds to follow websites and to stay informed and I also mentioned the move to this static website. While I made the move to a static website, I have also been singing the praises of Tiny Tiny RSS, which required me to maintain a PHP installation as well as a database to go with it. Tiny Tiny RSS is an absolute awesome RSS feed reader and if others I knew wanted to also use my installation, it is multi-user so you could have a community, I would likely have kept my Tiny Tiny RSS install. But I wanted to eliminate the need for a PHP and database server, but I still want my RSS feeds available everywhere. Tiny Tiny RSS was also running on a small Linode server that was costing me 10 US dollars a month, it was time to eliminate an expense and simplify my RSS experience.
Tiny Tiny RSS allows you to export your subscriptions to an OPML file, an XML based format that I believe any RSS reader now supports. I exported my subscriptions and used Python, and a bit of shell and awk to provide my daily headlines on my FreeBSD home server statically. With Tiny Tiny RSS I noticed I was seeing a lot of duplicate headlines with some sites, particularly bigger news sites in which I suspect they were altering headlines or the feed in some way. Not the fault of Tiny Tiny RSS, but none the less complicating my consumption of news. I do not want to be glued to my feed reader, I want to take a look once or twice a day and then move on. I recall my parents and a much younger me reading the newspaper in the morning or afternoon after school or work. Fun fact, I was a paper carrier as a kid and even won paper carrier of the year more than once. I want that feel of reading the newspaper with my coffee in the morning or at lunch time, then I want to go about my day. I want that experience because it is simpler and faster and just a better minimal experience than a constant flow of updates, which turns into just noise. Rather than feeling like I might miss something among hundreds of updates and going through the step of having to mark them read, I want to look and move on. Every twelve hours or so most of those old updates will be gone with the last issue and I can see a new easily digestible list of linked headlines.
I intended to keep my exported OPML as is, in the event I decide to someday go back to Tiny Tiny RSS or another RSS feed reader. I also weeded out some subscriptions in which the websites no longer existed, this got me down to 51 current subscriptions of which many of those only post a few times a year. Most of the changes and new content comes via the Reuters, BBC and Hacker News feeds which seemed to overrun Tiny Tiny RSS and always inundate me with sometimes hundreds of unread posts if I did not look for a few hours.
I used Python and listparser to write a text only version of my feeds to work with, because this will be how I will subscribe and unsubscribe from feeds I wanted it to be simple and easy to understand, a CSV file with title and URL of the feed that looked like the following.
"So you want to be a brewer...",http://blog.troegs.com/?feed=rss2
"Simple Desktops",http://feeds.feedburner.com/simpledesktops
"xkcd.com",http://xkcd.com/rss.xml
"Skillet",http://skillet.lifehacker.com/rss
"Reuters: Technology News",http://feeds.reuters.com/reuters/technologyNews
"BBC News - World",https://feeds.bbci.co.uk/news/world/rss.xml
"CertDepot",http://feeds.feedburner.com/Certdepot
"Command-Line-Fu",http://feeds2.feedburner.com/Command-line-fu
Now that I had a list to work with I identified what I wanted to accomplish.
- Download the RSS feed from each entry in the rssfeeds.csv file
- Check the title of each entry in the feed to see if it already exists in a text database my script will create.
- If the entry already exists, was it created over 15 hours ago, if yes do not show it again.
- For titles that are not already existing in the database, write the title and timestamp to the database.
- Output to an HTML file on my website only the new RSS feed titles from the last 15 hours.
Could I miss something with this method? Sure, it could happen. Does it really matter if I do? No it does not. This would be like worrying about missing something on social media, who cares. I am still informed about any major news but I am not so overwhelmed with headlines that I feel the need to scroll through a few hundred updates five times a day. My solution usually gives me a list of fewer than 90 headlines in an easy to read format something like the following condensed example, although not in the example, in my solution these are links to the articles listed. I have since updated the layout to the design you see in the screenshot. It is much easier to read and identify articles from specific feeds
Mon, Mar 02 11:00 AM ( "Skillet" )
-----------------------------------------
Add a Little Booze to Your French Toast
Mon, Mar 02 11:00 AM ( "Unwinnable" )
----------------------------------------
Horror on the Orient Express
Mon, Mar 02 11:00 AM ( "Hacker News" )
-----------------------------------------
WireGuard Gives Linux a Faster, More Secure VPN
Ask HN: Who is hiring? (March 2020)
Payment Request API
Mon, Mar 02 11:00 AM ( "BBC News - World" )
-----------------------------------------
Coronavirus: World in 'uncharted territory'
Coronavirus: Global growth could halve if outbreak intensifies
Turkey says millions of migrants may head to EU
I considered archiving versions of each page or playing with background and font coloring, but right now I am pretty happy with the simple single list. The problem with archiving, is that I feel like this would lead me toward spending more time with this content than I intend or really need to and I would need to identify value with having such an archive. Good design is often as minimal as possible, you do not have to over think it. It is amazing how consuming RSS feeds like this feels so much less like a chore than it ever has before. As a bonus I eliminated an installation of a webserver I had to care for and saved myself 120 US dollars a year.