Category: Tech

I am tired of people making fun of a feature just because they cannot think of a use for that. This attitude recently came to the fore with Apple’s introduction of slofies” – the slow-motion selfies. The call for stop trying to make slofies happen” was loud and clear from the tech community. Apparently, no one else wants to use it because we do not want to use it.

That’s a terrible take. Sure, may be the feature can be graded low on the usefulness” parameter, but it stands high on the fun scale. And our smartphones today are the most personal devices we carry around with us today, and that is not just because they are useful. They are equally fun too.

So stop mocking anything that you will not use. Selfies. Crazy filters. Slow-motion videos. Loop and Bounce effects – the boomerangs. They all make these dull devices a lot more fun. And their fun factor is what makes them sell in masses.

I wonder what the purists think of the recent computational photography trend.

Google started it with its all-in-cloud touch up of the photos. And then they moved it on-device in the camera app. Every photo one took was stitched together from multiple shots with different settings. And eventually each OEM made their cameras smarter, AI-driven”.

Latest iPhone 11 stitches a single photo from 4 under exposed frames taken before the shutter button is clicked, one normal picture and 1 over exposed frame. They call this process semantic rendering. What follows is some heavy processing. Here’s snippet from the The Verge’s review of iPhone 11 Pro review.

Smart HDR looks for things in the photos it understands: the sky, faces, hair, facial hair, things like that. Then it uses the additional detail from the underexposed and overexposed frames to selectively process those areas of the image: hair gets sharpened, the sky gets de-noised but not sharpened, faces get relighted to make them look more even, and facial hair gets sharpened up. Smart HDR is also now less aggressive with highlights and shadows. Highlights on faces aren’t corrected as aggressively as before because those highlights make photos look more natural, but other highlights and shadows are corrected to regain detail.

What you get as a result is an extremely clear picture with each object in the photo appropriately visible.

But with so much processing of each image, should this even be called photography any more? Here’s Wikipedia introducing the term.

Photography is the art, application and practice of creating durable images by recording light or other electromagnetic radiation, either electronically by means of an image sensor, or chemically by means of a light-sensitive material such as photographic film.

What we do with our smartphones is neither an art nor is it creating a single image.

All parts of the photos are independently captured (and even pre-captured) with the best suited settings, processed post-capture, with even some live sections including audio recorded. This is not creating an image” any more.

Someone might say it all started when the digital photography became mainstream – when the physical limitations of the analog methods did not constrain the person with a camera in his hand. However, what we capture is no longer a single image anymore. A more apt term for these might be visual memories”. Common people are interested in doing just that, they don’t care if they are called photographers.

Let Photography stay an art.

State of Mobile Imaging

There has been a lot of positive craze around Huawei P30 Pro’s cameras recently. For me it started with this Twitter thread by Vlad Savov of The Verge where he compared images from this latest phone from Huawei against Pixel 3. Especially, the pitch dark, night-sight pictures. Just look at this specific example from the thread.

That’s simply criminal. Here’s Vlad summarizing his observation in the article.

In the following example, featuring an unlit bathroom where my eyes could detect shapes but no colors, the P30 Pro does the unbelievable by actually focusing and producing a very respectable image.

So now these mobile cameras can do better than our eyes? Nice. And this positive view towards the P30 camera is shared by almost all tech reviewers. Here’s what Engadget says about this.

With less light, zooming, focus and detail should be a struggle, but the P30 Pro mostly shrugged it off. The combination of dual OIS on both the primary camera and the telephoto, in addition to the digital image stabilizing trick, gives the phone a better chance to capture images at reduced noise and do it all better.

Of course, this isn’t the perfect smartphone – far from it. It isn’t even the hands-down best camera quality and experience on a mobile phone. Rene Ritchie has a nice comparison video of the phone with the iPhone XS. It is good to be aware of the capabilities and shortcoming of the overall device.

But I am amazed at the speed with which the imaging technology on the smartphones are improving. It was only few months back when we were surprised looking at what the Pixel 3 could achieve with its night-sight feature. And we already have a device that, if not tops, matches that under most conditions.

The tech pundits always made us believe that camera tech will be the next big differentiator for the smartphones — something that will separate the big, serious players from the emerging ones. However, the way things are going, I don’t think that would be the case. The camera modules will again soon be commoditized and everyone would be back to the drawing board in search of that one differentiator.

This does not bode well for Google’s hardware efforts in smartphones. Apple and Samsung, for whom their brand is the primary selling point, would be pleased with this.

Displaying Webmentions with Posts

I have been using Blot, a simple blogging platform with no interface, for quite some time now for running my blog. I am not alone when I say this, but am mighty impressed with how simple it is to post things on blot and maintain the overall site. They are just some files in Dropbox – that’s about it. So, it was pretty straightforward to customise the theme to my liking and to enable the support for IndieWeb principles.

Post Notes

One thing I have noticed, though, is that most of the IndieWeb principles are not visible. They enable a more open web, providing sites a grammar they can talk to one another with. But for someone who owns the website or even someone who reads the posts on a website, whatever changes go in just aren’t apparent. Except, of course, for webmentions.

I have already detailed the steps to Indiewebify one’s website (specifically one built with Hugo). I did not go into the details of setting up webmentions. And that is exactly what I get asked the most about – how does one display mentions along with the posts.

The need is more evident with microblogging – and especially so with The platform fosters a very active and pleasant community focused more on the interactions (replies) than meaningless reactions (likes, reposts). It also sends webmentions for every reply to a post to the sites that can receive them. So the desire to display the interactions along with the posts, microposts more so, is understandable.

In this post, I will (finally!) document the steps that can help one receive, fetch and display the webmentions along with the posts. The steps are documented from a reference of a blot website. However, the steps below can be altered at appropriate places, primarily formats, to implement the support for any other platform.

Essential Indieweb

Before you can start receiving the webmentions at your site, there is an essential step from IndieWeb to be achieved – to make your website your identity online. It involves declaring openly your social network profiles as rel-me links and link those profiles back to your site. This allows you to login to any IndieAuth enabled services using your website’s homepage – no need to create an account or maintain passwords.

To achieve this, modify the head.html of the site’s theme to add such links to your other online profiles in <head> element. These profiles can be at Twitter, Github, Facebook, or even email – anything where you can link back to your website from. Some reference links are shown below.

<link rel="me" href="" /> <link rel="me" href="" /> <link rel="me" href="" />

After you declare your website with either Twitter or Github, they can authenticate your identity. With email, a link is sent to the configured email address to do the same, very much like any email-based two-factor authentication.

You also need to declare the service which will act as an authorization endpoint when needed. This is used by other services, mostly IndieAuth clients, to validate your identity. To configure this, just add the below link to your site’s head.

<link rel="authorization_endpoint" href="">

Once you have this enabled, you can test your setup using Indiewebify.Me. Test the Set up Web Sign In” section.

You are now ready to receive the webmentions from other sites, including

Receive Webmentions

This primarily involves hosting and declaring a webmention endpoint. Of course, the active IndieWeb community already has a ready solution for this – by Aaron Parecki. It is a hosted service created to easily handle webmentions”. All you have to do is sign-in with your domain (i.e. validate your identity) and let receive all mentions to your site. Once that’s done, just declare the webmention endpoint as below in your site’s head.

<link rel="webmention" href="" />

The username is typically the url of your site (you can also find these details on the settings page). To test this setup, login to dashboard and you should start seeing the mentions sent to your site (which includes the replies on

Display Webmentions also provides APIs for you to fetch the webmentions to your posts/site. You can implement a custom solution using Javascript for fetching and displaying these mentions along with posts. Below is one of the ways this can be achieved. It specifically pulls the likes, reposts and replies and puts them below the posts. The code might look a bit untidy, but it would be easier to follow what’s going on. You can improve over this eventually.

To start with, declare a placeholder for the webmentions. Place the below div element in your entry.html file between {{#entry}} and {{/entry}} – preferably towards the end of the file, just above {{/entry}}.

<div class="post-mentions" id="post-mentions" style="display:none"> <ul class="mentions-list" id="mentions-list"></ul> </d

Of course, you could replace the unordered list ul element with anything else. This is just one of the ways you can do it.

Next, you need to fill this div element with the mentions the entry has received. You can use the javascript snippet available at this gist to fetch and display the webmentions along with the post. Just place the complete code available there at the end of script.js file of your blot’s theme. So it would look something like below (note that this is an incomplete snapshot).

{{{appJS}}} var post_url = window.location.href; $(document).ready(function(){ $("ul#mentions-list").empty(); $.getJSON("", { target: post_url }, function(data){ ... ... 

Make sure the first {{{appJS}}} line is not removed. This import makes sure the additional javascripts necessary for some specific features provided by Blot – for example jQuery for image zoom, Google Analytics etc. – are imported.

The above javascript snippet does below.

  1. Gets the current post url and fetches the webmentions for that url from
  2. Divides and groups the mentions by the activity type (like, repost, link, reply). This is so that you can control how each activity-type is styled.
  3. Finally, populates these mentions into the above-created placeholder div element.

At this point, you should start seeing the webmentions along with the posts once the above-mentioned steps are carried out.

In case, the webmentions are available on the dashboard, but aren’t getting loaded on the post, one possible root cause is failure of jQuery import. Declare a jQuery import explicitly in head.html by adding below statement within the head tags.

<script src=""> </script> 

You will also note that all the html elements in the javascript code are tagged with class attribute. This allows you to control the style of the elements as per your liking. Just modify the main css file for your theme (typically style.css) to add styling for these classes.

For reference, find below a sample styling for the main post-mention class.

.post-mentions { padding-top: 15px; margin-top: 10px; border-top: 1px solid #AEADAD; border-bottom: 1px solid #AEADAD; font-size: 16px; } .post-mentions ul { list-style:none; padding:0; margin-left: 0; }

Similarly, you can also style the mention, mention-author, mention-social and mention-text classes.

Interactions from Social Media

Though references to your posts from IndieWeb sites are handled, what about the references made on Twitter or Facebook? It can be in any form of response (i.e. likes/retweets/reposts) to the syndicated post. Of course, they do not send webmentions (wish they did).

One option is to implement your own backfeed to poll for such interactions and handle them as response. Well, the community has made sure there is a simpler hosted option. Enter Bridgy.

Bridgy is an open source project and proxy that implements backfeed and POSSE as a service. Bridgy sends webmentions for comments, likes, etc. on Facebook, Twitter, Google+, Instagram, and Flickr.

Just connect your social accounts with your website at Bridgy and every time there is an interaction about your post on a service, Bridgy captures that and sends a webmention to your endpoint configured earlier.

I understand that the IndieWeb journey can get overwhelming. Webmentions are a critical part of this journey and, as I said earlier, one of the more prominent pieces of the overall puzzle. I have never been comfortable using any of the commenting systems, be it those that come native with the CMSes or external commenting systems like Disqus. I have also observed that the new platforms, like Blot or, rarely come bundled with commenting systems of their own.

Webmention has a potential to address that need. I hope the steps detailed above come in handy for anyone who wants to display webmentions on his or her site right next to the posts. The source for the theme that styles my blog at Blot is openly available at GitHub. So, if you like anything that you see at my blog, you can refer to the source and get inspired.

Do let me know if you face issues with getting any of steps carried out. If you don’t face any issue and get everything working perfectly, send me a webmention — the best way would be to link to this post. If all’s well, your links would be visible below this post as mentions.

I believe all data is anonymised” has to be the biggest lie all these data hoarding and advertising companies tell its customers. With the amount of data they have, they can build an extremely accurate profile of any user, doesn’t matter if individual data point is anonymised.

Open Web and Your Social Signature

I had recently expressed my hope for more people to own their identities online.

There is nothing wrong with attempting to control what you post online, to make sure it stays online till you want it to. I do also realise that it is naive to think no one getting online will find this process irksome. Even though well defined, the (open web) principles are not for all. The simplicity of using and posting on social media services will continue to attract regular users. However, here’s wishing that at least a part of these users are inspired to get their own personal domain.

An innate wish there is that more people would leave the silos behind and get online as themselves, express thoughts that are their own, not mindless reposts and shares, and at the site they control – their blogs1. At the same time, the hope is the hosting platforms make it simpler to book such places online and get them up and running easily.

I think there has already been a huge improvement on this front. There are numerous platforms, like WordPress, Ghost and others2, that are making it simpler to get your own blogs up and running. They also allow you to link these blogs to your domain without fussing over hosting/maintenance. The promise is simple. Jump in with a free tier — if you are happy and if you want to, just switch to a paid account.

But then comes the million dollars question? What’s the point if what I write reaches no one? If no one reads it or talks about it? If everyone keeps shouting in the void without anyone listening, one better not spend the energy. After all, we are sociable. We like interactions, we want feedback.

RSS is a powerful protocol that could have solved this problem. Unfortunately, that’s what it remained, a protocol3. It needed a system to be built on top to gain any traction amongst masses. That’s where I believe lies an opportunity for It brings in that social layer to the thoughts you pen on your blog.

You can either host your content there or get your posts from existing blog to the timeline. You write on your blog, it’s visible for others on their timeline, just as a tweet or a Facebook post will on their respective siloed timelines.

But it doesn’t allow repost. It does not glorify numbers of likes and comments and followers.

Such behaviours and numbers are the signals for bots to game the machine curation systems. Tristan Harris put this very well during one of his podcasts appearances.

Outrage just spreads faster than something that’s not outrage.

When you open up the blue Facebook icon, you’re activating the AI, which tries to figure out the perfect thing it can show you that’ll engage you. It doesn’t have any intelligence, except figuring out what gets the most clicks. The outrage stuff gets the most clicks, so it puts that at the top.

So what do we do then? As Don MacDonald pondered in one of his posts, is sharing a problem? Shall we just stop sharing?

I doubt that will be effective. It will work when we make it work. We need to take control of what gets presented to us to consume. It cannot be done by a corporate inclined primarily first to maximise its margins. It cannot be done by an algorithm that’s designed to gallop every signal and spit a feed to maximise engagement.

Once we start consuming, reading, healthy, we will think healthy. And we should think. And share, and respond we should. Let’s just make sure it is a space that represents us. A space that one can point to and say that’s my thoughts in there. My social presence, a signature. Let open web be that space.

  1. I use blog and site interchangeably throughout this article. I do not want to get into the technicalities. And I am just focused on individuals, not companies.

  2. A lot many for professional sites too — SqaureSpace, Wix etc. Again, the idea is focusing primarily on individuals.

  3. Of course, I am intentionally jumping over a phase when RSS was the buzz word. In Reader, Google had upped everyone’s hopes from the platform. And in Reader, it dealt RSS a dull shrug.

IndieWebify Your Hugo Website

For many, that’s too much of jargon right there in the title. So to bit of basics first. I would not go into the history or setup of this website. TL;DR this is site build with a static site generator Hugo themed by a custom port of Ghostwriter theme. In this post, I want to focus mainly on why and how of IndieWeb.

Ever since I built this site, I had grown pretty determined to styling this exactly to my liking. Every section in here is thought through and has a purpose. When I was happy with what I had at the core, I started exploring on how to make this my primary identity online. I had realised that, thanks to all the silos of Facebook, Twitter, Medium, et al., my online presence was scattered across multiple sites. IndieWeb calls it sharecropping your content.

Our online content and identities are becoming more important and sometimes even critical to our lives. Neither are secure in the hands of random ephemeral startups or big silos. We should be the holders of our online presence.

When you post something on the web, it should belong to you, not a corporation. Too many companies have gone out of business and lost all of their users’ data. By joining the IndieWeb, your content stays yours and in your control.

Yes, absolutely. I just couldn’t stop nodding my head in agreement as I read along. I want this site to be the primary place where my content exists. And I hope everyone spending time online, doing as basic a task as consuming content on social media and liking stuff there, should do so first primarily on a place he controls. Post on your website, syndicate it every where else1.

And it doesn’t take much to enable this. The steps I mention below can be incorporated on any platform you base your site on. Here I specifically jot down the steps I did for my site. It’s just a set of principles, many of them already W3C Recommendations.

Hugo Basics

Hugo allows to completely define how your website is structured and styled. It allows you to even define the behaviour of the pages across. Typically a user chooses a theme from a varied selection and let it handle everything. However, you always have access to the theme files and the code — just some Go snippets spread across html pages, making it fairly simple to modify the same.

  1. Familiarise yourself with your theme, typically present in <site_root>/themes/<theme_name>
  2. Identify the customisations made
  3. Make changes to the code, typically present either in layouts folder of your site or the theme

Web Sign-in via IndieAuth

First step in owning your identity online is allowing your site to be your identity online. All this needs is to declare openly your social network profiles as rel-me links and link those profiles back to your site. This allows you to login to any IndieAuth enabled websites using just your website homepage, no need to create an account or maintain more passwords.

For me, I already had links to my social network profiles in my /layouts/partials/header.html file. All I had to do was add a rel-me tag to it — a sample snippet to Twitter profile below.

<a data-hint="Twitter" title="Twitter" href="{{ . }}" target="_blank" rel="me"> ... </a>

I have selected Twitter and Github as the social network profiles to be used for identification.

You also need to declare the service which will act as an authorization endpoint when needed. This is used by other services, mostly IndieAuth clients, to validate your identity. To configure this, just add the below link to your site’s head.

<link rel="authorization_endpoint" href="">

Identity via h-card

Of course, once you decide your site will be your identify, your profile online, you need to make sure it clearly declares your basic information, i.e. define that identity. This allows other sites and services to know more about you. This is done via an h-card, a part of the microformats2 specification.

I added an h-card with my name, nickname, email and photo in /layouts/partials/footer.html

<p class="h-card vcard"> <a style="text-decoration: none" href={{ .Site.BaseURL }} class="p-name u-url url author metatag" rel="me"> {{ }} </a> / <a class="p-nickname u-email email metatag" rel="me" href="mailto:{{ }}"> {{ .Site.Author.nick }} </a> <img class="u-photo" src="img/headshot.png" alt="" /> </p> 

This way your social networks are not ones that define your identity, rather it is you who declare and control it.

Content definition with microformats

After defining yourself, next step is define your content. It is important for other sites and services to not just identify you, but your content too. Just add a
n h-entry markup (again part of microformats2 specification) to the posts template. This is possible by identifying the layouts that handle your posts and adding some markups to it.

I had to modify the /layouts/partials/post-header.html, /layouts/partials/post-content.html and /layouts/post/single.html. The information I declare is post title (p-name), post url (u-url), author (p-author), date (dt-published) and, of course, the content (e-content). Finally a typical post template looks as below.

<article class="h-entry"> <h1 class="post-title p-name" itemprop="name headline">{{ .Title }}</h1> <span>Published <a class="u-url" href="{{ .Permalink }}"> <time class="dt-published">{{ .Date }}</time></a> </span> <span>by</span> <span class="p-author">{{ | default }}</span> <div class="post-content clearfix e-content" itemprop="articleBody"> {{ .Content }} </div> </article>

Cross-site conversations via Webmentions

All the steps done till now allow defining your and your content’s identity for external websites. However, how do you enable conservations with them? If enabled, every time you or your website is referenced online, if they are IndieWeb Level 2 enabled, you are notified by Webmentions, an another W3C Recommendation.

Webmention is a simple way to automatically notify any URL when you link to it on your site. From the receivers perspective, it’s a way to request notification when other sites link to it.

In short, it is a modern alternative to erstwhile Pingback mechanism in prime blogging days. Of course, you have to enable your site to both send and receive web mentions.

To send a webmention to the page you have referenced, you can notify by

  • using a form made available on the referenced page
  • using curl or one of the many open source clients

If the referenced site can receive webmentions, they would be notified that you have linked to them. But how do you enable your site to receive web mention?

Again, it involves hosting and declaring a webmention endpoint. Of course, the active community already has a ready solution for this – by Pelle Wessman. It is a hosted service created to easily handle webmentions”. All you have to do is sign in with your domain (i.e. validate your identity) and let receive all mentions to your site. Once that’s done, just declare the web mention endpoint as below in your head.

 <link rel="webmention" href="" /> also provides APIs for you to fetch the webmentions to your content/site. You can implement your own solution for fetching and displaying these mentions along with posts. I have a simple Javascript to fetch and display them with posts — this is still a work in progress.

Interactions from Social Media

Though references to your posts from IndieWeb sites are handled, what about the references made on Twitter or Facebook? It can be in any form of response (i.e. likes, retweets/reposts) to the syndicated post. Of course, they do not send webmentions (wish they did).

One option is to implement your own backfeed to poll for such interactions and handle them as response. Well, of course the community has made sure there is a simpler hosted option. Enter Bridgy.

Bridgy is an open source project and proxy that implements backfeed and POSSE as a service. Bridgy sends webmentions for comments, likes, etc. on Facebook, Twitter, Google+, Instagram, and Flickr.

I was set — now, every time there is an interaction about my post on a service, bridgy captures that and sends a webmention to my endpoint.

One More Thing – Micropub

At this point, I had completely IndieWebified this site. I understand it might get overwhelming if you read it as a long sequence of steps. But trust me, the changes to be made are neither too many nor too complicated. It is just setting your site with appropriate markups and endpoints. There is also a really helpful guide in that walks you through every step, and also validates if you have everything setup correctly. But if you still do not want to go through the trouble, I have open sourced the changes I made to the theme I was using. You can use this theme, style it as per your liking, if needed, and get going.

For me though, there was one more thing pending. At the same time I was doing setting this up, had caught my attention. I already had a section dedicated for the microposts. I configured a feed for these posts, the microblogs, to syndicate to my timeline too. I was enjoying the interactions I was being part of with the community there2.

There was just one issue, I could not use it to post to my site. My process to post to this site still involved text editors and git commits. I wish it was simpler
 — at least for such microblogs. I wish I could just post from and have an entry made. Enter Micropub, again another W3C Recommendation.

It is the trickiest part of the puzzle. Again it involves creating an endpoint which can accept request to post, verify that it is coming from an authorised source and and finally create the post file in required structure for Hugo to build and deploy. I have enabled the support for Micropub on this site, thanks again to Paul.

I would like to detail out the solution I have running. But that would be a topic for another detailed post. In short, I hosted Paul’s open source micropub endpoint3, configured it to generate the post in the format I need, configured it to commit to my website source at GitHub, declared it as my site’s micropub endpoint and then let Netlify handle the rest.

You bet I would say it was all simple. I won’t.

I have realised that the idea of owning your identity online is a crucial facet of protecting yourself and your content in this fragile, siloed world of internet. It may be perceived as ostentatious by many, but it is rather humble, driven by a virtuous intent. There is nothing wrong in attempting to control what you post online, to make sure it stays online till you want it to. IndieWeb project allows that.

I do also realise that it is naive to think no one getting online will find this process irksome. Even though well defined, the principles are not for all. The simplicity of using and posting on social media services will continue to attract regular users4. However, here’s wishing that at least a part of these users are inspired to get their own personal domain. After all, it’s the first step in getting started on this IndieWeb journey.

  1. IndieWeb terms it POSSE – Post on Site, Syndicate Elsewhere

  2. It really is a great community. I would urge every one with a blog or a site of there own to join the platform. It is open, stays focused on letting you own your content and makes you be part of a wonderful set of people, all with some varied talent. With an open mind, there is so much to learn.

  3. Though it’s primarily created for Jekyll, it does work with Hugo too.

  4. In a way, this simplicity brought more people online, gave them a platform to create and publish their content. Gave them a voice. It would be pretentious to only blame them for their frailties and not appreciate what they have fostered.

Why Microsoft failed with Windows Phone?

A good round-up by Quartz on why Microsoft failed with Windows Phone; an attempt is also made to run with an alternate reality where Microsoft has avoided that. Its based on Microsoft declaring Windows Phone is free.

Compared to Google, Microsoft has much stronger connections to hardware OEMs on the one hand and software developers on the other. Its products are widely used and respected by business and consumer customers alike. By offering the Windows Phone platform for free, the company sacrifices licensing revenue, but this unnatural act is more than compensated for by the expansion of the Windows ecosystem. Windows PCs become more attractive, more compatible with the outpouring of mobile devices and applications created by enthusiastic hardware makers and eager app developers.

Biggest problem there? Microsoft just didn’t have a competitive solution in 2007-08. They were still rolling out devices with Windows Mobile OS1 and were struggling to come up with their own alternative which would be distinct from what Apple and Google had on offer. Of course, they attempted and failed, before they finally launched a good enough solution with Windows Phone. But this was around 2010 – good 3 years after iPhone was launched. It just was too late.

So, I really think it would not have mattered if Windows Mobile was free. It just wasn’t good enough against iOS and Android. What Microsoft needed was a quick relook at their mobile strategy which their success with Windows just didn’t allow. Quartz does a good job to summarise the attempts Microsoft made to turn the tide. But abstracting it to Windows Phone failure was easily preventable is simply a stretch.

It wasn’t the culture. Microsoft failed because there just was no space to play between the killer duo of open Android that different OEMs combined owned the larger market share with and the closed iPhone that gobbled up the premium market.

Windows Mobile 6.1

  1. One, that had a Start button, a taskbar, a tool bar, a file system and even Internet Explorer. To get an idea of how lagging it was behind what iPhone OS and Android, just above is how it looked like in 2008.

Why do we keep failing at foreseeing the technology advances?

Forecasting the future of technology has always been an entertaining but fruitless game. Nothing looks more dated than yesterday’s edition of Tomorrow’s World. But history can teach us something useful: not to fixate on the idea of the next big thing, the isolated technological miracle that utterly transforms some part of economic life with barely a ripple elsewhere. Instead, when we try to imagine the future, the past offers two lessons. First, the most influential new technologies are often humble and cheap. Mere affordability often counts for more than the beguiling complexity of an organic robot such as Rachael. Second, new inventions do not appear in isolation, as Rachael and her fellow androids did. Instead, as we struggle to use them to their best advantage, they profoundly reshape the societies around us.

This is such a great article from Tim Harford, a must read to understand how we have been always wrong while foreseeing where the technology is moving. Especially important is the fact that it was never the big-bang technological advances that changed the world.

But many world-changing inventions hide in plain sight in much the same way — too cheap to remark on, even as they quietly reorder everything.

Tim calls it the toilet paper principle. Again, so apt. He has a detailed run-down of many such important, but often overlooked events that deeply affected the multiple phases of industrial revolutions. Do read this article. It is a nice summary of what we have been getting wrong about the technology over time. Also, it lets us reevaluate our perspective and beliefs on where we see the advances to be coming from.

Static Website: Benefits & Writing Process

I have recently converted my blog to a website and, as I have already documented, I serve it as a static website. I have preferred this approach over a dynamic one that gets driven by a full-fledged blogging software or a publishing platform. Of course, these were not the only possible options. As is so well captured by Christopher Heng at his website1, the options are just too many.

It wasn’t an easy call to select one, it never is. Every option one chooses has its own advantages and disadvantages. And, of course, going with a standalone website builder and serving the content as pure HTML had its own too. For me, though, benefits really outweighed the challenges.

writing setup

Why choose a static website for blogs?

A static website is nothing but a string of HTML pages served by a hosted web server. Every document, blog post or update is a plain HTML page. Given this simplicity of existence, it has some key benefits that attracted me.

  1. It’s quick and cheap to develop
  2. It’s easy and cheap to host
  3. It’s fast to be served

Again, Hugo captures them the best.

Improved performance, security and ease of use are just a few of the reasons static site generators are so appealing.

The most noticeable is performance. HTTP servers are very good at sending files—so good, in fact, that you can effectively serve the same number of pages with a fraction of the memory and CPU needed for a dynamic site.

I have already seen this benefit realised with the performance of my website too.

These benefits should attract every blogger out there looking for a simple manageable solution for their blogs. Then why aren’t all blogs served as static website? The answer is, of course, nothing’s perfect and static websites are not for all. They bring with themselves a list of challenges.

Creating static pages needs a bit of programming skill, and a lot of interest to slog it out over web design and development. In the longer run, maintaining these pages can become cumbersome if they are to be updated even so slightly, every now and then. Especially, if all you care about is the text you write, what gets placed around it is really not of much an interest to you. All you want to do is write and publish. You don’t want to awaken the web dev in you every time you want your footer to be updated or a page to be added.

Ironically, a rush to dynamic websites to solve this challenge is totally counter to what you really care about, the text. The text is light, the page carrying them needs to be the same. To have these generated every time someone requests for them2 is just too costly. And to have them stored as database entries is just too messy. It is text, it needs to be stored as text.

Hugo (and the likes) provides a nice middle ground. Smashing Magazine summarizes it well.

Each generator takes a repository with plain-text files, runs one or more compilation phases, and spits out a folder with a static website that can be hosted anywhere. No PHP or database needed.

Hugo takes caching a step further and all HTML files are rendered on your computer. You can review the files locally before copying them to the computer hosting the HTTP server.

Isn’t it, after all, better to make the whole user-facing part of your website into a cache of servable HTML pages and have it generated and deployed locally — without loading the server with a programming runtime or a database? This is exactly what Hugo enables.

This helps to design, build, test and maintain the website without much of a hassle. And with Hugo, I have already hit all the benefits of a static website I mentioned above, bypassing the challenges it presents. A one-time effort to design and build the website is handled with little pain; now I can focus on writing.

Workflow to write, especially on mobile

There is another challenge with such websites that are built locally and served statically – all they contain is the final product, the HTMLs. Any change needs to be followed by a rebuild and redeploy. There is no online CMS to handle your content from a browser, especially no WYSIWYG3 web editor to create or modify your posts. Of course, one way to handle this is to deploy a separate light-weight CMS.

But for blogging, there is a simple way. All you need is a continuous deployment setup and a couple of applications to handle writing and publishing your posts. I have already explained how Netlify has helped me achieve the first part. Below is how I circumvent the second challenge4.

  1. Write: It is important you can write from multiple places, especially your mobile devices. Web editors of the blogging platforms allow you just that, keeping your drafts ready for you to pick up from where you left earlier. A static website lacks this and so calls for some other alternatives.

    I use iA Writer to write all my posts. It has apps for all the platforms I own, iOS and macOS. It allows me to focus mainly on writing, automatically saving all the words as I write them. It also keeps all the posts synced up across all the devices, granting me the convenience of cross-device writing that web editors enable. I find it goes even a step further as it provides me a consistent experience across all the platforms, as compared to the messy state of online writing — more on this later.

  2. Deploy: Doing this from desktop was always trivial; handling this from mobile was what puzzled me. However I managed to get a workable solution. Once I have the post ready, I use Git2Go to push the final Markdown (.md) files to my Git repository. Net
    lify does the rest, making the post available at the website. Any minor modifications, it is just an update with iA Writer to these .md files and a push via Git2Go.

  3. Workflow: Though it is easy to say just write at one place and push to repository, it would be a significant effort to get a file, as per Hugo-defined format, ready for writing. Any generator needs some added metadata, front matter in Hugo land, embedded along with the content to create a serve-ready HTML. Adding this to every post I write would have been a downer; especially with my intention of writing different types of posts – fiction, non-fiction and links.

    It needed Workflow, I mean literally. The Workflow app is a boon for anyone who wants to automate common tasks on iOS. It is a powerful tool with hundreds of actions that can be easily stitched together to create a workflow — one that can handle complex tasks with a single click. For example, here’s my workflow to get a link post ready.

  • Once I find a link I want to share and add some comment on, I open it in Safari and just copy the content I want to comment on
  • I trigger the relevant workflow from the share sheet
  • Workflow then
    • fetches the template for the link post
    • adds the link to the metadata as the source URL
    • adds the current date and time as the post date
    • adds the title I provide (or the title of original post if none provided)
    • takes the content from clipboard, adds to the content body
    • saves the file and opens it in iA Writer to be edited further
  • Once the post is ready to be published, I just export the post as Markdown from iA Writer and import it to Git2Go in the content section.
  • Commit & Push, and the link post is ready on the website.

This is just one such workflow. I have managed to create one each for every type of posts I write. Workflow does the routine heavy lifting for me, allowing me to focus on writing.

I know just this effort might be overwhelming for many. The act of building, managing and updating the static websites is not everybody’s cup of tea. However, for me, I have found that this triplet of enablers – iA Writer, Git2Go and Workflow – serves me well. I have never been so satisfied with either the website or the process involved in publishing to it. I am pretty confident this setup, with minor modifications, will last long.

  1. Heng’s website theSiteWizard is really a great resource for anyone interested in building or maintaining their own websites. It doesn’t matter the scale — be it for an individual blog or a full fletched website for your group.

  2. Of course, the dynamic generators, especially the CMSes like WordPress and other, do handle this well now-a-days by caching the HTML pages to avoid the unnecessary delays of generating and delivering the pages to the end users.

  3. What You See Is What You Get

  4. I focus mainly on the platforms I own i.e. iOS and macOS. Of course, if you have a different set of devices, your solution may vary, or may not exists at all (chances are slim for that though).