Introduction
One of the challenges of building a start-up is that you are always considering how much time and effort to put into each aspect of your business. Obviously this includes all aspects of the core service you are building, should you build something that scales from day one or a disposable proof of concept, do you build one off services or reusable tools? But it also applies to every aspect of running the rest of the business, where do you market your services, how many meetings should you set up, how much time should and money should you spend on business services and branding etc.
One of the easiest things to set up might be you website. If you choose to go with AWS, for example, you can register a domain, add your content to S3, use CloudFront as a front end and have a decent website hosting all your static content up and running in a couple of hours. This absolutely meets the definition of a minimum viable service, you can remove it from the to-do list and move on to the next task.
However, I think there are a few additional steps which take a few hours to implement but are totally worth it in terms of elevating the overall site experience. I didn't find these collected in one place so thought it would be useful to document them here;
Information for Search Engines
Any public website can include a "robots.txt" file, in the root directory of the website, which tells well behaved search engines what to index. This isn't essential as in the absence of a file search engines will probably index your whole site. But if you are using common Javascript files such as the Bootstrap framework used in this site then it's advisable to exclude those files so people don't start using your site as a download location for common libraries.
On this site the common libraries are in a /assets/ directory and the "robots.txt" file looks like
User-agent: * Allow: / # Exclude technical assets that shouldn't appear in search results Disallow: /assets/ Disallow: /*.css Disallow: /*.js
This should protect against discovering your library files such as "bootstrap.css" with a web search. It should also be noted that malicious crawlers (and there are so many of them) may use this file to look for hidden content, so do not use this as any protection for secrets or confidential information.
Security
It is also best practice to include your security contact details so that security researchers can easily report any issues with your site. The standard convention is to create a ".well-known" subdirectory off the root of your website (the leading "." is important) and then create a "security.txt" file in this directory with your security contact details. Again anyone can read this so best practice is to include a reporting email address which you check frequently, unless you really do run a 24 x 7 incident response centre in which case you should include that.
Humans
The next step is completly optional but as well as including a "robots.txt" file you can also include a "humans.txt" file at the root of your website. There is no standard for this but the convention is to add the names of people who have worked on the site and any other people or projects you wish to thank, this site includes a shout out to some of the open source development teams who's work we use, for example.
Favicons
I don't think it's ever too soon to start creating a visual identity for your site and this includes picking an icon for your brand. You might not want to lose hours or spend thousands on creating your branding for a minimum viable product but at least having a credible placeholder for your look and feel is probably a worthwhile investment. I used design.com to create an icon, I'm not 100% in love with it but at least it starts to create a visual identity for the site and can be iterated upon later.
Once you have a icon saving it as a favicon file to be used in browser addressbars and bookmarks makes sense. But like many things on the web there seem to be multiple standards for this "simple file"
There is a good blog post on this topic here - How to add a favicon to your website — The modern browser guide or you can also consult the very detailed Wikipedia article here - Wikipedia - Favicon I used one of the many free favicon creation sites to take the large icon file I created at design.com and resize it to each of the sizes suggested in the above blog post, it took about 10 minutes to create and download the relevant images. I then uploaded them to the root directory of my website.
In practice this seemed to be enough for most browsers and mobile devices. But to be certain I also added the icon references to the site homepage as follows;
<!-- For all browsers -->
<link rel="icon" type="image/png" sizes="32x32" href="/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="/favicon-16x16.png">
<!-- For Google and Chrome -->
<link rel="icon" type="image/png" sizes="48x48" href="/favicon-48x48.png">
<link rel="icon" type="image/png" sizes="96x96" href="/favicon-96x96.png">
<link rel="icon" type="image/png" sizes="192x192" href="/favicon-192x192.png">
<!-- For iPad -->
<link rel="apple-touch-icon" type="image/png" sizes="167x167" href="/favicon-167x167.png">
<!-- For iPhone -->
<link rel="apple-touch-icon" type="image/png" sizes="180x180" href="/favicon-180x180.png">
Initially I thought adding the Apple icons may not be needed but with the iPad being more powerful and people using them as a second screen to accompany a laptop or as a laptop replacement it made sense to add the Apple specific icons. Also having a dedicated icon on your iPhone or iPad homepage for when you want to run a quick demo just looks better.
An additional benefit is that many browsers will search your site for a favicon file anyway, adding one even as a simple placeholder will reduce the volume of "404 not found errors" in your web site logs.
Search Engine Submission
You may feel that it's too early to submit your website to search engines, but remember, as soon as it's open to the Internet it is likely to be discovered and indexed. Therefore it's always worth controlling the process and submitting it yourself.
In addition search engines start to rank sites higher the longer they have existed so again, it makes sense to start earlier. By manually submitting your site to Google and Bing and providing a contact email address you get access to some interesting free tools which explain how to start optimising your site for better search performance and results.
I submitted this site to
Google - Improve your performance on Google Search
Bing - Bing URL submission Options
After this I used a site called Free Web Submission to submit the site to the next 40 search engines, It prompts you for the paid plan upgrade a couple of times but otherwise seems to have done the job.
Terms and Conditions and Privacy Policy
If you are saving any data from your end users, including login details or email addresses then both a terms and conditions document and privacy policy become essential.
Note; If you are in the UK and recording any data about your users, you most likely need to register with the Information Commissioner's Office. There is a useful questionaire on their website which determines how much you have to pay, if you are a small business and data processing isn't the main function of your business then the fee is around 50 pounds per year, but there can be quite large fines if you don't register.
Here are the Clouds and Light Terms and Conditions and Privacy Policy For the terms and conditions, because Clouds and Light allows user generated content, I included sections on copyright and also prohibited content. As this is a development site the lack of warranty for any service is also clearly stated, as is the fact that this is a UK company operating under the laws of England and Wales.
The privacy policy is arguably even more important for any site which collects data. Here I took time to document what data is collected and what it might be used for. I also included placeholder sections for services which aren't in place yet (such as payment processors) but which may be used in the future. The policy also reiterates user's rights under the General Data Protection Regulation. Because users have the right to see all the data held on them and to have their data deleted I have built some tools which implement this for the database tables we use in AWS. I also included the ICO registration number for the company.
If you use third party services such as "Sign in with Amazon" or a number of payment and analytics services they will often ask for a link to your privacy policy as part of registration, so it's advisable to create it sooner rather than later.
At present the only cookies used by the site are generated by the site for user personalisation. This means that we don't need to seek cookie consent yet. However, if we add a third party service such as google analytics then cookie consent will need to be added to the site.
Open Graph and LinkedIn / Facebook
The final recommendation for the site really applies if you (or someone else) is likely to post your pages on LinkedIn or Facebook.
LinkedIn uses an open graph meta tag format for page preview descriptions. In practice this means adding 4 additional tags to any website pages you wish to share and ensuring you have an appropriate image linked, the process is described here - Make your website sharable on LinkedIn.
The catch I found here is that LinkedIn seems to have its own webpage cache. So if you go to share your website and the preview doesn't work it isn't as simple as updating the tags on your website and hitting refresh on LinkedIn. So again this step is worth implementing sooner rather than later.
Conclusion
This isn't an exhaustive or definitive list of steps to optimise your website. Rather it's things I've discovered over the last five months as I've built out a public website which I didn't find in a single guide.
Implementing the steps above might take you three to six hours. However I think they are all worth it in terms of making your website more visible, more professional and has documentation which could protect you in the event of a dispute.
The blog pages don't support comments (yet) but any thought, ideas, comments and corrections can always be sent to "support@cloudsandlight.com".
Test Site Now Live
The platform is now open for testing and is completly free to use.
The site has 4 example courses and a fully functioning course editor as well as integration with your test AWS account. Not every feature is complete but we are happy to run demos and discuss the roadmap for the platform.