I originally compiled this web-site for my own amusement primarily, not expecting many people to visit it.
I have therefore been pleasantly surprised to find a number of surfers who have ancestors who either served on one of the ships involved in the hurricane at Apia, Samoa in 1889, or who are interested in the history of the dramatic events, and who have taken the trouble to make contact and provide additional data to expand the scope.
The web-site started off at Geocities, and being my first effort to compose a web-site, was utterly dire. I still have my old back ups on CD, so I know! Thank goodness Geocities is only active in Japan now. The site moved to somewhere else (that I've forgotten) but didn't get much better. I made a stronger effort when I uploaded it to Tesco, and then on Homecall, very similar to that version. Hopefully, the standard has improved dramatically over the years. I have now purchased my own domain name, and now use Freeola as my commercial hosting service. I have retained the Homecall site whilst I get the Freeola site sorted out, and will then replace all the pages on the Homecall site with "Redirects".
As of July, 2016, I have extensively revised the site in an attempt to make it more user friendly for tablets and mobiles, i.e. to make use of "Responsive Design". I had used Google's site statistics which showed around 60% of the clicks to my site originated from mobiles and tablets, and I realised I must improve the site for SEO (Search Engine Optimisation) as well as "other devices" and the results of my efforts are shown in the green box below.
I have also added some German versions of paragraphs on pages containing data that might be of interest to a German visitor. Hopefully, English language visitors will forgive this addition. I know German visitors could use Google to translate the entire page into their language, with the same degree of accuracy as Google Translator did for me with the paragraphs, but as I have yet to recieve a single German visitor that I know of, I decided that perhaps including some German text would help to attract them in their searches. Especially for keywords such as "Apia, Samoa, Hurricane, Adler, Eber, Olga".
Page down to the very bottom to find links for all the sections of my site, including the "Samoa Story".
Wenn Sie mich kontaktieren möchten, nutzen Sie bitte die folgende E- Mail-Adresse:
By the way, if you do contact me, I would be interested to know how you got to this site. What you were searching for and how easy it was to find. It will help me in understanding how the site is listed by the browsers.
Click on the sub-links which have opened up in the navigation panel on the left hand side of this window to read about Scaramouche and Rafael Sabatini.
On any of the pages of my site, clicking the small image of a sword fencer (it is supposed to be Andre Louis Moreau, aka Scaramouche) in the top left corner will take you to the home page in the same way as clicking "HOME" will at the bottom of the page.
Incidentally, I recently installed Windows 10 on my PC and in general terms am happy with it, though it took 3 1/2 hours to complete the download and installation. I found Microsoft OneNote to be nothing like the tutorial, which is a pity as it looked to be quite powerful, when in fact, it is a glorified Notepad. I now think the tutorial referred to the version of One Note you get when you buy the 2016 Microsoft Office suite; the version free with Windows 10 is presumably a cut-down program. But I did find a problem with Microsoft Edge not properly loading a web page text. I found often the text stopped abruptly, and that a refresh seemed to then load it all. I don't know if this is a common problem, but if you think some text seems to be missing from a page you access, try the refresh trick. I still prefer Google which doesn't seem to give these problems. [As of April, 2016 this problem with Edge seems to have been sorted, but there is another one, see Problems Outstanding below.]
I hope you didn't expect to find out much about the modern day Scaramouche who compiled this web site. You wouldn't find it very interesting if I did put in anything about myself. That is apart from the data about my family history, accessible on the HOME page. I thought I'd put that in in case a Hague family member happened apon the site by searching for the name. It might get listed, but I doubt it; when I've tried it, the list is swamped with results for "The Hague".
Did you know that since 2013, the British Library has been automatically harvesting details about UK web-sites as a special part of its long-exising "Legal Deposit" scheme? It doesn't need to ask your permission, but apparently does tell you if it has harvested yours. I guess since you have published to the web, you will not mind this action, you might even approve. But it will also apparently harvest password protected areas of your site (they request access from you and you have to comply within one month). I have not received any notification about this site (as of July 2016), but as the notification goes to the "web-master" this may be my Freebie broadband supplier "Talk-Talk" who have provided the web-space, and not me anyway.
As of February, 2016, Scaramouche has finally discovered the wonders of CSS! It won't show up much when navigating my web-site, but it is a little more modern and so should be compliant with all web browsers. I can heartily recommend the website W3Schools as being the source of my new found knowledge, such as it is! (Please don't "View Source" as you will see how bad my implementation is.)
You can get a comprehensive review of your web-site, or rather in most cases a single page of your web-site, using the following links. All are free. One is: WebsiteGrader.com. You simply paste in the URL of your home (or index) page, the results for this web-site Home Page are shown in the green box below. I think if you paste in the URL of a page on your site, it still only checks the home page. Another good tool is Broken Links Checker. The previously mentioned W3Schools also provides a web-page checker at Validator which does give you the help to improve manually coded HTML significantly, and does analyse the web page URL you enter. There is also a free XML "Site Map File Generator" (see below) - this one analyses your whole web-site for pages. Finally, two useful Google tools are: Test My Site which checks how your web-page works at speed over Desktop, Tablet and Mobile, gives you a report and suggests fixes. Click here for the page it gave me for this site. In truth, it doesn't seem to send the report. There is also Web Page Analyzer which checks how well your page appears on mobiles. This is an image of what it said about this site's Home page. Both these Google tools appear to analyse the page URL only, this is what it said about this page.
I recently discovered this site: The Site Wizard, I have only recently started perusing it, but it seems at first sight to provide a great deal of very useful information for the amateur deciding to create a personal web-site. I haven't got very far with it yet, but it looks as if it might be very helpful. One of the things that resonated with me was the page about free domains (like my old site at http://www.scaramouche.homecall.co.uk/) and the risks associated with free providers, such as them suddenly deciding to no longer provide it: like with my old one at Tesco which simply closed down, losing all the Internet-wide links and Search Engine rankings. It therefore rang the bell in my brain loudly enough for me to buy a domain name, and publish the site using a commercial paid-for provider, or host, the result being this site: www.grahamhague.com
Google have some very useful free guides, see the Webmaster Academy for an excellent guide to creating or improving your personal web-site.
These following are not free tools, but as they follow on from the advice just given, I'll include it here. I now use Freeola for the hosting (VIP version at £10.47 per quarter plus vat) and GetDotted for the "dot com" domain registration (at £11.98 per year plus vat) - prices as of August 2016 and the domain used an additional "WhoIs" privacy option. I am not recommending them necessarily, and there are cheaper options out there, but they seemed to be reasonably competetive, and a friend has used them for a while and finds their customer support to be very good, and downtime to be very low. I have heard that the phrase: "You get what you pay for" is VERY relevant when it comes to finding a web host. I sent a question whilst setting up on a Sunday, and got a response and solution within minutes. But if you decide you want a commercial presence on the web choose your own! As the "Site Wizard" web-master said, if my now current provider falls over, I can simply find another, and with the domain name set for me (and which needn't change just due to a change of the host), all Search Engine links will seamlessly point to the new provider. I have decided the cost is worth it, for the convenience, the more professional presence (instead of the inclusion of a known freebie provider in the address), and the protection against future free-space companies withdrawing the service. The downside at this moment in time is getting the new site to have the same prominence in the Google rankings as had the old one, but once it (hopefully) does so, I won't have to do it again.
When I first encountered this message that I needed to include a sitemap, I was a little indignant as I had long included an HTM Site Map page on my site. But in fact, browsers look for a special site map file called "sitemap.xml" which lists all the URLs of the pages that make up your site. You can create this yourself in Notepad (it is not particularly obscure in format) but see the next paragraph and especially check out the Site Maps Protocol web-site for information about it.
I found this "Site Map Generator" which I used to generate, download, and upload to my web-site an XML file called "sitemap". I will wait and see if anything seems to happen. Everything seemed to work OK with the generation and uploading, and access, of the new files. BTW, the Generator linked above is a freebie, and seems to have worked fine. You could do it yourself manually if you didn't have too many pages. But the paid for version looks like it might be a good buy if you have a large, complicated, frequently revised web-site to manage. Google Chrome Search Console does like the fact I have a sitemap.xml file, as does Web-Grader (see above). So I would advise uploading one to your site, especially as you can generate it for free!
I went to Google Account to firstly "Verify" I owned the web-site (involved uploading a Google file to the site) and then it will (presumably automatically) be checked by Google Search. You can also access some statistics, I don't know what as yet, but it seems Google will use the "sitemap.xml" file that was generated above (simply a list of every linked URL on your site) to improve(?) the searching algorithms.
Well, the Google Account page gives me access to all sorts of statistics for my web-site. For example, in the last 28 days (up to 8th July 2016) I had: 7 clicks in the UK; 3 in the US; and 1 each in Canada; American Samoa; New Zealand; Samoa. I also had: 9 on desktop; 3 on mobile; and 2 on tablet, which rather reinforced the need for me to make the site more accessible to these other devices. Since I checked this, desktop traffic has reduced to less than 50%. These statistics are very interesting, I wish I had found out about them long ago. The traffic from mobiles and tablets seems to be gaining in importance over the last few days alone.
I have also added "Google Analytics" to all the pages of my site that refer to the hurricane; I haven't got to the bottom of the reports this generates yet, but it does seem to be very comprehensive. Using the data collected is something I need to understand more properly. One thing it told me was that recently I had received more traffic from the US than the UK, and that 70% of my recent traffic were new visitors rather than returning visitors. I could also determine that the day before I checked, a visitor had started at "Apia Hurricane", viewed "Samoa Hurricane", "HMS Calliope", "Final Destruction", "The Other Ships" and ended on "Eber Crew Stories" in a session lasting more than an hour. All very interesting. To activate it, all you do is paste a short section of HTML code into the "Head" section of your code on each page you want to analyse. It seems to work extremely well. I'm impressed by everything Google does (except paying its UK taxes)!
A check of the Search Statistics on the old Homecall site on 20 July 2016 showed (over the last 28 days): 8 from desktop; 6 from mobile; 4 from tablet. So at the moment, desktop access is now less than 50% of my traffic. Just as well I tried to make the site responsive.
I have read Google's excellent Webmaster Guides, and discovered much of my efforts to be rather bad practice. My "Description" tags on each page were especially poor, and I have amended them on all the pages. Titles were not very good either, so I have addressed that. I am hoping this will mean that the new site will begin to appear on the Google listings without too much delay. I uploaded the files to the new Host on 17th August, and added the Property to the Google Search Console. I did the usual Google Bot "Fetch & Render" and "Submit to Index". A page on the new-site first appeared in the Google listings on 24th August. Since then, it has slipped down the listing and the other pages stubbornly refuse to appear. The Google Analytics show no site activity as yet, and Google says it has only indexed around 8 of the 90 pages on the new site. I guess this is simply a matter of waiting for Google to index more pages, it had only indexed 59 (of 90) on the old site after a few years! But I do hope my improved HTML will be reflected in more Google listings. I heartily recommend reading Google's Webmaster Academy stuff if you are starting out (or like me, have an old site) with a web presence.
I did put a "robots.txt" file onto both my sites, which was intended to exclude the Google-Bot Image reader from harvesting images from my sites. Some of them have appeared on "Google Images" and whilst there is nothing particularly sensitive about them, I have posted images of people's family members, and have been unable on some occasions to obtain permission to use them. These particular ones I got from the internet, but I don't want to perpetuate rudeness, I have permission to use ones sent directly to me by family members. So I thought I ought to keep all my images out of Google Images. But ever since I uploaded the file, Google Search states it couldn't index my site because the "robots.txt" file was "Unreachable". It definitely existed, but "unreachable" means Google Bot found it but was not allowed to access it. In case there were exclusions in it, which of course it wasn't able to establish, Google decide to abandon ALL attempts to index the site. Unable to establish what was going on, I have now removed the file from both hosts. If Google cannot "Find It", as opposed to "Reach It", then it assumes everything on the site can be accessed by it's Bots. But having made an attempt which failed, it may be some time before Google makes another attempt to index the new site. And it doesn't seem to index the whole site in one go, it seems to do it to just a few pages each time.
Some people using commercial Web-Site Creators seem to have awful trouble with the "robots.txt" file, which can be automatically created in the background (so you know nothing about it) and can even be "hidden" or "virtual" so you don't even see it, and therefore cannot remove it, from the uploaded files. This is a bit unfriendly. It is used to tell browsers if there are pages, or images or whatever, that you don't want them to display - that begs the question why create them in the first place, but I guess it might be useful in some cases. I wanted to stop the Google-Image-Bot browser from harvesting the images on my site (without my permission). Obviously, anyone viewing the page could still do a "Save As" on the image, so it wasn't crucially important, and since it caused problems, deleting it seemed a good idea. I would think few "Personal" web-sites need to make use of it.
Google occasionally lists some of my pages as being not mobile friendly, the message being a hyper link to a page that tests the page, and describes it as "Awesome" for Mobile friendliness! I don't understand that, but the messages do seem to be gradually disappearing.
And when I "index" my site on Google Search Console, the "desktop render" Googlebot does not render the page as does Chrome during browsing. I thought this might be simply a delay in "fetching" the site, but it's not. Again, I don't understand this, and may need to contact some help about it. At least the "mobile render" Googlebot is OK. Update: This was the stupidest thing! My browser window is not full screen, and was rendering the Google Bot image that was displaying the two screen views of my web-site each with a horizontal scroll bar, NOT because that is how it would appear to users, but because the browser window size I was using did not permit it to display them completely! Maximising my browser window showed that the Google Bot DID indeed see the screens as they were designed, with NO horizontal scroll bar.
I wish there was a site on the web on which you could enter the URL of your web-site, and then view the site as it would appear rendered in each of the various browsers. If you know of one, please let me know to the e-m address above! In Google Analytics, you can choose to view by "Audience", "Technology" and "Browser & OS" which lists those browsers used by user visitors to your site, and you can even refine this to "Browser Version". A popular browser used on my site appears to be "Safari". But I still haven't found a way of seeing what such "other browsers" see with my site. Well, I did find a few, but no "freebies". The site "http://browsershots.org/" seemed to offer a free service, but although it seemed to be doing something when I tried it out, nothing materialised! The afore-mentioned Web Site Wizard does have a page about testing your web-site in different browsers, and recommends installing a number of different browsers on your PC to allow you to do so. This is probably the best solution. I haven't followed the advice myself yet, but will probably do so. I have tested this site on IE 11, (installed on my Windows 10 PC, though it may have been "left over" from an earlier Windows version) and Microsoft Edge, and although the fonts appear rather different in these browsers than they do in Chrome (I don't know why for sure, but believe it is something I have failed to do in the CSS file), the content and layout seems ok.