Basic SEO (for initial traffic)

Summary – write good, relevant quality content, update it frequently, target users who are ready to buy, ensure your site runs fast through caching and content delivery networks, submit sitemaps to search engines, separate out multilingual content, build links back to your site around the web.

As demonstrated by many SEO ‘experts’, for whom years of work are turned on their head in an instant when major search engines (OK, Google…) change their search formulas, SEO is not an exact science and strategies must be flexible in order to work in the long term.

However there are several ‘core tenets’ of SEO that all site owners should be aware of. Implementation of these will ensure a basic level of search engine optimisation for an initial flow of traffic. With these the right people will find you, and stay with you for long enough to see what they need to. Converting these visitors to actual sales is another matter altogether, one I’ll tackle after these sections on SEO.

The best thing about SEO however, is that for the most part it is free to carry out. It’s great for business as it just requires a semi-regular time investment and can often single-handedly speed up business growth. Helping search engines to index you, and the best clients to find you, is a good strategic move for any business concern, but especially for translators whose work is nowadays mainly bought and sold online.

In SEO there is a general distinction between ‘on-page’ and ‘off-page’ techniques. On-page work can be conducted on the site itself, off-page is promotion work that involves boosting the rank or reputation of a site around the internet. Let’s take a look at the basics of both.


 

On-page

The three main on-page aspects of a website, from a search engine’s perspective, are:

·        Content quality

·        Code correctness

·        Site indexing speed

As a search engine has many thousands of computers ‘crawling’ the web for new sites and updates all day, every day, using formulas to determine site quality (a.k.a. search algorithms such as Google’s PageRank), it pays to create a site which is easy for them to find. The algorithms currently used strive to give the highest ranking to sites that are most useful to the users’ searches, i.e. by providing the most relevant content. This means site owners should aim to provide content that is easily found and understood by the engines, while maintaining a high standard of usefulness to their human readers.

To produce content of the quality that search engines would rank highly, you need to consider the use of keywords. These used to be stuffed into the bottom of a page before search engines got wise to the trick. Now they rely on the relevant keywords being used in the body of the text, as well as elsewhere.

Content quality factors

First research keywords most relevant to your target visitor. This can be done with Google’s own keyword planner. Be sure to log in to a Google (Adwords) account to get the full analysis of keywords related to your content.

The use of keywords in copy must be natural, yet considered. Using hundreds of search terms in your content just to draw search traffic into your site will produce stilted copy in your articles and sales pages and won’t convert to sales or visitor engagement.

Content must therefore have an engagement factor to keep people on-site. To get them to talk about your content, to share it online and offline, to save them from ‘bouncing’ away (leaving immediately), which is another search engine formula criteria in itself.

Your content must be relevant to the visitor’s needs and the search they carried out. To be most effective it must engage their interest and get them talking, thinking or acting. A sale, a retweet, a referral, a comment; any of these are the hallmarks of a successful ‘piece of content’, be it a blog post, sales page or site widget.

When it comes to content, the basic rule is quality over quantity, but if the quality is in place then having quantity multiplies the effect.

Generally speaking, blog and site owners find that the more content they have, the more visitors they have. If the content is relevant and useful, then they move up the search engine rankings and the effects snowball. If the content is relevant and useful to target clients, then converting visitors to clients is that bit easier.

Consistent content writing also helps to engage readers and show the search engine that the site is fresh, recent and relevant. Aiming to write hundreds of articles over a number of years is a solid goal to establish a reputable web presence. But all of this would be for nought if the search engines had trouble indexing it in their databases. We need to help them to do so, and the way to do that at the moment is to speak directly to them in HTML.

Fluency in HTML

HTML is the ‘language’ of the internet, in which web pages are presented. HTML code correctness and completion can be automated and made easy, but must be done for an optimal search engine ranking.

If you are using a CMS as laid out above, you can add plugins or modules to your site that will automate this process for you. If you are handcoding your site, I’ll assume you know the crucial importance of tagging titles, descriptions and headers correctly, and let you move on to the next paragraph on the use of keywords in your title and header copy.

The major point on keyword use is to be sparing in their use. There is often discussion around ‘keyword density’, (trying to find the ‘perfect’ frequency keywords are to be used in titles and content), but no definitive answers ever emerge. It is generally accepted that over-using keywords is bad (known as stuffing) and that pluralising keywords to pretend they are new words, among other grammatical tricks, is not a reliable strategy.

So when writing your page titles, descriptions and headers, be sure to use your keywords. Just don’t repeat them too often. I was told on good authority that site developers of a major broadsheet newspaper in the UK aim for 5-10% keyword density on all headlines and page titles. But as Google representatives themselves have never given a definitive answer on this, it is best to stick to the basic tenets of ‘don’t stuff’ and, ‘be natural’ with keywords; just make sure you use them.

Site indexing speed

This makes it easier for the search engine to index your site, as it uses less resources, and is generally seen as favourable. Visitors also visit more pages, which is good for business. Aspects of indexing speed are site load time, crawlability and URL relevance.

Sites that are slow to load are penalised in the rankings, so ensure that your site images are suitably compressed (jpg/gif over png/bmp, depending on quality required), that your pages are not full of sidebar widgets and scripts that lead to excruciatingly slow load times. Use this Pagespeed tool provided by Google to check your site, or this one by Pingdom if you’d prefer.

If you have some ‘multimedia’ elements to add to a page, be they video, audio or image, consider embedding them from a ‘content delivery network’ (CDN) which hosts files and caches pages, serving them quickly to visitors from servers located as close to them (geographically speaking) as possible. This will involve a little documentation reading to set up, but will be worth it for the increase in ranking that can be achieved. For text-only type sites, this is a lower priority, but all sites could benefit from seeking out a caching plugin or service.

Crawlability factors include having what’s known as a ‘sitemap’. Preferably in XML format. Again, CMS plugins can automate this process for you, so I recommend you stick to a CMS and seek out a sitemap plugin. This allows the search engine to quickly see if you have made any changes to the site, and to rapidly digest the page structure.

Your URLs also need to be relevant, that is to say, the address to each page should be human readable (lukespear.co.uk/ebooks, not lukespear.co.uk/node/5640) to be correctly ranked. Look for ‘permalinks’ in your CMS to automate this simple process.

Any links you then include in your content must be periodically checked. If they no longer lead to the original destination then you have a ‘link rot’ issue, which negatively affects your site rank. Use a free service such as Brokenlinkcheck.com to weed them out.

Multilingual websites and SEO

Search engines still get confused by different languages. It is best practice to use separate subdomains or sub-folders. Employing multiple languages is an effective way to increase site traffic, and therefore the odds of a sale. Subdomains (en.language.com or fr.language.com) act as separate sites, and this is reflected in search engine rank, meaning they don’t benefit from the rank built up on the original domain when first launched. They may be harder to keep current over time, requiring the maintenance of a new CMS for each. Thus, it is better to use the category/sub-folder method (language.com/en or language.com/fr) to separate content between languages to make the most of your existing rank, only adding to it, rather than competing against it. Consider translating your own blog posts and pages in order to attract search traffic from your target markets.

A final note on the keywords themselves

Ensure you use keywords that would tend to attract users who are already prepared to buy. By this I mean to attract those users who have searched for something along the lines of ‘professional Greek translator in London’ or ‘how to get a website translation’ because the chance that this user is ready to buy is much higher than one searching for ‘free and instant translation’.

Therefore your website copy ought to reflect the needs of your ideal clients to increase your sales conversion rate.


 

Off-page

Your off-page strategy involves building reputation through other people and sites. Search engines take into account the following aspects:

·        Links

·        Social

·        Personal

·        Official listings

Building quality inbound links from trusted sites, is a key factor in any SEO strategy. Sites must be relevant, credible and specifically not blacklisted in any way. The text in these links can also include keywords. The number of these links, provided they are of quality, will directly affect your ranking. There are numerous ways to build ‘backlinks’, and there are certainly ways that you should avoid, as noted in the next section on advanced SEO. Some of the most tried and true backlink-generating methods include:

·        Writing guest pieces for like-minded sites, giving them fresh content and giving you a solid backlink

·        Forum signatures and blog comments do tend to work to a degree, but are not an especially sustainable or scalable technique as a lot of sites include a ‘no-follow’ tag to avoid ‘comment spam’, as it is sometimes known, meaning search engines discard any comment links

·        Find ways on to blogrolls and link lists – a small scale strategy that can be slowly worked on over time

·        Business directories – not the same as official listings, as described below, but an interesting (and already old) way of building backlinks. Not ideal if it adds another redundant layer to a visitor clicking through to your site, but not to be ignored as the most popular ones can rank highly – Google, for instance, has said it splits directories into two categories: those offering guaranteed inclusion, and those not guaranteeing inclusion, often requiring a non-refundable fee. The latter rank higher.

·        Links back from trusted sites – government, charity, university and broadcaster sites all carry more weight than most others when it comes to improving rankings

For the last point, there are many ways to earn links back from each kind of site. Charity sites might mention you if you translate a landing page for them on their website. Universities might mention you if you give a lecture there occasionally. Broadcasters if you have a great news story on language or translation, and government if you do any local work as a ‘languages champion’ or similar, perhaps for schools. These sites are highly reputable, so it pays to be associated where possible.

Social media can of course be used to post your content to these channels yourself and start the process of others sharing and linking back to you. The quantity and quality of shares can count towards a positive ranking. No guarantees here, but it all builds over time. Be wary of URL shorteners (bit.ly, etc.) which can go bust and leave all of your backlinks in the place where links go to die. According to the Economist, of 1000 URL shortening services launched since 2001, 600 had gone bust by 2012, with most citing spam as the primary reason.

Trust is the key here, which is built through these shares and mentions online. A slow process, but one that adds to your ranking potential. Age contributes to this. Just by virtue of being online for a sustained period of time will help to improve your credibility in the eyes of the search engines. A site with 100 well-linked articles and 5 years online will invariably rank higher for a particular search term than a 1 month old site with 2 articles.

The personal perceptions of the visitor, such as the country you are posting from, how local you are to them on a regional level and how often they are inclined to visit your site all contribute to average visit times, number of pages viewed and so on. Being location agnostic may have benefits for attracting a global audience, but mentioning your country and city may also be a useful way to build an intangible level of trust in your visitor and potential client. An aspect to be considered, at least.

Finally, no off-page strategy is complete without some mention of the web’s ‘official listings’. Well, they aren’t so much official, as they are the de facto standards for registering your site online. Start with Google’s own Webmaster Tools, which gives you a place to officially upload your sitemap to Google and let them know key site preferences and information directly. The Yahoo! Directory now charges an annual fee, so is not as interesting as Google’s offer, however the DMOZ Open Directory Project is well edited (but can take months for approval) and should be consulted. DMOZ offers directory services directly to Google, Alexa, Lycos, HotBot, AOL Search and many more.

To ensure that you’ve covered all of the basics, run your site through a tool such as Woorank. Be aware that Woorank will score your site based on its own criteria, and this can be displayed in search results. This score can look like a star-review and could be misleading to clients, but the information it provides is extremely useful. Perhaps consider scoring a well-established website to view the criteria if you’d rather not risk your own site rank being placed online.

Another tool frequently recommend by SEO communities is the SEO for Firefox tool by SEOBook. This provides lots of useful information when you search the web. More tools are available in the Resources section.

Previous:
Next: