Google to Penalize Websites Lacking SSL in 2017SSL is a way of encrypting websites to make them safer and less hackable. Google is already penalizing sites that DON’T have this in place. Things are about to get a whole lot worse for pages with logins and credit card forms come January 2017.All WordPress sites have a login page, and thus, every WordPress site will need an SSL cert to meet this criteria.Secure sites are identified by two things: https:// in your browser bar and, at times, a symbol or color denoting the status. Take ours for example. We’ve got full SSL in place on our website:Is there a catch?Even if you’re using SSL you could still be penalized.Your site has to be FULLY encrypted. That means those that are using mixed content security, forcing https:// on some pages and not others, will also get you into trouble.Here is an example of a mixed content site that uses SSL on some content and not others. You’ll notice a lowercase ‘i’ here in a circle to denote more info. Some browsers use an exclamation point. If you click that you’ll see a note like this indicating the non-secure connection:Not so bad? Well here is what Google will start doing as first measures in 2017:
If you are qualifying an off-the-shelf content management system (CMS) system or creating your own custom CMS system then the capabilities listed below are imperative to achieving maximum organic search engine rankings.
A couple excellent open source CMS systems are Drupal and Joomla. It is easier to configure Drupal than Joomla to meet the search engine optimization (SEO) requirements discussed in this article.
ON PAGE SEO FEATURES NEEDED FOR CMS SYSTEM
1. Page Header Information (not visible on the page and between the tags). Note that the title, meta-description and meta-keywords tags listed below do not allow html tags at all.
a. Title tag - the most important SEO attribute on the page. Summary of Body content, no more than 70 characters, most important keywords in front. Ideally the title will also be of a persuasive nature. The title of the page is what users see and click on in search engine results pages so you have to make your persuasive and sales case in the title.
b. Meta-keywords tag - only the keywords that are in body content (spelled exactly the same), all of the keywords on the page, most important keywords in front. Place above the meta-description tag and after the title tag.
c. Meta-description tag - summary of body content, no more that 200 characters, most important keywords in front. Since many search engines including Google sometimes use the meta-description directly in their search engine results pages (SERP) as the description of the page this tag should contain keyword rich and persuasive sales-oriented content.
Social media. I hope you're sitting down for this one. Once you have created your website, you need to tell people about it, and the best way to do that is social media. I know, shocking, right? Okay, it's not even a little bit surprising, but in a world where we are crunched for time, it is easy to overlook the simplest things.
Make sure your website is included in all of your online profiles, and then be an active member of these communities. If you write a new blog post, tweet about it, post a Facebook status with a link, and don't be afraid to click those social sharing buttons embedded next to every post. It may feel a little gauche to like, Google+, or Digg your own content, but there is nothing wrong with being proud of your work. It also encourages people to do the same.
Social media has largely replaced mass emails and has provided a forum for us to foster connections easily. Have fun and make the most of it. Your presence on these sites helps define who you are and exposes you to groups of people that you never may have encountered otherwise.
Join a blogging network. Now, let me be very clear if you have to pay to be included in a network, forget it. Paid link networking sites are in violation of Google's Terms of Service and cannot only tank your site's ranking but also cause your site to be deindexed entirely from search engines. Scary, right? That is how it should be, however, and is definitely not worth the risk. There are reputable blogging networks, such as BlogHer, that provide a forum for you to enhance your sites visibility and engage in conversations with other bloggers. Find one that is a good fit for you and become an active participant in the community.
Make unique contributions to other sites. Propagating the web with unique content is an excellent way to increase your visibility. There are a variety of Web 2.0 properties that can expand your reach and promote your website. The key to this is to make unique and high quality contributions.
Duplicate content is penalized by search engines because it can operate like spam. While these posts are done in support of your primary site, they should be able to stand on their own as well developed content. Start small and approach them with the same thoughtfulness you applied to your primary site. You can also do things like upload photos to Flickr and videos to YouTube. Done well, these can drive considerable traffic your way.
Link to other sites. How many times have you read an article and clicked on a link that directed you to another website? Probably far too many to count, but it is a great way to enhance your own content, spread goodwill, and build relationships with other sites.
Oftentimes, when you link to someone's content, they will check you out and if they like what they see, could become regular visitors or link back to you at some point. You cannot go about this haphazardly, however, and the link should be relevant to your content. If something captures your interest, you should not only share it with others by clicking on your favorite social media button, but also write a post about it with your thoughts.
Yes, link building can benefit you with increased traffic, but it is also a great way to generate ideas for your own content. We all become stumped at some point. Other people's content can jumpstart the process and get our own creative juices flowing. Of course we are not talking about plagiarism, which should be considered one of the deadly sins, but being inspired by someone else and having something meaningful to add to the conversation.
Be an engaged and active participant in online communities. The importance of interaction cannot be emphasized enough. The web is one enormous conversation, and the only way to be included is to participate. So, submit comments, participate in forum discussion, share other people's work, and, basically, try to make friends with the people around you.
Some people make the mistake of treating the internet as an isolation booth, when it is anything but. It is a ginormous room that people are constantly walking in and out of, sharing stories and information. Sure, sometimes you may just quietly hang out and observe, but there are also times you enter the fray to offer your opinion, a compliment, or a new perspective. The more you do this, the more complex and vibrant your community will become.
As I said, learning how to promote your website is really common sense. You have all of the tools at your disposal, you just need to make use of them. It requires considerable time and effort, but it is worth it. Being an active member of a global conversation will enrich more than just your site. It will help you grow as a person, and that is the biggest payoff of all.
Article by John V. Knowing how to promote your website is critical to its success, and much of it is common sense. No one can guarantee that you will become an online sensation, but there are some things you can do to help it become an online favorite. Learn more at http://www.wpromote.com.
The Web is a wild place, and the principles of survival of the fittest are very prominent in the realm of search engine marketing and SEO. So it makes sense that Google would name its algorithm updates after some of the world's exotic wildlife.
Google's biggest changes to its search algorithm over the last two years are a pair of updates known as Panda and Penguin. Both of them had the same basic goal of lowering the rank of low-quality or "thin" websites, and thus increasing the rank of higher-quality sites. However, despite their common allegiance toward improving the quality of Google's search rankings, Panda and Penguin are very different beasts.
Over the past year, Google has really tightened the screws on what they determine to be participation in manipulative link schemes. This has been tackled from both sides of the link equation; he who giveth, and he who taketh away.
Those who give out links from a site which was engineered to manipulate search engine rankings may have their sites penalized. Manual reviews that see evidence of serious breaches of Google’s commandments may result in excommunication. Yes – banned, de-indexed and sent into galactic oblivion. That might be for the sin of selling page rank via text based-links, giving run-of-site links, encouraging guest blog posts, allowing registration of sploggers, along with and each and every form of manipulative link-building schemes known to man.
The recipients of the links won’t escape Google’s witch hunt for link miscreants either! Obviously, bad links lead directly to your website’s front door, so finding you is not at all difficult. On the basis of guilt by association, ignorance is no excuse. You may not actually be burnt at the stake, but you may be spanked for even minor link receipt infringements.
In many cases, the impact of bad links will not be dramatic. Instead, it may manifest itself as an erosion rather than an “across the board” drop. Often, a group of important keyword phrases will decline in the search engine results pages, while others remain relatively stable.
Getting your head around what’s happening can be extremely difficult because Google now has so many punitive facets to its algorithms. (*1) Incoming link quality and anchor text content, along with on-page content analysis, assessment of percentage of advertising content above the fold, exact-match keywords in domain name and heaven only knows what else! Devising a remedial strategy can be confusing in the extreme!
Basically, your best option is to take an holistic approach and address the various aspects in turn. In this instance, we shall address a mechanism for tackling algorithmic penalties.
Guilty Twinges Notwithstanding
Have you ever been seduced into building links to boost your rankings? Sure you have – hasn’t everyone tried to find a way to improve their position? Unfortunately, it’s highly probable that some of the links that boosted your rankings 5 or 6 years ago are now a liability at best. Even more likely, some are now an insidious impediment to your good rankings today.
I don’t think anyone in their right mind ever thought Google would actually penalize a website for incoming links. It seems just plain wrong to me, because its not an aspect of website ownership over which the site’s owner has full control. Anyone can make a link to your site without let or hindrance! In most cases, you won’t even know about it!
How to Identify Your Incoming Links
First, you’ve got to assess the big picture on incoming links and that requires access to in-depth information. If you have a Google Webmaster account, and your site is verified correctly, Google will give you access to all the links they find pointing to your site. That’s all very well and good, but the last thing you want to do is rush into dumping links that might actually be making a positive contribution to your rankings!
You need a way of determining the quality of those links, and you have at least 2 options:
Manually review every linking site and decide if its delivering a good link or a bad link.
Pay for a link analysis that provides a quality score to help with the decision making.
There are several commercial services that provide link quality analysis. The one I’ve used is LinkDelete, chosen because it had good online reviews. My own experiences with the service have been positive thus far, with deadlines being met and prompt responses to questions I’ve asked.
Initial payment gets your account established. Within a few days a comprehensive spreadsheet list of incoming links they’ve found via their sources is provided for your review. A link quality score is included with every link, and there are often multiple links found per linking domain.
You have three tasks at this point:
Select the links to remove.
Refine the links to the core link detail / account listing if possible.
Provide a text file listing of all links you’d like removed.
Selecting the Links to Remove
Commence by copying all the spreadsheet data to a new worksheet. In the initial phase, you could opt to eliminate all links with a positive link score, on the basis that they are probably good. Sort the remaining links by Domain / URL to get a better view of how many links have been found from each domain to your site.
Be aware that the link quality assessment is NOT without flaws! It may not take into account the relationship of your site to the link genre, or that the site belongs to a member of your business network or contacts. Three of the sites I’ve worked on had Open Directory links showing as -100 link quality, which seems a little harsh! Most of us would commit to almost any indiscretion in exchange for a DMOZ link to our site – its always been the holy grail of directory links.
What is a given is that a link showing up as -500 quality rating is highly suspect! A link showing a positive link value is most likely a good link that you’d want to retain. Link quality is a factor of trust + relevance and automated assessments may throw up a few false negatives. Regardless, it’s up to you to review the list with due diligence! (*2)
Mastery of Excel spreadsheet commands is a handy skill if you want to refine lists because you can identify and remove duplicates, search and replace to drop portions of URLs and much more.
Link Quality Assessment
Carefully scan through the list looking for:
sites you know are good
sites you know are relevant to your industry / genre
sites that are part of your network of business contacts
local or regional directories and country-specific local businesses
Remove all the “probably good” links from the remove list. To verify an individual domain’s perceived quality, check it on Google by searching for the domain name. If the search results show the site in first position, with a number of the site’s pages listed as a sub-menu, its very likely to be a ‘good’ site.
If the domain name search does not show the site in top position, the chances are high that it has:
been taken down
been penalized by Google
been de-indexed by Google
If it’s in top place with no sub-menu pages listed, check how many pages its got indexed by changing the search parameter to “site:www.domainname.com”. Zero pages indexed may mean its been blacklisted by Google and you don’t want it in your link portfolio!
Linking Page Content
Open the linking page’s URL in a separate tab or window and review the page content and the actual listing for your site. Not all links pages are created equal, and some give clear indications of quality, or lack thereof.
If the page has 100's of miscellaneous links, you should mark the link for deletion.
If the page has few (+ or -30) links and all are closely related sites associated within a correct category, it’s probably a keeper.
If the site is clearly advertising itself as offering fast, SEO-friendly PR links and the design is cheap and nasty, lose it!
If it’s a directory site that’s smartly designed, stated as human-edited, and there’s a backlog of submissions, its possibly a keeper.
If the page does not open, but a custom 404 Page Not Found error page displays, it may be that the linking page has already been taken down by the owner. I’ve observed many directories that have deleted ALL listings, but the base site still loads, indicating a possible ‘start afresh’ approach.
If a domain registrar’s page opens, showing the domain as available for purchase, you can cross it off your link removal list on the basis that the link has already be taken down.
Link Anchor Text
If the initial indicators are positive, look at the link’s anchor text.
If there’s a branding element included, such as your business or domain name, this is a link you’d probably want to keep.
If the anchor text is stuffed with exact-match keyword search phrase/s, the link may be a liability.
Some experts suggest you retrieve the anchor text for every link as part of the review process. If there’s only a few links, that’s something you can do quite quickly. However, if there are 5,000+ links to to be reviewed and sorted into good and bad, finding and checking individual anchor text would add a huge burden to an already time-consuming and tedious task.
In the overall scheme of things, it is important to understand that Google is assessing you on multiple fronts. Keep in mind that not only are the quality of the links to you assessed, Google also analyzes the keywords within the anchor text for manipulative efforts.
Thus, a link from an otherwise good site which has over-optimized anchor text can still harm your rankings!
Stages 1 & 2: Link Quality
I’ve settled on a system of making the first pass through the Removal List and eliminating potentially good links, at the same time confirming those which are clearly suspect and possibly toxic. I get link removal requests underway as Stage 1, with a follow-up in the second month where requests were ignored.
Stage 2 is using the Disavow Tool to distance the site from potentially harmful links.
Stage 3: Anchor Text Assessment
As a Stage 3 process, you could go through every remaining link and assess the anchor text content for keyword variation and branding inclusion. This might be the time to extract the list of links to your domain from your Google Webmaster Tools account, and match those up with the each link’s anchor text.
As is usual in SEO circles, there’s a range of conflicting advice on what’s the best format for anchor text! Overall, it seems Google expects that a natural link profile would show far more domain name and/or branded links than exact-match, high search volume keyword-rich anchor texts. Any potential imbalance could be addressed by addition of new “branded” links, rather than removing otherwise good links from trusted domains. If an otherwise good link contains over-optimized anchor text, you could try politely requesting the webmaster or site owner to revise the link using the text you provide. (*3)
Refine the Link Removal List
The LinkDelete process has a limit on the number of link removal requests per month, so targeting the primary link from a domain can reduce your overall list significantly. Look carefully at the links, and try to spot the core account link. For example, a directory may have your core listing, along with links to it from several category pages. Eliminating the core listing will automatically remove it from all other pages, as per this directory listing example:
You should spend time trying to find the account listing URL. It may be found as a link on the “More Info” button. You may even need to search the site for your domain then track down that primary listing.
Rather than have the “Please Remove Link” request list all the pages the link was found on, simply requesting the deletion of the primary account / listing link also helps the webmaster quickly identify the source to remove, and increases the chances of a positive outcome.
Submit a Text File listing Links for Removal
After reaching the end of that marathon effort, you now need to copy and save the list of links you’d like removed into a simple text file. Log into your LinkDelete account and reply to the Project’s link list message, and attach your Links to Be Removed text file.
Link Removal Report
Approximately 3 weeks after your link removal list is submitted, the first month’s process will be completed. You will receive a report outlining which links have been successfully removed, and which remain. At the same time, you will be billed for month two of services.
The number of links you’d like removed determines your next step. A single site has an allocation of 400 link removal submissions per month. None of the sites I’ve worked on have exceeded that, so a second batch of new removal requests was unnecessary. Instead, a followup contact with those sites that had ignored the first round of removal requests was the correct option. If you have no further new links to be processed for removal, you should now cancel your monthly subscription.
Towards the end of the 2nd month, you will have the final list of links removed from the follow-up campaign and are ready to take the next step.
LinkDelete will provide you with the evidence of two months of remedial efforts to remove links you don’t trust. You now have a list of links you’ve successfully removed, and a list of sites that have not responded to two consecutive requests to delete the link.
The Disavow Tool
This was provided in late 2012 for site owners to “disown” links that might be hurting them. It’s important to heed Google’s instructions. Here’s what they say about the use of the disavow tool! (*3)
“If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site. You should still make every effort to clean up unnatural links pointing to your site. Simply disavowing them isn’t enough.”
However, you’re now good to go! You’ve already done the hard yards and can back that up with demonstrable results from 2 months of link removal efforts. The links you are disavowing are owned by someone who steadfastly ignored your polite removal requests over 2 consecutive months!
When you submit a disavow request, it will be processed much like a sitemap is processed. Google will flag the disavowed links pointing at your site as if they have the rel=”NOFOLLOW” meta-tag attached to them. Effectively, in the context of both link counts and anchor text analysis, they will be treated as if they did not exist. On the Disavow Tools page, Google provides a link to detailed instructions.
Having determined that a potentially toxic domain is linking to you, it’s sensible to simplify the process by dealing with links at the Domain Level, rather than individually specifying each link.
Basically, you need to:
Copy your final file of non-removed links into a worksheet.
Remove all the multiples so that you are left one link per domain.
Search and replace “www.” with “domain:” because Google does not need the canonical version.
Trim the page file names off the end of all the domains to ensure all links are disavowed.Save the list of domains to be disavowed as a text file (use Wordpad or similar). The file needs to be in UTF8 format, with a .txt extension.
Go to Webmaster Tools / Disavow: https://www.google.com/webmasters/tools/disavow-links-main
Select the correct Domain name that you wish to disavow links to, and upload the text file.
Wait patiently – potentially for a quite a long time. Google may process the list quite quickly, but it may be weeks / months before the impact of the changes appears in a future iteration of the Penguin update process.
The Webscape Changeth
After months of link deletion efforts across websites in different genres, it’s clear that the impact of Google’s efforts to cleanse the web of link manipulation is having a broad impact.
“Bad” Sites Are Being Turned Off
While reviewing lists of links to assess quality, I’ve been surprised by the numbers of sites that have been taken down. Lots of cloned versions of dodgy SEO-friendly directories now have their domains parked or have had all content deleted.
If your *.blogspot.com site was created solely to improve SEO ranking of your main site, then there may no longer be any point in retaining it because it’s likely doing you more harm than good.
If it’s a link-scam directory, all it’s potentially good for is negative SEO. Its output is now harmful to the sites that are listed, or to unwitting sites innocently submitting links in future.
Of course, at the point where link removal requests exceed link submissions, running a directory becomes a pointless waste of time and money.
1st Month Link Removal Results
Of the sites I’ve worked on, the lowest removal success rate was 24% in the 1st month. Most achieved between 27% and 36% removal of unwanted links. The highest success rate was 100% removal of all unwanted links, albeit on a site that had a modest 81 domain links targeted for elimination.
2nd Month Link Removal Results
The follow-up emails achieved a 5% – 10% return, but provided additional supporting evidence to Google that we’ve tried to “make every effort to clean up unnatural links.”
How Rankings Improved After Link Removal
Across the board on all sites, demonstrable improvements to rankings were evident. Sites with 30% removal of poor links responded progressively, month by month. Even the site with the lowest percentage of links removed also responded positively.
Conclusion on Initial Link Removal Efforts
Clearly, the results of cleaning up potentially harmful links are positive. The task is relatively modest in cost and therefore well worth the effort. Like much of the work involved with SEO, it’s tedious and requires attention to detail.
The impact of the Google Disavow Links submissions remains an unknown quantity at the moment, and it will be interesting to see if there’s an incremental improvement as disavowed links are removed from the equation. There’s little in the way of online case studies as to the effectiveness of disavowing links and the timeframes for seeing results.
Stages 1 & 2: Link Quality
I’ve settled on a system of making the first pass through the Removal List and eliminating potentially good links, at the same time confirming those which are clearly suspect and possibly toxic. I get link removal requests underway as Stage 1, with a follow-up in the second month where requests were ignored. Stage 2 is using the Disavow Tool to distance the site from potentially harmful links.
A bad link is a bad link, no matter if the anchor text is good, so the priority should be eliminating the low-quality links first.
Stage 3: Anchor Text Assessment
As a Stage 3 process, I go through every remaining link and assess the anchor text content for keyword variation and branding inclusion. This might be the time to extract the list of links to your domain from your Google Webmaster Tools account, and match those up with the link’s anchor text.
As is usual in SEO circles, there’s a range of conflicting advice on what’s the best format for anchor text! Overall, it seems Google expects that a natural link profile would show far more domain name and/or branded links than exact-match, high search volume keyword-rich anchor texts. Any potential imbalance could be addressed by addition of new “branded” links, rather than removing otherwise good links from trusted domains. If an otherwise good link contains over-optimized anchor text, you could try politely requesting the webmaster or site owner to revise the link using the text you provide. There seems to be some consensus that targeted (exact-match keywords) anchor texts should not exceed 30% of your overall link portfolio. (*4)
There’s no magic bullet, no quick and easy shortcuts. However, those prepared to put in the time and energy required will reap rewards commensurate with the efforts they’ve expended.
1 – www.seomoz.org/blog/the-difference-between-penguin-and-an-unnatural-links-penalty-and-some-info-on-panda-too
2 – www.seomoz.org/blog/identifying-link-penalties-in-2012
3 – http://searchengineland.com/how-google-disavow-link-tool-remove-penalties-154928
4 – www.seomoz.org/blog/click-here-seo
Ben Kemp, a search engine optimization consultant since 1997, is a specialist in website redesign, and a veteran with 25+ years of experience in the IT industry.
- Universe Design Websites & Photography - North Carolina, USA
- About Us
- Joomla Universe