seo-websites

Los Angeles Search Engine Optimization (SEO)

The most powerful tool in reaching new clients online is a search engine. As we touched on earlier the search engine is one of the few advertising mediums where a potential client or customer is actively looking to see some sort of marketing results.

At the core of every search engine are the search engine results pages (SERPs), a list of ranked sites, products, or news stories that have been sorted based on quality, relevance, and a large number of other factors. The most valuable rankings online are the top spots in organic search results that have been generated by the search engines’ algorithms.

Why are those top rankings so valuable to the search user who ultimately represents the end consumer, potential client, and customer?

User behavior studies on search engine use show that over 70% of search engine users only click on the top three rankings. If you don’t rank in the top ten of the SERPs, almost no one will actually bother clicking on your page link. The actual break down varies a little bit depending on the search engine, but for the most part, 40% of users only click on the first ranking result, 60% only click on the top two results, and 70% only click on the top three.

That means top ranking for a valuable keyword could result in significant access to a consumer that a competitor may not have. And we all know that first impressions matter most.

Conceptually, it’s easy to see why Search Engine Optimization (SEO) is so valuable. As the name implies, it is efforts to improve site ranking through various strategies, ‘optimizing’ it to achieve better ranking. But how do the search engines actually determine who ranks highest and how do we go about optimizing a site so that it can improve in ranking?

The core driving force behind each search engine is an algorithm. Algorithms are automated sets of coded instructions that repeatedly scan, evaluate, and sort data from billions of websites around the web. We refer to the scanning elements that explore each web page and web page link, robots, spiders, or crawlers. These ‘crawlers’ visit a web page, either from a link submitted to a directory (a yellow pages of sorts through which web master can ask for their site to be listed), or from a link from a referring site. Generally, search engines give better value to sites that they find through a referring site, rather than ones that are referred through a directory. Why? Think about real life dating as an example- if a stranger approaches you and asks for a date, you’re not likely to respond with the most enthusiasm about their offer. However, if a very good friend of yours suggests that you meet with someone, you’re much more likely to be agreeable. The same is true of search engines. Whereas a submission for listing via a directory is the equivalent of yelling ‘Look at me’ to someone, having a referral link be the source of Google’s visit is the equivalent of receiving a recommendation for someone.

Obviously, first time introductions don’t necessarily equate to lifelong partnerships, so the algorithms spend quite a bit of time returning to your site, learning about you, your business, your industry, and more. Your site can be crawled hundreds of times in a day, each time by the algorithm as it attempts to get a better understanding of how it should incorporate you into rankings.

Way back in time, right around the start of the new millennium (2000), Google evaluated sites based on keyword content. The early forerunners for the modern crawler looked at what words were used throughout the site and how frequently they were used, to make a determination about site ranking. Of course this simple way of rating sites led to what we know today as ‘keyword stuffing’. Early practitioners of SEO saw that if they put enough keywords on their page, they could improve ranking with search engines. This didn’t necessarily mean that the site had anything worthwhile in its content, or that anyone would ever want to visit it, but it worked.

Old SEO tactic using keyword stuffing
Sadly, this site is still live on the web. Who was their SEO?

In the image above, you’ll note how they’ve added hundreds of words to the bottom of their page in an effort to let Google know that they want to rank for those topics. Problem is, Google quickly realized that looking at the number of keywords was too simple a way of evaluating ranking. So it adjusted its algorithm to notice irregular frequencies of keywords, irregular use of keywords, and more. The modern search algorithm even evaluates the reading level of your site text, whether or not it is written in a way that makes sense from a conversational perspective, and whether or not your keywords are supported by relevant topics or keyword content.

Google also sets a higher priority for certain types of keywords. Meta tags are some of the most important areas that you can add keywords. Most of the time, the meta title and description don’t actually show up on a web page, but they should be there. Google gives them a priority over your other keywords as it does keyword content listed as being a heading with a HTML tag like this h1 or h3.

As an example, look at the ‘View Source’ for CoalitionTechnologies.com. If you look towards the top, you’ll see our meta tags in a location that most users will never see.

Right there in the middle of the screen grab, you’ll see our title of ‘Los Angeles Web Design’. That shows up in your browser here:

Image of our site's source meta title.
Such an attractive title, don’t you think

That little snipped of text shows up at the top of your browser window. Since it features so prominently to the end user, Google gives it added priority over other keyword content. Obviously, at Coalition Technologies, we want to rank for Los Angeles Web Design.

Meta keywords, another meta tag, used to have significant relevance, but because of keyword stuffing, they were largely deprecated or devalued by Google. Some search engines still use them, though, so we put them in anyways. Right beside them, there is usually another meta tag called ‘description’.

Meta descriptions don’t actually show up anywhere in the users browser window or on your website at all. Where do they go and what are they for?

Meta Descriptions show up for SEO in results pages
Such a succinct summary of our business

Yep, right beneath our page link on Google’s search engine results pages (SERPs). That is pretty prominent positioning, so Google gives added value to those. It would be foolish for a site to stuff a lot of keywords into the description, since many of their site visitors would see what appears to be spam on the search results and would likely move along.

Google next looked for a way that would allow them to find out what other users liked to see, since that most often meant that the results weren’t influenced by someone ‘gaming’ the algorithm system. What did they find to use for their next evolution of evaluation?

Links.

Simple, tiny, meaningless pathways from one web page to another, became one of the most powerful factors for Google to evaluate ranking and site value.

When Google first was beginning to grow, the web wasn’t nearly as massive and powerful as it is now. It was a smaller community of businesses, individuals, networks, and developers than what we have today. As a result, most links were created with a genuine desire to share content with someone else.

If you liked a page, you’d link to it. If someone liked your page, they would link back. It was a powerful way of communicating that some websites had valuable content that other users would appreciate.

As we already mentioned, Google views links as votes of confidence or recommendations. And it still does. But not long after Google started making links valuable, people realized that they could influence their search results by creating links.

Soon, the process of linking from one site to another, became just as spam filled and corrupt as keywords had been before. What once was an honest way of evaluating sites for quality, became an easy way for companies to move their web pages up in search results.

Problem was, Google was on to something with the idea that links were and could be expressions of confidence in another site or product. And the fact that it was rapidly growing into the largest search provider on earth, told it that it couldn’t just walk away from its indicators. So it hasn’t. Google still uses links as one of the primary methods of evaluating quality and relevance of the site.

But that didn’t mean that it was just going to let the spammers who were trying to toy with the algorithm and rankings take over. So it made the algorithm’s method of evaluating links more intelligent and better able to explore the value of links.

First, it began by establishing an expected ratio of actual content to links. If a site generated thousands of links, but only had several hundred words of content, Google would devalue the site’s referrals.

Some webmasters and early SEO companies responded by removing the high volume of links, and by establishing reciprocal linking schemes, which basically amounted to, “I’ll link to your site if you link to mine”. Google responded by devaluing reciprocal links in its ranking algorithm. If a crawler for the algorithm realized that a link took it off one website, to another, and then a link from that page went back to the same page it had just left, the algorithm would devalue the links.

Google’s aggressive pursuit of a better quality search experience allowed it to quickly become the major market player for search engines. Even now, with its attention on cell phones, YouTube, and dozens of other products, Google is still the biggest search engine.

In 2010, Bing (Microsoft) and Yahoo formed a partnership that now accounts for around 30% of global monthly searches. Bing now powers Yahoo’s search engine, effectively marrying the two products and results. But even together they have a ways to go.
The next graphic shows an accounting for Bing and Yahoo’s partnership and shows just how much control Google has over the search market.

Pie chart showing Google, Bing, and Yahoo percent of market share
It looks a little like Pacman doesn’t it?

Because much of the methods by which the forerunners to today’s SEO companies responded to changes in the search algorithms essentially subverted Google’s efforts to generate only the best quality results, Google has leveraged its position as the gatekeeper of most of the world’s web traffic to try and create a set of rules that help to ensure that its algorithm operates the way it wants to. These rules are called Google’s Guidelines for Webmasters. Much of the content and instructions is aimed at how webmasters should create their sites, and market them, as to avoid punishment for not playing by the rules.

The Guidelines forbid certain approaches to link building such as paid linking, where one company sells links from a high authority site to another site, or where one company sells high volumes of low quality links to another site. It also forbids link trading schemes, where a web developer might trade a link from one good site to another site, in exchange for a link from an entirely different site back to another property. Because these types of tactics are difficult to catch natively through the algorithms functions, the rules mainly act as a warning that engaging in those methods may result in ‘manual punishment’.

Manual punishment or action is taken by a group of individuals called Google’s Spam Team. They are headed by Matt Cutts, and their primary job is to ensure that if the algorithm misses something, there is a human element to respond to it.

Head of Google's SEO Anti-Spam Team
He doesn’t have the job just because he’s handsome.

In one of the most high profile examples of their power, they responded to accusations that JC Penney had engaged in a widespread paid linking campaign. In January of 2011, the New York Times noticed that JC Penney had a large number of links leading to its product pages that had nothing to do with content on either the referring site or on JC Penney’s site. In one example, a video game forum discussing Half Life had a link to an evening gown category page on JC Penney.com. Google’s algorithm had picked up on the fact that the links were low value, but it hadn’t totally dismissed them. In all, there was estimated to be around 10,000,000 low value links. But in that quantity, they added up enough to influence the ranking algorithm. What happened? Once the manual action of the Spam Team was taken, the average ranking for JC Penney’s products fell from #3 (on the first page) to #86 (the ninth page).

Google’s Guidelines essentially draw a line in the sand between two groups of SEO companies. One company type pushes tactics that Google regards as spam. These ‘black hat’ companies often give away their lack of understanding of high level, Guideline compliant, effective SEO in their sales pitches. Usually their sites offer SEO pricing packages based on numbers of keywords to be targeted, and number of links to be created to the site. Based on what you just read, you should realize that selling numbers of keyword targeting and links, they’re basically selling spam.

Image of a link building sales package
Selling links by the 1000’s
Another SEO link emphasis package
More links for sale, get your hot and dirty links here!

These two packages are prominently displayed on site’s belonging to companies that claim to be high quality SEO firms. Of course, a simple description of Google’s algorithm factors and rules, should help to clear the air and let your client know what the differences are between our campaigns and these other companies’ offerings.

What else does Google look at, outside of keyword content, and links? Here’s a quick list of some of what helps influence ranking:

Types of content: The internet is a very fun and entertaining place to be. Recognizing that, Google gives prominence to sites that have diverse types of content- text, video, pictures, animations, applications, and more, all are encouraged under Google’s ranking algorithm.

Volume of content: Lots of pages and lots of content usually means a site that is a more valuable resource on a number of different topics. For many search terms, Wikipedia ranks highly because its site gets lots of added authority from the high volume of content it possesses.

Alt img tags: An HTML element that was originally intended to tell a user what a picture was prior to their dial up being able to download it, has now become an important way for Google to gauge image relevance to the actual site content. Google’s crawlers don’t see the picture that you see, so the text description provided with an alt img tag is important to cluing Google in on what the picture shows. Alt img tags are also required to have valid XHTML markup.

Page load times: Google has added increasing value to page load speed. In the era of high speed internet, a user doesn’t like to have to wait unnecessarily for delays in load speed. Writing code so that the pages load faster, and choosing the right hosting partner for a website can directly impact a site’s ranking.

Domain Authority: Almost anyone can get a .com domain, which means that the .com sites aren’t as authoritative. Sites with .edu, .gov, etc, all get higher priority because they represent a more limited, recognizable group of organizations and institutions.

Brand Authority: Google also tracks discussions throughout the web of various brands. A company like Nike gets talked about quite extensively, with and without linking. That discussion can positively influence ranking.

Outbound linking: Google uses outbound links to influence its ranking of a page. An outbound link can be good references for specific content and keyword relevance.

Duplicate Content: Websites can accidentally or intentionally build duplicate content into their pages. A website can have internal duplicate content issues or external duplicate content issues. An example of internal duplicate content may be when a website allows you to find the same product through multiple categories. For instance, Levi 501s might be found under both the Jeans category (www.clothes.com/jeans/levi-501s) and the Pants category (www.clothes.com/pants/levi-501s). The same descriptions, pictures, and content just under different URLs tells Google that you’re trying to increase the volume of content on your page and the number of pages with spam-like tactics. This type of issue can be resolved using link canonicalization tags, which tell Google to only view content through a URL pathway. Internal duplicate content can also occur because a site doesn’t properly offer a permanent 301 redirect if it can be accessed through both a www.domain.com address and http://domain.com address. A permanent 301 redirect always sends traffic to the same access point. External duplicate content occurs when someone else offers up the same material as you do to another user. If you’re just a clone of someone else, why would you be useful to anyone? This frequently happens with e-commerce sites that use descriptions provided by a single manufacturer.

Social Media/Network: Going viral and being the topic of a lot of ‘buzz’ is well known as a successful marketing strategy. But most businesses don’t realize that search engines are testing ways to utilize social network buzz in their algorithm rankings. Getting lots of ‘likes’ on Facebook could actually impact the number of visits to your site.

Bounce Rate: A site’s bounce rate is the frequency with which users who visit the site back out (bounce) to the search engine results page that they came from. Too frequent ‘bouncing’ of users usually means that you aren’t that relevant to the search term they were using.

Time on site: How long a user spends reading/watching/exploring your site helps indicate whether or not it is relevant to end users.

Page Views: How many pages a user visits on your site, serves as another indicator of site quality.

Site Maps: A properly written site map allows Google to easily analyze every page on your site. Without a site map, some pages may never be reached by the crawlers, or may not be indexed often enough.

Robots.txt: A small file that exists solely for the benefit of the search crawlers, the robots.txt file serves as a brief set of directions as to how the site should be looked at.

And Many More…….

If you hadn’t guessed already, we are trying to point out that SEO is a much more complicated field than simple link building and keywords. The SEO companies and web developers who emphasize those two items are demonstrating only the lowest levels of SEO work. Black hat efforts often don’t generate long term results and often result in punishment of the offending site. That isn’t usually promoted in those ‘SEO Platinum Packages’.

Related Posts That May Help