събота, 12 февруари 2011 г.

Bad SEO http://golearnweb.com/seo-tutorials/bad-seo.html

Bad SEO

Bad SEO

There’s a ton of search engine optimization, keyword phrase strategies, and page ranking advice out there that promises to get your website booming with traffic from Google, Yahoo!, and Bing (formerly Live, formerly MSN). Unfortunately, much of the free help is dated and discusses a system that’s long been revolutionized since the major search engines took over.

Some of the alleged top Google listing tips, tricks, and services are just flat out wrong, misleading, and can be harmful to your online business. If you really want to increase targeted traffic to your website, you should ignore the following SEO advice…



What to Avoid and Ignore When Optimizing Your Website for Search Engines?


1. Hiding lists of links on your page to be picked up by web crawlers

Google looks for keywords in specific areas on the page. Mainly in the title and within the content itself. Listing a bunch of keywords somewhere outside the content, especially in a way that cannot be seen by visitors, won’t help you and might even get your website penalized. Tags being an exception.

If it doesn’t read like a decently formatted narrative, then the web crawlers probably won’t think it is either. That’s what they’re designed to do - decipher which webpages contain the information to help people typing in queries in their search engine to get to what they’re looking for.


2. Believing that filling your meta tags with keywords will help you

Don’t be naive enough to think that you can tell the search engines what your webpage is about, which is what many meta tags do. Google and the other engines don’t even use them to determine page ranking anymore (Google never did), and so it’s already an obsolete method. Google also uses the text around keywords in the content to determine the page descriptions, so its not necessary to include. However, if you want a more standard description for search engines, you can still use the description meta tag and Google will pick it up to be displayed in listings.


3. Using link exchanges to create backlinks

It’s true that inbound links from other webpages greatly enhance your high search engine listing position probability, but only if they’re from related sources. Google won’t consider a link back from a website about hiking as reliable if your website is about role-playing games. The web crawlers have gotten good at cross-referencing content and they know the difference between an apple and an orange. People will usually only go to related links as well, so most people visiting the site giving you the link will not visit you anyway.


4. Trying to rank for keywords that are not related to the content on your page

Just because you use the keywords, doesn’t mean you’re going to get ranked for them, especially if your page talks about something completely different. Just like the engines cross-reference between websites, they also cross-reference within topics. If you tried to place keywords for “American literature during World War II” to be picked up and the page really discusses “How YouTube became popular,” your page will probably just get buried in the depths of cyberspace. Again, even by chance you do rank for the term, once people see your site does not match their search, they will either not click on it, or immediately leave the website once the land on the first page.


5. Falling for schemes that will submit your website to “thousands” of search engines and directories

There are only a handful of search engines out there and anything else is most likely already controlled by Google, Yahoo!, Bing, or Ask. The first three comprise of almost 100% of the searches on the Internet. The web crawlers for these search engines are also very efficient at what they do, so as long as you’re already linked in to the Internet (have at least one inbound link) and made your pages accessible to visitors, your website will get indexed (listed) anyway. You don’t need to submit to anyone, let alone pay for it.


6. Falling for schemes that guarantee or promise number 1 spots on search engines

There’s much you can control to get ranked - keywords, optimizing your website for crawlers, content, and some inbound links. There’s much you cannot control as well - inbound links of competing websites and pages, the strength and popularity of the competitor’s domain, and the age of pages to name a few. In the end, the only people who know exactly how the web crawlers operate are the web crawling programmers. There’s countless variables used to determine page rankings that we just don’t know about, so no one can guarantee or promise high rankings. At the very least, a good SEO specialist can give your pages a higher probability of being ranked on the first page, but not much more.


7. Focusing on Google's "PageRank" number for your website

This number tries to place a numerical popularity value on your page and in large part represents the number of inbound links. However, do not get the cause and effect confused. The number indicates page popularity, it does not create it or boost it. Also, PageRank does not necessarily account for every possible keyword phrase you might be ranking for with a given page. Simply focusing on this number can be misleading. Think of this as more of a score to see how well you are doing overall with that page relative to other pages on the Internet. Even Google announced people should not focus on it, so they removed it from their Webmaster Tools.


8. Listening to "Experts" who say your page won't list well because of its web language

Whether you code your website using HTML, XHTML, CSS, PHP, ASP, Javascript, or a mix of any of these, search engines will be able to index your pages. Now, there can be some exceptions, if your pages are coded poorly etc, but don't fall for schemes of SEO specialists wanting to redesign your entire website because it's in asp.net. Pure myth.


9. Trusting "SEO experts" because they ranked well for a keyword

Many SEO specialists like to taut their ability to rank well for a specific keyword phrase, claiming that this example proves they can get your pages rank as well. However, you need to do your research. Many of these specialists find keywords no one or very few people type into search engines. On top of that, very few people in the field use the phrase. With no competition and no one interested in trying to rank for the keyword because people do not use phrase, of course someone can step in, target that phrase, and rank well for it. In these cases, it does not matter how many inbound links the competing websites bring in, because those websites do not directly relate or target the keyword.

Also, watch out for SEO specialists who focus on search results. For example, people who claim they ranked #1 in a search engine listing for a phrase with 4 million other results. Search engine results account for not just all the websites using that phrase, but all the websites using any combination or individual words in the phrase. So, if someone searches for "big red hats in London" then websites using "big red hats in London" as well as websites containing "big red hats" or "big" "red" "London" and every other variation you can think of, will also be listed. Most of these websites do not relate to the search and really do not matter in the grand scheme of things.

It does not matter how many websites show up in the search if no one searches for that phrase, no one targets the phrase, and most of the websites only show up in the list because their content contains snippets of the phrase. Ask for real proof and real results. Ask for client data where the SEO specialist got a page listed in a top position for a targeted, relevant, and commonly used keyword. If they cannot provide such evidence, time to look elsewhere.

Where can you check out how many searches a keyword gets in a month? Start by visiting Google's Keyword Tool. Start thinking organic and natural.

If you come across anything that sounds like it’s a cheat, a work-around, like it’s trying to pull one over the web crawlers and trick them into getting your page ranked, then it’s probably a bad idea. It won’t work and the search engines might even punish your page for the attempt to fool them.

You need to start thinking organically, thinking how your page would succeed naturally.
Make sure you always…

* Write content on pages that bring real value and useful information to your visitors.
* Use keywords that relate directly to your subject matter. You can watch the importance of using relevant long tail keywords right from the source at Google.
* Get links from websites in your own market or field and hope that others will also find your information worthy for them to share with others.
* When in doubt, review Google's website guidelines. To get an idea of what Google focuses on, you can also read about their technology overview page for searches.
* Stay away from SEO specialists and consultants that tell you to use any of the methods to avoid mentioned here. Trust your gut on their advice and don’t be afraid to get a second opinion.


The search engines want to find your useful information and pages to share with the world. Start writing about what you know best and format it the best and most accessible way you can and you’ll already be ahead of most of the game



SEO Tools – Scripts That Help And Ones That Flunk

It’s hard enough to find good resources and programs out there to help your web pages get ranked on Google, Yahoo!, and Bing without needing to worry about the numerous services that don’t work or give you inaccurate information. Check out these free SEO tools, scripts that while target useful information, do not live up to their promises, and others that do!


1. Domain Age Checker

The guys over at WebConfs.com host a plethora of pages to plug in info and get the data you need. Unfortunately, while domain age matters, with older websites seen as more trustworthy by search engines and more difficult to knock out of the top ten search results for a keyword, their Domain Age Tool flat out lies. So bad, in fact, that it can be as much as 6-7 years off! Given the Internet only really started booming in the 90?s, that’s a huge margin.

Lucky for us, SEOLogs.com hosts a similar program that actually reports good estimates. Of course, if you want the best information, you can always look up the domain at www.whois.com.


2. Traffic Rankings

For years people heralded (and man still do) Alexa as the gold standard when it came to traffic rankings. When considering the ability to rank a page for a keyword, domain strength and popularity matter, the volume of inbound traffic matters. As a webmaster for multiple clients, I can easily cross-reference actual traffic and Alexa’s rankings. I can tell you with certainty that Alexa at times gave clients who received triple the traffic a lower score and position in comparison to clients who received substantially less traffic. It’s a faulty system that sometimes works and many a time does not. People need to stop promoting it and using it. The same goes for Compete.com.

Quantcast does much better, but that’s because it requires a script much like Google Analytics, and then can only serve people when comparing sites within it’s own circle. So unless Google releases it’s own Alexa, it’s a crapshoot trying to sort out rankings amongst these three resources. Google’s PageRank might be the only alternative, but with a range of 0-10, it’s hardly precise enough.


3. Keyword Generator

First off, do not buy a program to generate keywords for you. You know your market best, you know the lingo being used, you should build your own list and then branch out from there. Second, do not bother with Wordtracker’s Free Keyword Suggestion Tool. The tool only gives you data based on people who type words into the tool, not into search engines! You can guess that only people concerned about keywords then, not your actual market, would be feeding it information. Wordtracker does not even supply an ample list. For example, I typed in “SEO Tips” and it gave me three phrases “seo tips,” “free msn seo tips,” and “seo copywriting tips.” I don’t know which search engines they’re useful for, what my competition might be like, nothing.

Your best free option, straight from the source, would be Google’s AdWords: Keyword Tool. Originally designed for Adword campaigns, it uses search information directly from Google, directly from 67% people searching online, for it’s estimates. It tells you the monthly search volume, the current search volume, and how many competitors target the word (assuming businesses know their stuff, high competition means difficult to rank for due to high website usage of the term). It also lists many variations of the phrase, related phrases, and even slightly related phrases. It truly lets you know whether or not you’re wasting your time trying to rank for keyword phrases people rarely search for. It also lets you specify results by language and region and you can even get recommended terms pulled directly off websites. It’s awesome and the best keyword tool you’ll find!
Lastly, Don’t be a Tool Yourself!

There’s plenty of dishonesty and bad SEO practices going around. Don’t let some SEO specialist or consultant pull your chain and waste your money. If your gut starts giving you the signs and you don’t see any good results, it might be time for a second or even a third opinion! And remember, I’m always around for a bit of hand holding too. Feel free to send a question my way!


4. Duplicate Articles – Search Engine Listing Suicide

So you’ve decided to give blogging and article marketing a shot to increase traffic, create inbound links, and improve your listing position on the major search engines. You’ve already written a handful of articles in your niche market to distribute to high traffic publication sites. But then after you submitted them, you noticed many did not rank in Google and other search engines, even though you optimized them for keywords, and found the articles listed in the engine.

Now ask yourself….

How many websites did you submit each individual article to?

You’ve probably fallen victim to the duplicate articles dilemma.

Rewind to a few years ago when “Internet marketing gurus” were pushing their viral marketing strategies and encouraging people to submit the same article to every known article directory they could get their hands on. Well, guess what? They were encouraging countless people to essentially spam the search engines with their writings and trick a system big on copyrights and original content. Everybody knows how huge a taboo spam is. Eventually, Google and the other guys caught on and implemented algorithms in their web crawlers to weed out and bury the duplicates.

And who can blame them? How reliable would users consider an engine that listed the same exact information in the top ten spots for a search, just on different websites? That’s like a library only offering different editions of the same book when someone was looking for research on a particular topic. What if someone turned in a paper with a works cited for 10 sources all essentially the same piece of information? Intelligent people would find a new library and the guy who wrote a paper would probably fail the assignment.

Google wants to bring value to their users and offer a service second-to-none, that’s how they climbed to the top in the first place. So, if you’re trapped in the habit of creating duplicate content, you should ask yourself the same question - are you creating any real value for people?


5. The Reality of Article Submission

Most of the article databases receive very little relative traffic to begin with and return a low residual effect. Their traffic rankings cannot compete with more prominent sites like universities and news sources, or high traffic directories and blogs. If you’ve spent several hours on a top notch article, why publish it on www.most-awesome-articles.com (not really a website) where it won’t get ranked, it won’t be found by the website users, and when other sites like WordPress, HubPages, Squidoo, and EzineArticles are around? If you wanted to include a guest piece in your magazine or paper, which source would you rely on… one from a prominent editor at CNN or a random letter received from spikeymikey@nowheremail.com?

Google thinks in a similar fashion on their end. Why list an article from www.most-awesome-articles.com when they already have a version from HubPages?

You might be thinking to yourself - what about Reuters? What about authorized syndication? It’s true that Google and the other engines allow some room for syndication from common news sources, but most of us do not operate in those circles or have access to that content to freely use, so why take the chance? Probability dictates that Google will pick up one of your articles, but the rest will get lost in the engine and with them all the time spent submitting to dozens of useless directories.

Truth betold, no one can guess why Google keeps one version of web copy over another. It could be related to the age of the page (how long it’s been live online), the reliability of the source, the code to text ratio compared to the other versions… and sometimes, the engine will list more than one copy. It’s all guess work, but you can bet that more often than not, only one version will be viable.


6. The Solutions That Almost Saved You

But then there came in the works a hybrid solution to try and trick the engines again! Re-write the article so the web crawlers think it’s something new, so one can submit almost the same article to at least the top publication sites without needing to create something fresh. And so began the age of www.dupecop.com and similar services like Article Checker to check the percentage difference between docs.

Here too though, writers beware, Google’s web crawlers are more sophisticated than people give them credit for, and even an article that appears to be 50% different can be picked up as a dupe. To safeguard this technique, a service like QuoteFinder http://blogoscoped.com/quotefinder/ is recommended. Just change the sentences that show more than one online source in your new article.
The Best Solution for Duplicate Content… Doh!

I however, have an even better solution! Stop creating duplicate articles completely and start writing original and unique content that really offers value to your readers and hopefully potential clients. Nobody likes a fraud, a hack, or a two-timer – and that’s what you’ll look like if the smarter-than-you-think readers find out you’re duplicating content out there, or worse, simply rehashing someone else’s. Don’t destroy your potential credibility, visibility, and profitability because of laziness. Make each article you write count and make sure it’s one-of-a-kind.
The following benefits of blogging and article marketing can be yours…

* Higher page position in search engines (with inbound links)
* Increased webpage traffic
* Increased credibility
* Increased visibility
* Increased clients and customers
* Increased profits


But if you duplicate content all you’ll do is lose the above benefits plus…

* Waste time
* Waste money
* Become discouraged


What to do with your now unwasted time?

Search engines also like pages to stay fresh! Update your articles from time-to-time. Add in new information, take out parts now obsolete, add in some graphics, optimize the keywords, and keep editing until you get ranked on the first page for your targeted keywords. A well maintained and fully tweaked article can bring in more traffic and more business than 15 hurried and forgotten pieces any day!

Online publication and writing can be one of the strongest methods to market yourself and your business online as well as improve your search engine listing position. So do continue with your blogging and article marketing, and do submit your awesome work so people can find it at EzineArticles, your blog, or other sites, but steer clear from tendencies to submit duplicates – in the end, you’re just committing internet marketing suicide. And like spam, we know how deadly that can be.

AUTHOR: http://golearnweb.com/seo-tutorials/bad-seo.html

Няма коментари:

Публикуване на коментар