What is Important for Search Engines?

0 31

I recently experienced a question from someone who needed some possible SEO seeing me. He was in the process of the redesign and wanted to make sure not to make any errors along the way, which is super-smart! Your time to be looking at SEO is definitely at the first stage of any style or redesign project.

The actual interesting part of the email had been this person’s misconceptions as to what he thought were key elements for the search engines. I’d like to talk about those points with you, using my comments following each of them:

* Little or no Flash.

This is a huge misconception to many who are trying to design search-engine-friendly internet sites. There’s nothing inherently wrong with using Flash and no explanation to avoid it altogether. What action you take need to avoid is an all-Flash site, as well as Flash nav. But that’s it. And in many cases, if you have those things, there are workarounds.

* All scripts need to be called from external records.

This is a great idea to keep quality down and make it simple to update your pages, but it includes nothing to do with search engines like google or how your web pages are ranked within all of them. Search engines have long recognized how to ignore code that is of no use to them. Whether your own scripts are right there within the source code of the site or called up on the surface will have no bearing on your own rankings or search engine esprit.

* The site should be made using CSS as substantially as possible.

Another myth. CSS doesn’t have any special components that search engines like better when compared with tables or any other Html page. Again, it may make it much easier for you to update your pages, in order to use your content for other items, but it’s not a SEARCH ENGINE MARKETING technique that will increase search rankings or relevance.

* Typically the CSS should be called via external files.

Same as phoning up scripts in exterior files — nice to perform, but not a search engine issue in minimal.

* There should be no remarks in the code. It should be put into an FAQ or Doc-type file.

Why not? I’m unsure where this myth originated from, but I suppose if you’re convinced that file size is going to affect your engine rankings, you might also think this one. It may have also happened because some people used to believe that adding keyword phrases to opinion tags would help search engine ranking positions, even though it didn’t. Comment labels have long been ignored through the engines, and because of this, you may use them as much or as little in your origin code as you would like. It’s my job to comment out bits of wording and code that I no anymore wish to use but that we may want to add back in after. It’s absolutely, positively not an issue!

* A large percentage of the codes on each page need to alter from page to page so your search engines don’t see the internet pages as duplicate content.

No. You certainly do NOT have to change the program code in your pages to avoid duplicate content issues! Website templates have program code that is exactly the same from web page to page. This is great and normal and definitely fine with the search engines. You might have to think that the lookup engineers were really foolish if they were going to punish, or reprimand pages because they used the identical design template from the website to page! Sure, an individual wants the same exact *content* on every page of your internet site, but even that is not commonly a problem if it’s a few paragraphs here and there. (See my new article on Danny’s Web Land site on the Belief in Duplicate Content. )

* All picture inbound links should have text links beneath pictures.

No reason for this at all. Image links that will make use of the image alt trait (aka “alt tags”) have been followed easily by the engines like google and will always continue to be adopted. They’re followed even without the particular alt attribute, but the words and phrases you place in there tell the major search engines and the site users just what they’ll be getting when they stick to the link. It’s essentially the same task as the anchor text of your text link.

* TEND NOT TO use drop-down or fly-out menus using JavaScript.

This is certainly fairly good advice; however, you can find very easy workarounds if you have to make use of JavaScript menus for some reason. The particular “NoScript” tag is a flawlessly legitimate place to recreate your personal menu for those who (like often the search engines) can’t complete JavaScript. I’ve been using this process since 2000 or so if my website was pre-loaded with JavaScript menus, and it’s most certainly not a problem. I just haven’t got around to redesigning my very own site with more crawler-friendly navigation. Certainly these days, a new CSS menu would be a considerably better option.

* Must work with basic HTML link direction-finding (textual navigation, no JavaScript mouse-over, and no image place graphical navigation).

Yes and no. JavaScript links are definitely a no-good practice. But there are plenty of crawler-friendly impression maps, and like I actually mentioned previously, graphical backlinks are A-OK with engines like google.

* All pages have to be VALIDATED by an HTML PAGE validator and all style bedding needs to be VALIDATED through a CSS validator.

Why? This has no do with search engines. Is actually nice to do, though.

1. The majority of the site will be stationary, as static pages are much easier for search engines to spider and rank properly.

Energetic pages are just as easy to be able to crawl and rank as static pages. Most websites today are dynamic due to the fact they’re simply easier to retain. The search engines have figured out the best way to crawl and rank these individuals just fine for many, many years currently. It’s true that there are distinct things you need to watch out for when coming up with a dynamic site, individuals developers are aware of the most awful of issues.

You certainly really should consult with an SEO for anyone who is changing content management programs, or if you’re having problems receiving your dynamic URLs spidered in addition to indexed. But there’s no motive to have only static web pages on your site because most likely worried about the search engines being able to index chart dynamic pages.

* The positioning needs to be browser-compatible and screen-resolution-compatible.

This is another thing that’s wonderful to do for your site visitors, nevertheless, it has no bearing on search engine results positioning or relevance.

Phew! Let’s hope this helped clear up many misconceptions that anyone else often has had. Please don’t get me wrong – I do agree that most of the things listed here are a great style and design tips that can help you to develop an awesome, user-friendly website. On the hunt to make it very clear that they have no do with SEO, ratings, spidering, indexing, etc.

Read also :

Leave A Reply

Your email address will not be published.