Monday, October 31, 2005

Free Content For Your Web Site

Are you looking to add some quick, quality content to your web sites? A number a web sites provide articles you can freely add to your web site, as long as you follow a few rules. The most common rule is to provide a link to the author’s web site. You get a free article. The author gets a free link to their web site.

Here are links to four web sites that provide free content you can use: is a part of the Jayde Online Network. It is an article search engine and directory that is updated daily.

Valuable Content provides free content for your newsletter, ezine, or website. Use articles or poems from the Valuable Content web site with the requirement being that you include the author resource box associated with the article and a link to Valuable Content

Article City articles may be used. in whole or in part, provided that author by-lines and links are kept intact, unchanged and functional.

Articles 4 Content I could not find the terms of use for articles from this site. Submitting articles is quick and easy with no sign up required.

Article Beam allows you to freely copy and publish any of the articles in their directory, so long as they are published exactly as they appear. RSS feeds also available.

Article Hang Out Uses the same software as Article Beam. They do appear to have different libraries of articles available. I don't know if they are related, but submission of articles both Article Beam and Article Hang Out is uncomplicated and easy.

Speaking of submitting articles: While adding freely available articles to your web site is nice, it is even better if others put articles you’ve written on their web sites. The result will be that other web sites will be promoting your products and services, and they will be providing valuable links back to your web site. So write a few articles yourself, submit them to the above services, and let other people promote your web site.

Thursday, October 27, 2005 and Sblogs

Up until recently was wide open to those who used sblogging as a marketing tool.

What is a sblog?

It is a blog that was created for spamming purposes. Automated software can easily create hundreds, even thousands of blogs whose purpose is to promote and provide links to some web site. These blogs have no value to anyone who might read one, their only purpose is to spam the search engines for promotional purposes. That's why they are called sblogs -- spam blogs.

A simple solution, that stops the automated software, is to require that a code, displayed in a distorted image, be typed in before a new blog entry is published. has instituted such a system and I cheer on this change with a big HURRAY!!

Thursday, October 20, 2005

Free Press Release Distribution Services - Part 2

In May I posted an article about Free Press Release Distribution Services and provided links to several services. Here are links to additional free press release distrubution services:

The Open Press - Be sure to follow the submission guidelines and press release format. They have very specific requirements.

PR Free - Easy to use, easy to understand press release submission.

Press Method - I can't say that I've ever seen any evidence a press release I've submitted through this service has ever been seen by anyone. If you make a contribution, your press release will get better exposure. (This is true of several of the free press release distribution services.) - Quick and easy to submit a press release.

PR News Now - I see press releases from PR News Now show up in search results, but I'm not sure how this site works. Much of the site, including the HELP section, seems to be nonfunctional. - only accepts a limited number of free press releases, but it does not hurt to give them a try.

Wednesday, October 19, 2005

Google Jagger Update

Google is going through a major update in their index. This update is being called Jagger by the SEO industry.

If you need information about this update, information about how to report spam to Google, information about how to ask that your web site be reincluded in Google, or just general information about what's happening at Google, the place to go is Matt Cutts' blog: Gadgets, Google and SEO. Matt is a Google employee who has been posting some excellent, and very useful information.

Tuesday, October 18, 2005

The robots.txt File

Take a look at your log files for 404 errors and you may see that the robots.txt file tops the list. This is one of the most frequently accessed files on a web site and, if you don't have one, it will top the list of files generating 404 (page not found) errors.

The robots.txt file is accessed so frequently because search engine spiders check the robots.txt file for rules you use to tell spiders where they can and can not go on your web site. It's an important file because you can use it to keep spiders out of folders you don't want indexed, such as your images or stats folders.

The robots.txt file is a plain text file that goes in the root folder of your web site.

If you just want to stop the 404 errors, you can use an empty robots.txt file. There does not need to be any content in the file.

Here is an example of what might be included in a robots.txt file:

User-agent: *
Disallow: /setup.php
Disallow: /cgi-bin/
Disallow: /images/

The "User-agent" variable identifies the specific spider. In this case the asterisk means that the rules apply to all spiders.

The variable "Disallow:" identifies files or folders that the User-agent may not visit. In this case spiders may not index the setup.php file, nor may the index any of the files in the "cgi-bin" and "images" folders.

Here's another example:

User-agent: googlebot
Disallow: /images/

In the above the Google spider, GoogleBot, is being excluded from the images folder.

These two examples can be combined:

User-agent: googlebot

User-agent: *
Disallow: /setup.php
Disallow: /cgi-bin/
Disallow: /images/

In the above nothing is disallowed for GoogleBot, so it may index all files and folders on the web site. All other spiders may not index the setup.php file, nor the "cgi-bin" and "images" folders.

One of the best places to go for information about the robots.txt file is It is an all inclusive source for information on the robots.txt file and Robot Exclusion Standards, and it provides articles about writing well-behaved web spiders. Topics covered include:

The Web Robots FAQs - Frequently Asked Questions about Web Robots, from Web users, Web authors, and Robot implementers.

Robots Exclusion - Find out what you can do to direct robots that visit your Web site.

A List of Robots - A database of currently known robots, with descriptions and contact details.

The Robots Mailing List - An archived mailing list for discussion of technical aspects of designing, building, and operating Web Robots.

Articles and Papers - Background reading for people interested in Web Robots

Related Sites - Some references to other sites that concern Web Robots.

Wednesday, October 12, 2005

Google Webmaster Guidelines

I recently posted a link to a page with questions and answers about MSN search. So what about Google?

Google has recently updated their webmaster information pages. They provide an outstanding resource to help you understand what Google does with web sites. While they don't give away any inner secrets, it well worth reading the Google Information for Webmasters pages.

Saturday, October 08, 2005

MSN Responds To Questions

Here's some interesting information. MSN responds to questions submitted by SEO Chat in the following areas:

A) Growth, Relevancy & Technology
B) Code & Crawling
C) Spam, Penalties & Ranking Questions
D) Webmaster Recommendations
E) MSN News and Future Offerings

Read the questions and answers at SEOmoz.