Friday, August 12, 2011

.htaccess Files

A .htaccess (hypertext access) file is a directory-level configuration file supported by several web servers, that allows for decentralized management of web server configuration.

 The .htaccess files(or "distributed configuration files") can override many other configuration settings including content type and character set, CGI handlers, etc.

 These files are placed inside the web tree, and are able to override a subset of the server's global configuration for that directory, and all sub-directories.

 .htaccess is only a file, it can change settings on the servers and allow you to do many different things, the most popular being able to have your own custom 404 error pages.

.htaccess isn't difficult to use and is really just made up of a few simple instructions in a text file.

 .htaccess can do including: password protecting folders, redirecting users automatically, custom error pages, changing your file extensions, banning users with certian IP addresses, only allowing users with certain IP addresses, stopping directory listings and using a different file as the index file.

Thursday, August 11, 2011

Photo Sharing

Photo sharing, the most recent internet-based action with lots of resources for the promotion of business websites that accord in products or casework is all the range with SEO personnel. Not only website content but also photos are indexed by search agent bots and noticed in search result pages.

If you are running business,then its great place for acknowledgment of your products and services. Photo sharing, an online trend to get noticed by millions of people surfing the internet in quest of websites, products or services is the latest addition to our kit of customized SEO services for website promotion.  It drives lots of traffic to the websites. Create account and create albums as per categories and upload photos you have. You should give all photo details including title,keywords and description with links as its part of image optimization. It will advice to arise your product into google for specific keyword.

Image optimization with potential and accepted keywords is a good SEO practice for website promotion. Images with easily crawlable URLs are highly indexable by search engine robots. Alphabetical URLs are more search engine friendly than numerical ones. Visitors do not use figures but words as search phrases to locate the pictures of products or services or brands that they are attractive for.

Searching for the pictures with optimized URLs is faster and easier in several different browsers.

Wednesday, August 10, 2011

Link Relevance

Link relevance is entirely dependent on the keywords that are used on the linking pages. So let's assume a random 3rd party page links to your intended landing page. If you were trying to optimize for Keyword A, then that link would have high link relevance if Keyword A appeared throughout the linking page. The most relevant link possible for Keyword A would be from a page that has been completely optimized for Keyword A. In other words, Keyword A is utilized in that page's Title tag, Meta Description, H1 tag, and a handful of times in the body content.

If you have begun working on search engine optimization efforts for your website, you already know that search providers evaluate the relevance of a site largely on the links that go into and out of a website.

The rank of sites that link to your page and the anchor text they use to link to you can make a big difference in how highly your site is ranked.

The anchor text related to the link that sends visitors to your site is very important to your search engine rank, but the link relevance for outbound links can effect your site as well.

Assuming the link relevance of two links is equal, the link with the highest SEO value will be the link from the page with more importance (i.e. higher PR). However, if you had to choose between a link with high relevance or high importance, then you should almost always opt for the link with more relevance.

Sunday, August 7, 2011

Home page and index.html page

Every website is built inside directories on a Web server. And each Web page is a separate file on that Web server. But sometimes, when you go to a URL, there is no file listed in the URL.

When you go to a URL without a file named at the end, the server looks for a default file and displays that automatically.

There are three commonly used default page names that you can use on most Web servers:
    * index.html
    * index.htm
    * default.htm (on some Windows servers)

It's a good idea to stick with index.html or index.htm on most servers, as default.htm is most often used on Windows servers, and isn't as common as it used to be. You Should Have an index.html Page in All Your Directories Whenever you have a directory on your website you should have an index.html page. This allows your readers to see a page when they come to that directory without typing a file name in the URL. It also prevents them from seeing things you might not want them to see.

If you don't put in an index.html file in a directory, most Web servers will display a file listing of all the files in that directory. While in some situations, you might want that, most of the time this is ugly at best and a security hole at worst. Writing a default Web page and naming it index.html helps solve those problems. Your Homepage Should be an index.html PageWhen you start building your website, you should create your main page and name it index.html. This is true whether you're using a free hosting service or you have your own domain name. That way, when people come to your URL, they automatically get your main page.

All other pages will have names like "about.html" or "contact.html", but your home page should file should be called "index.html".

Video Promotion

Video promotion means promoting your video on the web.

As almost everyone should have their videos on the web, the second action we need to take is promoting the videos on the web. uploading your video just to You Tube or other video sharing websites is not enough, because your video on You Tube is one video from a billions of videos there. After you upload the video to You Tube or any other website you should promote the video as much as you can. This is very important to your video success.

Most people still feel that video making and video promotion is not an easy job or is some thing that needs qualified help and expertise. You must believe that it is a simple task that does not demand technical acquaintance. There is a surprising video making tool available in almost each and every shape of modern computer and that tool is the Windows Movie Maker.

Once you have done the movie making for advertising or marketing, the only thing left required to perform is to upload these over the famous free video hosting websites such as YouTube, metacafe, videosurf, dailymotion, and many others. This type of promotion can play an extremely significant role to attract potential viewers and customers.

Saturday, August 6, 2011

Google Sandbox

The Google Sandbox Effect is a theory used to explain why newly-registered domains or domains with frequent ownership changes rank poorly in Google Search Engine Results Pages (SERPS). In other words new websites are put into a “sandbox” or a holding area and have their search ratings on hold until they can prove worthy of ranking.

Once Google deems a website to be of quality and importance the website will be removed from the Sandbox and can potentially show up high in Google Search Engine Results Pages. Webmasters can do numerous things to improve their website with Google, but time really is the key in getting out of the Sandbox. Sandbox believers say it can take anywhere from 6 months to a year and sometimes longer before Google will promote a website out of the Sandbox.

Because Google does not acknowledge the Sandbox and it has not been clearly proven the Sandbox Effect is just a theory. Even though it is just a theory, the Sandbox is believed by the majority of webmasters. The Sandbox is believed to have come about in 2004 when changes to Google’s Algorithm made it so new websites were banned from the top of Google Search Engine Results Pages (SERPS).

It may seem that the Sandbox is unfair to newly launched websites, but Google created the Sandbox with good reasons. Google was trying to discourage spam sites from quickly reaching the top of Search Results, getting dropped off Google, creating a new site and repeatedly showing up high in Search Results. In the past, companies would put up a promotional website for a short period of time and once the promotion was over the website was gone. These promotional websites would still show up high in Google Search Engine Results even after they were removed, causing many broken links and unhappy Google searchers.

Even with the Sandbox Effect it is still possible for newly launched websites to make it to the top of Google Search Engine Results Pages (SERPS). If Google deems a new website of being worthy it can be seen in Search Engine Results immediately, but it can still take up to 6 months for the website to rank to its fullest potential. There are many ways in which web designers use to avoid the Sandbox, some of which are discussed below. But because of its uncertainty, even if all algorithm variables are followed there is still no way to guarantee new websites from being put in the Sandbox.

Friday, August 5, 2011

Dynamic URLs vs. Static URLs

During earlier times, all websites used static HTML pages and so the first search engines were oriented towards static web pages. As the web technology developed several new methods to generate websites and dynamically generated web pages came into being.

A dynamic URL is the address of a Web page with content that depends on variable parameters that are provided to the server that delivers it. The parameters may be already present in the URL itself or they may be the result of user input. A dynamic URL can often be recognized by the presence of certain characters or character strings that appear in the URL (visible in the address bar of your browser).

Example of a dynamic URL:
http://www.domainname.com/page.php?mode=view&u=123

Dynamic URLs are generated from specific queries to a site’s database resulting in the different URLs for the same content unlike to static URLs in which URL or the file name of a webpage remains same until the webmaster makes any change in its HTML code.

A static URL is a URL that does not change over a period of time. A static URL will also not contain any URL parameters.

Static URL’s look like this:
http://www.domain.com/page.html

Static URLs are typically ranked better in search engine results pages, and they are indexed more quickly than dynamic URLs, if dynamic URLs get indexed at all. Static URLs are also easier for the end-user to view and understand what the page is about. If a user sees a URL in a search engine query that matches the title and description, they are more likely to click on that URL than one that doesn't make sense to them.

Thursday, August 4, 2011

Robots.txt

The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web crawlers and other web robots from accessing all or part of a website which is otherwise publicly viewable.

Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. The standard is different from, but can be used in conjunction with, Sitemaps, a robot inclusion standard for websites.

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do.

The format is simple enough for most intents and purposes: a USERAGENT line to identify the crawler in question followed by one or more DISALLOW: lines to disallow it from crawling certain parts of your site.

The Basic structure of a robots.txt:
User-agent: *
Disallow: /

Google Blogger has introduce the robots.txt file in each blogger blog.
To check the robots.txt file of your blogger blog, just type the following URL in the adress bar of your browser.

http://www.yourblogname.blogspot.com/robots.txt
(replace yourblogname with your blog name).

Top Stories