Sunday, August 7, 2011

Video Promotion

Video promotion means promoting your video on the web.

As almost everyone should have their videos on the web, the second action we need to take is promoting the videos on the web. uploading your video just to You Tube or other video sharing websites is not enough, because your video on You Tube is one video from a billions of videos there. After you upload the video to You Tube or any other website you should promote the video as much as you can. This is very important to your video success.

Most people still feel that video making and video promotion is not an easy job or is some thing that needs qualified help and expertise. You must believe that it is a simple task that does not demand technical acquaintance. There is a surprising video making tool available in almost each and every shape of modern computer and that tool is the Windows Movie Maker.

Once you have done the movie making for advertising or marketing, the only thing left required to perform is to upload these over the famous free video hosting websites such as YouTube, metacafe, videosurf, dailymotion, and many others. This type of promotion can play an extremely significant role to attract potential viewers and customers.

Saturday, August 6, 2011

Google Sandbox

The Google Sandbox Effect is a theory used to explain why newly-registered domains or domains with frequent ownership changes rank poorly in Google Search Engine Results Pages (SERPS). In other words new websites are put into a “sandbox” or a holding area and have their search ratings on hold until they can prove worthy of ranking.

Once Google deems a website to be of quality and importance the website will be removed from the Sandbox and can potentially show up high in Google Search Engine Results Pages. Webmasters can do numerous things to improve their website with Google, but time really is the key in getting out of the Sandbox. Sandbox believers say it can take anywhere from 6 months to a year and sometimes longer before Google will promote a website out of the Sandbox.

Because Google does not acknowledge the Sandbox and it has not been clearly proven the Sandbox Effect is just a theory. Even though it is just a theory, the Sandbox is believed by the majority of webmasters. The Sandbox is believed to have come about in 2004 when changes to Google’s Algorithm made it so new websites were banned from the top of Google Search Engine Results Pages (SERPS).

It may seem that the Sandbox is unfair to newly launched websites, but Google created the Sandbox with good reasons. Google was trying to discourage spam sites from quickly reaching the top of Search Results, getting dropped off Google, creating a new site and repeatedly showing up high in Search Results. In the past, companies would put up a promotional website for a short period of time and once the promotion was over the website was gone. These promotional websites would still show up high in Google Search Engine Results even after they were removed, causing many broken links and unhappy Google searchers.

Even with the Sandbox Effect it is still possible for newly launched websites to make it to the top of Google Search Engine Results Pages (SERPS). If Google deems a new website of being worthy it can be seen in Search Engine Results immediately, but it can still take up to 6 months for the website to rank to its fullest potential. There are many ways in which web designers use to avoid the Sandbox, some of which are discussed below. But because of its uncertainty, even if all algorithm variables are followed there is still no way to guarantee new websites from being put in the Sandbox.

Friday, August 5, 2011

Dynamic URLs vs. Static URLs

During earlier times, all websites used static HTML pages and so the first search engines were oriented towards static web pages. As the web technology developed several new methods to generate websites and dynamically generated web pages came into being.

A dynamic URL is the address of a Web page with content that depends on variable parameters that are provided to the server that delivers it. The parameters may be already present in the URL itself or they may be the result of user input. A dynamic URL can often be recognized by the presence of certain characters or character strings that appear in the URL (visible in the address bar of your browser).

Example of a dynamic URL:
http://www.domainname.com/page.php?mode=view&u=123

Dynamic URLs are generated from specific queries to a site’s database resulting in the different URLs for the same content unlike to static URLs in which URL or the file name of a webpage remains same until the webmaster makes any change in its HTML code.

A static URL is a URL that does not change over a period of time. A static URL will also not contain any URL parameters.

Static URL’s look like this:
http://www.domain.com/page.html

Static URLs are typically ranked better in search engine results pages, and they are indexed more quickly than dynamic URLs, if dynamic URLs get indexed at all. Static URLs are also easier for the end-user to view and understand what the page is about. If a user sees a URL in a search engine query that matches the title and description, they are more likely to click on that URL than one that doesn't make sense to them.

Thursday, August 4, 2011

Robots.txt

The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web crawlers and other web robots from accessing all or part of a website which is otherwise publicly viewable.

Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. The standard is different from, but can be used in conjunction with, Sitemaps, a robot inclusion standard for websites.

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do.

The format is simple enough for most intents and purposes: a USERAGENT line to identify the crawler in question followed by one or more DISALLOW: lines to disallow it from crawling certain parts of your site.

The Basic structure of a robots.txt:
User-agent: *
Disallow: /

Google Blogger has introduce the robots.txt file in each blogger blog.
To check the robots.txt file of your blogger blog, just type the following URL in the adress bar of your browser.

http://www.yourblogname.blogspot.com/robots.txt
(replace yourblogname with your blog name).

Keyword Stuffing

Keywords are a critical element in any search engine optimization (SEO) campaign. One of the first signs of a novice SEO campaign is a Web page that is littered with keywords.

Keyword Stuffing means mislead search engine results by overloading the web page content with long lists of keywords you wish to rank for. It includes repeated use of a keyword to improve a page relevance.

Wednesday, August 3, 2011

What is Nav Bar?

A nav bar or navigation bar is a sub region of a web page that contains hypertext links in order to navigate between the pages of a website. It is accessible in several colors, and is configured in the Template tab in Blogger's interface. If you are using a classic template, you'll see a menu from which you can select a color.

For blogs using Layouts, just click the "edit" link on the Navbar page element. Having easy-to-use navigation is important for any web site. Many blogger want to hide their navbar because it will accomplish their blog layout look better and professional without a box (navbar) above it. It is additionally accepted as a links bar or link bar.

Tuesday, August 2, 2011

What is favicon?

A favicon or favorites icon also known as a shortcut icon, website icon, URL icon, or bookmark icon, is a file containing one or more small icons, most commonly 16×16 pixels image associated with a particular website or webpage which is shown in the address bar of a browser when loading a webpage.

This is used to decorate your site in the bookmarks and tab bar of browsers.

Monday, August 1, 2011

Keyword Analysis

Keyword Analysis is at the heart of every successful search engine optimization (SEO) campaign. An analysis on the quality and relevancy of keywords is made prior to adopting them for website promotion. Keyword Analysis is the abstraction of words that people use to find information on the Internet.

Keyword Analysis helps you raise conversions and find new markets, but can be time-consuming. It is the initial process for website promotion and can accomplish the difference between success and abortion for your website. It is about what keywords should use on web-pages to get visibility on search engines.

Top Stories