Thursday, September 8, 2011

Google Pandalized

Google Pandalized is really a new strategy by Google to find out Spams and to create Google Search additional accurate and Junk free. It is an update to check the spam sites in order to overcome them to make the search results an increasing number of efficient.
The sites which are pandalized that will be proved spam in the Google panda update get less priority search engines although there are not completely dis found from Google. In order to know if your website is pandalized or perhaps not, then you would possibly check the signs or symptoms of sites that are Pandalized. If unfortunately you are listed as spam, then you also can overcome your internet site in next revise for which you should know how to eradicate Google Panda. There is much loss for you for anyone who is pandalized as you must bear high targeted visitors loss but there is one good news you can get depandalized also.

Friday, September 2, 2011

Google Wonder Wheel

Google introduce the most effective tools use to boost your SERP result; Google Wonder Wheel, it is basically used to obtain the relevant search topics for your own query in a very relevant and graphically layout way.
    * Google Wonder Wheel is considered fun, easy, and worth a look.
    * Google Wonder Wheel determines how Google classifies keywords and what keywords will be shown under those classifications.
    * Google Wonder Wheel is also a great way to generate keywords for your content and website navigation.
    * Google Wonder Wheel is useful when you are trying to work to achieve site links for your Web site.
    * Google Wonder Wheel provides a better way to develop site content by allowing Google to make suggestions of appropriate content for a given set of keywords.
    * The major purpose of the Google Wonder Wheel is to look for relevant search terms within a given niche.
It’s a brand new feature Google added that allows people to visually see search term phrases related to whatever they wanted initially in any graph like steering wheel.

Thursday, September 1, 2011

Search Engine Keyword Ranking Reporting Features

 
SEO Basics

  
   * Visitor loyalty ("sticky-ness") report
    * Help text embedded with calculation methodology explained
    * All individual items in reports graph able over time
    * E-commerce reporting - analyze your shopping cart vs. standard web traffic*
    * Revenue source reporting - geographical data on purchases*
    * New downloads reports shows all files downloaded from your site (not available via ASP    solutions)
    * New drilldown reports show information succinctly yet in complete detail
    * Search Engine Marketing / SEO - Page query terms report automatically shows results of   cost-Per-click campaigns, internal searches, and more
    * Search term report shows actual keywords typed into search engines
    * New IP Address and IP Drilldown reports
    * Intranet IP analysis
    * New Robots and Spiders reports (not available via ASP solutions)
    * New Client Parameters reports - screen colors, resolution, time zone offset, Java/Javascript versions, etc.
    * Automatic exclusion of "bot" traffic from Visitors reports
    * Click-path analysis improved (click to/from report)
    * Improved flexibility in usernames reporting

Monday, August 29, 2011

SEO Initial Steps

1. The most initial step is to analyze thoroughly your site, for which you going to start working. Find out the positive and negative points in that site; try to find the actual theme of that site.

2. Once you analyze your site thoroughly try to find out the competitors of your site over the web and their current search engine positioning also analyze their strategy by working a little over their key phrases which they were using to deploy their existence.

3. Once you get an idea about your competitor’s strategy by working on their key phrases, page layout and all such basic things, its time for you to suggest the keywords for your or your client’s site.

4. After that you or your client short listed or approved your suggested keywords immediately working to analyze those keywords and find out such keywords which have less competition over the web, that’s actually your major task to promote your website against a certain keyword or multiple keywords, the most you spend your time in analyzing keywords the better rankings you will achieve.

5. Now its time to implement your strategy over your website, start writing or modifying the current titles, descriptions, alt tags and keywords, try to use your targeted keywords in titles and descriptions specially because these are the major things which is used by search engines since years, but make sure don’t repeat just keywords, try to summarize your web page’s theme with a few most important keywords for which you actually wants your site to attain a higher search engine placement. If needed, you can suggest SEO friendly URLs that depends on keywords .

6. Try to manage the density of keywords used for which you wants your site to attain a higher search engine placement, keep it under 3.0% but also make sure that it should be more than 2.25%.

7. Validate your site’s pages according to w3c standards using online tools available for this purpose and also try to remove any broken links over your site’s pages.

8. Try to submit the references of your site in useful discussion forums, blogs, and communities and even in Google and yahoo groups related to your site’s theme.

9. Submit the robots.txt, rss file and sitemap in Google and yahoo that’s actually in xml format and makes it easier for search engines to crawl your website in a proper manner.

10. Implement Google, Yahoo and MSN Webmasters Tools improve your site’s visibility in search results. It provides you with an easy way to make your site more SEO-friendly. They can show you view of your site, help you diagnose problems, and let you share info with SE to help improve your site’s visibility in their search results. It’s very simple to set up!

11. Start working on link building of your site try to focus on similar sites for this purpose and also try to get links from those pages which have a better search engine placement and a higher Google page rank, try to get one way links as much as possible even you can use a neutral site/directory for this purpose use that neutral site/directory to give back links to those who demands links from your site in that all link building process.

12. Use some SEO tools like web position or SEO elite to find out your sites ranking that’s the first step in fact and then you have to perform that step on regular occasions; normally on weekly basis and also monitor the daily traffic trends.

Resources: other site

Sunday, August 28, 2011

Difference Between Google AdWords and AdSense

The Google AdWords program enables you to create advertisements which will appear on relevant Google search results pages and partner sites.
Google Adwords is a program you can use to get listed in the search results for Google and other search engines, such as AOL, for which it supplies advertisements. You essentially create an account, insert your advertisement, pick the maximum amount you are willing to pay per click on your ad and then submit your credit card. You advertisement will then go live once the credit card is approved. Every time someone clicks on your ad, Google bills your account. Every couple of weeks or when your bill reaches a certain amount, Google will actually charge your credit card. Adwords is known as a form of pay-per-click advertising.

The Google AdSense program differs in that it delivers Google AdWords ads to individuals' websites. Google then pays web publishers for the ads displayed on their site based on user clicks on ads or on ad impressions, depending on the type of ad.
Google Adsense is essentially the other side of the coin. There are non-search engine sites on the web that get a lot of traffic and Google trusts. This site is one, but it is hardly an exclusive club. With Google Adsense, these sites are able to set up accounts with Google and list advertisements from Google Adwords on the sites.

You pay Google for to place Adwords on Google’s system, but Google pays you to put Adsense on your site.

Friday, August 26, 2011

Business Reviews

Current society, encouraging your clients to abandon reviews for ones business or brand where probable on-line is equivalent to enjoying outdated fashion direct one to one referral marketing testimonial! In fact, negative on-line company reviews can have as dramatic an impact as developing a bad report with the bbb! On another side with the topic a confident business valuation may really encourage others to do business with your organization. Success or Failure, anticipate to respond accordingly and fix any problems that may crop up. Even a poor review isn't necessarily an unsatisfactory thing even if it is left anonymously by way of competitor which usually indeed could happen! It will be all in how a business handles and responded to the issue.

Three basic principles to follow:
1. Don't fake reviews. If you do review your own company, offer a bias disclaimer!

2. Ask your satisfied customers to give you a review.

3. Expect a negative review at some point and be prepared to act toward resolution.

Thursday, August 25, 2011

Forum Postings

The Fascinating Dimension about Forum Posting


 Discussion Boards or Forums are online networking communities. The fascinating dimension about forum posting is that it is possible to aim for a segment of those which counterparts the demographic profile you are searching for. Partaking in automated forum posting for building a reputation for your aggregation by blame to its associates about your capability, and makes a absolute consequence on them with your competence.

 A Forum posting service can be classified as a service wherein the service provider makes use of posters to facilitate forum postings along with your business website.

 It additionally provides you Direct One Way Incoming links to your web site. To improve your search engine ranking with cheap and high quality forum postings. To help in multiplying your conversion percentages. Search engines also presuppose that you are alive in the networking association and they consistently index your portal.

 The absolute action undertaken starting from creation of a forum profile and posting on different forums on your behalf to give an effect to high traffic link building process. In this entire process, highly qualified content writers post rich, interesting, unique, and result-oriented content on the forums to attract huge number of hits for the concerned business. This simultaneously increases the amount of traffic to the website. In addition to that, high skilled posters are being used to create new threads and post replies by making use of your web site link.

 You must do some analysis before joining any forum. You should accompany forums that are relevant to the sites you want to get backlinks for. You should only choose forums that are popular and active. Backlinks for high authority forums are very valuable. The number of active members and the Google PageRank of the forum can give you a good idea about its popularity. You should keep your signature short and link it to your main website.  You should never actualize posts that sound like a advertising or which are extraneous to the post topic. You should pay attention to the TOS of the forum, or you will risk yourself of getting banned and losing all the backlinks you built.

Monday, August 22, 2011

Canonicalization

Canonicalization is usually a demanding concept to be aware of yet it really is important to making the optimized website.
URL canonicalization works with information that's multiple feasible URLs. Having numerous Web addresses to the exact same information may cause issues regarding search engines - particularly with identifying that URL needs to be proven in SERP.
Search engines able to do such things as maintaining or maybe getting rid of trailing cuts, trying to convert urls using Caps to lower case, or maybe getting rid of period IDs through bulletin board or maybe additional software.
To get more command in excess of how your Web addresses appear in serp's, in order to combine properties, for instance website link popularity, many of us suggest that you decide on some sort of canonical (preferred) URL as the desired model of the web page. You'll be able to reveal your personal preference to be able to Search engines in many means.

Sunday, August 21, 2011

DAO – Digital Asset Optimization

A digital Asset Search engine optimization (DAO) could be the natural advancement of Search engine ranking. SEO specializes in the text message files of a website, relying upon the search phrases within of which text along with tags. The net is will no longer a text-only environment, and search engines like Google are establishing.

DAO or Digital Asset Optimization is actually enormous section of Off Page SEARCH ENGINE OPTIMISATION. DAO means Optimizing valuable possessions for engines like Google and also consumers.

In essence, every digital asset you own needs to be optimized intended for Search Engines along with intended for consumers making use of the particular website’s research perform.

Articles, PR announcements, Anchor text, Blog Posts, Micro Sites, Photos, and in many cases the particular single profiles an individual create for social networking sites or maybe forums. Prior to anybody will start using DAO, they must execute a search term research.

A digital Asset Marketing is usually applied to various record types for instance REALLY SIMPLY SYNDICATION, MS Office, Flash, PDF, and many others too. Since DAO replaces SEARCH ENGINE OPTIMIZATION, corporations need to redouble search engine optimization techniques to almost all readily available press. Anything which can be indexed and listed can be optimized. Digital Asset Optimization may bring within targeted visitors that will not really have been searching for your business providers, but instead located your website with the picture of a keyboard set utilised to be a graphic.

Latent Semantic Indexing

Latent Semantic Indexing (LSI) is an indexing and retrieval method that uses a mathematical technique called Singular value decomposition (SVD) to identify patterns in the relationships between the terms and concepts contained in an unstructured collection of text. LSI is based on the principle that words that are used in the same contexts tend to have similar meanings. A key feature of LSI is its ability to extract the conceptual content of a body of text by establishing associations between those terms that occur in similar contexts.

LSI has been used in several ways. The most obvious and common way is to analyze the similarity between bodies of text. This can be used in dozens of interesting ways, from finding related documents in a group to doing paragraph-wise LSI to find site summaries. It can also be used to facilitate a "smart" search of your document space, and even do document categorization (read: SPAM filtering!)

Latent Semantic Indexing (LSI) is a new concept that Google has began to employ and pioneer. It was originally used in Google’s Adsense program, as a way of seeing which adverts would be the most relevant on a particular site. Google recently bought a company called Applied Semantics, in an effort to use LSI concepts and ideas in its search rankings, and many other search engines are beginning to follow suite.

Tuesday, August 16, 2011

Hidden Text

Hidden Text is one of the challenges faced by webmasters and search engines. Spammers continue to use hidden text to stuff their keywords and keyword phrases in the hidden text to increase their keyword density and improve their rankings. Search engines seek to figure out when spammers are doing this, and then then take appropriate action.

One of the more common tricks used is hidden text and search engines have been known to penalize sites who adopt this technique as part of their search engine optimization campaign.

Hidden text is not visible to the human eye, yet is still readable by search engine spiders.

They attempt to "hide" the text using any of the following methods:
    • Using a font color identical to the color of the background
    • Using a font color very similar to the color of the background
    • Hiding the text behind an image

Friday, August 12, 2011

.htaccess Files

A .htaccess (hypertext access) file is a directory-level configuration file supported by several web servers, that allows for decentralized management of web server configuration.

 The .htaccess files(or "distributed configuration files") can override many other configuration settings including content type and character set, CGI handlers, etc.

 These files are placed inside the web tree, and are able to override a subset of the server's global configuration for that directory, and all sub-directories.

 .htaccess is only a file, it can change settings on the servers and allow you to do many different things, the most popular being able to have your own custom 404 error pages.

.htaccess isn't difficult to use and is really just made up of a few simple instructions in a text file.

 .htaccess can do including: password protecting folders, redirecting users automatically, custom error pages, changing your file extensions, banning users with certian IP addresses, only allowing users with certain IP addresses, stopping directory listings and using a different file as the index file.

Thursday, August 11, 2011

Photo Sharing

Photo sharing, the most recent internet-based action with lots of resources for the promotion of business websites that accord in products or casework is all the range with SEO personnel. Not only website content but also photos are indexed by search agent bots and noticed in search result pages.

If you are running business,then its great place for acknowledgment of your products and services. Photo sharing, an online trend to get noticed by millions of people surfing the internet in quest of websites, products or services is the latest addition to our kit of customized SEO services for website promotion.  It drives lots of traffic to the websites. Create account and create albums as per categories and upload photos you have. You should give all photo details including title,keywords and description with links as its part of image optimization. It will advice to arise your product into google for specific keyword.

Image optimization with potential and accepted keywords is a good SEO practice for website promotion. Images with easily crawlable URLs are highly indexable by search engine robots. Alphabetical URLs are more search engine friendly than numerical ones. Visitors do not use figures but words as search phrases to locate the pictures of products or services or brands that they are attractive for.

Searching for the pictures with optimized URLs is faster and easier in several different browsers.

Wednesday, August 10, 2011

Link Relevance

Link relevance is entirely dependent on the keywords that are used on the linking pages. So let's assume a random 3rd party page links to your intended landing page. If you were trying to optimize for Keyword A, then that link would have high link relevance if Keyword A appeared throughout the linking page. The most relevant link possible for Keyword A would be from a page that has been completely optimized for Keyword A. In other words, Keyword A is utilized in that page's Title tag, Meta Description, H1 tag, and a handful of times in the body content.

If you have begun working on search engine optimization efforts for your website, you already know that search providers evaluate the relevance of a site largely on the links that go into and out of a website.

The rank of sites that link to your page and the anchor text they use to link to you can make a big difference in how highly your site is ranked.

The anchor text related to the link that sends visitors to your site is very important to your search engine rank, but the link relevance for outbound links can effect your site as well.

Assuming the link relevance of two links is equal, the link with the highest SEO value will be the link from the page with more importance (i.e. higher PR). However, if you had to choose between a link with high relevance or high importance, then you should almost always opt for the link with more relevance.

Sunday, August 7, 2011

Home page and index.html page

Every website is built inside directories on a Web server. And each Web page is a separate file on that Web server. But sometimes, when you go to a URL, there is no file listed in the URL.

When you go to a URL without a file named at the end, the server looks for a default file and displays that automatically.

There are three commonly used default page names that you can use on most Web servers:
    * index.html
    * index.htm
    * default.htm (on some Windows servers)

It's a good idea to stick with index.html or index.htm on most servers, as default.htm is most often used on Windows servers, and isn't as common as it used to be. You Should Have an index.html Page in All Your Directories Whenever you have a directory on your website you should have an index.html page. This allows your readers to see a page when they come to that directory without typing a file name in the URL. It also prevents them from seeing things you might not want them to see.

If you don't put in an index.html file in a directory, most Web servers will display a file listing of all the files in that directory. While in some situations, you might want that, most of the time this is ugly at best and a security hole at worst. Writing a default Web page and naming it index.html helps solve those problems. Your Homepage Should be an index.html PageWhen you start building your website, you should create your main page and name it index.html. This is true whether you're using a free hosting service or you have your own domain name. That way, when people come to your URL, they automatically get your main page.

All other pages will have names like "about.html" or "contact.html", but your home page should file should be called "index.html".

Video Promotion

Video promotion means promoting your video on the web.

As almost everyone should have their videos on the web, the second action we need to take is promoting the videos on the web. uploading your video just to You Tube or other video sharing websites is not enough, because your video on You Tube is one video from a billions of videos there. After you upload the video to You Tube or any other website you should promote the video as much as you can. This is very important to your video success.

Most people still feel that video making and video promotion is not an easy job or is some thing that needs qualified help and expertise. You must believe that it is a simple task that does not demand technical acquaintance. There is a surprising video making tool available in almost each and every shape of modern computer and that tool is the Windows Movie Maker.

Once you have done the movie making for advertising or marketing, the only thing left required to perform is to upload these over the famous free video hosting websites such as YouTube, metacafe, videosurf, dailymotion, and many others. This type of promotion can play an extremely significant role to attract potential viewers and customers.

Saturday, August 6, 2011

Google Sandbox

The Google Sandbox Effect is a theory used to explain why newly-registered domains or domains with frequent ownership changes rank poorly in Google Search Engine Results Pages (SERPS). In other words new websites are put into a “sandbox” or a holding area and have their search ratings on hold until they can prove worthy of ranking.

Once Google deems a website to be of quality and importance the website will be removed from the Sandbox and can potentially show up high in Google Search Engine Results Pages. Webmasters can do numerous things to improve their website with Google, but time really is the key in getting out of the Sandbox. Sandbox believers say it can take anywhere from 6 months to a year and sometimes longer before Google will promote a website out of the Sandbox.

Because Google does not acknowledge the Sandbox and it has not been clearly proven the Sandbox Effect is just a theory. Even though it is just a theory, the Sandbox is believed by the majority of webmasters. The Sandbox is believed to have come about in 2004 when changes to Google’s Algorithm made it so new websites were banned from the top of Google Search Engine Results Pages (SERPS).

It may seem that the Sandbox is unfair to newly launched websites, but Google created the Sandbox with good reasons. Google was trying to discourage spam sites from quickly reaching the top of Search Results, getting dropped off Google, creating a new site and repeatedly showing up high in Search Results. In the past, companies would put up a promotional website for a short period of time and once the promotion was over the website was gone. These promotional websites would still show up high in Google Search Engine Results even after they were removed, causing many broken links and unhappy Google searchers.

Even with the Sandbox Effect it is still possible for newly launched websites to make it to the top of Google Search Engine Results Pages (SERPS). If Google deems a new website of being worthy it can be seen in Search Engine Results immediately, but it can still take up to 6 months for the website to rank to its fullest potential. There are many ways in which web designers use to avoid the Sandbox, some of which are discussed below. But because of its uncertainty, even if all algorithm variables are followed there is still no way to guarantee new websites from being put in the Sandbox.

Friday, August 5, 2011

Dynamic URLs vs. Static URLs

During earlier times, all websites used static HTML pages and so the first search engines were oriented towards static web pages. As the web technology developed several new methods to generate websites and dynamically generated web pages came into being.

A dynamic URL is the address of a Web page with content that depends on variable parameters that are provided to the server that delivers it. The parameters may be already present in the URL itself or they may be the result of user input. A dynamic URL can often be recognized by the presence of certain characters or character strings that appear in the URL (visible in the address bar of your browser).

Example of a dynamic URL:
http://www.domainname.com/page.php?mode=view&u=123

Dynamic URLs are generated from specific queries to a site’s database resulting in the different URLs for the same content unlike to static URLs in which URL or the file name of a webpage remains same until the webmaster makes any change in its HTML code.

A static URL is a URL that does not change over a period of time. A static URL will also not contain any URL parameters.

Static URL’s look like this:
http://www.domain.com/page.html

Static URLs are typically ranked better in search engine results pages, and they are indexed more quickly than dynamic URLs, if dynamic URLs get indexed at all. Static URLs are also easier for the end-user to view and understand what the page is about. If a user sees a URL in a search engine query that matches the title and description, they are more likely to click on that URL than one that doesn't make sense to them.

Thursday, August 4, 2011

Robots.txt

The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web crawlers and other web robots from accessing all or part of a website which is otherwise publicly viewable.

Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. The standard is different from, but can be used in conjunction with, Sitemaps, a robot inclusion standard for websites.

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do.

The format is simple enough for most intents and purposes: a USERAGENT line to identify the crawler in question followed by one or more DISALLOW: lines to disallow it from crawling certain parts of your site.

The Basic structure of a robots.txt:
User-agent: *
Disallow: /

Google Blogger has introduce the robots.txt file in each blogger blog.
To check the robots.txt file of your blogger blog, just type the following URL in the adress bar of your browser.

http://www.yourblogname.blogspot.com/robots.txt
(replace yourblogname with your blog name).

Keyword Stuffing

Keywords are a critical element in any search engine optimization (SEO) campaign. One of the first signs of a novice SEO campaign is a Web page that is littered with keywords.

Keyword Stuffing means mislead search engine results by overloading the web page content with long lists of keywords you wish to rank for. It includes repeated use of a keyword to improve a page relevance.

Wednesday, August 3, 2011

What is Nav Bar?

A nav bar or navigation bar is a sub region of a web page that contains hypertext links in order to navigate between the pages of a website. It is accessible in several colors, and is configured in the Template tab in Blogger's interface. If you are using a classic template, you'll see a menu from which you can select a color.

For blogs using Layouts, just click the "edit" link on the Navbar page element. Having easy-to-use navigation is important for any web site. Many blogger want to hide their navbar because it will accomplish their blog layout look better and professional without a box (navbar) above it. It is additionally accepted as a links bar or link bar.

Tuesday, August 2, 2011

What is favicon?

A favicon or favorites icon also known as a shortcut icon, website icon, URL icon, or bookmark icon, is a file containing one or more small icons, most commonly 16×16 pixels image associated with a particular website or webpage which is shown in the address bar of a browser when loading a webpage.

This is used to decorate your site in the bookmarks and tab bar of browsers.

Monday, August 1, 2011

Keyword Analysis

Keyword Analysis is at the heart of every successful search engine optimization (SEO) campaign. An analysis on the quality and relevancy of keywords is made prior to adopting them for website promotion. Keyword Analysis is the abstraction of words that people use to find information on the Internet.

Keyword Analysis helps you raise conversions and find new markets, but can be time-consuming. It is the initial process for website promotion and can accomplish the difference between success and abortion for your website. It is about what keywords should use on web-pages to get visibility on search engines.

Sunday, July 31, 2011

Google Plus

Google Plus is Google's Latest Social Network has launched on Tuesday, 29th June, 2011. It’s much like the Facebook News Feed, allowing users to share photos, videos, links or their location with friends.

What are Keywords?

In the Search engines, you can type in one or more key words to search for the information you need. Within seconds, you got with a list of results. You can add some key words to narrow bottomward your choices, or you can start opening next and next and next page to find out if that page or site has the advice you need.

keywords that analyze what the page is about, usually acclimated in search engines. Keywords help search engines to categorize your site, and to allow people to find your pages more quickly.

Saturday, July 30, 2011

List of HTTP Response Codes

1xx Informational
Request received, continuing process.
This class of status code indicates a provisional response, consisting only of the Status-Line and optional headers, and is terminated by an empty line. Since HTTP/1.0 did not define any 1xx status codes, servers must not send a 1xx response to an HTTP/1.0 client except under experimental conditions.

100 Continue
    This means that the server has received the request headers, and that the client should proceed to send the request body (in the case of a request for which a body needs to be sent; for example, a POST request). If the request body is large, sending it to a server when a request has already been rejected based upon inappropriate headers is inefficient. To have a server check if the request could be accepted based on the request's headers alone, a client must send Expect:
100-continue as a header in its initial request and check if a 100 Continue status code is received in response before continuing (or receive 417 Expectation Failed and not continue).

101 Switching Protocols
    This means the requester has asked the server to switch protocols and the server is acknowledging that it will do so.

102 Processing (WebDAV) (RFC 2518)
    as a WebDAV request may contain many sub-requests involving file operations, it may take a long time to complete the request. This code indicates that the server has received and is processing the request, but no response is available yet. This prevents the client from timing out and assuming the request was lost.

122 Request-URI too long
    This is a non-standard IE7-only code which means the URI is longer than a maximum of 2083 characters.

2xx Success
This class of status codes indicates the action requested by the client was received, understood, accepted and processed successfully.

200 OK
Standard response for successful HTTP requests. The actual response will depend on the request method used. In a GET request, the response will contain an entity corresponding to the requested resource. In a POST request the response will contain an entity describing or containing the result of the action.

201 Created
Following a POST command, this indicates success, but the textual part of the response line indicates the URI by which the newly created document should be known.

202 Accepted
    The request has been accepted for processing, but the processing has not been completed. The request might or might not eventually be acted upon, as it might be disallowed when processing actually takes place.

203 Partial Information
When received in the response to a GET command, this indicates that the returned meta information is not a definitive set of the object from a server with a copy of the object, but is from a private overlaid web. This may include annotation information about the object, for example.

204 No Response
Server has received the request but there is no information to send back, and the client should stay in the same document view. This is mainly to allow input for scripts without changing the document at the same time.

205 Reset Content
    The server successfully processed the request, but is not returning any content. Unlike a 204 response, this response requires that the requester reset the document view.

206 Partial Content
   The server has fulfilled the partial GET request for the resource. The request MUST have included a Range header field indicating the desired range, and MAY have included an If-Range header field to make the request conditional.
If the 206 response is the result of an If-Range request that used a strong cache validator the response SHOULD NOT include other entity-headers. If the response is the result of an If-Range request that used a weak validator, the response MUST NOT include other entity-headers; this prevents inconsistencies between cached entity-bodies and updated headers. Otherwise, the response MUST include all of the entity-headers that would have been returned with a 200 (OK) response to the same request.
A cache MUST NOT combine a 206 response with other previously cached content if the ETag or Last-Modified headers do not match exactly, A cache that does not support the Range and Content-Range headers MUST NOT cache 206 (Partial) responses.


207 Multi-Status (WebDAV) (RFC 4918)
    The message body that follows is an XML message and can contain a number of separate response codes, depending on how many sub-requests were made.

226 IM Used (RFC 3229)
    The server has fulfilled a GET request for the resource, and the response is a representation of the result of one or more instance-manipulations applied to the current instance.

3xx Redirection
The client must take additional action to complete the request.
This class of status code indicates that further action needs to be taken by the user agent in order to fulfil the request. The action required may be carried out by the user agent without interaction with the user if and only if the method used in the second request is GET or HEAD. A user agent should not automatically redirect a request more than five times, since such redirections usually indicate an infinite loop.

300 Multiple Choices
    Indicates multiple options for the resource that the client may follow. It, for instance, could be used to present different format options for video, list files with different extensions, or word sense disambiguation.

301 Moved Permanently
    This and all future requests should be directed to the given URI.

302 Found
    This is an example of industrial practice contradicting the standard. HTTP/1.0

specification (RFC 1945) required the client to perform a temporary redirect (the original describing phrase was "Moved Temporarily"), but popular browsers implemented 302 with the functionality of a 303 See Other. Therefore, HTTP/1.1 added status codes 303 and 307 to distinguish between the two behaviors. However, the majority of Web applications and frameworks still [as of?] use the 302 status code as if it were the 303.

303 See Other (since HTTP/1.1)
    The response to the request can be found under another URI using a GET method. When received in response to a POST (or PUT/DELETE), it should be assumed that the server has received the data and the redirect should be issued with a separate GET message.

304 Not Modified
    Indicates the resource has not been modified since last requested. Typically, the HTTP client provides a header like the If-Modified-Since header to provide a time against which to compare. Using this saves bandwidth and reprocessing on both the server and client, as only the header data must be sent and received in comparison to the entirety of the page being re-processed by the server, then sent again using more bandwidth of the server and client.

305 Use Proxy (since HTTP/1.1)
    Many HTTP clients (such as Mozilla and Internet Explorer) do not correctly handle responses with this status code, primarily for security reasons.

306 Switch Proxy
    No longer used.

307 Temporary Redirect (since HTTP/1.1)
    In this occasion, the request should be repeated with another URI, but future requests can still use the original URI. In contrast to 303, the request method should not be changed when reissuing the original request. For instance, a POST request must be repeated using another POST request.

4xx Client Error
The 4xx class of status code is intended for cases in which the client seems to have erred. Except when responding to a HEAD request, the server should include an entity containing an explanation of the error situation, and whether it is a temporary or permanent condition. These status codes are applicable to any request method. User agents should display any included entity to the user. These are typically the most common error codes encountered while online.

400 Bad request
The request had bad syntax or was inherently impossible to be satisfied.

401 Unauthorized
The request requires user authentication. Similar to 403 Forbidden, the page you were trying to access can not be loaded until you first log on with a valid user ID and password. If you have just logged on and received the 401 Unauthorized error, it means that the credentials you entered were invalid for some reason.

402 Payment Required
    Reserved for future use. The original intention was that this code might be used as part of some form of digital cash or micro payment scheme, but that has not happened, and this code is not usually used. As an example of its use, however, Apple's MobileMe service generates a 402 error ("httpStatusCode:402" in the Mac OS X Console log) if the MobileMe account is delinquent.

403 Forbidden
    The request was a legal request, but the server is refusing to respond to it. Unlike a 401 Unauthorized response, authenticating will make no difference.

404 Not Found
    The requested resource could not be found but may be available again in the future. Subsequent requests by the client are permissible.

405 Method Not Allowed
    A request was made of a resource using a request method not supported by that resource; for example, using GET on a form which requires data to be presented via POST, or using PUT on a read-only resource.

406 Not Acceptable
    The requested resource is only capable of generating content not acceptable according to the Accept headers sent in the request.

407 Proxy Authentications Required
The Web server (running the Web site) thinks that the HTTP data stream sent from the client (e.g. your Web browser or our CheckUpDown robot) was correct, but access to the URL resource requires the prior use of a proxy server that needs some authentication which has not been provided. This typically means you must log in (enter user ID and password) with the proxy server first.
A 407 error detected via a Web browser can often be resolved by navigating to the URL in a slightly different way e.g. accessing another URL for the proxy server first. Your ISP should be able to explain the role of the proxy server in their security setup and how you should use it.


408 Request Timeout
    Client stopped the request before the server finished retrieving it. A user will either hit the stop button, close the browser, or click on a link before the page loads. Usually occurs when servers are slow or file sizes are large.

409 Conflict
    Indicates that the request could not be processed because of conflict in the request, such as an edit conflict.

410 Gone
    Indicates that the resource requested is no longer available and will not be available again. This should be used when a resource has been intentionally removed and the resource should be purged. Upon receiving a 410 status code, the client should not request the resource again in the future. Clients such as search engines should remove the resource from their indices. Most use cases do not require clients and search engines to purge the resource, and a "404 Not Found" may be used instead.

411 Length Required
    The request did not specify the length of its content, which is required by the requested resource.

412 Precondition Failed
    The server does not meet one of the preconditions that the requester put on the request.

413 Request Entity Too Large
    The request is larger than the server is willing or able to process.

414 Request-URI Too Long
    The URI provided was too long for the server to process.

415 Unsupported Media Type
    The request entity has a media type which the server or resource does not support. For example, the client uploads an image as image/svg+xml, but the server requires that images use a different format.

416 Requested Range Not Satisfiable
    The client has asked for a portion of the file, but the server cannot supply that portion. For example, if the client asked for a part of the file that lies beyond the end of the file.

417 Expectation Failed
    The server cannot meet the requirements of the Expect request-header field.

418 I'm a teapot
    This code was defined in 1998 as one of the traditional IETF April Fools' jokes, in RFC 2324, Hyper Text Coffee Pot Control Protocol, and is not expected to be implemented by actual HTTP servers.

422 Unprocessable Entity (WebDAV) (RFC 4918)
    The request was well-formed but was unable to be followed due to semantic errors.

423 Locked (WebDAV) (RFC 4918)
    The resource that is being accessed is locked.

424 Failed Dependency (WebDAV) (RFC 4918)
    The request failed due to failure of a previous request (e.g. a PROPPATCH).

425 Unordered Collection (RFC 3648)
    Defined in drafts of "WebDAV Advanced Collections Protocol", but not present in "Web Distributed Authoring and Versioning (WebDAV) Ordered Collections Protocol".

426 Upgrade Required (RFC 2817)
    The client should switch to a different protocol such as TLS/1.0.

444 No Response
    A Nginx HTTP server extension. The server returns no information to the client and closes the connection (useful as a deterrent for malware).

449 Retry With

    A Microsoft extension. The request should be retried after performing the appropriate action.

450 Blocked by Windows Parental Controls
    A Microsoft extension. This error is given when Windows Parental Controls are turned on and are blocking access to the given webpage.

499 Client Closed Request
    An Nginx HTTP server extension. This code is introduced to log the case when the connection is closed by client while HTTP server is processing its request, making server unable to send the HTTP header back.

5xx Server Error
The server failed to fulfill an apparently valid request.

Response status codes beginning with the digit "5" indicate cases in which the server is aware that it has encountered an error or is otherwise incapable of performing the request. Except when responding to a HEAD request, the server should include an entity containing an explanation of the error situation, and indicate whether it is a temporary or permanent condition. Likewise, user agents should display any included entity to the user. These response codes are applicable to any request method.

500 Internal Error
    Couldn't retrieve the HTML document because of server-configuration problems. Contact site administrator.

501 Not Implemented
    Web server doesn't support a requested feature.

502 Service Temporarily Overloaded
    Server congestion; too many connections; high traffic. Keep trying until the page loads.

503 Service Unavailable
    Server busy, site may have moved, or you lost your dial-up Internet connection.

504 Gateway Timeout
    The server was acting as a gateway or proxy and did not receive a timely response from the upstream server.

505 HTTP Version Not Supported
    The server does not support the HTTP protocol version used in the request.
506 Variant Also Negotiates (RFC 2295)
    Transparent content negotiation for the request results in a circular reference.

507 Insufficient Storage (WebDAV) (RFC 4918)
    The Exchange server size limits are smaller than the size of the item we are attempting to migrate.  For example, the size limit of the server is 10MB but the item has a 20MB attachment.

By default, Exchange Server has message size limits of 10MB.  Although the Exchange server may accept delivery of items larger than 10MB via SMTP, MigrationWiz delivers emails using an email API which is subject to different size limits.


509 Bandwidth Limit Exceeded (Apache bw/limited extension)
    This status code, while used by many servers, is not specified in any RFCs.

510 Not Extended (RFC 2774)
    Further extensions to the request are required for the server to fulfill it.

What is Keyword Density?

Keyword density is an SEO term that refers to a combination of the percentage of times a keyword or phrase, in proportion with other words, appears on a web page compared to the total number of words on the page.

Friday, July 29, 2011

Seo Abbreviations

A
    Anchor
ABMs
    Automated Bid Managers
ACS
    Access Control Server
ACSS
    Aural Cascading Style Sheets
AJ
    Ask Jeeves (a search engine)
AOL
    America Online (a search engine)
API
    Application Programming Interface
ASCII
    American Standard Code for Information Interchange
ASP
    Microsoft Active Server Pages
    Application Service Provider
ASPX
    Microsoft Active Server Page Framework
ATW
    AlltheWeb (a search engine)
AV
    AltaVista (a search engine)
B2B
    Business to Business
BL
    Backlink
BO
    Backlinks Obsession
BOW
    Best of the Web
BSE
    Bulk Solicited Email
CAPTCHA
    Completely Automated Public Turing Tests to Tell Computers and Humans Apart
CGI-BIN
    Common Gateway Interface
CMS
    Content Management System
CPA
    Cost Per Acquisition
    Cost Per Action
CPC
    Cost Per Click
CPL
    Cost Per Lead
CPM
    Cost Per Thousand
CPS
    Cost Per Sale
CSS
    Cascading Style Sheets
CTR
    click thru rate
DNS
    Domain Name System
DOS
    Disk Operating System
EPC
    Earnings Per Click
EXE
    Executable
FAQ
    Frequently Asked Question
FTP
    File Transfer Protocol
G
    Google
GAP
    Google Advertising Professionals
GB - GIGABYTE
    1,000,000,000 Bytes = 1,000 Megabytes
GIF
    Graphics Interchange Format
GOOGOL
    10100 = 1 followed by 100 zeros
GUI
    Graphical User Interface
HTTP
    Hypertext Transfer Protocol   
HTTPS
    HyperText Transfer Protocol Secure   
IBL
    Inbound Link   
ICRA
    Internet Content Rating Association
ILQ
    Inbound link Quality
IM
    Instant Messaging
In-House SEO
    Any SEO working from home in their underwear or similar attire.
IP
    Internet Protocol
ISAPI
    Internet Server Application Program Interface
ISP
    Internet Service Provider
IT
    Information Technology
IYP
    Internet Yellow Pages
JPG
    Three letter file extension for Joint Photographic Experts Group
JS
    Javascript (file.js)
JSP
    Java Server Pages (file.jsp)
KB - KILOBYTE
    1,024 Bytes   
KBD
    Keyboard: Text to be entered by the user.   
KDA
    Keyword Density Analyzer
KEI
    Keyword Effectiveness Index (Wordtracker)
KW
    Keyword
LAMP
    Linux, Apache, MySQL, PHP / Perl / Python
LAN
    Local Area Network   
LPO
    Landing Page Optimization
LSA
    Latent Semantic Analysis
LSI
    Latent Semantic Indexing
MB - MEGABYTE
    1,000,000 Bytes   
META
    Generic Metadata: Information about information.
Metadata
    Data about Data
MIME
    Multipurpose Internet Mail Extensions
MMC
    Microsoft Management Console
MSP
    Managed Service Provider
MSSQL
    Microsoft Sequel Server
MySQL
    Popular Open Source Database
OBL
    Outbound Link   
ODP
    Open Directory Project (a directory)
ORM
    Online Reputation Management
OS
    Operating System
PDF
    Portable Document Format
PNG
    Portable Network Graphics
PPC
    Pay Per Click
PPV
    Pay Per Visitor
PR
    Google PageRank
PR0
    PageRank Zero
PSD
    PhotoShop Document
QWERTY (Pronounced KWER'TEE)
    English Language Keyboard
    Top Left Row of Keys
RAM
    Random Access Memory
REP
    Robots Exclusion Protocol
ROM
    Read Only Memory
RSS
    Really Simple Syndication
    Rich Site Summary
RTF
    Rich Text Format (file.rtf)
SAMP
    Sample: Program output, scripts, etc.   
SCRIPT
    Script Statements: The SCRIPT element places a script within a document.   
SE
    Search Engine
SEM
    Search Engine Marketeer
    Search Engine Marketer
    Search Engine Marketing
SEMPO
    Search Engine Marketing Professional Organization
SEO
    Search Engine Optimization
    Search Engine Optimizer
    Search Engine Overengineering
    Semantic Emotion Optimization
    Semantic Engagement Optimization
    Semantically Enhanced Optimization
    Sewage Enforcement Officer
SEP
    Search Engine Placement
    Search Engine Positioning
    Search Engine Promotion   
SERPs
    Search Engine Results Pages
SEs
    Search Engines
SES
    Search Engine Strategies (a conference)
SEU
    Search Engine Usability
SGML
    Standard Generalized Markup Language
SIT
    Stuffit Archive (file.sit) Macintosh File Compression Format
SMM
    Social Media Marketing   
SMO
    Social Media Optimization   
SMP
    Social Media Profile
    Social Media Profiling
SMTP
    Simple Mail Transfer Protocol
SOAP
    Simple Object Access Protocol   
SOC
    Source Ordered Content
SPAM
    Sites Positioned Above Me
    SPAM Food Products from Hormel Foods Corporation   
SQL
    Structured Query Language
SSI
    Server Side Includes (file.shtml)
STATS
    Statistics
SWF
    Shockwave Flash (file.swf)
SYSOP
    System Operator
T&C
    Terms and Conditions   
TCP
    Transmission Control Protocol
TIF
    Three letter file extension for Tagged Image File Format (file.tif)   
TIFF
    Tagged Image File Format (file.tiff)   
TIF
    Three letter file extension for Tagged Image File Format (file.tif)   
TIFF
    Tagged Image File Format (file.tiff)   
TXT
    Text File (file.txt)
UGC
    User Generated Content
URI
    Uniform Resource Indicator   
URL
    Uniform Resource Locator   
URN
    Uniform Resource Name   
VBS
    Visual Basic Script (file.vbs)
VEO
    Visitor Engagement Optimization   
W3
    World Wide Web   
W3C
    World Wide Web Consortium
WAI
    Web Accessibility Initiative
WAIS
    Wide Area Information System
WAN
    Wide Area Network   
WAP
    Wireless Application Protocol
WCAG
    Web Content Accessibility Guidelines
WebDAV
    Web-based Distributed Authoring and Versioning   
WECA
    Wireless Ethernet Compatibility Alliance   
WiFi™
    Wireless Fidelity (aka 802.11 Wireless Networking)   
WIMP
    Windows, Icons, Menus and Pointing Devices
WINDOWS NT
    Windows New Technology
WML
    Wireless Markup Language (file.wml)
WWW
    World Wide Web   
WYSINWOG
    What You See Is Not What Others Get
WYSIWYG (Pronounced Wizzy'Wig)
    What You See Is What You Get - Acronym associated with various HTML editors such as; FrontPage and Dreamweaver which are classified as WYSIWYG editors.
XAML
    Extensible Application Markup Language (Microsoft Vista)
XHTML
    Extensible Hypertext Markup Language   
XML
    Extensible Markup Language (file.xml)   
XMLP
    XML Protocol   
XMP
    Adobe Extensible Metadata Platform
XSL
    Extensible Scripting Language (file.xsl)   
XSLT
    XSL Transformations (a language for transforming XML)   
Y!
    Yahoo! (a directory)
YB - YOTTABYTE
    1,000,000,000,000,000,000,000,000 Bytes = 1,024 Zettabytes
ZB - ZETTABYTE
    1,000,000,000,000,000,000,000 Bytes = 1,024 Exabytes   
ZIP
    Zipped File (file.zip) Windows File Compression Format

Thursday, July 28, 2011

Who is Matt Cutts?

Matt Cutts joined Google as a software engineer in January 2000, at the time the company had less than 100 employees.. He is currently the head of Google's Web-spam team.

Before Google, He worked on Ph.D. in computer graphics at the University of North Carolina at Chapel Hill. He have an M.S. from UNC-Chapel Hill, and B.S. degrees in both mathematics and computer science from the University of Kentucky.

He wrote the first version of Safe-search, which is Google's family filter, and works for the Search Quality group in Google, specializing in search engine optimization issues.

He is well known in the SEO community for enforcing the Google Webmaster Guidelines and cracking down on link spam.

Matt Cutts also advises the public on how to get better website visibility in Google as well as webmaster issues in general, and is generally an outspoken and public face of Google.

Wednesday, July 27, 2011

Social Profile Creation

 Profile creation account is One of the main techniques builds high volumes of inbound links to your website by utilizing the assorted social platforms available on the web, an advancing object is to register and create multiple forum profiles on social platforms, forums, discussion blogs, community websites and other available sources.

Individually these links are relatively low value, but the accumulated after effect of multiple profiles actual bound builds link juice, they frequently appearance a very quick movement in the search engine rankings as a result.

Article Sites with PageRank


ezinearticles.com 6
goarticles.com 6
articlesbase.com 5
articlecity.com 5
isnare.com 5
articledashboard.com 5
articlesnatch.com 5
ideamarketers.com 4
articlealley.com 4
a1articles.com 4
articlesalley.com 4
sooperarticles.com 4
articlecube.com 4
upublish.info 4
articles.simplysearch4it.com 4
articlesphere.com 4
webarticles.com 4
templatekit.com 4
articlegeek.com 4
saching.com 4
pubarticles.com 4
articlebiz.com 4
articleclick.com 4
articlerich.com 4
amazines.com 4
ezau.com 4
itsallaboutlinks.com 4
1888articles.com 4
submit-article.net 4
basearticles.com 4
articleworld.net 3
top7business.com 3
sponsordirectory.com 3
anyarticles.com 3
articledepot.co.uk 3
authorconnection.com 3
valuablecontent.com 3
articlefeeder.com 3
dime-co.com 3
reprintarticles.com 3
articlejoe.com 3
searcharticles.net 3
articledestination.com 3
article-content-king.com 3
abcarticledirectory.com 3
allbestarticles.com 3
artipot.com 3
goodinfohome.com 3
article-diary.com 3
articlesland.com 3
article-treasure.com 3
nicearticles.net 3
articlecompilation.com 2
articlebliss.com 2
articleblotter.com 2
articlesland.info 2
articleshaven.com 2
articles411.com 2
articlepoint.com 2
article-hut.com 2
iarticlebeach.com 2
articleintelligence.com 2
jodee.biz 2
articlekarma.com 2
nichevolumes.com 2
articlesupport.com 2
article-mania.com 2
articlecell.com 2
articlefield.com 2
articlicious.com 2
articlesubmitedge.com 2
rysite.com 2
articlesubmit.com 2
articlesale.com 2
articlefree4all.com 2
latestarticles.net 2
newarticles.us 2
pagequest.co.uk 2
articlelinksdirectory.com 2
article-submission-directory.com 2
articlestreet.com 2
homehighlight.org 2
24by7articles.com 2
articles-expert.com 1
freearticlesnow.com 1
articlecache.com 1
articlesclip.com 1
chainarticles.com 1
smashuparticles.com 1
olivearticles.com 1
steeparticles.com 1
favouritearticles.com 1
articleshouse.com 1
deeparticles.com 1
freesubmitarticles.com 1
thinkarticle.com 1
manualarticlesubmission.com 1
articles.topaix.com 1
adish.info 1
article-dashboard.com 1
articlecastle.com 1
articlemark.org 1
articlerealm.com 1
articlesubmit2.com 1
helpingarticles.com 1
nextarticles.com 1
problembgone.com 1
sciencenewsarticles.org 1
smartyarty.org.uk 1
stupidarticles.com 1
articleslocation.com 1
backlinkarticle.com 1
earticlesonline.com 0
e-articles.info 0
articlesocial.com 0
addnn.com 0
1articleworld.com 0
socializearticles.com 0
articlesunit.com 0
articlecarrier.com 0
bytearticles.com 0
markarticles.com 0
tuneinarticles.com 0
submittothis.com 0
articleritz.com 0
articlemotron.com 0
qwesz.com 0
purearticle.com 0
totheplanet.info 0
allentrepreneurinfo.com 0
articlehearty.com 0
blogruv.com 0
economicnewsarticles.org 0
eearticles.com 0
ganazat.com 0
articlemajoris.com 0
articlegarden.com 0
articlequeue.com 0
futurearticle.com 0
frenzyarticle.com 0
just4articles.com 0
saveuparticles.com 0
article4social.com 0
articlelinear.com 0
enhancedarticles.com 0
articleadda.com 0
article-coop.com 0
publish-your-articles.com 0
scrubbylane.com 0
articledate.com 0
good-article.com 0
articlesnacks.com 0

Top Stories