November 9, 2009

SEO Tool test started

We have started to test our new SEO Tool. Feature list:
  • Analyzation of more than 50 parameters
  • Page search by characteristics (eg. keywords, http status code, etc.)
  • Keyword planning
  • Page recrawl to check changes
  • Page comparison
  • SERP tracker
We're planning to add new functionalities to help building links as well.

We're looking for individuals, webmasters and SEO professionals to test functionalities. If you are interested, send me an e-mail and I'll tell you how you can get a free account.

July 8, 2009

Wyolyzer seo tool start

I am here again, I start posting again. The reason behind of this small break was the development of our new SERP tracker and web site optimization status analyzer. The official name of this tool is Wyolyzer, which is beta now, but we are continuously working to develop this tool to get the best SEO tool. Now two main features are available:
  • SERP tracker: you can track keyword positions in major search engines such as Google, Yahoo and Bing for your site. Localization is solved by country selecting. You can use any type of characters.
  • EWSc calculator: calculating SEO status of a site is important for all domain owners. We can estimate the state of the site by calculating some value from the first page data.
You can try it, if you register on the WyoSEO site.

May 14, 2009

SEO – ACRank by majestic-SEO

ACRank (A Citation Rank) is a very simple rank value created by majestic SEO a link crawler company. Their value depends on the number of short domains. What is a short domain? A short domain is a second level domain, namely the shortest form of a domain. There are of course long domains as well. A Long domain is a short domain with subdomains. Ok, let’s see an example: my blog’s short domain is blogspot.com, long domain is searchack.blogspot.com. (In case of www, short domain is e.g. wyoseo.com and long domain is www.wyoseo.com)

Go back to ACRank. ACRank shows that how important a particular page is by assigning an integer value from 0 (lowest) to 15 (highest) depending on number of unique referring external short domains. The scale is almost logarithmic (to the base 2), but not exactly. See this table:Notes:
  • You may notice that 1 domain is specified for rank 1 and 2 - this is because Rank 1 will be given only if there is a single external backlink, where as Rank 2 will be given if there is more than one external link but they all come from single external domain. This is done to split off some of the more valuable Rank 1 pages that would otherwise be mixed up together. (Source)
  • The ACRank value does not depend on the ACRank values of pages that link to the page, which means that weight of backlinks is not taken into account.

ACRank is created by majestic SEO to measure the value of websites. You can see on majestic SEO’s page that how many 1..15 ACRank sites refers to your website.


May 11, 2009

SEO - Alexa Rank

What is Alexa? The name is come from the Library of Alexandria the largest repository of knowledge in the ancient world. Alexa Internet, Inc. is a California-based subsidiary company of Amazon.com. It was founded in 1996 with the vision of improving internet searching by tracking user decisions and using that data to aid future searches. In 1999, Alexa was acquired by Amazon.com and the company moved away from its original vision of providing an 'intelligent' search engine.

Alexa offered a toolbar to rank sites based on tracking information of users. Web users, who installed the Alexa Toolbar, provide statistical samples to Alexa where they surfing. From these samples Alexa computes the Alexa rank. Of course, it is not publicly known how the computation works. But one thing is sure: the smaller the better. A new site, about 1 visitor per week starts with the value 4 000 000 and the most visited site of the world has the value 1. This is Google (now)… Yahoo won the traffic rank 2. Typically if a site is in the top 100,000 that means that it’s enjoying a quite heavy traffic flow.

What Alexa says about the ranking method:

A site's ranking is based on a combined measure of reach and pageviews. Reach is determined by the number of unique Alexa users who visit a site on a given day. Pageviews are the total number of Alexa user URL requests for a site. However, multiple requests for the same URL on the same day by the same user are counted as a single pageview. The site with the highest combination of users and pageviews is ranked #1.

If a site is identified as a personal home page or blog, its traffic ranking will have an asterisk (*) next to it. Personal pages are ranked on the same scale as a regular domain.
For more details you can check Alexa’s help page.

Some issues:
  • Alexa's traffic rankings are for top level domains only.
  • Alexa ranking is heavily skewed towards websites which have a large webmaster/tech audience. This is because webmasters or web savvy audiences are much more likely to have the Alexa toolbar installed than websites whose visitors are unaware of Alexa.
  • It is easy to manipulate. Take a look: 20 quick way to increase alexa rank
Some related tools:

If you know more details about Alexa’s Ranking algorithm, please share with me!

May 8, 2009

SEO - Web site ranking: PageRank

Measuring your Website’s SEO quality is more than necessary. But what kind of tools and rankings are available? First, let’s see Google’s PR:

PageRank was developed at Stanford University by Larry Page (Lawrence originally, and hence the name Page-Rank) and later Sergey Brin. The key concept of PageRank is that a web page is generally more important, if many other web pages link to it. A page shares its PR with pages it links to.

PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))
where
  • PR(A) is the PageRank of page A,
  • PR(Ti) is the PageRank of pages Ti which link to page A,
  • C(Ti) is the number of outbound links on page Ti and
  • d is a damping factor which can be set between 0 and 1.
What is a damping factor?
The PageRank theory holds that even an imaginary surfer who is randomly clicking on links will eventually stop clicking. The probability, at any step, that the person will continue is a damping factor d. It is assumed that the damping factor will be set around 0.85.


where p1,p2,...,pN are the pages under consideration, M(pi) is the set of pages that link to pi, L(pj) is the number of outbound links on page pj, and N is the total number of pages.

How can we measure?
Sources:

May 6, 2009

SEO - Getting Website Properties

How can we get the properties of our website? I’ll show you some example.

Getting the URL: I think, it is easy, because everybody knows the name of his/her website :) If no, which site would you like to optimize? :)

Getting the IP address: It is a little bit tricky but everyone can find out the IP address of a website. You can do it in your browser or in your operating system. The browser-based solution is the following: write into Google (I mean into a search engine :) the phrase “website IP address” and click on the first result. It is selfseo.com for me. Put your domain name to the search text, and you will get your reward!
You can use your operating system’s built-in tools as well; open a command line (win) or a terminal (linux, mac) and type: “ping www.wyoseo.com” and you can catch the IP address of the site.

If you want to see more information about a website, you will have to use the whois service. On the web, just search for the word “whois” and click to the first result. Type the address and voila :) Under linux (unix) you can use the whois command as well.

May 5, 2009

SEO: the most important parameters

Today I start writing about SEO things. I've a list about the most important factors that matters for search engines, and thus, SEO. The core question is that what kind of information we have about a website? What a SE robot sees on our website?

Ok, here is my short list:
  1. Website parameters
  2. Website content
  3. Backlinks (links that refer to our site)

And these are the roots of all evil :) Let’s take a deeper look!

Website parameters (‘physical’):
  • website URL: e.g.: www.wyoseo.com
  • website IP: e.g.: 62.77.128.10
  • website age: 0y 16d (ehh :)

Website parameters (‘logical’): computed by different algorithms
  • PR, Alexa rank, etc.

Website content:
  • number of pages, page names
  • content of individual pages
  • keyword density (for a page, for the whole site)
  • outgoing links
  • correlation (relationships among pages), e.g. content correlation
  • lexical analysis
  • content/code ratio
  • etc.

Backlinks:
  • from different domains
  • from different IPs
  • with different title and anchor text
  • from pages with different PR (or other values), trustiness
  • linkfarms, black lists and paid links
  • one way links and link exchange sites
  • number of backlinks

Basically, that’s all. I’d like to emphasize the word ‘basically’. Why? Because almost all of these parameters have subsets and this is the reason why search engines use about (more than?) 100 parameters to calculate SERPs.