米6体育登录app正规嘛
Link Metrics is the collective term referring to the criteria for ranking search results. They reflect the importance of any given webpage on the world wide web, and include some measure of domain authority, relevancy and trust. It’s a way to measure websites not by how important their webmasters say they are, but by how authoritative, relevant and trusted other web users say they are.
Search engines like Google, Yahoo and Bing determine a page’s ranking based, among other factors, on an assessment of its inbound links. Each link from another website is considered as a vote or endorsement from that website for your website. The more endorsements, the more authoritative, relevant and trusted your website must be, and thus the higher its search engine page ranking, or SERP.
PageRank
PageRank is an example of a link metric algorithm invented by Google as a key marker of content quality. PageRank is typically a score from 1-10 for a page based on how many inbound links it has, and how strong those links are. It’s logarithmic, meaning it’s 10x harder to get from 2 to 3 as it is to get from 1 to 2.
Yahoo’s Trust Rank works in a similar way to combat spam by filtering webpages by the relative ranking of its backlinks. Commonly, search engines like these operate on the assumption that good sites tend to link to other good sites.
So how does this work? When a website links to your site, or when you link internally from one of your pages to another, the link passes ranking points. This is commonly called “link juice” or “link equity” transfer. It generally follows that the higher the PageRank or Trust Rank etc. of a particular page, the more ‘link juice’ that page can pass to other pages through its links. Read more in our guide to link juice .
Your 米6体育APP首页 agency will use tools to determine the strength of the page where a link to your site is placed. They include Moz’s MozRank and MozTrust and Majestic’s Citation Flow and Trust Flow, each with their own set of link metrics for determining authority, relevance and trust to emulate the big engines. This enables your 米6体育APP首页 agency to determine the best practices to follow.
Authority ranking factors
In much the same way as you would cite only verifiable sources in a news article or academic paper, online content should likewise only reference other content that is considered authoritative.
Perhaps the most important factor in link metrics is the authority of the site passing the link. Google endow the outgoing links from high authority sites with much more ranking importance, or ‘link juice’ than they do the outgoing links from less authority sites. For example, a link from the BBC News website will pass on much more link juice than a link from your local green grocer’s website. That’s because authority is generally a measure of popularity: How many links do other sites make to their website? How many mentions of their brand are there across the web? Pages with the most inbound links are considered the most authoritative.
At the same time, sites with the most authority relative to your site are those that represent the same industry, niche, or theme. This is because Google assumes that websites and businesses are more astute at spotting high-quality websites in their own field than they are at discerning high quality websites in unrelated industries. Because of this they consider inter-field endorsements (or links) to be authoritative links.
Google endows individual pages on the web with different metrics than they do to the overall site to which they belong. For instance, a landing page, or homepage will often have higher page metrics than a page that takes four clicks to reach from the home page. Still, when it comes to receiving links, it’s important to make page connections: Search engines are placing more and more importance on the distribution of links throughout your website, rewarding websites that have links spread throughout it’s pages and penalising sites that have links too heavily focused on its homepage. An authoritative website is one that is considered to feature quality content across the board.
Relevancy ranking factors
A search engine will analyse the following page elements to determine the relevancy of a link. These refer to both the website passing the link and the website receiving the link:
Domain to domain
What is the linking site about? Is it comparable to your site? At the most basic level, links are determined to be most relevant among sites that share similar content.
If the linking website is a website about plumbing, and the receiving website is a website about plumbing, this will be regarded a relevant link, receiving brownie points from the search engines for relevancy.
Domain to page
Does the theme of your website relate to the page to which you’re linking? If it’s completely irrelevant, odds are it won’t make contextual sense on the page, and could be considered manipulative.
It makes no sense, for example, to link from your plumbing website to a jobs website. At the same time, a domain considered irrelevant might feature a page containing information relevant to you. If, say, a link passes from your plumbing site to a page on a recruitment site that contains job listings in plumbing, it can be interpreted as relevant.
Page to page
Are you both talking about the same thing? Even if your domains seem a poor match, if another site has produced similar content on a given page, it may be considered relevant to your own page.
If the link-passing page focuses on electric boilers, and it is linking to a page on your site that is focused on electric boilers, this is considered a relevant link. Even if the overall linking website is about plumbing, but the linking page is about piping and the linked to page is about boilers, this is considered to be less relevant.
Link to page
Is the link on your page relevant to the content around it? Does it make contextual sense, or is it shoehorned in? This counts too for ‘links’ pages, in which webmasters will place a list of links to external websites, sometimes called ‘recommended businesses’ or other variants.
If, on your plumbing website, you offer details of suppliers, associated businesses or even competitors, Google consider these links to be highly irrelevant, and find that they tend to be either paid for or reciprocal. Either way, they carry very little link juice and can be considered as spammy activity.
Ultimately, the link should be relevant in relation to the text around it as awkward or irrelevant links can be identified and penalised by Google’s Penguin algorithm accordingly. Read more on this in anchor text , and for more info on location of links, see link building for a description.
Your 米6体育APP首页 agency can advise on the best internal linking practices, too. This is part of effective website navigation , or siloing, which helps clarify your main website themes.
Trust ranking factors
Trust is one of the most contentious factors in 米6体育APP首页 . It is certainly an important factor, but this importance fluctuates regularly. Still, the general rules apply:
Google and the other engines have a large list of websites that they consider to be highly trustworthy. These websites will include the likes of Yahoo, BBC News, CNN etc. Google endow the outgoing links from high authority sites with much more ranking importance, or ‘link juice’ than they do the outgoing links from less authoritative sites.
Think of it like the six degrees of separation rule—everyone is connected somehow. If you are within one link of a highly trusted website (ie, that website links to you) then you will also be deemed to be trustworthy. If you are within two links of a trusted website (ie. linked to by a business linked to by a trusted site) then you are considered to be a bit less trustworthy, but trustworthy nonetheless. The further down the ladder you are, the lower your trust ranking.
An example would be:
Yahoo > London.gov > Lambeth Borough > Disablity Lambeth > usedcellphonesforsale.info (spam)
London.gov gains trustworthiness owing to its proximity to Yahoo, a ‘trustworthy’ site. But four links away from Yahoo is ‘Disability Labeth’, which links to a spam site. Try to be as close to a highly trustworthy site as possible, and test the trustworthiness of this site by checking its outgoing links. If it links to any spammy sites, it’s probably not considered all that trustworthy.
There’s a further, final consideration: co-citation. It refers to the similarities found between two web pages, based on a third-party web page that successfully mentions the first two web pages in correlation with each other. If two sites are referenced on the same page, based on that co-citation Google identifies a relation—or link—between those two websites, which is then used as an important search engine-ranking factor.
Say, for example, a blog post with the title ‘How to maximise your ROI with digital design’ that links to a study published by Princeton University on processing fluency , also references our page on user experience . Through association, both sources cited in the article will credited in relation to the rank-worthiness of the other.
‘NoFollow’ attributes and robots.txt
Since the value of links became common knowledge, search engines (especially Google) have been fighting the tactics that allow people to “game” their way to the top of search engine results, including paid links, excessive reciprocal or three-way links, and questionable link baiting tactics. Google’s Penguin algorithm eats manipulative links for lunch.
Sometimes your 米6体育APP首页 agency will recommend adding a ‘ NoFollow ’ attribute to an anchor tag or page, which tells search engines to “ignore this link” or “ignore all links on this page.” Any NoFollow-ed link is essentially worthless when it comes to 米6体育APP首页 as it doesn’t pass ranking power, but, any paid links, links that dilute your subject relevance or links to untrustworthy pages can benefit from a ‘NoFollow’ tag to avoid a ranking penalty.
Other relevant robots tags or robots.txt include a ‘NoIndex’ and ‘Disallow’, link tags that prevent the search engines even looking at the linked to web page. It is very common for blogs and online forums to use robots tags and robots.txt in links to prevent search engines thinking they are spam. However, unless you are using them intentionally to control your internal link juice distribution or dictate the way your site is crawled, they’ll be inhibiting your page ranking.