Tag Archives: medialinkers seo

Google Further Explains What the Part of “Core Algorithm” Means

Google integrated Panda update as part of its core ranking algorithm last week. This news created a lot of buzz in the SEO world. Everyone kept guessing its impact on the search rankings as no clear answers were given regarding what it really meant. However, now you just have to stop guessing, as Google has released more information about the “core update. “

 

The core update means that the algorithm is good enough to work on its own and to keep up with the future trends and changes, without requiring a lot of manual monitoring.   Google’s Andrey Lipattsev, answered a lot of questions clearing the confusions of many SEOs and webmasters in a Q&A session.

 

According to him the core update doesn’t mean an entirely new functionality but is simply an improved part of the ranking algorithm. He further explained it with this analogy:

 

Think of it like the cars of the past that required the users to start them manually using certain tools as they didn’t come installed with the engine starters. Today, a lot of cars have in-built petrol engines and no manual work is needed to start them. The overall idea is the same but things have become more convenient.

 

Connecting it Google Panda, he further added that the basic concept of why Panda was introduced by Google has not changed; it has been tested to work in the past and is now part of the core algorithm update. In other words, any update that comes to a stage to work 100% on its own without requiring any changes simply becomes a part of the core algorithm. Check Andrey Lipattsev Video for more details.

 

The webmasters and SEOs need not to worry about it if they are using pure white hat link building tactics. You can also contact the Medialinkers SEO team to handle the SEO campaign for you.

Share Button

Google Dances Again, but Penguin Still not the Likely Cause

A lot of webmasters observed changes in the search engine results again this weekend. The cause of fluctuations last weekend was due to a core ranking algorithm update, later confirmed by Google. However, this time, Google hasn’t yet confirmed the reason for the fluctuations caused again. The SEO forums and social media pages are receiving a lot of queries by SEOs and webmasters regarding the changes.

 

Same is the case of the automated tools, which are reporting large bars through their interface.  On the end, John Mueller and Gary Illyes have again replied to everyone assuring in negative when referred towards the Penguin update.

 

In fact, the answer by Gary Illyes on Twitter was a plain no, to the question; “Is Google rolling out the Penguin update?” posted by many SEOs and website owners.

 

1 2

 

Right now, no one is clear about the cause of these major fluctuations again. Whether these are changes in the core update again or some other update on its own is something that will be revealed over time. For now, all the signs pointing to a new update are still not enough to point towards the mighty Penguin update.

 

Here are some charts from the various tracking tools:

Images credit Search Engine Roundtable

Images credit Search Engine Roundtable

6

Stay tuned to the Medialinkers SEO Services blog for the latest updates.

Share Button

Switching to HTTP/2 & what it means for the SEOs?

According to Barry Schwartz, Google’s John Muller said that GoogleBot would be supporting the HTTP/2 by the end of this year or the next early year. What happens is that the https provides a tremendous speed increase, making for an extremely fast user experience along with the other online factors.

 

What Is HTTP/2?

 

Http/2 is the latest update done by the internet engineering task force made to the HTTP protocol. This protocol is the successor to the HTTP/1.1 drafted in 19999. HTTP/2 is a much-required refresh, meaning that the site has changed over the year. This update brings different advancements with it, in addition to the security, speed and efficiency.

This update was based largely on the Google’s own protocol SPDY, which is going to be deprecated in 2016. This protocol has a lot of features made already in HTTP/2 and has been managed to improve the data transmission while keeping backward the compatibility. SPDY has also proven a lot of concepts used in HTTP/2.

What Http/2 is about?

 

Major Improvements in HTTP/2

 

Single Connection

There is just one connection used for loading a site and this connection remains open as long as the site is open. This reduces the round trips required for setting up the multiple TCP connections.

Multiplexing

There are multiple requests which are allowed at the same time and one the same connection. The HTTP/1.1 was transferred to wait for the transfers to complete.

Server Push

The additional resources are sent to the client for the future use.

Prioritization

The requests are assigned dependency levels, which the server can use for delivering the highest priority resources quickly.

Binary

This HTTP/2 makes it quite easy for the server to parse, for a more compact and the less error prone. There is no additional time wasted in translating the information, from text to binary, the computer’s native language.

 

Header Compression

There are a lot of demos out there where you can see the difference in action in the tiled images. It appears that as the latency increases, the speed also increases from the HTTP/2 and becomes even more noticeable, which is great for the mobile users.

HTTP/2 also uses the HPACK compressions, reducing the overhead. This means that there are a lot of headers which are sent with the same value in the request in HTTP/1.1.

 

Who Supports HTTP/2?

 

You’ll also find that most major server software — such as Apache, NGINX, and IIS — already supports HTTP/2. Many of the major CDNs have also added HTTP/2 support, including MaxCDN and Akamai.

HTTP/2 is supported by 75% + browsers in US and 67.78 % globally. There are also a couple of caveats to these numbers, as the IE supports the HTTP/2 in windows 10, chrome, Firefox and opera to support HTTP/2 over HTTPs. You can always check how this would affect the site visitors in analytics by going to the audience -> technology -> browser & as compared to the supported browsers.

 

HTTPS with HTTP/2

 

While the HTTP/2 supports both the insecure and the secure connections, both the Firefox and Chrome would support HTTP/2 over https. This means that sites which want http/2 would have to be served over https.

There are new initiatives such as encryption, going into the public beta on Dec 3, 2015. Let’s encrypt is a certificate providing free security certificate for the sites, as it’s a great initiative towards the secure web.

 

How would it improve?

 

This would offer extreme speed for the user experience. As time goes on, people would also learn the limits of the new protocol, making the users see increased speeds on the HTTP/2 connections.

 

What it Means For Developers?

 

For developers this means that following:

 

Domain Sharding

You can load the files from the multiple subdomains for establishing more established connections. This increases the parallel file transfer adds to the overhead server connection.

Image Sprites

The image file would have to be loaded before the images are shown and the large image file needs to tie up RAM.

Combining different files

JS and CSS are found combined for reducing page requests. This makes users wait and also consumes additional RAM.

 

What it Means For SEOs?

 

With GoogleBot adding support for HTTP/2, websites that support the protocol will likely see additional rankings boost from speed. On top of that, with Chrome and Firefox only support HTTP/2 over HTTPS, many websites that have not yet upgraded to HTTPS may see an additional boost in rankings when they do.

 

The problem that SEOs would face is the switching of http’s with redirects, as all the 302a instead of 301s, additional hops, chains and cleaning up old redirects would become important. There are various items which have to be first cleaned such as the external links, internal links, duplication issues, mixed content, canonical tags, sitemaps and other tracking systems which have to be changed.

 

Also, another thing which people don’t realize is that the referral data in the headers in dropped when switching from a site using security to the one without the security. This also means that more traffic is attributed to the direct when it must be attributed to the referring sites. HTTPs also prevent ads from being placed on the sites.

As Google has officially made speed a ranking factor, it would be quite interesting to see if the HTTP/2 it becomes a ranking factor and how much of an additional weight would be placed on the additional speed.

In short, switching from HTTP to HTTP/2 is beneficial for the users, developers, server admins, sales teams and everyone else involved in the web business. There is no downside to upgrade this and if the users are not able to load over HTTP/2, they would load it like they always have.

 

Also, according to Bill, a lot of ad networks don’t support https, which means that Google is looking to have more ad space in the market through this protocol.

Stay tuned to the blog of medialinkers.info for more interesting and inside news on SEO.

Share Button
medialinkers seo experts

Seven On-Page SEO Tips to Pay Attention to

Every business is looking to gain the online presence; however, an online presence on the internet is a huge challenge. Most of the time, the marketing, and the web development teams are not on the same page, and their difference of strategy and ideas leads to a disaster. This is why, to ensure that everything turns out to be fine; you need to implement the following strategies.

Blocking the Staging Servers

The most common issue lies in the staging and development version of a website. These are also quite crucial to the testing of landing pages, being the most vulnerable regions of the site. Without the access controllers, the search engines would crawl and index these pages. This would lead to the duplicate content issues, causing a decline in the overall search rankings of the page. To protect this situation from occurring, you can do the following:

Specify the IP range, which utilize firewall to block search spiders

Include a ‘robots.txt’ file at the development’s server root, and disallow the links that shouldn’t be crawled and indexed.

Finally create a password-protected login page, preventing the spiders from accessing the content.

Work on Redirects

Redirects are used for indicating the new locations of the pages if they are deleted or moved recently. Without the redirects, you would lose all your search traffic, which is what makes them a common practice in the SEO world.

The 302 redirect is for temporarily redirections, as they pass no link value to the target page. For using a permanent redirect, you need to use the 301 redirect.

The Redirect Hops

Using multiple redirects require a chain of extra server requests, which slows down the page delivery. And in situations where the chain exceeds five hops, the search engines would neglect crawling it which means that it would not be indexed regularly. Also, this way the link value would be lost with each redirect.

Through the extensive redirect usage, your server performance would be impacted negatively. To counter this, you can use the regex and the wildcards.

Canonicalization

Canonicalization is the method through which you can prioritize a single web page as the source of content. The duplicate content issue becomes the problem when the same content is found on the multiple website pages and its authoritative source isn’t clear. For example, these links would cause duplicate content problems

http://www.mysite.com

http://www.mysite.com/index.html

http://mysite.com

For managing the duplicate URLs effectively, you can always implement a 301 redirect, as it would help in pointing each URL variant at the preferred URL or the canonical one.

You also need to note that the URLs which render the same page content without or with a trailing slash also cause duplication. This is why; it’s wise to configure the URLs with or without the trailing slash. This way, you can redirect the 301 rule for directing the users away from the discarded version. You need to ensure the internal navigation points directly at the right URL for maximizing the link value within the site.

Another way you can do this is to use the rel=canonical tag on every option, to indicate the search engines about the authoritative source.

Go Responsive

Leaving the mobile users and targeting only the desktop users would make you miss out on the massive search traffic from the mobile devices. Also, according to a website named comScore, the mobile users have increased more than the internet users. Also, google has taken the mobile platform seriously, with their Mobilegeddeon update.

Some people try to make the mobile version of their desktop site, using the subdomain (m.mysite.com) but this doesn’t only sounds like a costly plan but also the extra resources for managing the two separate sites. This can also lead to the duplicate issues when the pages are not properly tagged, resulting in a google penalty.

So instead of creating a separate mobile site, you can implement the responsive design technique, as the mobile-first approach always start quite well, making it attractive and functional at the same time.

Speeding the pages

Not only is this bad for business, it may also attract the wrong sort of attention from Google.

Page speed is a key ingredient when it comes to the Google’s ranking algorithm. You need to pay utmost attention to the page loading time; otherwise the users would abandon your site.

The developers and the designers must ensure that the site is well optimized, using the following recommendations:

  • The inline styles for the “above the fold’ content
  • Avoid using the bloated code removing the CSS and the JavaScript to external files
  • Minimize the source code to remove the ‘white space.’
  • Use CSS sprites for reducing the server requests
  • Enable asynchronous rendering and downloading of external JavaScript files

Using the Right HTML elements

From SEO perspective, the most important HTML elements include the headings, title, and the alt tags. You need to optimize them to make the search engines understand your web page.

Also, the right heading structure makes it easy for the screen readers to decipher the content areas.

But always use H1 tag just once on the entire page, and make it relevant to the content of the page.

Managing Errors

Also, make sure to return the ‘404: Not Found’ response code, to show up for the pages that are moved or deleted. This is because, Google won’t index those pages.

Most of the time, developers use the ‘200: ok’ response code which becomes problematic, as these URLs get indexed by Google. These can also get your site penalized by the Google Panda against quality assessment which is why; always pay attention to the broken links and the missing errors, which your site has got.

By following these practices, you are well on your way to running a successful SEO campaign for your client’s projects.  For more details, you can consult with Medialinkers SEO Experts.

Share Button