an ad for Japanese search engine goo.ne.jp
Goo! Goo! Goo!
an ad for Japanese search engine goo.ne.jp
Goo! Goo! Goo!
Join us for a *Google News & Webmaster Central office hours hangout* 🙂
Just a few months ago, Google News launched the Publisher Center tool designed exclusively for you. It’s now the one-stop-shop for you to more quickly and effectively manage the information Google News has on record for your site. Learn what you can do with it and how to access it. Google News’ Community Manager, Stacie Chan, will also be able to answer your Publisher Center-specific questions.
This session is about Google News. Add your questions to this event page and vote on existing questions!
This is a Hangout on Air here on Google+. It works best with a webcam + headset. You can find out more about Hangouts and how to participate at https://sites.google.com/site/webmasterhelpforum/en/office-hours
* Publisher Center: https://partnerdash.google.com/partnerdash/d/news
* Help Center: https://support.google.com/news/publisher/?hl=en#topic=4359865
* Help Forum: https://productforums.google.com/forum/#!forum/news
* News-specific crawl errors HC page (Article too short, etc.): https://support.google.com/news/publisher/answer/93994
* Editors’ Picks info: https://support.google.com/news/publisher/answer/1407682?hl=en&ref_topic=4359926
* Editors’ Picks case study: http://gawkerdata.kinja.com/google-news-editors-picks-as-seo-our-initial-results-1681002004
* Google News Sitemap overview: https://support.google.com/news/publisher/answer/75717?hl=en&ref_topic=4359874
* Are reviews news? https://productforums.google.com/forum/#!topic/news/j_xcBHlXMP8
* Language/Country requirements: https://support.google.com/news/publisher/answer/2481306?hl=en
Join us for a special *Google Webmaster Central office hours hangout* with personal slots for you & your site.
You can sign up for a 4-minute slot for your personal questions at http://goo.gl/GcywKV . The first part of this hangout will be only for these questions, we’ll open it up for the public afterwards. This is your chance to ask that question about your website that’s been on your mind!
This is a Hangout on Air on Google+. It works best with a webcam + headset. You can find out more about Hangouts and how to participate at https://sites.google.com/site/webmasterhelpforum/en/office-hours
Feel free to drop by – we welcome webmasters of all levels!
If your site is clobbered by Panda it’s incredibly essential to quickly determine the root causes of the assault. The best way to attack the problem is by doing what is called a “deep crawl evaluation” of the website while executing a considerable audit via the lens of Panda. The result is a remediation plan covering a variety of core site troubles that need to be remedied earlier compared to later on. Your SEO efforts need to be well planned after a complete analysis.
As well as for larger-scale sites, the remediation strategy can be long as well as complex. The smart way to handle this big problem is to reduce it into smaller sized pieces as the evaluation goes on. You don’t want to develop a plan with massive site changes that could upset your client; or worse, destroy the site entirely.
But just considering that problems have actually been determined, as well as a remediation plan mapped out, it does not mean all is good in the world of Panda. There may be times that major troubles can not be easily fixed. And also if you can’t take on low-quality material on a large website attacked by Panda, you may would like to obtain used to benched ranks and low web traffic levels.
One problem that is especially annoying and very serious when undertaking a Panda remediation is the highly difficult (in some instances) Content Management System (CMS) “issue.” The term ‘CMS’ has a rather broad definition since some internal systems are not in fact material management systems. They simply offer a primary mechanism for getting details over a website. There’s a distinction between that and a full-blown CMS. No matter, the CMS being used could make Panda adjustments easy, or it could make them very hard. Each scenario is various, however again, it’s something that occurs often.
When providing the remediation plan to a large client/customer team, there are normally people representing various aspects of the company/site owner in the meeting. There could be individuals from advertising and marketing, development, sales, Information Technology, designers, and even corporate executives involved. A main objective is to make certain everybody is on board when handling an issue as huge as a Panda penalty (or any other Google penalty).
However sometimes Information Technology and engineering, or the webmaster, has the job of bringing a sense of issue when it come to how effectively modifications can be applied. This can be a rather difficult situation. There’s a lot of website traffic and also a big financial stack that is in jeopardy, and also no one wants to be the person who states that the project cannot be accomplished.
This video discusses the Panda Penalty
For instance, what happens if there are 200,000 web pages of what is called “thin content” is a major the reason for the Panda penalty?. The web pages have actually been identified and analyzed, including the directories that have been involved, but the custom-made CMS will not allow you to effortlessly manage that content. When the CMS limits are explained, there still needs to be some way to correct the situation, or lose the website entirely.
So what types of CMS hurdles could you bump into when attempting to recuperate from Panda? Unfortunately, there are several and they be caused by the website CMS. Here are 3 problems associated with Panda penalties that require remediation. Not all CMS problems that could hinder Panda recuperation are covered here, however, the three below are common and must be addressed as soon as possible.
Get started right away finding that low-grade content, and have a gigantic listing of URLs to wipe out (remove from the site), you want to release either 404 or 410 header response codes. So you approach your development/design team and detail the circumstance. But unfortunately for some content administration systems, it’s not so easy to separate certain URLs to eliminate. And if you can not get rid of those specific low-quality URLs, you could never ever get away the Panda filter. It’s a difficult spot with the Panda penalty.
In many situations CMS issues can be solved with a bulk 404 for web pages, however only by 404ing many pages without regard to quality. So you would be throwing the child out with the bath water. When you delete an entire lot of pages, you would certainly be destroying top notch content together with low-quality content. This is not just bad form, and also defeats the purpose of exactly what you are attempting to accomplish with your Panda remediation.
It’s not uncommon to see CMS platforms that might only get rid of some content from a certain day forward or in reverse. Again, you would certainly be destroying excellent content with low-quality content, which is not good. The goal is to boost the percentage of top quality content on your website, not to obliterate big sections of content that could consist of both high- and also low-quality URLs.
Similar to the situation above, if you have to noindex material (versus remove), then your CMS should allow you to dynamically supply the meta robots tag. For instance, if you discover 20,000 web pages of content on the website that is valuable for customers to accessibility, but you do not want the material indexed by Google, you can provide the meta robots tag on each web page utilizing “noindex, follow.” The pages won’t be indexed, however the links on the web page would certainly be be usable. Or you could possibly use “noindex, nofollow” where the pages would not be indexed and also the web links wouldn’t be followed. It relies on your certain scenario.
But once again, the CMS can supply challenges to obtaining this applied. There are frequently situations where as soon as a meta robots tag is utilized and also in the page’s code, it’s difficult to alter. Or there could be multiple meta robotics tags made use of on the web page in an effort to noindex material that’s poor quality. And also beyond that, there have been times where the meta robotics tag isn’t really even an option in the CMS. In this instance you cannot utilize the tag even if you would like to. Or, similar to the previous issue, you can not precisely utilize the tag. It’s a sector or class-level directive that would force you to noindex high-quality material along with low-quality content, causing a roadblock that must be dealt with pages individually.
The Meta Robots tag can be an effective piece of code in SEO, but you need to have the ability to utilize it properly as well as selectively. If not, it could have significant complexities.
The appropriate usage of nofollow is prevalent in all well formed websites and nofollow problems during Panda remediation is very common. However, once addressed, it will greatly facilitate the removal of any penalty, plus help with the Penguin algorithm as well.
For instance, big affiliate sites often have an enormous followed hyperlinks problem. Connect hyperlinks should be nofollowed and ought to not stream link juice or the benefits of domain authority to targeted web sites (where there is a business connection). But what if you have a scenario where all, or a lot of, your affiliate web links were followed? Let’s say your website has 1 million web pages indexed as well as many links to e-commerce internet sites. The most effective method to handle this scenario is to just nofollow all affiliate hyperlinks throughout the material, while leaving any kind of organic web links in one piece (adhered to). That should be very easy, best? Maybe.
What looks like a quick fix by means of a the CMS administration system could possibly become an actual difficult situation. Some custom-made CMS platforms could only nofollow all hyperlinks on the web page, and that’s absolutely not what you want to do. You just intend to precisely nofollow affiliate web links.
In various other scenarios, there could be a CMS with the ability to nofollow links from a particular date forward, as upgrades to the CMS provided new capabilities for discrete nofollows. But exactly what regarding the 200,000 web pages that were indexed prior to that date? You do not want to leave those as-is if they are followed ‘affiliate hyperlinks.’ Again, a straightforward circumstance that instantly ends up being a difficulty for business owners handling Panda.
A final word . . . Look at the entire CMS then focus on what links need to be adjusted and which do not require edits.
How can you tell if your site is suffering from an algorithmic penalty, or you are simply being outgunned by better content?
To see manual actions on your site, visit the Manual Actions Viewer:
Learn about the Panda and Penguin algorithmic updates:
Have a question? Ask it in our Webmaster Help Forum: http://groups.google.com/a/googleproductforums.com/forum/#!forum/webmasters
Want your question to be answered on a video like this? Follow us on Twitter and look for an announcement when we take new questions: http://twitter.com/googlewmc
More videos: http://www.youtube.com/GoogleWebmasterHelp
Webmaster Central Blog: http://googlewebmastercentral.blogspot.com/
Webmaster Central: http://www.google.com/webmasters/
More about SEO:http://www.vancouverwaseo.org. Getting top rankings using White Hat ethical SEO is important, so make sure you work with top SEO service company like Vancouver WA SEO.