Regarding SEO, PR value, spam pages and the future of Google, I believe that many Internet marketers want to know what Google really thinks. This article provides you with the views of Matt Cutts, a senior Google engineer, on these issues.
Recently, Mat Cutts elaborated on SEO, PR value, spam,
future google search engineunderstanding.
First, Mat Cutts divided the changes in Google's search engine into the following parts:
1. Knowledge graph
2. Voice search
3. Traditional search
4. Today’s Google Search Engine
5. Google's Deep Learning
He further explained that the so-called deep understanding of Google refers to
The Google search engine establishes the connection between the semantics of words and the application of words, to reduce the discrepancy between the search query and the search results.
Deep understanding of both general and voice search domains
Voice search has changed people's habits of using search engines, but compared with traditional search, voice search has fewer steps. Voice search means that search engines can still recognize user intent, just in a different way.
Cutts showed an example of his use of voice search. He used voice search to find out about the weather, no need to repeat the previous question "What is the weather?" Because Google's search engine knows that the user's intent is to know the weather conditions, so it shows the weather conditions in Las Vegas, Nevada. Then, he asks, "What's the weather in Mountain View?" Google jumps out to the weather in Mountain View, because the Google search engine knows it's a succinct voice advisory. Then, he asked, "What's the weather like in Mountain View this weekend?" Google then showed the weather in Mountain View on Saturday.
hummingbird algorithm,panda algorithmand original
Next, Cutts talked about the hummingbird algorithm, because he felt that the hummingbird mentioned by many blogs was actually irrelevant to the ranking of the website. In fact, Hummingbird launched for a month and no one noticed. Hummingbird is mostly a core mass change. He said that hummingbird is not what those bloggers say it is, and in fact, hummingbird does not cause much impact on SEO.
What SEOs should pay attention to is that Google is now considering softening the Panda algorithm. Those sites that fall into the gray area of Panda, if they are high-quality sites, Panda will re-rank them.
Google also considers strengthening the weight through originality. In Google's search results, we can see that originality is becoming more and more important. Cutts acknowledged that this will be the direction Google continues to develop.
google atmobile fieldsearch results for
Next, he mentioned the role of smartphones and their impact on search results. This field requires every SEO personnel to continue to pay attention. The current website is not very compatible with mobile phones, which also leads to a lower ranking of the website in mobile phone search results.
Smartphone search engines rank websites mainly based on the following aspects:
1. If the mobile phone cannot play flash, then Google will not display those flash websites.
2. If the website has a lot of flash, then seriously consider the use of flash, or make sure that the mobile version of the website will not use these flash.
3. If the website directs a lot of traffic to the homepage instead of the inner pages that visitors want to visit, the ranking of the website will decrease.
4. If a website loads slowly on a smartphone, Google won’t rank it.
Cutts explained very clearly why mobile traffic has increased significantly, and that websites that are not compatible with smartphones will affect the traffic sent by Google. Webmasters must develop their own mobile strategy now.
Penguin Algorithm, Google's webspam strategy and native advertising
Matt Cutts talks about Google's webspam strategy. When they first released the Penguin Algorithm, someone on the Black Hat Forum boasted that their website was not affected by the Penguin Algorithm. Webspam is the target of Google's anti-spam team, which penalizes sites that are riddled with spam. Matt Cutts said he knew his anti-spam team was doing a good job when some of the site staff threatened him personally.
His team will continue the work, and some special keywords are also included as spam keywords, including "payday loans", "car insurance", "mesothelioma" and some pornographic terms. Because they believe that these high-value keywords are very attractive to spam publishers, they must update their algorithms to protect those areas from the scourge of spam.
Matt Cutts also talked about native advertising, and how Google's team will continue to penalize sites that don't properly use disclaimers to indicate paid advertising. Google has already begun taking action against native ads that appear online without being labeled. He emphasized that if local advertisements are not debunked, there is nothing wrong with it. If the ad is debunked, Google will take action.
Google Radar still pays attention to those spam sites, once found, Google will take immediate action.
Bad news for PR value
Some bad news for PR believers. Google used to update every day or every three months. These PR values will be displayed on the Google Toolbar, so webmasters can see the PR values of their responsible websites. Unfortunately, Google will no longer update those PageRanks promoted to the sidebar. As a result, this year, we no longer see PR values on all sites. But Cutts can't be sure, if you don't update the PR value, Google will judge the impact of the PR value on the ranking. It can be speculated that the PR value will no longer be used as a ranking factor.
Communication with Webmasters, Snippets, Java, and Negative SEO
The Google management team also continued to strengthen communication with webmasters. Google sees more and more malware and hacking attacks that are badly affecting websites, and most webmasters don't know what's going on and how to fix their sites, so they made some new videos about it. They included many, many examples in the guidelines to make people more aware of the reasons for the decline in rankings, and at the same time guide webmasters to fix them.
Cutts stresses that it's not just him doing the work. Google has 100 lectures every year, which are used to communicate with various webmasters. At the same time, the Google team also has a dedicated time to facilitate communication and allow users to consult with the search team.
Google's search engine is getting smarter these days and can read the Java language, which is often used by spammers. However, Cutts also cautioned that Google's search engine now recognizes Java. But this does not mean that the entire website can be written in Java language.
Rich snippets can also improve webpages, which can be observed through those top-ranking websites. “Sites that rank higher get richer snippets, and those that rank lower get less snippets,” Matt says.
Matt also said that negative SEO is not as common as people think, on the contrary, it is usually self-inflicted. Someone once mentioned to Matt that their competitors imported paid links to their website in order to destroy the website. But when he dug deeper, he found that the paid link had been pointing to their site since 2010, when it was impossible for competitors to do the job, and the link existed before Google penalized it.
The Google Search Engine of the Future: Mobile, Original, and High-Quality Search Results
For the future of Google's search engine, Matt Cutts emphasizes the importance of mobile sites. According to statistics, the proportion of mobile phone traffic visiting the YouTube website surged from 6% two years ago to 25% last year, and this year, it has increased to 40%. In some countries, traffic to YouTube on mobile phones far exceeds that on computers. Matt Cutts reiterated repeatedly: "If your website is not compatible with mobile phones, then it needs to be fixed as soon as possible."
The Google team is also constantly improving the search engine so that the Google search engine can understand the underlying query and provide the best results for it.
Originality is an area that Google's search engine wants to strengthen. Google believes that identifying those original content can eliminate spam. The Google team plans to strengthen the originality of content against spam. They found that if that spammy content decreased by 15%, high-quality content surged.
The next step of the Google team is the detection of hacker attacks. Here Cutts does not mean black hat technology, but a collective term for hackers. Google's search engine wants to make it impossible for people to find any results for pornography, such as child pornography. "If you type porn, we're not going to let you find any relevant results on Google," Cutts said.
Cutts' current suggestion to webmasters is to strengthen compatibility with mobile phones. Enhanced compatibility is to use Google's automatically generated website form annotations, which makes it easier for people to fill out forms on your website. Adding the form to the site is very simple, and this step will be implemented in the next few months.
The next generation of algorithms will focus on those advertising sites, especially those full of hidden ads. This is not recent, because these ads seriously degrade the user experience of Google search engine, and Google also announced that their management team's layout algorithm will target these ads. But if these advertisements are written in Java language and submitted to the crawler, and these advertisements are used instead of hidden advertisements before the algorithm affects them, then the website will not be greatly affected.
Matt Cutts Q&A
Answering questions from the audience, Matt Cutts referred to links on news sites. Google considers these sites to be news release sites and doesn't pay much attention to them, he said. He emphasized that the links on the news release website will not be penalized, because the news release website has high value for news release and marketing, but these links will not generate PR value.
There are also growing issues with infinite scrolling sites, such as scrolling down, how Twitter loads tweets. Cutts mentioned that Google is trying to handle infinite scrolling that other search engines can't handle. He suggested that sites that use infinite scrolling also need to have static links, such as using a paginated structure, so that Google crawlers don't need to wait for all pages to load to crawl information on the page.
Someone also asked about rich blog content and whether updating blog content regularly will affect the ranking of the website. Cutts uses the Huffington Post as an example. They have many editors and publish a lot of news every day. However, he said that publishing content that users like to watch is the most appropriate way.
Finally, Cutts said his team is paying close attention to the mix of organic and non-natural search results. He hoped to receive feedback in that regard as soon as possible.