Google expels believers

How often do they knock on your door to ask: “Hello, do you believe in neural networks?” Personally, at every presentation, marketers of a particular IT company suggest that I believe that neural networks, machine learning and other big data work in their product. Faith is, as you know, the opposite of Knowledge. You can believe in those things that are not a priori observable and not verifiable. Knowledge is something that we can observe and test. When marketers, speaking about the merits of a product, talk not about its specific functions that the user can 'feel', but about abstract non-networks and 'lerning machines', the user can only believe. For example, when you get search results, believe that all these fancy things made this search result better than others. There is no interactive way to interact with the algorithm, it is impossible to understand what the algorithm was guided by – it means we just believe that neural networks made it super intelligent. But! believers will soon be gone. For the first time in the industry, Google made it possible for the user to visually see how the algorithm works, interact with the algorithm, and immediately get a response to their actions. No more faith.

Google expels believers

At I / O 2016, Google said that it has been developing its own tensor processors for neural networks since 2013. Long haul. But the result was worth it

A few months ago, recommendations for articles from Google Now appeared in the standard Google mobile application, and after a while the “application” was completely updated. At first glance, nothing special, now there are three sections: the feed of articles, tips and history, most of this functionality has been tested in Google Now for many years. But after using the updated Google app for a while, I suddenly remembered what Sundar Pichai said a couple of years ago at Google I / O that the company strives to maximize the personalization of the experience for each user, so that each person there was a personal Google. What was then perceived as another presentation presentation, in fact, turned out to be a long-term strategy, which, even after a couple of years, was embodied in the final product.

Google expels believers

There are especially advanced individuals who combine the phone and SMS into a folder, put the camera on a hotkey or gesture, and instead display YouTube and GMail on the quick access panel. As you can see, I am a retrograde.

We are all accustomed to the fact that at every presentation we are told about neural networks, machine learning and other big data. Translated from marketing into Russian, this means: “Just believe that our algorithm is very intelligent.” Faith is needed here because in most products, the user does not see how the algorithms work, all that remains is to believe that the result that was given to you was obtained in a super intelligent way. Google is fundamentally changing this vicious practice of faith. In the Google mobile application, you now clearly see what the algorithm was guided by when you added something to your feed, and you can customize it for yourself. No more blind faith that the extradition sent down to you from above is intellectually formed. Now you can see and customize how it works.

Tape

The main reading recommendation feed is formed entirely based on your preferences. Google, based on statistics from searches, views in Chrome and YouTube, determines the topics that interest you, and when some content appears on the Internet, which the algorithm determines as relevant to the topic you are interested in, it adds it to your feed. The beauty of these recommendations is that you can always clearly see which topic a particular recommendation relates to. Then you can: subscribe to the topic, then you will receive recommendations on it much more and more often, exclude the topic from the recommendations, leave the topic in the recommendations, but exclude a specific source or a specific article.

Google expels believers

According to my subjective estimates, about 60% of the recommendations fall on those topics that I am consistently interested in or subscribed to, another 30% on new topics that, according to Google, may interest me, and another 10% are articles that are not related to any topic.

Someone from the nineties probably remembers how demotivators with funny prompts for search queries were used, from which far-reaching conclusions were made about what search engines think about our life. I feel the genre may be reborn!

Google expels believers

High culture in the 'Culture' theme

Google expels believers

This is our country))

Google expels believers

'Local News' is on fire! The US Senate recently moved to Moscow in full force!

Everything is clear with the first group of recommendations. The algorithm analyzes your media consumption and tries to subtract topics that interest you. Moreover, he is trying to formulate the topic very accurately. For example, I don't often go to iron sites, but I watched several videos on YouTube dedicated to comparing processors from blue and red. After that, Google in the feed offered me several articles on this topic, the articles were about all similar topics, but related to different topics: “Central processor”, “Intel”, “AMD”. If I subscribed to any of these topics, further recommendations would become more targeted and more regular. But I didn't subscribe, and for some time the recommendations on processors in the feed disappeared. Then, recommendations on YouTube offered me new videos on those jelly-like channels that I watched. I watched several videos and even liked them. From this, Google concluded that the topic was interesting to me, it was just that initially he did not guess with the wording of the topics. And on the same day, I again saw in the Google feed recommendations of articles on processor-hardware topics, but this time the topics of the articles were worded differently: “Intel Coffe Lake”, “Zen” (names of processor architectures ). Since this time I did not register on any of the proposed topics, the recommendations disappeared again, but I think Google will more than once try to narrow the search circle and precisely formulate the topic to which I want to subscribe. Subscribing to topics is not at all necessary, if you actively click on articles on the topic and watch videos on YouTube, then the relevant recommendations will not disappear from your feed, even if you did not subscribe to the topic.

Google expels believers

Hot news. You can subscribe, but I generally excluded the topic from the recommendations

Google expels believers

AMD, such AMD

Google expels believers

The topic I am actively interested in. Even if you don't subscribe to it, it will still be in the recommendations

Google expels believers

Something new. Even signed up

Google expels believers

Our everything. Any Russian should be subscribed to this topic

Google expels believers

Another vagrant trend. Flew past and never appears

The second group of recommendations are recommendations for new topics. Most often, this is either hot news like the referendum in Catalonia, or articles from those resources that I regularly visit, but allocated to a topic that is new to me. If you guessed right with any of the recommendations of Google, then you should definitely subscribe to the topic. Unlike the suggested topics from the first group, these do not come back often. There are also completely new recommendations, when both the topic is new and the resource, but such recommendations are not common. Apparently, this is insurance against locking the user in the “information bubble”.

And the third, the smallest, group of recommendations – these are materials not related to any topic. Previously, there were a noticeable number of such recommendations, but over time, they remained no more than 10%. I don't see any uniformity in them. Most likely, these are experimental recommendations, in which the algorithm at this stage cannot highlight a clear topic, but if the user is interested, it will begin to give clarifying recommendations.

There are four types of topics (here and everywhere the terminology is mine):

  • Global: Country, Culture, Popular, Business, Health, Science and Technology and a whole bunch more. These themes have a blue circle logo with an up arrow graph. You cannot subscribe to such topics or exclude them, but they will appear in a reasonable amount and only on significant news events. The frequency of these topics is directly related to how actively you click on the relevant recommendations, and perhaps even counts whether you have read the open article to the end. For me, for example, Country and Business appear more often, and Sports have ceased to appear altogether, because I ignored these recommendations.
  • Personal: These are topics formulated by the algorithm itself based on your preferences. They are narrow and specific: OnePlus 5, Pixel, Matilda (I don't know how she got into my recommendations), Telegram, etc.
  • Custom: These are topics that the user helped formulate in the Customize Articles section (Customize the Ribbon in another menu). Unfortunately, recommendations on these topics do not work yet. Apparently, Google still does not have enough big dat's for such specific tastes.
  • Askable: These are pre-formulated recommendation topics that will start working after the user sets the required parameters. For example, “Musicians” – here you need to specify a list of musicians, songs and bands that you want Google to track for you. Likewise for stock quotes, actors, movies / TV series and sports. The “Sports” category has the most detailed settings, you can subscribe to specific teams and leagues, separately subscribe to news and best moments. The fans will definitely appreciate it.

Google expels believers

I have never googled any of them. BRB and Enjoykin are my YouTube adventures. Where did the rest come from I don't know

Google expels believers

Also did not google these series. But in Chrome, I often opened the pages of the LostFilm website where I read about them. Apparently, these recommendations came from the browser history

Google expels believers

Five-minute pranks: in the 'Actors' section you can set up tracking of workers in the XXX industry

A great advantage of Google's approach is that the user can clearly see the mechanics of the algorithm and can fine-tune it for himself. As already mentioned, the user has four tools to customize the algorithm: subscribe to a topic, exclude a topic, exclude a source, exclude an article. This is not counting the conclusions that the algorithm draws from user behavior. Article hiding is the most obscure signal. Often after that, I have to use this tool again one or two more times before the algorithm clearly understands what exactly I do not need to show. For example, I saw in the recommendations an article on Medusa that an outstanding person had died. This news is held under the theme “Bolshoi Theater”, I do not want to exclude either the topic or the source, so I exclude the article. Unfortunately, unlike excluding videos from YouTube recommendations, where you can answer clarifying questions about why I'm removing this video from recommendations, there are no such clarifying questions in the Google feed. Therefore, for some time the algorithm does not understand my motivation: I believe that the article does not fit the topic? or I don't like a particular character? Or was there something I didn't like about the title? For all the time, I had to exclude news about the deaths of the great three or four times before they stopped getting my recommendations.

Content recommendations in the Google mobile app should in no way be compared with Yandex.Zen or Yandex.News. These are fundamentally different approaches. Yandex.Zen is a closed source list aggregator where you can only add or remove a site from the collection. There is no work with themes. The whole algorithm setting comes down to putting likes / dislikes on the materials, and the belief that the supreme Yandex algorithm correctly interpreted your signal, understood what exactly you liked / disliked and will continue to take this into account. Of course, you can believe in this, but when the operation of the algorithm is so opaque, and the tools for setting it up are so abstract, I find it hard to believe.

Yandex.News, in general, seemed to be made by TV people, and not by IT people. No topics of interest to me personally, no personalization. The news picture of the world is formed one for all. Everything is as in the Vremya program: a single news agenda for all citizens of the country. You can only choose your preferred source: do I want Channel One to tell me this news or, better, Zvezda? This is such a non-Internet approach that it is even strange.

In this article, we've covered just one feature of the updated Google app. However, in addition to “what to read” recommendations, Google also has the ability to provide offline tips that can save you a lot of time and money. We will talk about this and other functions of the updated application in the second part.

Rate article
About smartphones.
Add a comment