Showing posts with label Search Console. Show all posts
Showing posts with label Search Console. Show all posts

How to solve the LCP Issue/ problem on core web vitals due to google adsense ads?

 

Performance blogs or websites will eventually drop when advertised by third parties such as ads on Google Adsense and other digital advertising. To avoid that, a special method is needed to load ads so that loading or website performance is maintained.


Generally the method used to load ads so as not to affect the performance of the website/blog is to apply lazy load to fish (adsense), therefore here I will give a fairly complete review of lazy load adsense.


You can solved the LCP Issue/ problem on core web vitals due to google adsense ads with Lazy Load Adsense.


Without Lazy Load Adsense your LCP can be poor or higher than 4s. The best LCP is lower than 2,5 s.



What is Lazy Load Adsense?

lcp issue adsense 1

Lazy load adsense is a lazy loading or smart loading script on adsense ads to reduce the impact of third party code on the website and maintain optimal website performance.




There are several methods for making smart loading on adsense ads including Ad event listeners, Event-based ad requests and Lazy loading, all three of which can be referred to as lazy load adsense.



Why Should You Use Lazy Load Adsense?


Currently there is no better method than lazy load adsense to solve all kinds of website performance issues caused by third party ads.

There are many things that can be solved with lazy load adsense, related to website/blog performance problems, including:
  • Third party webfont load
  • JavaScript execution time
  • Main-thread work
  • Impact of third party code, etc
That's why in my title tag it says, lazy load adsense is the best solution for the performance of websites or blogs that are installed with third-party ads.

Immediately I will share the lazy load adsense script with all of you, there are 3 (three) lazy load adsense scripts that I will share and both lazy load scripts are equally good for maintaining the performance of your website and also support auto ads.



Instructions for Using Lazy Load Adsense


How to use the lazy load adsense script is very easy, you only need to install one of the lazy load adsense script codes in the head or body of the website, then remove all adsense javascript library tags from the website/blog template.

Change your adsense script :


<script async="async" src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"/>

//or a new version of the javascript library
<script async="async" data-ad-client="ca-pub-1234567890123456" src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"/>

//or updated version
<script async src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-4209467777280582" crossorigin="anonymous"></script>


To :


<script type="text/javascript">//<![CDATA[
var lazyloadads=false;window.addEventListener("scroll",function(){(0!=document.documentElement.scrollTop&&false===lazyloadads||0!=document.body.scrollTop&&false===lazyloadads)&&(!function(){var e=document.createElement("script");e.type="text/javascript",e.async=true,e.src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-420946777728xxxx";var a=document.getElementsByTagName("script")[0];a.parentNode.insertBefore(e,a)}(),lazyloadads=true)},true);
//]]></script>

Change the red text with your adsense publisher code



How the Lazy Load Adsense Script Works ?


Lazy Load Adsense is completely different from lazy load images because basically lazy load adsense is a scroll event listener, the adsense javascript library in the lazy load adsense script will only be loaded or run when the user scrolls your page.

And that means that adsense ads will only appear when the user scrolls through your blog or website, if there is no page scrolling activity from the user, the ad will also not be loaded (not displayed).
lcp issue adsense 2



Lazy Load Adsense From Adsense Policy Side


Google is very open to implementing lazy loading on their ads (adsense), Please see the google help center entitled "Viewability best practices". On that page, Google recommends the use of lazy load adsense or also known as smart loading.





For detail step by step yo can watch this video below.


 



That means Google doesn't care about the use of lazy laod adsense. But according to google the best practice for implementing lazy load adsense is to use the Google Publisher Tag (GPT).



Reference


https://www.cordialblogger.com/2021/04/lazy-load-adsense.html
https://nanokaryamandiri.com/

How to create and customize robots.txt file on blogger ?

      


    Before creating a robots.txt file, it's a good idea to first know what a robots.txt file is. A robots.txt file is a file that tells search engine crawlers which URLs on your site are accessible to search engine crawlers. This file can be used to hide your site's url from search engine crawlers, but it cannot hide it from google indexing. To hide the url from Google's indexer, we can use the noindex attribute on the url.


Limitation of robots.txt file

    If you want to create or edit a robots.txt file, you must know the limits of URL blocking methods from search engine crawlers. Depending on your goals and situation, we recommend that you consider other mechanisms to ensure your URLs are not discoverable on the web.

  • The robots.txt command may not be supported by other search engines.The instructions in the robots.txt file cannot force crawler behavior on your site; it is the crawler who chooses whether to comply with the instructions or not. While Googlebot and other well-known web crawlers comply with the instructions in the robots.txt file, other crawlers may not. Therefore, if you want to keep your information safe from web crawlers, we recommend using other blocking methods, such as password-protected private files on your server.
  • Different crawlers interpret the syntax in different ways. Although well-known web crawlers follow the commands in the robots.txt file, each crawler may interpret the commands in a different way. You should know the appropriate syntax to handle different web crawlers because some web crawlers may not understand certain instructions.

  • Pages disallowed in robots.txt can still be indexed if linked from other sites.While Google will not crawl or index content blocked by robots.txt, we may still find and index disallowed URLs if they are linked from other sites on the web. As a result, URL addresses and, possibly, other publicly available information such as link text in links to pages may still appear in Google search results. To prevent URLs from appearing in Google search results, password protect your files on the server, use response headers or noindex meta tags, or delete entire pages.



How to create and customize robots.txt file on blogger

Then How to create or custemize robots.txt in blogger ?


Here are the steps:


Step 1Go to your blogger admin dashboard.


Step 2. Select "Setting" >> scroll down and find "Custom robots.txt"


Step 3. Fill the robots.txt


Step 4. Click Save.


By default behavior is that the user agent is allowed to crawl the entire site.


User-agent: *

Disallow:

# or

User-agent: *

Allow:/


# Example 1: Block only Googlebot

User-agent: Googlebot

Disallow: /


# Example 2: Block Googlebot and Adsbot

User-agent: Googlebot

User-agent: AdsBot-Google

Disallow: /


# Example 3: Block all but AdsBot crawlers

User-agent: *

Disallow: /


# Example 4: Block only Googlebot on folder "nogooglebot"

User-agent: Googlebot

Disallow: /nogooglebot/


# Example 5: no Block  Googlebot only on folder "nogooglebot"

User-agent: Googlebot

Disallow: /

Allow: /nogooglebot/





    Okay, that's enough for now from me, if you have any questions, please comment in the comments column. If something goes wrong I apologize. Hopefully helpful, and thank you.


"Look for someone who is willing to accept your situation, your family and your job. In fact, happiness is about being together and being grateful." 



Reference:

https://www.google.com





How To Create And Submit Blogger Sitemap To Google Search Console?

What Is An XML Sitemap?

In simple terms, An XML Sitemap is a list of a website’s URLs. That’s why it’s called a sitemap. It maps out how the website is structured and what the website includes. (“XML” stands for “Extensible Markup Language,” a way of displaying information on websites.)







How to Create Site Map on Page Blogger ?....

 






The main function of a sitemap is to make it easier for visitors to find out the contents of our website.

Featured Post

My google play console account was terminated

A few days ago I received a notification email from Google that my Play Console account has been terminated. The message content is as below...