Is it possible to make website always online?










1















Well,I wanna make my website always online.



One method:Is it possible to cache a whole website using nginx proxy_pass for a login time?and how?
Actually i mean if the backend is down or not working,we can still use the cache for people to visit my website.



Another one method:crawler?



OR snapshot tech like search engin cached?










share|improve this question


























    1















    Well,I wanna make my website always online.



    One method:Is it possible to cache a whole website using nginx proxy_pass for a login time?and how?
    Actually i mean if the backend is down or not working,we can still use the cache for people to visit my website.



    Another one method:crawler?



    OR snapshot tech like search engin cached?










    share|improve this question
























      1












      1








      1








      Well,I wanna make my website always online.



      One method:Is it possible to cache a whole website using nginx proxy_pass for a login time?and how?
      Actually i mean if the backend is down or not working,we can still use the cache for people to visit my website.



      Another one method:crawler?



      OR snapshot tech like search engin cached?










      share|improve this question














      Well,I wanna make my website always online.



      One method:Is it possible to cache a whole website using nginx proxy_pass for a login time?and how?
      Actually i mean if the backend is down or not working,we can still use the cache for people to visit my website.



      Another one method:crawler?



      OR snapshot tech like search engin cached?







      nginx caching proxypass






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 15 '18 at 3:18









      Peter LeePeter Lee

      132




      132






















          1 Answer
          1






          active

          oldest

          votes


















          0














          It is not actually possible for proxy_pass to cache websites (unless you have an actual application that has a cache of the website on the other end) as all it does is pass requests to another endpoint. Crawlers usually refer to search engine bots that look through a website for links so they can index all information on the website for the search engine hosting the crawler.



          "Snapshots tech" works and are usually created by CDNs such as Cloudflare/Akamai, and is probably what you are looking for. CDNs are also used for many other things but I assume you are most interested in being able to show a copy of your website if it occasionally goes offline.



          There is also another option that is setting browser caching headers in NGINX, that instructs the user's browsers to show a cached copy of your website and not update it until the cache expires. However, the downside is that your users will not be able to see the live copy of your website even if your website is online and that they would have to have visited your web page within the cache time to have cached it.



          An example:



          location ~* .(?:js|css|html)$ 
          expires 1d; #users' browsers cache it for a day
          add_header Pragma public;
          add_header Cache-Control "public";






          share|improve this answer

























          • I only cache for one day,so if there is any update on my website.the other day will be update

            – Peter Lee
            Nov 16 '18 at 2:48











          • Yes that works but users will have to have visited that webpage within the last day too to have it in cache.

            – Orphamiel
            Nov 16 '18 at 4:16











          • yes that's no problem

            – Peter Lee
            Nov 16 '18 at 7:52











          • proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;

            – Peter Lee
            Nov 16 '18 at 8:25











          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53311903%2fis-it-possible-to-make-website-always-online%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0














          It is not actually possible for proxy_pass to cache websites (unless you have an actual application that has a cache of the website on the other end) as all it does is pass requests to another endpoint. Crawlers usually refer to search engine bots that look through a website for links so they can index all information on the website for the search engine hosting the crawler.



          "Snapshots tech" works and are usually created by CDNs such as Cloudflare/Akamai, and is probably what you are looking for. CDNs are also used for many other things but I assume you are most interested in being able to show a copy of your website if it occasionally goes offline.



          There is also another option that is setting browser caching headers in NGINX, that instructs the user's browsers to show a cached copy of your website and not update it until the cache expires. However, the downside is that your users will not be able to see the live copy of your website even if your website is online and that they would have to have visited your web page within the cache time to have cached it.



          An example:



          location ~* .(?:js|css|html)$ 
          expires 1d; #users' browsers cache it for a day
          add_header Pragma public;
          add_header Cache-Control "public";






          share|improve this answer

























          • I only cache for one day,so if there is any update on my website.the other day will be update

            – Peter Lee
            Nov 16 '18 at 2:48











          • Yes that works but users will have to have visited that webpage within the last day too to have it in cache.

            – Orphamiel
            Nov 16 '18 at 4:16











          • yes that's no problem

            – Peter Lee
            Nov 16 '18 at 7:52











          • proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;

            – Peter Lee
            Nov 16 '18 at 8:25















          0














          It is not actually possible for proxy_pass to cache websites (unless you have an actual application that has a cache of the website on the other end) as all it does is pass requests to another endpoint. Crawlers usually refer to search engine bots that look through a website for links so they can index all information on the website for the search engine hosting the crawler.



          "Snapshots tech" works and are usually created by CDNs such as Cloudflare/Akamai, and is probably what you are looking for. CDNs are also used for many other things but I assume you are most interested in being able to show a copy of your website if it occasionally goes offline.



          There is also another option that is setting browser caching headers in NGINX, that instructs the user's browsers to show a cached copy of your website and not update it until the cache expires. However, the downside is that your users will not be able to see the live copy of your website even if your website is online and that they would have to have visited your web page within the cache time to have cached it.



          An example:



          location ~* .(?:js|css|html)$ 
          expires 1d; #users' browsers cache it for a day
          add_header Pragma public;
          add_header Cache-Control "public";






          share|improve this answer

























          • I only cache for one day,so if there is any update on my website.the other day will be update

            – Peter Lee
            Nov 16 '18 at 2:48











          • Yes that works but users will have to have visited that webpage within the last day too to have it in cache.

            – Orphamiel
            Nov 16 '18 at 4:16











          • yes that's no problem

            – Peter Lee
            Nov 16 '18 at 7:52











          • proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;

            – Peter Lee
            Nov 16 '18 at 8:25













          0












          0








          0







          It is not actually possible for proxy_pass to cache websites (unless you have an actual application that has a cache of the website on the other end) as all it does is pass requests to another endpoint. Crawlers usually refer to search engine bots that look through a website for links so they can index all information on the website for the search engine hosting the crawler.



          "Snapshots tech" works and are usually created by CDNs such as Cloudflare/Akamai, and is probably what you are looking for. CDNs are also used for many other things but I assume you are most interested in being able to show a copy of your website if it occasionally goes offline.



          There is also another option that is setting browser caching headers in NGINX, that instructs the user's browsers to show a cached copy of your website and not update it until the cache expires. However, the downside is that your users will not be able to see the live copy of your website even if your website is online and that they would have to have visited your web page within the cache time to have cached it.



          An example:



          location ~* .(?:js|css|html)$ 
          expires 1d; #users' browsers cache it for a day
          add_header Pragma public;
          add_header Cache-Control "public";






          share|improve this answer















          It is not actually possible for proxy_pass to cache websites (unless you have an actual application that has a cache of the website on the other end) as all it does is pass requests to another endpoint. Crawlers usually refer to search engine bots that look through a website for links so they can index all information on the website for the search engine hosting the crawler.



          "Snapshots tech" works and are usually created by CDNs such as Cloudflare/Akamai, and is probably what you are looking for. CDNs are also used for many other things but I assume you are most interested in being able to show a copy of your website if it occasionally goes offline.



          There is also another option that is setting browser caching headers in NGINX, that instructs the user's browsers to show a cached copy of your website and not update it until the cache expires. However, the downside is that your users will not be able to see the live copy of your website even if your website is online and that they would have to have visited your web page within the cache time to have cached it.



          An example:



          location ~* .(?:js|css|html)$ 
          expires 1d; #users' browsers cache it for a day
          add_header Pragma public;
          add_header Cache-Control "public";







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Nov 16 '18 at 4:15

























          answered Nov 15 '18 at 16:09









          OrphamielOrphamiel

          8041121




          8041121












          • I only cache for one day,so if there is any update on my website.the other day will be update

            – Peter Lee
            Nov 16 '18 at 2:48











          • Yes that works but users will have to have visited that webpage within the last day too to have it in cache.

            – Orphamiel
            Nov 16 '18 at 4:16











          • yes that's no problem

            – Peter Lee
            Nov 16 '18 at 7:52











          • proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;

            – Peter Lee
            Nov 16 '18 at 8:25

















          • I only cache for one day,so if there is any update on my website.the other day will be update

            – Peter Lee
            Nov 16 '18 at 2:48











          • Yes that works but users will have to have visited that webpage within the last day too to have it in cache.

            – Orphamiel
            Nov 16 '18 at 4:16











          • yes that's no problem

            – Peter Lee
            Nov 16 '18 at 7:52











          • proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;

            – Peter Lee
            Nov 16 '18 at 8:25
















          I only cache for one day,so if there is any update on my website.the other day will be update

          – Peter Lee
          Nov 16 '18 at 2:48





          I only cache for one day,so if there is any update on my website.the other day will be update

          – Peter Lee
          Nov 16 '18 at 2:48













          Yes that works but users will have to have visited that webpage within the last day too to have it in cache.

          – Orphamiel
          Nov 16 '18 at 4:16





          Yes that works but users will have to have visited that webpage within the last day too to have it in cache.

          – Orphamiel
          Nov 16 '18 at 4:16













          yes that's no problem

          – Peter Lee
          Nov 16 '18 at 7:52





          yes that's no problem

          – Peter Lee
          Nov 16 '18 at 7:52













          proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;

          – Peter Lee
          Nov 16 '18 at 8:25





          proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;

          – Peter Lee
          Nov 16 '18 at 8:25



















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53311903%2fis-it-possible-to-make-website-always-online%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Use pre created SQLite database for Android project in kotlin

          Darth Vader #20

          Ondo