robots-txt | page may be crawled from robots.txt , robots meta tags | Sitemap library
kandi X-RAY | robots-txt Summary
kandi X-RAY | robots-txt Summary
Determine if a page may be crawled from robots.txt, robots meta tags and robot headers
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of robots-txt
robots-txt Key Features
robots-txt Examples and Code Snippets
Community Discussions
Trending Discussions on robots-txt
QUESTION
"gatsby develop" works well. However, an error occurs in 'gatsby build'
...ANSWER
Answered 2022-Mar-30 at 05:45Summarizing a lot gatsby develop
is interpreted by the browser while gatsby build
is compiled in the Node server (your machine or your deploy server) so the behavior of your code is slightly different. Especially to what's related to global objects and SSR (Server-Side Rendering). The fact that your code works under gatsby develop
means that is working under certain specific conditions, not that your code works always or has no errors, this should be inferred if it succeeds in a gatsby build
.
In your case, it seems that the posts
data is undefined
when using memoized hook (useMemo
), at least, in the initial render.
Try using:
QUESTION
ANSWER
Answered 2022-Feb-14 at 19:18CSS modules in Gatsby v3 onwards needs to be imported as ES modules, so your:
QUESTION
I'm migrating my site from gatsby 2 to version 4. It runs perfectly with the gatsby develop
. However, when I run gatsby build
, I got the following error
ANSWER
Answered 2022-Jan-30 at 18:28After some debugging, the issue seems to be related to the Contentful source plugin (gatsby-source-contentful
) according to some GitHub threads and to the capability to create internal IDs for tag
node (tags___NODE
).
Aside of waiting the resolution you can try updating the plugin to the latest version.
QUESTION
I'm making Next.JS app and using next-sitemap
plugin for generating sitemap.xml and robots.txt files. All is fine, but sometimes Google Lighthouse give me an error (on screenshot):
My robots.txt file is on https://webnos.online/robots.txt.
I've found this error and solution here, but running await fetch(new URL('/robots.txt', location.href).href)
in console returns correct result (on screenshot) unlike found solution:
Other audit services didn't show me any errors with robots.txt file. How can I fix this error or could I ignore it?
...ANSWER
Answered 2021-Dec-18 at 08:25When I moved to Vercel hosting the problem was solved. Before that I used Netlify.
QUESTION
I am trying to build in my production environment (i using GitHub actions to do the deploy), but the wrong is what the node is not the same between in my local
in my local i have this version:
...ANSWER
Answered 2021-Oct-01 at 04:43but i dont know what is the node version on github actions i can not reproduce the error in my local, because of the version are not the same
You could use setup-node
action to make the version exactly same with your local:
QUESTION
I have run gatsby clean
before npm run develop
but still it has not made a difference...
My gatsby-node.js file has been looked at by others who are familiar with the Gatsby framework, and they're not sure what the problem is either...
Here is my gatsby-node.js file:
...ANSWER
Answered 2021-Sep-30 at 07:58Try running gatsby clean first, and then try it again…
QUESTION
The site loses all functionalities after building it. In develop mode everything works fine, but when I build the website it looks like all scripts are missing. Bootstrap (Carousel DropDowns) are not responding, leflet map and background image not loading and react-multi-carousel do not work. I don't see any errors in the browser console, of course I ran gatsby clean
before building. I uploaded the project to netlify. Below I am enclosing the json package:
ANSWER
Answered 2021-Apr-13 at 20:59There's not much debug in the question however, to me, it seems that you are using some dependencies outside React's scope, which may potentially block React's hydration process, which may explain the problem described. For example, using Leaflet instead of React's Leaflet or (depending on its implementation) Bootstrap instead of React's Boostrap.
You should be able to change all React-unfriendly modules to React's ones without too much effort and that should fix your problems.
Keep in mind that if your project "work in develop
and breaks in build
" doesn't mean that your project work or stops working, it just means that is working under certain and specific circumstances. Basically, and summarizing (to avoid extending the answer), gatsby develop
uses the browser as an interpreter, where there are, among other things, global objects like window
or document
. However, gatsby build
occurs in the Node server, where at the build time, there are not global objects because there are not even created yet, that the main reason why you may face a different behavior between both scenarios but doesn't mean that the project stopped working magically.
You can read further details in the Overview of Gatsby Build Process.
Another option, linked with blocking React's hydration, is that some component may be blocking that process because of its own behavior. Be careful when using global objects like window
or document
(or when importing modules that uses it), they use should be always be wrapped inside the following condition, as you can see from the docs:
When referencing window in a React component.
QUESTION
I have a Gatsby site that has been running smoothly for 3 months online. As of Friday 24th July I have started to receive the below result and users only see a blank screen.
...ANSWER
Answered 2020-Dec-14 at 05:47What did you try so far? As @ksav pointed out, in this GitHub thread there are a several ways to fix a similar issue:
- Removing
node_modules
,.cache
and install again - Removing
node_modules
,.cache
fix Gatsby tov2.23.3
/upgrade up to^2.26.1
where the bug is fixed and install again
It seems related to a loading staticQuery bug that can't be reproduced in a fresh install. The final trial is to remove your package-lock
/yarn-lock.json
and generate it again.
QUESTION
ReadTheDocs auto-generates a robots.txt
and sitemap.xml
for projects. Each time I deploy a new minor version of my project (ex. 4.1.10
), I hide previous minor versions (ex. 4.1.9
). ReadTheDocs adds entries for all versions to sitemap.xml
, but hidden versions are also added to robots.txt
. The result is that submitted sitemaps to Google Search Console, at this point, result in "Submitted URL blocked by robots.txt" errors, since the previous sitemap entry is now blocked by the newly generated robots.txt
.
ReadTheDocs generates a sitemap URL for each version, so we have an entry like this for 4.1.9
, for example:
ANSWER
Answered 2020-Aug-25 at 15:38After playing around with a few ideas, here is the solution I came other with. Since this question is asked frequently and often opened as a bug against ReadTheDocs on GitHub (which it's not, it just appears to be poorly supported and/or documented), I'll share my workaround here for others to find.
As mentioned above and in the docs, while ReadTheDocs allows you to override the auto-generated robots.txt
and publish your own, you can't with sitemap.xml
. Unclear why. Regardless, you can simply publish a different sitemap.xml
, I named mine sitemap-index.xml
, then, tell your robots.txt
to point to your custom sitemap.
For my custom sitemap-index.xml
, I only put the pages I care about rather then ever generated version (since stable
and latest
are really what I want search engines to be crawling, not versioned pages):
QUESTION
I am using gatsby. All was working fine recently until I ran npm update
as I wanted to ensure I was up to date. Since then I am getting white pages I navigate to with this error
I believe this error is only occurring as the page is not loading and is not the root cause. I am looking to correct the root cause of the page not loading.
Looking around it seemed like this could be a service worker problem So I removed the service works as per gatsby guides but no luck.
The error only occurs when navigating to a page, for example
...ANSWER
Answered 2020-Jun-09 at 15:22Smoking Gun was a plugin I was using
"embla-carousel" "embla-carousel-react"
Rolled them back and issues went away, raising an issue on github for the team
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install robots-txt
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page