kandi X-RAY | s3_website Summary
kandi X-RAY | s3_website Summary
Manage an S3 website: sync, deliver via CloudFront, benefit from advanced S3 website features.
Top functions reviewed by kandi - BETA
s3_website Key Features
s3_website Examples and Code Snippets
Trending Discussions on s3_website
I have created Pipeline in GitLabs and I am using docker as gitlab-runner. I want to push Jekyll website on s3 website. And to do so, I am using s3_website gem. I have 4 stages defined in my pipeline. Where I am building Jekyll, creating Artifacts using Gulp, executing test on my jekyll site and then deploying.
All steps are working fine but while doing deployment, I'm getting following error. And i could not figure it how to get this solve....
ANSWERAnswered 2019-Jan-18 at 18:05
We ran into this same error on CircleCI. If I understand correctly, the s3_website gem wraps a Java .jar that's using JRuby 1.7, and something must have changed in one of the Docker images or Ruby gems that causes it to start inheriting the system's Ruby 2+ path. As a result, its JRuby 1.7 tries to load Ruby gems that only work in Ruby 2.0 and above, so it runs into errors.
As a workaround, instead of letting the s3_website gem invoke the .jar file itself, I tell the s3_website gem to only download the .jar file, then I manually invoke it:
I am using S3_website gem to push a react build to my bucket. Everything works great but when I navigate to a custom url and I hit refresh, I get a 404 error because it tries to find this object in the bucket instead of executing the route.
I have the default config in the s3_website yml with no redirect rules....
ANSWERAnswered 2018-Feb-18 at 21:06
In your bucket properties under
Error Document just use the same file as the
Index Document which should be your
My stack looks like this:
- Amazon S3
- Amazon Cloudfront
- Amazon Route53
I am deploying the generated Jekyll content to an S3 bucket using the s3_website gem. This gem also updates the associated Cloudfront distribution. I configured the Cloudfront distribution to use my certificate and added the staging URL as a CName to the distribution. Lastly I added 2 aliases from Route 53 to point to the distribution, one for IPv4 and one for IPv5.
Everything seems to be working as expected, except for some bizarre behavior I don't understand. All the pages seem to be served via HTTPS, except for https://staging.jacopretorius.net/archive/. I thought this might be because the URL doesn't have an extension, but https://staging.jacopretorius.net/about/ is being served correctly. This is happening consistently in both Chrome and Safari.
When I dump the headers using curl the headers seem to indicate that the one URL (the /about/ one) is always a Cloudfront miss....
ANSWERAnswered 2017-Oct-22 at 20:51
https://staging.jacopretorius.net/archive/ page is being served as secure when I load it. What makes you say it isn't being served secure exactly? If you type in
https:// and see the page content then the page is being served over HTTPS, but if you don't see the green lock icon the browser doesn't think the page is fully secure. You need to click the little (i) icon in the Chrome browser address bar and see what the actual issue is. The browser wouldn't be displaying any page content at all if it had an issue with the SSL certificate on that page.
I see a message in Chrome that not everything on the page is secure. I think that's because the search form on the page has an
http://www.google.com/search target. Try changing that to
https://www.google.com/search (and clear the CloudFront cache for
/archive/ after making the change).
No vulnerabilities reported
Reuse Trending Solutions
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page