codepipeline-nested-cfn | CloudFormation templates , CodeBuild build specification | AWS library

 by   aws-samples Python Version: Current License: Non-SPDX

kandi X-RAY | codepipeline-nested-cfn Summary

codepipeline-nested-cfn is a Python library typically used in Cloud, AWS applications. codepipeline-nested-cfn has no bugs, it has no vulnerabilities and it has low support. However codepipeline-nested-cfn build file is not available and it has a Non-SPDX License. You can download it from GitHub.
CloudFormation templates, CodeBuild build specification & Python scripts to perform unit tests of a nested CloudFormation template.
    Support
      Quality
        Security
          License
            Reuse
            Support
              Quality
                Security
                  License
                    Reuse

                      kandi-support Support

                        summary
                        codepipeline-nested-cfn has a low active ecosystem.
                        summary
                        It has 221 star(s) with 225 fork(s). There are 47 watchers for this library.
                        summary
                        It had no major release in the last 6 months.
                        summary
                        There are 5 open issues and 1 have been closed. There are 2 open pull requests and 0 closed requests.
                        summary
                        It has a neutral sentiment in the developer community.
                        summary
                        The latest version of codepipeline-nested-cfn is current.
                        This Library - Support
                          Best in #AWS
                            Average in #AWS
                            This Library - Support
                              Best in #AWS
                                Average in #AWS

                                  kandi-Quality Quality

                                    summary
                                    codepipeline-nested-cfn has 0 bugs and 0 code smells.
                                    This Library - Quality
                                      Best in #AWS
                                        Average in #AWS
                                        This Library - Quality
                                          Best in #AWS
                                            Average in #AWS

                                              kandi-Security Security

                                                summary
                                                codepipeline-nested-cfn has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
                                                summary
                                                codepipeline-nested-cfn code analysis shows 0 unresolved vulnerabilities.
                                                summary
                                                There are 0 security hotspots that need review.
                                                This Library - Security
                                                  Best in #AWS
                                                    Average in #AWS
                                                    This Library - Security
                                                      Best in #AWS
                                                        Average in #AWS

                                                          kandi-License License

                                                            summary
                                                            codepipeline-nested-cfn has a Non-SPDX License.
                                                            summary
                                                            Non-SPDX licenses can be open source with a non SPDX compliant license, or non open source licenses, and you need to review them closely before use.
                                                            This Library - License
                                                              Best in #AWS
                                                                Average in #AWS
                                                                This Library - License
                                                                  Best in #AWS
                                                                    Average in #AWS

                                                                      kandi-Reuse Reuse

                                                                        summary
                                                                        codepipeline-nested-cfn releases are not available. You will need to build from source code and install.
                                                                        summary
                                                                        codepipeline-nested-cfn has no build file. You will be need to create the build yourself to build the component from source.
                                                                        summary
                                                                        Installation instructions are not available. Examples and code snippets are available.
                                                                        summary
                                                                        codepipeline-nested-cfn saves you 35 person hours of effort in developing the same functionality from scratch.
                                                                        summary
                                                                        It has 95 lines of code, 0 functions and 1 files.
                                                                        summary
                                                                        It has low code complexity. Code complexity directly impacts maintainability of the code.
                                                                        This Library - Reuse
                                                                          Best in #AWS
                                                                            Average in #AWS
                                                                            This Library - Reuse
                                                                              Best in #AWS
                                                                                Average in #AWS
                                                                                  Top functions reviewed by kandi - BETA
                                                                                  kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
                                                                                  Currently covering the most popular Java, JavaScript and Python libraries. See a Sample Here
                                                                                  Get all kandi verified functions for this library.
                                                                                  Get all kandi verified functions for this library.

                                                                                  codepipeline-nested-cfn Key Features

                                                                                  CloudFormation templates, CodeBuild build specification & Python scripts to perform unit tests of a nested CloudFormation template.

                                                                                  codepipeline-nested-cfn Examples and Code Snippets

                                                                                  No Code Snippets are available at this moment for codepipeline-nested-cfn.
                                                                                  Community Discussions

                                                                                  Trending Discussions on AWS

                                                                                  Python/Docker ImportError: cannot import name 'json' from itsdangerous
                                                                                  chevron right
                                                                                  Docker push to AWS ECR hangs immediately and times out
                                                                                  chevron right
                                                                                  What is jsconfig.json
                                                                                  chevron right
                                                                                  Error: While updating laravel 8 to 9. Script @php artisan package:discover --ansi handling the post-autoload-dump event returned with error code 1
                                                                                  chevron right
                                                                                  Python Selenium AWS Lambda Change WebGL Vendor/Renderer For Undetectable Headless Scraper
                                                                                  chevron right
                                                                                  AttributeError: Can't get attribute 'new_block' on
                                                                                  chevron right
                                                                                  Terraform AWS Provider Error: Value for unconfigurable attribute. Can't configure a value for "acl": its value will be decided automatically
                                                                                  chevron right
                                                                                  How can I get output from boto3 ecs execute_command?
                                                                                  chevron right
                                                                                  AWS Graphql lambda query
                                                                                  chevron right
                                                                                  'AmplifySignOut' is not exported from '@aws-amplify/ui-react'
                                                                                  chevron right

                                                                                  QUESTION

                                                                                  Python/Docker ImportError: cannot import name 'json' from itsdangerous
                                                                                  Asked 2022-Mar-31 at 12:49

                                                                                  I am trying to get a Flask and Docker application to work but when I try and run it using my docker-compose up command in my Visual Studio terminal, it gives me an ImportError called ImportError: cannot import name 'json' from itsdangerous. I have tried to look for possible solutions to this problem but as of right now there are not many on here or anywhere else. The only two solutions I could find are to change the current installation of MarkupSafe and itsdangerous to a higher version: https://serverfault.com/questions/1094062/from-itsdangerous-import-json-as-json-importerror-cannot-import-name-json-fr and another one on GitHub that tells me to essentially change the MarkUpSafe and itsdangerous installation again https://github.com/aws/aws-sam-cli/issues/3661, I have also tried to make a virtual environment named veganetworkscriptenv to install the packages but that has also failed as well. I am currently using Flask 2.0.0 and Docker 5.0.0 and the error occurs on line eight in vegamain.py.

                                                                                  Here is the full ImportError that I get when I try and run the program:

                                                                                  veganetworkscript-backend-1  | Traceback (most recent call last):
                                                                                  veganetworkscript-backend-1  |   File "/app/vegamain.py", line 8, in 
                                                                                  veganetworkscript-backend-1  |     from flask import Flask
                                                                                  veganetworkscript-backend-1  |   File "/usr/local/lib/python3.9/site-packages/flask/__init__.py", line 19, in 
                                                                                  veganetworkscript-backend-1  |     from . import json
                                                                                  veganetworkscript-backend-1  |   File "/usr/local/lib/python3.9/site-packages/flask/json/__init__.py", line 15, in 
                                                                                  veganetworkscript-backend-1  |     from itsdangerous import json as _json
                                                                                  veganetworkscript-backend-1  | ImportError: cannot import name 'json' from 'itsdangerous' (/usr/local/lib/python3.9/site-packages/itsdangerous/__init__.py)
                                                                                  veganetworkscript-backend-1 exited with code 1
                                                                                  

                                                                                  Here are my requirements.txt, vegamain.py, Dockerfile, and docker-compose.yml files:

                                                                                  requirements.txt:

                                                                                  Flask==2.0.0
                                                                                  Flask-SQLAlchemy==2.4.4
                                                                                  SQLAlchemy==1.3.20
                                                                                  Flask-Migrate==2.5.3
                                                                                  Flask-Script==2.0.6
                                                                                  Flask-Cors==3.0.9
                                                                                  requests==2.25.0
                                                                                  mysqlclient==2.0.1
                                                                                  pika==1.1.0
                                                                                  wolframalpha==4.3.0
                                                                                  

                                                                                  vegamain.py:

                                                                                  # Veganetwork (C) TetraSystemSolutions 2022
                                                                                  # all rights are reserved.  
                                                                                  # 
                                                                                  # Author: Trevor R. Blanchard Feb-19-2022-Jul-30-2022
                                                                                  #
                                                                                  
                                                                                  # get our imports in order first
                                                                                  from flask import Flask # <-- error occurs here!!!
                                                                                  
                                                                                  # start the application through flask.
                                                                                  app = Flask(__name__)
                                                                                  
                                                                                  # if set to true will return only a "Hello World" string.
                                                                                  Debug = True
                                                                                  
                                                                                  # start a route to the index part of the app in flask.
                                                                                  @app.route('/')
                                                                                  def index():
                                                                                      if (Debug == True):
                                                                                          return 'Hello World!'
                                                                                      else:
                                                                                          pass
                                                                                  
                                                                                  # start the flask app here --->
                                                                                  if __name__ == '__main__':
                                                                                      app.run(debug=True, host='0.0.0.0') 
                                                                                  

                                                                                  Dockerfile:

                                                                                  FROM python:3.9
                                                                                  ENV PYTHONUNBUFFERED 1
                                                                                  WORKDIR /app
                                                                                  COPY requirements.txt /app/requirements.txt
                                                                                  RUN pip install -r requirements.txt
                                                                                  COPY . /app
                                                                                  

                                                                                  docker-compose.yml:

                                                                                  version: '3.8'
                                                                                  services:
                                                                                    backend:
                                                                                      build:
                                                                                        context: .
                                                                                        dockerfile: Dockerfile
                                                                                      command: 'python vegamain.py'
                                                                                      ports:
                                                                                        - 8004:5000
                                                                                      volumes:
                                                                                        - .:/app
                                                                                      depends_on:
                                                                                        - db
                                                                                  
                                                                                  #  queue:
                                                                                  #    build:
                                                                                  #      context: .
                                                                                  #      dockerfile: Dockerfile
                                                                                  #    command: 'python -u consumer.py'
                                                                                  #    depends_on:
                                                                                  #      - db
                                                                                  
                                                                                    db:
                                                                                      image: mysql:5.7.22
                                                                                      restart: always
                                                                                      environment:
                                                                                        MYSQL_DATABASE: admin
                                                                                        MYSQL_USER: root
                                                                                        MYSQL_PASSWORD: root
                                                                                        MYSQL_ROOT_PASSWORD: root
                                                                                      volumes:
                                                                                        - .dbdata:/var/lib/mysql
                                                                                      ports:
                                                                                        - 33069:3306
                                                                                  

                                                                                  How exactly can I fix this code? thank you!

                                                                                  ANSWER

                                                                                  Answered 2022-Feb-20 at 12:31

                                                                                  I was facing the same issue while running docker containers with flask.

                                                                                  I downgraded Flask to 1.1.4 and markupsafe to 2.0.1 which solved my issue.

                                                                                  Check this for reference.

                                                                                  Source https://stackoverflow.com/questions/71189819

                                                                                  QUESTION

                                                                                  Docker push to AWS ECR hangs immediately and times out
                                                                                  Asked 2022-Mar-30 at 07:53

                                                                                  I'm trying to push my first docker image to ECR. I've followed the steps provided by AWS and things seem to be going smoothly until the final push which immediately times out. Specifically, I pass my aws ecr credentials to docker and get a "login succeeded" message. I then tag the image which also works. pushing to the ecr repo I get no error message, just the following:

                                                                                  The push refers to repository [xxxxxxxxxxx.dkr.ecr.ca-central-1.amazonaws.com/reponame]
                                                                                  714c1b96dd83: Retrying in 1 second 
                                                                                  d2cdc77dd068: Retrying in 1 second 
                                                                                  30aad807caf5: Retrying in 1 second 
                                                                                  0559774c4ea2: Retrying in 1 second 
                                                                                  285b8616682f: Retrying in 1 second 
                                                                                  4aeea0ec2b15: Waiting 
                                                                                  1b1312f842d8: Waiting 
                                                                                  c310009e0ef3: Waiting 
                                                                                  a48777e566d3: Waiting 
                                                                                  2a0c9f28029a: Waiting 
                                                                                  EOF
                                                                                  

                                                                                  It tries a bunch of times and then exits with no message. Any idea what's wrong?

                                                                                  ANSWER

                                                                                  Answered 2022-Jan-02 at 14:23

                                                                                  I figured out my issue. I wasn't using the correct credentials. I had a personal AWS account as my default credentials and needed to add my work profile to my credentials.

                                                                                  EDIT
                                                                                  If you have multiple aws profiles, you can mention the profile name at the docker login as below (assuming you have done aws configure --profile someprofile at earlier day),

                                                                                  aws ecr get-login-password --region us-east-1 --profile someprofile | docker login ....
                                                                                  

                                                                                  Source https://stackoverflow.com/questions/70452836

                                                                                  QUESTION

                                                                                  What is jsconfig.json
                                                                                  Asked 2022-Mar-29 at 17:49

                                                                                  If i search the same question on the internet, then i'll get only links to vscode website ans some blogs which implements it.

                                                                                  I want to know that is jsconfig.json is specific to vscode or javascript/webpack?

                                                                                  What will happen if we deploy the application on AWS / Heroku, etc. Do we have to make change?

                                                                                  ANSWER

                                                                                  Answered 2021-Aug-06 at 04:10

                                                                                  This is definitely specific to VSCode.

                                                                                  The presence of jsconfig.json file in a directory indicates that the directory is the root of a JavaScript Project. The jsconfig.json file specifies the root files and the options for the features provided by the JavaScript language service.

                                                                                  Check more details here: https://code.visualstudio.com/docs/languages/jsconfig

                                                                                  You don't need this file when deploy it on AWS/Heroku, basically, you can exclude this from your commit if you are using git repo, i.e., add jsconfig.json in your .gitignore, this will make your project IDE independent.

                                                                                  Source https://stackoverflow.com/questions/68675994

                                                                                  QUESTION

                                                                                  Error: While updating laravel 8 to 9. Script @php artisan package:discover --ansi handling the post-autoload-dump event returned with error code 1
                                                                                  Asked 2022-Mar-29 at 06:51

                                                                                  Nothing to install, update or remove Generating optimized autoload files Class App\Helpers\Helper located in C:/wamp64/www/vuexylaravel/app\Helpers\helpers.php does not comply with psr-4 autoloading standard. Skipping. > Illuminate\Foundation\ComposerScripts::postAutoloadDump > @php artisan package:discover --ansi

                                                                                     Error 
                                                                                  
                                                                                    Undefined constant Illuminate\Http\Request::HEADER_X_FORWARDED_ALL
                                                                                    at C:\wamp64\www\vuexylaravel\vendor\fideloper\proxy\config\trustedproxy.php:48
                                                                                       44▕      * - 'HEADER_X_FORWARDED_AWS_ELB' (If you are using AWS Elastic Load Balancer)
                                                                                       45▕      *
                                                                                       46▕      * @link https://symfony.com/doc/current/deployment/proxies.html
                                                                                       47▕      */
                                                                                    ➜  48▕     'headers' => Illuminate\Http\Request::HEADER_X_FORWARDED_ALL,
                                                                                       49▕
                                                                                       50▕ ];
                                                                                       51▕
                                                                                  
                                                                                    1   C:\wamp64\www\vuexylaravel\vendor\laravel\framework\src\Illuminate\Support\ServiceProvider.php:138
                                                                                        require()
                                                                                  
                                                                                    2   C:\wamp64\www\vuexylaravel\vendor\fideloper\proxy\src\TrustedProxyServiceProvider.php:28
                                                                                        Illuminate\Support\ServiceProvider::mergeConfigFrom("C:\wamp64\www\vuexylaravel\vendor\fideloper\proxy\config\trustedproxy.php", "trustedproxy")
                                                                                  Script @php artisan package:discover --ansi handling the post-autoload-dump event returned with error code 1
                                                                                  

                                                                                  ANSWER

                                                                                  Answered 2022-Feb-13 at 17:35

                                                                                  If you are upgrading your Laravel 8 project to Laravel 9 by importing your existing application code into a totally new Laravel 9 application skeleton, you may need to update your application's "trusted proxy" middleware.

                                                                                  Within your app/Http/Middleware/TrustProxies.php file, update use Fideloper\Proxy\TrustProxies as Middleware to use Illuminate\Http\Middleware\TrustProxies as Middleware.

                                                                                  Next, within app/Http/Middleware/TrustProxies.php, you should update the $headers property definition:

                                                                                  // Before...

                                                                                  protected $headers = Request::HEADER_X_FORWARDED_ALL;

                                                                                  // After...

                                                                                  protected $headers =
                                                                                      Request::HEADER_X_FORWARDED_FOR |
                                                                                      Request::HEADER_X_FORWARDED_HOST |
                                                                                      Request::HEADER_X_FORWARDED_PORT |
                                                                                      Request::HEADER_X_FORWARDED_PROTO |
                                                                                      Request::HEADER_X_FORWARDED_AWS_ELB;
                                                                                  

                                                                                  then run composer update

                                                                                  Make sure you are using PHP 8.0

                                                                                  Source https://stackoverflow.com/questions/71103241

                                                                                  QUESTION

                                                                                  Python Selenium AWS Lambda Change WebGL Vendor/Renderer For Undetectable Headless Scraper
                                                                                  Asked 2022-Mar-21 at 20:19
                                                                                  Concept:

                                                                                  Using AWS Lambda functions with Python and Selenium, I want to create a undetectable headless chrome scraper by passing a headless chrome test. I check the undetectability of my headless scraper by opening up the test and taking a screenshot. I ran this test on a Local IDE and on a Lambda server.

                                                                                  Implementation:

                                                                                  I will be using a python library called selenium-stealth and will follow their basic configuration:

                                                                                  stealth(driver,
                                                                                          languages=["en-US", "en"],
                                                                                          vendor="Google Inc.",
                                                                                          platform="Win32",
                                                                                          webgl_vendor="Intel Inc.",
                                                                                          renderer="Intel Iris OpenGL Engine",
                                                                                          fix_hairline=True,
                                                                                          )
                                                                                  

                                                                                  I implemented this configuration on a Local IDE as well as an AWS Lambda Server to compare the results.

                                                                                  Local IDE:

                                                                                  Found below are the test results running on a local IDE:

                                                                                  Lambda Server:

                                                                                  When I run this on a Lambda server, both the WebGL Vendor and Renderer are blank. as shown below:

                                                                                  I even tried to manually change the WebGL Vendor/Renderer using the following JavaScript command:

                                                                                  driver.execute_cdp_cmd('Page.addScriptToEvaluateOnNewDocument', {"source": "WebGLRenderingContext.prototype.getParameter = function(parameter) {if (parameter === 37445) {return 'VENDOR_INPUT';}if (parameter === 37446) {return 'RENDERER_INPUT';}return getParameter(parameter);};"})
                                                                                  

                                                                                  Then I thought maybe that it could be something wrong with the parameter number. I configured the command execution without the if statement, but the same thing happened: It worked on my Local IDE but had no effect on an AWS Lambda Server.

                                                                                  Simply Put:

                                                                                  Is it possible to add Vendor/Renderer on AWS Lambda? In my efforts, it seems that there is no possible way. I made sure to submit this issue on the selenium-stealth GitHub Repository.

                                                                                  ANSWER

                                                                                  Answered 2021-Dec-18 at 02:01
                                                                                  WebGL

                                                                                  WebGL is a cross-platform, open web standard for a low-level 3D graphics API based on OpenGL ES, exposed to ECMAScript via the HTML5 Canvas element. WebGL at it's core is a Shader-based API using GLSL, with constructs that are semantically similar to those of the underlying OpenGL ES API. It follows the OpenGL ES specification, with some exceptions for the out of memory-managed languages such as JavaScript. WebGL 1.0 exposes the OpenGL ES 2.0 feature set; WebGL 2.0 exposes the OpenGL ES 3.0 API.

                                                                                  Now, with the availability of Selenium Stealth building of Undetectable Scraper using Selenium driven ChromeDriver initiated google-chrome Browsing Context have become much more easier.

                                                                                  selenium-stealth

                                                                                  selenium-stealth is a python package selenium-stealth to prevent detection. This programme tries to make python selenium more stealthy. However, as of now selenium-stealth only support Selenium Chrome.

                                                                                  • Code Block:

                                                                                  from selenium import webdriver
                                                                                  from selenium.webdriver.chrome.options import Options
                                                                                  from selenium.webdriver.chrome.service import Service
                                                                                  from selenium_stealth import stealth
                                                                                  
                                                                                  options = Options()
                                                                                  options.add_argument("start-maximized")
                                                                                  options.add_experimental_option("excludeSwitches", ["enable-automation"])
                                                                                  options.add_experimental_option('useAutomationExtension', False)
                                                                                  s = Service('C:\\BrowserDrivers\\chromedriver.exe')
                                                                                  driver = webdriver.Chrome(service=s, options=options)
                                                                                  
                                                                                  # Selenium Stealth settings
                                                                                  stealth(driver,
                                                                                        languages=["en-US", "en"],
                                                                                        vendor="Google Inc.",
                                                                                        platform="Win32",
                                                                                        webgl_vendor="Intel Inc.",
                                                                                        renderer="Intel Iris OpenGL Engine",
                                                                                        fix_hairline=True,
                                                                                    )
                                                                                  
                                                                                  driver.get("https://bot.sannysoft.com/")
                                                                                  
                                                                                • Browser Screenshot:

                                                                                • You can find a detailed relevant discussion in Can a website detect when you are using Selenium with chromedriver?

                                                                                  Changing WebGL Vendor/Renderer in AWS Lambda

                                                                                  AWS Lambda enables us to deliver compressed WebGL websites to end users. When requested webpage objects are compressed, the transfer size is reduced, leading to faster downloads, lower cloud storage fees, and lower data transfer fees. Improved load times also directly influence the viewer experience and retention, which helps in improving website conversion and discoverability. Using WebGL, websites are more immersive while still being accessible via a browser URL. Through this technique AWS Lambda to automatically compress the objects uploaded to S3.

                                                                                  Background on compression and WebGL

                                                                                  HTTP compression is a capability that can be built into web servers and web clients to improve transfer speed and bandwidth utilization. This capability is negotiated between the server and the client using an HTTP header which may indicate that a resource being transferred, cached, or otherwise referenced is compressed. AWS Lambda on the server-side supports Content-Encoding header.

                                                                                  On the client-side, most browsers today support brotli and gzip compression through HTTP headers (Accept-Encoding: deflate, br, gzip) and can handle server response headers. This means browsers will automatically download and decompress content from a web server at the client-side, before rendering webpages to the viewer.

                                                                                  Conclusion

                                                                                  Due to this constraint you may not be able to change the WebGL Vendor/Renderer in AWS Lambda, else it may directly affect the process of rendering webpages to the viewers and can stand out to be a bottleneck in UX.

                                                                                  tl; dr

                                                                                  You can find a couple of relevant detailed discussion in:

                                                                                  Source https://stackoverflow.com/questions/70265306

                                                                                  QUESTION

                                                                                  AttributeError: Can't get attribute 'new_block' on
                                                                                  Asked 2022-Feb-25 at 13:18

                                                                                  I was using pyspark on AWS EMR (4 r5.xlarge as 4 workers, each has one executor and 4 cores), and I got AttributeError: Can't get attribute 'new_block' on . Below is a snippet of the code that threw this error:

                                                                                  search =  SearchEngine(db_file_dir = "/tmp/db")
                                                                                  conn = sqlite3.connect("/tmp/db/simple_db.sqlite")
                                                                                  pdf_ = pd.read_sql_query('''select  zipcode, lat, lng, 
                                                                                                          bounds_west, bounds_east, bounds_north, bounds_south from 
                                                                                                          simple_zipcode''',conn)
                                                                                  brd_pdf = spark.sparkContext.broadcast(pdf_) 
                                                                                  conn.close()
                                                                                  
                                                                                  
                                                                                  @udf('string')
                                                                                  def get_zip_b(lat, lng):
                                                                                      pdf = brd_pdf.value 
                                                                                      out = pdf[(np.array(pdf["bounds_north"]) >= lat) & 
                                                                                                (np.array(pdf["bounds_south"]) <= lat) & 
                                                                                                (np.array(pdf['bounds_west']) <= lng) & 
                                                                                                (np.array(pdf['bounds_east']) >= lng) ]
                                                                                      if len(out):
                                                                                          min_index = np.argmin( (np.array(out["lat"]) - lat)**2 + (np.array(out["lng"]) - lng)**2)
                                                                                          zip_ = str(out["zipcode"].iloc[min_index])
                                                                                      else:
                                                                                          zip_ = 'bad'
                                                                                      return zip_
                                                                                  
                                                                                  df = df.withColumn('zipcode', get_zip_b(col("latitude"),col("longitude")))
                                                                                  

                                                                                  Below is the traceback, where line 102, in get_zip_b refers to pdf = brd_pdf.value:

                                                                                  21/08/02 06:18:19 WARN TaskSetManager: Lost task 12.0 in stage 7.0 (TID 1814, ip-10-22-17-94.pclc0.merkle.local, executor 6): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/worker.py", line 605, in main
                                                                                      process()
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/worker.py", line 597, in process
                                                                                      serializer.dump_stream(out_iter, outfile)
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/serializers.py", line 223, in dump_stream
                                                                                      self.serializer.dump_stream(self._batched(iterator), stream)
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/serializers.py", line 141, in dump_stream
                                                                                      for obj in iterator:
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/serializers.py", line 212, in _batched
                                                                                      for item in iterator:
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/worker.py", line 450, in mapper
                                                                                      result = tuple(f(*[a[o] for o in arg_offsets]) for (arg_offsets, f) in udfs)
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/worker.py", line 450, in 
                                                                                      result = tuple(f(*[a[o] for o in arg_offsets]) for (arg_offsets, f) in udfs)
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/worker.py", line 90, in 
                                                                                      return lambda *a: f(*a)
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/util.py", line 121, in wrapper
                                                                                      return f(*args, **kwargs)
                                                                                    File "/mnt/var/lib/hadoop/steps/s-1IBFS0SYWA19Z/Mobile_ID_process_center.py", line 102, in get_zip_b
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/broadcast.py", line 146, in value
                                                                                      self._value = self.load_from_path(self._path)
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/broadcast.py", line 123, in load_from_path
                                                                                      return self.load(f)
                                                                                    File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/broadcast.py", line 129, in load
                                                                                      return pickle.load(file)
                                                                                  AttributeError: Can't get attribute 'new_block' on 
                                                                                  

                                                                                  Some observations and thought process:

                                                                                  1, After doing some search online, the AttributeError in pyspark seems to be caused by mismatched pandas versions between driver and workers?

                                                                                  2, But I ran the same code on two different datasets, one worked without any errors but the other didn't, which seems very strange and undeterministic, and it seems like the errors may not be caused by mismatched pandas versions. Otherwise, neither two datasets would succeed.

                                                                                  3, I then ran the same code on the successful dataset again, but this time with different spark configurations: setting spark.driver.memory from 2048M to 4192m, and it threw AttributeError.

                                                                                  4, In conclusion, I think the AttributeError has something to do with driver. But I can't tell how they are related from the error message, and how to fix it: AttributeError: Can't get attribute 'new_block' on

                                                                                  ANSWER

                                                                                  Answered 2021-Aug-26 at 14:53

                                                                                  I had the same error using pandas 1.3.2 in the server while 1.2 in my client. Downgrading pandas to 1.2 solved the problem.

                                                                                  Source https://stackoverflow.com/questions/68625748

                                                                                  QUESTION

                                                                                  Terraform AWS Provider Error: Value for unconfigurable attribute. Can't configure a value for "acl": its value will be decided automatically
                                                                                  Asked 2022-Feb-15 at 13:50

                                                                                  Just today, whenever I run terraform apply, I see an error something like this: Can't configure a value for "lifecycle_rule": its value will be decided automatically based on the result of applying this configuration.

                                                                                  It was working yesterday.

                                                                                  Following is the command I run: terraform init && terraform apply

                                                                                  Following is the list of initialized provider plugins:

                                                                                  - Finding latest version of hashicorp/archive...
                                                                                  - Finding latest version of hashicorp/aws...
                                                                                  - Finding latest version of hashicorp/null...
                                                                                  - Installing hashicorp/null v3.1.0...
                                                                                  - Installed hashicorp/null v3.1.0 (signed by HashiCorp)
                                                                                  - Installing hashicorp/archive v2.2.0...
                                                                                  - Installed hashicorp/archive v2.2.0 (signed by HashiCorp)
                                                                                  - Installing hashicorp/aws v4.0.0...
                                                                                  - Installed hashicorp/aws v4.0.0 (signed by HashiCorp)
                                                                                  

                                                                                  Following are the errors:

                                                                                  Acquiring state lock. This may take a few moments...
                                                                                  Releasing state lock. This may take a few moments...
                                                                                  ╷
                                                                                  │ Error: Value for unconfigurable attribute
                                                                                  │ 
                                                                                  │   with module.ssm-parameter-store-backup.aws_s3_bucket.this,
                                                                                  │   on .terraform/modules/ssm-parameter-store-backup/s3_backup.tf line 1, in resource "aws_s3_bucket" "this":
                                                                                  │    1: resource "aws_s3_bucket" "this" {
                                                                                  │ 
                                                                                  │ Can't configure a value for "lifecycle_rule": its value will be decided
                                                                                  │ automatically based on the result of applying this configuration.
                                                                                  ╵
                                                                                  ╷
                                                                                  │ Error: Value for unconfigurable attribute
                                                                                  │ 
                                                                                  │   with module.ssm-parameter-store-backup.aws_s3_bucket.this,
                                                                                  │   on .terraform/modules/ssm-parameter-store-backup/s3_backup.tf line 1, in resource "aws_s3_bucket" "this":
                                                                                  │    1: resource "aws_s3_bucket" "this" {
                                                                                  │ 
                                                                                  │ Can't configure a value for "server_side_encryption_configuration": its
                                                                                  │ value will be decided automatically based on the result of applying this
                                                                                  │ configuration.
                                                                                  ╵
                                                                                  ╷
                                                                                  │ Error: Value for unconfigurable attribute
                                                                                  │ 
                                                                                  │   with module.ssm-parameter-store-backup.aws_s3_bucket.this,
                                                                                  │   on .terraform/modules/ssm-parameter-store-backup/s3_backup.tf line 3, in resource "aws_s3_bucket" "this":
                                                                                  │    3:   acl    = "private"
                                                                                  │ 
                                                                                  │ Can't configure a value for "acl": its value will be decided automatically
                                                                                  │ based on the result of applying this configuration.
                                                                                  ╵
                                                                                  ERRO[0012] 1 error occurred:
                                                                                          * exit status 1
                                                                                  

                                                                                  My code is as follows:

                                                                                  resource "aws_s3_bucket" "this" {
                                                                                    bucket = "${var.project}-${var.environment}-ssm-parameter-store-backups-bucket"
                                                                                    acl    = "private"
                                                                                  
                                                                                    server_side_encryption_configuration {
                                                                                      rule {
                                                                                        apply_server_side_encryption_by_default {
                                                                                          kms_master_key_id = data.aws_kms_key.s3.arn
                                                                                          sse_algorithm     = "aws:kms"
                                                                                        }
                                                                                      }
                                                                                    }
                                                                                  
                                                                                    lifecycle_rule {
                                                                                      id      = "backups"
                                                                                      enabled = true
                                                                                  
                                                                                      prefix = "backups/"
                                                                                  
                                                                                      transition {
                                                                                        days          = 90
                                                                                        storage_class = "GLACIER_IR"
                                                                                      }
                                                                                  
                                                                                      transition {
                                                                                        days          = 180
                                                                                        storage_class = "DEEP_ARCHIVE"
                                                                                      }
                                                                                  
                                                                                      expiration {
                                                                                        days = 365
                                                                                      }
                                                                                    }
                                                                                  
                                                                                    tags = {
                                                                                      Name        = "${var.project}-${var.environment}-ssm-parameter-store-backups-bucket"
                                                                                      Environment = var.environment
                                                                                    }
                                                                                  }
                                                                                  

                                                                                  ANSWER

                                                                                  Answered 2022-Feb-15 at 13:49

                                                                                  Terraform AWS Provider is upgraded to version 4.0.0 which is published on 10 February 2022.

                                                                                  Major changes in the release include:

                                                                                  • Version 4.0.0 of the AWS Provider introduces significant changes to the aws_s3_bucket resource.
                                                                                  • Version 4.0.0 of the AWS Provider will be the last major version to support EC2-Classic resources as AWS plans to fully retire EC2-Classic Networking. See the AWS News Blog for additional details.
                                                                                  • Version 4.0.0 and 4.x.x versions of the AWS Provider will be the last versions compatible with Terraform 0.12-0.15.

                                                                                  The reason for this change by Terraform is as follows: To help distribute the management of S3 bucket settings via independent resources, various arguments and attributes in the aws_s3_bucket resource have become read-only. Configurations dependent on these arguments should be updated to use the corresponding aws_s3_bucket_* resource. Once updated, new aws_s3_bucket_* resources should be imported into Terraform state.

                                                                                  So, I updated my code accordingly by following the guide here: Terraform AWS Provider Version 4 Upgrade Guide | S3 Bucket Refactor

                                                                                  The new working code looks like this:

                                                                                  resource "aws_s3_bucket" "this" {
                                                                                    bucket = "${var.project}-${var.environment}-ssm-parameter-store-backups-bucket"
                                                                                  
                                                                                    tags = {
                                                                                      Name        = "${var.project}-${var.environment}-ssm-parameter-store-backups-bucket"
                                                                                      Environment = var.environment
                                                                                    }
                                                                                  }
                                                                                  
                                                                                  resource "aws_s3_bucket_acl" "this" {
                                                                                    bucket = aws_s3_bucket.this.id
                                                                                    acl    = "private"
                                                                                  }
                                                                                  
                                                                                  resource "aws_s3_bucket_server_side_encryption_configuration" "this" {
                                                                                    bucket = aws_s3_bucket.this.id
                                                                                  
                                                                                    rule {
                                                                                      apply_server_side_encryption_by_default {
                                                                                        kms_master_key_id = data.aws_kms_key.s3.arn
                                                                                        sse_algorithm     = "aws:kms"
                                                                                      }
                                                                                    }
                                                                                  }
                                                                                  
                                                                                  resource "aws_s3_bucket_lifecycle_configuration" "this" {
                                                                                    bucket = aws_s3_bucket.this.id
                                                                                  
                                                                                    rule {
                                                                                      id     = "backups"
                                                                                      status = "Enabled"
                                                                                  
                                                                                      filter {
                                                                                        prefix = "backups/"
                                                                                      }
                                                                                  
                                                                                      transition {
                                                                                        days          = 90
                                                                                        storage_class = "GLACIER_IR"
                                                                                      }
                                                                                  
                                                                                      transition {
                                                                                        days          = 180
                                                                                        storage_class = "DEEP_ARCHIVE"
                                                                                      }
                                                                                  
                                                                                      expiration {
                                                                                        days = 365
                                                                                      }
                                                                                    }
                                                                                  }
                                                                                  

                                                                                  If you don't want to upgrade your Terraform AWS Provider version to 4.0.0, you can use the existing or older version by specifying it explicitly in the code as below:

                                                                                  terraform {
                                                                                    required_version = "~> 1.0.11"
                                                                                    required_providers {
                                                                                      aws  = "~> 3.73.0"
                                                                                    }
                                                                                  }
                                                                                  

                                                                                  Source https://stackoverflow.com/questions/71078462

                                                                                  QUESTION

                                                                                  How can I get output from boto3 ecs execute_command?
                                                                                  Asked 2022-Jan-13 at 19:35

                                                                                  I have an ECS task running on Fargate on which I want to run a command in boto3 and get back the output. I can do so in the awscli just fine.

                                                                                  ➜ aws ecs execute-command --cluster cluster1 \                                                                                   
                                                                                      --task abc \
                                                                                      --container container1 \
                                                                                      --interactive \
                                                                                      --command 'echo hi'    
                                                                                  
                                                                                  The Session Manager plugin was installed successfully. Use the AWS CLI to start a session.
                                                                                  
                                                                                  Starting session with SessionId: ecs-execute-command-0f913e47ae7801aeb
                                                                                  hi
                                                                                  
                                                                                  Exiting session with sessionId: ecs-execute-command-0f913e47ae7801aeb.
                                                                                  

                                                                                  But I cannot sort out how to get the output for the same in boto3.

                                                                                  ecs = boto3.client("ecs")
                                                                                  ssm = boto3.client("ssm")
                                                                                  exec_resp = ecs.execute_command(
                                                                                      cluster=self.cluster,
                                                                                      task=self.task,
                                                                                      container=self.container,
                                                                                      interactive=True,
                                                                                      command="echo hi",
                                                                                  )
                                                                                  s_active = ssm.describe_sessions(
                                                                                      State="Active",
                                                                                      Filters=[
                                                                                          {
                                                                                              "key": "SessionId",
                                                                                              "value": exec_resp["session"]["sessionId"],
                                                                                          },
                                                                                      ],
                                                                                  )
                                                                                  # Here I get the document for the active session.
                                                                                  doc_active = ssm.get_document(Name=s_active["Sessions"][0]["DocumentName"])
                                                                                  # Now I wait for the session to finish.
                                                                                  s_history = {}
                                                                                  done = False
                                                                                  while not done:
                                                                                      s_history = ssm.describe_sessions(
                                                                                          State="History",
                                                                                          Filters=[
                                                                                              {
                                                                                                  "key": "SessionId",
                                                                                                  "value": exec_resp["session"]["sessionId"],
                                                                                              },
                                                                                          ],
                                                                                      )
                                                                                      done = len(s_history["Sessions"]) > 0
                                                                                  doc_history = ssm.get_document(Name=s_history["Sessions"][0]["DocumentName"])
                                                                                  

                                                                                  Now the session is terminating and I get another document back, but there still doesn't seem to be output anywhere. Has anybody gotten output from this? How?

                                                                                  For anybody arriving seeking a similar solution, I have created a tool for making this task simple. It is called interloper. This is mostly thanks to the excellent answer by Andrey.

                                                                                  ANSWER

                                                                                  Answered 2022-Jan-04 at 23:43

                                                                                  Ok, basically by reading the ssm session manager plugin source code I came up with the following simplified reimplementation that is capable of just grabbing the command output: (you need to pip install websocket-client construct)

                                                                                  import json
                                                                                  import uuid
                                                                                  
                                                                                  import boto3
                                                                                  import construct as c
                                                                                  import websocket
                                                                                  
                                                                                  ecs = boto3.client("ecs")
                                                                                  ssm = boto3.client("ssm")
                                                                                  exec_resp = ecs.execute_command(
                                                                                      cluster=self.cluster,
                                                                                      task=self.task,
                                                                                      container=self.container,
                                                                                      interactive=True,
                                                                                      command="ls -la /",
                                                                                  )
                                                                                  
                                                                                  session = exec_resp['session']
                                                                                  connection = websocket.create_connection(session['streamUrl'])
                                                                                  try:
                                                                                      init_payload = {
                                                                                          "MessageSchemaVersion": "1.0",
                                                                                          "RequestId": str(uuid.uuid4()),
                                                                                          "TokenValue": session['tokenValue']
                                                                                      }
                                                                                      connection.send(json.dumps(init_payload))
                                                                                  
                                                                                      AgentMessageHeader = c.Struct(
                                                                                          'HeaderLength' / c.Int32ub,
                                                                                          'MessageType' / c.PaddedString(32, 'ascii'),
                                                                                      )
                                                                                  
                                                                                      AgentMessagePayload = c.Struct(
                                                                                          'PayloadLength' / c.Int32ub,
                                                                                          'Payload' / c.PaddedString(c.this.PayloadLength, 'ascii')
                                                                                      )
                                                                                  
                                                                                      while True:
                                                                                          response = connection.recv()
                                                                                  
                                                                                          message = AgentMessageHeader.parse(response)
                                                                                  
                                                                                          if 'channel_closed' in message.MessageType:
                                                                                              raise Exception('Channel closed before command output was received')
                                                                                  
                                                                                          if 'output_stream_data' in message.MessageType:
                                                                                              break
                                                                                  
                                                                                  finally:
                                                                                      connection.close()
                                                                                  
                                                                                  payload_message = AgentMessagePayload.parse(response[message.HeaderLength:])
                                                                                  
                                                                                  print(payload_message.Payload)
                                                                                  

                                                                                  Source https://stackoverflow.com/questions/70367030

                                                                                  QUESTION

                                                                                  AWS Graphql lambda query
                                                                                  Asked 2022-Jan-09 at 17:12

                                                                                  I am not using AWS AppSync for this app. I have created Graphql schema, I have made my own resolvers. For each create, query, I have made each Lambda functions. I used DynamoDB Single table concept and it's Global secondary indexes.

                                                                                  It was ok for me, to create an Book item. In DynamoDB, the table looks like this: .

                                                                                  I am having issue with the return Graphql queries. After getting the Items from DynamoDB table, I have to use Map function then return the Items based on Graphql type. I feel like this is not efficient way to do that. Idk the best way query data. Also I am getting null both author and authors query.

                                                                                  This is my gitlab-branch.

                                                                                  This is my Graphql Schema

                                                                                  import { gql } from 'apollo-server-lambda';
                                                                                  
                                                                                  const typeDefs = gql`
                                                                                    enum Genre {
                                                                                      adventure
                                                                                      drama
                                                                                      scifi
                                                                                    }
                                                                                  
                                                                                    enum Authors {
                                                                                      AUTHOR
                                                                                    }
                                                                                  
                                                                                    # Root Query - all the queries supported by the schema
                                                                                  
                                                                                    type Query {
                                                                                      """
                                                                                      All Authors query
                                                                                      """
                                                                                      authors(author: Authors): [Author]
                                                                                      books(book: String): [Book]
                                                                                    }
                                                                                  
                                                                                    # Root Mutation - all the mutations supported by the schema
                                                                                    type Mutation {
                                                                                      createBook(input: CreateBook!): Book
                                                                                    }
                                                                                  
                                                                                    """
                                                                                    One Author can have many books
                                                                                    """
                                                                                    type Author {
                                                                                      id: ID!
                                                                                      authorName: String
                                                                                      book: [Book]!
                                                                                    }
                                                                                  
                                                                                    """
                                                                                    Book Schema
                                                                                    """
                                                                                    type Book {
                                                                                      id: ID!
                                                                                      name: String
                                                                                      price: String
                                                                                      publishingYear: String
                                                                                      publisher: String
                                                                                      author: [Author]
                                                                                      description: String
                                                                                      page: Int
                                                                                      genre: [Genre]
                                                                                    }
                                                                                  
                                                                                    input CreateBook {
                                                                                      name: String
                                                                                      price: String
                                                                                      publishingYear: String
                                                                                      publisher: String
                                                                                      author: [CreateAuthor]
                                                                                      description: String
                                                                                      page: Int
                                                                                      genre: [Genre]
                                                                                    }
                                                                                  
                                                                                    input CreateAuthor {
                                                                                      authorName: String!
                                                                                    }
                                                                                  `;
                                                                                  export default typeDefs;

                                                                                  This is I created the Book Item

                                                                                  import AWS from 'aws-sdk';
                                                                                  import { v4 } from 'uuid';
                                                                                  import { CreateBook } from '../../generated/schema';
                                                                                  
                                                                                  async function createBook(_: unknown, { input }: { input: CreateBook }) {
                                                                                    const dynamoDb = new AWS.DynamoDB.DocumentClient();
                                                                                    const id = v4();
                                                                                  
                                                                                    const authorsName = 
                                                                                      input.author &&
                                                                                      input.author.map(function (item) {
                                                                                        return item['authorName'];
                                                                                      });
                                                                                  
                                                                                    const params = {
                                                                                      TableName: process.env.ITEM_TABLE ? process.env.ITEM_TABLE : '',
                                                                                      Item: {
                                                                                        PK: `AUTHOR`,
                                                                                        SK: `AUTHORS#${id}`,
                                                                                        GSI1PK: `BOOKS`,
                                                                                        GSI1SK: `BOOK#${input.name}`,
                                                                                        name: input.name,
                                                                                        author: authorsName,
                                                                                        price: input.price,
                                                                                        publishingYear: input.publishingYear,
                                                                                        publisher: input.publisher,
                                                                                        page: input.page,
                                                                                        description: input.description,
                                                                                        genre: input.genre,
                                                                                      },
                                                                                    };
                                                                                  
                                                                                    await dynamoDb.put(params).promise();
                                                                                  
                                                                                    return {
                                                                                      ...input,
                                                                                      id,
                                                                                    };
                                                                                  }
                                                                                  
                                                                                  export default createBook;

                                                                                  This is how query the All Book

                                                                                  import AWS from 'aws-sdk';
                                                                                  
                                                                                  async function books(_: unknown, input: { book: string }) {
                                                                                    const dynamoDb = new AWS.DynamoDB.DocumentClient();
                                                                                  
                                                                                    const params = {
                                                                                      TableName: process.env.ITEM_TABLE ? process.env.ITEM_TABLE : '',
                                                                                      IndexName: 'GSI1',
                                                                                      KeyConditionExpression: 'GSI1PK = :hkey',
                                                                                      ExpressionAttributeValues: {
                                                                                        ':hkey': `${input.book}`,
                                                                                      },
                                                                                    };
                                                                                  
                                                                                    const { Items } = await dynamoDb.query(params).promise();
                                                                                  
                                                                                    const allBooks =  // NEED TO MAP THE FUNcTION THEN RETURN THE DATA BASED ON GRAPHQL //QUERIES.
                                                                                      Items &&
                                                                                      Items.map((i) => {
                                                                                        const genre = i.genre.filter((i) => i);
                                                                                        return {
                                                                                          name: i.name,
                                                                                          author: i.author,
                                                                                          genre,
                                                                                        };
                                                                                      });
                                                                                  
                                                                                    return allBooks;
                                                                                  }
                                                                                  
                                                                                  export default books;

                                                                                  This my Author query and Image of the console result

                                                                                  import AWS from 'aws-sdk';
                                                                                  import { Author, Authors } from '../../generated/schema';
                                                                                  
                                                                                  async function authors(
                                                                                    _: unknown,
                                                                                    input: { author: Authors }
                                                                                  ): Promise {
                                                                                    const dynamoDb = new AWS.DynamoDB.DocumentClient();
                                                                                  
                                                                                    const params = {
                                                                                      TableName: process.env.ITEM_TABLE ? process.env.ITEM_TABLE : '',
                                                                                      KeyConditionExpression: 'PK = :hkey',
                                                                                      ExpressionAttributeValues: {
                                                                                        ':hkey': `${input.author}`,
                                                                                      },
                                                                                    };
                                                                                  
                                                                                    const { Items } = await dynamoDb.query(params).promise();
                                                                                  
                                                                                    console.log({ Items }); // I can see the data but don't know how to returns the data like this below type without using map function
                                                                                  
                                                                                    // type Author {
                                                                                    //   id: ID!
                                                                                    //   authorName: String
                                                                                    //   book: [Book]!
                                                                                    // }
                                                                                  
                                                                                    return Items; // return null in Graphql play ground. 
                                                                                  }
                                                                                  
                                                                                  export default authors;

                                                                                  Edit: current resolver map

                                                                                  // resolver map - src/resolvers/index.ts
                                                                                  const resolvers = {
                                                                                    Query: {
                                                                                      books,
                                                                                      authors,
                                                                                      author,
                                                                                      book,
                                                                                    },
                                                                                    Mutation: {
                                                                                      createBook,
                                                                                    },
                                                                                  };
                                                                                  

                                                                                  ANSWER

                                                                                  Answered 2022-Jan-09 at 17:06

                                                                                  TL;DR You are missing some resolvers. Your query resolvers are trying to do the job of the missing resolvers. Your resolvers must return data in the right shape.

                                                                                  In other words, your problems are with configuring Apollo Server's resolvers. Nothing Lambda-specific, as far as I can tell.

                                                                                  Write and register the missing resolvers.

                                                                                  GraphQL doesn't know how to "resolve" an author's books, for instance. Add a Author {books(parent)} entry to Apollo Server's resolver map. The corresponding resolver function should return a list of book objects (i.e. [Books]), as your schema requires. Apollo's docs have a similar example you can adapt.

                                                                                  Here's a refactored author query, commented with the resolvers that will be called:

                                                                                  query author(id: '1') {     # Query { author } resolver
                                                                                    authorName
                                                                                    books {                   # Author { books(parent) } resolver
                                                                                      name
                                                                                      authors {               # Book { author(parent) } resolver
                                                                                        id
                                                                                      }
                                                                                    }
                                                                                  }
                                                                                  

                                                                                  Apollo Server uses the resolver map during query execution to decide what resolvers to call for a given query field. It's not a coincidence that the map looks like your schema. Resolver functions are called with parent, arg, context and info arguments, which give your functions the context to fetch the right records from the data source.

                                                                                  // resolver map - passed to the Apollo Server constructor
                                                                                  const resolvers = {
                                                                                    Query: {
                                                                                      books,
                                                                                      authors,
                                                                                      author,
                                                                                      book,
                                                                                    },
                                                                                  
                                                                                    Author: {
                                                                                      books(parent) { getAuthorBooks(parent); }, // parent is the author - resolver should return a list of books
                                                                                    },
                                                                                  
                                                                                    Book: {
                                                                                      authors(parent) { getBookAuthors(parent); }, // parent is the book - resolver should return a list of authors
                                                                                    },
                                                                                  };
                                                                                  
                                                                                  Your query resolvers are trying to do too much work.

                                                                                  It's not the author query resolver's job to resolve all the child fields. Apollo Server will call multiple resolvers multiple times during query execution:

                                                                                  You can think of each field in a GraphQL query as a function or method of the previous type which returns the next type. In fact, this is exactly how GraphQL works. Each field on each type is backed by a function called the resolver which is provided by the GraphQL server developer. When a field is executed, the corresponding resolver is called to produce the next value

                                                                                  Apollo Server calls this the resolver chain. The books(parent) resolver will be invoked with Author as its parent argument. You can use the author id to look up her books.

                                                                                  Your resolver return values must be consistent with the schema.

                                                                                  Make sure your resolvers are returning data in the shape required by the schema. Your author resolver is apparently returning a map {Items: [author-record]}, but your schema says it needs to be a list.

                                                                                  (If I were you, I would change the author query signature from author(PK: String, SK: String): [Author] to something more caller-friendly like author(id: ID): Author. Return an Object, not a List. Hide the DynamoDB implementation details in the resolver function. Apollo Server has a ID scalar type that is serialised as a String.)

                                                                                  Source https://stackoverflow.com/questions/70577447

                                                                                  QUESTION

                                                                                  'AmplifySignOut' is not exported from '@aws-amplify/ui-react'
                                                                                  Asked 2021-Dec-19 at 14:09

                                                                                  I've run into this issue today, and it's only started today. Ran the usual sequence of installs and pushes to build the app...

                                                                                  npx create-react-app exampleapp
                                                                                  npm start
                                                                                  amplify init
                                                                                  amplify add api
                                                                                  Amplify push
                                                                                  npm install aws-amplify @aws-amplify/ui-react
                                                                                  amplify add auth
                                                                                  amplify push
                                                                                  

                                                                                  Make my changes to the index.js and ap.js as usual..

                                                                                  index.js:

                                                                                  import React from 'react';
                                                                                  import ReactDOM from 'react-dom';
                                                                                  import './index.css';
                                                                                  import App from './App';
                                                                                  import reportWebVitals from './reportWebVitals';
                                                                                  import Amplify from 'aws-amplify';
                                                                                  import aws_exports from './aws-exports'
                                                                                  
                                                                                  Amplify.configure(aws_exports);
                                                                                  
                                                                                  ReactDOM.render(
                                                                                    
                                                                                      
                                                                                    ,
                                                                                    document.getElementById('root')
                                                                                  );
                                                                                  
                                                                                  reportWebVitals();
                                                                                  

                                                                                  App.js:

                                                                                  import React from 'react';
                                                                                  import './App.css';
                                                                                  import { withAuthenticator, AmplifySignOut, Authenticator } from '@aws-amplify/ui-react';
                                                                                  import { Amplify, Auth } from 'aws-amplify';
                                                                                  import awsExports from './aws-exports';
                                                                                  
                                                                                  import awsconfig from './aws-exports';
                                                                                  
                                                                                  Amplify.configure(awsconfig);
                                                                                  Auth.configure(awsconfig);
                                                                                  
                                                                                  function App() {
                                                                                     return (
                                                                                      
                                                                                        Help!
                                                                                        
                                                                                      
                                                                                     );
                                                                                  }
                                                                                  
                                                                                  export default withAuthenticator(App);
                                                                                  

                                                                                  If I add AmplifySignOut it throws the error: 'AmplifySignOut' is not exported from '@aws-amplify/ui-react'

                                                                                  If I remove AmplifySignOut, then the login appears but it has no formatting as per the Amazon Authentication style (orange button etc.).

                                                                                  I can add import '@aws-amplify/ui-react/styles.css'; and I get some styling back, but I really need things back to how the were working. Any help would be appreciated!

                                                                                  ANSWER

                                                                                  Answered 2021-Nov-20 at 19:28

                                                                                  I am following along with the Amplify tutorial and hit this roadblock as well. It looks like they just upgraded the react components from 1.2.5 to 2.0.0 https://github.com/aws-amplify/docs/pull/3793

                                                                                  Downgrading ui-react to 1.2.5 brings back the AmplifySignOut and other components used in the tutorials.

                                                                                  in package.json:

                                                                                  "dependencies": {
                                                                                      "@aws-amplify/ui-react": "^1.2.5",
                                                                                     ...
                                                                                  }
                                                                                  

                                                                                  Alternatively, you'll need to look into the version 2 docs to find suitable replacements: https://ui.docs.amplify.aws/components/authenticator

                                                                                  Source https://stackoverflow.com/questions/70036160

                                                                                  Community Discussions, Code Snippets contain sources that include Stack Exchange Network

                                                                                  Vulnerabilities

                                                                                  No vulnerabilities reported

                                                                                  Install codepipeline-nested-cfn

                                                                                  You can download it from GitHub.
                                                                                  You can use codepipeline-nested-cfn like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.

                                                                                  Support

                                                                                  For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
                                                                                  Find more information at:
                                                                                  Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
                                                                                  Find more libraries
                                                                                  Explore Kits - Develop, implement, customize Projects, Custom Functions and Applications with kandi kits​
                                                                                  Save this library and start creating your kit
                                                                                  CLONE
                                                                                • HTTPS

                                                                                  https://github.com/aws-samples/codepipeline-nested-cfn.git

                                                                                • CLI

                                                                                  gh repo clone aws-samples/codepipeline-nested-cfn

                                                                                • sshUrl

                                                                                  git@github.com:aws-samples/codepipeline-nested-cfn.git

                                                                                • Share this Page

                                                                                  share link

                                                                                  Explore Related Topics

                                                                                  Reuse Pre-built Kits with codepipeline-nested-cfn

                                                                                  Consider Popular AWS Libraries

                                                                                  localstack

                                                                                  by localstack

                                                                                  og-aws

                                                                                  by open-guides

                                                                                  aws-cli

                                                                                  by aws

                                                                                  awesome-aws

                                                                                  by donnemartin

                                                                                  amplify-js

                                                                                  by aws-amplify

                                                                                  Try Top Libraries by aws-samples

                                                                                  aws-cdk-examples

                                                                                  by aws-samplesPython

                                                                                  aws-serverless-workshops

                                                                                  by aws-samplesJavaScript

                                                                                  aws-workshop-for-kubernetes

                                                                                  by aws-samplesShell

                                                                                  aws-serverless-airline-booking

                                                                                  by aws-samplesJavaScript

                                                                                  Compare AWS Libraries with Highest Support

                                                                                  Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
                                                                                  Find more libraries
                                                                                  Explore Kits - Develop, implement, customize Projects, Custom Functions and Applications with kandi kits​
                                                                                  Save this library and start creating your kit