serverless-app | Using AWS Lambda with API Gateway | AWS library
kandi X-RAY | serverless-app Summary
kandi X-RAY | serverless-app Summary
Using AWS Lambda with API Gateway and S3 for imagemagick tasks.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of serverless-app
serverless-app Key Features
serverless-app Examples and Code Snippets
Community Discussions
Trending Discussions on serverless-app
QUESTION
I have created a Lambda function using AWS SAM CLI which is deployed as a container image. Problem is the requirements are downloaded every time I make a small change in the code(app.py) and run sam build. The reason can be undestood from the Dockerfile below.
Dockerfile
...ANSWER
Answered 2022-Mar-27 at 16:53With the way docker caching works, everything after your COPY
statements is invalidated in cache (assuming changing). The way dependencies are often retained in cache is by only adding what is necessary to install dependencies, installing them, and then only adding your service code once dependencies are installed. In the example below, the pip install
will only run more than once if requirements.txt changes.
QUESTION
I'm not seeing how an AWS Kinesis Firehose lambda can send update and delete requests to ElasticSearch (AWS OpenSearch service).
Elasticsearch document APIs provides for CRUD operations: https://www.elastic.co/guide/en/elasticsearch/reference/current/docs.html
The examples I've found deals with the Create case, but doesn't show how to do delete
or update
requests.
https://aws.amazon.com/blogs/big-data/ingest-streaming-data-into-amazon-elasticsearch-service-within-the-privacy-of-your-vpc-with-amazon-kinesis-data-firehose/
https://github.com/amazon-archives/serverless-app-examples/blob/master/python/kinesis-firehose-process-record-python/lambda_function.py
The output format in the examples do not show a way to specify create
, update
or delete
requests:
ANSWER
Answered 2022-Mar-03 at 04:20Firehose uses lambda function to transform records before they are being delivered to the destination in your case OpenSearch(ES) so they are only used to modify the structure of the data but can't be used to influence CRUD actions. Firehose can only insert records into a specific index. If you need a simple option to remove records from ES index after a certain period of time have a look at "Index rotation" option when specifying destination for your Firehose stream.
If you want to use CRUD actions with ES and keep using Firehose I would suggest to send records to S3 bucket in the raw format and then trigger a lambda function on object upload event that will perform a CRUD action depending on fields in your payload.
A good example of performing CRUD actions against ES from lambda https://github.com/chankh/ddb-elasticsearch/blob/master/src/lambda_function.py
This particular example is built to send data from DynamoDB streams into ES but it should be a good starting point for you
QUESTION
I have the following YAML file
template.yaml
...ANSWER
Answered 2022-Mar-02 at 19:17Comments (and blank lines are treated as comments), in ruamel.yaml are currently (0.17) associated with a node that comes before it. In this case of a mapping comment occuring in a mapping is associated with the key.
So what you need to do is reassociate the comment with the new key. Doing
QUESTION
I am working with a simple AWS Lambda function :
...ANSWER
Answered 2022-Jan-23 at 06:44Lambda standalone from console
The event
that you get in your lambda function from API, and the one used when you run the function from the console are different. The event
from api passed to your function will have a fixed known format. But when you run the function from console, you are passing the event
in the incorrect format, thus it all breaks.
You have to ensure that your event
structure used when you run the code in console matches the event
structure from the API format.
QUESTION
I built a simple HelloWorld API using a lambda function and APIGateway. I'm using Cloudformation.
The lambda function runs fine when I run it using aws lambda invoke
.
The API runs locally using sam local start-api
.
But when I deploy it using sam deploy
(after using package
of course), the API returns status code 500.
This is the log that I get when I try to test it.
...ANSWER
Answered 2022-Feb-14 at 23:52Lambda proxy integrations should only use POST, not GET
. So it should be:
QUESTION
I have a serverless function I'm building using AWS SAM within Visual Studio Code. The runtime I'm using is nodejs12.x but I'm writing everything in TypeScript then compiling it to JS into a /dist
directory. That's the directory that I point all of my CloudFormation templates to in order to find the handlers. For example, the right is the TS and the left is my compiled JS.
In the sidebar you can see the /dist
directory where my JS files are placed after I run tsc
while a little further down is my template and TypeScript source.
My template then looks like this:
...ANSWER
Answered 2021-Aug-19 at 07:19Yes, it can be done: what you need are sourcemaps
In my case I had my lambdas being compiled and bundled into 1 single index.js
file under the dist folder. I shipped this along with a single .map
file using webpack as a bundler.
QUESTION
I have an AWS SAM template, which creates lambda function and post method in API Gateway. By default, it uses Lambda Proxy integration and it is working fine when I am testing through the PostMan tool but when I am using the API gateway URL with my sandbox app, it is displaying the following error.
...ANSWER
Answered 2022-Jan-11 at 17:16Configuration is not proper for API creation in API-Gateway in the AWS SAM template. because SAM deployment uses lambda proxy integration by default that's why in method response, there are few values required which can not be set automatically using the above configuration. So I use open API specification where I defined Rest API configuration and it is working fine without any manual intervention after deployment.
Following configuration is fine.
QUESTION
I followed an AWS tutorial for setting up Lambda + API Gateway using SAM Template. But the event defined under lambda template creates a Proxy integration. I followed this tutorial because I wanted to set up similar for one of my projects. I need Non-proxy integration for that specific use case. Because I have to return xml format to the client and this can be only done by modifying the Integration Response. But in proxy APIs integration response cannot be modified. I searched a lot but couldn't find an answer. For now the template.yaml looks like this
...ANSWER
Answered 2021-Sep-24 at 10:23What you seek is the "Mapping template" functionality of API Gateway. Unfortunately, there is no direct way to do it in the AWS SAM.
But there is a way you can achieve this by leveraging the Open API support inside the AWS SAM which has a subset of API Gateway extensions. (x-amazon-API gateway-integration.requestTemplates object)
QUESTION
In this documentation:
there's this snippet of SAM template:
...ANSWER
Answered 2021-Dec-11 at 15:27This code is just a snippet from a CloudFormation template. CognitoUserPoolName
and CognitoUserPoolClientName
are strings which should be specified by you. One way of doing this is by passing them as parameters:
QUESTION
I'm trying to recreate something I did in AWS using Cognito User Pool and Identity Pool. The user was able to login and receive temporary tokens that allowed direct access to an s3 bucket. See here for more info on that. I would like my B2C users to be able to login to my SPA and list containers and blobs and get blobs. I've successfully implemented logging in using MSAL (@azure/msal-browser) with auth flow, but I cannot figure out how to provide access tokens for the storage account (or ANY azure resource for that matter). I've run around in circles in the documentation for days, so if you link a docs page, I'd appreciate some elaboration because I'm obviously not understanding something.
...ANSWER
Answered 2021-Nov-01 at 11:12Accessing Storage is not supported with token obtained using B2C user flow or custom policy Reference: As u not able to create Storage account in your azure ad b2c tenant .you need to create storage in azure and You need to add the user in your B2C AAD to your current ADD as the guest to access the blob storage .
For example :the email of my B2C user is bowman@testbowmanb2c.onmicrosoft.com.
And for the operation of data, the user need this role:
For more details refer this SO Thread
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install serverless-app
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page