lambda-proxy | A simple AWS Lambda proxy to handle API Gateway request | REST library
kandi X-RAY | lambda-proxy Summary
kandi X-RAY | lambda-proxy Summary
A simple AWS Lambda proxy to handle API Gateway request
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Setup documentation for Swagger
- Create OpenAPI response
- Get the parameters from the endpoint
- Add a new route
- Register a GET endpoint
- Checks if a given route matches the given method
- Extract arguments from the path
- Configures the logging
- Check if log is already configured
- Make a HTTP POST request
- Hosted host
- Extract the request path from the event
- Extract apigw stage from the event
- Start a local server
lambda-proxy Key Features
lambda-proxy Examples and Code Snippets
return function(event, context)
{
"resource": "/test",
"path": "/test",
"httpMethod": "POST",
#
# a banch of data removed for length
#
"body": "{\n \"first_name\": \"John\",\n \"last_name\": \"Smith\",\n \"d_o_b\": \"1985-12-04\",\
return {
'statusCode': 200,
'body': json.dumps({'clicks': int(ddbResponse['Attributes']['clicks'])})
}
{
'statusCode': 200,
'body': int(response['Attributes']['clicks'])
}
Integration: lambda
response:
headers:
Content-Type: "'text/csv'"
Content-Disposition: "'attachment; filename=abc.csv'"
def clo(database, input):
return min(database, key=lambda p: dis(v['us'],v['ub'],p['us'],p['ub']))
coord = {'us': usera_in, 'ub': userb_in}
output = clo(List_r5.data, coord)
sta_output = output['NAME']
us_output = output['us']
ub_
print(r2.content)
b'{"message": "Internal server error"}'
import json
def lambda_handler(event, context):
body = json.loads(event['body'])
print(body)
messag
{
"resource": "Resource path",
"path": "Path parameter",
"httpMethod": "Incoming request's method name"
"headers": {String containing incoming request headers}
"multiValueHeaders": {List of strings containing incoming r
getNearestConvenios:
handler: src/controllers/convenio_controller.get_nearest_convenios/{parameter}
events:
- http:
path: convenios/nearest
method: get
cors: True
import json
def lambda_handler(event, context):
body = json.loads(event['body'])
return {
'statusCode': 200,
'body': json.dumps(body['data'])
}
{
"resource": "Resource path",
"path": "Path parameter",
"httpMethod": "Incoming request's method name"
"headers": {String containing incoming request headers}
"multiValueHeaders": {List of strings containing incoming r
Community Discussions
Trending Discussions on lambda-proxy
QUESTION
I am investigating the feasibility of using a python lambda to serve a throttled endpoint connecting to a dynamo database. There is a basic token authentication and the counts for the throttling are kept in a dynamodb with a TTL of a day. I call it throttling but its actualy a daily count of requests. All the complicated parts work and if the user is unauthenticated I get the expected response {"message": "No token, or your token is invalid!"}
and if the daily count of requests is exhausted i also get the expected message
{"message": "Daily Invocations Exhausted!"}
however when the authentication and throttle allows a response from the lambda_handler
i get a 502 {"message": "Internal server error"}
.
When I look at the API Gateway test i see this log:
ANSWER
Answered 2021-Nov-29 at 15:56Your lambda_handler
function is decorated and has an associated decorator function named lambda_limit_per_day
. The return value of your Lambda function will therefore be the return value of the decorator's wrapper function.
At present, your wrapper function is not returning anything in the two paths where you handle 'not throttled' and 'within throttling limits' -- your code simply calls function(event, context)
but discards the return value from those calls (to the lambda_handler
decorated function). So, the return value of the wrapper function, and hence your decorated Lambda function, is implicitly None
and that's a malformed response, causing API Gateway to generate a 502 response.
Change those two instances of function(event, context)
to:
QUESTION
I am trying to deploy a serverless REST API with NodeJS, AWS Lambda, API Gateway, RDS and PostgreSQL.
So far I've set up the PostgreSQL RDS successfully and before start writing the functions to handle the requests to the DB I thought it'd be a good idea to test a small function first locally to check if the requests are being handled correctly.
So in the root of the project, I installed serverless-offline:
npm install serverless-offline
It threw several warnings during installation of the type:
npm WARN deprecated @hapi/pez@4.1.2: This version has been deprecated and is no longer supported or maintained
(I'm sorry if that information is irrelevant, I'm quite new and don't know what is important and what is not.)
Then I configured my serverless.yml:
...ANSWER
Answered 2021-May-12 at 21:36Ran into the same issue, but after switching to a package-lock.json file (identical package.json) from a previous project the issue was resolved. So I assume there's a dependency that's causing this issue, but sorry to say I haven't been able to identify what that dependency is
QUESTION
I am reading this AWS blog post to learn how to make my static website's form POST data to API Gateway and Lambda.
Most of it makes sense to me, but the Lambda code provided contains this unused variable:
...ANSWER
Answered 2021-Feb-11 at 15:58The response object you are referring to needs to be returned back to the client with the structure provides. The code that you are referencing in the AWS article isn't sending any responses back to the client and that is why you are not seeing the response variable implemented anywhere. To complete the handler function you would return that response variable and transforming the body property to what message you intend to send back to your client. Without this return structure, you will get a 502 error on the client.
QUESTION
Trying to send cookie back after login request on my hobby project website. For some reason it is working when running locally i.e. http://localhost:3000. But as soon as I push my API online and try to access it through my live website, I see no cookie under Application -> Cookies -> website (using chrome). I have googled a lot and I believe I have set check off every CORS policy.
The nodeJS is running in AWS lambda and is invoked through API gateway. API GW is directed to through a cloudfront distribution (if it matters).
In my express backend I have logged my headers accordingly:
...ANSWER
Answered 2020-Dec-19 at 12:28Turns out it wasn’t a CORS issue. I had simply forgotten to forward cookies from my cloudfront distribution.
QUESTION
TLDR: How do i send a short payload from a mqtt request to aws iot to aws lambda that has a open connection via apigateway to an electron app running locally in linux.
I have a esp8266 with the following code as the init.js
This code succesfully sends it's message to aws iot, with a rule set to trigger a lambda called sendmessage. Now this sendmessage lambda is connected via websockets to a Electon app locally on my linux machine. I am able to send messages from the Electron app via websockets to api gateway wss url. I followed this example here which sets up all the websockets with api gateway and aws lambdas (one being the sendmessage lambda).
ANSWER
Answered 2020-Nov-17 at 04:59It seems like you're setting 1 lambda to handle 2 trigger sources, one is IoT service, the other is API Gateway Websocket. Since you use 1 lambda, you have to handle cases when the request is came from sources:
- While
event.requestContext
is available when the request is triggered from API Gateway, it is not available when the request is triggered from IoT service (check the IoT event object here https://docs.aws.amazon.com/lambda/latest/dg/services-iotevents.html). So the error you faced (which isCannot read property 'domainName' of undefined"
) is about that. You should turn off the lambda trigger from IoT service or handle the request when it comes from IoT Service. - I'm not sure about the forbidden error but it is more like you sent unstructured message to API gateway WS, it should be
connection.send(JSON.stringify({ action: "sendmessage", data: "hello world" }));
instead ofconnection.send("hello world");
Edited based on post update:
I know ws is there because if I console it it returns a big object with a bunch of functions
Lambda function is not really a server, it is an instance Node environment (that's why it is called FUNCTION), Lambda function doesn't work as the way you think normal Nodejs app does, its container (node environment) usually is halted (or freeze) whenever its job is done so you cannot keep its container alive like a normal server. That's the reason while you can console log the Websocket object, you cannot keep it alive, the NodeJS container was already halted whenever you return/response.
Since you cannot use the Websocket object to open WS connection in Lambda, Amazon offers a way to do that via API Gateway. The way we work with API Gateway Websocket is different than the normal server does too, it would be something like:
- User -> request to API Gateway to connect to websocket -> call Lambda 1 (onconnect function)
- User -> request to API Gateway to send message over Websocket -> call Lambda 2 (sendmessage function)
- User -> request to API Gateway to close connection -> call Lambda 3 (ondisconnect function)
3 settings above is configured in API Gateway (https://docs.aws.amazon.com/apigateway/latest/developerguide/apigateway-websocket-api-integrations.html), logic of 3 functions onconnect
, sendmessage
, ondisconnect
can be handled in 1 lambda or 3 lambda functions depending on the way we design, I check your 3 lambda functions and it looks okay.
I see that you want to use IoT but I'm not sure why. You should test your Websocket API first without anything related to IoT. It would be better if you can tell what you want to achieve here since IoT works more like a publish/subscribe/messaging channel and I don't think it's necessary to use it here.
QUESTION
I have a react app that uploads a file to S3. When the user press a button to extract text out of the file, the app will call a GET to API Gateway and send the file name as parameter. This will trigger the lambda function to extract the text from the file on the S3. But I am stuck with the API that needs to call the Lambda function.
I followed this tutorial from AWS: https://docs.aws.amazon.com/apigateway/latest/developerguide/integrating-api-with-aws-services-lambda.html#api-as-lambda-proxy-expose-get-method-with-query-strings-to-call-lambda-function
This is what the response is what I get when I test the API call:
ANSWER
Answered 2020-Oct-26 at 08:06AWS Service
integration type is to ingrate the api gateway with any AWS service. Even though a lambda is a AWS feature, there is a Lambda
integration type for this specifically there for integrating lambdas. I think lambda integration type is the suitable one in this case.
You can pass the file name in the GET request as well (for example as a query string parameter).
If you like to use the AWS Service
integration type, make sure to add permission for api gateway to invoke the lambda. you need to add this permission in Lambda IAM Role's trust policy.
QUESTION
I was trying to use the debugging function for lambda (python) in Visaul Studio Code. I was following the instructions on AWS Docs, but I could not trigger the python applicaion in debug mode.
Please kindly see if you know the issue and if I have setup anything incorrectly, thanks.
Observation- Start application
Seems application was not started on the debug port specified?
- Request call
The endpoint could not be reached and python application was not entered
If accessed through port 3000, application could complete successfully
Setup performed- Initialize the project and install ptvsd as instructed
- Enable ptvsd on the python code
- Add launch configuration
This is basically just the offical helloworld sample for python
...ANSWER
Answered 2020-Oct-07 at 04:06It seems I was editing the python file at "python-debugging/hello_world/build" following the guideline of the doc (there is a step in the doc which asks you to copy the python file to "python-debugging/hello_world/build").
But then when you run "sam local start-api", it actually runs the python file at the location specifed by the CloudFormation template (tempalted.yaml), which is at "python-debugging/hello_world" (check the "CodeUri" property).
When I moved all the libriaries to the same folder as the python file it works.
So I suppose you have to make sure which python (or lambda) script you are running, and ensure the libraries are together with the python script (if you are not using layers).
Folder structure Entering debugging mode in Visual studio code Step 1: Invoke and start up the local API gateway- Server
- Client
- Server
In the IDE, open the "Run" perspective, select the launch config for this file ("SAM CLI Python Hello World"). Start the debug.
Step 5: Step through the function, return response- Server
- Client
QUESTION
I'm trying to insert bulk data into the DynamoDB table but not even a single data is getting inserted in the table using the Lambda function written in TypeScript.
Here is my code:-
...ANSWER
Answered 2020-Sep-27 at 07:54Im not sure if this is the answer but, You cant use async/await inside foreach. use for of instead.
QUESTION
I'm trying to return errors from my lambda functions but for all of the errors it just returns status 502
with message Internal server error
. Previously it was just returning cors
error for all types of returned errors. After adding 'Access-Control-Allow-Origin' : '*'
in api gateway responses, i'm getting 502
error. I've logged thrown errors in catch block & i can see the specific errors in CloudWatch
. I've seen this question but that didn't help anyway. Please note that instead of using callback
i'm using async await
. Also i've tried with & without lambda-proxy
integration but the response is same. Do i need to configure something else in case of lambda-proxy
?
ANSWER
Answered 2020-Jun-08 at 08:01When using Lambda proxy integration with API Gateway, the response from Lambda is expected in a certain format:
QUESTION
I have a click action the sends a fetch request to a AWS Lambda function that runs a google auth script, that then returns a url to authorize with back to the fetch request and with window.location
sends me to google to authorize, I authorize and currently I have it send me back to the same lambda. I can't just sent it back to the Gatsby site, because google needs a the auth redirect url to return a 200 status code and I couldn't just create a page on my site like /auth for the redirect. So once Im redirected to the original lambda, the authorization code is appended to the url. So so far that works just fine.
What I am stuck on is the next step.
A) How do I redirect the user back to the Gatsby site?
B) do I store that auth (pull it from the url params) in a database somewhere I could use faundDB Im familiar with that and lambda functions.
C) And should I be sending google to a separate lambda then the one I send the fetch request to, will it really matter.
I am using a NodeJS Lambda
Here is my lambda
...ANSWER
Answered 2020-May-17 at 12:42I got it working. I ended up creating a second lambda to push to the database as the call bakc url for the google auth. and then 301 redirect to localhost for now.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install lambda-proxy
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page