http-trigger | Kubernetes CRD controller for http invocation | Function As A Service library
kandi X-RAY | http-trigger Summary
kandi X-RAY | http-trigger Summary
Kubernetes CRD controller for http invocation of Kubeless functions
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of http-trigger
http-trigger Key Features
http-trigger Examples and Code Snippets
Community Discussions
Trending Discussions on http-trigger
QUESTION
I've created an Azure function in C# to read an .xlsx spreadsheet (via ExcelDataReader) and output a formatted xml file (using XMLWriter). The HTTP-triggered function is working perfectly at the moment, but I've now been told that the disk paths I've been reading/writing my files to won't be available for much longer as our on premise data gateway is going to be abandoned, apparently. So, my function will now have to use blob storage for both input and output.
My processing starts in a Logic Apps workflow, all triggered as an email hits the inbox of a shared account. Any relevant .xlsx attachment is saved into blob storage with the current Logic App run-number used as the file body.
I've built a JSON formatted binding record in the Logic App and passed this to the function in the hope I can pick it up in the declarative code for the Bindings e.g
...ANSWER
Answered 2021-Jun-01 at 05:00As rene mentioned in comments, you can add a class which contains a property named blobName
and imitate this sample to implement your requirement.
Apart from this, since you use logic app, you can also use blob storage connector of logic app to get the blob content or write to the blob content.
QUESTION
I have made a Azure function (http-trigger) and deployed it in the portal using Visual Studio 2019.
The function works fine and I now will add a binding to my CosmosDB. I navigate to my function and click on “Integration”. Now I see the trigger, the function and input and output bindings.
I should be able to add a new input binding here. But I have no “add”button. What am I doing wrong?
...ANSWER
Answered 2021-May-28 at 15:53You cannot add it on the Portal because you are authoring the Function on VS, your screenshot shows that this is a Pre-compiled Function (basically it is compiling in your machine and publishing the DLLs), so you cannot alter it because it's already compiled.
The only scenario where you can add or remove Inputs or Outputs on the Portal is when the Function was created and authored (code written) in the Portal or on csx
files, meaning you can actually edit the source code in the Portal itself.
QUESTION
There are similar questions to this one on Stackoverflow, but none of them addressing my issue in using Terraform when an Azure Storage account is used to retain outputs.
Here is my scenario, which may sound familiar:
My terraform script provisions an Azure HTTP-triggered function with a function key. Also, this terraform script provisions a Web App that calls the mentioned HTTP-triggered function. The HTTP-triggered function's key is stored in the Web App's appsettings.json
file to include it in the HTTP headers' calls to the HTTP-triggered function.
The following code snippet illustrates how the HTTP-triggered function is provisioned in terraform:
...ANSWER
Answered 2021-May-28 at 06:07Based on your requirement, you could try to output the variable with the following command:
QUESTION
Im having trouble with puppeteer in Functions. I am running the exact same code path from two different functions; one HTTP-triggered and one scheduled.
The HTTP-triggered functions works as intended. But the scheduled function times out with:
...ANSWER
Answered 2021-May-16 at 12:06The problem was solved by returning the promise from mainFetchAndStoreArticles
, instead of using .then() and returning null.
QUESTION
I already read and tried this, this, and many other resources, without success.
I have a UWP app that calls an AAD-protected HTTP-triggered Azure Function. I created the two app registrations on the AAD section of the Azure portal. The API app registration specifies a scope and has an application ID URI of api://5e6b2b53-...
. On the “Authentication” blade, I set https://login.microsoftonline.com/common/oauth2/nativeclient
as redirect URI. I already set the same value as the redirect URI of the UWP app registration (I don’t know if it’s correct). I also set:
On the Function app registration. I also set the following redirect URI for the Function app registration, but I don’t understand if it’s required:
The UWP app registration uses the right scope I defined on the Function app registration. Both the app registrations are multi-tenant. The code I use on the UWP app to call the protected Azure Function is:
...ANSWER
Answered 2021-Apr-08 at 07:32I've done some test and hope it could help, if I misunderstood in some place, pls point it out.
First I created a http trigger function, when I called GET https://xxx.azurewebsites.net/api/HttpTrigger1?name=asdfg
, I would get the response like hello asdfg
.
Then I followed this doc to enable authentication via azure ad. That means I create a new azure ad app, expose an api like this.
After this step, when I call the GET request, it asks me to sign in then I can get the same response. Next I created another azure ad app and add api permission of the api I exposed just now, and via this application, I can generate access token with the scope of that api, and with this access token in the Authorization request header, I can access the GET request directly.
QUESTION
I'd like to only allow requests to an HTTP-triggered Azure Function that include a well-known client-certificate.
I do not want requests forwarded to the Azure Function that are not "approved".
Where is the Trust Store in Azure where I can store these well-known, client public certificates?
- How do I point AppService to look at this Trust Store?
ANSWER
Answered 2021-Mar-24 at 21:52Host your azure functions behind an APIM then you can use APIM to manage your client certificates. You can then use the Client Certificates page in the azure portal to upload your client certificates to the APIM resource and configure the APIM policy to only allow trusted clients.
For setting up APIM over your azure functions see : https://docs.microsoft.com/en-us/learn/modules/build-serverless-api-with-functions-api-management/
For using client certificates to secure access to an API : https://docs.microsoft.com/en-us/learn/modules/control-authentication-with-apim/4-secure-access-client-certs
QUESTION
I have two Azure Functions in the same project in VS Code. They are written in Python 3.8. They are both HTTP-triggered with either a GET or POST request. They were made using the Azure Functions extension for VS Code.
Their authorization levels are both set to anonymous
, meaning they should accept requests from anywhere.
They have identical HttpTrigger[1, 2]/function.json
files:
ANSWER
Answered 2021-Mar-15 at 06:49import logging
import azure.functions as func
import requests_async as requests
async def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
response = await requests.get('http://localhost:7071/api/HttpTrigger2')
return func.HttpResponse(
"Test"+response.text,
status_code=200
)
QUESTION
There is a C# application under development that is supposed to be a part of a bigger backend application to process some data. This application is supposed to obtain a token from Azure AD B2C and send it to an HTTP-triggered function where it is supposed to be validated by the following code:
...ANSWER
Answered 2021-Feb-26 at 14:53Obtaining a token for the AAD B2C tenant without UI is possible in two ways and you should probably pick one depending on what exactly you want to achieve:
- user token - by using Resource Owner Password Credentials flow - https://docs.microsoft.com/en-us/azure/active-directory-b2c/add-ropc-policy. This flow is deprecated though and mentioned usually in legacy application context
- server-side application token - by using Client Cretendial flow - this on the other hand requires using requests specific for AAD but with AAD B2C tenant - https://docs.microsoft.com/en-us/azure/active-directory-b2c/application-types#daemonsserver-side-applications
I'm also not quite sure why should you use id_token for that. If the application needs to authorize the request to the function with the token then it should be an access token regardless of how the token is retrieved (interactive UI or not).
QUESTION
We are using PubSub for queuing utilizing a push subscription pointing at an http-triggered cloud function. According to this documentation Cloud Run and App Engine will both authenticate requests from PubSub, cloud functions isn't listed. We have used other google services, like scheduler to invoke functions which require authentication, but have not had luck doing so with PubSub.
My question is, does cloud functions support authentication from PubSub through a subscription aim account set, or is it required that the function read and deal with the JWT itself for authentication?
...ANSWER
Answered 2021-Feb-02 at 06:58Pub/Sub subscription supports the use of service account authentication for subscriptions using "Push".
To use service accounts just specify the endpoint of the cloud function, enable authentication and add a service account to be used to send requests to the cloud function. Make sure that the service account has the appropriate permissions to access both PubSub and cloud functions.
QUESTION
I'm running a blob-triggered function on Azure VM (Ubuntu 18.04) using Azure Function Core Tools.
What I want to do is to get information of blobs WITHOUT using a service endpoint.
In my VNet I have:
VM1 which runs a function with Core Tools
VM2 which is a DNS server and pokes VM1 with an HTTP request like below;
- curl -X POST http://{VM1's private IP}:7071/admin/functions/{my blob Function}
-H "content-type:application/json" -d "{'input':'myContainer/myFolder/myBlob'}"
- curl -X POST http://{VM1's private IP}:7071/admin/functions/{my blob Function}
Blob storage with a private endpoint
When I enabled a service endpoint Microsoft.Storage on my subnet, VM1 can run a blob-triggered function, can be poked by VM2, and gets information of blobs (which was fed in curl).
However once I delete a service endpoint, VM1 can't run the function and gets following errors, obviously failed to connect to a storage:
An unhandled exception has occurred. Host is shutting down.
Microsoft.WindowsAzure.Storage: This request is not authorized to perform this operation.
An unhandled exception has occurred. Host is shutting down.
Microsoft.WindowsAzure.Storage: The operation was canceled. System.Private.CoreLib: The operation was canceled.
Name resolution to a private IP of a storage is naturally fine, from both VM1 and VM2, as they are in the same subnet.
Is there any way to solve this, like adding a route to my route table?
Thank you in advance.
Edit #1
Other functions which don't use private endpoints, like HTTP-triggered functions are not affected and are callable.
I guess the Core Tools runtime does not support Private Link, because if I want a function on Azure Functions (not Core Tools on a local machine) to connect to private endpoints, it is required to use a Premium plan or App Service Plan.
ANSWER
Answered 2020-Dec-16 at 08:45In this case, when you enable a service point Microsoft.Storage
for the subnet, Azure will add a route to the public IP addresses for Storage services in the route table of this subnet. Azure service endpoint provides a direct connection to Azure’s service over Microsoft’s backbone network infrastructure. Using service endpoints does not remove the public endpoint from Azure Storage accounts – it’s just a redirection of traffic.
When you enable a private endpoint for the blob storage, the blob resources are accessible only via your virtual network. The blob-triggered function will communicate with designated resources using a resource-specific private IP address. If you have removed the service endpoint of a subnet, there is not a default route to the public IP of the storage resources via a next-hop virtual network service endpoint
. The outbound traffic for Azure services by default goes over the Internet. Thus, Other functions that don't use private endpoints, like HTTP-triggered functions are not affected and are callable.
In conclusion, it does not seem that it's possible for Azure Functions Core Tools on the Azure VM to connect to private endpoints for blob storage WITHOUT using a service endpoint.
Hope this makes sense, for more information, you could read these blogs:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install http-trigger
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page