Cloud-Tasks | A very simple Palm webOS client for Remember The Milk | Frontend Framework library
kandi X-RAY | Cloud-Tasks Summary
kandi X-RAY | Cloud-Tasks Summary
This software is protected by the GPL version 2. See file licence.txt for details of the licence. If you would like to work on this application then you'll probably need to know the following to get it, and the associated tests, working...
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of Cloud-Tasks
Cloud-Tasks Key Features
Cloud-Tasks Examples and Code Snippets
Community Discussions
Trending Discussions on Cloud-Tasks
QUESTION
Received an import error after upgrading to airflow2.0.2-python3.7 image. Package seems to be installed, not sure what is causing the issue and how to fix it. Tried to uninstalling and reinstalling the packages but that does not work either.
...ANSWER
Answered 2021-Apr-22 at 12:15It's a bug (harmless) in definition of the google provider 2.2.0 in fact:
In provider.yaml
:
airflow.providers.google.common.hooks.leveldb.LevelDBHook
should be:
airflow.providers.google.leveldb.hooks.LevelDBHook
This was fixed in https://github.com/apache/airflow/pull/15453 and will be available in next version of google provider.
QUESTION
I am trying to create a task using Google Cloud Tasks using the Python client google-cloud-tasks==2.1.0
but I am getting an exception that HttpRequest.url is required. I am setting relative url which is a URL handling the task in my app.
The queue exists and has been created using:
...ANSWER
Answered 2021-Feb-04 at 01:22Your task have two targets which is App Engine and HTTP. On HTTP, a URL is required as specified in creating HTTP target tasks.
The URL must start with 'http://' or 'https://'. To fix the problem, update your http_request
:
QUESTION
I am trying to use BigQuery in AI-Platform-Notebooks, but I am running into a ContextualVersionConflict. In this toy example, I am trying to pull two columns worth of data from the BigQuery database entitled bgt_all, in the project job2vec.
...ANSWER
Answered 2021-Jan-05 at 10:19In order to further contribute to the community I am posting the answer based on my comment above.
Firstly, you should try to upgrade the packages using the command:
pip install --upgrade pandas-gbq 'google-cloud-bigquery[bqstorage,pandas]'
Then, instead of using the to_dataframe() method you can use the read_gbq(), which loads data from BigQuery using the environment's default project, as follows:
QUESTION
On App Engine in Automatic scaling on python2, request handlers had a max timeout of 60s for normal http requests and 10mins for Taskqueue requests.
I can't find any information about pubsub tasks. Do they also get the 10 minute timeout like Taskqueue/cloud-tasks?
Additionally it seems like google's changing their docs and in python3 all requests will have the 10 minute timeout: https://cloud.google.com/appengine/docs/standard/python3/how-instances-are-managed
But if you go to their docs for cron in python3, it says An HTTP request invoked by cron can run for up to 60 seconds
https://cloud.google.com/appengine/docs/standard/python3/scheduling-jobs-with-cron-yaml
ANSWER
Answered 2020-Jan-23 at 13:25When you refer to a Pub/Sub task, I understand you mean having an App Engine service subscribe to topic as a Push subscriber.
According to the Cloud Pub/Sub documentation, what you would consider a “queue” is basically Pub/Sub dynamically adjusting the rate of push requests based on the rate at which it receives success responses.
With push subscriptions, Pub/Sub considers a HTTP success response as an acknowledgement of the worker receiving the message. However, you have to keep in mind that the deadline to provide this response is initially determined by the ackDeadline of the subscription, which is 10 seconds by default, as mentioned in the Managing Subscriptions documentation.
According to the Receiving push messages documentation, if the App Engine subscriber does not reply with a success HTTP status code within the ackDeadline, Pub/Sub will retry delivery until the message expires after the subscription's message retention period.
Conveniently, you can set the ackDeadline for the push subscription to a maximum of 10 minutes, making it the same duration as the Python3 App Engine Standard with auto scaling deadline.
Regarding your question about the difference for the cron triggered requests, it is indeed how it is designed, but I would not be able to tell you why it is like that.
Also, for more information about the difference between Pub/Sub and Cloud Tasks you can refer to the official docs. Funnily enough, both Cloud Task and Pub/Sub docs contain a slightly different page talking about the different between the 2.
EDIT:
I decided to put the timeout deadline of Python2 apps to the test, and I confirmed that the limitation is indeed present even when receiving requests from a Pub/Sub push subscription.
I created 3 basic task handlers, that waited 80, 120, and 610 seconds to send a 200 http response respectively. When publishing to the topic, I noticed the following:
- As expected the service waiting for more than 10 minutes failed to acknowledge all the requests.
- The service waiting for 120s was successful in a few requests but failed in most of them.
- Strangely enough the service waiting for 80s successfully acknowledged all the requests.
This makes me believe the deadline for the Python2 is not such a hard limit as the documentation says. However, I still believe it is ideal to keep the documented deadline in mind when developing the apps, and be aware that, even for Pub/Sub tasks it will be somewhat applied.
As the runtime is most likely gonna be deprecated soon, I don’t think any changes in the documentation to specify the deadline for Pub/Sub tasks would be approved in time, as with Python3 there is no risk of App Engine terminating the request before the configured ackDeadline.
QUESTION
I have a gcloud task where the code I largely copied from the cloud tasks documentation. https://cloud.google.com/tasks/docs/creating-http-target-tasks
The goal of the cloud function is to be able to push a dateTo and dateFrom date to it so it can loop the period and create cloudTasks from it. I did not create the loop yet because I first want this issue solved.
The problem is it does not push the body the the http cloud function. The http cloud function works when using CURL.
curl -X POST "posturl" -H "Content-Type:application/json" --data '{"date": "2019-12-01", "lastRun": false}'
I checked the method like mentioned here and it is POST so it should be fine. https://stackoverflow.com/a/56448479/2785289
Checking the interface there is no payload. Using gcloud beta describe tasks ... there is no body or nothing about payload.
...ANSWER
Answered 2020-Jan-08 at 13:54As per the Google docs, By default responseView is BASIC; not all information is retrieved by default because some data, such as payloads, might be desirable to return only when needed because of its large size or because of the sensitivity of data that it contains.
QUESTION
I'm trying to use Google Cloud Tasks, creating a task via "Try this API" web feature or the Ruby Google Cloud SDK.
I can't get the payload to be delivered to a worker.
Sending the Http Method as POST actually works, but shows up as GET in the Cloud Tasks UI.
No payload or headers are sent to the worker or shows up in the Cloud Task UI. I've tried Base64, JSON, normal strings. (see images below)
Example:
Request:
...ANSWER
Answered 2019-Jul-08 at 17:01Thank You for this post, this is a bug in the existing Cloud Tasks UI and we are in the process of fixing this bug.
In the meantime the correct HTTP method of the task can be determined by running the following command:
gcloud beta tasks describe
https://cloud.google.com/sdk/gcloud/reference/beta/tasks/describe
The above command will show the correct HTTP method for the task.
Answer from: Google Cloud Tasks always set HttpMethod to GET when using HttpRequest as payload_type
You can also use the get task method to get more information.
QUESTION
Context:- We are using GAE with Python3 and so GAE APIs package isn't available so we are using google-cloud-* packages for interacting with GAE services
i.e. google-cloud-tasks for push queues, google-cloud-datastore for datastore.
Problem:- There is no way to test things in development environment as google-cloud-* packages directly act on production services.
i.e. if I push a task using google-cloud-tasks it would push in production queue, similarly if I create or update an entity from development environment it would be updating entity in production datastore.
Earlier with GAE APIs packages in local system it used to have local cloud tasks and datastore for development purpose.
I see it as a big and very common issue, I wonder if someone else as well faced such issue and found any solution to this.
...ANSWER
Answered 2018-Dec-18 at 16:40For Cloud Datastore you can follow the instructions at https://cloud.google.com/datastore/docs/tools/datastore-emulator to use the local emulator instead of your production Datastore database.
As noted in https://cloud.google.com/tasks/docs/migrating, Cloud Tasks is not currently supported in an emulator.
QUESTION
We have setup a CI for Google App deployment and it has been working fine until yesterday, we are having trouble deploying to Google App Engine, the error is shown as below:
...ANSWER
Answered 2018-Oct-03 at 01:41As suggested in the Google Issue Tracker:
Hi, when you try to deploy, were you overwriting an existing version? If yes, could you please try to deploy a version with a new name instead of overwriting it? It seems the error you observed is a known issue so this will be a workaround for the moment as the App Engine engineering team works on this issue.
I tried deploying to a new version and it works, then I switch it back to overwriting the existing version and it works again. (*works for other applications as well.)
We are still not sure what is the cause of this sudden issue though.
P/S: I also added permission to my CI service in the bucket with "Storage Legacy Bucket Owner"
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install Cloud-Tasks
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page