google-cloud-ruby | Google Cloud Client Library for Ruby | GCP library
kandi X-RAY | google-cloud-ruby Summary
Support
Quality
Security
License
Reuse
- Returns the breakpoint
- Add a span .
- Waits for the callback
- Start the branch
- Creates a new object .
- Checks that the table belongs to the table
- Creates a new debugger object .
- Creates a new object for connecting to the subscription .
- Generate the default message for this commit .
google-cloud-ruby Key Features
google-cloud-ruby Examples and Code Snippets
Trending Discussions on google-cloud-ruby
Trending Discussions on google-cloud-ruby
QUESTION
I am using the package @google-cloud/text-to-speech
in order to convert text to speech, using roughly this code:
import GoogleTextToSpeech, { SynthesizeSpeechRequest } from '@google-cloud/text-to-speech';
const fs = require('fs');
const path = require('path');
// ...
const request: SynthesizeSpeechRequest = {
input: { ssml }, // ssml is a valid SSML string from 0 to about 2000 chars.
voice: {
languageCode: 'en-US',
name: 'en-US-Wavenet-A',
},
audioConfig: {
audioEncoding: 'LINEAR16',
pitch: 0,
speakingRate: 1.0,
},
};
const clientConfig = JSON.parse(
fs.readFileSync(
path.join(
require.resolve('@google-cloud/text-to-speech'),
'..',
'v1',
'text_to_speech_client_config.json',
),
),
);
clientConfig.interfaces[
'google.cloud.texttospeech.v1.TextToSpeech'
].methods.SynthesizeSpeech.timeout_millis = 3000000;
const client = new GoogleTextToSpeech.TextToSpeechClient(clientConfig);
// Performs the Text-to-Speech request
const [response] = await client.synthesizeSpeech(request);
// response gets processed further
This works for us, and has worked for us for a long time. Recently in testing however, we noticed that we sometimes get an error when the text is close to the 2000 character length, the error seems to be because the audio response from google is too large. The error looks like this:
Error: 8 RESOURCE_EXHAUSTED: Received message larger than max (5162312 vs. 4194304)
As far as we can tell, the error seems to be that under the hood, grpc is configured to not receive a response above that 4MB limit. According to this github issue, this SO post, this github issue, this SO post, and this github thread, it seems like it should be possible to configure this in clientConfig
in our code to be -1 (unlimited) instead. The problem is, all of those posts are for either some other language, or aren't specific to cloud text-to-speech.
My question today is, how can I configure this value specifically in @google-cloud/text-to-speech
? It seems like it's possible, but also seems like I'm going to have to jump through several hoops (such as perhaps constructing a Channel, passing that to GRPC, then passing that GRPC instance to the TextToSpeechClient), and as far as I can tell, most of these GRPC configuration properties are not listed in the docs. For example, the interfaces['google.cloud.texttospeech.v1.TextToSpeech'].methods.SynthesizeSpeech.timeout_milllis
property is a value that we have successfully updated in the past, but it does not appear to be included in those docs for what is passable to the TextToSpeechClient
constructor.
Any help is appreciated, as well as links to helpful resources on where this kind of configuration is documented, if they exist!
Thanks!
ANSWER
Answered 2021-Dec-08 at 00:22Max suggested contacting Google support or searching in Google cloud forums
QUESTION
I have a google cloud function that I can invoke using gcloud
cli using a service account with the necessary IAM permissions
gcloud auth activate-service-account 'service-account-email' --key-file=google_key.json
gcloud functions call opt_manual --data '{some-json}'
this works just fine.
I'm trying to implement a similar call using official ruby sdk https://github.com/googleapis/google-cloud-ruby/tree/main/google-cloud-functions-v1
name = "opt_manual"
data = '{some-json}'
client = ::Google::Cloud::Functions::V1::CloudFunctionsService::Client.new do |config|
config.credentials = "google_key.json"
end
client.get_function ::Google::Cloud::Functions::V1::GetFunctionRequest.new(name: name)
# =>
# Permission denied on resource project opt_manual.. debug_error_string:{
# "created":"@1636730694.210272000",
# "description":"Error received from peer ipv4:142.251.36.202:443",
# "file":"src/core/lib/surface/call.cc",
# "file_line":1070,
# "grpc_message":"Permission denied on resource project opt_manual.",
# "grpc_status":7
# } (Google::Cloud::PermissionDeniedError)
The service account includes the following permissions:
- Cloud Functions Admin
- Cloud Functions Invoker
- Service Account User
- Workload Identity User
Cloud function principles include correct service account.
Despite all of that I'm still getting PermissionDeniedError
maybe someone had a similar case and remember how it could be fixed? Keep in mind in the same project I access bigquery and cloud storage using official SDK using the same service account without any problem.
ANSWER
Answered 2021-Nov-12 at 16:54Can you replace the following with values and try it instead of opt_manual
:
projects/{project}/locations/{location}/functions/opt_manual
Your Service Account likely has too many permissions. You should need only Cloud Functions Invoker (roles/cloudfunctions.invoker
).
Explanation the underlying method call is projects.locations.functions.get. Unfortunately, the Ruby API documentation for GetFunctionsRequest doesn't explain this. APIs Explorer is the definitive tool for understanding Google's REST APIs.
QUESTION
I have audio files located on a private GCS bucket. I want to serve these audio files for users to listen to.
I cant use Active Storage for this as these files are created/deleted outside of my Rails application.
I could download files using google-cloud-storage gem. It would cover authentication, file download. But if I understand correctly I can only serve files from the public directory? So do I need to download those to Rails.public_path
?
Furthermore, I really don't want to manage these files after downloading them - caching, deleting them after some time, etc.
What would be the best way to achieve this?
ANSWER
Answered 2021-Sep-03 at 21:47The best option in my opinion would be to use the google-cloud-storage
gem, since both Google::Cloud::Storage::Bucket
and Google::Cloud::Storage::File
have the #signed_url
method. This way you can find the relevant file(s) that you need and create a temporary url, send the url to the client, which will be in charge of downloading the file directly.
If you don't want the client do download the file directly from Google Cloud you can just download the file from GC yourself, and use #send_data
or #send_file
in the controller.
QUESTION
I am using google-cloud-bigquery gem for my Ruby on Rails application. I am able to execute the query on the dataset and do the following
- Execute query
- Create destination table and store results into it
- Store the final results into a file from a destination table
Now I want to set the expiration time for the destination table. I find a document to updating a table. But I am unable to find a way to set the expiration time using Ruby language?
Also I am able to fetch expires_at value from a table which returns nil. I don't find a way to set it.
Kindly help
ANSWER
Answered 2020-May-18 at 14:12I'm no Ruby expert, but I also cannot find anything in the docs/api that allows you to set the expiration on a table. You can do it at the dataset level (here) or for partitions on the table (here). It looks like it's not exposed via the client library for some reason.
Another way of doing it is via DDL in SQL e.g:
ALTER TABLE mydataset.mytable
SET OPTIONS (
expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 7 DAY),
description="Table that expires seven days from now"
)
QUESTION
I am trying to write a cell to Bigtable with timestamp as micro granularity. The doc over here says that i should be able to set the granularity to micros: https://cloud.google.com/bigtable/docs/reference/data/rpc/google.bigtable.v2#google.bigtable.v2.Mutation.SetCell
But if you look at the java client, i dont see an option to set it other than millis. https://cloud.google.com/bigtable/docs/reference/admin/rpc/google.bigtable.admin.v2#google.bigtable.admin.v2.Table.TimestampGranularity
Same for Ruby client https://github.com/googleapis/google-cloud-ruby/blob/master/google-cloud-bigtable/lib/google/cloud/bigtable/instance.rb#L548
Does anyone know if it is possible to set the granularity to micros?
ANSWER
Answered 2020-Mar-04 at 07:46As mentioned in the Google API Documentation:
If unspecified at creation time, the value will be set to MILLIS
It seems that you need to set the granularity when creating your BigTable to micros, unless, it will be set to default to milliseconds. Besides that, as mentioned in the Package google.bigtable.v2 documentation, there is the function TimestampRangeFilterMicros
that you can use with values in micros - more information in this BigTable documentation here.
Let me know if the information helped you!
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install google-cloud-ruby
On a UNIX-like operating system, using your system’s package manager is easiest. However, the packaged Ruby version may not be the newest one. There is also an installer for Windows. Managers help you to switch between multiple Ruby versions on your system. Installers can be used to install a specific or multiple Ruby versions. Please refer ruby-lang.org for more information.
Support
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesExplore Kits - Develop, implement, customize Projects, Custom Functions and Applications with kandi kits
Save this library and start creating your kit
Share this Page