pypackage | Ship virtualenvs as deb or rpm
kandi X-RAY | pypackage Summary
kandi X-RAY | pypackage Summary
Create RPMs or DEBs from a requirement file.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Create a new environment
- Return a list of directories for the virtualenv
- Install activate
- Log a message to the console
- Write content to destination
- Get the default values
- Update the default configuration with the given defaults
- Gets a configuration section by name
- Return a generator of environment variables starting with prefix
- Make sure that the virtualenv exists
- Rewrite an egg link
- Make a relative path from source to destination
- Ensure pth and egg - link
- Return the level for the given integer
- Resolve the given executable
- Returns a list of directories for the virtualenv
- Expand default value
pypackage Key Features
pypackage Examples and Code Snippets
Community Discussions
Trending Discussions on pypackage
QUESTION
Hi this is my first experience trying to deploy a Python app on cloud using CF. I am having issues deploying my app; I sincerely appreciate if anyone can help me or point me to the right direction to solve the issue.
The main problem is the app that I am trying to deploy is large size due to a lot of python dependencies. The size of my app directory is 200 Kb. The first error I observed was: Staging fails due to "Failed to upload payload for droplet" . I think the reason is when all Python dependencies are downloaded from requirements.txt file and finally the droplet is created its size is too large for upload. The droplet size=982. 3 Mb.
The first solution I tried was vendoring app where I created a vendor directory containing all python dependencies but the size of vendor directory was greater that 1Gb, which causes the upload size exceed 1Gb limit and leads to failure in uploading app files.
The second solution I am working on is to upload all installed Python libraries on an object store (in my case S3 bucket which is bounded to my app) and then download the dependencies folder called Pypackages to the app's root directory: /home/vcap/app, so I want to have /home/vcap/app/Pypackages exist before my app starts on the cloud. But I couldn't do it successfully yet. I have included a python script in my app directory which downloads files from S3 bucket successfully. (I have put the correct absolute path for download in downloadS3.py script ie, /home/vcap/app/Pypackages) I want to run this script using "python downloadS3.py" as a one-off task. First I tried the solution here : Can I have multiple commands run in a manifest.yml file? and although I can see the status of the task is SUCCEED via '$cf tasks my-app-name' , /home/vcap/app/Pypackages does not exist.
I also tried to run one-off task as the steps below:
1-
$ cf push -c 'python downloadS3.py && sleep infinity' -i 1 --no-route
2-
$ cf push -c 'null'
I have printed the contents of /home/vcap/app on my app, ie when app is started and I enter the url in my browser (I don't know what is the right way to see the contents of root directory). Anyway, the problem is Pypackages are not downloaded to the correct root directory. I am not sure if I am running the one-off task in a wrong way or if there is a better solution to make my app work.
I appreciate any helps! (edited)
ANSWER
Answered 2021-Apr-05 at 16:10Diego Cells stage apps and upload droplet to blobstore via cloud controller, the max file can be uploaded is configurable at Ops Manager > TAS for VMs > Application Developer Control > Maximum File Upload Size (MB), default is 1024MB. Seems this is causing restriction, if you can get it increased with your admin help...
Tasks run in their own containers so possibly not an option. I think Python buildpack collects and install the packages before creating the droplet, so don't think copying packages directly to /app directory will be of much help.
If you have data files then you can use .profile
file and do some scripting to copy them from S3 or server/NFS location into the /app directory. Something like
QUESTION
The problem
I have a directory structure for my project which follows the standard for Python packages, as it was created with this cookiecutter template: https://github.com/audreyr/cookiecutter-pypackage#quickstart
The directory structure is
...ANSWER
Answered 2019-Apr-27 at 17:25You should create a virtual environment and install the project in order for the test modules to correctly resolve import statements.
In the project root, i.e. the directory project_name
which contains a subdirectory project_name
and a subdirectory tests
, create a setup.py
(or pyproject.toml
) file for the package metadata. See here for details about that part.
From this same project root directory which is now containing the installer (setup.py
), create and activate a venv and install your project:
QUESTION
I want to use tox to automate testing of my python package. As of now, just locally. When running tox, the test passes, but then a UnicodeDecodeError is thrown. tox --version
is 3.13.2.
The error message (full traceback below):
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe4 in position 70: invalid continuation byte
The tests succeed when running tox
, which is visualize with -vvvvvv
, which I expect. The functions do not do anything and are just a dummy at the moment (Automatically created using the cookiecutter-pypackage). I reduced the items in the envlist to just py37. Anaconda is in my PATH
variable and no regular python is installed. I tried using different python versions by writing a .bat (i am using Windows) like on the official tox documentation. This works identically to the py37. Test pass and the following is thrown.
I could not find anything in the tox documentation regarding UnicodeDecodeErrors
.
Neither powershell or my commandline can execute export LANG=en_US.UTF-8
as suggested in this post. Setting setenv = LANG=en_US.UTF-8
in the tox.ini
also did not change anything.
The Traceback below is for the py37
environment, which gets called when adding skipdist = true
to tox.ini
. Leaving that out will still return the exact same error, with an almost identical traceback.
The error is thrown from codecy.py
file. Moving up in the Traceback and looking into each file didn't help me, as i could not find out, which file gets encoded or anything else, which could help. Without posting the whole console output with the successful virtualenv creation, something could help. The Error is thrown on the envreport
in the summary. tox-envreport
is not installed if that matters. When the sdist gets tested, it is in the GLOB sdist-make:
section.
ANSWER
Answered 2019-Aug-25 at 20:44Just a guess. Place your application outside this folder: OneDrive - Universität zu Köln The problem in Köln I think.
This is good traceback to feel tox bug report.
QUESTION
I've always understood the rule #1 of secrets is you keep them out of public source control.
So, I was prepping to upload a new package to pypi.
In .travis.yml I see:
...ANSWER
Answered 2019-Aug-14 at 20:46A repository’s .travis.yml
file can have "encrypted values", such as environment variables, notification settings, and deploy API keys. These encrypted values can be added by anyone, but are only readable by Travis CI.
This is what the secure:
field name indicates. It's safe to include these encrypted values in your .travis.yml
and safe to upload them to Github as well.
You can generate secure values by installing the travis
gem and running it:
QUESTION
I have a project with an overarching namespace, with packages inside it. Here's the folder structure:
...ANSWER
Answered 2018-Nov-26 at 19:30You would add the 'old' names inside your new package by importing into the top-level package.
Names imported as globals in pypackage/__init__.py
are attributes on the pypackage
package. Make use of that to give access to 'legacy' locations:
QUESTION
What can I put on our setup.py
project configuration file to tell the developers that the project is a private/commercial application/library.
Currently I set:
...ANSWER
Answered 2017-May-16 at 13:58Why not checkout setup.py
files of big projects @Github?
QUESTION
i have a gist(csv format) which updates on a daily basis and contains n revisions. Each revision data is different from one another.
I need to know the difference between each revision so i used gist api to retrieve the revisions which can be saved in csv
My requirement:
- How can i download and save each url csv i.e example.csv with a different name?
- How can get each revisions difference?
Struck here how to download the file. I tried with urlib, request pypackages but i couldn't figure where i am behind.Thanks
...ANSWER
Answered 2017-Feb-08 at 12:28So you're over-complicating some things. Each of the objects returned by github3.py have the information you want.
I've taken your code below and modified it slightly. To summarize
I removed the usage of
as_json()
since there's no point in coercing the data to and from a string. If you wanted a dictionary, you could have usedas_dict()
.Next, I used the gist's commit history and used that to find the file for each revision.
Using the GistHistory object and the actual GistFile object, I construct your filename to save them so it they will look like
5c058121cc4f289773b7013208ca5c5b0d97ba33-example.csv
Finally, I use the GistFile object to actually retrieve the file content and save it to disk
I hope this helps.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install pypackage
You can use pypackage like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page