DosNa | Distributed Object Store Numpy Array
kandi X-RAY | DosNa Summary
kandi X-RAY | DosNa Summary
DosNa is a Python library. DosNa has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has low support. You can install using 'pip install DosNa' or download it from GitHub, PyPI.
DosNa is intended to be a python wrapper to distribute N-dimensional arrays over an Object Store server. The main goal of DosNa is to provide easy and friendly seamless interface to store and manage N-Dimensional datasets over a remote Cloud. It is designed to be modular by defining a Connection -> Dataset -> DataChunk architecture, and supports multiple Backends and Engines that extend the base abstract model to add different functionality. Each Backend represents a different Object Store type and wraps DosNa to connect and interact with such Object Store. Engines, on the other hand, add different local (or remote) multi-threading and multi-process support. Engines act as clients to the selected Backend and parallelize (or enhance) some of the functions to access the remote data. For example, the Ceph backend works by directly wrapping Cluster and Pool/IOCtx API's from Librados and extending two new classes, Dataset and DataChunk that wrap and extend the behavior of a Librados Object (that are only accessible through looping the Pool). DosNa is based on the Librados architecture library and tries to mimic its model of an Object Store, thus dosna.Cluster and dosna.Pool are connection objects to the remote Object Store service and the Pools (or streams) within it. A dosna.Dataset will automagically distribute a N-dimensional dataset across the cluster by partitioning the whole data in smaller chunks and store them in a completely distributed fashion as dosna.DataChunks. After a chunk_size is specified, data loaded to a dosna.Dataset will be split in to chunks of size chunk_size and will create different remote objects as dosna.DataChunk, each of them corresponding to a different chunk from the original data. These dosna.DataChunk objects are distributed along the Object Store, making dosna.Dataset a wrapper (acting as a lookup table) to distribute N-dimensional datasets into many N-dimensional smaller chunks. An existing dosna.Dataset can be used as an h5py.Dataset object or a Numpy Array. This is, a dataset object supports standard slicing ds[:, :, :] (getter) and ds[:, :, :] = 5 (setter) and the dosna.Dataset object will take care behind the scenes to access all the dosna.DataChunk needed to reconstruct the desired slices and retrieve or update them accordingly.
DosNa is intended to be a python wrapper to distribute N-dimensional arrays over an Object Store server. The main goal of DosNa is to provide easy and friendly seamless interface to store and manage N-Dimensional datasets over a remote Cloud. It is designed to be modular by defining a Connection -> Dataset -> DataChunk architecture, and supports multiple Backends and Engines that extend the base abstract model to add different functionality. Each Backend represents a different Object Store type and wraps DosNa to connect and interact with such Object Store. Engines, on the other hand, add different local (or remote) multi-threading and multi-process support. Engines act as clients to the selected Backend and parallelize (or enhance) some of the functions to access the remote data. For example, the Ceph backend works by directly wrapping Cluster and Pool/IOCtx API's from Librados and extending two new classes, Dataset and DataChunk that wrap and extend the behavior of a Librados Object (that are only accessible through looping the Pool). DosNa is based on the Librados architecture library and tries to mimic its model of an Object Store, thus dosna.Cluster and dosna.Pool are connection objects to the remote Object Store service and the Pools (or streams) within it. A dosna.Dataset will automagically distribute a N-dimensional dataset across the cluster by partitioning the whole data in smaller chunks and store them in a completely distributed fashion as dosna.DataChunks. After a chunk_size is specified, data loaded to a dosna.Dataset will be split in to chunks of size chunk_size and will create different remote objects as dosna.DataChunk, each of them corresponding to a different chunk from the original data. These dosna.DataChunk objects are distributed along the Object Store, making dosna.Dataset a wrapper (acting as a lookup table) to distribute N-dimensional datasets into many N-dimensional smaller chunks. An existing dosna.Dataset can be used as an h5py.Dataset object or a Numpy Array. This is, a dataset object supports standard slicing ds[:, :, :] (getter) and ds[:, :, :] = 5 (setter) and the dosna.Dataset object will take care behind the scenes to access all the dosna.DataChunk needed to reconstruct the desired slices and retrieve or update them accordingly.
Support
Quality
Security
License
Reuse
Support
DosNa has a low active ecosystem.
It has 8 star(s) with 4 fork(s). There are 6 watchers for this library.
It had no major release in the last 12 months.
There are 1 open issues and 2 have been closed. On average issues are closed in 22 days. There are no pull requests.
It has a neutral sentiment in the developer community.
The latest version of DosNa is 0.1
Quality
DosNa has no bugs reported.
Security
DosNa has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
License
DosNa is licensed under the Apache-2.0 License. This license is Permissive.
Permissive licenses have the least restrictions, and you can use them in most projects.
Reuse
DosNa releases are not available. You will need to build from source code and install.
Deployable package is available in PyPI.
Build file is available. You can build the component from source.
Installation instructions, examples and code snippets are available.
Top functions reviewed by kandi - BETA
kandi has reviewed DosNa and discovered the below as its top functions. This is intended to give you an instant insight into DosNa implemented functionality, and help decide if they suit your requirements.
- Convolution 2D convolution
- Show current engine status
- Get the current backend
- Get MPI command
- Create a new dataset
- Returns the root of the dataset root
- Check if the given dataset exists
- Lists the objects in rados
- Create a hdf5 dataset
- Convolve 1D image
- Parse command line arguments
- Create output dataset
- Create a new Dataset
- Get a dataset by name
- Render an image object
- Make an image
- Return a numpy array of data
- Create random data on disk
- Set data at the given values
- Returns a Dataset object
- Delete a dataset
- Get the data at the given slice
- Return a SageMaker SageMaker Dataset object
- Set the data for this dataset
- Delete chunk with given index
- Return a CephDataset object
Get all kandi verified functions for this library.
DosNa Key Features
No Key Features are available at this moment for DosNa.
DosNa Examples and Code Snippets
Copy
import dosna as dn
import numpy as np
data = np.random.randn(100, 100, 100)
con = dn.Connection('dosna-tutorial')
con.connect()
ds = con.create_dataset('data', data=data, chunk_size=(32,32,32))
print(ds[0, 0, :10])
# [ 0.38475526 -1.02690848 0.88
Copy
dn.use(backend='hdf5') # One of ['ram', 'hdf5', 'ceph', 'sage']
dn.use(engine='mpi') # One of ['cpu', 'joblib', 'mpi']
import dosna as dn
dn.use(backend='hdf5', engine='mpi')
# Rest of the script
Copy
git clone https://github.com/DiamondLightSource/DosNa.git
cd DosNa && python setup.py install
cd DosNa && python setup.py develop
Community Discussions
No Community Discussions are available at this moment for DosNa.Refer to stack overflow page for discussions.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install DosNa
There are no requirements other than NumPy to install and use DosNa with the default Backend and Engine. However, specific backend and engines require different dependencies listed as follows:.
Library: numpy
Backends: ram: none hdf5: h5py ceph: librados sage: pyclovis s3: boto3
Engines: cpu: none jl: joblib mpi: mpi4py
Examples: convolutions.py: scipy
Library: numpy
Backends: ram: none hdf5: h5py ceph: librados sage: pyclovis s3: boto3
Engines: cpu: none jl: joblib mpi: mpi4py
Examples: convolutions.py: scipy
Support
For any new features, suggestions and bugs create an issue on GitHub.
If you have any questions check and ask questions on community page Stack Overflow .
Find more information at:
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page