cell_counting_v2 | repository includes the code for training cell | Machine Learning library
kandi X-RAY | cell_counting_v2 Summary
kandi X-RAY | cell_counting_v2 Summary
The repository includes the code for training cell counting applications. (Keras + Tensorflow). Dataset can be downloaded here : [1] Microscopy Cell Counting with Fully Convolutional Regression Networks. [2] U-Net: Convolutional Networks for Biomedical Image Segmentation. To make the training easier, I added Batch Normalization to all architectures (FCRN-A and U-Net simple version). Though still contains tiny difference with the original Matconvnet implementation, for instance, upsampling in Keras is implemented by repeating elements, instead of bilinear upsampling. So, to mimic the bilinear upsampling, I did upsampling + convolution. Also, more data augmentation needs to be added. Nevertheless. I'm able to get similar results as reported in the paper. In all architectures, they follow the fully convolutional idea, each architecture consists of a down-sampling path, followed by an up-sampling path. During the first several layers, the structure resembles the cannonical classification CNN, as convolution, ReLU, and max pooling are repeatedly applied to the input image and feature maps. In the second half of the architecture, spatial resolution is recovered by performing up-sampling, convolution, eventually mapping the intermediate feature representation back to the original resolution. In the U-net version, low-level feature representations are fused during upsampling, aiming to compensate the information loss due to max pooling. Here, I only gave a very simple example here (64 kernels for all layers), not tuned for any dataset. As people know, Deep Learning is developing extremely fast today, both papers were published two years ago, which is quite "old". If people are interested in cell counting, feel free to edit on this.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of cell_counting_v2
cell_counting_v2 Key Features
cell_counting_v2 Examples and Code Snippets
Community Discussions
Trending Discussions on cell_counting_v2
QUESTION
I am currently trying to make a research paper's source code work. It's supposed to be able to detect cells on an image using deep learning. The source code is available here : https://github.com/WeidiXie/cell_counting_v2
I'm using Python 3.9.5 and keras 2.5.0.
I've been having some issues trying to make this code work properly, I am getting errors that I'm not sure how to resolve. This happens when I compile train.py :
...ANSWER
Answered 2021-May-18 at 21:35I think the problem is because of version mismatch of Keras. Convolution2D
is a deprecated function in Keras 2.5.0 and it has been replaced by Conv2D
and the subsample
argument has been replaced by stride
. You need to either install an older version of Keras like 1.2.2 or modify the model.py code to make it compatible with new Keras.
For more information you may check:
- Keras 1.2.2 (old) Documentation: https://faroit.com/keras-docs/1.2.2/layers/convolutional/
- Keras 2.0.5 (new) Documentation: https://faroit.com/keras-docs/1.2.2/layers/convolutional/
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install cell_counting_v2
You can use cell_counting_v2 like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page