Popular New Releases in Serialization
protobuf
Protocol Buffers v3.20.1-rc1
flatbuffers
FlatBuffers release 2.0.0
protobuf.js
v6.11.2
protobuf
v1.5.2
kryo
kryo-5.3.0
Popular Libraries in Serialization
by protocolbuffers c++
53747 NOASSERTION
Protocol Buffers - Google's data interchange format
by google c++
17768 Apache-2.0
FlatBuffers: Memory Efficient Serialization Library
by capnproto c++
8826 NOASSERTION
Cap'n Proto serialization/RPC system - core tools and C++ library
by protobufjs javascript
8139 NOASSERTION
Protocol Buffers for JavaScript (& TypeScript).
by golang go
7869 BSD-3-Clause
Go support for Google's protocol buffers
by marshmallow-code python
6044 MIT
A lightweight library for converting complex objects to and from simple Python datatypes.
by EsotericSoftware html
5311 BSD-3-Clause
Java binary serialization and cloning: fast, efficient, automatic
by serde-rs rust
5309 NOASSERTION
Serialization framework for Rust
by gogo go
4628 NOASSERTION
[Looking for new ownership] Protocol Buffers for Go with Gadgets
Trending New libraries in Serialization
by bytedance go
2214 Apache-2.0
A blazingly fast JSON serializing & deserializing library
by rkyv rust
1136 MIT
Zero-copy deserialization framework for Rust
by djkoloski rust
778 MIT
Zero-copy deserialization framework for Rust
by apple swift
710 Apache-2.0
Low-level atomic operations for Swift
by RainwayApp csharp
661 Apache-2.0
An extremely simple, fast, efficient, cross-platform serialization format
by dropbox rust
567 Apache-2.0
A protobuf code generation framework for the Rust language developed at Dropbox.
by sharksforarms rust
369 NOASSERTION
Declarative binary reading and writing: bit-level, symmetric, serialization/deserialization
by richardartoul go
339 MIT
Molecule is a Go library for parsing protobufs in an efficient and zero-allocation manner.
by only-cliches rust
240 MIT
Flexible, Fast & Compact Serialization with RPC
Top Authors in Serialization
1
20 Libraries
1614
2
9 Libraries
230
3
9 Libraries
18789
4
9 Libraries
5668
5
8 Libraries
63
6
6 Libraries
33
7
5 Libraries
107
8
5 Libraries
81
9
5 Libraries
1275
10
5 Libraries
243
1
20 Libraries
1614
2
9 Libraries
230
3
9 Libraries
18789
4
9 Libraries
5668
5
8 Libraries
63
6
6 Libraries
33
7
5 Libraries
107
8
5 Libraries
81
9
5 Libraries
1275
10
5 Libraries
243
Trending Kits in Serialization
No Trending Kits are available at this moment for Serialization
Trending Discussions on Serialization
How to ignore empty list when serializing to json?
android datastore-preferences: Property delegate must have a 'getValue(Context, KProperty<*>)' method
Saving model on Tensorflow 2.7.0 with data augmentation layer
Forcing VS2022 to use 32 bit version of msbuild
.NET 6 XmlSerializer Pretty print
Signing payload in JS (Frontend) using EC and validating in Python
How reproducible / deterministic is Parquet format?
How to distribute a Kotlin CLI application?
SharpSerializer: Ignore attributes/properties from deserialization
Kafka integration tests in Gradle runs into GitHub Actions
QUESTION
How to ignore empty list when serializing to json?
Asked 2022-Mar-09 at 14:36I am trying to figure out how to serialize to a json object and skip serializing properties whose values are empty lists. I am not using Newtonsoft json
1using System.Text.Json;
2using System.Text.Json.Serialization;
3using AutoMapper;
4
I have an object with a property.
1using System.Text.Json;
2using System.Text.Json.Serialization;
3using AutoMapper;
4[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
5[JsonPropertyName("extension")]
6public List<Extension> Extension { get; set; }
7
When I try to serialize this object using the following
1using System.Text.Json;
2using System.Text.Json.Serialization;
3using AutoMapper;
4[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
5[JsonPropertyName("extension")]
6public List<Extension> Extension { get; set; }
7var optionsJson = new JsonSerializerOptions
8 {
9 WriteIndented = true,
10 IgnoreNullValues = true,
11 PropertyNameCaseInsensitive = true,
12 };
13
14var json = JsonSerializer.Serialize(report, optionsJson);
15
It still gives me an empty array:
1using System.Text.Json;
2using System.Text.Json.Serialization;
3using AutoMapper;
4[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
5[JsonPropertyName("extension")]
6public List<Extension> Extension { get; set; }
7var optionsJson = new JsonSerializerOptions
8 {
9 WriteIndented = true,
10 IgnoreNullValues = true,
11 PropertyNameCaseInsensitive = true,
12 };
13
14var json = JsonSerializer.Serialize(report, optionsJson);
15"extension": [],
16
Isn't there a way to keep it from serializing these empty lists? I would like to see extension
gone. It should not be there at all. I need to do this because the gateway will respond with an error if I send:
1using System.Text.Json;
2using System.Text.Json.Serialization;
3using AutoMapper;
4[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
5[JsonPropertyName("extension")]
6public List<Extension> Extension { get; set; }
7var optionsJson = new JsonSerializerOptions
8 {
9 WriteIndented = true,
10 IgnoreNullValues = true,
11 PropertyNameCaseInsensitive = true,
12 };
13
14var json = JsonSerializer.Serialize(report, optionsJson);
15"extension": [],
16"extension": null,
17
It must not be part of the object when serialized.
gateway errorThe reason why I do not want these empty lists is that the third party gateway I am sending to objects to empty lists
"severity": "error", "code": "processing", "diagnostics": "Array cannot be empty - the property should not be present if it has no values", "location": [ "Bundle.entry[2].resource.extension", "Line 96, Col 23" ]
I'm trying to avoid doing some kind of nasty string replace on this.
ANSWER
Answered 2022-Mar-09 at 13:53You can add a dummy property that is used during serialization that handles this.
- Add a new property with the same signature, but flag it with
JsonPropertyNameAttribute
to ensure it is being serialized with the correct name, and also with theJsonIgnoreAttribute
so that it will not be serialized when it returns null. - The original property you mark with JsonIgnore, unconditionally, so that it will never be serialized itself
- This dummy property would return
null
(and thus be ignored) when the actual property contains an empty list, otherwise it would return that (non-empty) list - Writes to the dummy property just writes to the actual property
Something like this:
1using System.Text.Json;
2using System.Text.Json.Serialization;
3using AutoMapper;
4[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
5[JsonPropertyName("extension")]
6public List<Extension> Extension { get; set; }
7var optionsJson = new JsonSerializerOptions
8 {
9 WriteIndented = true,
10 IgnoreNullValues = true,
11 PropertyNameCaseInsensitive = true,
12 };
13
14var json = JsonSerializer.Serialize(report, optionsJson);
15"extension": [],
16"extension": null,
17[JsonIgnore]
18public List<Extension> Extensions { get; set; } = new();
19
20[JsonPropertyName("extension")]
21[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
22 public List<Extension> SerializationExtensions
23 {
24 get => Extensions?.Count > 0 ? Extensions : null;
25 set => Extensions = value ?? new();
26 }
27
QUESTION
android datastore-preferences: Property delegate must have a 'getValue(Context, KProperty<*>)' method
Asked 2022-Feb-28 at 12:19I'm writing a jetpack compose android app, I need to store some settings permanently.
I decided to use androidx.datastore:datastore-preferences:1.0.0
library, I have added this to my classpath.
According to the https://developer.android.com/topic/libraries/architecture/datastore descripton I have added this line of code to my kotline file at the top level:
val Context.prefsDataStore: DataStore by preferencesDataStore(name = "settings")
But I get a compile error:
1e: ...SettingsViewModel.kt: (13, 50): Property delegate must have a 'getValue(Context, KProperty<*>)' method. None of the following functions is suitable:
2public abstract operator fun getValue(thisRef: Context, property: KProperty<*>): DataStore<Preferences> defined in kotlin.properties.ReadOnlyProperty
3
How can I use the datastore-preferences?
My build.gradle file:
1e: ...SettingsViewModel.kt: (13, 50): Property delegate must have a 'getValue(Context, KProperty<*>)' method. None of the following functions is suitable:
2public abstract operator fun getValue(thisRef: Context, property: KProperty<*>): DataStore<Preferences> defined in kotlin.properties.ReadOnlyProperty
3plugins {
4 id 'com.android.application'
5 id 'kotlin-android'
6 id 'kotlin-kapt'
7
8}
9
10apply plugin: 'dagger.hilt.android.plugin'
11apply plugin: 'kotlinx-serialization'
12
13
14android {
15 compileSdk 31
16
17 defaultConfig {
18 applicationId "hu.homedashboard.mobile"
19 minSdk 22
20 targetSdk 31
21 versionCode 1
22 versionName "1.0"
23
24 testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
25 vectorDrawables {
26 useSupportLibrary true
27 }
28 }
29
30 buildTypes {
31 release {
32 minifyEnabled false
33 proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
34 }
35 }
36 compileOptions {
37 sourceCompatibility JavaVersion.VERSION_1_8
38 targetCompatibility JavaVersion.VERSION_1_8
39 }
40 kotlinOptions {
41 jvmTarget = '1.8'
42 useIR = true
43 }
44 buildFeatures {
45 compose true
46 }
47 composeOptions {
48 kotlinCompilerExtensionVersion compose_version
49 kotlinCompilerVersion "$kotlinVersion"
50 }
51 packagingOptions {
52 resources {
53 excludes += '/META-INF/{AL2.0,LGPL2.1}'
54 }
55 }
56}
57
58dependencies {
59
60 implementation "androidx.activity:activity-compose:1.3.1"
61 implementation "androidx.appcompat:appcompat:1.3.1"
62 implementation "androidx.datastore:datastore-preferences:1.0.0"
63 implementation "androidx.compose.material:material:$compose_version"
64 implementation "androidx.compose.ui:ui-tooling-preview:$compose_version"
65 implementation "androidx.compose.ui:ui:$compose_version"
66 implementation "com.google.accompanist:accompanist-swiperefresh:0.20.3"
67 implementation "androidx.core:core-ktx:1.6.0"
68 implementation "androidx.lifecycle:lifecycle-livedata-ktx:$lifecycle_version"
69 implementation "androidx.lifecycle:lifecycle-runtime-ktx:$lifecycle_version"
70 implementation "androidx.lifecycle:lifecycle-runtime-ktx:2.3.1"
71 implementation "androidx.lifecycle:lifecycle-viewmodel-compose:$lifecycle_version"
72 implementation "androidx.lifecycle:lifecycle-viewmodel-ktx:$lifecycle_version"
73 implementation "androidx.navigation:navigation-compose:2.4.0-alpha10"
74 implementation "com.google.android.material:material:1.4.0"
75 implementation "com.google.dagger:hilt-android:2.40.1"
76 implementation "com.jakewharton.retrofit:retrofit2-kotlinx-serialization-converter:0.8.0"
77 implementation "com.squareup.retrofit2:retrofit:2.9.0"
78 implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlinVersion"
79 implementation "org.jetbrains.kotlinx:kotlinx-serialization-json:1.3.0"
80 implementation "org.ocpsoft.prettytime:prettytime:5.0.2.Final"
81 implementation 'androidx.hilt:hilt-navigation-compose:1.0.0-alpha03'
82
83 kapt "com.google.dagger:hilt-compiler:2.38.1"
84 kapt "com.google.dagger:dagger-android-processor:2.40.1"
85 kapt "com.google.guava:guava:31.0.1-android"
86
87 api "com.google.guava:guava:31.0.1-android"
88
89 testImplementation 'junit:junit:4.+'
90 androidTestImplementation 'androidx.test.ext:junit:1.1.3'
91 androidTestImplementation 'androidx.test.espresso:espresso-core:3.4.0'
92 androidTestImplementation "androidx.compose.ui:ui-test-junit4:$compose_version"
93 debugImplementation "androidx.compose.ui:ui-tooling:$compose_version"
94}
95
96kapt {
97 correctErrorTypes true
98 javacOptions {
99 option("-Xmaxerrs", 500)
100 }
101}
102
ANSWER
Answered 2022-Jan-13 at 09:20I got this error because of an incorrect import:
1e: ...SettingsViewModel.kt: (13, 50): Property delegate must have a 'getValue(Context, KProperty<*>)' method. None of the following functions is suitable:
2public abstract operator fun getValue(thisRef: Context, property: KProperty<*>): DataStore<Preferences> defined in kotlin.properties.ReadOnlyProperty
3plugins {
4 id 'com.android.application'
5 id 'kotlin-android'
6 id 'kotlin-kapt'
7
8}
9
10apply plugin: 'dagger.hilt.android.plugin'
11apply plugin: 'kotlinx-serialization'
12
13
14android {
15 compileSdk 31
16
17 defaultConfig {
18 applicationId "hu.homedashboard.mobile"
19 minSdk 22
20 targetSdk 31
21 versionCode 1
22 versionName "1.0"
23
24 testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
25 vectorDrawables {
26 useSupportLibrary true
27 }
28 }
29
30 buildTypes {
31 release {
32 minifyEnabled false
33 proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
34 }
35 }
36 compileOptions {
37 sourceCompatibility JavaVersion.VERSION_1_8
38 targetCompatibility JavaVersion.VERSION_1_8
39 }
40 kotlinOptions {
41 jvmTarget = '1.8'
42 useIR = true
43 }
44 buildFeatures {
45 compose true
46 }
47 composeOptions {
48 kotlinCompilerExtensionVersion compose_version
49 kotlinCompilerVersion "$kotlinVersion"
50 }
51 packagingOptions {
52 resources {
53 excludes += '/META-INF/{AL2.0,LGPL2.1}'
54 }
55 }
56}
57
58dependencies {
59
60 implementation "androidx.activity:activity-compose:1.3.1"
61 implementation "androidx.appcompat:appcompat:1.3.1"
62 implementation "androidx.datastore:datastore-preferences:1.0.0"
63 implementation "androidx.compose.material:material:$compose_version"
64 implementation "androidx.compose.ui:ui-tooling-preview:$compose_version"
65 implementation "androidx.compose.ui:ui:$compose_version"
66 implementation "com.google.accompanist:accompanist-swiperefresh:0.20.3"
67 implementation "androidx.core:core-ktx:1.6.0"
68 implementation "androidx.lifecycle:lifecycle-livedata-ktx:$lifecycle_version"
69 implementation "androidx.lifecycle:lifecycle-runtime-ktx:$lifecycle_version"
70 implementation "androidx.lifecycle:lifecycle-runtime-ktx:2.3.1"
71 implementation "androidx.lifecycle:lifecycle-viewmodel-compose:$lifecycle_version"
72 implementation "androidx.lifecycle:lifecycle-viewmodel-ktx:$lifecycle_version"
73 implementation "androidx.navigation:navigation-compose:2.4.0-alpha10"
74 implementation "com.google.android.material:material:1.4.0"
75 implementation "com.google.dagger:hilt-android:2.40.1"
76 implementation "com.jakewharton.retrofit:retrofit2-kotlinx-serialization-converter:0.8.0"
77 implementation "com.squareup.retrofit2:retrofit:2.9.0"
78 implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlinVersion"
79 implementation "org.jetbrains.kotlinx:kotlinx-serialization-json:1.3.0"
80 implementation "org.ocpsoft.prettytime:prettytime:5.0.2.Final"
81 implementation 'androidx.hilt:hilt-navigation-compose:1.0.0-alpha03'
82
83 kapt "com.google.dagger:hilt-compiler:2.38.1"
84 kapt "com.google.dagger:dagger-android-processor:2.40.1"
85 kapt "com.google.guava:guava:31.0.1-android"
86
87 api "com.google.guava:guava:31.0.1-android"
88
89 testImplementation 'junit:junit:4.+'
90 androidTestImplementation 'androidx.test.ext:junit:1.1.3'
91 androidTestImplementation 'androidx.test.espresso:espresso-core:3.4.0'
92 androidTestImplementation "androidx.compose.ui:ui-test-junit4:$compose_version"
93 debugImplementation "androidx.compose.ui:ui-tooling:$compose_version"
94}
95
96kapt {
97 correctErrorTypes true
98 javacOptions {
99 option("-Xmaxerrs", 500)
100 }
101}
102import java.util.prefs.Preferences
103
So fix it by
1e: ...SettingsViewModel.kt: (13, 50): Property delegate must have a 'getValue(Context, KProperty<*>)' method. None of the following functions is suitable:
2public abstract operator fun getValue(thisRef: Context, property: KProperty<*>): DataStore<Preferences> defined in kotlin.properties.ReadOnlyProperty
3plugins {
4 id 'com.android.application'
5 id 'kotlin-android'
6 id 'kotlin-kapt'
7
8}
9
10apply plugin: 'dagger.hilt.android.plugin'
11apply plugin: 'kotlinx-serialization'
12
13
14android {
15 compileSdk 31
16
17 defaultConfig {
18 applicationId "hu.homedashboard.mobile"
19 minSdk 22
20 targetSdk 31
21 versionCode 1
22 versionName "1.0"
23
24 testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
25 vectorDrawables {
26 useSupportLibrary true
27 }
28 }
29
30 buildTypes {
31 release {
32 minifyEnabled false
33 proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
34 }
35 }
36 compileOptions {
37 sourceCompatibility JavaVersion.VERSION_1_8
38 targetCompatibility JavaVersion.VERSION_1_8
39 }
40 kotlinOptions {
41 jvmTarget = '1.8'
42 useIR = true
43 }
44 buildFeatures {
45 compose true
46 }
47 composeOptions {
48 kotlinCompilerExtensionVersion compose_version
49 kotlinCompilerVersion "$kotlinVersion"
50 }
51 packagingOptions {
52 resources {
53 excludes += '/META-INF/{AL2.0,LGPL2.1}'
54 }
55 }
56}
57
58dependencies {
59
60 implementation "androidx.activity:activity-compose:1.3.1"
61 implementation "androidx.appcompat:appcompat:1.3.1"
62 implementation "androidx.datastore:datastore-preferences:1.0.0"
63 implementation "androidx.compose.material:material:$compose_version"
64 implementation "androidx.compose.ui:ui-tooling-preview:$compose_version"
65 implementation "androidx.compose.ui:ui:$compose_version"
66 implementation "com.google.accompanist:accompanist-swiperefresh:0.20.3"
67 implementation "androidx.core:core-ktx:1.6.0"
68 implementation "androidx.lifecycle:lifecycle-livedata-ktx:$lifecycle_version"
69 implementation "androidx.lifecycle:lifecycle-runtime-ktx:$lifecycle_version"
70 implementation "androidx.lifecycle:lifecycle-runtime-ktx:2.3.1"
71 implementation "androidx.lifecycle:lifecycle-viewmodel-compose:$lifecycle_version"
72 implementation "androidx.lifecycle:lifecycle-viewmodel-ktx:$lifecycle_version"
73 implementation "androidx.navigation:navigation-compose:2.4.0-alpha10"
74 implementation "com.google.android.material:material:1.4.0"
75 implementation "com.google.dagger:hilt-android:2.40.1"
76 implementation "com.jakewharton.retrofit:retrofit2-kotlinx-serialization-converter:0.8.0"
77 implementation "com.squareup.retrofit2:retrofit:2.9.0"
78 implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlinVersion"
79 implementation "org.jetbrains.kotlinx:kotlinx-serialization-json:1.3.0"
80 implementation "org.ocpsoft.prettytime:prettytime:5.0.2.Final"
81 implementation 'androidx.hilt:hilt-navigation-compose:1.0.0-alpha03'
82
83 kapt "com.google.dagger:hilt-compiler:2.38.1"
84 kapt "com.google.dagger:dagger-android-processor:2.40.1"
85 kapt "com.google.guava:guava:31.0.1-android"
86
87 api "com.google.guava:guava:31.0.1-android"
88
89 testImplementation 'junit:junit:4.+'
90 androidTestImplementation 'androidx.test.ext:junit:1.1.3'
91 androidTestImplementation 'androidx.test.espresso:espresso-core:3.4.0'
92 androidTestImplementation "androidx.compose.ui:ui-test-junit4:$compose_version"
93 debugImplementation "androidx.compose.ui:ui-tooling:$compose_version"
94}
95
96kapt {
97 correctErrorTypes true
98 javacOptions {
99 option("-Xmaxerrs", 500)
100 }
101}
102import java.util.prefs.Preferences
103import androidx.datastore.preferences.core.Preferences
104
or
1e: ...SettingsViewModel.kt: (13, 50): Property delegate must have a 'getValue(Context, KProperty<*>)' method. None of the following functions is suitable:
2public abstract operator fun getValue(thisRef: Context, property: KProperty<*>): DataStore<Preferences> defined in kotlin.properties.ReadOnlyProperty
3plugins {
4 id 'com.android.application'
5 id 'kotlin-android'
6 id 'kotlin-kapt'
7
8}
9
10apply plugin: 'dagger.hilt.android.plugin'
11apply plugin: 'kotlinx-serialization'
12
13
14android {
15 compileSdk 31
16
17 defaultConfig {
18 applicationId "hu.homedashboard.mobile"
19 minSdk 22
20 targetSdk 31
21 versionCode 1
22 versionName "1.0"
23
24 testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
25 vectorDrawables {
26 useSupportLibrary true
27 }
28 }
29
30 buildTypes {
31 release {
32 minifyEnabled false
33 proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
34 }
35 }
36 compileOptions {
37 sourceCompatibility JavaVersion.VERSION_1_8
38 targetCompatibility JavaVersion.VERSION_1_8
39 }
40 kotlinOptions {
41 jvmTarget = '1.8'
42 useIR = true
43 }
44 buildFeatures {
45 compose true
46 }
47 composeOptions {
48 kotlinCompilerExtensionVersion compose_version
49 kotlinCompilerVersion "$kotlinVersion"
50 }
51 packagingOptions {
52 resources {
53 excludes += '/META-INF/{AL2.0,LGPL2.1}'
54 }
55 }
56}
57
58dependencies {
59
60 implementation "androidx.activity:activity-compose:1.3.1"
61 implementation "androidx.appcompat:appcompat:1.3.1"
62 implementation "androidx.datastore:datastore-preferences:1.0.0"
63 implementation "androidx.compose.material:material:$compose_version"
64 implementation "androidx.compose.ui:ui-tooling-preview:$compose_version"
65 implementation "androidx.compose.ui:ui:$compose_version"
66 implementation "com.google.accompanist:accompanist-swiperefresh:0.20.3"
67 implementation "androidx.core:core-ktx:1.6.0"
68 implementation "androidx.lifecycle:lifecycle-livedata-ktx:$lifecycle_version"
69 implementation "androidx.lifecycle:lifecycle-runtime-ktx:$lifecycle_version"
70 implementation "androidx.lifecycle:lifecycle-runtime-ktx:2.3.1"
71 implementation "androidx.lifecycle:lifecycle-viewmodel-compose:$lifecycle_version"
72 implementation "androidx.lifecycle:lifecycle-viewmodel-ktx:$lifecycle_version"
73 implementation "androidx.navigation:navigation-compose:2.4.0-alpha10"
74 implementation "com.google.android.material:material:1.4.0"
75 implementation "com.google.dagger:hilt-android:2.40.1"
76 implementation "com.jakewharton.retrofit:retrofit2-kotlinx-serialization-converter:0.8.0"
77 implementation "com.squareup.retrofit2:retrofit:2.9.0"
78 implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlinVersion"
79 implementation "org.jetbrains.kotlinx:kotlinx-serialization-json:1.3.0"
80 implementation "org.ocpsoft.prettytime:prettytime:5.0.2.Final"
81 implementation 'androidx.hilt:hilt-navigation-compose:1.0.0-alpha03'
82
83 kapt "com.google.dagger:hilt-compiler:2.38.1"
84 kapt "com.google.dagger:dagger-android-processor:2.40.1"
85 kapt "com.google.guava:guava:31.0.1-android"
86
87 api "com.google.guava:guava:31.0.1-android"
88
89 testImplementation 'junit:junit:4.+'
90 androidTestImplementation 'androidx.test.ext:junit:1.1.3'
91 androidTestImplementation 'androidx.test.espresso:espresso-core:3.4.0'
92 androidTestImplementation "androidx.compose.ui:ui-test-junit4:$compose_version"
93 debugImplementation "androidx.compose.ui:ui-tooling:$compose_version"
94}
95
96kapt {
97 correctErrorTypes true
98 javacOptions {
99 option("-Xmaxerrs", 500)
100 }
101}
102import java.util.prefs.Preferences
103import androidx.datastore.preferences.core.Preferences
104val Context.dataStore by preferencesDataStore(name = "settings")
105
QUESTION
Saving model on Tensorflow 2.7.0 with data augmentation layer
Asked 2022-Feb-04 at 17:25I am getting an error when trying to save a model with data augmentation layers with Tensorflow version 2.7.0.
Here is the code of data augmentation:
1input_shape_rgb = (img_height, img_width, 3)
2data_augmentation_rgb = tf.keras.Sequential(
3 [
4 layers.RandomFlip("horizontal"),
5 layers.RandomFlip("vertical"),
6 layers.RandomRotation(0.5),
7 layers.RandomZoom(0.5),
8 layers.RandomContrast(0.5),
9 RandomColorDistortion(name='random_contrast_brightness/none'),
10 ]
11)
12
Now I build my model like this:
1input_shape_rgb = (img_height, img_width, 3)
2data_augmentation_rgb = tf.keras.Sequential(
3 [
4 layers.RandomFlip("horizontal"),
5 layers.RandomFlip("vertical"),
6 layers.RandomRotation(0.5),
7 layers.RandomZoom(0.5),
8 layers.RandomContrast(0.5),
9 RandomColorDistortion(name='random_contrast_brightness/none'),
10 ]
11)
12# Build the model
13input_shape = (img_height, img_width, 3)
14
15model = Sequential([
16 layers.Input(input_shape),
17 data_augmentation_rgb,
18 layers.Rescaling((1./255)),
19
20 layers.Conv2D(16, kernel_size, padding=padding, activation='relu', strides=1,
21 data_format='channels_last'),
22 layers.MaxPooling2D(),
23 layers.BatchNormalization(),
24
25 layers.Conv2D(32, kernel_size, padding=padding, activation='relu'), # best 4
26 layers.MaxPooling2D(),
27 layers.BatchNormalization(),
28
29 layers.Conv2D(64, kernel_size, padding=padding, activation='relu'), # best 3
30 layers.MaxPooling2D(),
31 layers.BatchNormalization(),
32
33 layers.Conv2D(128, kernel_size, padding=padding, activation='relu'), # best 3
34 layers.MaxPooling2D(),
35 layers.BatchNormalization(),
36
37 layers.Flatten(),
38 layers.Dense(128, activation='relu'), # best 1
39 layers.Dropout(0.1),
40 layers.Dense(128, activation='relu'), # best 1
41 layers.Dropout(0.1),
42 layers.Dense(64, activation='relu'), # best 1
43 layers.Dropout(0.1),
44 layers.Dense(num_classes, activation = 'softmax')
45 ])
46
47 model.compile(loss='categorical_crossentropy', optimizer='adam',metrics=metrics)
48 model.summary()
49
Then after the training is done I just make:
1input_shape_rgb = (img_height, img_width, 3)
2data_augmentation_rgb = tf.keras.Sequential(
3 [
4 layers.RandomFlip("horizontal"),
5 layers.RandomFlip("vertical"),
6 layers.RandomRotation(0.5),
7 layers.RandomZoom(0.5),
8 layers.RandomContrast(0.5),
9 RandomColorDistortion(name='random_contrast_brightness/none'),
10 ]
11)
12# Build the model
13input_shape = (img_height, img_width, 3)
14
15model = Sequential([
16 layers.Input(input_shape),
17 data_augmentation_rgb,
18 layers.Rescaling((1./255)),
19
20 layers.Conv2D(16, kernel_size, padding=padding, activation='relu', strides=1,
21 data_format='channels_last'),
22 layers.MaxPooling2D(),
23 layers.BatchNormalization(),
24
25 layers.Conv2D(32, kernel_size, padding=padding, activation='relu'), # best 4
26 layers.MaxPooling2D(),
27 layers.BatchNormalization(),
28
29 layers.Conv2D(64, kernel_size, padding=padding, activation='relu'), # best 3
30 layers.MaxPooling2D(),
31 layers.BatchNormalization(),
32
33 layers.Conv2D(128, kernel_size, padding=padding, activation='relu'), # best 3
34 layers.MaxPooling2D(),
35 layers.BatchNormalization(),
36
37 layers.Flatten(),
38 layers.Dense(128, activation='relu'), # best 1
39 layers.Dropout(0.1),
40 layers.Dense(128, activation='relu'), # best 1
41 layers.Dropout(0.1),
42 layers.Dense(64, activation='relu'), # best 1
43 layers.Dropout(0.1),
44 layers.Dense(num_classes, activation = 'softmax')
45 ])
46
47 model.compile(loss='categorical_crossentropy', optimizer='adam',metrics=metrics)
48 model.summary()
49model.save("./")
50
And I'm getting this error:
1input_shape_rgb = (img_height, img_width, 3)
2data_augmentation_rgb = tf.keras.Sequential(
3 [
4 layers.RandomFlip("horizontal"),
5 layers.RandomFlip("vertical"),
6 layers.RandomRotation(0.5),
7 layers.RandomZoom(0.5),
8 layers.RandomContrast(0.5),
9 RandomColorDistortion(name='random_contrast_brightness/none'),
10 ]
11)
12# Build the model
13input_shape = (img_height, img_width, 3)
14
15model = Sequential([
16 layers.Input(input_shape),
17 data_augmentation_rgb,
18 layers.Rescaling((1./255)),
19
20 layers.Conv2D(16, kernel_size, padding=padding, activation='relu', strides=1,
21 data_format='channels_last'),
22 layers.MaxPooling2D(),
23 layers.BatchNormalization(),
24
25 layers.Conv2D(32, kernel_size, padding=padding, activation='relu'), # best 4
26 layers.MaxPooling2D(),
27 layers.BatchNormalization(),
28
29 layers.Conv2D(64, kernel_size, padding=padding, activation='relu'), # best 3
30 layers.MaxPooling2D(),
31 layers.BatchNormalization(),
32
33 layers.Conv2D(128, kernel_size, padding=padding, activation='relu'), # best 3
34 layers.MaxPooling2D(),
35 layers.BatchNormalization(),
36
37 layers.Flatten(),
38 layers.Dense(128, activation='relu'), # best 1
39 layers.Dropout(0.1),
40 layers.Dense(128, activation='relu'), # best 1
41 layers.Dropout(0.1),
42 layers.Dense(64, activation='relu'), # best 1
43 layers.Dropout(0.1),
44 layers.Dense(num_classes, activation = 'softmax')
45 ])
46
47 model.compile(loss='categorical_crossentropy', optimizer='adam',metrics=metrics)
48 model.summary()
49model.save("./")
50---------------------------------------------------------------------------
51KeyError Traceback (most recent call last)
52<ipython-input-84-87d3f09f8bee> in <module>()
53----> 1 model.save("./")
54
55
56/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py in
57 error_handler(*args, **kwargs)
58 65 except Exception as e: # pylint: disable=broad-except
59 66 filtered_tb = _process_traceback_frames(e.__traceback__)
60 ---> 67 raise e.with_traceback(filtered_tb) from None
61 68 finally:
62 69 del filtered_tb
63
64 /usr/local/lib/python3.7/dist-
65 packages/tensorflow/python/saved_model/function_serialization.py in
66 serialize_concrete_function(concrete_function, node_ids, coder)
67 66 except KeyError:
68 67 raise KeyError(
69 ---> 68 f"Failed to add concrete function '{concrete_function.name}' to
70 object-"
71 69 f"based SavedModel as it captures tensor {capture!r} which is
72 unsupported"
73 70 " or not reachable from root. "
74
75 KeyError: "Failed to add concrete function
76 'b'__inference_sequential_46_layer_call_fn_662953'' to object-based SavedModel as it
77 captures tensor <tf.Tensor: shape=(), dtype=resource, value=<Resource Tensor>> which
78 is unsupported or not reachable from root. One reason could be that a stateful
79 object or a variable that the function depends on is not assigned to an attribute of
80 the serialized trackable object (see SaveTest.test_captures_unreachable_variable)."
81
I inspected the reason of getting this error by changing the architecture of my model and I just found that reason came from the data_augmentation layer since the RandomFlip
and RandomRotation
and others are changed from layers.experimental.prepocessing.RandomFlip
to layers.RandomFlip
, but still the error appears.
ANSWER
Answered 2022-Feb-04 at 17:25This seems to be a bug in Tensorflow 2.7 when using model.save
combined with the parameter save_format="tf"
, which is set by default. The layers RandomFlip
, RandomRotation
, RandomZoom
, and RandomContrast
are causing the problems, since they are not serializable. Interestingly, the Rescaling
layer can be saved without any problems. A workaround would be to simply save your model with the older Keras H5 format model.save("test", save_format='h5')
:
1input_shape_rgb = (img_height, img_width, 3)
2data_augmentation_rgb = tf.keras.Sequential(
3 [
4 layers.RandomFlip("horizontal"),
5 layers.RandomFlip("vertical"),
6 layers.RandomRotation(0.5),
7 layers.RandomZoom(0.5),
8 layers.RandomContrast(0.5),
9 RandomColorDistortion(name='random_contrast_brightness/none'),
10 ]
11)
12# Build the model
13input_shape = (img_height, img_width, 3)
14
15model = Sequential([
16 layers.Input(input_shape),
17 data_augmentation_rgb,
18 layers.Rescaling((1./255)),
19
20 layers.Conv2D(16, kernel_size, padding=padding, activation='relu', strides=1,
21 data_format='channels_last'),
22 layers.MaxPooling2D(),
23 layers.BatchNormalization(),
24
25 layers.Conv2D(32, kernel_size, padding=padding, activation='relu'), # best 4
26 layers.MaxPooling2D(),
27 layers.BatchNormalization(),
28
29 layers.Conv2D(64, kernel_size, padding=padding, activation='relu'), # best 3
30 layers.MaxPooling2D(),
31 layers.BatchNormalization(),
32
33 layers.Conv2D(128, kernel_size, padding=padding, activation='relu'), # best 3
34 layers.MaxPooling2D(),
35 layers.BatchNormalization(),
36
37 layers.Flatten(),
38 layers.Dense(128, activation='relu'), # best 1
39 layers.Dropout(0.1),
40 layers.Dense(128, activation='relu'), # best 1
41 layers.Dropout(0.1),
42 layers.Dense(64, activation='relu'), # best 1
43 layers.Dropout(0.1),
44 layers.Dense(num_classes, activation = 'softmax')
45 ])
46
47 model.compile(loss='categorical_crossentropy', optimizer='adam',metrics=metrics)
48 model.summary()
49model.save("./")
50---------------------------------------------------------------------------
51KeyError Traceback (most recent call last)
52<ipython-input-84-87d3f09f8bee> in <module>()
53----> 1 model.save("./")
54
55
56/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py in
57 error_handler(*args, **kwargs)
58 65 except Exception as e: # pylint: disable=broad-except
59 66 filtered_tb = _process_traceback_frames(e.__traceback__)
60 ---> 67 raise e.with_traceback(filtered_tb) from None
61 68 finally:
62 69 del filtered_tb
63
64 /usr/local/lib/python3.7/dist-
65 packages/tensorflow/python/saved_model/function_serialization.py in
66 serialize_concrete_function(concrete_function, node_ids, coder)
67 66 except KeyError:
68 67 raise KeyError(
69 ---> 68 f"Failed to add concrete function '{concrete_function.name}' to
70 object-"
71 69 f"based SavedModel as it captures tensor {capture!r} which is
72 unsupported"
73 70 " or not reachable from root. "
74
75 KeyError: "Failed to add concrete function
76 'b'__inference_sequential_46_layer_call_fn_662953'' to object-based SavedModel as it
77 captures tensor <tf.Tensor: shape=(), dtype=resource, value=<Resource Tensor>> which
78 is unsupported or not reachable from root. One reason could be that a stateful
79 object or a variable that the function depends on is not assigned to an attribute of
80 the serialized trackable object (see SaveTest.test_captures_unreachable_variable)."
81import tensorflow as tf
82import numpy as np
83
84class RandomColorDistortion(tf.keras.layers.Layer):
85 def __init__(self, contrast_range=[0.5, 1.5],
86 brightness_delta=[-0.2, 0.2], **kwargs):
87 super(RandomColorDistortion, self).__init__(**kwargs)
88 self.contrast_range = contrast_range
89 self.brightness_delta = brightness_delta
90
91 def call(self, images, training=None):
92 if not training:
93 return images
94 contrast = np.random.uniform(
95 self.contrast_range[0], self.contrast_range[1])
96 brightness = np.random.uniform(
97 self.brightness_delta[0], self.brightness_delta[1])
98
99 images = tf.image.adjust_contrast(images, contrast)
100 images = tf.image.adjust_brightness(images, brightness)
101 images = tf.clip_by_value(images, 0, 1)
102 return images
103
104 def get_config(self):
105 config = super(RandomColorDistortion, self).get_config()
106 config.update({"contrast_range": self.contrast_range, "brightness_delta": self.brightness_delta})
107 return config
108
109input_shape_rgb = (256, 256, 3)
110data_augmentation_rgb = tf.keras.Sequential(
111 [
112 tf.keras.layers.RandomFlip("horizontal"),
113 tf.keras.layers.RandomFlip("vertical"),
114 tf.keras.layers.RandomRotation(0.5),
115 tf.keras.layers.RandomZoom(0.5),
116 tf.keras.layers.RandomContrast(0.5),
117 RandomColorDistortion(name='random_contrast_brightness/none'),
118 ]
119)
120input_shape = (256, 256, 3)
121padding = 'same'
122kernel_size = 3
123model = tf.keras.Sequential([
124 tf.keras.layers.Input(input_shape),
125 data_augmentation_rgb,
126 tf.keras.layers.Rescaling((1./255)),
127 tf.keras.layers.Conv2D(16, kernel_size, padding=padding, activation='relu', strides=1,
128 data_format='channels_last'),
129 tf.keras.layers.MaxPooling2D(),
130 tf.keras.layers.BatchNormalization(),
131
132 tf.keras.layers.Conv2D(32, kernel_size, padding=padding, activation='relu'), # best 4
133 tf.keras.layers.MaxPooling2D(),
134 tf.keras.layers.BatchNormalization(),
135
136 tf.keras.layers.Conv2D(64, kernel_size, padding=padding, activation='relu'), # best 3
137 tf.keras.layers.MaxPooling2D(),
138 tf.keras.layers.BatchNormalization(),
139
140 tf.keras.layers.Conv2D(128, kernel_size, padding=padding, activation='relu'), # best 3
141 tf.keras.layers.MaxPooling2D(),
142 tf.keras.layers.BatchNormalization(),
143
144 tf.keras.layers.Flatten(),
145 tf.keras.layers.Dense(128, activation='relu'), # best 1
146 tf.keras.layers.Dropout(0.1),
147 tf.keras.layers.Dense(128, activation='relu'), # best 1
148 tf.keras.layers.Dropout(0.1),
149 tf.keras.layers.Dense(64, activation='relu'), # best 1
150 tf.keras.layers.Dropout(0.1),
151 tf.keras.layers.Dense(5, activation = 'softmax')
152 ])
153
154model.compile(loss='categorical_crossentropy', optimizer='adam')
155model.summary()
156model.save("test", save_format='h5')
157
Loading your model with your custom layer would look like this then:
1input_shape_rgb = (img_height, img_width, 3)
2data_augmentation_rgb = tf.keras.Sequential(
3 [
4 layers.RandomFlip("horizontal"),
5 layers.RandomFlip("vertical"),
6 layers.RandomRotation(0.5),
7 layers.RandomZoom(0.5),
8 layers.RandomContrast(0.5),
9 RandomColorDistortion(name='random_contrast_brightness/none'),
10 ]
11)
12# Build the model
13input_shape = (img_height, img_width, 3)
14
15model = Sequential([
16 layers.Input(input_shape),
17 data_augmentation_rgb,
18 layers.Rescaling((1./255)),
19
20 layers.Conv2D(16, kernel_size, padding=padding, activation='relu', strides=1,
21 data_format='channels_last'),
22 layers.MaxPooling2D(),
23 layers.BatchNormalization(),
24
25 layers.Conv2D(32, kernel_size, padding=padding, activation='relu'), # best 4
26 layers.MaxPooling2D(),
27 layers.BatchNormalization(),
28
29 layers.Conv2D(64, kernel_size, padding=padding, activation='relu'), # best 3
30 layers.MaxPooling2D(),
31 layers.BatchNormalization(),
32
33 layers.Conv2D(128, kernel_size, padding=padding, activation='relu'), # best 3
34 layers.MaxPooling2D(),
35 layers.BatchNormalization(),
36
37 layers.Flatten(),
38 layers.Dense(128, activation='relu'), # best 1
39 layers.Dropout(0.1),
40 layers.Dense(128, activation='relu'), # best 1
41 layers.Dropout(0.1),
42 layers.Dense(64, activation='relu'), # best 1
43 layers.Dropout(0.1),
44 layers.Dense(num_classes, activation = 'softmax')
45 ])
46
47 model.compile(loss='categorical_crossentropy', optimizer='adam',metrics=metrics)
48 model.summary()
49model.save("./")
50---------------------------------------------------------------------------
51KeyError Traceback (most recent call last)
52<ipython-input-84-87d3f09f8bee> in <module>()
53----> 1 model.save("./")
54
55
56/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py in
57 error_handler(*args, **kwargs)
58 65 except Exception as e: # pylint: disable=broad-except
59 66 filtered_tb = _process_traceback_frames(e.__traceback__)
60 ---> 67 raise e.with_traceback(filtered_tb) from None
61 68 finally:
62 69 del filtered_tb
63
64 /usr/local/lib/python3.7/dist-
65 packages/tensorflow/python/saved_model/function_serialization.py in
66 serialize_concrete_function(concrete_function, node_ids, coder)
67 66 except KeyError:
68 67 raise KeyError(
69 ---> 68 f"Failed to add concrete function '{concrete_function.name}' to
70 object-"
71 69 f"based SavedModel as it captures tensor {capture!r} which is
72 unsupported"
73 70 " or not reachable from root. "
74
75 KeyError: "Failed to add concrete function
76 'b'__inference_sequential_46_layer_call_fn_662953'' to object-based SavedModel as it
77 captures tensor <tf.Tensor: shape=(), dtype=resource, value=<Resource Tensor>> which
78 is unsupported or not reachable from root. One reason could be that a stateful
79 object or a variable that the function depends on is not assigned to an attribute of
80 the serialized trackable object (see SaveTest.test_captures_unreachable_variable)."
81import tensorflow as tf
82import numpy as np
83
84class RandomColorDistortion(tf.keras.layers.Layer):
85 def __init__(self, contrast_range=[0.5, 1.5],
86 brightness_delta=[-0.2, 0.2], **kwargs):
87 super(RandomColorDistortion, self).__init__(**kwargs)
88 self.contrast_range = contrast_range
89 self.brightness_delta = brightness_delta
90
91 def call(self, images, training=None):
92 if not training:
93 return images
94 contrast = np.random.uniform(
95 self.contrast_range[0], self.contrast_range[1])
96 brightness = np.random.uniform(
97 self.brightness_delta[0], self.brightness_delta[1])
98
99 images = tf.image.adjust_contrast(images, contrast)
100 images = tf.image.adjust_brightness(images, brightness)
101 images = tf.clip_by_value(images, 0, 1)
102 return images
103
104 def get_config(self):
105 config = super(RandomColorDistortion, self).get_config()
106 config.update({"contrast_range": self.contrast_range, "brightness_delta": self.brightness_delta})
107 return config
108
109input_shape_rgb = (256, 256, 3)
110data_augmentation_rgb = tf.keras.Sequential(
111 [
112 tf.keras.layers.RandomFlip("horizontal"),
113 tf.keras.layers.RandomFlip("vertical"),
114 tf.keras.layers.RandomRotation(0.5),
115 tf.keras.layers.RandomZoom(0.5),
116 tf.keras.layers.RandomContrast(0.5),
117 RandomColorDistortion(name='random_contrast_brightness/none'),
118 ]
119)
120input_shape = (256, 256, 3)
121padding = 'same'
122kernel_size = 3
123model = tf.keras.Sequential([
124 tf.keras.layers.Input(input_shape),
125 data_augmentation_rgb,
126 tf.keras.layers.Rescaling((1./255)),
127 tf.keras.layers.Conv2D(16, kernel_size, padding=padding, activation='relu', strides=1,
128 data_format='channels_last'),
129 tf.keras.layers.MaxPooling2D(),
130 tf.keras.layers.BatchNormalization(),
131
132 tf.keras.layers.Conv2D(32, kernel_size, padding=padding, activation='relu'), # best 4
133 tf.keras.layers.MaxPooling2D(),
134 tf.keras.layers.BatchNormalization(),
135
136 tf.keras.layers.Conv2D(64, kernel_size, padding=padding, activation='relu'), # best 3
137 tf.keras.layers.MaxPooling2D(),
138 tf.keras.layers.BatchNormalization(),
139
140 tf.keras.layers.Conv2D(128, kernel_size, padding=padding, activation='relu'), # best 3
141 tf.keras.layers.MaxPooling2D(),
142 tf.keras.layers.BatchNormalization(),
143
144 tf.keras.layers.Flatten(),
145 tf.keras.layers.Dense(128, activation='relu'), # best 1
146 tf.keras.layers.Dropout(0.1),
147 tf.keras.layers.Dense(128, activation='relu'), # best 1
148 tf.keras.layers.Dropout(0.1),
149 tf.keras.layers.Dense(64, activation='relu'), # best 1
150 tf.keras.layers.Dropout(0.1),
151 tf.keras.layers.Dense(5, activation = 'softmax')
152 ])
153
154model.compile(loss='categorical_crossentropy', optimizer='adam')
155model.summary()
156model.save("test", save_format='h5')
157model = tf.keras.models.load_model('test.h5', custom_objects={'RandomColorDistortion': RandomColorDistortion})
158
where RandomColorDistortion
is the name of your custom layer.
QUESTION
Forcing VS2022 to use 32 bit version of msbuild
Asked 2022-Feb-03 at 06:44I'm currently investigating migrating our toolset from VS2013, 15, and 17 to just VS2022 in order to streamline and reduce the amount of software needed to build our solution. VS2022 has all the parts we need however I'm struggling to get the actual solution to build through VS itself.
The solution builds entirely fine if I use the accompanying 32 bit version of msbuild at the path C:\Program Files\Microsoft Visual Studio\2022\Community\Msbuild\Current\Bin\MSBuild.exe
. However the C:\Program Files\Microsoft Visual Studio\2022\Community\Msbuild\Current\Bin\amd64\MSBuild.exe
version (which I'm guessing VS22 uses?) fails to build with the same errors as through VS22.
It's important to note our solution/projects are primarily based on .net framework 2.0
(unfortunately) and are set to x86
platform.
I keep cycling back to the following error SGEN : error : An attempt was made to load an assembly with an incorrect format
on various first and third-party dll's. I've been trialling back and forth combinations of the following:
- changing projects between
x86
andAnyCPU
but the error still crops regardless of the project and it's references' configurations - turning off
Generate serialization assembly
on the projects as I've seen somewhere that people have had success solving the error by doing this. - changing
ToolsVersion
from4.0
(our current setting) toCurrent
and other versions
I'm effectively looking for a way to force VS2022 to use the 32bit msbuild if that's at all possible as it builds the entire solution without making any changes at all.
I did also notice the following in VS output window:
1Task attempted to find "sgen.exe" using the SdkToolsPath value "". Make sure the SdkToolsPath is set to the correct value and the tool exists in the correct processor specific location below it. (TaskId:109)
24> C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.8 Tools\x64\sgen.exe
3
Would setting the SdkToolsPath
to the 32 bit version solve my problem? I haven't been actually been able to find where to do this to try it.
Thank you!
ANSWER
Answered 2022-Feb-03 at 06:44What has worked for us is overriding the SGenToolPath
project property manually in the .csproj files
Add the followings:
1Task attempted to find "sgen.exe" using the SdkToolsPath value "". Make sure the SdkToolsPath is set to the correct value and the tool exists in the correct processor specific location below it. (TaskId:109)
24> C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.8 Tools\x64\sgen.exe
3<Target Name="SGENBeforeBuild" BeforeTargets="PrepareForBuild">
4 <PropertyGroup>
5 <!-- workaround for VS2022 x64 building serialization assemblies for x86 targets
6 , see https://stackoverflow.com/questions/70770918/forcing-vs2022-to-use-32-bit-version-of-msbuild -->
7 <SGenToolPath>$(TargetFrameworkSDKToolsDirectory)</SGenToolPath>
8 </PropertyGroup>
9</Target>
10
Interestingly, we had to use this workaround for some x86 targeting projects, not all of them.
In fact, you might (we did) run into a similar problem with a number of Visual Studio build tool steps. Here's an MSDN page that lists project properties for most Visual Studio (2022) build tools:
https://docs.microsoft.com/en-us/visualstudio/msbuild/common-msbuild-project-properties?view=vs-2022
QUESTION
.NET 6 XmlSerializer Pretty print
Asked 2022-Jan-05 at 21:52I've this sample .NET 6 program printing out a serialised object to XML:
1using System.Text;
2using System.Xml.Serialization;
3
4var serializer = new XmlSerializer(typeof(Order));
5
6var order = new Order
7{
8 Address = new Address
9 {
10 FirstName = "Name"
11 }
12};
13
14await using var memoryStream = new MemoryStream();
15var streamWriter = new StreamWriter(memoryStream, Encoding.UTF8);
16serializer.Serialize(streamWriter, order);
17
18var result = Encoding.UTF8.GetString(memoryStream.ToArray());
19
20Console.WriteLine(result);
21
22public class Order
23{
24 public Address Address;
25}
26
27public class Address
28{
29 public string FirstName;
30}
31
This results in this output:
1using System.Text;
2using System.Xml.Serialization;
3
4var serializer = new XmlSerializer(typeof(Order));
5
6var order = new Order
7{
8 Address = new Address
9 {
10 FirstName = "Name"
11 }
12};
13
14await using var memoryStream = new MemoryStream();
15var streamWriter = new StreamWriter(memoryStream, Encoding.UTF8);
16serializer.Serialize(streamWriter, order);
17
18var result = Encoding.UTF8.GetString(memoryStream.ToArray());
19
20Console.WriteLine(result);
21
22public class Order
23{
24 public Address Address;
25}
26
27public class Address
28{
29 public string FirstName;
30}
31<?xml version="1.0" encoding="utf-8"?><Order xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><Address><FirstName>Name</FirstName></Address></Order>
32
In .NET 5 and .NET Core 3 similar code results in pretty printed XML like below. How can I format this XML in .NET6?
1using System.Text;
2using System.Xml.Serialization;
3
4var serializer = new XmlSerializer(typeof(Order));
5
6var order = new Order
7{
8 Address = new Address
9 {
10 FirstName = "Name"
11 }
12};
13
14await using var memoryStream = new MemoryStream();
15var streamWriter = new StreamWriter(memoryStream, Encoding.UTF8);
16serializer.Serialize(streamWriter, order);
17
18var result = Encoding.UTF8.GetString(memoryStream.ToArray());
19
20Console.WriteLine(result);
21
22public class Order
23{
24 public Address Address;
25}
26
27public class Address
28{
29 public string FirstName;
30}
31<?xml version="1.0" encoding="utf-8"?><Order xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><Address><FirstName>Name</FirstName></Address></Order>
32<?xml version="1.0" encoding="utf-8"?>
33<Order xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
34 <Address>
35 <FirstName>Name</FirstName>
36 </Address>
37</Order>
38
ANSWER
Answered 2022-Jan-05 at 21:52To write indented xml you can use XmlTextWriter
(instead of just StreamWriter
) with Formatting
set to Formatting.Indented
:
1using System.Text;
2using System.Xml.Serialization;
3
4var serializer = new XmlSerializer(typeof(Order));
5
6var order = new Order
7{
8 Address = new Address
9 {
10 FirstName = "Name"
11 }
12};
13
14await using var memoryStream = new MemoryStream();
15var streamWriter = new StreamWriter(memoryStream, Encoding.UTF8);
16serializer.Serialize(streamWriter, order);
17
18var result = Encoding.UTF8.GetString(memoryStream.ToArray());
19
20Console.WriteLine(result);
21
22public class Order
23{
24 public Address Address;
25}
26
27public class Address
28{
29 public string FirstName;
30}
31<?xml version="1.0" encoding="utf-8"?><Order xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><Address><FirstName>Name</FirstName></Address></Order>
32<?xml version="1.0" encoding="utf-8"?>
33<Order xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
34 <Address>
35 <FirstName>Name</FirstName>
36 </Address>
37</Order>
38await using var memoryStream = new MemoryStream();
39XmlTextWriter streamWriter = new XmlTextWriter(memoryStream, Encoding.UTF8);
40streamWriter.Formatting = Formatting.Indented;
41serializer.Serialize(streamWriter, order);
42
43var result = Encoding.UTF8.GetString(memoryStream.ToArray());
44
UPD
As @sveinungf wrote in the comment - using XmlWriter.Create
is recommended approach, so the code can look like this (also note that create method can accept StringBuilder
and file name also which can be more convenient in some scenarios):
1using System.Text;
2using System.Xml.Serialization;
3
4var serializer = new XmlSerializer(typeof(Order));
5
6var order = new Order
7{
8 Address = new Address
9 {
10 FirstName = "Name"
11 }
12};
13
14await using var memoryStream = new MemoryStream();
15var streamWriter = new StreamWriter(memoryStream, Encoding.UTF8);
16serializer.Serialize(streamWriter, order);
17
18var result = Encoding.UTF8.GetString(memoryStream.ToArray());
19
20Console.WriteLine(result);
21
22public class Order
23{
24 public Address Address;
25}
26
27public class Address
28{
29 public string FirstName;
30}
31<?xml version="1.0" encoding="utf-8"?><Order xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><Address><FirstName>Name</FirstName></Address></Order>
32<?xml version="1.0" encoding="utf-8"?>
33<Order xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
34 <Address>
35 <FirstName>Name</FirstName>
36 </Address>
37</Order>
38await using var memoryStream = new MemoryStream();
39XmlTextWriter streamWriter = new XmlTextWriter(memoryStream, Encoding.UTF8);
40streamWriter.Formatting = Formatting.Indented;
41serializer.Serialize(streamWriter, order);
42
43var result = Encoding.UTF8.GetString(memoryStream.ToArray());
44await using var memoryStream = new MemoryStream();
45var streamWriter = XmlWriter.Create(memoryStream, new()
46{
47 Encoding = Encoding.UTF8,
48 Indent = true
49});
50serializer.Serialize(streamWriter, order);
51var result = Encoding.UTF8.GetString(memoryStream.ToArray());
52
QUESTION
Signing payload in JS (Frontend) using EC and validating in Python
Asked 2021-Dec-18 at 11:56I have a Python backend that generates public/private keys, generates a payload, then needs to get that payload signed by the client (ReactJS or pure JS), which is later verified.
The implementation in Python looks like this:
Imports
1import json
2import uuid
3
4from backend.config import STARTING_BALANCE
5from cryptography.hazmat.backends import default_backend
6from cryptography.hazmat.primitives.asymmetric import ec
7from cryptography.hazmat.primitives.asymmetric.utils import (
8 encode_dss_signature,
9 decode_dss_signature
10)
11from cryptography.hazmat.primitives import hashes, serialization
12from cryptography.exceptions import InvalidSignature
13
14from cryptography.hazmat.primitives.serialization import load_pem_private_key
15
16import base64
17import hashlib
18
Generate keys:
1import json
2import uuid
3
4from backend.config import STARTING_BALANCE
5from cryptography.hazmat.backends import default_backend
6from cryptography.hazmat.primitives.asymmetric import ec
7from cryptography.hazmat.primitives.asymmetric.utils import (
8 encode_dss_signature,
9 decode_dss_signature
10)
11from cryptography.hazmat.primitives import hashes, serialization
12from cryptography.exceptions import InvalidSignature
13
14from cryptography.hazmat.primitives.serialization import load_pem_private_key
15
16import base64
17import hashlib
18class User:
19 def __init__(self):
20 self.address = hashlib.sha1(str(str(uuid.uuid4())[0:8]).encode("UTF-8")).hexdigest()
21 self.private_key = ec.generate_private_key(
22 ec.SECP256K1(),
23
24 default_backend()
25 )
26
27 self.private_key_return = self.private_key.private_bytes(
28 encoding=serialization.Encoding.PEM,
29 format=serialization.PrivateFormat.TraditionalOpenSSL,
30 encryption_algorithm=serialization.NoEncryption()
31 )
32
33 self.public_key = self.private_key.public_key()
34
35 self.serialize_public_key()
36
37 def serialize_public_key():
38 """
39 Reset the public key to its serialized version.
40 """
41 self.public_key = self.public_key.public_bytes(
42 encoding=serialization.Encoding.PEM,
43 format=serialization.PublicFormat.SubjectPublicKeyInfo
44 ).decode('utf-8')
45
Sign:
1import json
2import uuid
3
4from backend.config import STARTING_BALANCE
5from cryptography.hazmat.backends import default_backend
6from cryptography.hazmat.primitives.asymmetric import ec
7from cryptography.hazmat.primitives.asymmetric.utils import (
8 encode_dss_signature,
9 decode_dss_signature
10)
11from cryptography.hazmat.primitives import hashes, serialization
12from cryptography.exceptions import InvalidSignature
13
14from cryptography.hazmat.primitives.serialization import load_pem_private_key
15
16import base64
17import hashlib
18class User:
19 def __init__(self):
20 self.address = hashlib.sha1(str(str(uuid.uuid4())[0:8]).encode("UTF-8")).hexdigest()
21 self.private_key = ec.generate_private_key(
22 ec.SECP256K1(),
23
24 default_backend()
25 )
26
27 self.private_key_return = self.private_key.private_bytes(
28 encoding=serialization.Encoding.PEM,
29 format=serialization.PrivateFormat.TraditionalOpenSSL,
30 encryption_algorithm=serialization.NoEncryption()
31 )
32
33 self.public_key = self.private_key.public_key()
34
35 self.serialize_public_key()
36
37 def serialize_public_key():
38 """
39 Reset the public key to its serialized version.
40 """
41 self.public_key = self.public_key.public_bytes(
42 encoding=serialization.Encoding.PEM,
43 format=serialization.PublicFormat.SubjectPublicKeyInfo
44 ).decode('utf-8')
45def sign(self, data):
46 """
47 Generate a signature based on the data using the local private key.
48 """
49 return decode_dss_signature(self.private_key.sign(
50 json.dumps(data).encode('utf-8'),
51 ec.ECDSA(hashes.SHA256())
52 ))
53
Verify:
1import json
2import uuid
3
4from backend.config import STARTING_BALANCE
5from cryptography.hazmat.backends import default_backend
6from cryptography.hazmat.primitives.asymmetric import ec
7from cryptography.hazmat.primitives.asymmetric.utils import (
8 encode_dss_signature,
9 decode_dss_signature
10)
11from cryptography.hazmat.primitives import hashes, serialization
12from cryptography.exceptions import InvalidSignature
13
14from cryptography.hazmat.primitives.serialization import load_pem_private_key
15
16import base64
17import hashlib
18class User:
19 def __init__(self):
20 self.address = hashlib.sha1(str(str(uuid.uuid4())[0:8]).encode("UTF-8")).hexdigest()
21 self.private_key = ec.generate_private_key(
22 ec.SECP256K1(),
23
24 default_backend()
25 )
26
27 self.private_key_return = self.private_key.private_bytes(
28 encoding=serialization.Encoding.PEM,
29 format=serialization.PrivateFormat.TraditionalOpenSSL,
30 encryption_algorithm=serialization.NoEncryption()
31 )
32
33 self.public_key = self.private_key.public_key()
34
35 self.serialize_public_key()
36
37 def serialize_public_key():
38 """
39 Reset the public key to its serialized version.
40 """
41 self.public_key = self.public_key.public_bytes(
42 encoding=serialization.Encoding.PEM,
43 format=serialization.PublicFormat.SubjectPublicKeyInfo
44 ).decode('utf-8')
45def sign(self, data):
46 """
47 Generate a signature based on the data using the local private key.
48 """
49 return decode_dss_signature(self.private_key.sign(
50 json.dumps(data).encode('utf-8'),
51 ec.ECDSA(hashes.SHA256())
52 ))
53@staticmethod
54def verify(public_key, data, signature):
55 """
56 Verify a signature based on the original public key and data.
57 """
58 deserialized_public_key = serialization.load_pem_public_key(
59 public_key.encode('utf-8'),
60 default_backend()
61 )
62
63 (r, s) = signature
64
65 try:
66 deserialized_public_key.verify(
67 encode_dss_signature(r, s),
68 json.dumps(data).encode('utf-8'),
69 ec.ECDSA(hashes.SHA256())
70 )
71 return True
72 except InvalidSignature:
73 return False
74
What I need now is to load (or even generate) the PEM keys on the client-side, then upon request, sign a JSON payload that can later be verified from the Python backend.
I tried looking into the usage of web cryptography and cryptoJS but had no luck.
I'm okay using another algorithm that is more compatible, but at the very least I need the signing functionality fully working.
I also tried compiling Python to JS using Brython and Pyodide but both could not support all the required packages.
In simple terms, I am looking for the following:
Generate Payload (Python) -----> Sign Payload (JS) -----> Verify Signature (Python)
Any help/advice would be greatly appreciated.
ANSWER
Answered 2021-Dec-18 at 11:56CryptoJS only supports symmetric encryption and therefore not ECDSA. WebCrypto supports ECDSA, but not secp256k1.
WebCrypto has the advantage that it is supported by all major browsers. Since you can use other curves according to your comment, I will describe a solution with a curve supported by WebCrypto.
Otherwise, sjcl would also be an alternative, a pure JavaScript library that supports ECDSA and especially secp256k1, s.here.
WebCrypto is a low level API that provides the functionality you need like key generation, key export and signing. Regarding ECDSA WebCrypto supports the curves P-256 (aka secp256r1), P-384 (aka secp384r1) and p-521 (aka secp521r1). In the following I use P-256.
The following JavaScript code generates a key pair for P-256, exports the public key in X.509/SPKI format, DER encoded (so it can be sent to the Python site), and signs a message:
1import json
2import uuid
3
4from backend.config import STARTING_BALANCE
5from cryptography.hazmat.backends import default_backend
6from cryptography.hazmat.primitives.asymmetric import ec
7from cryptography.hazmat.primitives.asymmetric.utils import (
8 encode_dss_signature,
9 decode_dss_signature
10)
11from cryptography.hazmat.primitives import hashes, serialization
12from cryptography.exceptions import InvalidSignature
13
14from cryptography.hazmat.primitives.serialization import load_pem_private_key
15
16import base64
17import hashlib
18class User:
19 def __init__(self):
20 self.address = hashlib.sha1(str(str(uuid.uuid4())[0:8]).encode("UTF-8")).hexdigest()
21 self.private_key = ec.generate_private_key(
22 ec.SECP256K1(),
23
24 default_backend()
25 )
26
27 self.private_key_return = self.private_key.private_bytes(
28 encoding=serialization.Encoding.PEM,
29 format=serialization.PrivateFormat.TraditionalOpenSSL,
30 encryption_algorithm=serialization.NoEncryption()
31 )
32
33 self.public_key = self.private_key.public_key()
34
35 self.serialize_public_key()
36
37 def serialize_public_key():
38 """
39 Reset the public key to its serialized version.
40 """
41 self.public_key = self.public_key.public_bytes(
42 encoding=serialization.Encoding.PEM,
43 format=serialization.PublicFormat.SubjectPublicKeyInfo
44 ).decode('utf-8')
45def sign(self, data):
46 """
47 Generate a signature based on the data using the local private key.
48 """
49 return decode_dss_signature(self.private_key.sign(
50 json.dumps(data).encode('utf-8'),
51 ec.ECDSA(hashes.SHA256())
52 ))
53@staticmethod
54def verify(public_key, data, signature):
55 """
56 Verify a signature based on the original public key and data.
57 """
58 deserialized_public_key = serialization.load_pem_public_key(
59 public_key.encode('utf-8'),
60 default_backend()
61 )
62
63 (r, s) = signature
64
65 try:
66 deserialized_public_key.verify(
67 encode_dss_signature(r, s),
68 json.dumps(data).encode('utf-8'),
69 ec.ECDSA(hashes.SHA256())
70 )
71 return True
72 except InvalidSignature:
73 return False
74(async () => {
75
76 // Generate key pair
77 var keypair = await window.crypto.subtle.generateKey(
78 {
79 name: "ECDSA",
80 namedCurve: "P-256", // secp256r1
81 },
82 false,
83 ["sign", "verify"]
84 );
85
86 // Export public key in X.509/SPKI format, DER encoded
87 var publicKey = await window.crypto.subtle.exportKey(
88 "spki",
89 keypair.publicKey
90 );
91 document.getElementById("pub").innerHTML = "Public key: " + ab2b64(publicKey);
92
93 // Sign data
94 var data = {
95 "data_1":"The quick brown fox",
96 "data_2":"jumps over the lazy dog"
97 }
98 var dataStr = JSON.stringify(data)
99 var dataBuf = new TextEncoder().encode(dataStr).buffer
100 var signature = await window.crypto.subtle.sign(
101 {
102 name: "ECDSA",
103 hash: {name: "SHA-256"},
104 },
105 keypair.privateKey,
106 dataBuf
107 );
108 document.getElementById("sig").innerHTML = "Signature: " + ab2b64(signature);
109
110})();
111
112// Helper
113function ab2b64(arrayBuffer) {
114 return window.btoa(String.fromCharCode.apply(null, new Uint8Array(arrayBuffer)));
115}
1import json
2import uuid
3
4from backend.config import STARTING_BALANCE
5from cryptography.hazmat.backends import default_backend
6from cryptography.hazmat.primitives.asymmetric import ec
7from cryptography.hazmat.primitives.asymmetric.utils import (
8 encode_dss_signature,
9 decode_dss_signature
10)
11from cryptography.hazmat.primitives import hashes, serialization
12from cryptography.exceptions import InvalidSignature
13
14from cryptography.hazmat.primitives.serialization import load_pem_private_key
15
16import base64
17import hashlib
18class User:
19 def __init__(self):
20 self.address = hashlib.sha1(str(str(uuid.uuid4())[0:8]).encode("UTF-8")).hexdigest()
21 self.private_key = ec.generate_private_key(
22 ec.SECP256K1(),
23
24 default_backend()
25 )
26
27 self.private_key_return = self.private_key.private_bytes(
28 encoding=serialization.Encoding.PEM,
29 format=serialization.PrivateFormat.TraditionalOpenSSL,
30 encryption_algorithm=serialization.NoEncryption()
31 )
32
33 self.public_key = self.private_key.public_key()
34
35 self.serialize_public_key()
36
37 def serialize_public_key():
38 """
39 Reset the public key to its serialized version.
40 """
41 self.public_key = self.public_key.public_bytes(
42 encoding=serialization.Encoding.PEM,
43 format=serialization.PublicFormat.SubjectPublicKeyInfo
44 ).decode('utf-8')
45def sign(self, data):
46 """
47 Generate a signature based on the data using the local private key.
48 """
49 return decode_dss_signature(self.private_key.sign(
50 json.dumps(data).encode('utf-8'),
51 ec.ECDSA(hashes.SHA256())
52 ))
53@staticmethod
54def verify(public_key, data, signature):
55 """
56 Verify a signature based on the original public key and data.
57 """
58 deserialized_public_key = serialization.load_pem_public_key(
59 public_key.encode('utf-8'),
60 default_backend()
61 )
62
63 (r, s) = signature
64
65 try:
66 deserialized_public_key.verify(
67 encode_dss_signature(r, s),
68 json.dumps(data).encode('utf-8'),
69 ec.ECDSA(hashes.SHA256())
70 )
71 return True
72 except InvalidSignature:
73 return False
74(async () => {
75
76 // Generate key pair
77 var keypair = await window.crypto.subtle.generateKey(
78 {
79 name: "ECDSA",
80 namedCurve: "P-256", // secp256r1
81 },
82 false,
83 ["sign", "verify"]
84 );
85
86 // Export public key in X.509/SPKI format, DER encoded
87 var publicKey = await window.crypto.subtle.exportKey(
88 "spki",
89 keypair.publicKey
90 );
91 document.getElementById("pub").innerHTML = "Public key: " + ab2b64(publicKey);
92
93 // Sign data
94 var data = {
95 "data_1":"The quick brown fox",
96 "data_2":"jumps over the lazy dog"
97 }
98 var dataStr = JSON.stringify(data)
99 var dataBuf = new TextEncoder().encode(dataStr).buffer
100 var signature = await window.crypto.subtle.sign(
101 {
102 name: "ECDSA",
103 hash: {name: "SHA-256"},
104 },
105 keypair.privateKey,
106 dataBuf
107 );
108 document.getElementById("sig").innerHTML = "Signature: " + ab2b64(signature);
109
110})();
111
112// Helper
113function ab2b64(arrayBuffer) {
114 return window.btoa(String.fromCharCode.apply(null, new Uint8Array(arrayBuffer)));
115}<p style="font-family:'Courier New', monospace;" id="pub"></p>
116<p style="font-family:'Courier New', monospace;" id="sig"></p>
A possible output is:
1import json
2import uuid
3
4from backend.config import STARTING_BALANCE
5from cryptography.hazmat.backends import default_backend
6from cryptography.hazmat.primitives.asymmetric import ec
7from cryptography.hazmat.primitives.asymmetric.utils import (
8 encode_dss_signature,
9 decode_dss_signature
10)
11from cryptography.hazmat.primitives import hashes, serialization
12from cryptography.exceptions import InvalidSignature
13
14from cryptography.hazmat.primitives.serialization import load_pem_private_key
15
16import base64
17import hashlib
18class User:
19 def __init__(self):
20 self.address = hashlib.sha1(str(str(uuid.uuid4())[0:8]).encode("UTF-8")).hexdigest()
21 self.private_key = ec.generate_private_key(
22 ec.SECP256K1(),
23
24 default_backend()
25 )
26
27 self.private_key_return = self.private_key.private_bytes(
28 encoding=serialization.Encoding.PEM,
29 format=serialization.PrivateFormat.TraditionalOpenSSL,
30 encryption_algorithm=serialization.NoEncryption()
31 )
32
33 self.public_key = self.private_key.public_key()
34
35 self.serialize_public_key()
36
37 def serialize_public_key():
38 """
39 Reset the public key to its serialized version.
40 """
41 self.public_key = self.public_key.public_bytes(
42 encoding=serialization.Encoding.PEM,
43 format=serialization.PublicFormat.SubjectPublicKeyInfo
44 ).decode('utf-8')
45def sign(self, data):
46 """
47 Generate a signature based on the data using the local private key.
48 """
49 return decode_dss_signature(self.private_key.sign(
50 json.dumps(data).encode('utf-8'),
51 ec.ECDSA(hashes.SHA256())
52 ))
53@staticmethod
54def verify(public_key, data, signature):
55 """
56 Verify a signature based on the original public key and data.
57 """
58 deserialized_public_key = serialization.load_pem_public_key(
59 public_key.encode('utf-8'),
60 default_backend()
61 )
62
63 (r, s) = signature
64
65 try:
66 deserialized_public_key.verify(
67 encode_dss_signature(r, s),
68 json.dumps(data).encode('utf-8'),
69 ec.ECDSA(hashes.SHA256())
70 )
71 return True
72 except InvalidSignature:
73 return False
74(async () => {
75
76 // Generate key pair
77 var keypair = await window.crypto.subtle.generateKey(
78 {
79 name: "ECDSA",
80 namedCurve: "P-256", // secp256r1
81 },
82 false,
83 ["sign", "verify"]
84 );
85
86 // Export public key in X.509/SPKI format, DER encoded
87 var publicKey = await window.crypto.subtle.exportKey(
88 "spki",
89 keypair.publicKey
90 );
91 document.getElementById("pub").innerHTML = "Public key: " + ab2b64(publicKey);
92
93 // Sign data
94 var data = {
95 "data_1":"The quick brown fox",
96 "data_2":"jumps over the lazy dog"
97 }
98 var dataStr = JSON.stringify(data)
99 var dataBuf = new TextEncoder().encode(dataStr).buffer
100 var signature = await window.crypto.subtle.sign(
101 {
102 name: "ECDSA",
103 hash: {name: "SHA-256"},
104 },
105 keypair.privateKey,
106 dataBuf
107 );
108 document.getElementById("sig").innerHTML = "Signature: " + ab2b64(signature);
109
110})();
111
112// Helper
113function ab2b64(arrayBuffer) {
114 return window.btoa(String.fromCharCode.apply(null, new Uint8Array(arrayBuffer)));
115}<p style="font-family:'Courier New', monospace;" id="pub"></p>
116<p style="font-family:'Courier New', monospace;" id="sig"></p>Public key: MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEWzC5lPNifcHNuKL+/jjhrtTi+9gAMbYui9Vv7TjtS7RCt8p6Y6zUmHVpGEowuVMuOSNxfpJYpnGExNT/eWhuwQ==
117Signature: XRNTbkHK7H8XPEIJQhS6K6ncLPEuWWrkXLXiNWwv6ImnL2Dm5VHcazJ7QYQNOvWJmB2T3rconRkT0N4BDFapCQ==
118
On the Python side a successful verification would be possible with:
1import json
2import uuid
3
4from backend.config import STARTING_BALANCE
5from cryptography.hazmat.backends import default_backend
6from cryptography.hazmat.primitives.asymmetric import ec
7from cryptography.hazmat.primitives.asymmetric.utils import (
8 encode_dss_signature,
9 decode_dss_signature
10)
11from cryptography.hazmat.primitives import hashes, serialization
12from cryptography.exceptions import InvalidSignature
13
14from cryptography.hazmat.primitives.serialization import load_pem_private_key
15
16import base64
17import hashlib
18class User:
19 def __init__(self):
20 self.address = hashlib.sha1(str(str(uuid.uuid4())[0:8]).encode("UTF-8")).hexdigest()
21 self.private_key = ec.generate_private_key(
22 ec.SECP256K1(),
23
24 default_backend()
25 )
26
27 self.private_key_return = self.private_key.private_bytes(
28 encoding=serialization.Encoding.PEM,
29 format=serialization.PrivateFormat.TraditionalOpenSSL,
30 encryption_algorithm=serialization.NoEncryption()
31 )
32
33 self.public_key = self.private_key.public_key()
34
35 self.serialize_public_key()
36
37 def serialize_public_key():
38 """
39 Reset the public key to its serialized version.
40 """
41 self.public_key = self.public_key.public_bytes(
42 encoding=serialization.Encoding.PEM,
43 format=serialization.PublicFormat.SubjectPublicKeyInfo
44 ).decode('utf-8')
45def sign(self, data):
46 """
47 Generate a signature based on the data using the local private key.
48 """
49 return decode_dss_signature(self.private_key.sign(
50 json.dumps(data).encode('utf-8'),
51 ec.ECDSA(hashes.SHA256())
52 ))
53@staticmethod
54def verify(public_key, data, signature):
55 """
56 Verify a signature based on the original public key and data.
57 """
58 deserialized_public_key = serialization.load_pem_public_key(
59 public_key.encode('utf-8'),
60 default_backend()
61 )
62
63 (r, s) = signature
64
65 try:
66 deserialized_public_key.verify(
67 encode_dss_signature(r, s),
68 json.dumps(data).encode('utf-8'),
69 ec.ECDSA(hashes.SHA256())
70 )
71 return True
72 except InvalidSignature:
73 return False
74(async () => {
75
76 // Generate key pair
77 var keypair = await window.crypto.subtle.generateKey(
78 {
79 name: "ECDSA",
80 namedCurve: "P-256", // secp256r1
81 },
82 false,
83 ["sign", "verify"]
84 );
85
86 // Export public key in X.509/SPKI format, DER encoded
87 var publicKey = await window.crypto.subtle.exportKey(
88 "spki",
89 keypair.publicKey
90 );
91 document.getElementById("pub").innerHTML = "Public key: " + ab2b64(publicKey);
92
93 // Sign data
94 var data = {
95 "data_1":"The quick brown fox",
96 "data_2":"jumps over the lazy dog"
97 }
98 var dataStr = JSON.stringify(data)
99 var dataBuf = new TextEncoder().encode(dataStr).buffer
100 var signature = await window.crypto.subtle.sign(
101 {
102 name: "ECDSA",
103 hash: {name: "SHA-256"},
104 },
105 keypair.privateKey,
106 dataBuf
107 );
108 document.getElementById("sig").innerHTML = "Signature: " + ab2b64(signature);
109
110})();
111
112// Helper
113function ab2b64(arrayBuffer) {
114 return window.btoa(String.fromCharCode.apply(null, new Uint8Array(arrayBuffer)));
115}<p style="font-family:'Courier New', monospace;" id="pub"></p>
116<p style="font-family:'Courier New', monospace;" id="sig"></p>Public key: MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEWzC5lPNifcHNuKL+/jjhrtTi+9gAMbYui9Vv7TjtS7RCt8p6Y6zUmHVpGEowuVMuOSNxfpJYpnGExNT/eWhuwQ==
117Signature: XRNTbkHK7H8XPEIJQhS6K6ncLPEuWWrkXLXiNWwv6ImnL2Dm5VHcazJ7QYQNOvWJmB2T3rconRkT0N4BDFapCQ==
118from cryptography.hazmat.backends import default_backend
119from cryptography.hazmat.primitives.asymmetric import ec
120from cryptography.hazmat.primitives.asymmetric.utils import encode_dss_signature
121from cryptography.hazmat.primitives.serialization import load_der_public_key
122from cryptography.hazmat.primitives import hashes
123from cryptography.exceptions import InvalidSignature
124import base64
125import json
126
127publikKeyDer = base64.b64decode("MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEWzC5lPNifcHNuKL+/jjhrtTi+9gAMbYui9Vv7TjtS7RCt8p6Y6zUmHVpGEowuVMuOSNxfpJYpnGExNT/eWhuwQ==")
128data = {
129 "data_1":"The quick brown fox",
130 "data_2":"jumps over the lazy dog"
131}
132signature = base64.b64decode("XRNTbkHK7H8XPEIJQhS6K6ncLPEuWWrkXLXiNWwv6ImnL2Dm5VHcazJ7QYQNOvWJmB2T3rconRkT0N4BDFapCQ==")
133
134publicKey = load_der_public_key(publikKeyDer, default_backend())
135r = int.from_bytes(signature[:32], byteorder='big')
136s = int.from_bytes(signature[32:], byteorder='big')
137
138try:
139 publicKey.verify(
140 encode_dss_signature(r, s),
141 json.dumps(data, separators=(',', ':')).encode('utf-8'),
142 ec.ECDSA(hashes.SHA256())
143 )
144 print("verification succeeded")
145except InvalidSignature:
146 print("verification failed")
147
Where, unlike the posted Python code, load_der_public_key()
is used instead of load_pem_public_key()
.
Also, WebCrypto returns the signature in IEEE P1363 format, but as a concatenated ArrayBuffer
r|s, so a conversion of both parts to an integer is necessary to allow a format conversion to ASN.1/DER with encode_dss_signature()
.
Regarding JSON the separators have to be redefined to the most compact representation (but this depends on the settings on the JavaScript side).
QUESTION
How reproducible / deterministic is Parquet format?
Asked 2021-Dec-09 at 03:55I'm seeking advice from people deeply familiar with the binary layout of Apache Parquet:
Having a data transformation F(a) = b
where F
is fully deterministic, and same exact versions of the entire software stack (framework, arrow & parquet libraries) are used - how likely am I to get an identical binary representation of dataframe b
on different hosts every time b
is saved into Parquet?
In other words how reproducible Parquet is on binary level? When data is logically the same what can cause binary differences?
- Can there be some uninit memory in between values due to alignment?
- Assuming all serialization settings (compression, chunking, use of dictionaries etc.) are the same, can result still drift?
I'm working on a system for fully reproducible and deterministic data processing and computing dataset hashes to assert these guarantees.
My key goal has been to ensure that dataset b
contains an idendital set of records as dataset b'
- this is of course very different from hashing a binary representation of Arrow/Parquet. Not wanting to deal with the reproducibility of storage formats I've been computing logical data hashes in memory. This is slow but flexible, e.g. my hash stays the same even if records are re-ordered (which I consider an equivalent dataset).
But when thinking about integrating with IPFS
and other content-addressable storages that rely on hashes of files - it would simplify the design a lot to have just one hash (physical) instead of two (logical + physical), but this means I have to guarantee that Parquet files are reproducible.
Update
I decided to continue using logical hashing for now.
I've created a new Rust crate arrow-digest that implements the stable hashing for Arrow arrays and record batches and tries hard to hide the encoding-related differences. The crate's README describes the hashing algorithm if someone finds it useful and wants to implement it in another language.
I'll continue to expand the set of supported types as I'm integrating it into the decentralized data processing tool I'm working on.
In the long term, I'm not sure logical hashing is the best way forward - a subset of Parquet that makes some efficiency sacrifices just to make file layout deterministic might be a better choice for content-addressability.
ANSWER
Answered 2021-Dec-05 at 04:30At least in arrow's implementation I would expect, but haven't verified the exact same input (including identical metadata) in the same order to yield deterministic outputs (we try not to leave uninitialized values for security reasons) with the same configuration (assuming the compression algorithm chosen also makes the deterministic guarantee). It is possible there is some hash-map iteration for metadata or elsewhere that might also break this assumption.
As @Pace pointed out I would not rely on this and recommend against relying on it). There is nothing in the spec that guarantees this and since the writer version is persisted when writing a file you are guaranteed a breakage if you ever decided to upgrade. Things will also break if additional metadata is added or removed ( I believe in the past there have been some big fixes for round tripping data sets that would have caused non-determinism).
So in summary this might or might not work today but even if it does I would expect this would be very brittle.
QUESTION
How to distribute a Kotlin CLI application?
Asked 2021-Dec-02 at 22:07I've built a small bot in Kotlin.
It's finished and I can run it from my developer tools. I am using the application
plugin to attempt distribution but I keep failing.
./gradlew run
runs the bot as expected.
I was looking for something like ./gradlew installDist
and then just running installationDir/bin/App
(similar to Ktor apps) and running the app. But it just exits successfully with no output when I should see a lot of logging output.
What am I doing wrong?
1// gradle.build.kts
2import org.jetbrains.kotlin.gradle.plugin.statistics.ReportStatisticsToElasticSearch.url
3import org.jetbrains.kotlin.gradle.tasks.KotlinCompile
4
5plugins {
6 kotlin("jvm") version "1.5.31"
7 kotlin("plugin.serialization") version "1.5.31"
8 application
9}
10
11group = "me.nanospicer"
12version = "1.0"
13
14repositories {
15 mavenCentral()
16 maven {
17 url = uri("https://jitpack.io")
18 }
19}
20
21
22tasks.withType<KotlinCompile>() {
23 kotlinOptions.jvmTarget = "1.8"
24}
25
26application {
27 mainClass.set("MainKt")
28}
29
30val ktor_version="1.6.5"
31dependencies {
32 implementation("ch.qos.logback:logback-classic:1.2.7")
33 implementation("io.ktor:ktor-client-core:$ktor_version")
34 implementation("io.ktor:ktor-client-cio:$ktor_version")
35 implementation("io.ktor:ktor-client-serialization:$ktor_version")
36 implementation("io.ktor:ktor-client-logging:$ktor_version")
37 implementation("org.jetbrains.kotlinx:kotlinx-serialization-json:1.3.1")
38}
39
ANSWER
Answered 2021-Dec-02 at 22:07- Good news: The setup posted on my question works flawlessly.
- Bad news: My code didn't.
I had a file that didn't exist on the server and a part of the code was returning null
so the app would quit without any output because it wouldn't do anything due to the file being non-existent.
QUESTION
SharpSerializer: Ignore attributes/properties from deserialization
Asked 2021-Nov-10 at 17:43I am using SharpSerializer to serialize/deserialize object.
I want the ability to ignore specific properties when deserializing.
SharpSerializer has an option to ignore properties by attribute or by classes and property name:
1SharpSerializerSettings.AdvancedSettings.AttributesToIgnore
2SharpSerializerSettings.AdvancedSettings.PropertiesToIgnore
3
but it seems that these settings are only used to ignore from serialization, not from deserialization (I tested with the GitHub source code and the NugetPackage).
Am I correct?
Is there any way to ignore attributes/properties from deserialization?
P.S.
- I'm sure there are other great serialization libraries, but it will take a great amount of effort to change the code and all the existing serialized files.
- I opened an issue on the GitHub project, but the project does not seem to be active since 2018.
- The object with properties to ignore need not be the root object.
ANSWER
Answered 2021-Nov-10 at 17:43You are correct that SharpSerializer does not implement ignoring of property values when deserializing. This can be verified from the reference source for ObjectFactory.fillProperties(object obj, IEnumerable<Property> properties)
:
1SharpSerializerSettings.AdvancedSettings.AttributesToIgnore
2SharpSerializerSettings.AdvancedSettings.PropertiesToIgnore
3private void fillProperties(object obj, IEnumerable<Property> properties)
4{
5 foreach (Property property in properties)
6 {
7 PropertyInfo propertyInfo = obj.GetType().GetProperty(property.Name);
8 if (propertyInfo == null) continue;
9
10 object value = CreateObject(property);
11 if (value == null) continue;
12
13 propertyInfo.SetValue(obj, value, _emptyObjectArray);
14 }
15}
16
This code unconditionally sets any property read from the serialization stream into the incoming object using reflection, without checking the list of ignored attributes or properties.
Thus the only way to ignore your desired properties would seem to be to create your own versions of XmlPropertyDeserializer
or BinaryPropertyDeserializer
that skip or filter the unwanted properties. The following is one possible implementation for XML. This implementation reads the properties from XML into a Property
hierarchy as usual, then applies a filter action to remove properties corresponding to .NET properties with a custom attribute [SharpSerializerIgnoreForDeserialize]
applied, then finally creates the object tree using the pruned Property
.
1SharpSerializerSettings.AdvancedSettings.AttributesToIgnore
2SharpSerializerSettings.AdvancedSettings.PropertiesToIgnore
3private void fillProperties(object obj, IEnumerable<Property> properties)
4{
5 foreach (Property property in properties)
6 {
7 PropertyInfo propertyInfo = obj.GetType().GetProperty(property.Name);
8 if (propertyInfo == null) continue;
9
10 object value = CreateObject(property);
11 if (value == null) continue;
12
13 propertyInfo.SetValue(obj, value, _emptyObjectArray);
14 }
15}
16[System.AttributeUsage(System.AttributeTargets.Property, AllowMultiple = false, Inherited = true)]
17public class SharpSerializerIgnoreForDeserializeAttribute : System.Attribute { }
18
19public class PropertyDeserializerDecorator : IPropertyDeserializer
20{
21 readonly IPropertyDeserializer deserializer;
22 public PropertyDeserializerDecorator(IPropertyDeserializer deserializer) => this.deserializer = deserializer ?? throw new ArgumentNullException();
23
24 public virtual void Open(Stream stream) => deserializer.Open(stream);
25 public virtual Property Deserialize() => deserializer.Deserialize();
26 public virtual void Close() => deserializer.Close();
27}
28
29public class CustomPropertyDeserializer : PropertyDeserializerDecorator
30{
31 Action<Property> deserializePropertyAction;
32 public CustomPropertyDeserializer(IPropertyDeserializer deserializer, Action<Property> deserializePropertyAction = default) : base(deserializer) => this.deserializePropertyAction = deserializePropertyAction;
33 public override Property Deserialize()
34 {
35 var property = base.Deserialize();
36
37 if (deserializePropertyAction != null)
38 property.WalkProperties(p => deserializePropertyAction(p));
39
40 return property;
41 }
42}
43
44public static partial class SharpSerializerExtensions
45{
46 public static SharpSerializer Create(SharpSerializerXmlSettings settings, Action<Property> deserializePropertyAction = default)
47 {
48 // Adapted from https://github.com/polenter/SharpSerializer/blob/42f9a20b3934a7f2cece356cc8116a861cec0b91/SharpSerializer/SharpSerializer.cs#L139
49 // By https://github.com/polenter
50 var typeNameConverter = settings.AdvancedSettings.TypeNameConverter ??
51 new TypeNameConverter(
52 settings.IncludeAssemblyVersionInTypeName,
53 settings.IncludeCultureInTypeName,
54 settings.IncludePublicKeyTokenInTypeName);
55 // SimpleValueConverter
56 var simpleValueConverter = settings.AdvancedSettings.SimpleValueConverter ?? new SimpleValueConverter(settings.Culture, typeNameConverter);
57 // XmlWriterSettings
58 var xmlWriterSettings = new XmlWriterSettings
59 {
60 Encoding = settings.Encoding,
61 Indent = true,
62 OmitXmlDeclaration = true,
63 };
64 // XmlReaderSettings
65 var xmlReaderSettings = new XmlReaderSettings
66 {
67 IgnoreComments = true,
68 IgnoreWhitespace = true,
69 };
70
71 // Create Serializer and Deserializer
72 var reader = new DefaultXmlReader(typeNameConverter, simpleValueConverter, xmlReaderSettings);
73 var writer = new DefaultXmlWriter(typeNameConverter, simpleValueConverter, xmlWriterSettings);
74
75 var _serializer = new XmlPropertySerializer(writer);
76 var _deserializer = new CustomPropertyDeserializer(new XmlPropertyDeserializer(reader), deserializePropertyAction);
77
78 var serializer = new SharpSerializer(_serializer, _deserializer)
79 {
80 //InstanceCreator = settings.InstanceCreator ?? new DefaultInstanceCreator(), -- InstanceCreator not present in SharpSerializer 3.0.1
81 RootName = settings.AdvancedSettings.RootName,
82 };
83 serializer.PropertyProvider.PropertiesToIgnore = settings.AdvancedSettings.PropertiesToIgnore;
84 serializer.PropertyProvider.AttributesToIgnore = settings.AdvancedSettings.AttributesToIgnore;
85
86 return serializer;
87 }
88
89 public static void WalkProperties(this Property property, Action<Property> action)
90 {
91 if (action == null || property == null)
92 throw new ArgumentNullException();
93
94 // Avoid cyclic dependencies.
95 // Reference.IsProcessed is true only for the first reference of an object.
96 bool skipProperty = property is ReferenceTargetProperty refTarget
97 && refTarget.Reference != null
98 && !refTarget.Reference.IsProcessed;
99
100 if (skipProperty) return;
101
102 action(property);
103
104 switch (property.Art)
105 {
106 case PropertyArt.Collection:
107 {
108 foreach (var item in ((CollectionProperty)property).Items)
109 item.WalkProperties(action);
110 }
111 break;
112 case PropertyArt.Complex:
113 {
114 foreach (var item in ((ComplexProperty)property).Properties)
115 item.WalkProperties(action);
116 }
117 break;
118 case PropertyArt.Dictionary:
119 {
120 foreach (var item in ((DictionaryProperty)property).Items)
121 {
122 item.Key.WalkProperties(action);
123 item.Value.WalkProperties(action);
124 }
125 }
126 break;
127 case PropertyArt.MultiDimensionalArray:
128 {
129 foreach (var item in ((MultiDimensionalArrayProperty )property).Items)
130 item.Value.WalkProperties(action);
131 }
132 break;
133 case PropertyArt.Null:
134 case PropertyArt.Simple:
135 case PropertyArt.Reference:
136 break;
137 case PropertyArt.SingleDimensionalArray:
138 {
139 foreach (var item in ((SingleDimensionalArrayProperty)property).Items)
140 item.WalkProperties(action);
141 }
142 break;
143 default:
144 throw new NotImplementedException(property.Art.ToString());
145 }
146 }
147
148 public static void RemoveIgnoredChildProperties(Property p)
149 {
150 if (p.Art == PropertyArt.Complex)
151 {
152 var items = ((ComplexProperty)p).Properties;
153 for (int i = items.Count - 1; i >= 0; i--)
154 {
155 if (p.Type.GetProperty(items[i].Name)?.IsDefined(typeof(SharpSerializerIgnoreForDeserializeAttribute), true) == true)
156 {
157 items.RemoveAt(i);
158 }
159 }
160 }
161 }
162}
163
Then, given the following models:
1SharpSerializerSettings.AdvancedSettings.AttributesToIgnore
2SharpSerializerSettings.AdvancedSettings.PropertiesToIgnore
3private void fillProperties(object obj, IEnumerable<Property> properties)
4{
5 foreach (Property property in properties)
6 {
7 PropertyInfo propertyInfo = obj.GetType().GetProperty(property.Name);
8 if (propertyInfo == null) continue;
9
10 object value = CreateObject(property);
11 if (value == null) continue;
12
13 propertyInfo.SetValue(obj, value, _emptyObjectArray);
14 }
15}
16[System.AttributeUsage(System.AttributeTargets.Property, AllowMultiple = false, Inherited = true)]
17public class SharpSerializerIgnoreForDeserializeAttribute : System.Attribute { }
18
19public class PropertyDeserializerDecorator : IPropertyDeserializer
20{
21 readonly IPropertyDeserializer deserializer;
22 public PropertyDeserializerDecorator(IPropertyDeserializer deserializer) => this.deserializer = deserializer ?? throw new ArgumentNullException();
23
24 public virtual void Open(Stream stream) => deserializer.Open(stream);
25 public virtual Property Deserialize() => deserializer.Deserialize();
26 public virtual void Close() => deserializer.Close();
27}
28
29public class CustomPropertyDeserializer : PropertyDeserializerDecorator
30{
31 Action<Property> deserializePropertyAction;
32 public CustomPropertyDeserializer(IPropertyDeserializer deserializer, Action<Property> deserializePropertyAction = default) : base(deserializer) => this.deserializePropertyAction = deserializePropertyAction;
33 public override Property Deserialize()
34 {
35 var property = base.Deserialize();
36
37 if (deserializePropertyAction != null)
38 property.WalkProperties(p => deserializePropertyAction(p));
39
40 return property;
41 }
42}
43
44public static partial class SharpSerializerExtensions
45{
46 public static SharpSerializer Create(SharpSerializerXmlSettings settings, Action<Property> deserializePropertyAction = default)
47 {
48 // Adapted from https://github.com/polenter/SharpSerializer/blob/42f9a20b3934a7f2cece356cc8116a861cec0b91/SharpSerializer/SharpSerializer.cs#L139
49 // By https://github.com/polenter
50 var typeNameConverter = settings.AdvancedSettings.TypeNameConverter ??
51 new TypeNameConverter(
52 settings.IncludeAssemblyVersionInTypeName,
53 settings.IncludeCultureInTypeName,
54 settings.IncludePublicKeyTokenInTypeName);
55 // SimpleValueConverter
56 var simpleValueConverter = settings.AdvancedSettings.SimpleValueConverter ?? new SimpleValueConverter(settings.Culture, typeNameConverter);
57 // XmlWriterSettings
58 var xmlWriterSettings = new XmlWriterSettings
59 {
60 Encoding = settings.Encoding,
61 Indent = true,
62 OmitXmlDeclaration = true,
63 };
64 // XmlReaderSettings
65 var xmlReaderSettings = new XmlReaderSettings
66 {
67 IgnoreComments = true,
68 IgnoreWhitespace = true,
69 };
70
71 // Create Serializer and Deserializer
72 var reader = new DefaultXmlReader(typeNameConverter, simpleValueConverter, xmlReaderSettings);
73 var writer = new DefaultXmlWriter(typeNameConverter, simpleValueConverter, xmlWriterSettings);
74
75 var _serializer = new XmlPropertySerializer(writer);
76 var _deserializer = new CustomPropertyDeserializer(new XmlPropertyDeserializer(reader), deserializePropertyAction);
77
78 var serializer = new SharpSerializer(_serializer, _deserializer)
79 {
80 //InstanceCreator = settings.InstanceCreator ?? new DefaultInstanceCreator(), -- InstanceCreator not present in SharpSerializer 3.0.1
81 RootName = settings.AdvancedSettings.RootName,
82 };
83 serializer.PropertyProvider.PropertiesToIgnore = settings.AdvancedSettings.PropertiesToIgnore;
84 serializer.PropertyProvider.AttributesToIgnore = settings.AdvancedSettings.AttributesToIgnore;
85
86 return serializer;
87 }
88
89 public static void WalkProperties(this Property property, Action<Property> action)
90 {
91 if (action == null || property == null)
92 throw new ArgumentNullException();
93
94 // Avoid cyclic dependencies.
95 // Reference.IsProcessed is true only for the first reference of an object.
96 bool skipProperty = property is ReferenceTargetProperty refTarget
97 && refTarget.Reference != null
98 && !refTarget.Reference.IsProcessed;
99
100 if (skipProperty) return;
101
102 action(property);
103
104 switch (property.Art)
105 {
106 case PropertyArt.Collection:
107 {
108 foreach (var item in ((CollectionProperty)property).Items)
109 item.WalkProperties(action);
110 }
111 break;
112 case PropertyArt.Complex:
113 {
114 foreach (var item in ((ComplexProperty)property).Properties)
115 item.WalkProperties(action);
116 }
117 break;
118 case PropertyArt.Dictionary:
119 {
120 foreach (var item in ((DictionaryProperty)property).Items)
121 {
122 item.Key.WalkProperties(action);
123 item.Value.WalkProperties(action);
124 }
125 }
126 break;
127 case PropertyArt.MultiDimensionalArray:
128 {
129 foreach (var item in ((MultiDimensionalArrayProperty )property).Items)
130 item.Value.WalkProperties(action);
131 }
132 break;
133 case PropertyArt.Null:
134 case PropertyArt.Simple:
135 case PropertyArt.Reference:
136 break;
137 case PropertyArt.SingleDimensionalArray:
138 {
139 foreach (var item in ((SingleDimensionalArrayProperty)property).Items)
140 item.WalkProperties(action);
141 }
142 break;
143 default:
144 throw new NotImplementedException(property.Art.ToString());
145 }
146 }
147
148 public static void RemoveIgnoredChildProperties(Property p)
149 {
150 if (p.Art == PropertyArt.Complex)
151 {
152 var items = ((ComplexProperty)p).Properties;
153 for (int i = items.Count - 1; i >= 0; i--)
154 {
155 if (p.Type.GetProperty(items[i].Name)?.IsDefined(typeof(SharpSerializerIgnoreForDeserializeAttribute), true) == true)
156 {
157 items.RemoveAt(i);
158 }
159 }
160 }
161 }
162}
163public class Root
164{
165 public List<Model> Models { get; set; } = new ();
166}
167
168public class Model
169{
170 public string Value { get; set; }
171
172 [SharpSerializerIgnoreForDeserialize]
173 public string IgnoreMe { get; set; }
174}
175
You would deserialize using the customized XmlPropertyDeserializer
as follows:
1SharpSerializerSettings.AdvancedSettings.AttributesToIgnore
2SharpSerializerSettings.AdvancedSettings.PropertiesToIgnore
3private void fillProperties(object obj, IEnumerable<Property> properties)
4{
5 foreach (Property property in properties)
6 {
7 PropertyInfo propertyInfo = obj.GetType().GetProperty(property.Name);
8 if (propertyInfo == null) continue;
9
10 object value = CreateObject(property);
11 if (value == null) continue;
12
13 propertyInfo.SetValue(obj, value, _emptyObjectArray);
14 }
15}
16[System.AttributeUsage(System.AttributeTargets.Property, AllowMultiple = false, Inherited = true)]
17public class SharpSerializerIgnoreForDeserializeAttribute : System.Attribute { }
18
19public class PropertyDeserializerDecorator : IPropertyDeserializer
20{
21 readonly IPropertyDeserializer deserializer;
22 public PropertyDeserializerDecorator(IPropertyDeserializer deserializer) => this.deserializer = deserializer ?? throw new ArgumentNullException();
23
24 public virtual void Open(Stream stream) => deserializer.Open(stream);
25 public virtual Property Deserialize() => deserializer.Deserialize();
26 public virtual void Close() => deserializer.Close();
27}
28
29public class CustomPropertyDeserializer : PropertyDeserializerDecorator
30{
31 Action<Property> deserializePropertyAction;
32 public CustomPropertyDeserializer(IPropertyDeserializer deserializer, Action<Property> deserializePropertyAction = default) : base(deserializer) => this.deserializePropertyAction = deserializePropertyAction;
33 public override Property Deserialize()
34 {
35 var property = base.Deserialize();
36
37 if (deserializePropertyAction != null)
38 property.WalkProperties(p => deserializePropertyAction(p));
39
40 return property;
41 }
42}
43
44public static partial class SharpSerializerExtensions
45{
46 public static SharpSerializer Create(SharpSerializerXmlSettings settings, Action<Property> deserializePropertyAction = default)
47 {
48 // Adapted from https://github.com/polenter/SharpSerializer/blob/42f9a20b3934a7f2cece356cc8116a861cec0b91/SharpSerializer/SharpSerializer.cs#L139
49 // By https://github.com/polenter
50 var typeNameConverter = settings.AdvancedSettings.TypeNameConverter ??
51 new TypeNameConverter(
52 settings.IncludeAssemblyVersionInTypeName,
53 settings.IncludeCultureInTypeName,
54 settings.IncludePublicKeyTokenInTypeName);
55 // SimpleValueConverter
56 var simpleValueConverter = settings.AdvancedSettings.SimpleValueConverter ?? new SimpleValueConverter(settings.Culture, typeNameConverter);
57 // XmlWriterSettings
58 var xmlWriterSettings = new XmlWriterSettings
59 {
60 Encoding = settings.Encoding,
61 Indent = true,
62 OmitXmlDeclaration = true,
63 };
64 // XmlReaderSettings
65 var xmlReaderSettings = new XmlReaderSettings
66 {
67 IgnoreComments = true,
68 IgnoreWhitespace = true,
69 };
70
71 // Create Serializer and Deserializer
72 var reader = new DefaultXmlReader(typeNameConverter, simpleValueConverter, xmlReaderSettings);
73 var writer = new DefaultXmlWriter(typeNameConverter, simpleValueConverter, xmlWriterSettings);
74
75 var _serializer = new XmlPropertySerializer(writer);
76 var _deserializer = new CustomPropertyDeserializer(new XmlPropertyDeserializer(reader), deserializePropertyAction);
77
78 var serializer = new SharpSerializer(_serializer, _deserializer)
79 {
80 //InstanceCreator = settings.InstanceCreator ?? new DefaultInstanceCreator(), -- InstanceCreator not present in SharpSerializer 3.0.1
81 RootName = settings.AdvancedSettings.RootName,
82 };
83 serializer.PropertyProvider.PropertiesToIgnore = settings.AdvancedSettings.PropertiesToIgnore;
84 serializer.PropertyProvider.AttributesToIgnore = settings.AdvancedSettings.AttributesToIgnore;
85
86 return serializer;
87 }
88
89 public static void WalkProperties(this Property property, Action<Property> action)
90 {
91 if (action == null || property == null)
92 throw new ArgumentNullException();
93
94 // Avoid cyclic dependencies.
95 // Reference.IsProcessed is true only for the first reference of an object.
96 bool skipProperty = property is ReferenceTargetProperty refTarget
97 && refTarget.Reference != null
98 && !refTarget.Reference.IsProcessed;
99
100 if (skipProperty) return;
101
102 action(property);
103
104 switch (property.Art)
105 {
106 case PropertyArt.Collection:
107 {
108 foreach (var item in ((CollectionProperty)property).Items)
109 item.WalkProperties(action);
110 }
111 break;
112 case PropertyArt.Complex:
113 {
114 foreach (var item in ((ComplexProperty)property).Properties)
115 item.WalkProperties(action);
116 }
117 break;
118 case PropertyArt.Dictionary:
119 {
120 foreach (var item in ((DictionaryProperty)property).Items)
121 {
122 item.Key.WalkProperties(action);
123 item.Value.WalkProperties(action);
124 }
125 }
126 break;
127 case PropertyArt.MultiDimensionalArray:
128 {
129 foreach (var item in ((MultiDimensionalArrayProperty )property).Items)
130 item.Value.WalkProperties(action);
131 }
132 break;
133 case PropertyArt.Null:
134 case PropertyArt.Simple:
135 case PropertyArt.Reference:
136 break;
137 case PropertyArt.SingleDimensionalArray:
138 {
139 foreach (var item in ((SingleDimensionalArrayProperty)property).Items)
140 item.WalkProperties(action);
141 }
142 break;
143 default:
144 throw new NotImplementedException(property.Art.ToString());
145 }
146 }
147
148 public static void RemoveIgnoredChildProperties(Property p)
149 {
150 if (p.Art == PropertyArt.Complex)
151 {
152 var items = ((ComplexProperty)p).Properties;
153 for (int i = items.Count - 1; i >= 0; i--)
154 {
155 if (p.Type.GetProperty(items[i].Name)?.IsDefined(typeof(SharpSerializerIgnoreForDeserializeAttribute), true) == true)
156 {
157 items.RemoveAt(i);
158 }
159 }
160 }
161 }
162}
163public class Root
164{
165 public List<Model> Models { get; set; } = new ();
166}
167
168public class Model
169{
170 public string Value { get; set; }
171
172 [SharpSerializerIgnoreForDeserialize]
173 public string IgnoreMe { get; set; }
174}
175var settings = new SharpSerializerXmlSettings();
176var customSerialzier = SharpSerializerExtensions.Create(settings, SharpSerializerExtensions.RemoveIgnoredChildProperties);
177var deserialized = (Root)customSerialzier.Deserialize(stream);
178
If you need binary deserialization, use the following factory method to create the serializer instead:
1SharpSerializerSettings.AdvancedSettings.AttributesToIgnore
2SharpSerializerSettings.AdvancedSettings.PropertiesToIgnore
3private void fillProperties(object obj, IEnumerable<Property> properties)
4{
5 foreach (Property property in properties)
6 {
7 PropertyInfo propertyInfo = obj.GetType().GetProperty(property.Name);
8 if (propertyInfo == null) continue;
9
10 object value = CreateObject(property);
11 if (value == null) continue;
12
13 propertyInfo.SetValue(obj, value, _emptyObjectArray);
14 }
15}
16[System.AttributeUsage(System.AttributeTargets.Property, AllowMultiple = false, Inherited = true)]
17public class SharpSerializerIgnoreForDeserializeAttribute : System.Attribute { }
18
19public class PropertyDeserializerDecorator : IPropertyDeserializer
20{
21 readonly IPropertyDeserializer deserializer;
22 public PropertyDeserializerDecorator(IPropertyDeserializer deserializer) => this.deserializer = deserializer ?? throw new ArgumentNullException();
23
24 public virtual void Open(Stream stream) => deserializer.Open(stream);
25 public virtual Property Deserialize() => deserializer.Deserialize();
26 public virtual void Close() => deserializer.Close();
27}
28
29public class CustomPropertyDeserializer : PropertyDeserializerDecorator
30{
31 Action<Property> deserializePropertyAction;
32 public CustomPropertyDeserializer(IPropertyDeserializer deserializer, Action<Property> deserializePropertyAction = default) : base(deserializer) => this.deserializePropertyAction = deserializePropertyAction;
33 public override Property Deserialize()
34 {
35 var property = base.Deserialize();
36
37 if (deserializePropertyAction != null)
38 property.WalkProperties(p => deserializePropertyAction(p));
39
40 return property;
41 }
42}
43
44public static partial class SharpSerializerExtensions
45{
46 public static SharpSerializer Create(SharpSerializerXmlSettings settings, Action<Property> deserializePropertyAction = default)
47 {
48 // Adapted from https://github.com/polenter/SharpSerializer/blob/42f9a20b3934a7f2cece356cc8116a861cec0b91/SharpSerializer/SharpSerializer.cs#L139
49 // By https://github.com/polenter
50 var typeNameConverter = settings.AdvancedSettings.TypeNameConverter ??
51 new TypeNameConverter(
52 settings.IncludeAssemblyVersionInTypeName,
53 settings.IncludeCultureInTypeName,
54 settings.IncludePublicKeyTokenInTypeName);
55 // SimpleValueConverter
56 var simpleValueConverter = settings.AdvancedSettings.SimpleValueConverter ?? new SimpleValueConverter(settings.Culture, typeNameConverter);
57 // XmlWriterSettings
58 var xmlWriterSettings = new XmlWriterSettings
59 {
60 Encoding = settings.Encoding,
61 Indent = true,
62 OmitXmlDeclaration = true,
63 };
64 // XmlReaderSettings
65 var xmlReaderSettings = new XmlReaderSettings
66 {
67 IgnoreComments = true,
68 IgnoreWhitespace = true,
69 };
70
71 // Create Serializer and Deserializer
72 var reader = new DefaultXmlReader(typeNameConverter, simpleValueConverter, xmlReaderSettings);
73 var writer = new DefaultXmlWriter(typeNameConverter, simpleValueConverter, xmlWriterSettings);
74
75 var _serializer = new XmlPropertySerializer(writer);
76 var _deserializer = new CustomPropertyDeserializer(new XmlPropertyDeserializer(reader), deserializePropertyAction);
77
78 var serializer = new SharpSerializer(_serializer, _deserializer)
79 {
80 //InstanceCreator = settings.InstanceCreator ?? new DefaultInstanceCreator(), -- InstanceCreator not present in SharpSerializer 3.0.1
81 RootName = settings.AdvancedSettings.RootName,
82 };
83 serializer.PropertyProvider.PropertiesToIgnore = settings.AdvancedSettings.PropertiesToIgnore;
84 serializer.PropertyProvider.AttributesToIgnore = settings.AdvancedSettings.AttributesToIgnore;
85
86 return serializer;
87 }
88
89 public static void WalkProperties(this Property property, Action<Property> action)
90 {
91 if (action == null || property == null)
92 throw new ArgumentNullException();
93
94 // Avoid cyclic dependencies.
95 // Reference.IsProcessed is true only for the first reference of an object.
96 bool skipProperty = property is ReferenceTargetProperty refTarget
97 && refTarget.Reference != null
98 && !refTarget.Reference.IsProcessed;
99
100 if (skipProperty) return;
101
102 action(property);
103
104 switch (property.Art)
105 {
106 case PropertyArt.Collection:
107 {
108 foreach (var item in ((CollectionProperty)property).Items)
109 item.WalkProperties(action);
110 }
111 break;
112 case PropertyArt.Complex:
113 {
114 foreach (var item in ((ComplexProperty)property).Properties)
115 item.WalkProperties(action);
116 }
117 break;
118 case PropertyArt.Dictionary:
119 {
120 foreach (var item in ((DictionaryProperty)property).Items)
121 {
122 item.Key.WalkProperties(action);
123 item.Value.WalkProperties(action);
124 }
125 }
126 break;
127 case PropertyArt.MultiDimensionalArray:
128 {
129 foreach (var item in ((MultiDimensionalArrayProperty )property).Items)
130 item.Value.WalkProperties(action);
131 }
132 break;
133 case PropertyArt.Null:
134 case PropertyArt.Simple:
135 case PropertyArt.Reference:
136 break;
137 case PropertyArt.SingleDimensionalArray:
138 {
139 foreach (var item in ((SingleDimensionalArrayProperty)property).Items)
140 item.WalkProperties(action);
141 }
142 break;
143 default:
144 throw new NotImplementedException(property.Art.ToString());
145 }
146 }
147
148 public static void RemoveIgnoredChildProperties(Property p)
149 {
150 if (p.Art == PropertyArt.Complex)
151 {
152 var items = ((ComplexProperty)p).Properties;
153 for (int i = items.Count - 1; i >= 0; i--)
154 {
155 if (p.Type.GetProperty(items[i].Name)?.IsDefined(typeof(SharpSerializerIgnoreForDeserializeAttribute), true) == true)
156 {
157 items.RemoveAt(i);
158 }
159 }
160 }
161 }
162}
163public class Root
164{
165 public List<Model> Models { get; set; } = new ();
166}
167
168public class Model
169{
170 public string Value { get; set; }
171
172 [SharpSerializerIgnoreForDeserialize]
173 public string IgnoreMe { get; set; }
174}
175var settings = new SharpSerializerXmlSettings();
176var customSerialzier = SharpSerializerExtensions.Create(settings, SharpSerializerExtensions.RemoveIgnoredChildProperties);
177var deserialized = (Root)customSerialzier.Deserialize(stream);
178public static partial class SharpSerializerExtensions
179{
180 public static SharpSerializer Create(SharpSerializerBinarySettings settings, Action<Property> deserializePropertyAction = default)
181 {
182 // Adapted from https://github.com/polenter/SharpSerializer/blob/42f9a20b3934a7f2cece356cc8116a861cec0b91/SharpSerializer/SharpSerializer.cs#L168
183 // By https://github.com/polenter
184 var typeNameConverter = settings.AdvancedSettings.TypeNameConverter ??
185 new TypeNameConverter(
186 settings.IncludeAssemblyVersionInTypeName,
187 settings.IncludeCultureInTypeName,
188 settings.IncludePublicKeyTokenInTypeName);
189
190 // Create Serializer and Deserializer
191 Polenter.Serialization.Advanced.Binary.IBinaryReader reader;
192 Polenter.Serialization.Advanced.Binary.IBinaryWriter writer;
193 if (settings.Mode == BinarySerializationMode.Burst)
194 {
195 // Burst mode
196 writer = new BurstBinaryWriter(typeNameConverter, settings.Encoding);
197 reader = new BurstBinaryReader(typeNameConverter, settings.Encoding);
198 }
199 else
200 {
201 // Size optimized mode
202 writer = new SizeOptimizedBinaryWriter(typeNameConverter, settings.Encoding);
203 reader = new SizeOptimizedBinaryReader(typeNameConverter, settings.Encoding);
204 }
205
206 var _serializer = new BinaryPropertySerializer(writer);
207 var _deserializer = new CustomPropertyDeserializer(new BinaryPropertyDeserializer(reader), deserializePropertyAction);
208
209 var serializer = new SharpSerializer(_serializer, _deserializer)
210 {
211 //InstanceCreator = settings.InstanceCreator ?? new DefaultInstanceCreator(), -- InstanceCreator not present in SharpSerializer 3.0.1
212 RootName = settings.AdvancedSettings.RootName,
213 };
214 serializer.PropertyProvider.PropertiesToIgnore = settings.AdvancedSettings.PropertiesToIgnore;
215 serializer.PropertyProvider.AttributesToIgnore = settings.AdvancedSettings.AttributesToIgnore;
216
217 return serializer;
218 }
219}
220
And do:
1SharpSerializerSettings.AdvancedSettings.AttributesToIgnore
2SharpSerializerSettings.AdvancedSettings.PropertiesToIgnore
3private void fillProperties(object obj, IEnumerable<Property> properties)
4{
5 foreach (Property property in properties)
6 {
7 PropertyInfo propertyInfo = obj.GetType().GetProperty(property.Name);
8 if (propertyInfo == null) continue;
9
10 object value = CreateObject(property);
11 if (value == null) continue;
12
13 propertyInfo.SetValue(obj, value, _emptyObjectArray);
14 }
15}
16[System.AttributeUsage(System.AttributeTargets.Property, AllowMultiple = false, Inherited = true)]
17public class SharpSerializerIgnoreForDeserializeAttribute : System.Attribute { }
18
19public class PropertyDeserializerDecorator : IPropertyDeserializer
20{
21 readonly IPropertyDeserializer deserializer;
22 public PropertyDeserializerDecorator(IPropertyDeserializer deserializer) => this.deserializer = deserializer ?? throw new ArgumentNullException();
23
24 public virtual void Open(Stream stream) => deserializer.Open(stream);
25 public virtual Property Deserialize() => deserializer.Deserialize();
26 public virtual void Close() => deserializer.Close();
27}
28
29public class CustomPropertyDeserializer : PropertyDeserializerDecorator
30{
31 Action<Property> deserializePropertyAction;
32 public CustomPropertyDeserializer(IPropertyDeserializer deserializer, Action<Property> deserializePropertyAction = default) : base(deserializer) => this.deserializePropertyAction = deserializePropertyAction;
33 public override Property Deserialize()
34 {
35 var property = base.Deserialize();
36
37 if (deserializePropertyAction != null)
38 property.WalkProperties(p => deserializePropertyAction(p));
39
40 return property;
41 }
42}
43
44public static partial class SharpSerializerExtensions
45{
46 public static SharpSerializer Create(SharpSerializerXmlSettings settings, Action<Property> deserializePropertyAction = default)
47 {
48 // Adapted from https://github.com/polenter/SharpSerializer/blob/42f9a20b3934a7f2cece356cc8116a861cec0b91/SharpSerializer/SharpSerializer.cs#L139
49 // By https://github.com/polenter
50 var typeNameConverter = settings.AdvancedSettings.TypeNameConverter ??
51 new TypeNameConverter(
52 settings.IncludeAssemblyVersionInTypeName,
53 settings.IncludeCultureInTypeName,
54 settings.IncludePublicKeyTokenInTypeName);
55 // SimpleValueConverter
56 var simpleValueConverter = settings.AdvancedSettings.SimpleValueConverter ?? new SimpleValueConverter(settings.Culture, typeNameConverter);
57 // XmlWriterSettings
58 var xmlWriterSettings = new XmlWriterSettings
59 {
60 Encoding = settings.Encoding,
61 Indent = true,
62 OmitXmlDeclaration = true,
63 };
64 // XmlReaderSettings
65 var xmlReaderSettings = new XmlReaderSettings
66 {
67 IgnoreComments = true,
68 IgnoreWhitespace = true,
69 };
70
71 // Create Serializer and Deserializer
72 var reader = new DefaultXmlReader(typeNameConverter, simpleValueConverter, xmlReaderSettings);
73 var writer = new DefaultXmlWriter(typeNameConverter, simpleValueConverter, xmlWriterSettings);
74
75 var _serializer = new XmlPropertySerializer(writer);
76 var _deserializer = new CustomPropertyDeserializer(new XmlPropertyDeserializer(reader), deserializePropertyAction);
77
78 var serializer = new SharpSerializer(_serializer, _deserializer)
79 {
80 //InstanceCreator = settings.InstanceCreator ?? new DefaultInstanceCreator(), -- InstanceCreator not present in SharpSerializer 3.0.1
81 RootName = settings.AdvancedSettings.RootName,
82 };
83 serializer.PropertyProvider.PropertiesToIgnore = settings.AdvancedSettings.PropertiesToIgnore;
84 serializer.PropertyProvider.AttributesToIgnore = settings.AdvancedSettings.AttributesToIgnore;
85
86 return serializer;
87 }
88
89 public static void WalkProperties(this Property property, Action<Property> action)
90 {
91 if (action == null || property == null)
92 throw new ArgumentNullException();
93
94 // Avoid cyclic dependencies.
95 // Reference.IsProcessed is true only for the first reference of an object.
96 bool skipProperty = property is ReferenceTargetProperty refTarget
97 && refTarget.Reference != null
98 && !refTarget.Reference.IsProcessed;
99
100 if (skipProperty) return;
101
102 action(property);
103
104 switch (property.Art)
105 {
106 case PropertyArt.Collection:
107 {
108 foreach (var item in ((CollectionProperty)property).Items)
109 item.WalkProperties(action);
110 }
111 break;
112 case PropertyArt.Complex:
113 {
114 foreach (var item in ((ComplexProperty)property).Properties)
115 item.WalkProperties(action);
116 }
117 break;
118 case PropertyArt.Dictionary:
119 {
120 foreach (var item in ((DictionaryProperty)property).Items)
121 {
122 item.Key.WalkProperties(action);
123 item.Value.WalkProperties(action);
124 }
125 }
126 break;
127 case PropertyArt.MultiDimensionalArray:
128 {
129 foreach (var item in ((MultiDimensionalArrayProperty )property).Items)
130 item.Value.WalkProperties(action);
131 }
132 break;
133 case PropertyArt.Null:
134 case PropertyArt.Simple:
135 case PropertyArt.Reference:
136 break;
137 case PropertyArt.SingleDimensionalArray:
138 {
139 foreach (var item in ((SingleDimensionalArrayProperty)property).Items)
140 item.WalkProperties(action);
141 }
142 break;
143 default:
144 throw new NotImplementedException(property.Art.ToString());
145 }
146 }
147
148 public static void RemoveIgnoredChildProperties(Property p)
149 {
150 if (p.Art == PropertyArt.Complex)
151 {
152 var items = ((ComplexProperty)p).Properties;
153 for (int i = items.Count - 1; i >= 0; i--)
154 {
155 if (p.Type.GetProperty(items[i].Name)?.IsDefined(typeof(SharpSerializerIgnoreForDeserializeAttribute), true) == true)
156 {
157 items.RemoveAt(i);
158 }
159 }
160 }
161 }
162}
163public class Root
164{
165 public List<Model> Models { get; set; } = new ();
166}
167
168public class Model
169{
170 public string Value { get; set; }
171
172 [SharpSerializerIgnoreForDeserialize]
173 public string IgnoreMe { get; set; }
174}
175var settings = new SharpSerializerXmlSettings();
176var customSerialzier = SharpSerializerExtensions.Create(settings, SharpSerializerExtensions.RemoveIgnoredChildProperties);
177var deserialized = (Root)customSerialzier.Deserialize(stream);
178public static partial class SharpSerializerExtensions
179{
180 public static SharpSerializer Create(SharpSerializerBinarySettings settings, Action<Property> deserializePropertyAction = default)
181 {
182 // Adapted from https://github.com/polenter/SharpSerializer/blob/42f9a20b3934a7f2cece356cc8116a861cec0b91/SharpSerializer/SharpSerializer.cs#L168
183 // By https://github.com/polenter
184 var typeNameConverter = settings.AdvancedSettings.TypeNameConverter ??
185 new TypeNameConverter(
186 settings.IncludeAssemblyVersionInTypeName,
187 settings.IncludeCultureInTypeName,
188 settings.IncludePublicKeyTokenInTypeName);
189
190 // Create Serializer and Deserializer
191 Polenter.Serialization.Advanced.Binary.IBinaryReader reader;
192 Polenter.Serialization.Advanced.Binary.IBinaryWriter writer;
193 if (settings.Mode == BinarySerializationMode.Burst)
194 {
195 // Burst mode
196 writer = new BurstBinaryWriter(typeNameConverter, settings.Encoding);
197 reader = new BurstBinaryReader(typeNameConverter, settings.Encoding);
198 }
199 else
200 {
201 // Size optimized mode
202 writer = new SizeOptimizedBinaryWriter(typeNameConverter, settings.Encoding);
203 reader = new SizeOptimizedBinaryReader(typeNameConverter, settings.Encoding);
204 }
205
206 var _serializer = new BinaryPropertySerializer(writer);
207 var _deserializer = new CustomPropertyDeserializer(new BinaryPropertyDeserializer(reader), deserializePropertyAction);
208
209 var serializer = new SharpSerializer(_serializer, _deserializer)
210 {
211 //InstanceCreator = settings.InstanceCreator ?? new DefaultInstanceCreator(), -- InstanceCreator not present in SharpSerializer 3.0.1
212 RootName = settings.AdvancedSettings.RootName,
213 };
214 serializer.PropertyProvider.PropertiesToIgnore = settings.AdvancedSettings.PropertiesToIgnore;
215 serializer.PropertyProvider.AttributesToIgnore = settings.AdvancedSettings.AttributesToIgnore;
216
217 return serializer;
218 }
219}
220var settings = new SharpSerializerBinarySettings();
221var customSerialzier = SharpSerializerExtensions.Create(settings, SharpSerializerExtensions.RemoveIgnoredChildProperties);
222var deserialized = (Root)customSerialzier.Deserialize(stream);
223
Notes:
The methods
SharpSerializerExtensions.Create()
were modeled onSharpSerializer.initialize(SharpSerializerXmlSettings settings)
andSharpSerializer.initialize(SharpSerializerBinarySettings settings)
by Pawel IdzikowskiThe version of
SharpSerializer
available on nuget, version 3.0.1, only includes commits through 10/8/2017. Submissions since then that add the ability to use Autofac as the instance creator are not available via nuget. My code is based on the version available via nuget, and thus does not initializeSharpSerializer.InstanceCreator
which was added in 2018. The project appears not to have updated at all since then.SharpSerializer.Deserialize()
deserializes to the type specified in the serialization stream rather than to a type specified by the caller. It thus appears vulnerable to the sort of type injection attacks described in Alvaro Muñoz & Oleksandr Mirosh's blackhat paper https://www.blackhat.com/docs/us-17/thursday/us-17-Munoz-Friday-The-13th-JSON-Attacks-wp.pdf.For details see e.g. TypeNameHandling caution in Newtonsoft Json.
If you are willing to fork, modify and build
SharpSerializer
yourself, you might consider updatingObjectFactory.fillProperties(object obj, IEnumerable<Property> properties)
to not set ignored properties.
QUESTION
Kafka integration tests in Gradle runs into GitHub Actions
Asked 2021-Nov-03 at 19:11We've been moving our applications from CircleCI to GitHub Actions in our company and we got stuck with a strange situation.
There has been no change to the project's code, but our kafka integration tests started to fail in GH Actions machines. Everything works fine in CircleCI and locally (MacOS and Fedora linux machines).
Both CircleCI and GH Actions machines are running Ubuntu (tested versions were 18.04 and 20.04). MacOS was not tested in GH Actions as it doesn't have Docker in it.
Here are the docker-compose
and workflow
files used by the build and integration tests:
- docker-compose.yml
1version: '2.1'
2
3services:
4 postgres:
5 container_name: listings-postgres
6 image: postgres:10-alpine
7 mem_limit: 500m
8 networks:
9 - listings-stack
10 ports:
11 - "5432:5432"
12 environment:
13 POSTGRES_DB: listings
14 POSTGRES_PASSWORD: listings
15 POSTGRES_USER: listings
16 PGUSER: listings
17 healthcheck:
18 test: ["CMD", "pg_isready"]
19 interval: 1s
20 timeout: 3s
21 retries: 30
22
23 listings-zookeeper:
24 container_name: listings-zookeeper
25 image: confluentinc/cp-zookeeper:6.2.0
26 environment:
27 ZOOKEEPER_CLIENT_PORT: 2181
28 ZOOKEEPER_TICK_TIME: 2000
29 networks:
30 - listings-stack
31 ports:
32 - "2181:2181"
33 healthcheck:
34 test: nc -z localhost 2181 || exit -1
35 interval: 10s
36 timeout: 5s
37 retries: 10
38
39 listings-kafka:
40 container_name: listings-kafka
41 image: confluentinc/cp-kafka:6.2.0
42 depends_on:
43 listings-zookeeper:
44 condition: service_healthy
45 environment:
46 KAFKA_BROKER_ID: 1
47 KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://listings-kafka:9092,PLAINTEXT_HOST://localhost:29092
48 KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
49 KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
50 KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
51 KAFKA_ZOOKEEPER_CONNECT: listings-zookeeper:2181
52 volumes:
53 - /var/run/docker.sock:/var/run/docker.sock
54 networks:
55 - listings-stack
56 ports:
57 - "29092:29092"
58 healthcheck:
59 test: kafka-topics --bootstrap-server 127.0.0.1:9092 --list
60 interval: 10s
61 timeout: 10s
62 retries: 50
63
64networks: {listings-stack: {}}
65
- build.yml
1version: '2.1'
2
3services:
4 postgres:
5 container_name: listings-postgres
6 image: postgres:10-alpine
7 mem_limit: 500m
8 networks:
9 - listings-stack
10 ports:
11 - "5432:5432"
12 environment:
13 POSTGRES_DB: listings
14 POSTGRES_PASSWORD: listings
15 POSTGRES_USER: listings
16 PGUSER: listings
17 healthcheck:
18 test: ["CMD", "pg_isready"]
19 interval: 1s
20 timeout: 3s
21 retries: 30
22
23 listings-zookeeper:
24 container_name: listings-zookeeper
25 image: confluentinc/cp-zookeeper:6.2.0
26 environment:
27 ZOOKEEPER_CLIENT_PORT: 2181
28 ZOOKEEPER_TICK_TIME: 2000
29 networks:
30 - listings-stack
31 ports:
32 - "2181:2181"
33 healthcheck:
34 test: nc -z localhost 2181 || exit -1
35 interval: 10s
36 timeout: 5s
37 retries: 10
38
39 listings-kafka:
40 container_name: listings-kafka
41 image: confluentinc/cp-kafka:6.2.0
42 depends_on:
43 listings-zookeeper:
44 condition: service_healthy
45 environment:
46 KAFKA_BROKER_ID: 1
47 KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://listings-kafka:9092,PLAINTEXT_HOST://localhost:29092
48 KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
49 KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
50 KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
51 KAFKA_ZOOKEEPER_CONNECT: listings-zookeeper:2181
52 volumes:
53 - /var/run/docker.sock:/var/run/docker.sock
54 networks:
55 - listings-stack
56 ports:
57 - "29092:29092"
58 healthcheck:
59 test: kafka-topics --bootstrap-server 127.0.0.1:9092 --list
60 interval: 10s
61 timeout: 10s
62 retries: 50
63
64networks: {listings-stack: {}}
65name: Build
66
67on: [ pull_request ]
68
69env:
70 AWS_ACCESS_KEY_ID: ${{ secrets.TUNNEL_AWS_ACCESS_KEY_ID }}
71 AWS_SECRET_ACCESS_KEY: ${{ secrets.TUNNEL_AWS_SECRET_ACCESS_KEY }}
72 AWS_DEFAULT_REGION: 'us-east-1'
73 CIRCLECI_KEY_TUNNEL: ${{ secrets.ID_RSA_CIRCLECI_TUNNEL }}
74
75jobs:
76 build:
77 name: Listings-API Build
78 runs-on: [ self-hosted, zap ]
79
80 steps:
81 - uses: actions/checkout@v2
82 with:
83 token: ${{ secrets.GH_OLXBR_PAT }}
84 submodules: recursive
85 path: ./repo
86 fetch-depth: 0
87
88 - name: Set up JDK 11
89 uses: actions/setup-java@v2
90 with:
91 distribution: 'adopt'
92 java-version: '11'
93 architecture: x64
94 cache: 'gradle'
95
96 - name: Docker up
97 working-directory: ./repo
98 run: docker-compose up -d
99
100 - name: Build with Gradle
101 working-directory: ./repo
102 run: ./gradlew build -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2 -x integrationTest
103
104 - name: Integration tests with Gradle
105 working-directory: ./repo
106 run: ./gradlew integrationTest -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2
107
108 - name: Sonarqube
109 working-directory: ./repo
110 env:
111 GITHUB_TOKEN: ${{ secrets.GH_OLXBR_PAT }}
112 SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
113 run: ./gradlew sonarqube --info -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2
114
115 - name: Docker down
116 if: always()
117 working-directory: ./repo
118 run: docker-compose down --remove-orphans
119
120 - name: Cleanup Gradle Cache
121 # Remove some files from the Gradle cache, so they aren't cached by GitHub Actions.
122 # Restoring these files from a GitHub Actions cache might cause problems for future builds.
123 run: |
124 rm -f ${{ env.HOME }}/.gradle/caches/modules-2/modules-2.lock
125 rm -f ${{ env.HOME }}/.gradle/caches/modules-2/gc.properties
126
127
The integration tests are written using the Spock framework and the part where the errors occur are these:
1version: '2.1'
2
3services:
4 postgres:
5 container_name: listings-postgres
6 image: postgres:10-alpine
7 mem_limit: 500m
8 networks:
9 - listings-stack
10 ports:
11 - "5432:5432"
12 environment:
13 POSTGRES_DB: listings
14 POSTGRES_PASSWORD: listings
15 POSTGRES_USER: listings
16 PGUSER: listings
17 healthcheck:
18 test: ["CMD", "pg_isready"]
19 interval: 1s
20 timeout: 3s
21 retries: 30
22
23 listings-zookeeper:
24 container_name: listings-zookeeper
25 image: confluentinc/cp-zookeeper:6.2.0
26 environment:
27 ZOOKEEPER_CLIENT_PORT: 2181
28 ZOOKEEPER_TICK_TIME: 2000
29 networks:
30 - listings-stack
31 ports:
32 - "2181:2181"
33 healthcheck:
34 test: nc -z localhost 2181 || exit -1
35 interval: 10s
36 timeout: 5s
37 retries: 10
38
39 listings-kafka:
40 container_name: listings-kafka
41 image: confluentinc/cp-kafka:6.2.0
42 depends_on:
43 listings-zookeeper:
44 condition: service_healthy
45 environment:
46 KAFKA_BROKER_ID: 1
47 KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://listings-kafka:9092,PLAINTEXT_HOST://localhost:29092
48 KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
49 KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
50 KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
51 KAFKA_ZOOKEEPER_CONNECT: listings-zookeeper:2181
52 volumes:
53 - /var/run/docker.sock:/var/run/docker.sock
54 networks:
55 - listings-stack
56 ports:
57 - "29092:29092"
58 healthcheck:
59 test: kafka-topics --bootstrap-server 127.0.0.1:9092 --list
60 interval: 10s
61 timeout: 10s
62 retries: 50
63
64networks: {listings-stack: {}}
65name: Build
66
67on: [ pull_request ]
68
69env:
70 AWS_ACCESS_KEY_ID: ${{ secrets.TUNNEL_AWS_ACCESS_KEY_ID }}
71 AWS_SECRET_ACCESS_KEY: ${{ secrets.TUNNEL_AWS_SECRET_ACCESS_KEY }}
72 AWS_DEFAULT_REGION: 'us-east-1'
73 CIRCLECI_KEY_TUNNEL: ${{ secrets.ID_RSA_CIRCLECI_TUNNEL }}
74
75jobs:
76 build:
77 name: Listings-API Build
78 runs-on: [ self-hosted, zap ]
79
80 steps:
81 - uses: actions/checkout@v2
82 with:
83 token: ${{ secrets.GH_OLXBR_PAT }}
84 submodules: recursive
85 path: ./repo
86 fetch-depth: 0
87
88 - name: Set up JDK 11
89 uses: actions/setup-java@v2
90 with:
91 distribution: 'adopt'
92 java-version: '11'
93 architecture: x64
94 cache: 'gradle'
95
96 - name: Docker up
97 working-directory: ./repo
98 run: docker-compose up -d
99
100 - name: Build with Gradle
101 working-directory: ./repo
102 run: ./gradlew build -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2 -x integrationTest
103
104 - name: Integration tests with Gradle
105 working-directory: ./repo
106 run: ./gradlew integrationTest -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2
107
108 - name: Sonarqube
109 working-directory: ./repo
110 env:
111 GITHUB_TOKEN: ${{ secrets.GH_OLXBR_PAT }}
112 SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
113 run: ./gradlew sonarqube --info -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2
114
115 - name: Docker down
116 if: always()
117 working-directory: ./repo
118 run: docker-compose down --remove-orphans
119
120 - name: Cleanup Gradle Cache
121 # Remove some files from the Gradle cache, so they aren't cached by GitHub Actions.
122 # Restoring these files from a GitHub Actions cache might cause problems for future builds.
123 run: |
124 rm -f ${{ env.HOME }}/.gradle/caches/modules-2/modules-2.lock
125 rm -f ${{ env.HOME }}/.gradle/caches/modules-2/gc.properties
126
127 boolean compareRecordSend(String topicName, int expected) {
128 def condition = new PollingConditions()
129 condition.within(kafkaProperties.listener.pollTimeout.getSeconds() * 5) {
130 assert expected == getRecordSendTotal(topicName)
131 }
132 return true
133 }
134
135 int getRecordSendTotal(String topicName) {
136 kafkaTemplate.flush()
137 return kafkaTemplate.metrics().find {
138 it.key.name() == "record-send-total" && it.key.tags().get("topic") == topicName
139 }?.value?.metricValue() ?: 0
140 }
141
The error we're getting is:
1version: '2.1'
2
3services:
4 postgres:
5 container_name: listings-postgres
6 image: postgres:10-alpine
7 mem_limit: 500m
8 networks:
9 - listings-stack
10 ports:
11 - "5432:5432"
12 environment:
13 POSTGRES_DB: listings
14 POSTGRES_PASSWORD: listings
15 POSTGRES_USER: listings
16 PGUSER: listings
17 healthcheck:
18 test: ["CMD", "pg_isready"]
19 interval: 1s
20 timeout: 3s
21 retries: 30
22
23 listings-zookeeper:
24 container_name: listings-zookeeper
25 image: confluentinc/cp-zookeeper:6.2.0
26 environment:
27 ZOOKEEPER_CLIENT_PORT: 2181
28 ZOOKEEPER_TICK_TIME: 2000
29 networks:
30 - listings-stack
31 ports:
32 - "2181:2181"
33 healthcheck:
34 test: nc -z localhost 2181 || exit -1
35 interval: 10s
36 timeout: 5s
37 retries: 10
38
39 listings-kafka:
40 container_name: listings-kafka
41 image: confluentinc/cp-kafka:6.2.0
42 depends_on:
43 listings-zookeeper:
44 condition: service_healthy
45 environment:
46 KAFKA_BROKER_ID: 1
47 KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://listings-kafka:9092,PLAINTEXT_HOST://localhost:29092
48 KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
49 KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
50 KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
51 KAFKA_ZOOKEEPER_CONNECT: listings-zookeeper:2181
52 volumes:
53 - /var/run/docker.sock:/var/run/docker.sock
54 networks:
55 - listings-stack
56 ports:
57 - "29092:29092"
58 healthcheck:
59 test: kafka-topics --bootstrap-server 127.0.0.1:9092 --list
60 interval: 10s
61 timeout: 10s
62 retries: 50
63
64networks: {listings-stack: {}}
65name: Build
66
67on: [ pull_request ]
68
69env:
70 AWS_ACCESS_KEY_ID: ${{ secrets.TUNNEL_AWS_ACCESS_KEY_ID }}
71 AWS_SECRET_ACCESS_KEY: ${{ secrets.TUNNEL_AWS_SECRET_ACCESS_KEY }}
72 AWS_DEFAULT_REGION: 'us-east-1'
73 CIRCLECI_KEY_TUNNEL: ${{ secrets.ID_RSA_CIRCLECI_TUNNEL }}
74
75jobs:
76 build:
77 name: Listings-API Build
78 runs-on: [ self-hosted, zap ]
79
80 steps:
81 - uses: actions/checkout@v2
82 with:
83 token: ${{ secrets.GH_OLXBR_PAT }}
84 submodules: recursive
85 path: ./repo
86 fetch-depth: 0
87
88 - name: Set up JDK 11
89 uses: actions/setup-java@v2
90 with:
91 distribution: 'adopt'
92 java-version: '11'
93 architecture: x64
94 cache: 'gradle'
95
96 - name: Docker up
97 working-directory: ./repo
98 run: docker-compose up -d
99
100 - name: Build with Gradle
101 working-directory: ./repo
102 run: ./gradlew build -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2 -x integrationTest
103
104 - name: Integration tests with Gradle
105 working-directory: ./repo
106 run: ./gradlew integrationTest -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2
107
108 - name: Sonarqube
109 working-directory: ./repo
110 env:
111 GITHUB_TOKEN: ${{ secrets.GH_OLXBR_PAT }}
112 SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
113 run: ./gradlew sonarqube --info -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2
114
115 - name: Docker down
116 if: always()
117 working-directory: ./repo
118 run: docker-compose down --remove-orphans
119
120 - name: Cleanup Gradle Cache
121 # Remove some files from the Gradle cache, so they aren't cached by GitHub Actions.
122 # Restoring these files from a GitHub Actions cache might cause problems for future builds.
123 run: |
124 rm -f ${{ env.HOME }}/.gradle/caches/modules-2/modules-2.lock
125 rm -f ${{ env.HOME }}/.gradle/caches/modules-2/gc.properties
126
127 boolean compareRecordSend(String topicName, int expected) {
128 def condition = new PollingConditions()
129 condition.within(kafkaProperties.listener.pollTimeout.getSeconds() * 5) {
130 assert expected == getRecordSendTotal(topicName)
131 }
132 return true
133 }
134
135 int getRecordSendTotal(String topicName) {
136 kafkaTemplate.flush()
137 return kafkaTemplate.metrics().find {
138 it.key.name() == "record-send-total" && it.key.tags().get("topic") == topicName
139 }?.value?.metricValue() ?: 0
140 }
141Condition not satisfied after 50.00 seconds and 496 attempts
142 at spock.util.concurrent.PollingConditions.within(PollingConditions.java:185)
143 at com.company.listings.KafkaAwareBaseSpec.compareRecordSend(KafkaAwareBaseSpec.groovy:31)
144 at com.company.listings.application.worker.listener.notifier.ListingNotifierITSpec.should notify listings(ListingNotifierITSpec.groovy:44)
145
146 Caused by:
147 Condition not satisfied:
148
149 expected == getRecordSendTotal(topicName)
150 | | | |
151 10 | 0 v4
152 false
153
We've debugged the GH Actions machine (SSH into it) and run things manually. The error still happens, but if the integration tests are run a second time (as well as subsequent runs), everything works perfectly.
We've also tried to initialize all the necessary topics and send some messages to them preemptively, but the behavior was the same.
The questions we have are:
- Is there any issue when running Kafka dockerized in an Ubuntu machine (the error also occurred in a co-worker Ubuntu machine)?
- Any ideas on why this is happening?
Edit
- application.yml (Kafka related configuration)
1version: '2.1'
2
3services:
4 postgres:
5 container_name: listings-postgres
6 image: postgres:10-alpine
7 mem_limit: 500m
8 networks:
9 - listings-stack
10 ports:
11 - "5432:5432"
12 environment:
13 POSTGRES_DB: listings
14 POSTGRES_PASSWORD: listings
15 POSTGRES_USER: listings
16 PGUSER: listings
17 healthcheck:
18 test: ["CMD", "pg_isready"]
19 interval: 1s
20 timeout: 3s
21 retries: 30
22
23 listings-zookeeper:
24 container_name: listings-zookeeper
25 image: confluentinc/cp-zookeeper:6.2.0
26 environment:
27 ZOOKEEPER_CLIENT_PORT: 2181
28 ZOOKEEPER_TICK_TIME: 2000
29 networks:
30 - listings-stack
31 ports:
32 - "2181:2181"
33 healthcheck:
34 test: nc -z localhost 2181 || exit -1
35 interval: 10s
36 timeout: 5s
37 retries: 10
38
39 listings-kafka:
40 container_name: listings-kafka
41 image: confluentinc/cp-kafka:6.2.0
42 depends_on:
43 listings-zookeeper:
44 condition: service_healthy
45 environment:
46 KAFKA_BROKER_ID: 1
47 KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://listings-kafka:9092,PLAINTEXT_HOST://localhost:29092
48 KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
49 KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
50 KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
51 KAFKA_ZOOKEEPER_CONNECT: listings-zookeeper:2181
52 volumes:
53 - /var/run/docker.sock:/var/run/docker.sock
54 networks:
55 - listings-stack
56 ports:
57 - "29092:29092"
58 healthcheck:
59 test: kafka-topics --bootstrap-server 127.0.0.1:9092 --list
60 interval: 10s
61 timeout: 10s
62 retries: 50
63
64networks: {listings-stack: {}}
65name: Build
66
67on: [ pull_request ]
68
69env:
70 AWS_ACCESS_KEY_ID: ${{ secrets.TUNNEL_AWS_ACCESS_KEY_ID }}
71 AWS_SECRET_ACCESS_KEY: ${{ secrets.TUNNEL_AWS_SECRET_ACCESS_KEY }}
72 AWS_DEFAULT_REGION: 'us-east-1'
73 CIRCLECI_KEY_TUNNEL: ${{ secrets.ID_RSA_CIRCLECI_TUNNEL }}
74
75jobs:
76 build:
77 name: Listings-API Build
78 runs-on: [ self-hosted, zap ]
79
80 steps:
81 - uses: actions/checkout@v2
82 with:
83 token: ${{ secrets.GH_OLXBR_PAT }}
84 submodules: recursive
85 path: ./repo
86 fetch-depth: 0
87
88 - name: Set up JDK 11
89 uses: actions/setup-java@v2
90 with:
91 distribution: 'adopt'
92 java-version: '11'
93 architecture: x64
94 cache: 'gradle'
95
96 - name: Docker up
97 working-directory: ./repo
98 run: docker-compose up -d
99
100 - name: Build with Gradle
101 working-directory: ./repo
102 run: ./gradlew build -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2 -x integrationTest
103
104 - name: Integration tests with Gradle
105 working-directory: ./repo
106 run: ./gradlew integrationTest -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2
107
108 - name: Sonarqube
109 working-directory: ./repo
110 env:
111 GITHUB_TOKEN: ${{ secrets.GH_OLXBR_PAT }}
112 SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
113 run: ./gradlew sonarqube --info -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2
114
115 - name: Docker down
116 if: always()
117 working-directory: ./repo
118 run: docker-compose down --remove-orphans
119
120 - name: Cleanup Gradle Cache
121 # Remove some files from the Gradle cache, so they aren't cached by GitHub Actions.
122 # Restoring these files from a GitHub Actions cache might cause problems for future builds.
123 run: |
124 rm -f ${{ env.HOME }}/.gradle/caches/modules-2/modules-2.lock
125 rm -f ${{ env.HOME }}/.gradle/caches/modules-2/gc.properties
126
127 boolean compareRecordSend(String topicName, int expected) {
128 def condition = new PollingConditions()
129 condition.within(kafkaProperties.listener.pollTimeout.getSeconds() * 5) {
130 assert expected == getRecordSendTotal(topicName)
131 }
132 return true
133 }
134
135 int getRecordSendTotal(String topicName) {
136 kafkaTemplate.flush()
137 return kafkaTemplate.metrics().find {
138 it.key.name() == "record-send-total" && it.key.tags().get("topic") == topicName
139 }?.value?.metricValue() ?: 0
140 }
141Condition not satisfied after 50.00 seconds and 496 attempts
142 at spock.util.concurrent.PollingConditions.within(PollingConditions.java:185)
143 at com.company.listings.KafkaAwareBaseSpec.compareRecordSend(KafkaAwareBaseSpec.groovy:31)
144 at com.company.listings.application.worker.listener.notifier.ListingNotifierITSpec.should notify listings(ListingNotifierITSpec.groovy:44)
145
146 Caused by:
147 Condition not satisfied:
148
149 expected == getRecordSendTotal(topicName)
150 | | | |
151 10 | 0 v4
152 false
153spring:
154 kafka:
155 bootstrap-servers: localhost:29092
156 producer:
157 batch-size: 262144
158 buffer-memory: 536870912
159 retries: 1
160 key-serializer: org.apache.kafka.common.serialization.StringSerializer
161 value-serializer: org.apache.kafka.common.serialization.ByteArraySerializer
162 acks: all
163 properties:
164 linger.ms: 0
165
ANSWER
Answered 2021-Nov-03 at 19:11We identified some test sequence dependency between the Kafka tests.
We updated our Gradle version to 7.3-rc-3
which has a more deterministic approach to test scanning. This update "solved" our problem while we prepare to fix the tests' dependencies.
Community Discussions contain sources that include Stack Exchange Network
Tutorials and Learning Resources in Serialization
Tutorials and Learning Resources are not available at this moment for Serialization