Materialize | converting images to materials for use in video games
kandi X-RAY | Materialize Summary
Support
Quality
Security
License
Reuse
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample Here
Materialize Key Features
Materialize Examples and Code Snippets
Trending Discussions on Materialize
Trending Discussions on Materialize
QUESTION
I have an Aurora Serverless instance which has data loaded across 3 tables (mixture of standard and jsonb data types). We currently use traditional views where some of the deeply nested elements are surfaced along with other columns for aggregations and such.
We have two materialized views that we'd like to send to Redshift. Both the Aurora Postgres and Redshift are in Glue Catalog and while I can see Postgres views as a selectable table, the crawler does not pick up the materialized views.
Currently exploring two options to get the data to redshift.
- Output to parquet and use copy to load
- Point the Materialized view to jdbc sink specifying redshift.
Wanted recommendations on what might be most efficient approach if anyone has done a similar use case.
Questions:
- In option 1, would I be able to handle incremental loads?
- Is bookmarking supported for JDBC (Aurora Postgres) to JDBC (Redshift) transactions even if through Glue?
- Is there a better way (other than the options I am considering) to move the data from Aurora Postgres Serverless (10.14) to Redshift.
Thanks in advance for any guidance provided.
ANSWER
Answered 2021-Jun-15 at 13:51Went with option 2. The Redshift Copy/Load process writes csv with manifest to S3 in any case so duplicating that is pointless.
Regarding the Questions:
N/A
Job Bookmarking does work. There is some gotchas though - ensure Connections both to RDS and Redshift are present in Glue Pyspark job, IAM self ref rules are in place and to identify a row that is unique [I chose the primary key of underlying table as an additional column in my materialized view] to use as the bookmark.
Using the primary key of core table may buy efficiencies in pruning materialized views during maintenance cycles. Just retrieve latest bookmark from cli using
aws glue get-job-bookmark --job-name yourjobname
and then just that in the where clause of the mv aswhere id >= idinbookmark
conn = glueContext.extract_jdbc_conf("yourGlueCatalogdBConnection")
connection_options_source = { "url": conn['url'] + "/yourdB", "dbtable": "table in dB", "user": conn['user'], "password": conn['password'], "jobBookmarkKeys":["unique identifier from source table"], "jobBookmarkKeysSortOrder":"asc"}
datasource0 = glueContext.create_dynamic_frame.from_options(connection_type="postgresql", connection_options=connection_options_source, transformation_ctx="datasource0")
That's all, folks
QUESTION
transform file/directory structure into 'tree' in vue json
I have an array of objects that looks like this:
[
{
"name": "Officer",
"isDirectory": true,
"path": "Officer/EventReport/SelfReport/110-04-02/DADF.pdf"
},
{
"name": "Officer",
"isDirectory": true,
"path": "Officer/EventReport/SelfReport/110-04-10/110010.pdf"
},
{
"name": "Officer",
"isDirectory": true,
"path": "Officer/S_Meeting/W_Meeting/110-5/Officer_from.docx"
},
{
"name": "Officer",
"isDirectory": true,
"path": "Officer/S_Meeting/W_Meeting/110-5/1620021359034.jpg"
},
{
"name": "Officer",
"isDirectory": true,
"path": "Officer/S_Meeting/W_Meeting/110-5/2021-05-18_092810.png"
}
]
There could be any number of arbitrary path's, this is the result of iterating through files and folders within a directory.
What I'm trying to do is determine the 'root' node of these. Ultimately, this will be stored in mongodb and use materialized path to determine it's relationships.
I hope I can show this one.
[
{
"name": "Officer", //part1
"isDirectory": true,
"items": [
{
"name": "EventReport", //part2
"isDirectory": true,
"items": [
{
"name": "SelfReport", //part3
"isDirectory": true,
"items": [
{
"name": "2020-110-04-02", //part4
"isDirectory": true,
"items": [
{
"name": "RCBS.pdf", // name
"isDirectory": false
}
]
},
{
"name": "2020-110-04-10", //part4
"isDirectory": true,
"items": [
{
"name": "1100_b.pdf", //name
"isDirectory": false
}
]
}
]
}
]
},
{
"name": "SecurityMeeting", // part2
"isDirectory": true,
"items": [
{
"name": "SecurityWorkMeeting", //part3
"isDirectory": true,
"items": [
{
"name": "2021-05-SecurityWorkMeeting", //part4
"isDirectory": true,
"items": [
{
"name": "Officer_Report.docx", //name
"isDirectory": false
},
{
"name": "16200.jpg", //name
"isDirectory": false
},
{
"name": "2021-05-18_2342.png", //name
"isDirectory": false
}
]
},
]
}
]
}
]
},
]
method
let arr = xhr.data.UploadFile;
let tree = {};
arr.forEach(item => {
let tokens = item.path.replace(/^\/|\/$/g, "").split("/");
let current = tree;
for (let i=0; i {
if (Object.keys(node[key]).length === 0) {
return {
isDirectory: false,
name: key,
};
}
return {
isDirectory: true,
name: key,
items: parseNode(node[key]),
};
});
};
let result = parseNode(tree);
console.log("RESULT", result);
update enter image description here
I dont know why it will pass the parseNode = function(node), and cannot get result. And the items should be use array. Like this one
[
{
"name": "Officer", //part1
"isDirectory": true,
"items": [
{
"name": "EventReport", //part2
"isDirectory": true,
"items": [
{
"name": "SelfReport", //part3
"isDirectory": true,
"items": [
{
"name": "2020-110-04-02", //part4
"isDirectory": true,
"items": [
{
"name": "RCBS.pdf", // name
"isDirectory": false
}
]
},
{
"name": "2020-110-04-10", //part4
"isDirectory": true,
"items": [
{
"name": "1100_b.pdf", //name
"isDirectory": false
}
]
}
]
}
]
},
{
"name": "SecurityMeeting", // part2
"isDirectory": true,
"items": [
{
"name": "SecurityWorkMeeting", //part3
"isDirectory": true,
"items": [
{
"name": "2021-05-SecurityWorkMeeting", //part4
"isDirectory": true,
"items": [
{
"name": "Officer_Report.docx", //name
"isDirectory": false
},
{
"name": "16200.jpg", //name
"isDirectory": false
},
{
"name": "2021-05-18_2342.png", //name
"isDirectory": false
}
]
},
]
}
]
}
]
},
]
ANSWER
Answered 2021-Jun-11 at 09:55EDIT
Here is the full implementation, based upon my initial answer. I changed the forEach() into map() as it is more suitable in this case.
let arr = xhr.data.UploadFile;
let tree = {};
arr.forEach(item => {
let tokens = item.path.replace(/^\/|\/$/g, "").split("/");
let current = tree;
for (let i=0; i {
if (Object.keys(node[key]).length === 0) {
return {
isDirectory: false,
name: key,
};
}
return {
isDirectory: true,
name: key,
items: parseNode(node[key]),
};
});
};
let result = parseNode(tree);
console.log("RESULT", result);
ORIGINAL ANSWER
You could build a map by iterating on the array (I named it data):
let tree = {};
data.forEach(item => {
let tokens = item.path.split("/");
let current = tree;
for (let i=0; i
Then, walk through your map to build your result array:
const parseNode = function(node) {
let res = [];
Object.keys(node).forEach(key => {
if (Object.keys(node[key]).length === 0) {
res.push({
isDirectory: false,
name: key,
});
} else {
res.push({
isDirectory: true,
name: key,
items: parseNode(node[key]),
});
}
});
return res;
};
let result = parseNode(tree);
QUESTION
Using below code I'm attempting to use an actor as a source and send messages of type Double to be processed via a sliding window.
The sliding windows is defined as sliding(2, 2)
to calculate each sequence of twp values sent.
Sending the message:
actorRef.tell(10, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
actorRef.tell(30, ActorRef.noSender());
actorRef.tell(40, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
Should calculate the average as follows :
10 + 20 / 2 = 15
30 + 40 / 2 = 35
But the calculation does not appear to be invoked in below code.
Here I output the value :
movingAverage.runForeach(n -> {
if( n > 0){
System.out.println(n);
}
}, system);
src code:
import akka.Done;
import akka.actor.ActorRef;
import akka.stream.CompletionStrategy;
import akka.stream.OverflowStrategy;
import akka.stream.javadsl.Sink;
import akka.stream.javadsl.Source;
import java.util.Optional;
public class FilterThreshold {
public static void main(String[] args) {
final akka.actor.ActorSystem system = akka.actor.ActorSystem.create("Source");
final int bufferSize = 1;
final Source source =
Source.actorRef(
elem -> {
// complete stream immediately if we send it Done
if (elem == Done.done()) {
return Optional.of(CompletionStrategy.immediately());
} else {
return Optional.empty();
}
},
// never fail the stream because of a message
elem -> Optional.empty(),
bufferSize,
OverflowStrategy.dropHead());
ActorRef actorRef = source.to(Sink.foreach(System.out::println)).run(system);
actorRef.tell(10, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
actorRef.tell(30, ActorRef.noSender());
actorRef.tell(40, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
Source movingAverage = source
.sliding(2, 2)
.map(window -> (window.stream().mapToDouble(i -> i).sum()) / window.size());
movingAverage.runForeach(n -> {
if( n > 0){
System.out.println(n);
}
}, system);
}
}
I've edited the code from https://doc.akka.io/docs/akka/current/stream/operators/Source-or-Flow/sliding.html
How to apply the sliding window function defined as movingAverage
to calculate the values sent via the Akka actor actorRef
?
Update:
The method permaterialize
takes an actor system as a parameter.
Updating the code from:
final Pair> prematPair = source.preMaterialize();
to:
final Pair> prematPair = source.preMaterialize(system);
results in a compile time error:
Required type:
Pair
>
Provided:
Pair
>
Is there an alternative method I should use ?
Updated code posted:
import akka.Done;
import akka.NotUsed;
import akka.actor.ActorRef;
import akka.japi.Pair;
import akka.stream.CompletionStrategy;
import akka.stream.OverflowStrategy;
import akka.stream.javadsl.Flow;
import akka.stream.javadsl.Source;
import java.util.Optional;
public class FilterThreshold {
public static void main(String[] args) {
final akka.actor.ActorSystem system = akka.actor.ActorSystem.create("Source");
final int bufferSize = 1;
final Source source =
Source.actorRef(
elem -> {
System.out.println("elem is "+elem);
// complete stream immediately if we send it Done
if (elem == Done.done()) {
return Optional.of(CompletionStrategy.immediately());
} else {
return Optional.empty();
}
},
// never fail the stream because of a message
elem -> Optional.empty(),
bufferSize,
OverflowStrategy.dropHead());
// source is as before
final Pair> prematPair = source.preMaterialize(system);
Flow movingAverageFlow =
Flow.of(Double.class)
.sliding(2, 2)
.map(window -> (window.stream().mapToDouble(i -> i).sum()) / window.size());
final Source prematSource = prematPair.second();
prematSource.via(movingAverageFlow).runForeach(n -> {
System.out.println("n is "+n);
if (n > 0) {
System.out.println(n);
}
}, system);
final ActorRef actorRef = prematPair.first();
actorRef.tell(10, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
}
}
Update2 :
Using code :
import akka.Done;
import akka.NotUsed;
import akka.actor.ActorRef;
import akka.japi.Pair;
import akka.stream.CompletionStrategy;
import akka.stream.OverflowStrategy;
import akka.stream.javadsl.Flow;
import akka.stream.javadsl.Source;
import java.util.Optional;
public class FilterThreshold {
public static void main(String[] args) {
final akka.actor.ActorSystem system = akka.actor.ActorSystem.create("Source");
final int bufferSize = 1;
final Source source =
Source.actorRef(
elem -> {
System.out.println("elem is "+elem);
// complete stream immediately if we send it Done
if (elem == Done.done()) {
return Optional.of(CompletionStrategy.immediately());
} else {
return Optional.empty();
}
},
// never fail the stream because of a message
elem -> Optional.empty(),
bufferSize,
OverflowStrategy.dropHead());
// source is as before
final Pair> prematPair = source.preMaterialize(system);
final ActorRef actorRef = prematPair.first();
final Source prematSource = prematPair.second();
Flow movingAverageFlow =
Flow.of(Double.class)
.sliding(2, 2)
.map(window -> (window.stream().mapToDouble(i -> i).sum()) / window.size());
prematSource.via(movingAverageFlow).runForeach(n -> {
System.out.println("n is "+n);
if (n > 0) {
System.out.println(n);
}
}, system);
actorRef.tell(10, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
actorRef.tell(20, ActorRef.noSender());
prematSource.run(system);
}
}
prints:
elem is 10
elem is 20
elem is 20
elem is 20
elem is 20
So it seems the messages are being sent correctly but the moving average is not being materialized.
Is using prematSource.run(system);
not the correct way to materialize the value ?
ANSWER
Answered 2021-Jun-14 at 11:39The short answer is that your source
is a recipe of sorts for materializing a Source
and each materialization ends up being a different source.
In your code, source.to(Sink.foreach(System.out::println)).run(system)
is one stream with the materialized actorRef
being only connected to this stream, and
movingAverage.runForeach(n -> {
if( n > 0){
System.out.println(n);
}
}, system);
is a completely separate stream with a different materialized ActorRef
(which ultimately gets thrown away since runForeach
materializes as a CompletionStage
.
When dealing with Source.actorRef
, it's often a good idea to prematerialize the source before running the stream:
import akka.NotUsed
import akka.japi.Pair
import akka.stream.javadsl.Flow
// source is as before
final Pair> prematPair = source.preMaterialize(system);
final ActorRef actorRef = prematPair.first();
final Source prematSource = prematPair.second();
Flow movingAverageFlow =
Flow.of(Double.class)
.sliding(2, 2)
.map(window -> (window.stream().mapToDouble(i -> i).sum()) / window.size());
prematSource.via(movingAverageFlow).runForeach(n -> {
if (n > 0) {
System.out.println(n);
}
}, system);
(Apologies, my Java is quite rusty)
QUESTION
- I would have expected the queryOutput to be materialized?
- Why is there the invalid attempt to call fieldcount when it alreadyhas the IEnumerable?
///
/// Get all columns for a certain table
///
public async Task>> GetAllColumnsFromTableAsync(string tableName)
{
List> result;
string query = "SELECT COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME= @tableName";
using (IDbConnection db = DbConnection)
{
IEnumerable> queryOutput = await db.QueryAsync>
(query, Tuple.Create, new { tableName = tableName }, null, false, splitOn: "*");
result = queryOutput.ToList(); // System.InvalidOperationException: Invalid attempt to call FieldCount when reader is closed.
}
if (result is not null && result.Count > 0)
{
return result;
}
else
{
return default;
}
}
- making this a non async method works. so it MUST be "tuple"+"dapper"+"async"
ref: https://github.com/DapperLib/Dapper/issues/745
update:
when replacing the tuple with named tuples the query in itself works.
public async Task<(string COLUMN_NAME, string DATA_TYPE, int? CHARACTER_MAXIMUM_LENGTH)>> GetAllColumnsFromTableAsync2(string tableName)
{
const string query = "SELECT COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME= @tableName";
using var connection = new SqlConnection(_connectionstring);
var output = await connection.QueryAsync<(string COLUMN_NAME, string DATA_TYPE, int? CHARACTER_MAXIMUM_LENGTH)>
(query, new { tableName = tableName });
return output.ToList();
}
I have a further problem but that is another question.
ANSWER
Answered 2021-Jun-12 at 20:53Dapper's supposed to make your life easier, and that code looks complicated.
If you use a Tuple with named fields, you can just use Dapper's auto mapping to materialize. EG:
static class DbExtensions
{
public static async Task<(string COLUMN_NAME, string DATA_TYPE, int? CHARACTER_MAXIMUM_LENGTH)>> GetAllColumnsFromTableAsync(this IDbConnection db, string tableName, string schema = "dbo")
{
string query = "SELECT COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME= @tableName and TABLE_SCHEMA=@schema";
var queryOutput = await db.QueryAsync<(string COLUMN_NAME, string DATA_TYPE, int? CHARACTER_MAXIMUM_LENGTH)>(query, new { tableName = tableName, schema = schema });
return queryOutput.ToList();
}
}
Also note there's a bug in your query if you have multiple tables with the same name, so you should specify the schema.
QUESTION
I'm trying to learn Recurrent Neural Networks (RNN) with Flux.jl in Julia by following along some tutorials, like Char RNN from the FluxML/model-zoo.
I managed to build and train a model containing some RNN cells, but am failing to evaluate the model after training.
Can someone point out what I'm missing for this code to evaluate a simple (untrained) RNN?
julia> using Flux
julia> simple_rnn = Flux.RNN(1, 1, (x -> x))
julia> simple_rnn.([1, 2, 3])
ERROR: MethodError: no method matching (::Flux.RNNCell{var"#1#2", Matrix{Float32}, Vector{Float32}, Matrix{Float32}})(::Matrix{Float32}, ::Int64)
Closest candidates are:
(::Flux.RNNCell{F, A, V, var"#s263"} where var"#s263"<:AbstractMatrix{T})(::Any, ::Union{AbstractMatrix{T}, AbstractVector{T}, Flux.OneHotArray}) where {F, A, V, T} at C:\Users\UserName\.julia\packages\Flux\6o4DQ\src\layers\recurrent.jl:83
Stacktrace:
[1] (::Flux.Recur{Flux.RNNCell{var"#1#2", Matrix{Float32}, Vector{Float32}, Matrix{Float32}}, Matrix{Float32}})(x::Int64)
@ Flux C:\Users\UserName\.julia\packages\Flux\6o4DQ\src\layers\recurrent.jl:34
[2] _broadcast_getindex_evalf
@ .\broadcast.jl:648 [inlined]
[3] _broadcast_getindex
@ .\broadcast.jl:621 [inlined]
[4] getindex
@ .\broadcast.jl:575 [inlined]
[5] copy
@ .\broadcast.jl:922 [inlined]
[6] materialize(bc::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1}, Nothing, Flux.Recur{Flux.RNNCell{var"#1#2", Matrix{Float32}, Vector{Float32}, Matrix{Float32}}, Matrix{Float32}}, Tuple{Vector{Int64}}})
@ Base.Broadcast .\broadcast.jl:883
[7] top-level scope
@ REPL[3]:1
[8] top-level scope
@ C:\Users\UserName\.julia\packages\CUDA\LTbUr\src\initialization.jl:81
I'm using Julia 1.6.1 on Windows 10.
ANSWER
Answered 2021-Jun-11 at 12:27Turns out it's just a problem with the input type.
Doing something like this will work:
julia> v = Vector{Vector{Float32}}([[1], [2], [3]])
julia> simple_rnn.(v)
3-element Vector{Vector{Float32}}:
[9.731078]
[16.657223]
[28.398548]
I tried a lot of combinations until I found the working one. There is probably a way to automatically convert the input with some evaluation function.
QUESTION
When reading about CQRS it is often mentioned that the write model should not depend on any read model (assuming there is one write model and up to N read models). This makes a lot of sense, especially since read models usually only become eventually consistent with the write model. Also, we should be able to change or replace read models without breaking the write model.
However, read models might contain valuable information that is aggregated across many entities of the write model. These aggregations might even contain non-trivial business rules. One can easily imagine a business policy that evaluates a piece of information that a read model possesses, and in reaction to that changes one or many entities via the write model. But where should this policy be located/implemented? Isn't this critical business logic that tightly couples information coming from one particular read model with the write model?
When I want to implement said policy without coupling the write model to the read model, I can imagine the following strategy: Include a materialized view in the write model that gets updated synchronously whenever a relevant part of the involved entities changes (when using DDD, this could be done via domain events). However, this denormalizes the write model, and is effectively a special read model embedded in the write model itself.
I can imagine that DDD purists would say that such a policy should not exist, because it represents a business invariant/rule that encompasses multiple entities (a.k.a. aggregates). I could probably agree in theory, but in practice, I often encounter such requirements anyway.
Finally, my question is simply: How do you deal with requirements that change data in reaction to certain conditions whose evaluation requires a read model?
ANSWER
Answered 2021-Jun-07 at 01:20First, any write model which validates commands is a read model (because at some point validating a command requires a read), albeit one that is optimized for the purpose of validating commands. So I'm not sure where you're seeing that a write model shouldn't depend on a read model.
Second, a domain event is implicitly a command to the consumers of the event: "process/consider/incorporate this event", in which case a write model processor can subscribe to the events arising from a different write model: from the perspective of the subscribing write model, these are just commands.
QUESTION
I have a simple WebApp application that reads the data from google sheet and displays in HTML using materialize css( 1.0.0), JS. I am new to this stack of tools. HTML page has 2 containers and bottom one supposed to be populated (using innerHTML assignment)based on the selection on the top container. The bottom container content is simple card contents in table format. I want to put a HTML Select object with List of values to be displayed and created a Appscript function. I am assigning the function output to innerHTML like this. -- JS
function generateCards(text_card){
document.getElementById("container-id").innerHTML += text_card;
}
function createAuthorCards(authorName) {
document.getElementById("container-id").innerHTML = "";
google.script.run.withSuccessHandler(generateCards).getCardDynamicText(authorName) ;
}
-- If put copy/paste the function output in HTML it works( goes through rendering when refereshing), but if i use InnerHTML, the listbox is disabled, but all else seems ok. needed imageplease see the 2 attachments needed and missing_listbox images.enter image description here Is there any limitations of using innerHTML in webApp?
ANSWER
Answered 2021-Jun-09 at 22:03I was using google Webapp. It looks google doesn't allow the editing of dynamically created items. So I ended up using Modals to make the selections and save it.
QUESTION
I have a basic CSS photo gallery that works pretty well on desktop devices, and works well also if I simulate a mobile view in Mozilla Responsive Design Mode or this CodePen but doesn't seems to work properly on real mobile devices.
:root {
--blue: #6483B2ff;
--dark-blue: #1d3557;
--light-blue: #e1e5f2;
--white: #ffffff;
--flag-blue: #0038B8;
--error: #660000;
}
#gallery {
margin: 3em 0;
display: -webkit-box;
display: -ms-flexbox;
display: flex;
-webkit-box-orient: vertical;
-webkit-box-direction: normal;
-ms-flex-direction: column;
flex-direction: column;
-webkit-box-pack: center;
-ms-flex-pack: center;
justify-content: center;
-webkit-box-align: center;
-ms-flex-align: center;
align-items: center;
}
#gallery .quote {
text-align: center;
}
#gallery .gallery-layout {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
-webkit-box-orient: horizontal;
-webkit-box-direction: normal;
-ms-flex-direction: row;
flex-direction: row;
background: var(--light-blue);
border-radius: 15px;
width: 100%;
-webkit-box-pack: justify;
-ms-flex-pack: justify;
justify-content: space-between;
-webkit-box-align: center;
-ms-flex-align: center;
align-items: center;
padding: 0 20px;
}
#gallery .gallery-layout #gallery-items {
display: none;
}
#gallery .gallery-layout .frame {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
position: relative;
}
#gallery .gallery-layout .frame .overlay {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
-webkit-box-pack: center;
-ms-flex-pack: center;
justify-content: center;
-webkit-box-align: center;
-ms-flex-align: center;
align-items: center;
position: absolute;
width: 100%;
height: auto;
background: rgba(0, 0, 0, 0.5);
bottom: 0;
left: 0;
}
#gallery .gallery-layout .frame .overlay .overlay-text {
color: var(--white);
padding: 10px;
font-size: 1rem;
font-variant: small-caps;
letter-spacing: 1px;
text-shadow: -1px -1px 0 #000, 1px -1px 0 #000, -1px 1px 0 #000, 1px 1px 0 #000;
text-align: center;
}
#gallery .gallery-layout .prev-pic, #gallery .gallery-layout .next-pic {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
-webkit-box-pack: center;
-ms-flex-pack: center;
justify-content: center;
-webkit-box-align: center;
-ms-flex-align: center;
align-items: center;
margin: 10px;
}
#gallery .gallery-layout .prev-pic i, #gallery .gallery-layout .next-pic i {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
-webkit-box-align: center;
-ms-flex-align: center;
align-items: center;
-webkit-box-pack: center;
-ms-flex-pack: center;
justify-content: center;
color: var(--dark-blue);
width: 100%;
height: 150px;
padding: 5px;
}
#gallery .gallery-layout .prev-pic i:hover, #gallery .gallery-layout .next-pic i:hover {
border-radius: 5px;
background: var(--dark-blue);
color: var(--light-blue);
opacity: 0.7;
cursor: pointer;
}
#gallery .gallery-layout i.dl-1 {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
}
#gallery .gallery-layout i.dl-2 {
display: none;
}
@media screen and (max-width: 800px) {
#gallery .gallery-layout {
-webkit-box-orient: vertical;
-webkit-box-direction: normal;
-ms-flex-direction: column;
flex-direction: column;
padding: 0 20px;
}
#gallery .gallery-layout .prev-pic, #gallery .gallery-layout .next-pic {
width: 100%;
max-width: 150px;
}
#gallery .gallery-layout i.dl-1 {
display: none;
}
#gallery .gallery-layout i.dl-2 {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
height: 100%;
}
}
Document
navigate_beforekeyboard_arrow_up
Dummy description
navigate_nextkeyboard_arrow_down
ANSWER
Answered 2021-Jun-09 at 12:02The flex children are set to stretch by default. And when your child, that is, img tag is set to "height:auto;". It stretches.
You can change that behavior with "align-items" property.
#gallery .gallery-layout .frame {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
position: relative;
-webkit-box-align: center;
-ms-flex-align: center;
align-items: center;
}
QUESTION
I want to prepend a new iterator to my existing iterator.
How can I prepend a new Iterator to an existing Iterator?
Why does my code cause my REPL to hang?
The following hangs indefinitely in my REPL:
scala> var i = Seq(1).iterator
var i: Iterator[Int] =
scala> i = Seq(2).iterator ++ i
// mutated i
scala> i.next()
val res0: Int = 2
scala> i.next()
. . .
Note that the following works, but this is an append not a prepend:
var i = Seq(1).iterator
i = i ++ Seq(2).iterator
i.next()
i.next()
This also works, but materializes the entire iterator which I cannot do:
var i = Seq(1).iterator
i = (Seq(2) ++ i.toSeq).iterator
i.next()
i.next()
Thanks!
ANSWER
Answered 2021-Jun-07 at 23:22Simply speaking, it hangs because you have there an infinite loop. In the line where you think you are merging iterators, you are actually referencing the new i
(laziness) and not the initial value of i
.
You can get over that by introducing a new var as Luis Miguel pointed out
var i = Seq(1).iterator
var j = i
i = Seq(2).iterator ++ j
i.next()
i.next()
so that should do the trick. Hope it helps and is clear enough.
QUESTION
(define sum 0)
(define (accum x)
(set! sum (+ x sum))
sum)
;1: (define seq (stream-map accum (stream-enumerate-interval 1 20)))
;2: (define y (stream-filter even? seq))
;3: (define z (stream-filter (lambda (x) (= (remainder x 5) 0))
; seq))
;4: (stream-ref y 7)
;5: (display-stream z)
Step 1: ;1: ==> (cons-stream 1 (stream-map proc (stream-cdr s))
(Assume stream-cdr
is evaluated only when we force the cdr
of this stream)
sum
is now 1
Step 2: 1
is not even, hence (also memoized so not added again), it calls (stream-filter pred (stream-cdr stream))
. This leads to evaluation of cdr
hence materializing 2
which is even, hence it should call: (cons-stream 2 (stream-cdr stream))
.
According to this answer should be 1+2 = 3 , but it is 6
Can someone help with why the cdr
's car
is materialized before the current cdr
is called?
ANSWER
Answered 2021-Jun-05 at 14:21#lang r5rs
(define-syntax cons-stream
(syntax-rules ()
((_ h t) (cons h (lambda () t)))))
(define (stream-cdr s)
(if (and (not (pair? (cdr s)))
(not (null? (cdr s))))
(set-cdr! s ((cdr s))))
(cdr s))
we observe:
> sum
0
> (define seq (stream-map accum (stream-enumerate-interval 1 20)))
> sum
1
> seq
(mcons 1 #)
> (define y (stream-filter even? seq))
> sum
6
> seq
(mcons
1
(mcons
3
(mcons 6 #)))
> y
(mcons 6 #)
>
stream-filter?
needs to get to the first element of the stream it is constructing in order to construct it. A stream has its head element already forced, calculated, so it must be already present.
In the list of accumulated sums of the enumerated interval from 1 to 20, the first even number is 6:
1 = 1
1+2 = 3
1+2+3 = 6
...
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install Materialize
Support
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesExplore Kits - Develop, implement, customize Projects, Custom Functions and Applications with kandi kits
Save this library and start creating your kit
Share this Page