Explore all Cybersecurity open source software, libraries, packages, source code, cloud functions and APIs.

Cybersecurity is security as it is applied to information technology. This includes all technology that stores, manipulates, or moves data, such as computers, data networks, and all devices connected to or included in networks, such as routers and switches. All information technology devices and facilities need to be secured against intrusion, unauthorized use, and vandalism. Additionally, the users of information technology should be protected from theft of assets, extortion, identity theft, loss of privacy and confidentiality of personal information, malicious mischief, damage to equipment, business process compromise, and the general activity of cybercriminals.

Popular New Releases in Cybersecurity

Amass

v3.19.1

juice-shop

v13.2.2

pyWhat

5.1.0 - New & Better regex ✨

juice-shop

v12.8.1

grr

GRR release 3.4.5.1

Popular Libraries in Cybersecurity

CheatSheetSeries

by OWASP doticonpythondoticon

star image 19686 doticonNOASSERTION

The OWASP Cheat Sheet Series was created to provide a concise collection of high value information on specific application security topics.

Amass

by OWASP doticongodoticon

star image 6903 doticonApache-2.0

In-depth Attack Surface Mapping and Asset Discovery

juice-shop

by juice-shop doticontypescriptdoticon

star image 6608 doticonMIT

OWASP Juice Shop: Probably the most modern and sophisticated insecure web application

Reverse-Engineering

by mytechnotalent doticoncdoticon

star image 5434 doticonApache-2.0

A FREE comprehensive reverse engineering tutorial covering x86, x64, 32-bit ARM & 64-bit ARM architectures.

pyWhat

by bee-san doticonpythondoticon

star image 5040 doticonMIT

🐸 Identify anything. pyWhat easily lets you identify emails, IP addresses, and more. Feed it a .pcap file or some text and it'll tell you what it is! 🧙‍♀️

juice-shop

by bkimminich doticontypescriptdoticon

star image 4913 doticonMIT

OWASP Juice Shop: Probably the most modern and sophisticated insecure web application

grr

by google doticonpythondoticon

star image 4082 doticonApache-2.0

GRR Rapid Response: remote live forensics for incident response

MISP

by MISP doticonphpdoticon

star image 3701 doticonAGPL-3.0

MISP (core software) - Open Source Threat Intelligence and Sharing Platform

Top10

by OWASP doticonhtmldoticon

star image 2809 doticonNOASSERTION

Official OWASP Top 10 Document Repository

Trending New libraries in Cybersecurity

Reverse-Engineering

by mytechnotalent doticoncdoticon

star image 5434 doticonApache-2.0

A FREE comprehensive reverse engineering tutorial covering x86, x64, 32-bit ARM & 64-bit ARM architectures.

pyWhat

by bee-san doticonpythondoticon

star image 5040 doticonMIT

🐸 Identify anything. pyWhat easily lets you identify emails, IP addresses, and more. Feed it a .pcap file or some text and it'll tell you what it is! 🧙‍♀️

ScareCrow

by optiv doticongodoticon

star image 1785 doticonMIT

ScareCrow - Payload creation framework designed around EDR bypass.

coreruleset

by coreruleset doticonpythondoticon

star image 960 doticonApache-2.0

OWASP ModSecurity Core Rule Set (Official Repository)

ThreatPursuit-VM

by fireeye doticonpowershelldoticon

star image 855 doticonNOASSERTION

Threat Pursuit Virtual Machine (VM): A fully customizable, open-sourced Windows-based distribution focused on threat intelligence analysis and hunting designed for intel and malware analysts as well as threat hunters to get up and running quickly.

whatfiles

by spieglt doticoncdoticon

star image 807 doticonGPL-3.0

Log what files are accessed by any Linux process

RegExp

by zodiacon doticonc++doticon

star image 737 doticonMIT

Registry Explorer - enhanced Registry editor/viewer

CobaltStrikeScan

by Apr4h doticoncsharpdoticon

star image 578 doticonMIT

Scan files or process memory for CobaltStrike beacons and parse their configuration

Watcher

by thalesgroup-cert doticonpythondoticon

star image 542 doticonAGPL-3.0

Watcher - Open Source Cybersecurity Threat Hunting Platform. Developed with Django & React JS.

Top Authors in Cybersecurity

1

OWASP

97 Libraries

star icon33309

2

zaproxy

9 Libraries

star icon445

3

MISP

8 Libraries

star icon4296

4

hasherezade

6 Libraries

star icon3677

5

righettod

6 Libraries

star icon32

6

google

5 Libraries

star icon6201

7

usnistgov

4 Libraries

star icon51

8

hrbrmstr

4 Libraries

star icon34

9

NtRaiseHardError

4 Libraries

star icon119

10

guilhermej

4 Libraries

star icon9

1

97 Libraries

star icon33309

2

9 Libraries

star icon445

3

8 Libraries

star icon4296

4

6 Libraries

star icon3677

5

6 Libraries

star icon32

6

5 Libraries

star icon6201

7

4 Libraries

star icon51

8

4 Libraries

star icon34

9

4 Libraries

star icon119

10

4 Libraries

star icon9

Trending Kits in Cybersecurity

Python Digital Forensics Libraries are Python modules, functions, and script collections. It offers capabilities and tools for forensic investigators to analyze digital evidence. These libraries offer various features for helping investigators. It offers various aspects of digital forensics. It includes memory forensics, malware analysis, and file system analysis.  


These libraries offer tools for analyzing file systems and disk images. It will allow investigators to examine directories, files, and other data stored. These libraries offer tools for analyzing the memory of a memory dump or a live system. It will allow investigators to extract information. It helps with information about network connections, running processes, and other system data. These libraries provide tools for analyzing binary files. It will allow us to disassemble and analyze malware and other malicious code. These libraries provide tools for analyzing network traffic. It will allow us to capture and examine packets for evidence. We have to check about evidence of malicious activity or data exfiltration. These libraries offer tools for analyzing and decrypting encrypted communications and data. These offer tools for recovering deleted files and other data. 


Here are the 7 best Python Digital Forensics Libraries handpicked for developers:

beagle:

  • Is an open source library that offers incident response and digital forensics tools. 
  • Is designed to help investigators automate common forensic tasks and analyze large data. 
  • Offers tools for analyzing disk images and file systems. 
  • Allow us to examine the system's directories, files, and other data.

Digital-Forensics-Guide:

  • Is a Python package that offers tools for incident response and digital forensics.  
  • Includes memory forensics, malware analysis, file system analysis, and network analysis. 
  • Includes notebooks and scripts demonstrating how to analyze disk images and file systems.
  • Offers various techniques and tools.
  • Offers tools for analyzing digital evidence and identifying potential indicators of compromise.

ThePhish: 

  • Is an automated phishing email analysis tool based on MISP, TheHive, and Cortex. 
  • Automates the entire analysis process starting from the extraction of the observables. 
  • Will start from the header to the body of an email to the elaboration of a final verdict in most cases. 
  • Allows the analyst to intervene in the analysis process and get further details. 

dfirtrack:

  • Is a web application designed for Digital Forensics and Incident Response teams.
  • It will help manage and track the progress of their investigations.
  • Offers a centralized platform for managing different investigations.
  • Supports investigations like case updating, closing, and creation.
  • Enables you to track and manage all digital evidence related to a particular case.
  • Track evidence like associated metadata and storage locations.

Cortex-Analyzers:

  • Offers a collection of analyzers for use with Cortex and TheHive platforms. 
  • Is a collaborative incident response platform for tracking and managing security incidents. 
  • Helps analyze file types, identify potential threats, and extract metadata. 
  • Helps analyze and identify malicious activity, detect data exfiltration, and analyze network traffic.

Forensic-Tools:

  • Used for parsing Firefox profile databases.
  • Can help extract cookies, Google searches, and history.
  • Used for analyzing Facebook app and messenger, still new and currently tested.
  • Can extract messages with links, contacts, time, and attachments.
  • Helps with profile pictures and links. 
  • Can extract account details, call logs, messages, and contacts with their full details. 

kobackupdec:

  • Is a Python library for decrypting backups.
  • Can be created by the KNOX security feature on Samsung devices.
  • Allows forensic investigators to extract data from encrypted backups.
  • Enables them to perform digital forensics analysis on the extracted data.
  • Uses a brute-force approach to decrypt the encrypted backup files.

The Career Path of a Cybersecurity Analyst 


Cybersecurity is at the forefront of the digital era, and its significance is growing as our world becomes more dependent on technology. Amongst the professionals responsible for protecting our digital worlds, cybersecurity analysts are the vigilant protectors. They play an essential role in the detection, mitigation, and prevention of cyber risks, making them essential resources for organizations in a variety of industries. 



If you’re looking to get into cybersecurity or just want to learn more about the field this article will help you understand who a cybersecurity analyst is and why it’s so important. We will look at the educational requirements, skills you need to have, and the career paths you can take if you want to join the ever-growing field of cybersecurity. 


The Role of a Cybersecurity Analyst 


What does a Cybersecurity analyst do?

A cybersecurity analyst is responsible for keeping your computer system, network, and data safe from cyber threats and weaknesses. They keep an eye out for security breaches, evaluate potential risks, create security plans, and put measures in place to keep important information secure, honest and accessible. 


The key responsibilities of a cybersecurity analyst includes, 


Threat Detection and Analysis: 

One of a cybersecurity analyst’s primary responsibilities is to keep an eye on network traffic, system records, and security notifications. This constant vigilance allows them to spot unusual activity and potential security breaches in real-time. With the help of sophisticated tools and techniques, a cybersecurity analyst analyzes these threats to identify their source, method, and impact. 

Incident Response: 

Cybersecurity analysts play a critical role in the response to security incidents, such as data breaches and malware attacks. They investigate the root causes of the incident, determine the extent of the harm, and formulate a strategic response strategy to reduce the immediate impact of the incident and prevent similar occurrences in the future. 


Vulnerability Assessment: 

Another important part of their work is proactive vulnerability assessment (PVAM). Cybersecurity analysts carry out regular vulnerability assessments on an organization’s systems and applications, looking for vulnerabilities that attackers can take advantage of. They then work with teams to fix those vulnerabilities before they become a target of an attack. 


Security Awareness Training: 

One of the most important roles of a cybersecurity analyst is to educate employees and other stakeholders on the risks of cybersecurity and the best ways to protect against them. This is done in a proactive way, helping to build a security culture within the company and reducing the chances of human mistakes that could lead to breach. 


Security Policy Development and Enforcement

Cybersecurity analysts work on the development, implementation, and enforcement of security policies, processes, and best practices across an organization. They can also help ensure compliance with industry rules and regulations, such as GDPR or HIPAA. 


Skill Required to be a Cybersecurity Analyst 


As a cybersecurity analyst one is tasked with defending critical data, networks, and systems against cyber attacks. To excel in this vital role, individuals must cultivate a diverse skill set that spans technical expertise, analytical acumen, and a profound understanding of the evolving cybersecurity landscape. Some of the crucial skills include: 

Technical Proficiency, understanding various operating systems, network protocols, and security technologies is fundamental. Proficiency in areas such as firewall management, intrusion detection, and encryption is crucial. 


Threat Intelligence, staying informed about the latest threats and trends in the cybersecurity environment is important for defense. Analysts need to keep up-to-date with emerging threats and hacker techniques. 


Analytical Skills, the ability to analyze large volumes of data, identify patterns, and make informed decisions is important for threat detection and incident response. Analysts use data analysis tools and techniques to uncover hidden threats. 


Programming and Scripting, knowledge of programming languages like Python and scripting skills are valuable for automating routine tasks, conducting security assessments, and customizing security solutions. 


Risk Assessment, evaluating risks and prioritizing security measures based on potential impact is essential. Cybersecurity analysts must understand the organization’s business objective and align security efforts accordingly. 


Communication Skills, effective communication is critical for reporting security incidents, collaborating with other departments, and conveying complex technical concepts to non-technical stakeholders. 


Job Opportunities


Cybersecurity analysts are in high demand as companies and organizations are realizing the importance of keeping the digital information safe and secure. 


Industries that require Cybersecurity Analysts ? 

Cybersecurity plays a central role in almost every industry and sector, which mens cybersecurity analyst jobs are plentiful and varied. Here are a few key industries that are always looking for cybersecurity talent: 


  • Finance and Banking: Financial institutions manage vast amounts of sensitive data, making them prime targets for cyberattacks. They require skilled analysts to safeguard customer financial information and maintain the integrity of their systems. 
  • Healthcare: Healthcare organizations maintain lots of digital records and patient records that are to be protected so as to protect patient privacy and ensure the security of medical records. 
  • Government and Defense: Government agencies and defense organizations require cybersecurity experts to protect national security interests, government data, and critical infrastructure. 
  • Retail and e-commerce: Online retailers handle vast amounts of customer data and payment information, making them targets for cyberattacks. They, thus, need cybersecurity analysts to safeguard customer information and maintain trust. 
  • Technology Companies: Tech firms, including software developers, hardware manufacturers, and cloud service providers, need cybersecurity professionals to protect their products and services from security breaches. 


Job Titles for Cybersecurity Analysts

Cybersecurity analysts may go by various job titles, depending on the organization and specific responsibilities. Here are some common job titles associated with this role: 


  • Security Analyst, a general title for cybersecurity professionals who monitor security systems, investigate incidents and implement security measures.
  • Threat Analyst, specialize in identifying and assessing cybersecurity threats, vulnerabilities and risks. 
  • Incident Responder, incident responders are experts in handling security incidents, mitigating damage, and implementing measures to prevent future occurrences. 
  • Network Security Analyst, specifically focused on securing an organization’s network infrastructure. 
  • Compliance Analyst, ensure that an organization adheres to cybersecurity regulations, standards, and best practices. 


Career Growth and Advancement 


Cybersecurity analysts’ careers start with entry-level roles and work their way up to more specialized, senior roles. 


After learning how to detect threats, respond to incidents, and assess vulnerabilities, analysts can go on to work as a ‘security architect’, designing and implementing complex security plans. Another way to get into a senior role is as a ‘Penetration tester or ethical hacker’, tasked with proactively finding vulnerabilities by simulating attacks. If you want to move up to a leadership role, you can aim for a Security Manager or Director role, where you oversee security teams and strategies. 


The high point of a cybersecurity career usually comes when you become a CISO (Chief Information Security Officer), which means you're in charge of an organization's whole cybersecurity program and report directly to top execs. This career path not only provides you with professional growth, but also more responsibility and higher pay as cybersecurity is still a top priority for organizations around the world.


Global Demand for Cybersecurity Professionals 

Cybersecurity jobs are in high demand all over the world because of the ever-growing digital landscape and the constant threat of cyber attacks. 



Plus, digitalization and cyberattacks are getting more and more sophisticated, which means there's a huge gap between what's available and what's needed. So, as long as digitalization keeps growing and cyber threats keep getting worse, there's plenty of job opportunities and great pay for cybersecurity.


Compensation,

How much money does a cybersecurity analyst make?



It depends on a lot of things, like experience, where you work, and the company you work for. But on average, the average salary for a cybersecurity analyst in the US is between $60,000 and $120,000 a year. If you're a senior analyst with a lot of experience and knowledge, you could make even more. Many companies offer health insurance, retirement benefits, professional development programs, and bonuses to cybersecurity professionals.


In conclusion, the role of cybersecurity analysts is to play a critical role in the defense of organizations and individuals against cyber threats, such as data breaches and cyberattacks. Cybersecurity analysts possess a wide range of competencies, are committed to continuous learning, and are committed to upholding the highest security standards. 


As the digital world continues to evolve, cybersecurity analysts have the opportunity to pursue a career that offers both financial security and intellectual stimulation.




Trending Discussions on Cybersecurity

Golang reads html tags (<>) from JSON string data as &lt and &gt which causes rendering issues in the browser

Python / BeautifulSoup return ids with indeed jobs

Specific argument causes argparse to parse arguments incorrectly

How do I adjust my tibble to get a grouped bar chart in ggplot2?

how to make a model fit the dataset in Keras?

How to change my css to make hyper link visible [ with minimum sample code ]?

Bootstrap overflow width when writting an article with many paragraphs

Faster way than nested for loops for custom conditions on multiple columns in two DataFrames

Find a hash function to malfunction insertion sort

component wont render when is useEffect() is ran once

QUESTION

Golang reads html tags (<>) from JSON string data as &lt and &gt which causes rendering issues in the browser

Asked 2022-Mar-19 at 18:45

I have a basic web server that renders blog posts from a database of JSON posts wherein the main paragraphs are built from a JSON string array. I was trying to find a way to easily encode new lines or line breaks and found a lot of difficulty with how the encoding for these values changes from JSON to GoLang and finally to my HTML webpage. When I tried to encode my JSON with newlines I found I had to encode them using \\n rather than just \n in order for them to actually appear on my page. One problem however was they simply appeared as text and not line breaks.

I then tried to research ways to replace the \n portions of the joined string array into <br> tags, however I could not find any way to do this with go and moved to trying to do so in javascript. This did not work either despite me deferring the calling of my javascript in my link from my HTML. this is that javascript:

1var title = window.document.getElementById(&quot;title&quot;);
2var timestamp = window.document.getElementById(&quot;timestamp&quot;);
3var sitemap = window.document.getElementById(&quot;sitemap&quot;);
4var main = window.document.getElementById(&quot;main&quot;);
5var contact_form = window.document.getElementById(&quot;contact-form&quot;);
6var content_info = window.document.getElementById(&quot;content-info&quot;);
7
8var str = main.innerHTML;
9
10function replaceNewlines() {
11    // Replace the \n with &lt;br&gt;
12    str = str.replace(/(?:\r\n|\r|\n)/g, &quot;&lt;br&gt;&quot;);
13
14    // Update the value of paragraph
15    main.innerHTML = str;
16}
17

Here is my HTML:

1var title = window.document.getElementById(&quot;title&quot;);
2var timestamp = window.document.getElementById(&quot;timestamp&quot;);
3var sitemap = window.document.getElementById(&quot;sitemap&quot;);
4var main = window.document.getElementById(&quot;main&quot;);
5var contact_form = window.document.getElementById(&quot;contact-form&quot;);
6var content_info = window.document.getElementById(&quot;content-info&quot;);
7
8var str = main.innerHTML;
9
10function replaceNewlines() {
11    // Replace the \n with &lt;br&gt;
12    str = str.replace(/(?:\r\n|\r|\n)/g, &quot;&lt;br&gt;&quot;);
13
14    // Update the value of paragraph
15    main.innerHTML = str;
16}
17&lt;!DOCTYPE html&gt;
18&lt;html lang=&quot;en&quot;&gt;
19&lt;head&gt;
20    &lt;meta charset=&quot;UTF-8&quot;&gt;
21    &lt;meta http-equiv=&quot;X-UA-Compatible&quot; content=&quot;IE=edge&quot;&gt;
22    &lt;meta name=&quot;viewport&quot; content=&quot;width=device-width, initial-scale=1.0&quot;&gt;
23    &lt;title&gt;Dynamic JSON Events&lt;/title&gt;
24    &lt;link rel=&quot;stylesheet&quot; href=&quot;/blogtemplate.css&quot;&gt;&lt;/style&gt;
25&lt;/head&gt;
26&lt;body&gt;
27    &lt;section id=&quot;title&quot;&gt;
28        &lt;h1 id=&quot;text-title&quot;&gt;{{.Title}}&lt;/h1&gt;
29        &lt;time id=&quot;timestamp&quot;&gt;
30            {{.Timestamp}}
31        &lt;/time&gt;
32    &lt;/section&gt;
33    &lt;nav role=&quot;navigation&quot; id=&quot;site-nav&quot;&gt;
34        &lt;ul id=&quot;sitemap&quot;&gt;
35        &lt;/ul&gt;
36    &lt;/nav&gt;
37    &lt;main role=&quot;main&quot; id=&quot;main&quot;&gt;
38        {{.ParsedMain}}
39    &lt;/main&gt;
40    &lt;footer role=&quot;contentinfo&quot; id=&quot;footer&quot;&gt;
41        &lt;form id=&quot;contact-form&quot; role=&quot;form&quot;&gt;
42        &lt;address&gt;
43            Contact me by &lt;a id=&quot;my-email&quot; href=&quot;mailto:antonhibl11@gmail.com&quot; class=&quot;my-email&quot;&gt;e-mail&lt;/a&gt;
44        &lt;/address&gt;
45        &lt;/form&gt;
46    &lt;/footer&gt;
47&lt;script defer src=&quot;/blogtemplate.js&quot;&gt;
48&lt;/script&gt;
49&lt;/body&gt;
50&lt;/html&gt;
51

I then finally turned to trying to hardcode <br> tags into my json data to discover that this simply renders as &lt and &gt when it finally reaches the browser. I am getting pretty frustrated with this process of encoding constantly causing me issues in creating newlines and line breaks. How can I easily include newlines where I want in my JSON string data?

Here is my Go script if it helps:

1var title = window.document.getElementById(&quot;title&quot;);
2var timestamp = window.document.getElementById(&quot;timestamp&quot;);
3var sitemap = window.document.getElementById(&quot;sitemap&quot;);
4var main = window.document.getElementById(&quot;main&quot;);
5var contact_form = window.document.getElementById(&quot;contact-form&quot;);
6var content_info = window.document.getElementById(&quot;content-info&quot;);
7
8var str = main.innerHTML;
9
10function replaceNewlines() {
11    // Replace the \n with &lt;br&gt;
12    str = str.replace(/(?:\r\n|\r|\n)/g, &quot;&lt;br&gt;&quot;);
13
14    // Update the value of paragraph
15    main.innerHTML = str;
16}
17&lt;!DOCTYPE html&gt;
18&lt;html lang=&quot;en&quot;&gt;
19&lt;head&gt;
20    &lt;meta charset=&quot;UTF-8&quot;&gt;
21    &lt;meta http-equiv=&quot;X-UA-Compatible&quot; content=&quot;IE=edge&quot;&gt;
22    &lt;meta name=&quot;viewport&quot; content=&quot;width=device-width, initial-scale=1.0&quot;&gt;
23    &lt;title&gt;Dynamic JSON Events&lt;/title&gt;
24    &lt;link rel=&quot;stylesheet&quot; href=&quot;/blogtemplate.css&quot;&gt;&lt;/style&gt;
25&lt;/head&gt;
26&lt;body&gt;
27    &lt;section id=&quot;title&quot;&gt;
28        &lt;h1 id=&quot;text-title&quot;&gt;{{.Title}}&lt;/h1&gt;
29        &lt;time id=&quot;timestamp&quot;&gt;
30            {{.Timestamp}}
31        &lt;/time&gt;
32    &lt;/section&gt;
33    &lt;nav role=&quot;navigation&quot; id=&quot;site-nav&quot;&gt;
34        &lt;ul id=&quot;sitemap&quot;&gt;
35        &lt;/ul&gt;
36    &lt;/nav&gt;
37    &lt;main role=&quot;main&quot; id=&quot;main&quot;&gt;
38        {{.ParsedMain}}
39    &lt;/main&gt;
40    &lt;footer role=&quot;contentinfo&quot; id=&quot;footer&quot;&gt;
41        &lt;form id=&quot;contact-form&quot; role=&quot;form&quot;&gt;
42        &lt;address&gt;
43            Contact me by &lt;a id=&quot;my-email&quot; href=&quot;mailto:antonhibl11@gmail.com&quot; class=&quot;my-email&quot;&gt;e-mail&lt;/a&gt;
44        &lt;/address&gt;
45        &lt;/form&gt;
46    &lt;/footer&gt;
47&lt;script defer src=&quot;/blogtemplate.js&quot;&gt;
48&lt;/script&gt;
49&lt;/body&gt;
50&lt;/html&gt;
51package main
52
53import (
54    &quot;encoding/json&quot;
55    &quot;html/template&quot;
56    &quot;log&quot;
57    &quot;net/http&quot;
58    &quot;os&quot;
59    &quot;regexp&quot;
60    &quot;strings&quot;
61)
62
63type BlogPost struct {
64    Title      string   `json:&quot;title&quot;`
65    Timestamp  string   `json:&quot;timestamp&quot;`
66    Main       []string `json:&quot;main&quot;`
67    ParsedMain string
68}
69
70// this did not seem to work when I tried to implement it below
71var re = regexp.MustCompile(`\r\n|[\r\n\v\f\x{0085}\x{2028}\x{2029}]`)
72func replaceRegexp(s string) string {
73    return re.ReplaceAllString(s, &quot;&lt;br&gt;\n&quot;)
74}
75
76var blogTemplate = template.Must(template.ParseFiles(&quot;./assets/docs/blogtemplate.html&quot;))
77
78func blogHandler(w http.ResponseWriter, r *http.Request) {
79    blogstr := r.URL.Path[len(&quot;/blog/&quot;):] + &quot;.json&quot;
80
81    f, err := os.Open(&quot;db/&quot; + blogstr)
82    if err != nil {
83        http.Error(w, err.Error(), http.StatusNotFound)
84        return
85    }
86    defer f.Close()
87
88    var post BlogPost
89    if err := json.NewDecoder(f).Decode(&amp;post); err != nil {
90        http.Error(w, err.Error(), http.StatusInternalServerError)
91        return
92    }
93
94    post.ParsedMain = strings.Join(post.Main, &quot;&quot;)
95
96    // post.ParsedMain = replaceRegexp(post.ParsedMain)
97
98    if err := blogTemplate.Execute(w, post); err != nil {
99        log.Println(err)
100    }
101}
102
103func teapotHandler(w http.ResponseWriter, r *http.Request) {
104    w.WriteHeader(http.StatusTeapot)
105    w.Write([]byte(&quot;&lt;html&gt;&lt;h1&gt;&lt;a href='https://datatracker.ietf.org/doc/html/rfc2324/'&gt;HTCPTP&lt;/h1&gt;&lt;img src='https://external-content.duckduckgo.com/iu/?u=https%3A%2F%2Ftaooftea.com%2Fwp-content%2Fuploads%2F2015%2F12%2Fyixing-dark-brown-small.jpg&amp;f=1&amp;nofb=1' alt='Im a teapot'&gt;&lt;html&gt;&quot;))
106}
107
108func faviconHandler(w http.ResponseWriter, r *http.Request) {
109    http.ServeFile(w, r, &quot;./assets/art/favicon.ico&quot;)
110}
111
112func main() {
113    http.Handle(&quot;/&quot;, http.FileServer(http.Dir(&quot;/assets/docs&quot;)))
114    http.HandleFunc(&quot;/blog/&quot;, blogHandler)
115    http.HandleFunc(&quot;/favicon.ico&quot;, faviconHandler)
116    http.HandleFunc(&quot;/teapot&quot;, teapotHandler)
117    log.Fatal(http.ListenAndServe(&quot;:8080&quot;, nil))
118}
119
120

Here is an example of my JSON data:

1var title = window.document.getElementById(&quot;title&quot;);
2var timestamp = window.document.getElementById(&quot;timestamp&quot;);
3var sitemap = window.document.getElementById(&quot;sitemap&quot;);
4var main = window.document.getElementById(&quot;main&quot;);
5var contact_form = window.document.getElementById(&quot;contact-form&quot;);
6var content_info = window.document.getElementById(&quot;content-info&quot;);
7
8var str = main.innerHTML;
9
10function replaceNewlines() {
11    // Replace the \n with &lt;br&gt;
12    str = str.replace(/(?:\r\n|\r|\n)/g, &quot;&lt;br&gt;&quot;);
13
14    // Update the value of paragraph
15    main.innerHTML = str;
16}
17&lt;!DOCTYPE html&gt;
18&lt;html lang=&quot;en&quot;&gt;
19&lt;head&gt;
20    &lt;meta charset=&quot;UTF-8&quot;&gt;
21    &lt;meta http-equiv=&quot;X-UA-Compatible&quot; content=&quot;IE=edge&quot;&gt;
22    &lt;meta name=&quot;viewport&quot; content=&quot;width=device-width, initial-scale=1.0&quot;&gt;
23    &lt;title&gt;Dynamic JSON Events&lt;/title&gt;
24    &lt;link rel=&quot;stylesheet&quot; href=&quot;/blogtemplate.css&quot;&gt;&lt;/style&gt;
25&lt;/head&gt;
26&lt;body&gt;
27    &lt;section id=&quot;title&quot;&gt;
28        &lt;h1 id=&quot;text-title&quot;&gt;{{.Title}}&lt;/h1&gt;
29        &lt;time id=&quot;timestamp&quot;&gt;
30            {{.Timestamp}}
31        &lt;/time&gt;
32    &lt;/section&gt;
33    &lt;nav role=&quot;navigation&quot; id=&quot;site-nav&quot;&gt;
34        &lt;ul id=&quot;sitemap&quot;&gt;
35        &lt;/ul&gt;
36    &lt;/nav&gt;
37    &lt;main role=&quot;main&quot; id=&quot;main&quot;&gt;
38        {{.ParsedMain}}
39    &lt;/main&gt;
40    &lt;footer role=&quot;contentinfo&quot; id=&quot;footer&quot;&gt;
41        &lt;form id=&quot;contact-form&quot; role=&quot;form&quot;&gt;
42        &lt;address&gt;
43            Contact me by &lt;a id=&quot;my-email&quot; href=&quot;mailto:antonhibl11@gmail.com&quot; class=&quot;my-email&quot;&gt;e-mail&lt;/a&gt;
44        &lt;/address&gt;
45        &lt;/form&gt;
46    &lt;/footer&gt;
47&lt;script defer src=&quot;/blogtemplate.js&quot;&gt;
48&lt;/script&gt;
49&lt;/body&gt;
50&lt;/html&gt;
51package main
52
53import (
54    &quot;encoding/json&quot;
55    &quot;html/template&quot;
56    &quot;log&quot;
57    &quot;net/http&quot;
58    &quot;os&quot;
59    &quot;regexp&quot;
60    &quot;strings&quot;
61)
62
63type BlogPost struct {
64    Title      string   `json:&quot;title&quot;`
65    Timestamp  string   `json:&quot;timestamp&quot;`
66    Main       []string `json:&quot;main&quot;`
67    ParsedMain string
68}
69
70// this did not seem to work when I tried to implement it below
71var re = regexp.MustCompile(`\r\n|[\r\n\v\f\x{0085}\x{2028}\x{2029}]`)
72func replaceRegexp(s string) string {
73    return re.ReplaceAllString(s, &quot;&lt;br&gt;\n&quot;)
74}
75
76var blogTemplate = template.Must(template.ParseFiles(&quot;./assets/docs/blogtemplate.html&quot;))
77
78func blogHandler(w http.ResponseWriter, r *http.Request) {
79    blogstr := r.URL.Path[len(&quot;/blog/&quot;):] + &quot;.json&quot;
80
81    f, err := os.Open(&quot;db/&quot; + blogstr)
82    if err != nil {
83        http.Error(w, err.Error(), http.StatusNotFound)
84        return
85    }
86    defer f.Close()
87
88    var post BlogPost
89    if err := json.NewDecoder(f).Decode(&amp;post); err != nil {
90        http.Error(w, err.Error(), http.StatusInternalServerError)
91        return
92    }
93
94    post.ParsedMain = strings.Join(post.Main, &quot;&quot;)
95
96    // post.ParsedMain = replaceRegexp(post.ParsedMain)
97
98    if err := blogTemplate.Execute(w, post); err != nil {
99        log.Println(err)
100    }
101}
102
103func teapotHandler(w http.ResponseWriter, r *http.Request) {
104    w.WriteHeader(http.StatusTeapot)
105    w.Write([]byte(&quot;&lt;html&gt;&lt;h1&gt;&lt;a href='https://datatracker.ietf.org/doc/html/rfc2324/'&gt;HTCPTP&lt;/h1&gt;&lt;img src='https://external-content.duckduckgo.com/iu/?u=https%3A%2F%2Ftaooftea.com%2Fwp-content%2Fuploads%2F2015%2F12%2Fyixing-dark-brown-small.jpg&amp;f=1&amp;nofb=1' alt='Im a teapot'&gt;&lt;html&gt;&quot;))
106}
107
108func faviconHandler(w http.ResponseWriter, r *http.Request) {
109    http.ServeFile(w, r, &quot;./assets/art/favicon.ico&quot;)
110}
111
112func main() {
113    http.Handle(&quot;/&quot;, http.FileServer(http.Dir(&quot;/assets/docs&quot;)))
114    http.HandleFunc(&quot;/blog/&quot;, blogHandler)
115    http.HandleFunc(&quot;/favicon.ico&quot;, faviconHandler)
116    http.HandleFunc(&quot;/teapot&quot;, teapotHandler)
117    log.Fatal(http.ListenAndServe(&quot;:8080&quot;, nil))
118}
119
120{
121    &quot;title&quot; : &quot;Finished My First Blog App&quot;,
122    &quot;timestamp&quot;: &quot;Friday, March 18th, 11:39 AM&quot;,
123    &quot;main&quot;: [
124        &quot;It took me awhile to tidy everything up but I finally finished creating my first &quot;,
125        &quot;blog app using Go along with JSON for my database. I plan on using this to document &quot;,
126        &quot;my own thoughts and experiences as a programmer and cybersecurity researcher; things &quot;,
127        &quot;like tutorials, thought-pieces, and journals on my own projects progress will be &quot;,
128        &quot;posted here. I look forward to getting more used to writing and sharing my own story, &quot;,
129        &quot;I think it will help me learn from doing and also hearing feedback from others.\\n\\n&quot;,
130        &quot;I utilized a handler function to dynamically read from my JSON database and template &quot;,
131        &quot;data into my HTML template using the go html/template package as well as the encoding/json &quot;,
132        &quot;to handling reading those objects. Next I had to make sure my CSS and JavaScript assets &quot;,
133        &quot;would be served alongside this finished template in order for my styling to be output into &quot;,
134        &quot;the browser. For this I used a FileServer function which allowed for me to serve linked &quot;,
135        &quot;resources in my HTML boilerplate and have the server still locate blog resources dynamically. &quot;,
136        &quot;Going forward I am looking to add better styling, more JavaScript elements to the page, and &quot;,
137        &quot;more functionality to how my JSON data is encoded and parsed in order to create more complex &quot;,
138        &quot;looking pages and blog posts.&quot;
139    ]
140}
141

I am just trying to find a way to easily include spaces between paragraphs in the long array of strings in my JSON however I have failed in Go, my JS doesn't ever seem to affect my webpage(this is not the only problem I have had with this, it does not seem to want to affect any page elements for some reason), and I cannot seem to hardcode <br> tags directly into my JSON as the browser interprets those as &lt;br&gt;&lt;br&gt;. Nothing I have tried has actually let me encode linebreaks, What can I do here?

ANSWER

Answered 2022-Mar-19 at 06:43

You could try to loop over your array inside the template and generate a p tag for every element of the array. This way there is no need to edit your main array in go.

Template:

1var title = window.document.getElementById(&quot;title&quot;);
2var timestamp = window.document.getElementById(&quot;timestamp&quot;);
3var sitemap = window.document.getElementById(&quot;sitemap&quot;);
4var main = window.document.getElementById(&quot;main&quot;);
5var contact_form = window.document.getElementById(&quot;contact-form&quot;);
6var content_info = window.document.getElementById(&quot;content-info&quot;);
7
8var str = main.innerHTML;
9
10function replaceNewlines() {
11    // Replace the \n with &lt;br&gt;
12    str = str.replace(/(?:\r\n|\r|\n)/g, &quot;&lt;br&gt;&quot;);
13
14    // Update the value of paragraph
15    main.innerHTML = str;
16}
17&lt;!DOCTYPE html&gt;
18&lt;html lang=&quot;en&quot;&gt;
19&lt;head&gt;
20    &lt;meta charset=&quot;UTF-8&quot;&gt;
21    &lt;meta http-equiv=&quot;X-UA-Compatible&quot; content=&quot;IE=edge&quot;&gt;
22    &lt;meta name=&quot;viewport&quot; content=&quot;width=device-width, initial-scale=1.0&quot;&gt;
23    &lt;title&gt;Dynamic JSON Events&lt;/title&gt;
24    &lt;link rel=&quot;stylesheet&quot; href=&quot;/blogtemplate.css&quot;&gt;&lt;/style&gt;
25&lt;/head&gt;
26&lt;body&gt;
27    &lt;section id=&quot;title&quot;&gt;
28        &lt;h1 id=&quot;text-title&quot;&gt;{{.Title}}&lt;/h1&gt;
29        &lt;time id=&quot;timestamp&quot;&gt;
30            {{.Timestamp}}
31        &lt;/time&gt;
32    &lt;/section&gt;
33    &lt;nav role=&quot;navigation&quot; id=&quot;site-nav&quot;&gt;
34        &lt;ul id=&quot;sitemap&quot;&gt;
35        &lt;/ul&gt;
36    &lt;/nav&gt;
37    &lt;main role=&quot;main&quot; id=&quot;main&quot;&gt;
38        {{.ParsedMain}}
39    &lt;/main&gt;
40    &lt;footer role=&quot;contentinfo&quot; id=&quot;footer&quot;&gt;
41        &lt;form id=&quot;contact-form&quot; role=&quot;form&quot;&gt;
42        &lt;address&gt;
43            Contact me by &lt;a id=&quot;my-email&quot; href=&quot;mailto:antonhibl11@gmail.com&quot; class=&quot;my-email&quot;&gt;e-mail&lt;/a&gt;
44        &lt;/address&gt;
45        &lt;/form&gt;
46    &lt;/footer&gt;
47&lt;script defer src=&quot;/blogtemplate.js&quot;&gt;
48&lt;/script&gt;
49&lt;/body&gt;
50&lt;/html&gt;
51package main
52
53import (
54    &quot;encoding/json&quot;
55    &quot;html/template&quot;
56    &quot;log&quot;
57    &quot;net/http&quot;
58    &quot;os&quot;
59    &quot;regexp&quot;
60    &quot;strings&quot;
61)
62
63type BlogPost struct {
64    Title      string   `json:&quot;title&quot;`
65    Timestamp  string   `json:&quot;timestamp&quot;`
66    Main       []string `json:&quot;main&quot;`
67    ParsedMain string
68}
69
70// this did not seem to work when I tried to implement it below
71var re = regexp.MustCompile(`\r\n|[\r\n\v\f\x{0085}\x{2028}\x{2029}]`)
72func replaceRegexp(s string) string {
73    return re.ReplaceAllString(s, &quot;&lt;br&gt;\n&quot;)
74}
75
76var blogTemplate = template.Must(template.ParseFiles(&quot;./assets/docs/blogtemplate.html&quot;))
77
78func blogHandler(w http.ResponseWriter, r *http.Request) {
79    blogstr := r.URL.Path[len(&quot;/blog/&quot;):] + &quot;.json&quot;
80
81    f, err := os.Open(&quot;db/&quot; + blogstr)
82    if err != nil {
83        http.Error(w, err.Error(), http.StatusNotFound)
84        return
85    }
86    defer f.Close()
87
88    var post BlogPost
89    if err := json.NewDecoder(f).Decode(&amp;post); err != nil {
90        http.Error(w, err.Error(), http.StatusInternalServerError)
91        return
92    }
93
94    post.ParsedMain = strings.Join(post.Main, &quot;&quot;)
95
96    // post.ParsedMain = replaceRegexp(post.ParsedMain)
97
98    if err := blogTemplate.Execute(w, post); err != nil {
99        log.Println(err)
100    }
101}
102
103func teapotHandler(w http.ResponseWriter, r *http.Request) {
104    w.WriteHeader(http.StatusTeapot)
105    w.Write([]byte(&quot;&lt;html&gt;&lt;h1&gt;&lt;a href='https://datatracker.ietf.org/doc/html/rfc2324/'&gt;HTCPTP&lt;/h1&gt;&lt;img src='https://external-content.duckduckgo.com/iu/?u=https%3A%2F%2Ftaooftea.com%2Fwp-content%2Fuploads%2F2015%2F12%2Fyixing-dark-brown-small.jpg&amp;f=1&amp;nofb=1' alt='Im a teapot'&gt;&lt;html&gt;&quot;))
106}
107
108func faviconHandler(w http.ResponseWriter, r *http.Request) {
109    http.ServeFile(w, r, &quot;./assets/art/favicon.ico&quot;)
110}
111
112func main() {
113    http.Handle(&quot;/&quot;, http.FileServer(http.Dir(&quot;/assets/docs&quot;)))
114    http.HandleFunc(&quot;/blog/&quot;, blogHandler)
115    http.HandleFunc(&quot;/favicon.ico&quot;, faviconHandler)
116    http.HandleFunc(&quot;/teapot&quot;, teapotHandler)
117    log.Fatal(http.ListenAndServe(&quot;:8080&quot;, nil))
118}
119
120{
121    &quot;title&quot; : &quot;Finished My First Blog App&quot;,
122    &quot;timestamp&quot;: &quot;Friday, March 18th, 11:39 AM&quot;,
123    &quot;main&quot;: [
124        &quot;It took me awhile to tidy everything up but I finally finished creating my first &quot;,
125        &quot;blog app using Go along with JSON for my database. I plan on using this to document &quot;,
126        &quot;my own thoughts and experiences as a programmer and cybersecurity researcher; things &quot;,
127        &quot;like tutorials, thought-pieces, and journals on my own projects progress will be &quot;,
128        &quot;posted here. I look forward to getting more used to writing and sharing my own story, &quot;,
129        &quot;I think it will help me learn from doing and also hearing feedback from others.\\n\\n&quot;,
130        &quot;I utilized a handler function to dynamically read from my JSON database and template &quot;,
131        &quot;data into my HTML template using the go html/template package as well as the encoding/json &quot;,
132        &quot;to handling reading those objects. Next I had to make sure my CSS and JavaScript assets &quot;,
133        &quot;would be served alongside this finished template in order for my styling to be output into &quot;,
134        &quot;the browser. For this I used a FileServer function which allowed for me to serve linked &quot;,
135        &quot;resources in my HTML boilerplate and have the server still locate blog resources dynamically. &quot;,
136        &quot;Going forward I am looking to add better styling, more JavaScript elements to the page, and &quot;,
137        &quot;more functionality to how my JSON data is encoded and parsed in order to create more complex &quot;,
138        &quot;looking pages and blog posts.&quot;
139    ]
140}
141&lt;!DOCTYPE html&gt;
142&lt;html lang=&quot;en&quot;&gt;
143&lt;head&gt;
144    &lt;meta charset=&quot;UTF-8&quot;&gt;
145    &lt;meta http-equiv=&quot;X-UA-Compatible&quot; content=&quot;IE=edge&quot;&gt;
146    &lt;meta name=&quot;viewport&quot; content=&quot;width=device-width, initial-scale=1.0&quot;&gt;
147    &lt;title&gt;Dynamic JSON Events&lt;/title&gt;
148    &lt;link rel=&quot;stylesheet&quot; href=&quot;/blogtemplate.css&quot;&gt;&lt;/style&gt;
149&lt;/head&gt;
150&lt;body&gt;
151    &lt;section id=&quot;title&quot;&gt;
152        &lt;h1 id=&quot;text-title&quot;&gt;{{.Title}}&lt;/h1&gt;
153        &lt;time id=&quot;timestamp&quot;&gt;
154            {{.Timestamp}}
155        &lt;/time&gt;
156    &lt;/section&gt;
157    &lt;nav role=&quot;navigation&quot; id=&quot;site-nav&quot;&gt;
158        &lt;ul id=&quot;sitemap&quot;&gt;
159        &lt;/ul&gt;
160    &lt;/nav&gt;
161    &lt;main role=&quot;main&quot; id=&quot;main&quot;&gt;
162        {{range $element := .Main}} &lt;p&gt;{{$element}}&lt;/p&gt; {{end}}
163    &lt;/main&gt;
164    &lt;footer role=&quot;contentinfo&quot; id=&quot;footer&quot;&gt;
165        &lt;form id=&quot;contact-form&quot; role=&quot;form&quot;&gt;
166        &lt;address&gt;
167            Contact me by &lt;a id=&quot;my-email&quot; href=&quot;mailto:antonhibl11@gmail.com&quot; class=&quot;my-email&quot;&gt;e-mail&lt;/a&gt;
168        &lt;/address&gt;
169        &lt;/form&gt;
170    &lt;/footer&gt;
171&lt;script defer src=&quot;/blogtemplate.js&quot;&gt;
172&lt;/script&gt;
173&lt;/body&gt;
174&lt;/html&gt;
175

Source https://stackoverflow.com/questions/71535674

QUESTION

Python / BeautifulSoup return ids with indeed jobs

Asked 2022-Feb-19 at 20:51

I have a basic indeed web scraper set up using BeautifulSoup that I am able to return the job title and company of each job from the first page of the indeed job search url I am using:

1def extract():
2    headers = headers
3    url = f'https://www.indeed.com/jobs?q=Network%20Architect&amp;start=&amp;vjk=e8bcf3fbe7498a5f'
4    r = requests.get(url,headers)
5    #return r.status_code
6    soup = BeautifulSoup(r.content, 'html.parser')
7    return soup
8
9def transform(soup):
10    for job in soup.select('.result'):
11        title = job.select_one('.jobTitle').get_text(' ')
12        company = job.find(class_='companyName').text 
13        print(f'title: {title}')
14        print(f'company: {company}')
15        
16        
17        
18c = extract()
19transform(c)
20

Output

1def extract():
2    headers = headers
3    url = f'https://www.indeed.com/jobs?q=Network%20Architect&amp;start=&amp;vjk=e8bcf3fbe7498a5f'
4    r = requests.get(url,headers)
5    #return r.status_code
6    soup = BeautifulSoup(r.content, 'html.parser')
7    return soup
8
9def transform(soup):
10    for job in soup.select('.result'):
11        title = job.select_one('.jobTitle').get_text(' ')
12        company = job.find(class_='companyName').text 
13        print(f'title: {title}')
14        print(f'company: {company}')
15        
16        
17        
18c = extract()
19transform(c)
20title: new Network Architect
21company: MetroSys
22title: new Network Architect
23company: Federal Working Group
24title: new REMOTE Network Architect - CCIE
25company: CyberCoders
26title: new Network Architect SME
27company: Emergere Technologies
28title: Cybersecurity Apprentice
29company: IBM
30title: Network Engineer (NEW YORK) ONSITE ONLY NEED TO APPLY
31company: QnA Tech
32title: new Network Architect
33company: EdgeCo Holdings
34title: new Network Architect
35company: JKL Technologies, Inc.
36title: Network Architect
37company: OTELCO
38title: new Network Architect
39company: Illinois Municipal Retirement Fund (IMRF)
40title: new Network Architect, Google Enterprise Network
41company: Google
42title: new Network Infrastructure Lead Or Architect- Menlo Park CA -Ful...
43company: Xforia Technologies
44title: Network Architect
45company: Fairfax County Public Schools
46title: new Network Engineer
47company: Labatt Food Service
48title: new Network Architect (5056-3)
49company: JND
50

Now on indeed it appears they have a unique ID for each job, I am trying to access this ID WITH each job so that I can use it later in an SQL database so that I don't add duplicate jobs. I am able the access the job IDs with the following code:

1def extract():
2    headers = headers
3    url = f'https://www.indeed.com/jobs?q=Network%20Architect&amp;start=&amp;vjk=e8bcf3fbe7498a5f'
4    r = requests.get(url,headers)
5    #return r.status_code
6    soup = BeautifulSoup(r.content, 'html.parser')
7    return soup
8
9def transform(soup):
10    for job in soup.select('.result'):
11        title = job.select_one('.jobTitle').get_text(' ')
12        company = job.find(class_='companyName').text 
13        print(f'title: {title}')
14        print(f'company: {company}')
15        
16        
17        
18c = extract()
19transform(c)
20title: new Network Architect
21company: MetroSys
22title: new Network Architect
23company: Federal Working Group
24title: new REMOTE Network Architect - CCIE
25company: CyberCoders
26title: new Network Architect SME
27company: Emergere Technologies
28title: Cybersecurity Apprentice
29company: IBM
30title: Network Engineer (NEW YORK) ONSITE ONLY NEED TO APPLY
31company: QnA Tech
32title: new Network Architect
33company: EdgeCo Holdings
34title: new Network Architect
35company: JKL Technologies, Inc.
36title: Network Architect
37company: OTELCO
38title: new Network Architect
39company: Illinois Municipal Retirement Fund (IMRF)
40title: new Network Architect, Google Enterprise Network
41company: Google
42title: new Network Infrastructure Lead Or Architect- Menlo Park CA -Ful...
43company: Xforia Technologies
44title: Network Architect
45company: Fairfax County Public Schools
46title: new Network Engineer
47company: Labatt Food Service
48title: new Network Architect (5056-3)
49company: JND
50for tag in soup.find_all('a', class_ = 'result') :
51    print(tag.get('id'))
52

Output:

1def extract():
2    headers = headers
3    url = f'https://www.indeed.com/jobs?q=Network%20Architect&amp;start=&amp;vjk=e8bcf3fbe7498a5f'
4    r = requests.get(url,headers)
5    #return r.status_code
6    soup = BeautifulSoup(r.content, 'html.parser')
7    return soup
8
9def transform(soup):
10    for job in soup.select('.result'):
11        title = job.select_one('.jobTitle').get_text(' ')
12        company = job.find(class_='companyName').text 
13        print(f'title: {title}')
14        print(f'company: {company}')
15        
16        
17        
18c = extract()
19transform(c)
20title: new Network Architect
21company: MetroSys
22title: new Network Architect
23company: Federal Working Group
24title: new REMOTE Network Architect - CCIE
25company: CyberCoders
26title: new Network Architect SME
27company: Emergere Technologies
28title: Cybersecurity Apprentice
29company: IBM
30title: Network Engineer (NEW YORK) ONSITE ONLY NEED TO APPLY
31company: QnA Tech
32title: new Network Architect
33company: EdgeCo Holdings
34title: new Network Architect
35company: JKL Technologies, Inc.
36title: Network Architect
37company: OTELCO
38title: new Network Architect
39company: Illinois Municipal Retirement Fund (IMRF)
40title: new Network Architect, Google Enterprise Network
41company: Google
42title: new Network Infrastructure Lead Or Architect- Menlo Park CA -Ful...
43company: Xforia Technologies
44title: Network Architect
45company: Fairfax County Public Schools
46title: new Network Engineer
47company: Labatt Food Service
48title: new Network Architect (5056-3)
49company: JND
50for tag in soup.find_all('a', class_ = 'result') :
51    print(tag.get('id'))
52job_a678f3bfc20cb753
53job_eef3e4c10d979c1e
54job_faedfdbadab2f19b
55job_190a6b55b99c78f0
56job_32d20498e8fbf692
57job_aeaabb9af50f36d6
58job_92432325a24212d0
59job_819ce9d7ec6e5890
60job_d979bf7daac01528
61job_0879369d166a9b94
62job_2d377bc2e5085ad7
63job_bb8e5d0f651c072f
64job_dcff58df466f1ecb
65job_f70d55871eb1df3f
66sj_54a09e5e34e08948
67

When I try to implement this with my working code I can access the IDs however, they all get returned together instead of one at a time with the corresponding job, or 1 with each job posting (instead of 15 total getting 15x15) I have tried this way:

1def extract():
2    headers = headers
3    url = f'https://www.indeed.com/jobs?q=Network%20Architect&amp;start=&amp;vjk=e8bcf3fbe7498a5f'
4    r = requests.get(url,headers)
5    #return r.status_code
6    soup = BeautifulSoup(r.content, 'html.parser')
7    return soup
8
9def transform(soup):
10    for job in soup.select('.result'):
11        title = job.select_one('.jobTitle').get_text(' ')
12        company = job.find(class_='companyName').text 
13        print(f'title: {title}')
14        print(f'company: {company}')
15        
16        
17        
18c = extract()
19transform(c)
20title: new Network Architect
21company: MetroSys
22title: new Network Architect
23company: Federal Working Group
24title: new REMOTE Network Architect - CCIE
25company: CyberCoders
26title: new Network Architect SME
27company: Emergere Technologies
28title: Cybersecurity Apprentice
29company: IBM
30title: Network Engineer (NEW YORK) ONSITE ONLY NEED TO APPLY
31company: QnA Tech
32title: new Network Architect
33company: EdgeCo Holdings
34title: new Network Architect
35company: JKL Technologies, Inc.
36title: Network Architect
37company: OTELCO
38title: new Network Architect
39company: Illinois Municipal Retirement Fund (IMRF)
40title: new Network Architect, Google Enterprise Network
41company: Google
42title: new Network Infrastructure Lead Or Architect- Menlo Park CA -Ful...
43company: Xforia Technologies
44title: Network Architect
45company: Fairfax County Public Schools
46title: new Network Engineer
47company: Labatt Food Service
48title: new Network Architect (5056-3)
49company: JND
50for tag in soup.find_all('a', class_ = 'result') :
51    print(tag.get('id'))
52job_a678f3bfc20cb753
53job_eef3e4c10d979c1e
54job_faedfdbadab2f19b
55job_190a6b55b99c78f0
56job_32d20498e8fbf692
57job_aeaabb9af50f36d6
58job_92432325a24212d0
59job_819ce9d7ec6e5890
60job_d979bf7daac01528
61job_0879369d166a9b94
62job_2d377bc2e5085ad7
63job_bb8e5d0f651c072f
64job_dcff58df466f1ecb
65job_f70d55871eb1df3f
66sj_54a09e5e34e08948
67def transform(soup):
68    for job in soup.select('.result'):
69        title = job.select_one('.jobTitle').get_text(' ')
70        company = job.find(class_='companyName').text 
71         tag = soup.find_all('a', class_='result')
72         for x in tag:
73           print(x.get('id'))
74        print(f'title: {title}')
75        print(f'company: {company}')
76        
77

And this way:

1def extract():
2    headers = headers
3    url = f'https://www.indeed.com/jobs?q=Network%20Architect&amp;start=&amp;vjk=e8bcf3fbe7498a5f'
4    r = requests.get(url,headers)
5    #return r.status_code
6    soup = BeautifulSoup(r.content, 'html.parser')
7    return soup
8
9def transform(soup):
10    for job in soup.select('.result'):
11        title = job.select_one('.jobTitle').get_text(' ')
12        company = job.find(class_='companyName').text 
13        print(f'title: {title}')
14        print(f'company: {company}')
15        
16        
17        
18c = extract()
19transform(c)
20title: new Network Architect
21company: MetroSys
22title: new Network Architect
23company: Federal Working Group
24title: new REMOTE Network Architect - CCIE
25company: CyberCoders
26title: new Network Architect SME
27company: Emergere Technologies
28title: Cybersecurity Apprentice
29company: IBM
30title: Network Engineer (NEW YORK) ONSITE ONLY NEED TO APPLY
31company: QnA Tech
32title: new Network Architect
33company: EdgeCo Holdings
34title: new Network Architect
35company: JKL Technologies, Inc.
36title: Network Architect
37company: OTELCO
38title: new Network Architect
39company: Illinois Municipal Retirement Fund (IMRF)
40title: new Network Architect, Google Enterprise Network
41company: Google
42title: new Network Infrastructure Lead Or Architect- Menlo Park CA -Ful...
43company: Xforia Technologies
44title: Network Architect
45company: Fairfax County Public Schools
46title: new Network Engineer
47company: Labatt Food Service
48title: new Network Architect (5056-3)
49company: JND
50for tag in soup.find_all('a', class_ = 'result') :
51    print(tag.get('id'))
52job_a678f3bfc20cb753
53job_eef3e4c10d979c1e
54job_faedfdbadab2f19b
55job_190a6b55b99c78f0
56job_32d20498e8fbf692
57job_aeaabb9af50f36d6
58job_92432325a24212d0
59job_819ce9d7ec6e5890
60job_d979bf7daac01528
61job_0879369d166a9b94
62job_2d377bc2e5085ad7
63job_bb8e5d0f651c072f
64job_dcff58df466f1ecb
65job_f70d55871eb1df3f
66sj_54a09e5e34e08948
67def transform(soup):
68    for job in soup.select('.result'):
69        title = job.select_one('.jobTitle').get_text(' ')
70        company = job.find(class_='companyName').text 
71         tag = soup.find_all('a', class_='result')
72         for x in tag:
73           print(x.get('id'))
74        print(f'title: {title}')
75        print(f'company: {company}')
76        
77def transform(soup):
78    for job in soup.select('.result'):
79        title = job.select_one('.jobTitle').get_text(' ')
80        company = job.find(class_='companyName').text 
81        tag = soup.find_all('a', class_='result')
82        for x in tag:
83            print(x.get('id'))
84            print(f'title: {title}')
85            print(f'company: {company}')
86    
87

The second way is the closest to my result however instead of getting 1 title, 1 company, and 1 id, adding up to 15 total jobs postings, I get the id returned with each job posting so 15x15.

The desired result is just to get it returned as:

1def extract():
2    headers = headers
3    url = f'https://www.indeed.com/jobs?q=Network%20Architect&amp;start=&amp;vjk=e8bcf3fbe7498a5f'
4    r = requests.get(url,headers)
5    #return r.status_code
6    soup = BeautifulSoup(r.content, 'html.parser')
7    return soup
8
9def transform(soup):
10    for job in soup.select('.result'):
11        title = job.select_one('.jobTitle').get_text(' ')
12        company = job.find(class_='companyName').text 
13        print(f'title: {title}')
14        print(f'company: {company}')
15        
16        
17        
18c = extract()
19transform(c)
20title: new Network Architect
21company: MetroSys
22title: new Network Architect
23company: Federal Working Group
24title: new REMOTE Network Architect - CCIE
25company: CyberCoders
26title: new Network Architect SME
27company: Emergere Technologies
28title: Cybersecurity Apprentice
29company: IBM
30title: Network Engineer (NEW YORK) ONSITE ONLY NEED TO APPLY
31company: QnA Tech
32title: new Network Architect
33company: EdgeCo Holdings
34title: new Network Architect
35company: JKL Technologies, Inc.
36title: Network Architect
37company: OTELCO
38title: new Network Architect
39company: Illinois Municipal Retirement Fund (IMRF)
40title: new Network Architect, Google Enterprise Network
41company: Google
42title: new Network Infrastructure Lead Or Architect- Menlo Park CA -Ful...
43company: Xforia Technologies
44title: Network Architect
45company: Fairfax County Public Schools
46title: new Network Engineer
47company: Labatt Food Service
48title: new Network Architect (5056-3)
49company: JND
50for tag in soup.find_all('a', class_ = 'result') :
51    print(tag.get('id'))
52job_a678f3bfc20cb753
53job_eef3e4c10d979c1e
54job_faedfdbadab2f19b
55job_190a6b55b99c78f0
56job_32d20498e8fbf692
57job_aeaabb9af50f36d6
58job_92432325a24212d0
59job_819ce9d7ec6e5890
60job_d979bf7daac01528
61job_0879369d166a9b94
62job_2d377bc2e5085ad7
63job_bb8e5d0f651c072f
64job_dcff58df466f1ecb
65job_f70d55871eb1df3f
66sj_54a09e5e34e08948
67def transform(soup):
68    for job in soup.select('.result'):
69        title = job.select_one('.jobTitle').get_text(' ')
70        company = job.find(class_='companyName').text 
71         tag = soup.find_all('a', class_='result')
72         for x in tag:
73           print(x.get('id'))
74        print(f'title: {title}')
75        print(f'company: {company}')
76        
77def transform(soup):
78    for job in soup.select('.result'):
79        title = job.select_one('.jobTitle').get_text(' ')
80        company = job.find(class_='companyName').text 
81        tag = soup.find_all('a', class_='result')
82        for x in tag:
83            print(x.get('id'))
84            print(f'title: {title}')
85            print(f'company: {company}')
86    
87title
88company
89ID
90title
91company
92ID
93

ANSWER

Answered 2022-Feb-19 at 20:51

You still have the job and extract information from it, so why not simply extract id from it -> job.get('id') should work for you:

1def extract():
2    headers = headers
3    url = f'https://www.indeed.com/jobs?q=Network%20Architect&amp;start=&amp;vjk=e8bcf3fbe7498a5f'
4    r = requests.get(url,headers)
5    #return r.status_code
6    soup = BeautifulSoup(r.content, 'html.parser')
7    return soup
8
9def transform(soup):
10    for job in soup.select('.result'):
11        title = job.select_one('.jobTitle').get_text(' ')
12        company = job.find(class_='companyName').text 
13        print(f'title: {title}')
14        print(f'company: {company}')
15        
16        
17        
18c = extract()
19transform(c)
20title: new Network Architect
21company: MetroSys
22title: new Network Architect
23company: Federal Working Group
24title: new REMOTE Network Architect - CCIE
25company: CyberCoders
26title: new Network Architect SME
27company: Emergere Technologies
28title: Cybersecurity Apprentice
29company: IBM
30title: Network Engineer (NEW YORK) ONSITE ONLY NEED TO APPLY
31company: QnA Tech
32title: new Network Architect
33company: EdgeCo Holdings
34title: new Network Architect
35company: JKL Technologies, Inc.
36title: Network Architect
37company: OTELCO
38title: new Network Architect
39company: Illinois Municipal Retirement Fund (IMRF)
40title: new Network Architect, Google Enterprise Network
41company: Google
42title: new Network Infrastructure Lead Or Architect- Menlo Park CA -Ful...
43company: Xforia Technologies
44title: Network Architect
45company: Fairfax County Public Schools
46title: new Network Engineer
47company: Labatt Food Service
48title: new Network Architect (5056-3)
49company: JND
50for tag in soup.find_all('a', class_ = 'result') :
51    print(tag.get('id'))
52job_a678f3bfc20cb753
53job_eef3e4c10d979c1e
54job_faedfdbadab2f19b
55job_190a6b55b99c78f0
56job_32d20498e8fbf692
57job_aeaabb9af50f36d6
58job_92432325a24212d0
59job_819ce9d7ec6e5890
60job_d979bf7daac01528
61job_0879369d166a9b94
62job_2d377bc2e5085ad7
63job_bb8e5d0f651c072f
64job_dcff58df466f1ecb
65job_f70d55871eb1df3f
66sj_54a09e5e34e08948
67def transform(soup):
68    for job in soup.select('.result'):
69        title = job.select_one('.jobTitle').get_text(' ')
70        company = job.find(class_='companyName').text 
71         tag = soup.find_all('a', class_='result')
72         for x in tag:
73           print(x.get('id'))
74        print(f'title: {title}')
75        print(f'company: {company}')
76        
77def transform(soup):
78    for job in soup.select('.result'):
79        title = job.select_one('.jobTitle').get_text(' ')
80        company = job.find(class_='companyName').text 
81        tag = soup.find_all('a', class_='result')
82        for x in tag:
83            print(x.get('id'))
84            print(f'title: {title}')
85            print(f'company: {company}')
86    
87title
88company
89ID
90title
91company
92ID
93def transform(soup):
94    for job in soup.select('.result'):
95        title = job.select_one('.jobTitle').get_text(' ')
96        company = job.find(class_='companyName').text 
97        id = job.get('id')
98        print(f'title: {title}')
99        print(f'company: {company}')
100        print(f'id: {id}')
101

Source https://stackoverflow.com/questions/71189021

QUESTION

Specific argument causes argparse to parse arguments incorrectly

Asked 2021-Dec-27 at 21:25

I am using python argparse in a script that has so far worked perfectly. However, passing a specific filepath as an argument causes the parser to fail.

Here is my argparse setup:

1parser = argparse.ArgumentParser(prog=&quot;writeup_converter.py&quot;, description=&quot;Takes a folder of Obsidian markdown files and copies them across to a new location, automatically copying any attachments. Options available include converting to a new set of Markdown files, removing and adding prefixes to attachments, and converting for use on a website&quot;)
2
3#positional arguments
4parser.add_argument(&quot;source_folder&quot;, help=&quot;The folder of markdown files to copy from.&quot;)
5parser.add_argument(&quot;source_attachments&quot;, help=&quot;The attachments folder in your Obsidian Vault that holds attachments in the notes.&quot;)
6parser.add_argument(&quot;target_folder&quot;, help=&quot;The place to drop your converted markdown files&quot;)
7parser.add_argument(&quot;target_attachments&quot;, help=&quot;The place to drop your converted attachments. Must be set as your attachments folder in Obsidian (or just drop them in the root of your vault if you hate yourself)&quot;)
8
9#optional flags
10parser.add_argument(&quot;-r&quot;, &quot;--remove_prefix&quot;, help=&quot;Prefix to remove from all your attachment file paths.&quot;)
11parser.add_argument(&quot;-v&quot;, &quot;--verbose&quot;, action=&quot;store_true&quot;, help=&quot;Verbose mode. Gives details of which files are being copied. Disabled by default in case of large directories&quot;)
12parser.add_argument(&quot;-w&quot;, &quot;--website&quot;, help=&quot;Use website formatting when files are copied. Files combined into one markdown file with HTML elements, specify the name of this file after the flag&quot;)
13parser.add_argument(&quot;-l&quot;, &quot;--asset_rel_path&quot;, help=&quot;Relative path for site assets e.g. /assets/images/blogs/..., include this or full system path will be added to links&quot;)
14
15print(sys.argv)
16exit()
17
18#parse arguments
19args = parser.parse_args()
20

I've added the print and exit for debugging purposes. Previously when I run the program with this configuration, it works well - however this set of arguments produces a strange error:

1parser = argparse.ArgumentParser(prog=&quot;writeup_converter.py&quot;, description=&quot;Takes a folder of Obsidian markdown files and copies them across to a new location, automatically copying any attachments. Options available include converting to a new set of Markdown files, removing and adding prefixes to attachments, and converting for use on a website&quot;)
2
3#positional arguments
4parser.add_argument(&quot;source_folder&quot;, help=&quot;The folder of markdown files to copy from.&quot;)
5parser.add_argument(&quot;source_attachments&quot;, help=&quot;The attachments folder in your Obsidian Vault that holds attachments in the notes.&quot;)
6parser.add_argument(&quot;target_folder&quot;, help=&quot;The place to drop your converted markdown files&quot;)
7parser.add_argument(&quot;target_attachments&quot;, help=&quot;The place to drop your converted attachments. Must be set as your attachments folder in Obsidian (or just drop them in the root of your vault if you hate yourself)&quot;)
8
9#optional flags
10parser.add_argument(&quot;-r&quot;, &quot;--remove_prefix&quot;, help=&quot;Prefix to remove from all your attachment file paths.&quot;)
11parser.add_argument(&quot;-v&quot;, &quot;--verbose&quot;, action=&quot;store_true&quot;, help=&quot;Verbose mode. Gives details of which files are being copied. Disabled by default in case of large directories&quot;)
12parser.add_argument(&quot;-w&quot;, &quot;--website&quot;, help=&quot;Use website formatting when files are copied. Files combined into one markdown file with HTML elements, specify the name of this file after the flag&quot;)
13parser.add_argument(&quot;-l&quot;, &quot;--asset_rel_path&quot;, help=&quot;Relative path for site assets e.g. /assets/images/blogs/..., include this or full system path will be added to links&quot;)
14
15print(sys.argv)
16exit()
17
18#parse arguments
19args = parser.parse_args()
20PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py -v -r Cybersecurity &quot;..\Personal-Vault\Cybersecurity\SESH\2021-22 Sessions\Shells Session Writeups\&quot; &quot;..\Personal-Vault\Attachments\&quot; &quot;..\Cybersecurity-Notes\Writeups\SESH\DVWA\&quot; &quot;..\Cybersecurity-Notes\Attachments\&quot;
21usage: writeup_converter.py [-h] [-r REMOVE_PREFIX] [-v] [-w WEBSITE] [-l ASSET_REL_PATH] source_folder source_attachments target_folder target_attachments
22writeup_converter.py: error: the following arguments are required: source_attachments, target_folder, target_attachments
23

It seems to not recognise the positional arguments that are definitely present. I added those debugging statements to see what the state of the arguments were according to Python:

1parser = argparse.ArgumentParser(prog=&quot;writeup_converter.py&quot;, description=&quot;Takes a folder of Obsidian markdown files and copies them across to a new location, automatically copying any attachments. Options available include converting to a new set of Markdown files, removing and adding prefixes to attachments, and converting for use on a website&quot;)
2
3#positional arguments
4parser.add_argument(&quot;source_folder&quot;, help=&quot;The folder of markdown files to copy from.&quot;)
5parser.add_argument(&quot;source_attachments&quot;, help=&quot;The attachments folder in your Obsidian Vault that holds attachments in the notes.&quot;)
6parser.add_argument(&quot;target_folder&quot;, help=&quot;The place to drop your converted markdown files&quot;)
7parser.add_argument(&quot;target_attachments&quot;, help=&quot;The place to drop your converted attachments. Must be set as your attachments folder in Obsidian (or just drop them in the root of your vault if you hate yourself)&quot;)
8
9#optional flags
10parser.add_argument(&quot;-r&quot;, &quot;--remove_prefix&quot;, help=&quot;Prefix to remove from all your attachment file paths.&quot;)
11parser.add_argument(&quot;-v&quot;, &quot;--verbose&quot;, action=&quot;store_true&quot;, help=&quot;Verbose mode. Gives details of which files are being copied. Disabled by default in case of large directories&quot;)
12parser.add_argument(&quot;-w&quot;, &quot;--website&quot;, help=&quot;Use website formatting when files are copied. Files combined into one markdown file with HTML elements, specify the name of this file after the flag&quot;)
13parser.add_argument(&quot;-l&quot;, &quot;--asset_rel_path&quot;, help=&quot;Relative path for site assets e.g. /assets/images/blogs/..., include this or full system path will be added to links&quot;)
14
15print(sys.argv)
16exit()
17
18#parse arguments
19args = parser.parse_args()
20PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py -v -r Cybersecurity &quot;..\Personal-Vault\Cybersecurity\SESH\2021-22 Sessions\Shells Session Writeups\&quot; &quot;..\Personal-Vault\Attachments\&quot; &quot;..\Cybersecurity-Notes\Writeups\SESH\DVWA\&quot; &quot;..\Cybersecurity-Notes\Attachments\&quot;
21usage: writeup_converter.py [-h] [-r REMOVE_PREFIX] [-v] [-w WEBSITE] [-l ASSET_REL_PATH] source_folder source_attachments target_folder target_attachments
22writeup_converter.py: error: the following arguments are required: source_attachments, target_folder, target_attachments
23['.\\writeup_converter.py', '-v', '-r', 'Cybersecurity', '..\\Personal-Vault\\Cybersecurity\\SESH\\2021-22 Sessions\\Shells Session Writeups&quot; ..\\Personal-Vault\\Attachments\\ ..\\Cybersecurity-Notes\\Writeups\\SESH\\DVWA\\ ..\\Cybersecurity-Notes\\Attachments\\']
24

As you can see, the four positional arguments have been combined into one. Experimenting further I found that the first argument specifically causes this issue:

1parser = argparse.ArgumentParser(prog=&quot;writeup_converter.py&quot;, description=&quot;Takes a folder of Obsidian markdown files and copies them across to a new location, automatically copying any attachments. Options available include converting to a new set of Markdown files, removing and adding prefixes to attachments, and converting for use on a website&quot;)
2
3#positional arguments
4parser.add_argument(&quot;source_folder&quot;, help=&quot;The folder of markdown files to copy from.&quot;)
5parser.add_argument(&quot;source_attachments&quot;, help=&quot;The attachments folder in your Obsidian Vault that holds attachments in the notes.&quot;)
6parser.add_argument(&quot;target_folder&quot;, help=&quot;The place to drop your converted markdown files&quot;)
7parser.add_argument(&quot;target_attachments&quot;, help=&quot;The place to drop your converted attachments. Must be set as your attachments folder in Obsidian (or just drop them in the root of your vault if you hate yourself)&quot;)
8
9#optional flags
10parser.add_argument(&quot;-r&quot;, &quot;--remove_prefix&quot;, help=&quot;Prefix to remove from all your attachment file paths.&quot;)
11parser.add_argument(&quot;-v&quot;, &quot;--verbose&quot;, action=&quot;store_true&quot;, help=&quot;Verbose mode. Gives details of which files are being copied. Disabled by default in case of large directories&quot;)
12parser.add_argument(&quot;-w&quot;, &quot;--website&quot;, help=&quot;Use website formatting when files are copied. Files combined into one markdown file with HTML elements, specify the name of this file after the flag&quot;)
13parser.add_argument(&quot;-l&quot;, &quot;--asset_rel_path&quot;, help=&quot;Relative path for site assets e.g. /assets/images/blogs/..., include this or full system path will be added to links&quot;)
14
15print(sys.argv)
16exit()
17
18#parse arguments
19args = parser.parse_args()
20PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py -v -r Cybersecurity &quot;..\Personal-Vault\Cybersecurity\SESH\2021-22 Sessions\Shells Session Writeups\&quot; &quot;..\Personal-Vault\Attachments\&quot; &quot;..\Cybersecurity-Notes\Writeups\SESH\DVWA\&quot; &quot;..\Cybersecurity-Notes\Attachments\&quot;
21usage: writeup_converter.py [-h] [-r REMOVE_PREFIX] [-v] [-w WEBSITE] [-l ASSET_REL_PATH] source_folder source_attachments target_folder target_attachments
22writeup_converter.py: error: the following arguments are required: source_attachments, target_folder, target_attachments
23['.\\writeup_converter.py', '-v', '-r', 'Cybersecurity', '..\\Personal-Vault\\Cybersecurity\\SESH\\2021-22 Sessions\\Shells Session Writeups&quot; ..\\Personal-Vault\\Attachments\\ ..\\Cybersecurity-Notes\\Writeups\\SESH\\DVWA\\ ..\\Cybersecurity-Notes\\Attachments\\']
24PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py a b c d
25['.\\writeup_converter.py', 'a', 'b', 'c', 'd']
26PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py &quot;a b&quot; b c d
27['.\\writeup_converter.py', 'a b', 'b', 'c', 'd']
28PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py &quot;a\ b&quot; b c d
29['.\\writeup_converter.py', 'a\\ b', 'b', 'c', 'd']
30PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py &quot;a\ b&quot; &quot;b&quot; c d
31['.\\writeup_converter.py', 'a\\ b', 'b', 'c', 'd']
32PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py &quot;..\Personal-Vault\Cybersecurity\SESH\2021-22 Sessions\Shells Session Writeups\&quot; &quot;b&quot; c d
33['.\\writeup_converter.py', '..\\Personal-Vault\\Cybersecurity\\SESH\\2021-22 Sessions\\Shells Session Writeups&quot; b c d']
34

As you can see, the arguments are parsed correctly until the string "..\Personal-Vault\Cybersecurity\SESH\2021-22 Sessions\Shells Session Writeups\" is used. I can't figure out a reason for this, so any ideas would be appreciated. This behaviour occurs in both Python and CMD.

ANSWER

Answered 2021-Dec-27 at 21:25

About ten seconds after posting this I realised the error thanks to Stack Overflow syntax highlighting - the backslash in the path was escaping the quotation mark. Escaping this causes argparse to behave correctly:

1parser = argparse.ArgumentParser(prog=&quot;writeup_converter.py&quot;, description=&quot;Takes a folder of Obsidian markdown files and copies them across to a new location, automatically copying any attachments. Options available include converting to a new set of Markdown files, removing and adding prefixes to attachments, and converting for use on a website&quot;)
2
3#positional arguments
4parser.add_argument(&quot;source_folder&quot;, help=&quot;The folder of markdown files to copy from.&quot;)
5parser.add_argument(&quot;source_attachments&quot;, help=&quot;The attachments folder in your Obsidian Vault that holds attachments in the notes.&quot;)
6parser.add_argument(&quot;target_folder&quot;, help=&quot;The place to drop your converted markdown files&quot;)
7parser.add_argument(&quot;target_attachments&quot;, help=&quot;The place to drop your converted attachments. Must be set as your attachments folder in Obsidian (or just drop them in the root of your vault if you hate yourself)&quot;)
8
9#optional flags
10parser.add_argument(&quot;-r&quot;, &quot;--remove_prefix&quot;, help=&quot;Prefix to remove from all your attachment file paths.&quot;)
11parser.add_argument(&quot;-v&quot;, &quot;--verbose&quot;, action=&quot;store_true&quot;, help=&quot;Verbose mode. Gives details of which files are being copied. Disabled by default in case of large directories&quot;)
12parser.add_argument(&quot;-w&quot;, &quot;--website&quot;, help=&quot;Use website formatting when files are copied. Files combined into one markdown file with HTML elements, specify the name of this file after the flag&quot;)
13parser.add_argument(&quot;-l&quot;, &quot;--asset_rel_path&quot;, help=&quot;Relative path for site assets e.g. /assets/images/blogs/..., include this or full system path will be added to links&quot;)
14
15print(sys.argv)
16exit()
17
18#parse arguments
19args = parser.parse_args()
20PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py -v -r Cybersecurity &quot;..\Personal-Vault\Cybersecurity\SESH\2021-22 Sessions\Shells Session Writeups\&quot; &quot;..\Personal-Vault\Attachments\&quot; &quot;..\Cybersecurity-Notes\Writeups\SESH\DVWA\&quot; &quot;..\Cybersecurity-Notes\Attachments\&quot;
21usage: writeup_converter.py [-h] [-r REMOVE_PREFIX] [-v] [-w WEBSITE] [-l ASSET_REL_PATH] source_folder source_attachments target_folder target_attachments
22writeup_converter.py: error: the following arguments are required: source_attachments, target_folder, target_attachments
23['.\\writeup_converter.py', '-v', '-r', 'Cybersecurity', '..\\Personal-Vault\\Cybersecurity\\SESH\\2021-22 Sessions\\Shells Session Writeups&quot; ..\\Personal-Vault\\Attachments\\ ..\\Cybersecurity-Notes\\Writeups\\SESH\\DVWA\\ ..\\Cybersecurity-Notes\\Attachments\\']
24PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py a b c d
25['.\\writeup_converter.py', 'a', 'b', 'c', 'd']
26PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py &quot;a b&quot; b c d
27['.\\writeup_converter.py', 'a b', 'b', 'c', 'd']
28PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py &quot;a\ b&quot; b c d
29['.\\writeup_converter.py', 'a\\ b', 'b', 'c', 'd']
30PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py &quot;a\ b&quot; &quot;b&quot; c d
31['.\\writeup_converter.py', 'a\\ b', 'b', 'c', 'd']
32PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py &quot;..\Personal-Vault\Cybersecurity\SESH\2021-22 Sessions\Shells Session Writeups\&quot; &quot;b&quot; c d
33['.\\writeup_converter.py', '..\\Personal-Vault\\Cybersecurity\\SESH\\2021-22 Sessions\\Shells Session Writeups&quot; b c d']
34PS D:\OneDrive\Documents\writeup-converter&gt; python .\writeup_converter.py -v -r Cybersecurity &quot;..\Personal-Vault\Cybersecurity\SESH\2021-22 Sessions\Shells Session Writeups\\&quot; ..\Personal-Vault\Attachments\ ..\Cybersecurity-Notes\Writeups\SESH\DVWA\ ..\Cybersecurity-Notes\Attachments\
35['.\\writeup_converter.py', '-v', '-r', 'Cybersecurity', '..\\Personal-Vault\\Cybersecurity\\SESH\\2021-22 Sessions\\Shells Session Writeups\\', '..\\Personal-Vault\\Attachments\\', '..\\Cybersecurity-Notes\\Writeups\\SESH\\DVWA\\', '..\\Cybersecurity-Notes\\Attachments\\']
36

Source https://stackoverflow.com/questions/70500553

QUESTION

How do I adjust my tibble to get a grouped bar chart in ggplot2?

Asked 2021-Nov-22 at 04:25

I think the code itself I'm using for a grouped barchart is roughly correct. However, my tibble doesn't have a way to call the three categories I need (Views, Interactions, and Comments). I have a conceptual issue in making this work.

This is what I'm trying to execute in ggplot2:

1bp_vic &lt;- ggplot(data, aes(x = Day, y = value, fill = category)) +
2  geom_bar(position = 'dodge', stat = 'identity')
3bp_vic
4

Value and fill may be off. However, I think the main issue is not having a proper category call.

My tibble has six columns. The last three are what I'm trying to plot. Integer counts for Views, Interactions, and Comments.

This is my script file and this is the CSV I'm generating my tibble from.

I have successfully executed this for individual columns only:

1bp_vic &lt;- ggplot(data, aes(x = Day, y = value, fill = category)) +
2  geom_bar(position = 'dodge', stat = 'identity')
3bp_vic
4bp_v &lt;- ggplot(data, aes(x = Day, y = Views)) + geom_col()
5bp_v
6
1bp_vic &lt;- ggplot(data, aes(x = Day, y = value, fill = category)) +
2  geom_bar(position = 'dodge', stat = 'identity')
3bp_vic
4bp_v &lt;- ggplot(data, aes(x = Day, y = Views)) + geom_col()
5bp_v
6dput(data)
7structure(list(Day = c(-3L, -2L, -1L, 1L, 2L, 3L, 4L, 5L, 6L, 
87L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 17L, 18L, 19L, 
920L, 21L, 22L, 23L, 24L, 25L), Category = c(&quot;SpaceForce&quot;, &quot;CyberSecurity&quot;, 
10&quot;Celebration&quot;, &quot;Update&quot;, &quot;Update&quot;, &quot;SpaceNews&quot;, &quot;Data&quot;, &quot;USSFExplained&quot;, 
11&quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;USSFExplained&quot;, 
12&quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;Nostalgia&quot;, &quot;Data&quot;, &quot;Publishing&quot;, 
13&quot;SpaceForce&quot;, &quot;Military&quot;, &quot;SpaceNews&quot;, &quot;Publishing&quot;, &quot;Office&quot;, 
14&quot;Office&quot;, &quot;Office&quot;, &quot;Office&quot;, &quot;Data&quot;, &quot;Update&quot;, &quot;Space&quot;), Type = c(&quot;Share&quot;, 
15&quot;Photo_1&quot;, &quot;Photo_5&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_1&quot;, &quot;Text&quot;, 
16&quot;Photo_1&quot;, &quot;Photo_1&quot;, &quot;Photo_1&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_1&quot;, 
17&quot;Photo_1&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_3&quot;, &quot;Text&quot;, &quot;Text&quot;, 
18&quot;Photo_3&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_3&quot;, &quot;Photo_1&quot;, &quot;Photo_1&quot;), 
19    Views = c(26L, 99L, 7106L, 517L, 655L, 828L, 2183L, 911L, 
20    467L, 247L, 299L, 245L, 674L, 668L, 721L, 1358L, 383L, 701L, 
21    281L, 1339L, 770L, 373L, 482L, 386L, 166L, 454L, 366L, 318L
22    ), Interactions = c(0L, 0L, 125L, 8L, 10L, 9L, 16L, 17L, 
23    10L, 9L, 9L, 7L, 10L, 8L, 9L, 10L, 13L, 9L, 11L, 18L, 13L, 
24    6L, 4L, 9L, 4L, 11L, 6L, 10L), Comments = c(0L, 0L, 35L, 
25    4L, 12L, 11L, 7L, 10L, 9L, 1L, 2L, 4L, 8L, 5L, 2L, 11L, 10L, 
26    13L, 0L, 19L, 9L, 5L, 4L, 4L, 0L, 8L, 5L, 6L)), class = &quot;data.frame&quot;, row.names = c(NA, 
27-28L))
28

ANSWER

Answered 2021-Nov-22 at 04:25

You want to use tidyverse to put the data into a useable (and tidy) format, before trying to plot the data.

1bp_vic &lt;- ggplot(data, aes(x = Day, y = value, fill = category)) +
2  geom_bar(position = 'dodge', stat = 'identity')
3bp_vic
4bp_v &lt;- ggplot(data, aes(x = Day, y = Views)) + geom_col()
5bp_v
6dput(data)
7structure(list(Day = c(-3L, -2L, -1L, 1L, 2L, 3L, 4L, 5L, 6L, 
87L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 17L, 18L, 19L, 
920L, 21L, 22L, 23L, 24L, 25L), Category = c(&quot;SpaceForce&quot;, &quot;CyberSecurity&quot;, 
10&quot;Celebration&quot;, &quot;Update&quot;, &quot;Update&quot;, &quot;SpaceNews&quot;, &quot;Data&quot;, &quot;USSFExplained&quot;, 
11&quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;USSFExplained&quot;, 
12&quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;Nostalgia&quot;, &quot;Data&quot;, &quot;Publishing&quot;, 
13&quot;SpaceForce&quot;, &quot;Military&quot;, &quot;SpaceNews&quot;, &quot;Publishing&quot;, &quot;Office&quot;, 
14&quot;Office&quot;, &quot;Office&quot;, &quot;Office&quot;, &quot;Data&quot;, &quot;Update&quot;, &quot;Space&quot;), Type = c(&quot;Share&quot;, 
15&quot;Photo_1&quot;, &quot;Photo_5&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_1&quot;, &quot;Text&quot;, 
16&quot;Photo_1&quot;, &quot;Photo_1&quot;, &quot;Photo_1&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_1&quot;, 
17&quot;Photo_1&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_3&quot;, &quot;Text&quot;, &quot;Text&quot;, 
18&quot;Photo_3&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_3&quot;, &quot;Photo_1&quot;, &quot;Photo_1&quot;), 
19    Views = c(26L, 99L, 7106L, 517L, 655L, 828L, 2183L, 911L, 
20    467L, 247L, 299L, 245L, 674L, 668L, 721L, 1358L, 383L, 701L, 
21    281L, 1339L, 770L, 373L, 482L, 386L, 166L, 454L, 366L, 318L
22    ), Interactions = c(0L, 0L, 125L, 8L, 10L, 9L, 16L, 17L, 
23    10L, 9L, 9L, 7L, 10L, 8L, 9L, 10L, 13L, 9L, 11L, 18L, 13L, 
24    6L, 4L, 9L, 4L, 11L, 6L, 10L), Comments = c(0L, 0L, 35L, 
25    4L, 12L, 11L, 7L, 10L, 9L, 1L, 2L, 4L, 8L, 5L, 2L, 11L, 10L, 
26    13L, 0L, 19L, 9L, 5L, 4L, 4L, 0L, 8L, 5L, 6L)), class = &quot;data.frame&quot;, row.names = c(NA, 
27-28L))
28library(tidyverse)
29
30data &lt;-
31  data %&gt;% 
32  tidyr::pivot_longer(
33    cols = c(Views, Interactions, Comments),
34    names_to = &quot;Section&quot;,
35    values_to = &quot;values&quot;
36  )
37

New format

1bp_vic &lt;- ggplot(data, aes(x = Day, y = value, fill = category)) +
2  geom_bar(position = 'dodge', stat = 'identity')
3bp_vic
4bp_v &lt;- ggplot(data, aes(x = Day, y = Views)) + geom_col()
5bp_v
6dput(data)
7structure(list(Day = c(-3L, -2L, -1L, 1L, 2L, 3L, 4L, 5L, 6L, 
87L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 17L, 18L, 19L, 
920L, 21L, 22L, 23L, 24L, 25L), Category = c(&quot;SpaceForce&quot;, &quot;CyberSecurity&quot;, 
10&quot;Celebration&quot;, &quot;Update&quot;, &quot;Update&quot;, &quot;SpaceNews&quot;, &quot;Data&quot;, &quot;USSFExplained&quot;, 
11&quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;USSFExplained&quot;, 
12&quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;Nostalgia&quot;, &quot;Data&quot;, &quot;Publishing&quot;, 
13&quot;SpaceForce&quot;, &quot;Military&quot;, &quot;SpaceNews&quot;, &quot;Publishing&quot;, &quot;Office&quot;, 
14&quot;Office&quot;, &quot;Office&quot;, &quot;Office&quot;, &quot;Data&quot;, &quot;Update&quot;, &quot;Space&quot;), Type = c(&quot;Share&quot;, 
15&quot;Photo_1&quot;, &quot;Photo_5&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_1&quot;, &quot;Text&quot;, 
16&quot;Photo_1&quot;, &quot;Photo_1&quot;, &quot;Photo_1&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_1&quot;, 
17&quot;Photo_1&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_3&quot;, &quot;Text&quot;, &quot;Text&quot;, 
18&quot;Photo_3&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_3&quot;, &quot;Photo_1&quot;, &quot;Photo_1&quot;), 
19    Views = c(26L, 99L, 7106L, 517L, 655L, 828L, 2183L, 911L, 
20    467L, 247L, 299L, 245L, 674L, 668L, 721L, 1358L, 383L, 701L, 
21    281L, 1339L, 770L, 373L, 482L, 386L, 166L, 454L, 366L, 318L
22    ), Interactions = c(0L, 0L, 125L, 8L, 10L, 9L, 16L, 17L, 
23    10L, 9L, 9L, 7L, 10L, 8L, 9L, 10L, 13L, 9L, 11L, 18L, 13L, 
24    6L, 4L, 9L, 4L, 11L, 6L, 10L), Comments = c(0L, 0L, 35L, 
25    4L, 12L, 11L, 7L, 10L, 9L, 1L, 2L, 4L, 8L, 5L, 2L, 11L, 10L, 
26    13L, 0L, 19L, 9L, 5L, 4L, 4L, 0L, 8L, 5L, 6L)), class = &quot;data.frame&quot;, row.names = c(NA, 
27-28L))
28library(tidyverse)
29
30data &lt;-
31  data %&gt;% 
32  tidyr::pivot_longer(
33    cols = c(Views, Interactions, Comments),
34    names_to = &quot;Section&quot;,
35    values_to = &quot;values&quot;
36  )
37head(data)
38# A tibble: 6 × 5
39    Day Category      Type    Section      values
40  &lt;int&gt; &lt;chr&gt;         &lt;chr&gt;   &lt;chr&gt;         &lt;int&gt;
411    -3 SpaceForce    Share   Views            26
422    -3 SpaceForce    Share   Interactions      0
433    -3 SpaceForce    Share   Comments          0
444    -2 CyberSecurity Photo_1 Views            99
455    -2 CyberSecurity Photo_1 Interactions      0
466    -2 CyberSecurity Photo_1 Comments          0
47

Then, you can plot the grouped bar chart.

1bp_vic &lt;- ggplot(data, aes(x = Day, y = value, fill = category)) +
2  geom_bar(position = 'dodge', stat = 'identity')
3bp_vic
4bp_v &lt;- ggplot(data, aes(x = Day, y = Views)) + geom_col()
5bp_v
6dput(data)
7structure(list(Day = c(-3L, -2L, -1L, 1L, 2L, 3L, 4L, 5L, 6L, 
87L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 17L, 18L, 19L, 
920L, 21L, 22L, 23L, 24L, 25L), Category = c(&quot;SpaceForce&quot;, &quot;CyberSecurity&quot;, 
10&quot;Celebration&quot;, &quot;Update&quot;, &quot;Update&quot;, &quot;SpaceNews&quot;, &quot;Data&quot;, &quot;USSFExplained&quot;, 
11&quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;USSFExplained&quot;, 
12&quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;Nostalgia&quot;, &quot;Data&quot;, &quot;Publishing&quot;, 
13&quot;SpaceForce&quot;, &quot;Military&quot;, &quot;SpaceNews&quot;, &quot;Publishing&quot;, &quot;Office&quot;, 
14&quot;Office&quot;, &quot;Office&quot;, &quot;Office&quot;, &quot;Data&quot;, &quot;Update&quot;, &quot;Space&quot;), Type = c(&quot;Share&quot;, 
15&quot;Photo_1&quot;, &quot;Photo_5&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_1&quot;, &quot;Text&quot;, 
16&quot;Photo_1&quot;, &quot;Photo_1&quot;, &quot;Photo_1&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_1&quot;, 
17&quot;Photo_1&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_3&quot;, &quot;Text&quot;, &quot;Text&quot;, 
18&quot;Photo_3&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_3&quot;, &quot;Photo_1&quot;, &quot;Photo_1&quot;), 
19    Views = c(26L, 99L, 7106L, 517L, 655L, 828L, 2183L, 911L, 
20    467L, 247L, 299L, 245L, 674L, 668L, 721L, 1358L, 383L, 701L, 
21    281L, 1339L, 770L, 373L, 482L, 386L, 166L, 454L, 366L, 318L
22    ), Interactions = c(0L, 0L, 125L, 8L, 10L, 9L, 16L, 17L, 
23    10L, 9L, 9L, 7L, 10L, 8L, 9L, 10L, 13L, 9L, 11L, 18L, 13L, 
24    6L, 4L, 9L, 4L, 11L, 6L, 10L), Comments = c(0L, 0L, 35L, 
25    4L, 12L, 11L, 7L, 10L, 9L, 1L, 2L, 4L, 8L, 5L, 2L, 11L, 10L, 
26    13L, 0L, 19L, 9L, 5L, 4L, 4L, 0L, 8L, 5L, 6L)), class = &quot;data.frame&quot;, row.names = c(NA, 
27-28L))
28library(tidyverse)
29
30data &lt;-
31  data %&gt;% 
32  tidyr::pivot_longer(
33    cols = c(Views, Interactions, Comments),
34    names_to = &quot;Section&quot;,
35    values_to = &quot;values&quot;
36  )
37head(data)
38# A tibble: 6 × 5
39    Day Category      Type    Section      values
40  &lt;int&gt; &lt;chr&gt;         &lt;chr&gt;   &lt;chr&gt;         &lt;int&gt;
411    -3 SpaceForce    Share   Views            26
422    -3 SpaceForce    Share   Interactions      0
433    -3 SpaceForce    Share   Comments          0
444    -2 CyberSecurity Photo_1 Views            99
455    -2 CyberSecurity Photo_1 Interactions      0
466    -2 CyberSecurity Photo_1 Comments          0
47ggplot(data, aes(fill = Section, y = values, x = Day)) +
48  geom_bar(position = &quot;dodge&quot;, stat = &quot;identity&quot;)
49

Output (though difficult to see most because of the 1 really high value) enter image description here

Or you could easily plot the Category rather than the day if needed too, by having x = Category instead of x = Day. enter image description here

If you would like to change the order of the categories, then you can make Category a factor, which you can do without changing the dataframe.

1bp_vic &lt;- ggplot(data, aes(x = Day, y = value, fill = category)) +
2  geom_bar(position = 'dodge', stat = 'identity')
3bp_vic
4bp_v &lt;- ggplot(data, aes(x = Day, y = Views)) + geom_col()
5bp_v
6dput(data)
7structure(list(Day = c(-3L, -2L, -1L, 1L, 2L, 3L, 4L, 5L, 6L, 
87L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 17L, 18L, 19L, 
920L, 21L, 22L, 23L, 24L, 25L), Category = c(&quot;SpaceForce&quot;, &quot;CyberSecurity&quot;, 
10&quot;Celebration&quot;, &quot;Update&quot;, &quot;Update&quot;, &quot;SpaceNews&quot;, &quot;Data&quot;, &quot;USSFExplained&quot;, 
11&quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;USSFExplained&quot;, 
12&quot;USSFExplained&quot;, &quot;USSFExplained&quot;, &quot;Nostalgia&quot;, &quot;Data&quot;, &quot;Publishing&quot;, 
13&quot;SpaceForce&quot;, &quot;Military&quot;, &quot;SpaceNews&quot;, &quot;Publishing&quot;, &quot;Office&quot;, 
14&quot;Office&quot;, &quot;Office&quot;, &quot;Office&quot;, &quot;Data&quot;, &quot;Update&quot;, &quot;Space&quot;), Type = c(&quot;Share&quot;, 
15&quot;Photo_1&quot;, &quot;Photo_5&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_1&quot;, &quot;Text&quot;, 
16&quot;Photo_1&quot;, &quot;Photo_1&quot;, &quot;Photo_1&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_1&quot;, 
17&quot;Photo_1&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_3&quot;, &quot;Text&quot;, &quot;Text&quot;, 
18&quot;Photo_3&quot;, &quot;Text&quot;, &quot;Text&quot;, &quot;Photo_3&quot;, &quot;Photo_1&quot;, &quot;Photo_1&quot;), 
19    Views = c(26L, 99L, 7106L, 517L, 655L, 828L, 2183L, 911L, 
20    467L, 247L, 299L, 245L, 674L, 668L, 721L, 1358L, 383L, 701L, 
21    281L, 1339L, 770L, 373L, 482L, 386L, 166L, 454L, 366L, 318L
22    ), Interactions = c(0L, 0L, 125L, 8L, 10L, 9L, 16L, 17L, 
23    10L, 9L, 9L, 7L, 10L, 8L, 9L, 10L, 13L, 9L, 11L, 18L, 13L, 
24    6L, 4L, 9L, 4L, 11L, 6L, 10L), Comments = c(0L, 0L, 35L, 
25    4L, 12L, 11L, 7L, 10L, 9L, 1L, 2L, 4L, 8L, 5L, 2L, 11L, 10L, 
26    13L, 0L, 19L, 9L, 5L, 4L, 4L, 0L, 8L, 5L, 6L)), class = &quot;data.frame&quot;, row.names = c(NA, 
27-28L))
28library(tidyverse)
29
30data &lt;-
31  data %&gt;% 
32  tidyr::pivot_longer(
33    cols = c(Views, Interactions, Comments),
34    names_to = &quot;Section&quot;,
35    values_to = &quot;values&quot;
36  )
37head(data)
38# A tibble: 6 × 5
39    Day Category      Type    Section      values
40  &lt;int&gt; &lt;chr&gt;         &lt;chr&gt;   &lt;chr&gt;         &lt;int&gt;
411    -3 SpaceForce    Share   Views            26
422    -3 SpaceForce    Share   Interactions      0
433    -3 SpaceForce    Share   Comments          0
444    -2 CyberSecurity Photo_1 Views            99
455    -2 CyberSecurity Photo_1 Interactions      0
466    -2 CyberSecurity Photo_1 Comments          0
47ggplot(data, aes(fill = Section, y = values, x = Day)) +
48  geom_bar(position = &quot;dodge&quot;, stat = &quot;identity&quot;)
49# Create order for the categories. If you want to do it by the number of views, then you can create the list from your dataframe.
50level_order &lt;- data %&gt;%
51  dplyr::filter(Section == &quot;Views&quot;) %&gt;%
52  dplyr::arrange(desc(values)) %&gt;%
53  pull(Category) %&gt;%
54  unique()
55  
56# Then, set category as a factor and include the ordered categories.
57ggplot(data, aes(fill = Section, y = values, x = factor(Category, level = level_order))) +
58  geom_bar(position = &quot;dodge&quot;, stat = &quot;identity&quot;)
59

enter image description here

Source https://stackoverflow.com/questions/70051541

QUESTION

how to make a model fit the dataset in Keras?

Asked 2021-Nov-14 at 20:30

the idea is to make a program that can detect if there is attack happened or not

i got stuck in fitting the model

libraries imported

1from keras.models import Sequential
2from keras.layers import Dense
3from keras.layers import Flatten
4from keras.layers.convolutional import Conv1D
5from keras.layers.convolutional import MaxPooling1D
6from keras.layers.embeddings import Embedding
7from keras.preprocessing import sequence
8import pandas as pd
9
Dataset Details:

https://www.unsw.adfa.edu.au/unsw-canberra-cyber/cybersecurity/ADFA-NB15-Datasets/bot_iot.php

https://ieee-dataport.org/documents/bot-iot-dataset

files picture

as you can see in attack column i want the program to tell if an attack happened or not

this is the model

1from keras.models import Sequential
2from keras.layers import Dense
3from keras.layers import Flatten
4from keras.layers.convolutional import Conv1D
5from keras.layers.convolutional import MaxPooling1D
6from keras.layers.embeddings import Embedding
7from keras.preprocessing import sequence
8import pandas as pd
9model = Sequential()
10model.add(Conv1D(128, 5, activation='relu'))
11model.add(MaxPooling1D())
12model.add(Dense(12, input_dim=8, activation='relu'))
13model.add(Dense(10,activation='relu'))
14model.add(Dense(1,activation='sigmoid'))
15model.add(Flatten())
16

and the model compile

1from keras.models import Sequential
2from keras.layers import Dense
3from keras.layers import Flatten
4from keras.layers.convolutional import Conv1D
5from keras.layers.convolutional import MaxPooling1D
6from keras.layers.embeddings import Embedding
7from keras.preprocessing import sequence
8import pandas as pd
9model = Sequential()
10model.add(Conv1D(128, 5, activation='relu'))
11model.add(MaxPooling1D())
12model.add(Dense(12, input_dim=8, activation='relu'))
13model.add(Dense(10,activation='relu'))
14model.add(Dense(1,activation='sigmoid'))
15model.add(Flatten())
16model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
17

model fitting part (here is my issue)

1from keras.models import Sequential
2from keras.layers import Dense
3from keras.layers import Flatten
4from keras.layers.convolutional import Conv1D
5from keras.layers.convolutional import MaxPooling1D
6from keras.layers.embeddings import Embedding
7from keras.preprocessing import sequence
8import pandas as pd
9model = Sequential()
10model.add(Conv1D(128, 5, activation='relu'))
11model.add(MaxPooling1D())
12model.add(Dense(12, input_dim=8, activation='relu'))
13model.add(Dense(10,activation='relu'))
14model.add(Dense(1,activation='sigmoid'))
15model.add(Flatten())
16model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
17model.fit(train, test, epochs=50, batch_size=30)
18
19

Error:

1from keras.models import Sequential
2from keras.layers import Dense
3from keras.layers import Flatten
4from keras.layers.convolutional import Conv1D
5from keras.layers.convolutional import MaxPooling1D
6from keras.layers.embeddings import Embedding
7from keras.preprocessing import sequence
8import pandas as pd
9model = Sequential()
10model.add(Conv1D(128, 5, activation='relu'))
11model.add(MaxPooling1D())
12model.add(Dense(12, input_dim=8, activation='relu'))
13model.add(Dense(10,activation='relu'))
14model.add(Dense(1,activation='sigmoid'))
15model.add(Flatten())
16model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
17model.fit(train, test, epochs=50, batch_size=30)
18
19ValueError: Data cardinality is ambiguous:
20  x sizes: 2934817
21  y sizes: 733705
22Make sure all arrays contain the same number of samples.
23

from the error message its clear the files are not the same row quantity

so i tried to take only the test file only and made 2 parts of it the first part

from column 0 to 16

the other is 16

1from keras.models import Sequential
2from keras.layers import Dense
3from keras.layers import Flatten
4from keras.layers.convolutional import Conv1D
5from keras.layers.convolutional import MaxPooling1D
6from keras.layers.embeddings import Embedding
7from keras.preprocessing import sequence
8import pandas as pd
9model = Sequential()
10model.add(Conv1D(128, 5, activation='relu'))
11model.add(MaxPooling1D())
12model.add(Dense(12, input_dim=8, activation='relu'))
13model.add(Dense(10,activation='relu'))
14model.add(Dense(1,activation='sigmoid'))
15model.add(Flatten())
16model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
17model.fit(train, test, epochs=50, batch_size=30)
18
19ValueError: Data cardinality is ambiguous:
20  x sizes: 2934817
21  y sizes: 733705
22Make sure all arrays contain the same number of samples.
23x = test.iloc[:,0:16]
24y = test.iloc[:,16]
25
1from keras.models import Sequential
2from keras.layers import Dense
3from keras.layers import Flatten
4from keras.layers.convolutional import Conv1D
5from keras.layers.convolutional import MaxPooling1D
6from keras.layers.embeddings import Embedding
7from keras.preprocessing import sequence
8import pandas as pd
9model = Sequential()
10model.add(Conv1D(128, 5, activation='relu'))
11model.add(MaxPooling1D())
12model.add(Dense(12, input_dim=8, activation='relu'))
13model.add(Dense(10,activation='relu'))
14model.add(Dense(1,activation='sigmoid'))
15model.add(Flatten())
16model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
17model.fit(train, test, epochs=50, batch_size=30)
18
19ValueError: Data cardinality is ambiguous:
20  x sizes: 2934817
21  y sizes: 733705
22Make sure all arrays contain the same number of samples.
23x = test.iloc[:,0:16]
24y = test.iloc[:,16]
25model.fit(x, y, epochs=50, batch_size=30)
26
27

Error:

1from keras.models import Sequential
2from keras.layers import Dense
3from keras.layers import Flatten
4from keras.layers.convolutional import Conv1D
5from keras.layers.convolutional import MaxPooling1D
6from keras.layers.embeddings import Embedding
7from keras.preprocessing import sequence
8import pandas as pd
9model = Sequential()
10model.add(Conv1D(128, 5, activation='relu'))
11model.add(MaxPooling1D())
12model.add(Dense(12, input_dim=8, activation='relu'))
13model.add(Dense(10,activation='relu'))
14model.add(Dense(1,activation='sigmoid'))
15model.add(Flatten())
16model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
17model.fit(train, test, epochs=50, batch_size=30)
18
19ValueError: Data cardinality is ambiguous:
20  x sizes: 2934817
21  y sizes: 733705
22Make sure all arrays contain the same number of samples.
23x = test.iloc[:,0:16]
24y = test.iloc[:,16]
25model.fit(x, y, epochs=50, batch_size=30)
26
27ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type int).
28

i have tried to make it all as float but it didn't work out still have the same problem

ANSWER

Answered 2021-Nov-14 at 14:36

The first problem I'm finding is that when using .fit() you need to pass the x and y values, not the train and test sets and that's why you are getting the the error. Keras is trying to predict your full test dataset based on the train dataset which of course makes no sense.

The second error seems like you are passing the right variables to the model (the last column being the target, defined as y and the predictors defined as x) however there seems to be an issue on how the data is formatted. Without access to the data it's hard to solve it. Are all columns numerical? If so, as addressed here this might help do the trick:

1from keras.models import Sequential
2from keras.layers import Dense
3from keras.layers import Flatten
4from keras.layers.convolutional import Conv1D
5from keras.layers.convolutional import MaxPooling1D
6from keras.layers.embeddings import Embedding
7from keras.preprocessing import sequence
8import pandas as pd
9model = Sequential()
10model.add(Conv1D(128, 5, activation='relu'))
11model.add(MaxPooling1D())
12model.add(Dense(12, input_dim=8, activation='relu'))
13model.add(Dense(10,activation='relu'))
14model.add(Dense(1,activation='sigmoid'))
15model.add(Flatten())
16model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
17model.fit(train, test, epochs=50, batch_size=30)
18
19ValueError: Data cardinality is ambiguous:
20  x sizes: 2934817
21  y sizes: 733705
22Make sure all arrays contain the same number of samples.
23x = test.iloc[:,0:16]
24y = test.iloc[:,16]
25model.fit(x, y, epochs=50, batch_size=30)
26
27ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type int).
28x = np.asarray(x).astype('float32')
29

If the data is not numeric across all entry points, then you might need to some bit of preprocessing in order to ensure it is fully numerical. Some alternatives worth looking into might be:

Once your dataset is all of numerical types, you should be able to use it to train the model without issues.

Source https://stackoverflow.com/questions/69963841

QUESTION

How to change my css to make hyper link visible [ with minimum sample code ]?

Asked 2021-Oct-12 at 15:34

I have a site with CSS, but the hyper links are not visible [ right side ], how to change my html/css so that the hyper links are visible [ like the left side on the follow image ] ?

enter image description here

I've simplified my site to show the problem and here is the minimum sample code :

1&lt;!DOCTYPE html&gt;
2&lt;html lang=&quot;en&quot;&gt;
3  &lt;head&gt;
4    &lt;meta charset=&quot;UTF-8&quot;&gt;
5    &lt;meta name=&quot;viewport&quot; content=&quot;width=device-width, initial-scale=1&quot;&gt;
6    &lt;title&gt;GATE Cyber Technology : Award Winning Innovation For Identity And Access Management&lt;/title&gt;
7    &lt;meta name=&quot;description&quot; content=&quot;GATE Cyber Technology LLC. INTERCEPTION-RESISTANT AUTHENTICATION AND ENCRYPTION SYSTEM AND METHOD. Introducing a breakthrough digital security innovation : Graphic Access Tabular Entry [ GATE ], an interception-resistant authentication and encryption system and method. With the GATE system you are not afraid that you are watched when you enter passwords, and you are not afraid that the password will be intercepted, the GATE innovative method is designed to be peek-resistant and interception-resistant. The GATE system and method will offer you better digital security. Identity and Access Management (IAM)&quot;&gt;
8    &lt;meta name=&quot;keywords&quot; content=&quot;GATE Cyber Technology LLC. INTERCEPTION-resistant AUTHENTICATION AND ENCRYPTION SYSTEM AND METHOD, Graphic Access Tabular Entry [ GATE ], GATE security, GATE authentication, GATE login, GATE user authentication, GATE password, GATE passcode, peek-resistant, online security, digital security, passwords, password protection, strong password, strong cybersecurity, strong user authentication, prevent password loss, prevent user credential loss, passcode, cyber security, pin, login, logon, digital access, online access, access control, online protection, digital protection, online defence, digital defence, message encryption, message decryption, signal encryption, signal decryption, overcome weakness of traditional password, the GATE system, award winning, better than fingerprinting, better than iris scanning, safer than keyfob, better than password manager, safer password entry, Identity and Access Management (IAM), GATE defeats wiretapping, keylogging, peeking, phishing and dictionary attack, no restrictions of traditional password's lowercase, uppercase, numbers and special characters requirements, easy to use&quot;&gt;
9    &lt;meta name=&quot;google-site-verification&quot; content=&quot;cXY5hsdt7XCjR_k96nha7Hn5uW4fw_1u6mc2LWDyAQ0&quot; /&gt;
10    &lt;link rel=&quot;shortcut icon&quot; href=&quot;favicon.ico&quot;&gt;
11
12    &lt;link media=&quot;all&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;https://cdn.ahrefs.com/assets/css/bootstrap.min.css&quot;&gt;
13    &lt;link media=&quot;all&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;https://fonts.googleapis.com/css?family=Lato:400,300,100italic,100,300italic,400italic,700,700italic,900,900italic&quot;&gt;
14    &lt;link media=&quot;screen&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;https://cdn.ahrefs.com/assets/css/home-responsive.css?20180815-001&quot;&gt;
15
16    &lt;link media=&quot;all&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;css/bootstrap.min.css&quot;&gt;
17    &lt;link media=&quot;all&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;css/css.css&quot;&gt;
18    &lt;link media=&quot;screen&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;css/home-responsive.css&quot;&gt;
19
20    &lt;meta property=&quot;og:image&quot; content=&quot;GATE_1.PNG&quot;&gt;
21
22    &lt;style&gt;
23      div.Intro
24      {
25        font-size: 100%;
26        text-align: left;
27      }
28
29      div.Table
30      {
31        font-size: 218%;
32        text-align: center;
33      }
34
35      a:hover { color:#ddeeff; }
36      a:visited { color:#E8E8E8 }
37     
38      tr a{ font-size: 18px;color:#aabbcc; }
39      tr a:hover { color:#ddeeff; }
40
41      .pic-container-1{display:block; position:relative; }
42      .pic-container-1 .pic-box{display:block;}
43      .pic-container-1 .pic-box img{display:block;}
44      .pic-container-1 .pic-hover{position:absolute; top:0px; left:104px; display:none;}
45      .pic-container-1:hover .pic-hover{display:block;}
46
47      .pic-container-2{display:block; position:relative; }
48      .pic-container-2 .pic-box{display:block;}
49      .pic-container-2 .pic-box img{display:block;}
50      .pic-container-2 .pic-hover{position:absolute; top:0px; left:18px; display:none;}
51      .pic-container-2:hover .pic-hover{display:block;}
52
53      .pic-container-3{display:block; position:relative; }
54      .pic-container-3 .pic-box{display:block;}
55      .pic-container-3 .pic-box img{display:block;}
56      .pic-container-3 .pic-hover{position:absolute; top:0px; left:20px; display:none;}
57      .pic-container-3:hover .pic-hover{display:block;}
58
59      .pic-container-4{display:block; position:relative; }
60      .pic-container-4 .pic-box{display:block;}
61      .pic-container-4 .pic-box img{display:block;}
62      .pic-container-4 .pic-hover{position:absolute; top:0px; left:18px; display:none;}
63      .pic-container-4:hover .pic-hover{display:block;}
64
65      #GATE_Frame_1 { width: 78%; height: auto; }
66      #GATE_Frame_2 { width: 98%; height: auto; }
67
68      #Balance { width: 80%; height: auto; }
69      
70      #Ted_Murphree_img { width: 36vw; height: auto; }
71      #Scott_Schober_img { width: 36vw; height: auto; }
72      #Cary_Pool_img { width: 36vw; height: auto; }
73      #Eduard_B_img { width: 36vw; height: auto; }
74      #Jonathan_Rosenoer_img { width: 36vw; height: auto; }
75
76      #Traditional_vs_GATE_1 { width: 96%; height: auto; }
77      #Traditional_vs_GATE_2 { width: 99.5%; height: auto; }
78
79      #modal
80      {
81        display: none;
82        position: fixed;
83        width: 100vw;
84        height: 100vh;
85        max-height: 100vh;
86        top: 0;
87        left: 0;
88        background: rgba(24, 24, 24, .6);
89        z-index: 999;
90      }
91      #modal .content
92      {
93        position: relative;
94        width: 55%;
95        height: 65vh;
96        margin: auto; /* allows horyzontal and vertical alignment as .content is in flex container */
97      }
98      #modal .content .yt-video
99      {
100        display: block;
101        width: 100%;
102        height: calc(100% - 45px);
103      }
104      #modal .content .title
105      {
106        box-sizing: border-box;
107        height: 45px;
108        line-height: 23px;
109        padding: 12px 4px;
110        margin: 0;
111        background: #007bff;
112        color: #fff;
113        text-align: center;
114        font-size: 26px;
115        max-width: 100%;
116        white-space: nowrap;
117        overflow: hidden;
118        text-overflow: ellipsis;
119      }
120      #modal .close
121      {
122        position: absolute;
123        top: 0;
124        right: 0;
125        width: 45px;
126        height: 45px;
127        line-height: 36px;
128        text-align: center;
129        border: 0;
130        font-weight: bold;
131        font-size: 38px;
132        color: #fff;
133        background: #366;
134        cursor: pointer;
135        transition: background .2s;
136      }
137      #modal .content .close .a { font-size:38px;color: #ffffff; }
138      #modal .close:hover, #modal .close:active { background: #ff0000; }
139      #modal.is-visible { display: flex; }
140
141      html, body, div, span, applet, object, iframe, h1, h2, h3, h4, h5, h6, p, blockquote, pre, a, abbr, acronym, address, big, cite, code, del, dfn, em, img, ins, kbd, q, s, samp, small, strike, strong, sub, sup, tt, var, b, u, i, center, dl, dt, dd, ol, ul, li,
142      fieldset, form, label, legend, table, caption, tbody, tfoot, thead, tr, th, td, article, aside, canvas, details, embed, figure, figcaption, footer, header, hgroup, menu, nav, output, ruby, section, summary, time, mark, audio, video
143      {
144        margin: 0;
145        padding: 0;
146        border: 0;
147        font-size: 100%;
148        font: inherit;
149        vertical-align: middle;
150      }
151
152      /* HTML5 display-role reset for older browsers */
153      article, aside, details, figcaption, figure, footer, header, hgroup, menu, nav, section { display: block; }
154      body { line-height: 1; }
155      // ol, ul { list-style: none; }
156      blockquote, q { quotes: none; }
157      blockquote:before, blockquote:after,
158      q:before, q:after
159      {
160        content: '';
161        content: none;
162      }
163      table
164      {
165        border-collapse: collapse;
166        border-spacing: 0;
167      }
168    &lt;/style&gt;
169  &lt;/head&gt;
170
171  &lt;body class=&quot;page__guest ahrefs page-home&quot;&gt;
172    &lt;div id=&quot;localizejs&quot;&gt;
173      &lt;div class=&quot;content&quot;&gt;
174        &lt;a id=&quot;Awards&quot;&gt;&lt;/a&gt;
175        &lt;div class=&quot;datas&quot;&gt;
176          &lt;div class=&quot;container center&quot;&gt;
177            &lt;Table Cellpadding=6&gt;
178              &lt;Tr&gt;
179                &lt;Td Align=Center&gt;&lt;Br&gt;
180                  &lt;Font Color=white&gt;&lt;A Href=http://bestech.ittn.com.cn/#/projectlist2021 target=_blank&gt;GATE has been selected&lt;/A&gt; to the &lt;A Href=&quot;2021_ZGC_Top_100_List_1.PNG&quot; target=_blank&gt;top 100&lt;/A&gt;,&lt;Br&gt; among more than 2800 technologies collected&lt;Br&gt; from all over the world at 2021 ZGC&lt;Br&gt;&lt;A Href=http://bestech.ittn.com.cn/#/home target=_blank&gt;International Technology Trade Conference&lt;/A&gt;.&lt;/Font&gt;
181                &lt;/Td&gt;
182              &lt;/Tr&gt;
183            &lt;/Table&gt;
184          &lt;/div&gt;
185        &lt;/div&gt;
186
187    &lt;!-- the modal div that will open when an anchor link is clicked to show the related video in an iframe. --&gt;
188
189    &lt;div id=&quot;modal&quot;&gt;
190      &lt;div class=&quot;content&quot;&gt;
191        &lt;div class=&quot;close&quot;&gt;&lt;a onclick = &quot;return close_iFrame();&quot;&gt;&amp;times;&lt;/a&gt;&lt;/div&gt;
192        &lt;h4 class=&quot;title&quot;&gt;.&lt;/h4&gt;
193        &lt;iframe class=&quot;yt-video&quot; allowfullscreen&gt;&lt;/iframe&gt;
194      &lt;/div&gt;
195    &lt;/div&gt;
196
197    &lt;script&gt;
198      var modal = document.getElementById('modal'),
199          closeBtn = modal.querySelector('close'),
200          ytVideo = modal.querySelector('.content .yt-video'),
201          title = modal.querySelector('.content .title'),
202          anchors = document.querySelectorAll('a[data-target=&quot;modal&quot;]'),
203          l = anchors.length;
204
205      for (var i = 0; i &lt; l; i++)
206      {
207        anchors[i].addEventListener(&quot;click&quot;, function (e)
208        {
209          e.preventDefault();
210          title.textContent = this.dataset.videoTitle || 'No title';
211          ytVideo.src = this.href;
212          modal.classList.toggle('is-visible');
213          modal.focus();
214        });
215      }
216
217      modal.addEventListener(&quot;keydown&quot;, function (e)
218      {
219        if (e.keyCode == 27)
220        {
221          title.textContent = '';
222          ytVideo.src = '';
223          this.classList.toggle('is-visible');
224        }
225      });
226    &lt;/script&gt;
227
228    &lt;script type=&quot;text/javascript&quot;&gt;
229
230      function close_iFrame()
231      {
232        var modal = document.getElementById('modal'),
233            ytVideo = modal.querySelector('.content .yt-video');
234
235        ytVideo.src = '';
236        modal.classList.toggle('is-visible');
237
238        // Opera 8.0+
239        var isOpera = (!!window.opr &amp;&amp; !!opr.addons) || !!window.opera || navigator.userAgent.indexOf(' OPR/') &gt;= 0;
240
241        // Firefox 1.0+
242        var isFirefox = typeof InstallTrigger !== 'undefined';
243
244        // Safari 3.0+ &quot;[object HTMLElementConstructor]&quot; 
245        var isSafari = /constructor/i.test(window.HTMLElement) || (function (p) { return p.toString() === &quot;[object SafariRemoteNotification]&quot;; })(!window['safari'] || safari.pushNotification);
246
247        // Internet Explorer 6-11
248        var isIE = /*@cc_on!@*/false || !!document.documentMode;
249
250        // Edge 20+
251        var isEdge = !isIE &amp;&amp; !!window.StyleMedia;
252
253        // Chrome 1+
254        var isChrome = !!window.chrome &amp;&amp; !!window.chrome.webstore;
255
256        // Blink engine detection
257        var isBlink = (isChrome || isOpera) &amp;&amp; !!window.CSS;
258
259        var output = 'Detecting browsers by ducktyping :\n===========================\n';
260        output+='isChrome: '+isChrome+'\n';      // 57.8 % Market Share
261        output+='isSafari: '+isSafari+'\n';      // 14.0 %
262        output+='isFirefox: '+isFirefox+'\n';    // 6.0 %
263        output+='isIE: '+isIE+'\n';
264        output+='isEdge: '+isEdge+'\n';          // 5.9 %  IE + Edge
265        output+='isOpera: '+isOpera+'\n';        // 3.7 %
266        output+='isBlink: '+isBlink+'\n';
267
268//        alert(output+'[ history.length = '+history.length+' ]');
269
270        if (isChrome)                            // 57.8 % [ Will work correctly only after 3rd+ time of going to the #Videos section ]
271        {
272/*
273[1] No code : after 1st play, &quot;back&quot; plays sound
274              after 2nd play, &quot;back&quot; also plays sound, remembers history
275              after play 2 videos, 1 &quot;back&quot; plays last vodeo, 2 &quot;back&quot; does nothing, 3 &quot;back&quot; plays 2nd last video
276              Seems to remember [ empty ] + [ video ]
277
278Memory pattern : Top [video_1] [ ] [video_2] ?
279*/
280
281          if (!sessionStorage.getItem(&quot;runOnce&quot;)) // 1st time : Remembers 1st video  // 2nd time : back to top after closing iFrame  // 3rd time+ : works correctly
282          {
283            // alert('runOnce');
284            window.history.replaceState({},&quot;Videos&quot;,&quot;#Videos&quot;);
285//            window.location.href='#Videos';
286//            history.go(0);
287            sessionStorage.setItem(&quot;runOnce&quot;,true);
288          }
289          else
290          {
291            window.history.replaceState({},&quot;Videos&quot;,&quot;#Videos&quot;);
292            history.go(-1);
293          }
294
295        }
296        else if (isSafari)                       // 14.0
297        {
298
299        }
300        else if (isFirefox)                      // 6.0 % [ Works correctly ]
301        {
302          history.go(-1);
303        }
304        else if (isIE)
305        {
306            window.history.replaceState({},&quot;Videos&quot;,&quot;#Videos&quot;);
307        }
308        else if (isEdge)                         // 5.9 %  IE + Edge
309        {
310            history.go(-1);
311        }
312        else if (isOpera)                        // 3.7 %
313        {
314            history.go(-1);
315        }
316        else if (isBlink)
317        {
318            history.go(-1);
319        }
320//alert( window.location.href );
321//        history.go(-1);
322//window.location.href = '#Videos';
323//history.replaceState({}, &quot;#Videos&quot;, &quot;#Videos&quot;);
324//alert( window.location.href );
325      }
326
327      window.onload = function()
328      {
329        // Opera 8.0+
330        var isOpera = (!!window.opr &amp;&amp; !!opr.addons) || !!window.opera || navigator.userAgent.indexOf(' OPR/') &gt;= 0;
331
332        // Firefox 1.0+
333        var isFirefox = typeof InstallTrigger !== 'undefined';
334
335        // Safari 3.0+ &quot;[object HTMLElementConstructor]&quot; 
336        var isSafari = /constructor/i.test(window.HTMLElement) || (function (p) { return p.toString() === &quot;[object SafariRemoteNotification]&quot;; })(!window['safari'] || safari.pushNotification);
337
338        // Internet Explorer 6-11
339        var isIE = /*@cc_on!@*/false || !!document.documentMode;
340
341        // Edge 20+
342        var isEdge = !isIE &amp;&amp; !!window.StyleMedia;
343
344        // Chrome 1+
345        var isChrome = !!window.chrome &amp;&amp; !!window.chrome.webstore;
346
347        // Blink engine detection
348        var isBlink = (isChrome || isOpera) &amp;&amp; !!window.CSS;
349
350        var output = 'Detecting browsers by ducktyping :\n===========================\n';
351        output+='isChrome: '+isChrome+'\n';      // 57.8 % Market Share
352        output+='isSafari: '+isSafari+'\n';      // 14.0 %
353        output+='isFirefox: '+isFirefox+'\n';    // 6.0 %
354        output+='isIE: '+isIE+'\n';
355        output+='isEdge: '+isEdge+'\n';          // 5.9 %  IE + Edge
356        output+='isOpera: '+isOpera+'\n';        // 3.7 %
357        output+='isBlink: '+isBlink+'\n';
358
359//        alert(output);
360
361        if (isIE) 
362        {
363//          alert(output);
364          var pichover=document.getElementsByClassName(&quot;pic-hover&quot;);
365          pichover[0].style.left=&quot;107px&quot;;
366          pichover[1].style.left=&quot;24px&quot;;
367          pichover[2].style.left=&quot;23px&quot;;
368          pichover[3].style.left=&quot;21px&quot;;
369       }
370     }
371
372
373    &lt;/script&gt;
374  &lt;/body&gt;
375&lt;/html&gt;
376

ANSWER

Answered 2021-Oct-11 at 16:04

but this is bad

1&lt;!DOCTYPE html&gt;
2&lt;html lang=&quot;en&quot;&gt;
3  &lt;head&gt;
4    &lt;meta charset=&quot;UTF-8&quot;&gt;
5    &lt;meta name=&quot;viewport&quot; content=&quot;width=device-width, initial-scale=1&quot;&gt;
6    &lt;title&gt;GATE Cyber Technology : Award Winning Innovation For Identity And Access Management&lt;/title&gt;
7    &lt;meta name=&quot;description&quot; content=&quot;GATE Cyber Technology LLC. INTERCEPTION-RESISTANT AUTHENTICATION AND ENCRYPTION SYSTEM AND METHOD. Introducing a breakthrough digital security innovation : Graphic Access Tabular Entry [ GATE ], an interception-resistant authentication and encryption system and method. With the GATE system you are not afraid that you are watched when you enter passwords, and you are not afraid that the password will be intercepted, the GATE innovative method is designed to be peek-resistant and interception-resistant. The GATE system and method will offer you better digital security. Identity and Access Management (IAM)&quot;&gt;
8    &lt;meta name=&quot;keywords&quot; content=&quot;GATE Cyber Technology LLC. INTERCEPTION-resistant AUTHENTICATION AND ENCRYPTION SYSTEM AND METHOD, Graphic Access Tabular Entry [ GATE ], GATE security, GATE authentication, GATE login, GATE user authentication, GATE password, GATE passcode, peek-resistant, online security, digital security, passwords, password protection, strong password, strong cybersecurity, strong user authentication, prevent password loss, prevent user credential loss, passcode, cyber security, pin, login, logon, digital access, online access, access control, online protection, digital protection, online defence, digital defence, message encryption, message decryption, signal encryption, signal decryption, overcome weakness of traditional password, the GATE system, award winning, better than fingerprinting, better than iris scanning, safer than keyfob, better than password manager, safer password entry, Identity and Access Management (IAM), GATE defeats wiretapping, keylogging, peeking, phishing and dictionary attack, no restrictions of traditional password's lowercase, uppercase, numbers and special characters requirements, easy to use&quot;&gt;
9    &lt;meta name=&quot;google-site-verification&quot; content=&quot;cXY5hsdt7XCjR_k96nha7Hn5uW4fw_1u6mc2LWDyAQ0&quot; /&gt;
10    &lt;link rel=&quot;shortcut icon&quot; href=&quot;favicon.ico&quot;&gt;
11
12    &lt;link media=&quot;all&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;https://cdn.ahrefs.com/assets/css/bootstrap.min.css&quot;&gt;
13    &lt;link media=&quot;all&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;https://fonts.googleapis.com/css?family=Lato:400,300,100italic,100,300italic,400italic,700,700italic,900,900italic&quot;&gt;
14    &lt;link media=&quot;screen&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;https://cdn.ahrefs.com/assets/css/home-responsive.css?20180815-001&quot;&gt;
15
16    &lt;link media=&quot;all&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;css/bootstrap.min.css&quot;&gt;
17    &lt;link media=&quot;all&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;css/css.css&quot;&gt;
18    &lt;link media=&quot;screen&quot; type=&quot;text/css&quot; rel=&quot;stylesheet&quot; href=&quot;css/home-responsive.css&quot;&gt;
19
20    &lt;meta property=&quot;og:image&quot; content=&quot;GATE_1.PNG&quot;&gt;
21
22    &lt;style&gt;
23      div.Intro
24      {
25        font-size: 100%;
26        text-align: left;
27      }
28
29      div.Table
30      {
31        font-size: 218%;
32        text-align: center;
33      }
34
35      a:hover { color:#ddeeff; }
36      a:visited { color:#E8E8E8 }
37     
38      tr a{ font-size: 18px;color:#aabbcc; }
39      tr a:hover { color:#ddeeff; }
40
41      .pic-container-1{display:block; position:relative; }
42      .pic-container-1 .pic-box{display:block;}
43      .pic-container-1 .pic-box img{display:block;}
44      .pic-container-1 .pic-hover{position:absolute; top:0px; left:104px; display:none;}
45      .pic-container-1:hover .pic-hover{display:block;}
46
47      .pic-container-2{display:block; position:relative; }
48      .pic-container-2 .pic-box{display:block;}
49      .pic-container-2 .pic-box img{display:block;}
50      .pic-container-2 .pic-hover{position:absolute; top:0px; left:18px; display:none;}
51      .pic-container-2:hover .pic-hover{display:block;}
52
53      .pic-container-3{display:block; position:relative; }
54      .pic-container-3 .pic-box{display:block;}
55      .pic-container-3 .pic-box img{display:block;}
56      .pic-container-3 .pic-hover{position:absolute; top:0px; left:20px; display:none;}
57      .pic-container-3:hover .pic-hover{display:block;}
58
59      .pic-container-4{display:block; position:relative; }
60      .pic-container-4 .pic-box{display:block;}
61      .pic-container-4 .pic-box img{display:block;}
62      .pic-container-4 .pic-hover{position:absolute; top:0px; left:18px; display:none;}
63      .pic-container-4:hover .pic-hover{display:block;}
64
65      #GATE_Frame_1 { width: 78%; height: auto; }
66      #GATE_Frame_2 { width: 98%; height: auto; }
67
68      #Balance { width: 80%; height: auto; }
69      
70      #Ted_Murphree_img { width: 36vw; height: auto; }
71      #Scott_Schober_img { width: 36vw; height: auto; }
72      #Cary_Pool_img { width: 36vw; height: auto; }
73      #Eduard_B_img { width: 36vw; height: auto; }
74      #Jonathan_Rosenoer_img { width: 36vw; height: auto; }
75
76      #Traditional_vs_GATE_1 { width: 96%; height: auto; }
77      #Traditional_vs_GATE_2 { width: 99.5%; height: auto; }
78
79      #modal
80      {
81        display: none;
82        position: fixed;
83        width: 100vw;
84        height: 100vh;
85        max-height: 100vh;
86        top: 0;
87        left: 0;
88        background: rgba(24, 24, 24, .6);
89        z-index: 999;
90      }
91      #modal .content
92      {
93        position: relative;
94        width: 55%;
95        height: 65vh;
96        margin: auto; /* allows horyzontal and vertical alignment as .content is in flex container */
97      }
98      #modal .content .yt-video
99      {
100        display: block;
101        width: 100%;
102        height: calc(100% - 45px);
103      }
104      #modal .content .title
105      {
106        box-sizing: border-box;
107        height: 45px;
108        line-height: 23px;
109        padding: 12px 4px;
110        margin: 0;
111        background: #007bff;
112        color: #fff;
113        text-align: center;
114        font-size: 26px;
115        max-width: 100%;
116        white-space: nowrap;
117        overflow: hidden;
118        text-overflow: ellipsis;
119      }
120      #modal .close
121      {
122        position: absolute;
123        top: 0;
124        right: 0;
125        width: 45px;
126        height: 45px;
127        line-height: 36px;
128        text-align: center;
129        border: 0;
130        font-weight: bold;
131        font-size: 38px;
132        color: #fff;
133        background: #366;
134        cursor: pointer;
135        transition: background .2s;
136      }
137      #modal .content .close .a { font-size:38px;color: #ffffff; }
138      #modal .close:hover, #modal .close:active { background: #ff0000; }
139      #modal.is-visible { display: flex; }
140
141      html, body, div, span, applet, object, iframe, h1, h2, h3, h4, h5, h6, p, blockquote, pre, a, abbr, acronym, address, big, cite, code, del, dfn, em, img, ins, kbd, q, s, samp, small, strike, strong, sub, sup, tt, var, b, u, i, center, dl, dt, dd, ol, ul, li,
142      fieldset, form, label, legend, table, caption, tbody, tfoot, thead, tr, th, td, article, aside, canvas, details, embed, figure, figcaption, footer, header, hgroup, menu, nav, output, ruby, section, summary, time, mark, audio, video
143      {
144        margin: 0;
145        padding: 0;
146        border: 0;
147        font-size: 100%;
148        font: inherit;
149        vertical-align: middle;
150      }
151
152      /* HTML5 display-role reset for older browsers */
153      article, aside, details, figcaption, figure, footer, header, hgroup, menu, nav, section { display: block; }
154      body { line-height: 1; }
155      // ol, ul { list-style: none; }
156      blockquote, q { quotes: none; }
157      blockquote:before, blockquote:after,
158      q:before, q:after
159      {
160        content: '';
161        content: none;
162      }
163      table
164      {
165        border-collapse: collapse;
166        border-spacing: 0;
167      }
168    &lt;/style&gt;
169  &lt;/head&gt;
170
171  &lt;body class=&quot;page__guest ahrefs page-home&quot;&gt;
172    &lt;div id=&quot;localizejs&quot;&gt;
173      &lt;div class=&quot;content&quot;&gt;
174        &lt;a id=&quot;Awards&quot;&gt;&lt;/a&gt;
175        &lt;div class=&quot;datas&quot;&gt;
176          &lt;div class=&quot;container center&quot;&gt;
177            &lt;Table Cellpadding=6&gt;
178              &lt;Tr&gt;
179                &lt;Td Align=Center&gt;&lt;Br&gt;
180                  &lt;Font Color=white&gt;&lt;A Href=http://bestech.ittn.com.cn/#/projectlist2021 target=_blank&gt;GATE has been selected&lt;/A&gt; to the &lt;A Href=&quot;2021_ZGC_Top_100_List_1.PNG&quot; target=_blank&gt;top 100&lt;/A&gt;,&lt;Br&gt; among more than 2800 technologies collected&lt;Br&gt; from all over the world at 2021 ZGC&lt;Br&gt;&lt;A Href=http://bestech.ittn.com.cn/#/home target=_blank&gt;International Technology Trade Conference&lt;/A&gt;.&lt;/Font&gt;
181                &lt;/Td&gt;
182              &lt;/Tr&gt;
183            &lt;/Table&gt;
184          &lt;/div&gt;
185        &lt;/div&gt;
186
187    &lt;!-- the modal div that will open when an anchor link is clicked to show the related video in an iframe. --&gt;
188
189    &lt;div id=&quot;modal&quot;&gt;
190      &lt;div class=&quot;content&quot;&gt;
191        &lt;div class=&quot;close&quot;&gt;&lt;a onclick = &quot;return close_iFrame();&quot;&gt;&amp;times;&lt;/a&gt;&lt;/div&gt;
192        &lt;h4 class=&quot;title&quot;&gt;.&lt;/h4&gt;
193        &lt;iframe class=&quot;yt-video&quot; allowfullscreen&gt;&lt;/iframe&gt;
194      &lt;/div&gt;
195    &lt;/div&gt;
196
197    &lt;script&gt;
198      var modal = document.getElementById('modal'),
199          closeBtn = modal.querySelector('close'),
200          ytVideo = modal.querySelector('.content .yt-video'),
201          title = modal.querySelector('.content .title'),
202          anchors = document.querySelectorAll('a[data-target=&quot;modal&quot;]'),
203          l = anchors.length;
204
205      for (var i = 0; i &lt; l; i++)
206      {
207        anchors[i].addEventListener(&quot;click&quot;, function (e)
208        {
209          e.preventDefault();
210          title.textContent = this.dataset.videoTitle || 'No title';
211          ytVideo.src = this.href;
212          modal.classList.toggle('is-visible');
213          modal.focus();
214        });
215      }
216
217      modal.addEventListener(&quot;keydown&quot;, function (e)
218      {
219        if (e.keyCode == 27)
220        {
221          title.textContent = '';
222          ytVideo.src = '';
223          this.classList.toggle('is-visible');
224        }
225      });
226    &lt;/script&gt;
227
228    &lt;script type=&quot;text/javascript&quot;&gt;
229
230      function close_iFrame()
231      {
232        var modal = document.getElementById('modal'),
233            ytVideo = modal.querySelector('.content .yt-video');
234
235        ytVideo.src = '';
236        modal.classList.toggle('is-visible');
237
238        // Opera 8.0+
239        var isOpera = (!!window.opr &amp;&amp; !!opr.addons) || !!window.opera || navigator.userAgent.indexOf(' OPR/') &gt;= 0;
240
241        // Firefox 1.0+
242        var isFirefox = typeof InstallTrigger !== 'undefined';
243
244        // Safari 3.0+ &quot;[object HTMLElementConstructor]&quot; 
245        var isSafari = /constructor/i.test(window.HTMLElement) || (function (p) { return p.toString() === &quot;[object SafariRemoteNotification]&quot;; })(!window['safari'] || safari.pushNotification);
246
247        // Internet Explorer 6-11
248        var isIE = /*@cc_on!@*/false || !!document.documentMode;
249
250        // Edge 20+
251        var isEdge = !isIE &amp;&amp; !!window.StyleMedia;
252
253        // Chrome 1+
254        var isChrome = !!window.chrome &amp;&amp; !!window.chrome.webstore;
255
256        // Blink engine detection
257        var isBlink = (isChrome || isOpera) &amp;&amp; !!window.CSS;
258
259        var output = 'Detecting browsers by ducktyping :\n===========================\n';
260        output+='isChrome: '+isChrome+'\n';      // 57.8 % Market Share
261        output+='isSafari: '+isSafari+'\n';      // 14.0 %
262        output+='isFirefox: '+isFirefox+'\n';    // 6.0 %
263        output+='isIE: '+isIE+'\n';
264        output+='isEdge: '+isEdge+'\n';          // 5.9 %  IE + Edge
265        output+='isOpera: '+isOpera+'\n';        // 3.7 %
266        output+='isBlink: '+isBlink+'\n';
267
268//        alert(output+'[ history.length = '+history.length+' ]');
269
270        if (isChrome)                            // 57.8 % [ Will work correctly only after 3rd+ time of going to the #Videos section ]
271        {
272/*
273[1] No code : after 1st play, &quot;back&quot; plays sound
274              after 2nd play, &quot;back&quot; also plays sound, remembers history
275              after play 2 videos, 1 &quot;back&quot; plays last vodeo, 2 &quot;back&quot; does nothing, 3 &quot;back&quot; plays 2nd last video
276              Seems to remember [ empty ] + [ video ]
277
278Memory pattern : Top [video_1] [ ] [video_2] ?
279*/
280
281          if (!sessionStorage.getItem(&quot;runOnce&quot;)) // 1st time : Remembers 1st video  // 2nd time : back to top after closing iFrame  // 3rd time+ : works correctly
282          {
283            // alert('runOnce');
284            window.history.replaceState({},&quot;Videos&quot;,&quot;#Videos&quot;);
285//            window.location.href='#Videos';
286//            history.go(0);
287            sessionStorage.setItem(&quot;runOnce&quot;,true);
288          }
289          else
290          {
291            window.history.replaceState({},&quot;Videos&quot;,&quot;#Videos&quot;);
292            history.go(-1);
293          }
294
295        }
296        else if (isSafari)                       // 14.0
297        {
298
299        }
300        else if (isFirefox)                      // 6.0 % [ Works correctly ]
301        {
302          history.go(-1);
303        }
304        else if (isIE)
305        {
306            window.history.replaceState({},&quot;Videos&quot;,&quot;#Videos&quot;);
307        }
308        else if (isEdge)                         // 5.9 %  IE + Edge
309        {
310            history.go(-1);
311        }
312        else if (isOpera)                        // 3.7 %
313        {
314            history.go(-1);
315        }
316        else if (isBlink)
317        {
318            history.go(-1);
319        }
320//alert( window.location.href );
321//        history.go(-1);
322//window.location.href = '#Videos';
323//history.replaceState({}, &quot;#Videos&quot;, &quot;#Videos&quot;);
324//alert( window.location.href );
325      }
326
327      window.onload = function()
328      {
329        // Opera 8.0+
330        var isOpera = (!!window.opr &amp;&amp; !!opr.addons) || !!window.opera || navigator.userAgent.indexOf(' OPR/') &gt;= 0;
331
332        // Firefox 1.0+
333        var isFirefox = typeof InstallTrigger !== 'undefined';
334
335        // Safari 3.0+ &quot;[object HTMLElementConstructor]&quot; 
336        var isSafari = /constructor/i.test(window.HTMLElement) || (function (p) { return p.toString() === &quot;[object SafariRemoteNotification]&quot;; })(!window['safari'] || safari.pushNotification);
337
338        // Internet Explorer 6-11
339        var isIE = /*@cc_on!@*/false || !!document.documentMode;
340
341        // Edge 20+
342        var isEdge = !isIE &amp;&amp; !!window.StyleMedia;
343
344        // Chrome 1+
345        var isChrome = !!window.chrome &amp;&amp; !!window.chrome.webstore;
346
347        // Blink engine detection
348        var isBlink = (isChrome || isOpera) &amp;&amp; !!window.CSS;
349
350        var output = 'Detecting browsers by ducktyping :\n===========================\n';
351        output+='isChrome: '+isChrome+'\n';      // 57.8 % Market Share
352        output+='isSafari: '+isSafari+'\n';      // 14.0 %
353        output+='isFirefox: '+isFirefox+'\n';    // 6.0 %
354        output+='isIE: '+isIE+'\n';
355        output+='isEdge: '+isEdge+'\n';          // 5.9 %  IE + Edge
356        output+='isOpera: '+isOpera+'\n';        // 3.7 %
357        output+='isBlink: '+isBlink+'\n';
358
359//        alert(output);
360
361        if (isIE) 
362        {
363//          alert(output);
364          var pichover=document.getElementsByClassName(&quot;pic-hover&quot;);
365          pichover[0].style.left=&quot;107px&quot;;
366          pichover[1].style.left=&quot;24px&quot;;
367          pichover[2].style.left=&quot;23px&quot;;
368          pichover[3].style.left=&quot;21px&quot;;
369       }
370     }
371
372
373    &lt;/script&gt;
374  &lt;/body&gt;
375&lt;/html&gt;
376a[href] {color: blue !important, text-decoration: underline !important}
377

Source https://stackoverflow.com/questions/69529183

QUESTION

Bootstrap overflow width when writting an article with many paragraphs

Asked 2021-Oct-12 at 11:38

OVERVIEW

I am building a website to showcase my blockchain and cybersecurity projects I'd worked with. So far, I'd implemented two pages of my website using bootstrap v5.1.3. I'm no front-end developer, but still I wanted to build something of my own.

Currently, I'm writing the description of one of my projects, and later I will add some images into it.

PROBLEM

I'm currently facing the issue of an horizontal bar showing if I write too much paragraphs in the page and I don't know how to make it disappear.

I'm trying to solve this issue so that all the paragraphs are responsive and only appear within the viewport width of a screen, and don't overflow creating the horizontal bar.

Check the image below for a better explanation.

IMAGE

enter image description here

QUESTION

How can I solve this only using bootstrap v5.1.3?

WEBPAGE CODE

1/* test.css */
2
3html, body {
4      margin: 0;
5      padding: 0;
6      width: 100%;
7      height: 100%;
8      font-family: Hack, monospace !important;
9      background-color: #0f0f0f;
10}
11
12body {
13      display: flex!important;
14}
15
16.wrapper {
17      background-color: #0f0f0f;
18}
19
20.text-center.h1 {
21      color: #F4364C !important;
22      font-size: 4vw !important;
23}
24
25.h6 {
26      color: #F4364C !important;
27      font-size: 1.25vw !important;
28      opacity: 0.5 !important;
29}
1/* test.css */
2
3html, body {
4      margin: 0;
5      padding: 0;
6      width: 100%;
7      height: 100%;
8      font-family: Hack, monospace !important;
9      background-color: #0f0f0f;
10}
11
12body {
13      display: flex!important;
14}
15
16.wrapper {
17      background-color: #0f0f0f;
18}
19
20.text-center.h1 {
21      color: #F4364C !important;
22      font-size: 4vw !important;
23}
24
25.h6 {
26      color: #F4364C !important;
27      font-size: 1.25vw !important;
28      opacity: 0.5 !important;
29}&lt;!DOCTYPE html&gt;
30&lt;html lang="en"&gt;
31&lt;head&gt;
32      &lt;meta charset="UTF-8"&gt;
33      &lt;meta name="viewport" content="width=device-width, initial-scale=1"&gt;
34      &lt;meta http-equiv="X-UA-Compatible" content="ie=edge"&gt;
35      &lt;meta name="author" content="Joshua"&gt;
36      
37      &lt;title&gt;Project 1 | XXX XXX&lt;/title&gt;
38      
39      &lt;!-- hack fonts --&gt;
40      &lt;link href='https://cdn.jsdelivr.net/npm/hack-font@3.3.0/build/web/hack.css' rel='stylesheet' &gt;
41
42      &lt;!-- stylesheet --&gt;
43      &lt;link href='test.css' rel='stylesheet'&gt;
44      
45      &lt;!-- bootstrap-5.1.3 --&gt;
46      &lt;link href="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-1BmE4kWBq78iYhFldvKuhfTAU6auU8tT94WrHftjDbrCEXSU1oBoqyl2QvZ6jIW3" crossorigin="anonymous"&gt;
47
48      &lt;!-- bootstrap-5.1.3 script bundle with popper --&gt;
49      &lt;script src="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/js/bootstrap.bundle.min.js" integrity="sha384-ka7Sk0Gln4gmtz2MlQnikT1wXgYsOg+OMhuP+IlRH9sENBO0LRn5q+8nbTov4+1p" crossorigin="anonymous"&gt;&lt;/script&gt;
50
51&lt;/head&gt;
52&lt;body&gt;
53      &lt;div class="d-flex flex-column min-vh-100 min-vw-100 wrapper"&gt;
54            
55            &lt;!-- Project Title --&gt;
56            &lt;div class="container-fluid my-auto"&gt;
57                  &lt;p class="text-center h1"&gt;&lt;span&gt;Astronomy Star Registry&lt;/span&gt;&lt;/p&gt;
58            &lt;/div&gt;
59
60            &lt;div class="container-fluid my-auto"&gt;
61                  &lt;p class="h6"&gt;
62                        Lorem ipsum dolor sit amet, consectetur adipiscing elit. Quisque non nibh sit amet eros ullamcorper tincidunt. Curabitur sed imperdiet erat. In facilisis urna magna, ut mollis est posuere nec. Duis non neque vel libero dignissim dictum. Nullam scelerisque, sem porttitor dignissim blandit, enim felis condimentum enim, non cursus felis ex vel felis. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia curae; Nunc lectus odio, finibus nec porta non, varius pulvinar eros. Aenean eget vulputate lorem, sed mollis ipsum. In mollis iaculis sem, quis sodales metus sodales quis. In nec efficitur libero, quis pharetra turpis. Nunc a felis vestibulum lacus feugiat euismod. Integer id diam a arcu dictum imperdiet nec at libero. Aliquam lorem dui, faucibus non posuere vel, venenatis vel augue. Aenean lorem ex, eleifend ut dictum a, semper nec risus. Nunc varius erat tortor, vitae sagittis sem vehicula non.
63                  &lt;/p&gt;
64                  &lt;p class="h6"&gt;
65                        Aliquam erat volutpat. Pellentesque sagittis, nisi ac tempor lobortis, lacus neque posuere libero, vel maximus nibh dui non massa. Nulla at lectus vestibulum, tristique nisi at, vulputate ex. Vestibulum sit amet pharetra tortor. Sed felis nulla, finibus ut ipsum eget, pretium mollis quam. Proin urna metus, cursus non turpis vel, elementum blandit nulla. Nulla eu accumsan ipsum. Donec sodales tellus a turpis dapibus tincidunt. Praesent luctus vestibulum magna, ac feugiat metus ullamcorper eu. Mauris non elementum nunc, sed sagittis risus. Cras sed elit laoreet, faucibus ligula quis, tempus quam. Donec posuere eget eros eu pulvinar. Vestibulum justo augue, feugiat elementum erat sit amet, tempor porttitor urna. Integer malesuada mauris et ultricies sollicitudin.
66                  &lt;/p&gt;
67                  &lt;p class="h6"&gt;
68                        Cras hendrerit quis velit vel molestie. Proin ut velit metus. Sed semper et neque non rhoncus. Cras semper dui eget eros tempus, sed malesuada nisi dignissim. Aliquam ante dolor, ultricies quis varius at, pellentesque nec urna. Mauris sit amet commodo nulla, ac malesuada lacus. Proin bibendum quis quam vel volutpat. Ut pulvinar tincidunt vehicula.
69                  &lt;/p&gt;
70                  &lt;p class="h6"&gt;
71                        Phasellus sit amet vulputate neque, id mattis velit. Vivamus porttitor tellus ac est dictum lacinia. Aenean tincidunt tempus fringilla. Sed aliquam nibh ut turpis condimentum, eget malesuada nibh iaculis. Ut tincidunt at nisl vel tristique. Nam quam nunc, lacinia eget augue dictum, aliquet aliquam lectus. Aenean eleifend quam nec est tempus imperdiet.
72                  &lt;/p&gt;
73                  &lt;p class="h6"&gt;
74                        In nec leo at tellus bibendum blandit sodales at neque. Sed vel dolor in tellus lobortis imperdiet venenatis in lectus. Ut ex ex, bibendum in fringilla et, vestibulum id mauris. Nam eu lorem nisi. Donec vitae fermentum est. Quisque sodales imperdiet felis, viverra consectetur enim egestas a. Duis leo orci, malesuada nec dolor ac, efficitur consequat dui. Aliquam lobortis commodo viverra.
75                  &lt;/p&gt;
76                  &lt;p class="h6"&gt;
77                        Nunc vulputate ultricies metus in molestie. Mauris ultrices metus feugiat augue mollis ultrices. Quisque ac mattis enim, sed suscipit orci. Fusce eu enim tempor, bibendum ligula quis, faucibus ligula. Aenean nec iaculis tortor, eu suscipit sem. Proin in elit at lectus euismod lacinia. Quisque ac auctor felis, eget ultrices orci. Curabitur accumsan, massa dictum pellentesque feugiat, mauris velit tincidunt mi, ut porta nisl nibh id nisi. Nam non facilisis arcu. Aliquam eros est, elementum a leo sit amet, porttitor euismod ligula. Maecenas tellus massa, molestie ut ultrices at, finibus ac mauris.
78                  &lt;/p&gt;
79                  &lt;p class="h6"&gt;
80                        Aliquam congue faucibus libero. Aenean sed suscipit ipsum. Aenean varius eleifend metus in pulvinar. Ut dapibus condimentum vehicula. Sed dictum arcu nulla, eget semper turpis fermentum at. Nam congue pretium rutrum. Mauris sit amet mauris sagittis, pulvinar nunc et, posuere diam.
81                  &lt;/p&gt;
82                  &lt;p class="h6"&gt;
83                        Nunc tortor elit, interdum eget lacinia sed, tincidunt quis ex. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus et nunc eu nibh pulvinar eleifend. Pellentesque porttitor feugiat placerat. In at felis est. Etiam scelerisque velit pharetra, blandit erat non, mattis ex. Aenean congue tortor nec diam maximus, eget auctor nisi accumsan. Sed at dignissim sem, eu placerat tellus. Curabitur lobortis dui nec lorem gravida pellentesque. Duis sagittis, tortor sit amet dapibus finibus, nisi lacus maximus tellus, nec convallis orci velit non libero. Cras sodales, sem in sodales tincidunt, nisi magna facilisis felis, imperdiet elementum erat turpis a dui. Duis non felis pretium, viverra dui eget, condimentum erat. Nunc lobortis convallis felis, ac scelerisque sem cursus a. Ut in gravida tortor. Cras porttitor sapien sem. Aenean cursus erat et libero scelerisque placerat.
84                  &lt;/p&gt;
85                  &lt;p class="h6"&gt;
86                        Suspendisse potenti. Sed varius ipsum sem, imperdiet vehicula orci pharetra sit amet. Nulla facilisi. Integer faucibus sed tellus quis cursus. Donec lacinia varius ipsum, vitae bibendum justo pharetra vel. Nunc facilisis a dolor sit amet maximus. In nec leo iaculis, pharetra tortor ac, imperdiet arcu. Duis non rhoncus enim, vehicula tincidunt orci. Ut in augue at ante sagittis efficitur ac eget tortor. Nunc eget felis ac quam tempor volutpat. Phasellus id volutpat tortor. Sed cursus eros at interdum convallis. Morbi ullamcorper felis eget massa porttitor pulvinar sed vitae purus. Ut iaculis ante eget ipsum congue, ut efficitur diam condimentum. Etiam lobortis dolor est, sed fringilla diam placerat eget.
87                  &lt;/p&gt;
88                  &lt;p class="h6"&gt;
89                        Vivamus consectetur, nisi in dapibus vehicula, ipsum eros congue nunc, a posuere nisl mauris vitae sem. Sed interdum placerat commodo. Quisque id molestie sapien. Vestibulum vitae tempus ligula. Morbi eu molestie risus. Vivamus ac sapien tincidunt, hendrerit nibh ut, sagittis lacus. Maecenas pellentesque elementum libero non pretium. Proin in sodales massa. Praesent eu blandit libero. Interdum et malesuada fames ac ante ipsum primis in faucibus.
90                  &lt;/p&gt;
91            &lt;/div&gt;
92
93      &lt;/div&gt;
94&lt;/body&gt;
95&lt;/html&gt;

ANSWER

Answered 2021-Oct-12 at 11:38

Please remove min-vw-100 class from your div

1/* test.css */
2
3html, body {
4      margin: 0;
5      padding: 0;
6      width: 100%;
7      height: 100%;
8      font-family: Hack, monospace !important;
9      background-color: #0f0f0f;
10}
11
12body {
13      display: flex!important;
14}
15
16.wrapper {
17      background-color: #0f0f0f;
18}
19
20.text-center.h1 {
21      color: #F4364C !important;
22      font-size: 4vw !important;
23}
24
25.h6 {
26      color: #F4364C !important;
27      font-size: 1.25vw !important;
28      opacity: 0.5 !important;
29}&lt;!DOCTYPE html&gt;
30&lt;html lang="en"&gt;
31&lt;head&gt;
32      &lt;meta charset="UTF-8"&gt;
33      &lt;meta name="viewport" content="width=device-width, initial-scale=1"&gt;
34      &lt;meta http-equiv="X-UA-Compatible" content="ie=edge"&gt;
35      &lt;meta name="author" content="Joshua"&gt;
36      
37      &lt;title&gt;Project 1 | XXX XXX&lt;/title&gt;
38      
39      &lt;!-- hack fonts --&gt;
40      &lt;link href='https://cdn.jsdelivr.net/npm/hack-font@3.3.0/build/web/hack.css' rel='stylesheet' &gt;
41
42      &lt;!-- stylesheet --&gt;
43      &lt;link href='test.css' rel='stylesheet'&gt;
44      
45      &lt;!-- bootstrap-5.1.3 --&gt;
46      &lt;link href="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-1BmE4kWBq78iYhFldvKuhfTAU6auU8tT94WrHftjDbrCEXSU1oBoqyl2QvZ6jIW3" crossorigin="anonymous"&gt;
47
48      &lt;!-- bootstrap-5.1.3 script bundle with popper --&gt;
49      &lt;script src="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/js/bootstrap.bundle.min.js" integrity="sha384-ka7Sk0Gln4gmtz2MlQnikT1wXgYsOg+OMhuP+IlRH9sENBO0LRn5q+8nbTov4+1p" crossorigin="anonymous"&gt;&lt;/script&gt;
50
51&lt;/head&gt;
52&lt;body&gt;
53      &lt;div class="d-flex flex-column min-vh-100 min-vw-100 wrapper"&gt;
54            
55            &lt;!-- Project Title --&gt;
56            &lt;div class="container-fluid my-auto"&gt;
57                  &lt;p class="text-center h1"&gt;&lt;span&gt;Astronomy Star Registry&lt;/span&gt;&lt;/p&gt;
58            &lt;/div&gt;
59
60            &lt;div class="container-fluid my-auto"&gt;
61                  &lt;p class="h6"&gt;
62                        Lorem ipsum dolor sit amet, consectetur adipiscing elit. Quisque non nibh sit amet eros ullamcorper tincidunt. Curabitur sed imperdiet erat. In facilisis urna magna, ut mollis est posuere nec. Duis non neque vel libero dignissim dictum. Nullam scelerisque, sem porttitor dignissim blandit, enim felis condimentum enim, non cursus felis ex vel felis. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia curae; Nunc lectus odio, finibus nec porta non, varius pulvinar eros. Aenean eget vulputate lorem, sed mollis ipsum. In mollis iaculis sem, quis sodales metus sodales quis. In nec efficitur libero, quis pharetra turpis. Nunc a felis vestibulum lacus feugiat euismod. Integer id diam a arcu dictum imperdiet nec at libero. Aliquam lorem dui, faucibus non posuere vel, venenatis vel augue. Aenean lorem ex, eleifend ut dictum a, semper nec risus. Nunc varius erat tortor, vitae sagittis sem vehicula non.
63                  &lt;/p&gt;
64                  &lt;p class="h6"&gt;
65                        Aliquam erat volutpat. Pellentesque sagittis, nisi ac tempor lobortis, lacus neque posuere libero, vel maximus nibh dui non massa. Nulla at lectus vestibulum, tristique nisi at, vulputate ex. Vestibulum sit amet pharetra tortor. Sed felis nulla, finibus ut ipsum eget, pretium mollis quam. Proin urna metus, cursus non turpis vel, elementum blandit nulla. Nulla eu accumsan ipsum. Donec sodales tellus a turpis dapibus tincidunt. Praesent luctus vestibulum magna, ac feugiat metus ullamcorper eu. Mauris non elementum nunc, sed sagittis risus. Cras sed elit laoreet, faucibus ligula quis, tempus quam. Donec posuere eget eros eu pulvinar. Vestibulum justo augue, feugiat elementum erat sit amet, tempor porttitor urna. Integer malesuada mauris et ultricies sollicitudin.
66                  &lt;/p&gt;
67                  &lt;p class="h6"&gt;
68                        Cras hendrerit quis velit vel molestie. Proin ut velit metus. Sed semper et neque non rhoncus. Cras semper dui eget eros tempus, sed malesuada nisi dignissim. Aliquam ante dolor, ultricies quis varius at, pellentesque nec urna. Mauris sit amet commodo nulla, ac malesuada lacus. Proin bibendum quis quam vel volutpat. Ut pulvinar tincidunt vehicula.
69                  &lt;/p&gt;
70                  &lt;p class="h6"&gt;
71                        Phasellus sit amet vulputate neque, id mattis velit. Vivamus porttitor tellus ac est dictum lacinia. Aenean tincidunt tempus fringilla. Sed aliquam nibh ut turpis condimentum, eget malesuada nibh iaculis. Ut tincidunt at nisl vel tristique. Nam quam nunc, lacinia eget augue dictum, aliquet aliquam lectus. Aenean eleifend quam nec est tempus imperdiet.
72                  &lt;/p&gt;
73                  &lt;p class="h6"&gt;
74                        In nec leo at tellus bibendum blandit sodales at neque. Sed vel dolor in tellus lobortis imperdiet venenatis in lectus. Ut ex ex, bibendum in fringilla et, vestibulum id mauris. Nam eu lorem nisi. Donec vitae fermentum est. Quisque sodales imperdiet felis, viverra consectetur enim egestas a. Duis leo orci, malesuada nec dolor ac, efficitur consequat dui. Aliquam lobortis commodo viverra.
75                  &lt;/p&gt;
76                  &lt;p class="h6"&gt;
77                        Nunc vulputate ultricies metus in molestie. Mauris ultrices metus feugiat augue mollis ultrices. Quisque ac mattis enim, sed suscipit orci. Fusce eu enim tempor, bibendum ligula quis, faucibus ligula. Aenean nec iaculis tortor, eu suscipit sem. Proin in elit at lectus euismod lacinia. Quisque ac auctor felis, eget ultrices orci. Curabitur accumsan, massa dictum pellentesque feugiat, mauris velit tincidunt mi, ut porta nisl nibh id nisi. Nam non facilisis arcu. Aliquam eros est, elementum a leo sit amet, porttitor euismod ligula. Maecenas tellus massa, molestie ut ultrices at, finibus ac mauris.
78                  &lt;/p&gt;
79                  &lt;p class="h6"&gt;
80                        Aliquam congue faucibus libero. Aenean sed suscipit ipsum. Aenean varius eleifend metus in pulvinar. Ut dapibus condimentum vehicula. Sed dictum arcu nulla, eget semper turpis fermentum at. Nam congue pretium rutrum. Mauris sit amet mauris sagittis, pulvinar nunc et, posuere diam.
81                  &lt;/p&gt;
82                  &lt;p class="h6"&gt;
83                        Nunc tortor elit, interdum eget lacinia sed, tincidunt quis ex. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus et nunc eu nibh pulvinar eleifend. Pellentesque porttitor feugiat placerat. In at felis est. Etiam scelerisque velit pharetra, blandit erat non, mattis ex. Aenean congue tortor nec diam maximus, eget auctor nisi accumsan. Sed at dignissim sem, eu placerat tellus. Curabitur lobortis dui nec lorem gravida pellentesque. Duis sagittis, tortor sit amet dapibus finibus, nisi lacus maximus tellus, nec convallis orci velit non libero. Cras sodales, sem in sodales tincidunt, nisi magna facilisis felis, imperdiet elementum erat turpis a dui. Duis non felis pretium, viverra dui eget, condimentum erat. Nunc lobortis convallis felis, ac scelerisque sem cursus a. Ut in gravida tortor. Cras porttitor sapien sem. Aenean cursus erat et libero scelerisque placerat.
84                  &lt;/p&gt;
85                  &lt;p class="h6"&gt;
86                        Suspendisse potenti. Sed varius ipsum sem, imperdiet vehicula orci pharetra sit amet. Nulla facilisi. Integer faucibus sed tellus quis cursus. Donec lacinia varius ipsum, vitae bibendum justo pharetra vel. Nunc facilisis a dolor sit amet maximus. In nec leo iaculis, pharetra tortor ac, imperdiet arcu. Duis non rhoncus enim, vehicula tincidunt orci. Ut in augue at ante sagittis efficitur ac eget tortor. Nunc eget felis ac quam tempor volutpat. Phasellus id volutpat tortor. Sed cursus eros at interdum convallis. Morbi ullamcorper felis eget massa porttitor pulvinar sed vitae purus. Ut iaculis ante eget ipsum congue, ut efficitur diam condimentum. Etiam lobortis dolor est, sed fringilla diam placerat eget.
87                  &lt;/p&gt;
88                  &lt;p class="h6"&gt;
89                        Vivamus consectetur, nisi in dapibus vehicula, ipsum eros congue nunc, a posuere nisl mauris vitae sem. Sed interdum placerat commodo. Quisque id molestie sapien. Vestibulum vitae tempus ligula. Morbi eu molestie risus. Vivamus ac sapien tincidunt, hendrerit nibh ut, sagittis lacus. Maecenas pellentesque elementum libero non pretium. Proin in sodales massa. Praesent eu blandit libero. Interdum et malesuada fames ac ante ipsum primis in faucibus.
90                  &lt;/p&gt;
91            &lt;/div&gt;
92
93      &lt;/div&gt;
94&lt;/body&gt;
95&lt;/html&gt;&lt;div class=&quot;d-flex flex-column min-vh-100 wrapper&quot;&gt;
96

Source https://stackoverflow.com/questions/69539428

QUESTION

Faster way than nested for loops for custom conditions on multiple columns in two DataFrames

Asked 2021-Jul-16 at 20:35

I have two Dataframes as below:

1df1
2+------------+-------------------+-------------+
3| Name       | Topic             |   Date      |
4+------------+-------------------+-------------+
5|        ABC |  Data Science     | 2020-01-01  |
6|        DEF |  Machine Learning | 2021-03-06  |
7|        ABC |  Cybersecurity    | 2021-01-05  |
8|        BHL |  Cloud Computing  | 2020-11-09  |
9+------------+-------------------+-------------+
10
11It has around 50,000 rows
12

The second dataframe has several columns, but I am interested in only following three:

1df1
2+------------+-------------------+-------------+
3| Name       | Topic             |   Date      |
4+------------+-------------------+-------------+
5|        ABC |  Data Science     | 2020-01-01  |
6|        DEF |  Machine Learning | 2021-03-06  |
7|        ABC |  Cybersecurity    | 2021-01-05  |
8|        BHL |  Cloud Computing  | 2020-11-09  |
9+------------+-------------------+-------------+
10
11It has around 50,000 rows
12df2
13+------------------------------------+------+-------------+
14| Description                        | Name | Created Date|
15+------------------------------------+------+-------------+
16| This is good Data Science project. |  XYZ | 2021-06-04  |
17| Cybersecurity is important.        |  BBB | 2021-02-03  |
18| I am Data Science Professional     |  ABC | 2021-02-08  |
19| Machine Learning is strategic.     |  DEF | 2021-03-01  |
20+------------------------------------+------+-------------+
21
22It has around 300,000 rows.
23

I want to find all the rows from df2 where:

For each unique (Name, Topic and Date) in df1, find rows in df2 where 'Name' matches and 'Created Date' is within the next six months of 'Date' from df1, as well as the 'Topic' is in 'Description'.

I have used two for loops to iterate over each dataframes' rows as shown below. But, the problem is that since there are large number of rows and iterating over each row this way is not the best method I feel. Can you please suggest any other way to do it faster and efficiently. I also want to attach 'Topic', 'Date' from df1 to each matching row of df2(some kind of merge, but not sure how).

My code is as follows:

1df1
2+------------+-------------------+-------------+
3| Name       | Topic             |   Date      |
4+------------+-------------------+-------------+
5|        ABC |  Data Science     | 2020-01-01  |
6|        DEF |  Machine Learning | 2021-03-06  |
7|        ABC |  Cybersecurity    | 2021-01-05  |
8|        BHL |  Cloud Computing  | 2020-11-09  |
9+------------+-------------------+-------------+
10
11It has around 50,000 rows
12df2
13+------------------------------------+------+-------------+
14| Description                        | Name | Created Date|
15+------------------------------------+------+-------------+
16| This is good Data Science project. |  XYZ | 2021-06-04  |
17| Cybersecurity is important.        |  BBB | 2021-02-03  |
18| I am Data Science Professional     |  ABC | 2021-02-08  |
19| Machine Learning is strategic.     |  DEF | 2021-03-01  |
20+------------------------------------+------+-------------+
21
22It has around 300,000 rows.
23import pandas as pd
24from dateutil.relativedelta import relativedelta
25
26df1 = df1.drop_duplicates()  # Drop duplicate entries
27
28df_final = pd.DataFrame()
29
30for index1, row1 in df1.iterrows():
31    future_date = row1['Date'] + relativedelta(months=6)
32    for index2, row2 in df2.iterrows():
33        if ((row1['Name'] == row2['Name']) and (row1['Date] &lt; row2['Created Date'] &lt; future_date)
34            and (row1['Topic'] in row2['Description'])):
35            df_final = df_final.append(row2)
36        else:
37             continue
38
39

ANSWER

Answered 2021-Jul-16 at 20:35

try those steps:

1df1
2+------------+-------------------+-------------+
3| Name       | Topic             |   Date      |
4+------------+-------------------+-------------+
5|        ABC |  Data Science     | 2020-01-01  |
6|        DEF |  Machine Learning | 2021-03-06  |
7|        ABC |  Cybersecurity    | 2021-01-05  |
8|        BHL |  Cloud Computing  | 2020-11-09  |
9+------------+-------------------+-------------+
10
11It has around 50,000 rows
12df2
13+------------------------------------+------+-------------+
14| Description                        | Name | Created Date|
15+------------------------------------+------+-------------+
16| This is good Data Science project. |  XYZ | 2021-06-04  |
17| Cybersecurity is important.        |  BBB | 2021-02-03  |
18| I am Data Science Professional     |  ABC | 2021-02-08  |
19| Machine Learning is strategic.     |  DEF | 2021-03-01  |
20+------------------------------------+------+-------------+
21
22It has around 300,000 rows.
23import pandas as pd
24from dateutil.relativedelta import relativedelta
25
26df1 = df1.drop_duplicates()  # Drop duplicate entries
27
28df_final = pd.DataFrame()
29
30for index1, row1 in df1.iterrows():
31    future_date = row1['Date'] + relativedelta(months=6)
32    for index2, row2 in df2.iterrows():
33        if ((row1['Name'] == row2['Name']) and (row1['Date] &lt; row2['Created Date'] &lt; future_date)
34            and (row1['Topic'] in row2['Description'])):
35            df_final = df_final.append(row2)
36        else:
37             continue
38
39# drop dup rows in df1
40df1 = df1.drop_duplicates()
41# merge df2 with df1 on name
42df2 = df2.merge(df1, how='inner', left_on='Name', right_on='Name')
43future_date = df2['Date'] + relativedelta(months=6)
44# now select based on requirement
45df2 = df2[(df2['Date'] &gt; df2['Created Date']) &amp; (df['Date'] &lt; future_date)]
46df2 = df2[df2.apply(lambda x: x['Topic'] in x['Description'], axis=1)] 
47

Source https://stackoverflow.com/questions/68414014

QUESTION

Find a hash function to malfunction insertion sort

Asked 2021-May-25 at 02:24

Below is the original pseudocode for Insertion Sort:

1function INSERTIONSORT(A[0..n−1])
2for i←1 to n−1 do 
3    j←i−1 
4    while j≥0 and A[j+1]&lt;A[j] do 
5          SWAP(A[j+1],A[j]) 
6          j←j−1
7

A company is using Insertion Sort inside one of their products. You are a cybersecurity expert that was hired by this company to assess any security flaws with their code. After a few tries, you managed to attack their Insertion Sort code and modify it in the following way:

1function INSERTIONSORT(A[0..n−1])
2for i←1 to n−1 do 
3    j←i−1 
4    while j≥0 and A[j+1]&lt;A[j] do 
5          SWAP(A[j+1],A[j]) 
6          j←j−1
7function INSERTIONSORT(A[0..n−1])
8         for i←1 to n−1 do
9         j←i−1 
10         while j≥0 and HASH(A,j+1) &lt; HASH(A,j) do 
11               SWAP(A[j+1],A[j])
12               j←j−1
13

In other words, instead of indexing the array as A[j] and A[j+1]inside the “while” condition, you now have a hash function that takes the array and a index as the arguments and return an integer. Your job is to implement specific hash functions that will cause the algorithm to malfunction in different ways.

  • a) Implement a hash function that causes Insertion Sort to keep the original array unchanged. Explain why your solution works.
  • b) Implement a hash function that causes Insertion Sort to always run in the worst case complexity, even if the resulting array does not end up getting sorted. Explain why your solution works.
  • c) Implement a hash function that causes Insertion Sort to sort the array in reverse. Explain why your solution works.

I think (a) and (b) is hash(A,j)=j and hash(A,j)=-j, but have no idea if that is correct and have no clue to c.

ANSWER

Answered 2021-May-25 at 02:24

**Part a) Original array unchanged

1function INSERTIONSORT(A[0..n−1])
2for i←1 to n−1 do 
3    j←i−1 
4    while j≥0 and A[j+1]&lt;A[j] do 
5          SWAP(A[j+1],A[j]) 
6          j←j−1
7function INSERTIONSORT(A[0..n−1])
8         for i←1 to n−1 do
9         j←i−1 
10         while j≥0 and HASH(A,j+1) &lt; HASH(A,j) do 
11               SWAP(A[j+1],A[j])
12               j←j−1
13#include &lt;stdio.h&gt;
14
15int hash(int arr[], int i) {
16    return i;
17}
18
19void insertionSort(int arr[], int n) {
20    int i, j, temp;
21
22    for (i = 1 ; i &lt;= n - 1; i++)
23    {
24        j = i-1;
25            while ( j &gt;= 0 &amp;&amp; hash(arr, j+1) &lt; hash(arr, j))
26            {           
27                temp     = arr[j];
28                arr[j]   = arr[j+1];
29                arr[j+1] = temp;
30                j--;
31            }
32    }
33}
34
35int main()
36{
37    int i;
38    int arr[] = {5, 6, 7, 3, 2 , 9, 4};
39    int n = sizeof(arr)/sizeof(arr[0]);
40    insertionSort(arr, n);
41    printf("Original array unchanged:\n");
42    for (i = 0; i &lt;= n - 1; i++)
43    {
44        printf("%d\n", arr[i]);
45    }
46    return 0;
47}

Part b) Worst Case insertion sort

1function INSERTIONSORT(A[0..n−1])
2for i←1 to n−1 do 
3    j←i−1 
4    while j≥0 and A[j+1]&lt;A[j] do 
5          SWAP(A[j+1],A[j]) 
6          j←j−1
7function INSERTIONSORT(A[0..n−1])
8         for i←1 to n−1 do
9         j←i−1 
10         while j≥0 and HASH(A,j+1) &lt; HASH(A,j) do 
11               SWAP(A[j+1],A[j])
12               j←j−1
13#include &lt;stdio.h&gt;
14
15int hash(int arr[], int i) {
16    return i;
17}
18
19void insertionSort(int arr[], int n) {
20    int i, j, temp;
21
22    for (i = 1 ; i &lt;= n - 1; i++)
23    {
24        j = i-1;
25            while ( j &gt;= 0 &amp;&amp; hash(arr, j+1) &lt; hash(arr, j))
26            {           
27                temp     = arr[j];
28                arr[j]   = arr[j+1];
29                arr[j+1] = temp;
30                j--;
31            }
32    }
33}
34
35int main()
36{
37    int i;
38    int arr[] = {5, 6, 7, 3, 2 , 9, 4};
39    int n = sizeof(arr)/sizeof(arr[0]);
40    insertionSort(arr, n);
41    printf("Original array unchanged:\n");
42    for (i = 0; i &lt;= n - 1; i++)
43    {
44        printf("%d\n", arr[i]);
45    }
46    return 0;
47}#include &lt;stdio.h&gt;
48
49int hash(int arr[], int i) {
50    return -i;
51}
52
53void insertionSort(int arr[], int n) {
54    int i, j, temp;
55
56    for (i = 1 ; i &lt;= n - 1; i++)
57    {
58        j = i-1;
59            while ( j &gt;= 0 &amp;&amp; hash(arr, j+1) &lt; hash(arr, j))
60            {           
61                temp     = arr[j];
62                arr[j]   = arr[j+1];
63                arr[j+1] = temp;
64                j--;
65            }
66    }
67}
68
69int main()
70{
71    int i;
72    int arr[] = {5, 6, 7, 3, 2 , 9, 4};
73    int n = sizeof(arr)/sizeof(arr[0]);
74    insertionSort(arr, n);
75    printf("In worst case(number of swaps maximum)\n");
76    for (i = 0; i &lt;= n - 1; i++)
77    {
78        printf("%d\n", arr[i]);
79    }
80    return 0;
81}

Part c) Sorted in reverse order.**

1function INSERTIONSORT(A[0..n−1])
2for i←1 to n−1 do 
3    j←i−1 
4    while j≥0 and A[j+1]&lt;A[j] do 
5          SWAP(A[j+1],A[j]) 
6          j←j−1
7function INSERTIONSORT(A[0..n−1])
8         for i←1 to n−1 do
9         j←i−1 
10         while j≥0 and HASH(A,j+1) &lt; HASH(A,j) do 
11               SWAP(A[j+1],A[j])
12               j←j−1
13#include &lt;stdio.h&gt;
14
15int hash(int arr[], int i) {
16    return i;
17}
18
19void insertionSort(int arr[], int n) {
20    int i, j, temp;
21
22    for (i = 1 ; i &lt;= n - 1; i++)
23    {
24        j = i-1;
25            while ( j &gt;= 0 &amp;&amp; hash(arr, j+1) &lt; hash(arr, j))
26            {           
27                temp     = arr[j];
28                arr[j]   = arr[j+1];
29                arr[j+1] = temp;
30                j--;
31            }
32    }
33}
34
35int main()
36{
37    int i;
38    int arr[] = {5, 6, 7, 3, 2 , 9, 4};
39    int n = sizeof(arr)/sizeof(arr[0]);
40    insertionSort(arr, n);
41    printf("Original array unchanged:\n");
42    for (i = 0; i &lt;= n - 1; i++)
43    {
44        printf("%d\n", arr[i]);
45    }
46    return 0;
47}#include &lt;stdio.h&gt;
48
49int hash(int arr[], int i) {
50    return -i;
51}
52
53void insertionSort(int arr[], int n) {
54    int i, j, temp;
55
56    for (i = 1 ; i &lt;= n - 1; i++)
57    {
58        j = i-1;
59            while ( j &gt;= 0 &amp;&amp; hash(arr, j+1) &lt; hash(arr, j))
60            {           
61                temp     = arr[j];
62                arr[j]   = arr[j+1];
63                arr[j+1] = temp;
64                j--;
65            }
66    }
67}
68
69int main()
70{
71    int i;
72    int arr[] = {5, 6, 7, 3, 2 , 9, 4};
73    int n = sizeof(arr)/sizeof(arr[0]);
74    insertionSort(arr, n);
75    printf("In worst case(number of swaps maximum)\n");
76    for (i = 0; i &lt;= n - 1; i++)
77    {
78        printf("%d\n", arr[i]);
79    }
80    return 0;
81}#include &lt;stdio.h&gt;
82
83int hash(int arr[], int i) {
84    return -arr[i];
85}
86
87void insertionSort(int arr[], int n) {
88    int i, j, temp;
89
90    for (i = 1 ; i &lt;= n - 1; i++)
91    {
92        j = i-1;
93            while ( j &gt;= 0 &amp;&amp; hash(arr, j+1) &lt; hash(arr, j))
94            {           
95                temp     = arr[j];
96                arr[j]   = arr[j+1];
97                arr[j+1] = temp;
98                j--;
99            }
100    }
101}
102
103int main()
104{
105    int i;
106    int arr[] = {5, 6, 7, 3, 2 , 9, 4};
107    int n = sizeof(arr)/sizeof(arr[0]);
108    insertionSort(arr, n);
109    printf("Sorted in reverse order:\n");
110    for (i = 0; i &lt;= n - 1; i++)
111    {
112        printf("%d\n", arr[i]);
113    }
114    return 0;
115}

Source https://stackoverflow.com/questions/67644985

QUESTION

component wont render when is useEffect() is ran once

Asked 2021-May-17 at 11:34

So I have the following code, where I'm fetching data to be rendered in my component. However, if the useEffect is set to run once, it wont render the data inside the component, and having it constantly running is not sustainable.

1import React, { useState, useEffect } from "react";
2import Chart from "react-google-charts";
3
4
5const Bottom5 = ({ company }) =&gt; {
6    const [quiz, setQuiz] = useState('');
7    const [dataPoints, setDatapoints] = useState([]);
8
9    useEffect(() =&gt; {
10            var resultData = [];
11            fetch(`http://localhost:3001/company/dashboard/bottom5/${company}`)
12            .then(function(response) {
13                return response.json();
14            })
15            .then(function(data) {
16                for (var i = 0; i &lt; data.length; i++) {
17                    resultData.push({
18                        label: data[i].name,
19                        y: data[i].sumCorrect
20                    });
21                }
22               setDatapoints(resultData)
23            });
24    },[])
25
26
27    return (
28            &lt;Chart style={{display:"inline-block"}}
29                width={'500px'}
30                height={'300px'}
31                chartType="ColumnChart"
32                loader={&lt;div&gt;Loading Chart&lt;/div&gt;}
33                data={[
34                    ['Names', 'Result'],
35                    ...dataPoints.map(d =&gt; [d.label, d.y])
36                ]}
37                options={{
38                    title: 'CyberSecurity Bottom 5',
39                    chartArea: { width: '50%' },
40                    hAxis: {
41                        title: 'Employees',
42                        minValue: 0,
43                    },
44                    vAxis: {
45                        title: 'Total Correct',
46                    },
47                }}
48                // For tests
49                rootProps={{ 'data-testid': '1' }}
50            /&gt;
51    )
52}
53
54export default Bottom5;

ANSWER

Answered 2021-Apr-30 at 16:19

There is an issue with update the array using hooks.

1import React, { useState, useEffect } from "react";
2import Chart from "react-google-charts";
3
4
5const Bottom5 = ({ company }) =&gt; {
6    const [quiz, setQuiz] = useState('');
7    const [dataPoints, setDatapoints] = useState([]);
8
9    useEffect(() =&gt; {
10            var resultData = [];
11            fetch(`http://localhost:3001/company/dashboard/bottom5/${company}`)
12            .then(function(response) {
13                return response.json();
14            })
15            .then(function(data) {
16                for (var i = 0; i &lt; data.length; i++) {
17                    resultData.push({
18                        label: data[i].name,
19                        y: data[i].sumCorrect
20                    });
21                }
22               setDatapoints(resultData)
23            });
24    },[])
25
26
27    return (
28            &lt;Chart style={{display:"inline-block"}}
29                width={'500px'}
30                height={'300px'}
31                chartType="ColumnChart"
32                loader={&lt;div&gt;Loading Chart&lt;/div&gt;}
33                data={[
34                    ['Names', 'Result'],
35                    ...dataPoints.map(d =&gt; [d.label, d.y])
36                ]}
37                options={{
38                    title: 'CyberSecurity Bottom 5',
39                    chartArea: { width: '50%' },
40                    hAxis: {
41                        title: 'Employees',
42                        minValue: 0,
43                    },
44                    vAxis: {
45                        title: 'Total Correct',
46                    },
47                }}
48                // For tests
49                rootProps={{ 'data-testid': '1' }}
50            /&gt;
51    )
52}
53
54export default Bottom5;setDatapoints(resultData) // reference is same - not updating
55setDatapoints([...resultData]) // do this &lt;&lt; --
56
57

The reference of an array does not change, so hooks doesn't update itself.

Source https://stackoverflow.com/questions/67337162

Community Discussions contain sources that include Stack Exchange Network

Tutorials and Learning Resources in Cybersecurity

Tutorials and Learning Resources are not available at this moment for Cybersecurity

Share this Page

share link

Get latest updates on Cybersecurity