Clara | A simple to use , composable , command line parser for C | Development Tools library
kandi X-RAY | Clara Summary
Support
Quality
Security
License
Reuse
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample Here
Clara Key Features
Clara Examples and Code Snippets
Trending Discussions on Clara
Trending Discussions on Clara
QUESTION
I have two large-ish data frames I am trying to append...
In df1, I have state codes, county codes, state names (Alabama, Alaska, etc.), county names, and years from 2010:2020.
In df2, I have county names, state abbreviations (AL, AK), and data for the year 2010 (which I am trying to merge into df1. The issue lies in that without specifying the state name and simply merging df1 and df2, some of the data which I am trying to get into df1 is duplicated due to there being some counties with the same name...hence, I am trying to also join by state to prevent this, but I have state abbreviations, and state names.
Is there any way in which I can make either the state names in df1 abbreviations, or the state names in df2 full names? Please let me know! Thank you for the help.
Edit: dput(df2)
dput(birthdata2)
structure(list(County = c("Ada County", "Adams County", "Adams County",
"Aiken County", "Alachua County", "Alamance County", "Alameda County",
"Albany County", "Alexandria city", "Allegan County", "Allegheny County",
"Allen County", "Allen County", "Anchorage Borough", "Anderson County",
"Androscoggin County", "Anne Arundel County", "Anoka County",
"Arapahoe County", "Arlington County", "Ascension Parish", "Ashtabula County",
"Atlantic County", "Baldwin County", "Baltimore city", "Baltimore County",
"Barnstable County", "Bartow County", "Bay County", "Bay County",
"Beaufort County", "Beaver County", "Bell County", "Benton County",
"Benton County", "Bergen County", "Berkeley County", "Berkeley County",
"Berks County", "Berkshire County", "Bernalillo County", "Berrien County",
"Bexar County", "Bibb County", "Black Hawk County", "Blair County",
"Blount County", "Bonneville County", "Boone County", "Boone County",
"Bossier Parish", "Boulder County", "Brazoria County", "Brazos County",
"Brevard County", "Bristol County", "Bronx County", "Broome County",
"Broward County", "Brown County", "Brunswick County", "Bucks County",
"Buncombe County", "Burlington County", "Butler County", "Butler County",
"Butte County", "Cabarrus County", "Cache County", "Caddo Parish",
"Calcasieu Parish", "Calhoun County", "Calhoun County", "Cambria County",
"Camden County", "Cameron County", "Canadian County", "Canyon County",
"Cape May County", "Carroll County", "Carroll County", "Cass County",
"Catawba County", "Cecil County", "Centre County", "Champaign County",
"Charles County", "Charleston County", "Charlotte County", "Chatham County",
"Chautauqua County", "Cherokee County", "Chesapeake city", "Chester County",
"Chesterfield County", "Chittenden County", "Citrus County",
"Clackamas County", "Clark County", "Clark County", "Clark County",
"Clark County", "Clarke County", "Clay County", "Clay County",
"Clayton County", "Clermont County", "Cleveland County", "Cobb County",
"Cochise County", "Coconino County", "Collier County", "Collin County",
"Columbia County", "Columbiana County", "Comal County", "Comanche County",
"Contra Costa County", "Cook County", "Coweta County", "Cowlitz County",
"Craven County", "Cumberland County", "Cumberland County", "Cumberland County",
"Cumberland County", "Cuyahoga County", "Dakota County", "Dallas County",
"Dane County", "Dauphin County", "Davidson County", "Davidson County",
"Davis County", "DeKalb County", "DeKalb County", "Delaware County",
"Delaware County", "Delaware County", "Denton County", "Denver County",
"Deschutes County", "DeSoto County", "District of Columbia",
"Dona Ana County", "Dorchester County", "Douglas County", "Douglas County",
"Douglas County", "Douglas County", "Douglas County", "DuPage County",
"Durham County", "Dutchess County", "Duval County", "East Baton Rouge Parish",
"Eaton County", "Ector County", "El Dorado County", "El Paso County",
"El Paso County", "Elkhart County", "Ellis County", "Erie County",
"Erie County", "Escambia County", "Essex County", "Essex County",
"Etowah County", "Fairfax County", "Fairfield County", "Fairfield County",
"Faulkner County", "Fayette County", "Fayette County", "Fayette County",
"Florence County", "Fond du Lac County", "Forsyth County", "Forsyth County",
"Fort Bend County", "Franklin County", "Franklin County", "Franklin County",
"Frederick County", "Fresno County", "Fulton County", "Galveston County",
"Gaston County", "Genesee County", "Gloucester County", "Grayson County",
"Greene County", "Greene County", "Greenville County", "Gregg County",
"Guadalupe County", "Guilford County", "Gwinnett County", "Hall County",
"Hamilton County", "Hamilton County", "Hamilton County", "Hampden County",
"Hampshire County", "Hampton city", "Hardin County", "Harford County",
"Harnett County", "Harris County", "Harrison County", "Hartford County",
"Hawaii County", "Hays County", "Henderson County", "Hendricks County",
"Hennepin County", "Henrico County", "Henry County", "Hernando County",
"Hidalgo County", "Hillsborough County", "Hillsborough County",
"Hinds County", "Honolulu County", "Horry County", "Houston County",
"Houston County", "Howard County", "Hudson County", "Humboldt County",
"Hunterdon County", "Imperial County", "Indian River County",
"Ingham County", "Iredell County", "Jackson County", "Jackson County",
"Jackson County", "Jackson County", "Jasper County", "Jefferson County",
"Jefferson County", "Jefferson County", "Jefferson County", "Jefferson County",
"Jefferson County", "Jefferson Parish", "Johnson County", "Johnson County",
"Johnson County", "Johnson County", "Johnston County", "Kalamazoo County",
"Kanawha County", "Kane County", "Kankakee County", "Kaufman County",
"Kendall County", "Kennebec County", "Kenosha County", "Kent County",
"Kent County", "Kent County", "Kenton County", "Kern County",
"King County", "Kings County", "Kings County", "Kitsap County",
"Knox County", "Kootenai County", "La Crosse County", "La Porte County",
"Lackawanna County", "Lafayette Parish", "Lake County", "Lake County",
"Lake County", "Lake County", "Lancaster County", "Lancaster County",
"Lane County", "Larimer County", "LaSalle County", "Lebanon County",
"Lee County", "Lee County", "Lehigh County", "Leon County", "Lexington County",
"Licking County", "Linn County", "Linn County", "Litchfield County",
"Livingston County", "Livingston Parish", "Lorain County", "Los Angeles County",
"Loudoun County", "Lowndes County", "Lubbock County", "Lucas County",
"Luzerne County", "Lycoming County", "Macomb County", "Macon County",
"Madera County", "Madison County", "Madison County", "Madison County",
"Mahoning County", "Manatee County", "Marathon County", "Maricopa County",
"Marin County", "Marion County", "Marion County", "Marion County",
"Martin County", "Maui County", "McHenry County", "McLean County",
"McLennan County", "Mecklenburg County", "Medina County", "Merced County",
"Mercer County", "Mercer County", "Merrimack County", "Mesa County",
"Miami County", "Miami-Dade County", "Middlesex County", "Middlesex County",
"Middlesex County", "Midland County", "Milwaukee County", "Minnehaha County",
"Missoula County", "Mobile County", "Mohave County", "Monmouth County",
"Monroe County", "Monroe County", "Monroe County", "Monroe County",
"Monterey County", "Montgomery County", "Montgomery County",
"Montgomery County", "Montgomery County", "Montgomery County",
"Montgomery County", "Morgan County", "Morris County", "Multnomah County",
"Muscogee County", "Muskegon County", "Napa County", "Nassau County",
"Navajo County", "New Castle County", "New Hanover County", "New Haven County",
"New London County", "New York County", "Newport News city",
"Niagara County", "Norfolk city", "Norfolk County", "Northampton County",
"Nueces County", "Oakland County", "Ocean County", "Okaloosa County",
"Oklahoma County", "Olmsted County", "Oneida County", "Onondaga County",
"Onslow County", "Ontario County", "Orange County", "Orange County",
"Orange County", "Orange County", "Orleans Parish", "Osceola County",
"Oswego County", "Ottawa County", "Ouachita Parish", "Outagamie County",
"Palm Beach County", "Parker County", "Pasco County", "Passaic County",
"Paulding County", "Pennington County", "Penobscot County", "Peoria County",
"Philadelphia County", "Pickens County", "Pierce County", "Pima County",
"Pinal County", "Pinellas County", "Pitt County", "Placer County",
"Plymouth County", "Polk County", "Polk County", "Portage County",
"Porter County", "Portsmouth city", "Potter County", "Prince George's County",
"Prince William County", "Providence County", "Pueblo County",
"Pulaski County", "Queens County", "Racine County", "Ramsey County",
"Randall County", "Randolph County", "Rankin County", "Rapides Parish",
"Rensselaer County", "Richland County", "Richland County", "Richmond city",
"Richmond County", "Richmond County", "Riverside County", "Robeson County",
"Rock County", "Rock Island County", "Rockingham County", "Rockland County",
"Rowan County", "Rutherford County", "Sacramento County", "Saginaw County",
"Saline County", "Salt Lake County", "San Bernardino County",
"San Diego County", "San Francisco County", "San Joaquin County",
"San Juan County", "San Luis Obispo County", "San Mateo County",
"Sandoval County", "Sangamon County", "Santa Barbara County",
"Santa Clara County", "Santa Cruz County", "Santa Fe County",
"Santa Rosa County", "Sarasota County", "Saratoga County", "Sarpy County",
"Schenectady County", "Schuylkill County", "Scott County", "Scott County",
"Sebastian County", "Sedgwick County", "Seminole County", "Shasta County",
"Shawnee County", "Sheboygan County", "Shelby County", "Shelby County",
"Skagit County", "Smith County", "Snohomish County", "Solano County",
"Somerset County", "Sonoma County", "Spartanburg County", "Spokane County",
"Spotsylvania County", "St. Charles County", "St. Clair County",
"St. Clair County", "St. Johns County", "St. Joseph County",
"St. Lawrence County", "St. Louis city", "St. Louis County",
"St. Louis County", "St. Lucie County", "St. Mary's County",
"St. Tammany Parish", "Stafford County", "Stanislaus County",
"Stark County", "Stearns County", "Strafford County", "Suffolk County",
"Suffolk County", "Sullivan County", "Summit County", "Sumner County",
"Sumter County", "Sussex County", "Sussex County", "Tangipahoa Parish",
"Tarrant County", "Taylor County", "Tazewell County", "Terrebonne Parish",
"Thurston County", "Tippecanoe County", "Tolland County", "Tom Green County",
"Tompkins County", "Travis County", "Trumbull County", "Tulare County",
"Tulsa County", "Tuscaloosa County", "Ulster County", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Union County",
"Union County", "Utah County", "Vanderburgh County", "Ventura County",
"Vigo County", "Virginia Beach city", "Volusia County", "Wake County",
"Walworth County", "Warren County", "Warren County", "Warren County",
"Washington County", "Washington County", "Washington County",
"Washington County", "Washington County", "Washington County",
"Washington County", "Washington County", "Washington County",
"Washoe County", "Washtenaw County", "Waukesha County", "Wayne County",
"Wayne County", "Wayne County", "Webb County", "Weber County",
"Weld County", "Westchester County", "Westmoreland County", "Whatcom County",
"Whitfield County", "Wichita County", "Will County", "Williamson County",
"Williamson County", "Wilson County", "Windham County", "Winnebago County",
"Winnebago County", "Wood County", "Woodbury County", "Worcester County",
"Wright County", "Wyandotte County", "Yakima County", "Yavapai County",
"Yellowstone County", "Yolo County", "York County", "York County",
"York County", "Yuma County"), State = c(" ID", " CO", " PA",
" SC", " FL", " NC", " CA", " NY", " VA", " MI", " PA", " IN",
" OH", " AK", " SC", " ME", " MD", " MN", " CO", " VA", " LA",
" OH", " NJ", " AL", " MD", " MD", " MA", " GA", " FL", " MI",
" SC", " PA", " TX", " AR", " WA", " NJ", " SC", " WV", " PA",
" MA", " NM", " MI", " TX", " GA", " IA", " PA", " TN", " ID",
" KY", " MO", " LA", " CO", " TX", " TX", " FL", " MA", " NY",
" NY", " FL", " WI", " NC", " PA", " NC", " NJ", " OH", " PA",
" CA", " NC", " UT", " LA", " LA", " AL", " MI", " PA", " NJ",
" TX", " OK", " ID", " NJ", " GA", " MD", " ND", " NC", " MD",
" PA", " IL", " MD", " SC", " FL", " GA", " NY", " GA", " VA",
" PA", " VA", " VT", " FL", " OR", " IN", " NV", " OH", " WA",
" GA", " FL", " MO", " GA", " OH", " OK", " GA", " AZ", " AZ",
" FL", " TX", " GA", " OH", " TX", " OK", " CA", " IL", " GA",
" WA", " NC", " ME", " NC", " NJ", " PA", " OH", " MN", " TX",
" WI", " PA", " NC", " TN", " UT", " GA", " IL", " IN", " OH",
" PA", " TX", " CO", " OR", " MS", " DC", " NM", " SC", " CO",
" GA", " KS", " NE", " OR", " IL", " NC", " NY", " FL", " LA",
" MI", " TX", " CA", " CO", " TX", " IN", " TX", " NY", " PA",
" FL", " MA", " NJ", " AL", " VA", " CT", " OH", " AR", " GA",
" KY", " PA", " SC", " WI", " GA", " NC", " TX", " MO", " OH",
" PA", " MD", " CA", " GA", " TX", " NC", " MI", " NJ", " TX",
" MO", " OH", " SC", " TX", " TX", " NC", " GA", " GA", " IN",
" OH", " TN", " MA", " MA", " VA", " KY", " MD", " NC", " TX",
" MS", " CT", " HI", " TX", " NC", " IN", " MN", " VA", " GA",
" FL", " TX", " FL", " NH", " MS", " HI", " SC", " AL", " GA",
" MD", " NJ", " CA", " NJ", " CA", " FL", " MI", " NC", " MI",
" MO", " MS", " OR", " MO", " AL", " CO", " KY", " MO", " NY",
" TX", " LA", " IA", " IN", " KS", " TX", " NC", " MI", " WV",
" IL", " IL", " TX", " IL", " ME", " WI", " DE", " MI", " RI",
" KY", " CA", " WA", " CA", " NY", " WA", " TN", " ID", " WI",
" IN", " PA", " LA", " FL", " IL", " IN", " OH", " NE", " PA",
" OR", " CO", " IL", " PA", " AL", " FL", " PA", " FL", " SC",
" OH", " IA", " OR", " CT", " MI", " LA", " OH", " CA", " VA",
" GA", " TX", " OH", " PA", " PA", " MI", " IL", " CA", " AL",
" IL", " IN", " OH", " FL", " WI", " AZ", " CA", " FL", " IN",
" OR", " FL", " HI", " IL", " IL", " TX", " NC", " OH", " CA",
" NJ", " PA", " NH", " CO", " OH", " FL", " CT", " MA", " NJ",
" TX", " WI", " SD", " MT", " AL", " AZ", " NJ", " IN", " MI",
" NY", " PA", " CA", " AL", " MD", " OH", " PA", " TN", " TX",
" AL", " NJ", " OR", " GA", " MI", " CA", " NY", " AZ", " DE",
" NC", " CT", " CT", " NY", " VA", " NY", " VA", " MA", " PA",
" TX", " MI", " NJ", " FL", " OK", " MN", " NY", " NY", " NC",
" NY", " CA", " FL", " NC", " NY", " LA", " FL", " NY", " MI",
" LA", " WI", " FL", " TX", " FL", " NJ", " GA", " SD", " ME",
" IL", " PA", " SC", " WA", " AZ", " AZ", " FL", " NC", " CA",
" MA", " FL", " IA", " OH", " IN", " VA", " TX", " MD", " VA",
" RI", " CO", " AR", " NY", " WI", " MN", " TX", " NC", " MS",
" LA", " NY", " OH", " SC", " VA", " GA", " NY", " CA", " NC",
" WI", " IL", " NH", " NY", " NC", " TN", " CA", " MI", " AR",
" UT", " CA", " CA", " CA", " CA", " NM", " CA", " CA", " NM",
" IL", " CA", " CA", " CA", " NM", " FL", " FL", " NY", " NE",
" NY", " PA", " IA", " MN", " AR", " KS", " FL", " CA", " KS",
" WI", " AL", " TN", " WA", " TX", " WA", " CA", " NJ", " CA",
" SC", " WA", " VA", " MO", " IL", " MI", " FL", " IN", " NY",
" MO", " MN", " MO", " FL", " MD", " LA", " VA", " CA", " OH",
" MN", " NH", " MA", " NY", " TN", " OH", " TN", " SC", " DE",
" NJ", " LA", " TX", " TX", " IL", " LA", " WA", " IN", " CT",
" TX", " NY", " TX", " OH", " CA", " OK", " AL", " NY", " AK",
" AL", " AR", " AZ", " CA", " CO", " FL", " GA", " HI", " IA",
" ID", " IL", " IN", " KS", " KY", " LA", " MA", " MD", " ME",
" MI", " MN", " MO", " MS", " MT", " NC", " ND", " NE", " NH",
" NJ", " NM", " NV", " NY", " OH", " OK", " OR", " PA", " RI",
" SC", " SD", " TN", " TX", " UT", " VA", " VT", " WA", " WI",
" WV", " WY", " NC", " NJ", " UT", " IN", " CA", " IN", " VA",
" FL", " NC", " WI", " KY", " NJ", " OH", " AR", " MD", " MN",
" OR", " PA", " RI", " TN", " UT", " WI", " NV", " MI", " WI",
" MI", " NC", " OH", " TX", " UT", " CO", " NY", " PA", " WA",
" GA", " TX", " IL", " TN", " TX", " TN", " CT", " IL", " WI",
" OH", " IA", " MA", " MN", " KS", " WA", " AZ", " MT", " CA",
" ME", " PA", " SC", " AZ"), x = c(358, 549, NA, 149, 219, 178,
1367, 257, 194, 76, 1032, 432, 114, 278, 177, 69, 574, 265, 701,
208, NA, 77, 267, 176, 1044, 826, 107, NA, 179, 78, 193, 119,
538, 246, 162, 765, 202, NA, 409, 84, 778, 177, 2441, 321, 123,
101, 91, NA, NA, 173, NA, 228, 390, 192, 372, 472, 2190, 173,
1930, 209, NA, 469, 194, 379, 340, 112, 146, 225, NA, 573, 258,
103, 87, 119, 531, 622, NA, 187, 56, NA, 78, 141, 155, NA, 76,
198, 164, 419, 74, 378, 99, 240, 222, 371, 298, 95, 77, 243,
NA, 2217, 101, 340, 116, 170, 206, 437, 202, 228, 790, 140, 129,
248, 844, NA, 73, NA, 155, 851, 6598, NA, NA, NA, 164, 575, 219,
152, 1565, 319, 3336, 388, 335, 179, 825, 374, 1104, NA, 97,
131, 537, 621, 915, 83, 141, 934, 240, NA, 353, NA, NA, 680,
76, 789, 406, 187, 1171, 732, 72, 249, 115, 857, 1265, 235, 132,
788, 305, 412, 569, 1087, 93, 1060, 776, 87, NA, NA, 319, 137,
220, NA, NA, 477, 701, NA, 1676, 138, 237, 1266, 1467, 338, 248,
523, 253, 109, 264, 137, 501, 167, NA, 562, 863, 221, 256, 1043,
436, 451, 74, 177, NA, 202, NA, 6004, 295, 835, 180, NA, NA,
111, 1167, 352, 221, 110, 1284, 1484, 317, 553, 1181, 311, NA,
185, 259, 842, 76, 70, 182, 89, 257, 148, 156, 795, 159, 150,
109, 998, 486, 972, 179, 130, 368, 551, 114, 100, 463, 169, 167,
245, 182, 518, 99, NA, NA, 70, 155, 173, 654, 102, 199, 1010,
1602, 158, 3530, 189, 449, 105, 65, 123, 177, 289, 230, 644,
618, 147, 302, 517, 215, 257, 73, 97, 117, 488, 333, 267, 274,
157, 178, 79, 118, 108, NA, 231, 9714, 345, NA, 399, 562, 272,
73, 766, 118, 164, 446, 255, 112, 242, 262, 68, 3848, 141, 267,
1403, 279, 77, 144, 263, 155, 255, 1340, 87, 285, 403, 72, 93,
110, NA, 2862, 72, 1389, 853, 167, 1329, 200, NA, 694, 160, 472,
79, 99, 722, 145, 370, 348, 1026, 634, 612, 257, 474, 134, 355,
629, 348, 164, 82, 1137, NA, 631, 176, 833, 190, 1737, 266, 166,
379, 479, 281, 377, 1045, 530, 185, 1095, 115, 186, 429, 304,
76, 2459, 1373, 108, 345, 571, 301, 95, 183, 253, 157, 1238,
NA, 396, 590, NA, NA, 78, 198, 2523, 65, 633, 853, 328, 771,
197, 206, 369, 601, 504, 119, 74, 155, 175, 1243, 497, 602, 174,
537, 2557, 174, 516, 108, 131, 225, 156, 111, 109, 566, 336,
336, 466, 1989, 235, 165, 143, 170, 309, 150, 307, 1419, 202,
NA, 1323, 2224, 2895, 622, 746, 109, 121, 655, NA, 237, 325,
1648, 178, 135, 117, 213, 144, 147, 153, 100, 142, NA, 137, 688,
343, 104, 177, 59, 203, 1526, 64, 238, 536, 320, 279, 307, 390,
387, NA, 326, 320, 133, 106, 294, 70, 597, 118, 1005, 271, NA,
204, NA, 464, 343, 93, 81, 860, 1301, 151, 552, 129, 176, 201,
103, 202, 2341, 146, 92, 173, 149, 135, 68, 113, NA, 1250, 167,
508, 809, 283, 115, 369, 2516, 2223, 401, 599, 849, 1237, 5841,
57, 1503, 911, 1908, 2311, 1294, 3545, 2704, 51, 625, 268, 1538,
1605, 2599, 3450, 744, 3550, 464, 696, 201, 45, 1143, 297, 1472,
2483, 2155, 643, 1334, 61, 1797, 604, 2718, 5530, 868, 3426,
276, 944, 1483, 1696, 673, 202, 570, 752, 184, 686, 71, 485,
389, 1018, NA, NA, 53, 178, 232, 155, 157, 390, 163, 56, 106,
NA, 59, 448, 318, 284, 2572, 121, 100, 398, 328, 267, 965, 251,
103, NA, 150, 643, 125, 458, NA, 68, 321, 142, 75, 106, 653,
NA, 239, 292, 110, 141, 125, 113, 425, 256, 197), year = c(2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010
)), row.names = c(NA, -628L), class = c("tbl_df", "tbl", "data.frame"
))
ANSWER
Answered 2022-Apr-18 at 03:52Here's one way you could turn state abbreviations into state names using R's built in state vectors:
a <- c('NJ', 'MA', 'FL')
state.name[sapply(a, \(x) which(x == state.abb))]
[1] "New Jersey" "Massachusetts" "Florida"
Applying this to birthdata2
, we might create a column called state_name
. But first we need to trim whitespace from the State column:
birthdata2$State <- trimws(birthdata2$State)
birthdata2$state_name <- state.name[sapply(birthdata2$State, \(x) which(x == state.abb)[1])]
County State x year state_name
1 Ada County ID 358 2010 Idaho
2 Adams County CO 549 2010 Colorado
3 Adams County PA NA 2010 Pennsylvania
4 Aiken County SC 149 2010 South Carolina
5 Alachua County FL 219 2010 Florida
6 Alamance County NC 178 2010 North Carolina
7 Alameda County CA 1367 2010 California
8 Albany County NY 257 2010 New York
9 Alexandria city VA 194 2010 Virginia
10 Allegan County MI 76 2010 Michigan
# … with 618 more rows
QUESTION
I am trying to create a asp.net core 2.1 razor website that will load two tables on the one page. Below i have defined my classes ItemMasters and DealMasters. ItemMasters stores items of clothing and DealMasters stores deals on them items. DealMasters stores the relevant ItemID with DealItemId
im trying to have the getItem Page to show the deals table but only the entires where the DealItemId matches the ID of the item on the page
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using System.ComponentModel.DataAnnotations;
using Microsoft.AspNetCore.Http;
using System.ComponentModel.DataAnnotations.Schema;
using System.Web;
using System.Security.Cryptography;
using Microsoft.AspNetCore.Mvc;
namespace demo.Data
{
public class ItemsMasters
{
[Key]
[Display(Name = "Itemid")]
public int ItemID { get; set; }
[Required(ErrorMessage = "Please enter this items Name")]
[Display(Name = "ItemName")]
public string ItemName { get; set; }
[Required(ErrorMessage = "Please enter this items Description")]
[Display(Name = "ItemDesc")]
public string ItemDesc { get; set; }
[Required(ErrorMessage = "Please enter this items Colour")]
[Display(Name = "ItemColour")]
public string ItemColour { get; set; }
[Required(ErrorMessage = "Please enter this items Catergory")]
[Display(Name = "ItemCatergory")]
public string ItemCatergory { get; set; }
[Display(Name = "ItemImage")]
public string ItemImage { get; set; }
[NotMapped]
[Display(Name = "Image")]
public IFormFile Image { get; set; }
[Display(Name = "ItemBrand")]
public string ItemBrand { get; set; }
[Display(Name = "ItemDateTime")]
public DateTime ItemDateTime { get; set; }
[Display(Name = "ItemUserID")]
public string ItemUserID { get; set; }
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using System.ComponentModel.DataAnnotations;
using Microsoft.AspNetCore.Http;
using System.ComponentModel.DataAnnotations.Schema;
using System.Web;
using System.Security.Cryptography;
namespace demo.Data
{
public class DealsMasters
{
[Key]
[Display(Name = "DealID")]
public int DealID { get; set; }
[Display(Name = "DealItemID")]
public int DealItemID { get; set; }
[Required(ErrorMessage = "Please enter The name of the supplier")]
[Display(Name = "Supplier")]
public string DealSupplier { get; set; }
[Required(ErrorMessage = "Please enter the price of the supplier")]
[Display(Name = "Price")]
public Decimal DealPrice { get; set; }
[Required(ErrorMessage = "Please enter the link to the suppliers page")]
[Display(Name = "Link")]
public string DealLink { get; set; }
[Required(ErrorMessage = "Please enter if the supplier accents visa")]
[Display(Name = "Visa")]
public string DealVisa { get; set; }
[Required(ErrorMessage = "Please enter if the supplier accents clarna")]
[Display(Name = "Clara")]
public string DealClarna { get; set; }
[Display(Name = "DealUserID")]
public string DealUserID { get; set; }
}
}
@page
@model demo.Pages.GetItemModel
@{
ViewData["Title"] = "GetItem";
}
-
Go Back
@Html.DisplayFor(model => model.ItemsMasters.ItemName)
-
View deals for this item |
Edit This Item |
Create a deal for this item |
Delete This Item
-
Description
-
@Model.ItemsMasters.ItemDesc
-
Colour
-
@Html.DisplayFor(model => model.ItemsMasters.ItemColour)
-
Catergory
-
@Html.DisplayFor(model => model.ItemsMasters.ItemCatergory)
-
Brand
-
@Html.DisplayFor(model => model.ItemsMasters.ItemBrand)
Supplier
Price
Visa
Clarna
Link
@foreach (var item in Model.DealsMasters)
{
@Html.DisplayFor(modelItem => item.DealSupplier)
£@Html.DisplayFor(modelItem => item.DealPrice)
@Html.DisplayFor(modelItem => item.DealVisa)
@Html.DisplayFor(modelItem => item.DealClarna)
Edit This Deal |
Delete This Deal
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.RazorPages;
using Microsoft.EntityFrameworkCore;
using demo.Data;
using Microsoft.AspNetCore.Identity;
using Microsoft.AspNetCore.Identity.UI.Services;
using System.Security.Claims;
namespace demo.Pages
{
public class GetItemModel : PageModel
{
private readonly demo.Data.ApplicationDbContext _context;
private int? id;
public GetItemModel(demo.Data.ApplicationDbContext context)
{
_context = context;
}
public ItemsMasters ItemsMasters { get; set; }
public DealsMasters DealsMasters { get; set; }
public async Task OnGetAsync(int? id)
{
if (id == null)
{
return NotFound();
}
ItemsMasters = await _context.ItemsMasters.FirstOrDefaultAsync(m => m.ItemID == id);
this.id = id;
DealsMasters = await _context.DealsMasters.FirstOrDefaultAsync(m => m.DealItemID == this.id);
if (ItemsMasters == null)
{
return NotFound();
}
if (DealsMasters == null)
{
return NotFound();
}
return Page();
}
}
}
ANSWER
Answered 2022-Apr-11 at 08:05im trying to have the getItem Page to show the deals table but only the entires where the DealItemId matches the ID of the item on the page
It is because the you use FirstOrDefaultAsync(m => m.DealItemID == this.id)
,try to use:
DealsMasters = await _context.DealsMasters.ToList();
QUESTION
So I'm fetching an API call which I'm then trying to iterate over in order to display as a list. My code so far is:
import React, { useState, useEffect } from "react";
import "./App.css";
function App() {
const [isLoading, setIsLoading] = useState(true);
const [trendingMovies, setTrendingMovies] = useState();
useEffect(() => {
fetch(
"https://api.themoviedb.org/3/trending/all/day?api_key=***"
)
.then((res) => res.json())
.then((results) => console.log(results))
.then((data) => setTrendingMovies(data))
.catch((error) => console.log(error))
.then(setIsLoading(false));
}, [trendingMovies]);
function Loading() {
return Loading...;
}
function DisplayTrendingMovies() {
return (
<>
Trending:
{console.log(trendingMovies)}
{trendingMovies &&
trendingMovies.map((movie) => (
- {movie.results.original_title}
))}
);
}
return <>{isLoading ? Loading() : DisplayTrendingMovies()};
}
export default App;
And in fact, the json loads, a sample call is:
{"page": 1,
"results": [
{
"genre_ids": [28, 12, 878],
"original_language": "en",
"original_title": "Spider-Man: No Way Home",
"poster_path": "/1g0dhYtq4irTY1GPXvft6k4YLjm.jpg",
"video": false,
"vote_average": 8.2,
"vote_count": 9831,
"overview": "Peter Parker is unmasked and no longer able to separate his normal life from the high-stakes of being a super-hero. When he asks for help from Doctor Strange the stakes become even more dangerous, forcing him to discover what it truly means to be Spider-Man.",
"release_date": "2021-12-15",
"title": "Spider-Man: No Way Home",
"id": 634649,
"adult": false,
"backdrop_path": "/iQFcwSGbZXMkeyKrxbPnwnRo5fl.jpg",
"popularity": 9675.798,
"media_type": "movie"
},
{
"adult": false,
"backdrop_path": "/qswImgf57wBf2i8Fv6K2O2BdMoe.jpg",
"genre_ids": [53, 10749, 18],
"id": 619979,
"original_language": "en",
"original_title": "Deep Water",
"overview": "Vic and Melinda Van Allen are a couple in the small town of Little Wesley. Their loveless marriage is held together only by a precarious arrangement whereby, in order to avoid the messiness of divorce, Melinda is allowed to take any number of lovers as long as she does not desert her family.",
"poster_path": "/6yRMyWwjuhKg6IU66uiZIGhaSc8.jpg",
"release_date": "2022-03-18",
"title": "Deep Water",
"video": false,
"vote_average": 6.0,
"vote_count": 1,
"popularity": 111.471,
"media_type": "movie"
},
{
"genre_ids": [28, 53],
"original_language": "sv",
"original_title": "Svart krabba",
"poster_path": "/mcIYHZYwUbvhvUt8Lb5nENJ7AlX.jpg",
"id": 760868,
"vote_average": 0.0,
"overview": "To end an apocalyptic war and save her daughter, a reluctant soldier embarks on a desperate mission to cross a frozen sea carrying a top-secret cargo.",
"release_date": "2022-03-18",
"vote_count": 0,
"video": false,
"adult": false,
"backdrop_path": "/w8pSD1jVt2OzBCr8QQTTjk0mBHt.jpg",
"title": "Black Crab",
"popularity": 41.066,
"media_type": "movie"
},
{
"genre_ids": [16, 10751, 35, 14],
"original_language": "en",
"original_title": "Turning Red",
"poster_path": "/qsdjk9oAKSQMWs0Vt5Pyfh6O4GZ.jpg",
"video": false,
"vote_average": 7.4,
"vote_count": 565,
"overview": "Thirteen-year-old Mei is experiencing the awkwardness of being a teenager with a twist – when she gets too excited, she transforms into a giant red panda.",
"release_date": "2022-03-10",
"title": "Turning Red",
"id": 508947,
"adult": false,
"backdrop_path": "/iPhDToxFzREctUA0ZQiYZamXsMy.jpg",
"popularity": 7663.359,
"media_type": "movie"
},
{
"video": false,
"vote_average": 6.0,
"overview": "At an elite New England university built on the site of a Salem-era gallows hill, three black women strive to find their place. Navigating politics and privilege, they encounter increasingly terrifying manifestations of the school's haunted past… and present.",
"release_date": "2022-03-18",
"vote_count": 1,
"adult": false,
"backdrop_path": "/bW0zTTdvc9zafc6hDEG1pMDuMCW.jpg",
"title": "Master",
"genre_ids": [27, 53, 18, 9648],
"id": 680829,
"original_language": "en",
"original_title": "Master",
"poster_path": "/gxbwRHsQ2v6DQv28ttp7pIx7Utj.jpg",
"popularity": 37.859,
"media_type": "movie"
},
{
"id": 850018,
"poster_path": "/r3eGUCijDCNqwcP1Ri8AZyTbPzI.jpg",
"video": false,
"vote_average": 0.0,
"overview": "A man breaks into a tech billionaire's empty vacation home, but things go sideways when the arrogant mogul and his wife arrive for a last-minute getaway.",
"release_date": "2022-03-11",
"vote_count": 0,
"adult": false,
"backdrop_path": "/cc7Nmid20lcfNh1eucWdLKidi7s.jpg",
"title": "Windfall",
"genre_ids": [80, 18, 53],
"original_title": "Windfall",
"original_language": "en",
"popularity": 12.777,
"media_type": "movie"
},
{
"id": 691683,
"video": false,
"title": "Cheaper by the Dozen",
"overview": "This remake of the beloved classic follows the raucous exploits of a blended family of 12, the Bakers, as they navigate a hectic home life while simultaneously managing their family business.",
"release_date": "2022-03-18",
"vote_count": 1,
"adult": false,
"backdrop_path": "/4YOpxmGyQZFELwLB7JqcpYKnmKw.jpg",
"vote_average": 10.0,
"genre_ids": [35, 10751, 18],
"poster_path": "/qNRsouZh5zmhaE3n4QpLDXzy1gQ.jpg",
"original_language": "en",
"original_title": "Cheaper by the Dozen",
"popularity": 43.263,
"media_type": "movie"
},
{
"title": "The Adam Project",
"overview": "After accidentally crash-landing in 2022, time-traveling fighter pilot Adam Reed teams up with his 12-year-old self on a mission to save the future.",
"release_date": "2022-03-11",
"adult": false,
"backdrop_path": "/ewUqXnwiRLhgmGhuksOdLgh49Ch.jpg",
"genre_ids": [878, 12, 35],
"vote_count": 705,
"original_language": "en",
"original_title": "The Adam Project",
"poster_path": "/wFjboE0aFZNbVOF05fzrka9Fqyx.jpg",
"video": false,
"id": 696806,
"vote_average": 7.0,
"popularity": 2137.891,
"media_type": "movie"
},
{
"adult": false,
"backdrop_path": "/zQG1FYDqoWo2hYhE5GVZ1yrWSfh.jpg",
"genre_ids": [10751, 18],
"id": 921655,
"original_language": "en",
"original_title": "Rescued by Ruby",
"overview": "Chasing his dream to join an elite K-9 unit, a state trooper partners with a fellow underdog: clever but naughty shelter pup Ruby. Based on a true story.",
"poster_path": "/tPlJEodEn0SSV4avo8KSawtlTlN.jpg",
"release_date": "2022-03-17",
"title": "Rescued by Ruby",
"video": false,
"vote_average": 7.4,
"vote_count": 6,
"popularity": 47.237,
"media_type": "movie"
},
{
"overview": "In his second year of fighting crime, Batman uncovers corruption in Gotham City that connects to his own family while facing a serial killer known as the Riddler.",
"release_date": "2022-03-01",
"adult": false,
"backdrop_path": "/5P8SmMzSNYikXpxil6BYzJ16611.jpg",
"id": 414906,
"genre_ids": [80, 9648, 53],
"original_language": "en",
"original_title": "The Batman",
"poster_path": "/74xTEgt7R36Fpooo50r9T25onhq.jpg",
"vote_count": 1981,
"video": false,
"vote_average": 8.0,
"title": "The Batman",
"popularity": 2675.235,
"media_type": "movie"
},
{
"release_date": "2022-03-17",
"adult": false,
"backdrop_path": "/trtFAmf4IcndxSh5tIfLwxPyW67.jpg",
"genre_ids": [28, 53],
"vote_count": 4,
"original_language": "en",
"original_title": "Panama",
"poster_path": "/82I3tDsGDTMy7lHar84Gz0jUuyW.jpg",
"title": "Panama",
"video": false,
"vote_average": 6.0,
"id": 628878,
"overview": "An ex-marine is hired by a defense contractor to travel to Panama to complete an arms deal. In the process he becomes involved with the U.S. invasion of Panama, and learns an important lesson about the true nature of political power.",
"popularity": 40.113,
"media_type": "movie"
},
{
"backdrop_path": "/fUjATGfykF0EU57DhMDkIXqQlc5.jpg",
"first_air_date": "2022-03-17",
"genre_ids": [18, 10765],
"id": 113566,
"name": "DMZ",
"origin_country": ["US"],
"original_language": "en",
"original_name": "DMZ",
"overview": "In the near future after a bitter civil war leaves Manhattan a demilitarized zone (DMZ), destroyed and isolated from the rest of the world, fierce medic Alma Ortega sets out on a harrowing journey to find the son she lost in the evacuation of New York City at the onset of the conflict. Standing in her way are gangs, militias, demagogues and warlords, including Parco Delgado, the popular — and deadly — leader of one of the most powerful gangs in the DMZ.",
"poster_path": "/wnug9DhsenurS5dWCypjZSRFnH6.jpg",
"vote_average": 10.0,
"vote_count": 3,
"popularity": 66.357,
"media_type": "tv"
},
{
"backdrop_path": "/x8KTttToqfHC92JiD2Cs8WWkBG7.jpg",
"genre_ids": [18],
"original_language": "en",
"poster_path": "/rFlYeo84b5YtzNkN0IonN6ZCPic.jpg",
"first_air_date": "2022-03-17",
"vote_average": 0.0,
"original_name": "WeCrashed",
"origin_country": ["US"],
"vote_count": 0,
"overview": "Inspired by actual events — and the love story at the center of it all. WeWork grew from a single coworking space into a global brand worth $47 billion in under a decade. Then, in less than a year, its valuation dropped $40 billion. What happened?",
"id": 117821,
"name": "WeCrashed",
"popularity": 61.183,
"media_type": "tv"
},
{
"vote_average": 7.1,
"title": "Nightmare Alley",
"overview": "An ambitious carnival man with a talent for manipulating people with a few well-chosen words hooks up with a female psychiatrist who is even more dangerous than he is.",
"release_date": "2021-12-02",
"id": 597208,
"adult": false,
"backdrop_path": "/g0YNGpmlXsgHfhGnJz3c5uyzZ1B.jpg",
"genre_ids": [80, 18, 53],
"original_language": "en",
"original_title": "Nightmare Alley",
"poster_path": "/vfn1feL0V9HNSXuLLpaxAW8O6LO.jpg",
"vote_count": 1055,
"video": false,
"popularity": 1099.529,
"media_type": "movie"
},
{
"genre_ids": [18, 35],
"original_language": "en",
"original_title": "Dog",
"poster_path": "/zHQy4h36WwuCetKS7C3wcT1hkgA.jpg",
"id": 626735,
"vote_average": 7.6,
"overview": "An army ranger and his dog embark on a road trip along the Pacific Coast Highway to attend a friend's funeral.",
"release_date": "2022-02-17",
"vote_count": 110,
"video": false,
"adult": false,
"backdrop_path": "/8T7SS4jU7TmtqrUr45JgZASCTUP.jpg",
"title": "Dog",
"popularity": 200.167,
"media_type": "movie"
},
{
"adult": false,
"backdrop_path": "/zOVxbbt8BrrjWBJ0eO21S2adUvo.jpg",
"genre_ids": [35],
"id": 776328,
"original_language": "it",
"original_title": "Marilyn ha gli occhi neri",
"overview": "Clara and Diego, under the guidance of the psychiatrist of a day rehabilitation center for disturbed people who attend, decide to transform the treatment center into a restaurant, involving all their other companions.",
"poster_path": "/thHmfbg56EDMCTmjEuz6Xo5M8hV.jpg",
"release_date": "2021-10-14",
"title": "Marilyn's Eyes",
"video": false,
"vote_average": 6.9,
"vote_count": 136,
"popularity": 29.106,
"media_type": "movie"
},
{
"id": 93544,
"overview": "Two seasoned drug dealers return to the gritty street of London, but their pursuit of money and power is threatened by a young and ruthless hustler.",
"name": "Top Boy",
"vote_count": 14,
"vote_average": 8.9,
"backdrop_path": "/4CqfqazO2EBhN469XfIp7RSJZ7h.jpg",
"original_language": "en",
"genre_ids": [80, 18],
"first_air_date": "2019-09-13",
"original_name": "Top Boy",
"origin_country": [],
"poster_path": "/mGZpOEaLZTRzWeQMq5SZM5BbDZg.jpg",
"popularity": 66.767,
"media_type": "tv"
},
{
"backdrop_path": "/3vOh84wLx47L1vhNTUuf1Z0zv2i.jpg",
"genre_ids": [10765],
"original_language": "pl",
"poster_path": "/bBzkreVliFn18zFBjysanZnC3Yq.jpg",
"first_air_date": "2022-03-18",
"vote_average": 0.0,
"original_name": "Krakowskie potwory",
"origin_country": ["PL"],
"vote_count": 0,
"overview": "A young woman haunted by her past joins a mysterious professor and his group of gifted students who investigate paranormal activity — and fight demons.",
"id": 158396,
"name": "Cracow Monsters",
"popularity": 24.704,
"media_type": "tv"
},
{
"backdrop_path": "/ifUfE79O1raUwbaQRIB7XnFz5ZC.jpg",
"genre_ids": [27, 9648, 53],
"original_language": "en",
"original_title": "Scream",
"poster_path": "/kZNHR1upJKF3eTzdgl5V8s8a4C3.jpg",
"video": false,
"vote_average": 6.8,
"vote_count": 937,
"overview": "Twenty-five years after a streak of brutal murders shocked the quiet town of Woodsboro, a new killer has donned the Ghostface mask and begins targeting a group of teenagers to resurrect secrets from the town’s deadly past.",
"release_date": "2022-01-12",
"id": 646385,
"title": "Scream",
"adult": false,
"popularity": 1191.305,
"media_type": "movie"
},
{
"original_language": "en",
"original_title": "Loveland",
"poster_path": "/zVxFQG0rAFITjIygEMHFRLtR6JI.jpg",
"video": false,
"id": 659924,
"release_date": "2022-03-17",
"vote_count": 0,
"adult": false,
"backdrop_path": "/2QerJOQmmWcINcM1S1sNkDxjXkj.jpg",
"genre_ids": [878, 10749, 53],
"vote_average": 0.0,
"overview": "In an uncharted future, two hardened souls meet and confront each other with the things they have done and what they have become.",
"title": "Loveland",
"popularity": 32.544,
"media_type": "movie"
}
],
"total_pages": 1000,
"total_results": 20000
}
But I never get the iterating ul elements. I think this is somehow related to how I'm running my useEffect, because in logs I get undefined before the json call is completed, the correct json (although it doesn't iterate/display as ul elements) and then another undefined.
Any pointers?
ANSWER
Answered 2022-Mar-24 at 19:52I think the problem is with the way fetch api's promise is handled. .then((results) => console.log(results)) seems to return undefined and the following .then is receiving data as undefined. Please try like below and let me know if it works!
import React, { useState, useEffect } from "react";
function App() {
const [isLoading, setIsLoading] = useState(true);
const [trendingMovies, setTrendingMovies] = useState();
useEffect(() => {
fetch(
"https://api.themoviedb.org/3/trending/all/day?api_key=***"
)
.then((res) => res.json())
.then((data) => setTrendingMovies(data.results))
.catch((error) => console.log(error))
.then(setIsLoading(false));
}, []);
function Loading() {
return Loading...;
}
function DisplayTrendingMovies() {
return (
<>
Trending:
{console.log(trendingMovies)}
{trendingMovies &&
trendingMovies.map((movie) => (
- {movie.original_title}
))}
);
}
return <>{isLoading ? Loading() : DisplayTrendingMovies()};
}
export default App;
QUESTION
For example, I have a dataframe df
like this:
| Name | color | id | weight |
|------- |-------- |---- |-------- |
| john | blue | 67 | 70 |
| clara | yellow | - | 67 |
| diana | red | 89 | 56 |
Here the numeric columns like "id" and "weight" should have all numeric values, unlike the second value of "id" which is a '-'.
If I do df.dtypes
, it returns:
| name | object
| color | object
| id | object
| weight | float
**How can I traverse through the dataframe column-wise, then check if the type of column is an object, then if it an object, then check if it is becoming an object because of the typo '-' like id- if yes then raise a flag **
ANSWER
Answered 2022-Mar-21 at 17:00Zip up the column name and the dtypes to make a tuple:
for col_name, col_type in zip(df.columns, df.dtypes):
if col_type == "object":
# do whatever here
pass
QUESTION
In the second rule I would like to select from the vcf file containing bob, clara and tim, only the first genotype of dictionary (i.e. bob) in roder to get as output in the second rule bob.dn.vcf
. Is this possible in snakemake
?
d = {"FAM1": ["bob.bam", "clara.bam", "tim.bam"]}
FAMILIES = list(d)
rule all:
input:
expand some outputs
wildcard_constraints:
family = "|".join(FAMILIES)
rule somerulename:
input:
lambda w: d[w.family]
output:
vcf="{family}/{family}.vcf"
shell:
"""
some python command line which produces a single vcf file with bob, clara and tim
"""
rule somerulename:
input:
invcf="{family}/{family}.vcf"
params:
ref="someref.fasta"
output:
out="{family}/{bob}.dn.vcf"
shell:
"""
gatk --java-options "-Xms2G -Xmx2g -XX:ParallelGCThreads=2" SelectVariants -R {params.ref} -V {input.invcf} -O {output.out}
"""
ANSWER
Answered 2022-Feb-03 at 13:32There are at least two options:
- explicitly specify output:
rule somerulename:
output:
out="FAM1/bob.dn.vcf"
- impose constraints on wildcard values:
rule somerulename:
output:
out="{family}/{bob}.dn.vcf"
wildcard_constraints:
family="FAM1",
bob="bob",
- control what is produced by specifying appropriate inputs to rule
all
:
rule all:
input: "FAM1/bob.dn.vcf", "FAM2/alice.dn.vcf"
QUESTION
I know there is a few questions on SO regarding the conversion of JSON file to a pandas df but nothing is working. Specifically, the JSON requests the current days information. I'm trying to return the tabular structure that corresponds with Data
but I'm only getting the first dict
object.
I'll list the current attempts and the resulting outputs below.
import requests
import pandas as pd
import json
get_session_url = "https://qships.tmr.qld.gov.au/webx/"
get_data_url = "https://qships.tmr.qld.gov.au/webx/services/wxdata.svc/GetDataX"
get_data_query = {
"token": None,
"reportCode": "MSQ-WEB-0001",
"dataSource": None,
"filterName": "Today",
"parameters": [{
"__type": "ParameterValueDTO:#WebX.Core.DTO",
"sName": "DOMAIN_ID",
"iValueType": 0,
"aoValues": [{"Value": -1}],
}],
"metaVersion": 0,
}
sess = requests.session()
sess.get(get_session_url).raise_for_status()
my_dict = sess.post(get_data_url, json = get_data_query).json()
print(my_dict)
Output:
{'d': {'__type': 'DataSetDTO:#WebX.Core.DTO', 'BuildVersion': '7.0.0.12590', 'ReportCode': 'MSQ-WEB-0001', 'Tables': [{'__type': 'DataTableDTO:#WebX.Core.DTO', 'BuildVersion': '7.0.0.12590', 'AsOfDate': '14:36 on Jan 19', 'Data': [[132378, 334489, 'EXT', 'NANA Z', 'BULK CARRIER', 229.2, 'LBH Australia Pty Ltd (Mackay)', '/Date(1642600800000+1000)/', '/Date(1642600800000+1000)/', 'SEA for HPS', 'Anch for HPS & DBCT', 'PLAN', 'Keelung (Chilung)', 'Kwangyang', None, 633086, 705], [132112, 333984, 'DEP', 'KRITI WARRIOR', 'BULK CARRIER', 234.98, 'Wilhelmsen Ships Service (Gladstone)', '/Date(1642600800000+1000)/', '/Date(1642608900000+1000)/', 'Fishermans Landing 1', 'SEA', 'CONF', 'Amrun', 'Amrun', '2201', 632395, 725], [132232, 334208, 'EXT', 'BLUE GRASS MARINER', 'TANKER', 183.06, 'Gulf Agency Company (Mackay)', '/Date(1642600860000+1000)/', '/Date(1642600860000+1000)/', 'SEA M', 'Anch for MKY', 'PLAN', 'Gladstone', 'Singapore', None, 633566, 705], [132654, 335076, 'EXT', 'SERIFOS WARRIOR', 'BULK CARRIER', 234.98, 'Wilhelmsen Ships Service (Gladstone)', '/Date(1642606200000+1000)/', '/Date(1642609800000+1000)/', 'SEA', 'Fairway Buoy Anchorage', 'PLAN', 'Amrun', 'Amrun', '2201', 632055, 705], [132030, 333847, 'ARR', 'MH GREEN', 'CONTAINER SHIP', 199.98, 'Inchcape Shipping Services (Queensland)', '/Date(1642610700000+1000)/', '/Date(1642623300000+1000)/', 'SEA', 'Fisherman Island No 8', 'SCHD', 'Yantian', 'Botany Bay', '11S/11N', 633005, 710], [131681, 333193, 'ARR', 'KM NAGOYA', 'BULK CARRIER', 234.98, 'Gulf Agency Company (Gladstone)', '/Date(1642611600000+1000)/', '/Date(1642618800000+1000)/', 'Fairway Buoy Anchorage', 'Clinton Coal 2', 'CONF', 'Fangcheng', 'Singapore', None, 633504, 725], [132781, 335341, 'ARR', 'MORNING CLARA', 'VEHICLES CARRIER', 199.9, 'Wilhelmsen Ships Service (Brisbane)', '/Date(1642611600000+1000)/', '/Date(1642626000000+1000)/', 'Drift Point Cartwright', 'Fisherman Island No 1', 'SCHD', 'Tianjin', 'Port Kembla', '2251', 633093, 710], [131971, 333736, 'DEP', 'MAPLE FORTITUDE', 'BULK CARRIER', 179.9, 'Inchcape Shipping Services (Queensland)', '/Date(1642615200000+1000)/', '/Date(1642621500000+1000)/', 'Townsville 09', 'SEA', 'SCHD', 'Lanshan', 'Auckland', '2101', 633738, 710], [131629, 333076, 'DEP', 'JP CORAL', 'BULK CARRIER', 228.0, 'Sturrock Grindrod Maritime (Gladstone)', '/Date(1642617000000+1000)/', '/Date(1642625100000+1000)/', 'Clinton Coal 2', 'SEA', 'CONF', 'Matsushima - Nagasaki', 'Matsuura - Nagasaki', '146', 631305, 725], [130504, 331071, 'ARR', 'KENNADI', 'BULK CARRIER', 199.9, 'LBH Australia Pty Ltd (Gladstone)', '/Date(1642617000000+1000)/', '/Date(1642626000000+1000)/', 'East Anchorage 9', 'Clinton Coal 4', 'CONF', 'Kwangyang', 'Kendari - Sulawesi', '37', 633759, 725], [131497, 332926, 'ARR', 'STAR VIRGINIA', 'BULK CARRIER', 229.0, 'Inchcape Shipping Services (Queensland)', '/Date(1642617900000+1000)/', '/Date(1642633200000+1000)/', 'Point Cartwright Anchorage', 'Fisherman Island Coal Berth', 'SCHD', 'Kitakyushu', 'Fukuyama - Hiroshima', '2', 632115, 710], [132459, 334657, 'ARR', 'NORD ANNAPOLIS', 'BULK CARRIER', 179.9, 'Monson Agencies Australia (Gladstone)', '/Date(1642617900000+1000)/', '/Date(1642625100000+1000)/', 'East Anchorage 11', 'Auckland Point 2', 'CONF', 'Portland', 'Chittagong', '26', 633752, 725], [132563, 334863, 'DEP', 'POSITIVE LEADER', 'VEHICLES CARRIER', 180.0, 'Monson Agencies Australia (Brisbane)', '/Date(1642622400000+1000)/', '/Date(1642635000000+1000)/', 'Fisherman Island No 1', 'SEA', 'SCHD', 'Townsville', 'Port Kembla', '090', 632525, 710], [132221, 334613, 'ARR', 'DANCEWOOD SW', 'BULK CARRIER', 170.7, 'Inchcape Shipping Services (Queensland)', '/Date(1642622400000+1000)/', '/Date(1642640400000+1000)/', 'Point Cartwright Anchorage', 'Pinkenba No 1', 'SCHD', 'Guam', 'Shibushi', '202201', 632332, 710], [132357, 334450, 'EXT', 'DOUBLE FANTASY', 'BULK CARRIER', 234.98, 'Monson Agencies Australia (Townsville & Abbot Point)', '/Date(1642622400000+1000)/', '/Date(1642622400000+1000)/', 'SEA', 'Abbot Point Anchorage', 'SCHD', 'Chiba', None, None, 631611, 710], [132431, 334598, 'DEP', 'INDUS PROSPERITY', 'BULK CARRIER', 229.2, 'Monson Agencies Australia (Townsville & Abbot Point)', '/Date(1642624200000+1000)/', '/Date(1642624200000+1000)/', 'Abott Point 2', 'SEA', 'SCHD', 'Chiba', 'Dung Quat', None, 627891, 710], [132465, 334672, 'DEP', 'KOTA LUMAYAN', 'CONTAINER SHIP', 260.502, 'Gulf Agency Company (Brisbane)', '/Date(1642626000000+1000)/', '/Date(1642639500000+1000)/', 'Fisherman Island No. 9', 'SEA', 'PLAN', 'Singapore', 'Sydney', '0147', 632026, 705], [132356, 334446, 'ARR', 'TRITON', 'BULK CARRIER', 225.0, 'Sturrock Grindrod Maritime (Mackay)', '/Date(1642626000000+1000)/', '/Date(1642632000000+1000)/', 'North Anchorage 22', 'HPS Berth 2', 'SCHD', 'Gunsan (ex Kunsan)', 'Singapore', '012022', 633638, 710], [132430, 334595, 'ARR', 'GOLDEN YOSA', 'TANKER', 144.03, 'Sturrock Grindrod Maritime (Brisbane)', '/Date(1642626000000+1000)/', '/Date(1642644000000+1000)/', 'SEA', 'Viva Energy', 'SCHD', 'Geelong', 'Townsville', '74(C1)', 628015, 710], [132631, 335048, 'DEP', 'MONDIAL SUN', 'BULK CARRIER', 229.0, 'Ben Line Agencies', '/Date(1642626000000+1000)/', '/Date(1642629600000+1000)/', 'Abbot Point 1', 'SEA', 'SCHD', 'Bahudopi', 'India', '018', 633700, 710], [132451, 334640, 'EXT', 'GOLDEN HACHI', 'TANKER', 126.8, 'Sturrock Grindrod Maritime (Brisbane)', '/Date(1642626000000+1000)/', '/Date(1642626000000+1000)/', 'SEA', 'Point Cartwright Anchorage', 'PLAN', 'Singapore', 'Botany Bay', '10', 632483, 705], [132442, 334622, 'DEP', 'FOREVER SW', 'BULK CARRIER', 189.99, 'Gulf Agency Company (Brisbane)', '/Date(1642626000000+1000)/', '/Date(1642643100000+1000)/', 'Fisherman Island Coal Berth', 'SEA', 'SCHD', 'Toledo/Cebu', 'Kushiro', '2A', 569051, 710], [132572, 334905, 'ARR', 'GREEK FRIENDSHIP', 'BULK CARRIER', 228.9, 'LBH Australia Pty Ltd (Mackay)', '/Date(1642627800000+1000)/', '/Date(1642627800000+1000)/', 'Abbot Point Anchorage 11', 'Abott Point 2', 'SCHD', 'Tianjin', 'Singapore', None, 633660, 710], [132262, 334259, 'DEP', 'ASTREA', 'BULK CARRIER', 228.99, 'Wave Shipping Pty Ltd', '/Date(1642627800000+1000)/', '/Date(1642627860000+1000)/', 'HPS Berth 1', 'SEA Paddock Departure', 'PLAN', 'Lianyungang', 'Singapore', '1', 633595, 705], [132510, 334762, 'DEP', 'BRILLIANT ADVANCE', 'BULK CARRIER', 228.99, 'Wilhelmsen Ships Service (Weipa)', '/Date(1642629600000+1000)/', '/Date(1642633200000+1000)/', 'Chith Export Facility', 'SEA', 'CONF', 'Laizhou', 'Gladstone', None, 631808, 725], [132170, 334112, 'ARR', 'LOWLANDS CRIMSON', 'BULK CARRIER', 234.96, 'Wilhelmsen Ships Service (Weipa)', '/Date(1642629600000+1000)/', '/Date(1642636800000+1000)/', 'Anchorage ^D', 'Chith Export Facility', 'CONF', 'Gladstone', 'China', None, 630787, 725], [132433, 334601, 'DEP', 'PT NORFOLK', 'GENERAL CARGO BARGE', 70.15, 'Pacific Tug (Aust) PTY LTD', '/Date(1642631400000+1000)/', '/Date(1642635000000+1000)/', 'Marina', 'Bundaberg Anchorage', 'CONF', None, None, None, 624749, 725], [132428, 334591, 'REM', 'PT KYTHIRA', 'TUG', 26.0, 'Pacific Tug (Aust) PTY LTD', '/Date(1642631400000+1000)/', '/Date(1642635000000+1000)/', 'Marina', 'Bundaberg Anchorage', 'CONF', None, 'Brisbane', None, 570086, 725], [131637, 333097, 'ARR', 'BALZANI', 'TANKER', 228.418, 'Monson Agencies Australia (Gladstone)', '/Date(1642632300000+1000)/', '/Date(1642642200000+1000)/', 'North Anchorage 7', 'Fishermans Landing 2', 'CONF', 'Yeosu (ex Yosu)', 'Port Kembla', '32106', 632359, 725], [132699, 335167, 'EXT', 'FEDERAL IMABARI', 'BULK CARRIER', 199.98, 'Monson Agencies Australia (Brisbane)', '/Date(1642633200000+1000)/', '/Date(1642633200000+1000)/', 'Skardon River Anchorage', 'SEA', 'CONF', None, None, None, 624678, 725], [132451, 335328, 'ARR', 'GOLDEN HACHI', 'TANKER', 126.8, 'Sturrock Grindrod Maritime (Brisbane)', '/Date(1642635000000+1000)/', '/Date(1642651200000+1000)/', 'Point Cartwright Anchorage', 'Ampol Lytton Products', 'PLAN', 'Singapore', 'Botany Bay', '10', 632483, 705], [131897, 333604, 'DEP', 'PROTEUS', 'TANKER', 183.06, 'Gulf Agency Company (Mackay)', '/Date(1642635000000+1000)/', '/Date(1642635060000+1000)/', 'Mackay Berth 1', 'SEA MKY', 'SCHD', 'Gladstone', 'Townsville', None, 633592, 710], [132059, 333886, 'ARR', 'RTM WAKMATHA', 'BULK CARRIER', 236.0, 'Wilhelmsen Ships Service (Gladstone)', '/Date(1642635000000+1000)/', '/Date(1642644900000+1000)/', 'Fairway Buoy Anchorage', 'Fishermans Landing 1', 'CONF', 'Gove', 'Amrun', None, 633057, 725], [132024, 333833, 'ARR', 'MARIA PRINCESS', 'TANKER', 228.59, 'Gulf Agency Company (Brisbane)', '/Date(1642635000000+1000)/', '/Date(1642654800000+1000)/', 'Point Cartwright Anchorage', 'Fishermans Island Tanker Terminal', 'SCHD', 'Seria Brunei', None, None, 633606, 710], [132504, 334740, 'EXT', 'MAIRAKI', 'BULK CARRIER', 291.9, 'LBH Australia Pty Ltd (Gladstone)', '/Date(1642636800000+1000)/', '/Date(1642636800000+1000)/', 'SEA', 'Drift Gladstone', 'PLAN', 'Tianjin', None, '43', 633705, 705], [132029, 333846, 'DEP', 'MANTA NILGUN', 'GENERAL CARGO', 179.99, 'Monson Agencies Australia (Gladstone)', '/Date(1642637700000+1000)/', '/Date(1642644000000+1000)/', 'South Trees East', 'SEA', 'CONF', 'Port Moresby', 'Nakhodka', '202201', 632946, 725], [132001, 333781, 'ARR', 'NSU KEYSTONE', 'BULK CARRIER', 299.94, 'Inchcape Shipping Services (Queensland)', '/Date(1642638600000+1000)/', '/Date(1642644000000+1000)/', 'North Anchorage 19', 'DBCT Berth 1', 'SCHD', 'Yeosu (ex Yosu)', 'Kimitsu', '57', 633532, 710], [131382, 332650, 'EXT', 'AQUADIVA', 'BULK CARRIER', 292.0, 'Gulf Agency Company (Gladstone)', '/Date(1642639500000+1000)/', '/Date(1642643100000+1000)/', 'SEA', 'Fairway Buoy Anchorage', 'PLAN', 'Bayuquan', 'Abbot Point', None, 633453, 705], [132417, 334562, 'DEP', 'KMARIN KENAI', 'BULK CARRIER', 229.0, 'Monson Agencies Australia (Mackay)', '/Date(1642640400000+1000)/', '/Date(1642644000000+1000)/', 'DBCT Berth 1', 'SEA Paddock Departure', 'SCHD', 'Yeosu (ex Yosu)', 'Sepetiba', None, 633645, 710], [132708, 335184, 'ARR', 'MSC ELA', 'CONTAINER SHIP', 294.06, 'Mediterranean Shipping Company', '/Date(1642641300000+1000)/', '/Date(1642654800000+1000)/', 'SEA', 'Fisherman Island No. 9', 'SCHD', 'Sydney', 'Shanghai', 'SE151R', 633718, 710], [132611, 335017, 'DEP', 'SSB 1803', 'BARGE CARRIER', 52.7, 'Sea Swift Pty Ltd', '/Date(1642644000000+1000)/', '/Date(1642647600000+1000)/', 'Hammond Island', 'SEA', 'CONF', None, None, None, 586569, 725], [132429, 334592, 'EXT', 'LEONORA VICTORY', 'TANKER', 183.2, 'Monson Agencies Australia (Gladstone)', '/Date(1642644000000+1000)/', '/Date(1642644000000+1000)/', 'SEA', 'Fairway Buoy Anchorage', 'PLAN', 'Balboa', 'Unknown Port', '32', 633737, 705], [132601, 335000, 'DEP', 'NORMAN RIVER', 'TUG', 24.45, 'Sea Swift Pty Ltd', '/Date(1642644000000+1000)/', '/Date(1642647600000+1000)/', 'Hammond Island', 'SEA', 'CONF', 'Cape Flattery', 'Cairns', None, 633691, 725], [132079, 335477, 'EXT', 'DEE4 LARCH', 'TANKER', 183.06, 'Inchcape Shipping Services (Queensland)', '/Date(1642646040000+1000)/', '/Date(1642646040000+1000)/', 'East Anchorage 6', 'SEA', 'PLAN', 'Etajima', 'Unknown Port', '1', 632184, 705], [132470, 334682, 'ARR', 'CASTILLO DE SANTISTEBAN', 'LIQUEFIED GAS TANKER', 299.9, 'Gulf Agency Company (Gladstone)', '/Date(1642646700000+1000)/', '/Date(1642658400000+1000)/', 'LNG Anchorage 2', 'Queensland Curtis LNG', 'CONF', 'Taiwan', 'Ningbo', None, 632133, 725], [132434, 334603, 'REM', 'PT NORFOLK', 'GENERAL CARGO BARGE', 70.15, 'Pacific Tug (Aust) PTY LTD', '/Date(1642647600000+1000)/', '/Date(1642662000000+1000)/', 'Shark Spit Anchorage', 'Queensport', 'SCHD', None, None, None, 624749, 710], [132538, 334816, 'EXT', 'WINCANTON', 'LIQUEFIED GAS TANKER', 119.95, 'Inchcape Shipping Services (Queensland)', '/Date(1642647600000+1000)/', '/Date(1642647600000+1000)/', 'SEA', 'Fairway Buoy Anchorage', 'PLAN', 'Newcastle', 'Newcastle', '264', 632386, 705], [132432, 334600, 'REM', 'PT KYTHIRA', 'TUG', 26.0, 'Pacific Tug (Aust) PTY LTD', '/Date(1642647600000+1000)/', '/Date(1642662000000+1000)/', 'Shark Spit Anchorage', 'Queensport', 'SCHD', 'Bundaberg', None, None, 570086, 710], [131727, 333300, 'ARR', 'SEMIRAMIS', 'BULK CARRIER', 228.9, 'Sturrock Grindrod Maritime (Mackay)', '/Date(1642647660000+1000)/', '/Date(1642653060000+1000)/', 'South Anchorage 09', 'HPS Berth 1', 'PLAN', 'Jingtang (Tangshan)', 'Singapore', 'TP0264', 633516, 705], [132130, 335179, 'ARR', 'CHORUS', 'BULK CARRIER', 228.99, 'Monson Agencies Australia (Mackay)', '/Date(1642649400000+1000)/', None, 'North Anchorage 06', 'DBCT Berth 3', 'SCHD', 'Busan', 'Kakogawa', '80', 633558, 710], [132439, 334614, 'EXT', 'SM TIGER', 'BULK CARRIER', 292.0, 'LBH Australia Pty Ltd (Mackay)', '/Date(1642649400000+1000)/', '/Date(1642649400000+1000)/', 'SEA for HPS', 'Anch for HPS & DBCT', 'PLAN', 'Kwangyang', 'Pohang', '50', 633640, 705], [132795, 335381, 'ARR', 'ALBATROSS BAY', 'LANDING CRAFT', 64.0, 'Sea Swift Pty Ltd', '/Date(1642651200000+1000)/', '/Date(1642654800000+1000)/', 'SEA', 'Horn Island', 'CONF', 'Cairns', 'Seisia', 'AB 2203', 633274, 725], [132433, 335356, 'ARR', 'PT NORFOLK', 'GENERAL CARGO BARGE', 70.15, 'Pacific Tug (Aust) PTY LTD', '/Date(1642651200000+1000)/', '/Date(1642654800000+1000)/', 'Bundaberg Anchorage', 'Marina', 'CONF', None, None, None, 624749, 725], [132428, 335355, 'REM', 'PT KYTHIRA', 'TUG', 26.0, 'Pacific Tug (Aust) PTY LTD', '/Date(1642651200000+1000)/', '/Date(1642654800000+1000)/', 'Bundaberg Anchorage', 'Marina', 'CONF', None, 'Brisbane', None, 570086, 725], [132295, 334319, 'DEP', 'HOEGH KOBE', 'VEHICLES CARRIER', 199.1, 'Seaway Agencies Pty Ltd', '/Date(1642654800000+1000)/', '/Date(1642669200000+1000)/', 'Wagners', 'SEA', 'SCHD', 'Auckland', 'Port Kembla', '68', 631289, 710], [132291, 334306, 'ARR', 'LOCH MAREE', 'BULK CARRIER', 176.83, 'Wave Shipping Pty Ltd', '/Date(1642655700000+1000)/', '/Date(1642672800000+1000)/', 'Point Cartwright Anchorage', 'Fisherman Island General Purpose Berth', 'SCHD', 'Fujairah', 'Lae', '9', 633744, 710], [132232, 334209, 'ARR', 'BLUE GRASS MARINER', 'TANKER', 183.06, 'Gulf Agency Company (Mackay)', '/Date(1642657200000+1000)/', '/Date(1642657260000+1000)/', 'Anch for MKY', 'Mackay Berth 1', 'SCHD', 'Gladstone', 'Singapore', None, 633566, 710], [132538, 334817, 'ARR', 'WINCANTON', 'LIQUEFIED GAS TANKER', 119.95, 'Inchcape Shipping Services (Queensland)', '/Date(1642657500000+1000)/', '/Date(1642668300000+1000)/', 'Fairway Buoy Anchorage', 'Fishermans Landing 5', 'CONF', 'Newcastle', 'Newcastle', '264', 632386, 725], [132473, 334686, 'EXT', 'GREAT CHEER', 'BULK CARRIER', 229.2, 'LBH Australia Pty Ltd (Mackay)', '/Date(1642658400000+1000)/', '/Date(1642658400000+1000)/', 'SEA for HPS', 'Anch for HPS & DBCT', 'PLAN', 'Kakogawa', 'Indonesia', '2201VC', 633677, 705], [132513, 334770, 'DEP', 'KAI YANG STAR', 'BULK CARRIER', 234.98, 'Wilhelmsen Ships Service (Weipa)', '/Date(1642659300000+1000)/', '/Date(1642666500000+1000)/', 'Lorim West', 'SEA', 'CONF', 'Dongjiakou', 'Qingdao', None, 633694, 725], [132575, 335351, 'EXT', 'IPSEA COLOSSUS', 'BULK CARRIER', 197.0, 'Monson Agencies Australia (Townsville & Abbot Point)', '/Date(1642662000000+1000)/', '/Date(1642662000000+1000)/', 'SEA', 'Abbot Point Anchorage', 'SCHD', 'Chittagong', None, None, 625240, 710], [132285, 334295, 'DEP', 'FW EXCURSIONIST', 'BULK CARRIER', 179.9, 'Wave Shipping Pty Ltd', '/Date(1642662000000+1000)/', '/Date(1642679100000+1000)/', 'Fisherman Island General Purpose Berth', 'SEA', 'SCHD', 'Busan', 'New Plymouth', '24', 633667, 710], [132364, 334463, 'EXT', 'JUPITER', 'BULK CARRIER', 225.0, 'LBH Australia Pty Ltd (Mackay)', '/Date(1642663800000+1000)/', '/Date(1642663800000+1000)/', 'SEA for DBCT', 'Anch for HPS & DBCT', 'PLAN', 'Rizhao', 'Singapore', '17', 633643, 705], [132781, 335342, 'DEP', 'MORNING CLARA', 'VEHICLES CARRIER', 199.9, 'Wilhelmsen Ships Service (Brisbane)', '/Date(1642665600000+1000)/', '/Date(1642680000000+1000)/', 'Fisherman Island No 1', 'SEA', 'SCHD', 'Tianjin', 'Port Kembla', '2251', 633093, 710], [131704, 333251, 'DEP', 'TANGGUH JAYA', 'LIQUEFIED GAS TANKER', 285.1, 'Wilhelmsen Ships Service (Gladstone)', '/Date(1642666500000+1000)/', '/Date(1642676400000+1000)/', 'Santos GLNG', 'SEA', 'CONF', 'Mexico', 'Incheon', None, 633458, 725], [130826, 331647, 'ARR', 'DL DAHLIA', 'BULK CARRIER', 229.0, 'Monson Agencies Australia (Gladstone)', '/Date(1642668300000+1000)/', '/Date(1642677300000+1000)/', 'Fairway Buoy Anchorage', 'Clinton Coal 1', 'CONF', 'Yeongheung', 'Tanjung Bin', '2712', 633296, 725], [132582, 334934, 'DEP', 'CORAL GEOGRAPHER', 'PASSENGER', 94.5, 'Coral Expeditions', '/Date(1642669200000+1000)/', '/Date(1642672800000+1000)/', 'C123', 'SEA', 'CONF', 'Cairns', 'Cairns', None, 633369, 725], [130422, 330911, 'DEP', 'NSU QUEST', 'BULK CARRIER', 299.94, 'Inchcape Shipping Services (Queensland)', '/Date(1642673700000+1000)/', '/Date(1642682700000+1000)/', 'Clinton Coal 3', 'SEA', 'CONF', 'Hay Point', 'Japan', '45', 632982, 725], [132795, 335383, 'REM', 'ALBATROSS BAY', 'LANDING CRAFT', 64.0, 'Sea Swift Pty Ltd', '/Date(1642674600000+1000)/', '/Date(1642676400000+1000)/', 'Horn Island', 'Main Jetty', 'CONF', 'Cairns', 'Seisia', 'AB 2203', 633274, 725], [132759, 335288, 'EXT', 'CMB PAUILLAC', 'BULK CARRIER', 235.0, 'Wilhelmsen Ships Service (Gladstone)', '/Date(1642675500000+1000)/', '/Date(1642679100000+1000)/', 'SEA', 'Fairway Buoy Anchorage', 'PLAN', 'Gove', 'Weipa', None, 633160, 705], [132430, 334596, 'DEP', 'GOLDEN YOSA', 'TANKER', 144.03, 'Sturrock Grindrod Maritime (Brisbane)', '/Date(1642676400000+1000)/', '/Date(1642692600000+1000)/', 'Viva Energy', 'SEA', 'SCHD', 'Geelong', 'Townsville', '74(C1)', 628015, 710], [132456, 334647, 'EXT', 'MISSY ENTERPRISE', 'GENERAL CARGO', 181.16, 'Wave Shipping Pty Ltd', '/Date(1642676400000+1000)/', '/Date(1642676460000+1000)/', 'SEA', 'Bundaberg Anchorage', 'PLAN', 'Singapore', 'Japan', '2', 631532, 705], [132389, 335619, 'EXT', 'GLOVIS CHORUS', 'VEHICLES CARRIER', 199.99, 'Gulf Agency Company (Brisbane)', '/Date(1642680000000+1000)/', '/Date(1642680000000+1000)/', 'SEA', 'Point Cartwright Anchorage', 'PLAN', 'Port Kembla', 'Pyeongtaek ', '77A', 630944, 705], [132505, 334744, 'EXT', 'NSU CHALLENGER', 'BULK CARRIER', 299.95, 'Gulf Agency Company (Gladstone)', '/Date(1642680000000+1000)/', '/Date(1642683600000+1000)/', 'SEA', 'Fairway Buoy Anchorage', 'PLAN', 'Nagoya', 'Oita', None, 633706, 705], [132727, 335219, 'EXT', 'RTM DIAS', 'BULK CARRIER', 234.87, 'Wilhelmsen Ships Service (Weipa)', '/Date(1642680900000+1000)/', '/Date(1642680900000+1000)/', 'SEA', 'Weipa Anchorage', 'PLAN', 'Gladstone', 'China', None, 633623, 705], [132859, 335500, 'ARR', 'FOURCROY', 'LANDING CRAFT', 49.8, 'Sea Swift Pty Ltd', '/Date(1642685400000+1000)/', '/Date(1642686900000+1000)/', 'SEA', 'Horn Island Barge Ramp', 'CONF', 'Saibai Island', 'Weipa', None, 633180, 725]], 'IsCustomMetaData': False, 'MetaData': {'__type': 'DataTableMetaDTO:#WebX.Core.DTO', 'Columns': [{'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'HAlignment': 'haright', 'Name': 'VOYAGE_ID', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Voyage Id', 'Visible': False, 'Width': '50px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'HAlignment': 'haright', 'Name': 'ID', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Id', 'Visible': False, 'Width': '20px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'JOB_TYPE_CODE', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Job Type', 'Visible': True, 'Width': '71px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '"link": {"title":"Ship Info", "type":"dashboard", "target":"_popup", "code":"standard.vesselinfo", "params":[{"name":"VID","value":"[%VESSEL_ID%]"}]}', 'Name': 'VESSEL_NAME', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Ship', 'Visible': True, 'Width': '94px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'MSQ_SHIP_TYPE', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Ship Type', 'Visible': True, 'Width': '115px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'HAlignment': 'haright', 'Name': 'LOA', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'LOA', 'Visible': True, 'Width': '95px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'AGENCY_NAME', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Agency', 'Visible': True, 'Width': '287px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'START_TIME', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Start Time', 'Visible': True, 'Width': '91px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'END_TIME', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'End Time', 'Visible': True, 'Width': '91px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'FROM_LOCATION_NAME', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'From Location', 'Visible': True, 'Width': '139px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'TO_LOCATION_NAME', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'To Location', 'Visible': True, 'Width': '139px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'STATUS_TYPE_CODE', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Status', 'Visible': True, 'Width': '83px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'LASTPORT_NAME', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Last Port', 'Visible': True, 'Width': '114px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'NEXTPORT_NAME', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Next Port', 'Visible': True, 'Width': '114px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'Name': 'VOYAGE_NUMBER', 'SortIndex': -1, 'SortOrder': '', 'Sortable': True, 'Template': '', 'Title': 'Voyage #', 'Visible': True, 'Width': '45px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'HAlignment': 'haright', 'Name': 'VESSEL_ID', 'SortIndex': -1, 'SortOrder': '', 'Template': '', 'Title': 'Vessel Id', 'Visible': False, 'Width': '64px'}, {'__type': 'ColumnMetaDTO:#WebX.Core.DTO', 'Format': '', 'HAlignment': 'haright', 'Name': 'STATUS_TYPE', 'SortIndex': -1, 'SortOrder': '', 'Template': '', 'Title': 'Status Type', 'Visible': False, 'Width': '64px'}], 'Script': 'var data = this.getData();\nvar $row = this.get$Row();\nvar $jobtype = this.get$Cell(\'JOB_TYPE\');\n\nvar $startTime = this.get$Cell(\'START_TIME\');\nvar $endTime = this.get$Cell(\'END_TIME\');\n\nif (data.JOB_TYPE == "Arrival")\n{\n $jobtype.css(\'color\', \'green\');\n}\nif (data.JOB_TYPE == "Departure")\n{\n $jobtype.css(\'color\', \'blue\');\n}\nif (data.JOB_TYPE == "Shift")\n{\n $jobtype.css(\'color\', \'#8B7500\');\n}\nif (data.JOB_TYPE == "External")\n{\n $jobtype.css(\'color\', \'grey\');\n}\n\nif (data.STATUS_TYPE >= 735 &&data.STATUS_TYPE < 750 )\n{\n $startTime.css(\'font-weight\', \'bold\');\n $endTime.css(\'font-weight\', \'bold\');\n $startTime.css(\'font-style\', \'italic\');\n $endTime.css(\'font-style\', \'italic\');\n}\n\n', 'TemplateRow': '', 'TemplateTable': '', 'Version': 0}, 'Name': 'DATA'}]}}
I've tried using pd.json_normalize
with and without record_path
. Specifying record_path
draws an error where column name can't be found.
print(pd.json_normalize(my_dict))
Output:
d.__type d.BuildVersion d.ReportCode d.Tables
0 DataSetDTO:#WebX.Core.DTO 7.0.0.12590 MSQ-WEB-0001 [{'__type': 'DataTableDTO:#WebX.Core.DTO', 'Bu...
print(pd.json_normalize(my_dict, record_path=['Data']))
Error:
File "/Users/kevin_o'connell/opt/anaconda3/lib/python3.8/site-packages/pandas/io/json/_normalize.py", line 243, in _pull_field
result = result[spec]
KeyError: 'Data'
I've also tried the following but as the print out shows, I'm not returning the tabular information associated with Data
.
print(pd.concat({k: pd.DataFrame(v).T for k, v in my_dict.items()}, axis=0))
0
d __type DataSetDTO:#WebX.Core.DTO
BuildVersion 7.0.0.12590
ReportCode MSQ-WEB-0001
Tables {'__type': 'DataTableDTO:#WebX.Core.DTO', 'Bui...
Returning the desired info as an object, not a pandas df:
df = pd.json_normalize(my_dict['d'], 'Tables')
df = pd.DataFrame(df['Data'].T)
Out:
Data
0 [[132393, 334520, EXT, CESI BEIHAI, LIQUEFIED ...
list meta as a parameter:
df = pd.json_normalize(my_dict['d'], record_path = 'Tables', meta = ['Data'], errors = 'ignore')
raise ValueError(
ValueError: Conflicting metadata name Data, need distinguishing prefix
ANSWER
Answered 2022-Jan-20 at 03:23record_path
is the path to the record, so you should specify the full path
df = pd.json_normalize(data, record_path=['d', 'Tables', 'Data'])
If you want to do without record_path
, the value type of Data
is list of list. You can use pd.DataFrame
directly
df = pd.DataFrame(data['d']['Tables'][0]['Data'])
print(df)
0 1 2 3 4 5 ... 11 12 13 14 15 16
0 132378 334489 EXT NANA Z BULK CARRIER 229.20 ... PLAN Keelung (Chilung) Kwangyang None 633086 705
1 132112 333984 DEP KRITI WARRIOR BULK CARRIER 234.98 ... CONF Amrun Amrun 2201 632395 725
2 132232 334208 EXT BLUE GRASS MARINER TANKER 183.06 ... PLAN Gladstone Singapore None 633566 705
3 132654 335076 EXT SERIFOS WARRIOR BULK CARRIER 234.98 ... PLAN Amrun Amrun 2201 632055 705
4 132030 333847 ARR MH GREEN CONTAINER SHIP 199.98 ... SCHD Yantian Botany Bay 11S/11N 633005 710
.. ... ... ... ... ... ... ... ... ... ... ... ... ...
71 132456 334647 EXT MISSY ENTERPRISE GENERAL CARGO 181.16 ... PLAN Singapore Japan 2 631532 705
72 132389 335619 EXT GLOVIS CHORUS VEHICLES CARRIER 199.99 ... PLAN Port Kembla Pyeongtaek 77A 630944 705
73 132505 334744 EXT NSU CHALLENGER BULK CARRIER 299.95 ... PLAN Nagoya Oita None 633706 705
74 132727 335219 EXT RTM DIAS BULK CARRIER 234.87 ... PLAN Gladstone China None 633623 705
75 132859 335500 ARR FOURCROY LANDING CRAFT 49.80 ... CONF Saibai Island Weipa None 633180 725
[76 rows x 17 columns]
QUESTION
I am making a name card and the codes are below:
I want to align the last two spans "STUDIO ICONIC" and "info@studioiconic.net" on the same line, however it is always appeared that the email span comes after like the picture show. Is possible to basically adjusted something to achieve that? or is there something i do it wrong? If possible i don't want to use grip or flexbox...Thanks.
ODEN QUEST
Creative Director
T +1 408 456 7890
M +1 408 456 8956
1234 Main Street
Sanita Clara, CA 95126
STUDIO ICONIC
info@studioiconic.net
ANSWER
Answered 2022-Jan-19 at 19:27The best way for layout is flex
. About the inline style, it is highly recommended that don't use it. instead, use classes for your styling.
.wrapper{
display:flex;
}
.image-container{
margin-left: 50px;
margin-top: 50px;
width: 150px;
height: 150px;
border:2px solid gainsboro;
}
.image{
width:100%;
}
.right-side{
display:flex;
flex-direction:column;
margin-left:100px;
}
.info{
display:flex;
}
.s1{
padding-left: 65px;
}
.s2{
margin-left:120px;
}
ODEN QUEST
Creative Director
T +1 408 456 7890
M +1 408 456 8956
1234 Main Street
Sanita Clara, CA 95126
STUDIO ICONIC
info@studioiconic.net
For detection only, in the code that you wrote, you should change display
and margin-left
of your span
.(span by default is display:inline
)
ODEN QUEST
Creative Director
T +1 408 456 7890
M +1 408 456 8956
1234 Main Street
Sanita Clara, CA 95126
STUDIO ICONIC
info@studioiconic.net
QUESTION
Let's say we have a simple dataframe that we want to pivot, like:
d = {"Equipment": ["Gym", "Gym", "Class", "Class", "Office", "Office"],
"Details": ["Barbell", "Ball", "Desk", "Desk", "Chair", "Lamp"],
"Recipient": ["Ben", "Ben", "Ben", "Clara", "Clara", "John"]
}
df = pd.DataFrame.from_dict(d)
ANSWER
Answered 2022-Jan-18 at 13:44As @Neither mentioned, the answer is given by the pandas.crosstab function.
The command
df = pd.crosstab(index=df["Recipient"], columns=[df["Equipment"],df["Details"]])
QUESTION
I have a list of objects of the class Person. Person has 2 attributes: String name, int points
My original list contains the following objects:
person1 = Person("Samuel", 5)
person2 = Person("Maria", 3)
person3 = Person("Samuel", 3)
person4 = Person("Maria", 6)
person5 = Person("Clara", 1)
I want to process my original list and get the following one:
person1 = Person("Samuel", 8)
person2 = Person("Maria", 9)
person3 = Person("Clara", 1);
So only objects with unique name are allowed. In case that one or more objects have the same name, the points of those objects have to be summed
Any idea how can I proceed?
ANSWER
Answered 2022-Jan-06 at 21:48public static List mergeByName(List persons) {
return persons.stream().collect(Collectors.groupingBy(Person::getName,
Collectors.summingInt(Person::getPoints)))
.entrySet().stream()
.map(entry -> new Person(entry.getKey(), entry.getValue()))
.collect(Collectors.toList());
}
QUESTION
I have this code to order a group of teams based on their scores, just like a soccer ranking and the code works fine when implemented like this (btw I defined "NEQS" to 18):
int melhor_class(t_equipa *eqA, t_equipa *eqB)
{
if(eqA->pontos > eqB->pontos){
return 1;
} else if(eqA->pontos < eqB->pontos){
return 0;
} else if(eqA->golosM > eqB->golosM){
return 1;
} else if(eqA->golosM < eqB->golosM){
return 0;
} else if(eqA->golosS < eqB->golosS){
return 1;
} else if(eqA->golosS > eqB->golosS){
return 0;
} else {
return 1;
}
}
void ordenar_equipas(t_equipa *e)
{
for(int i = 0; i < NEQS - 1; i++)
{
for(int j = 0; j < NEQS - i - 1; j++)
{
if (melhor_class(&e[j],&e[j+1]) == 0)
{
//swaping part
t_equipa temp = e[j];
e[j] = e[j+1];
e[j+1] = temp;
}
}
}
}
And the output works fine, its sorted out:
P V E D M S
Gil Vicente 33 0 0 0 18 7
Benfica 32 0 0 0 10 10
Sporting 31 0 0 0 10 7
Porto 24 0 0 0 20 8
Arouca 0 0 0 0 0 0
Belenenses 0 0 0 0 0 0
Boavista 0 0 0 0 0 0
Braga 0 0 0 0 0 0
Estoril 0 0 0 0 0 0
Famalicao 0 0 0 0 0 0
Maritimo 0 0 0 0 0 0
Moreirense 0 0 0 0 0 0
Pacos Ferreira 0 0 0 0 0 0
Portimonense 0 0 0 0 0 0
Santa Clara 0 0 0 0 0 0
Tondela 0 0 0 0 0 0
Vitoria 0 0 0 0 0 0
Vizela 0 0 0 0 0 0
But when I put the swaping part of the code in an function it just doesn't work:
void trocar_equipas(t_equipa *e, int p1, int p2)
{
t_equipa temp = e[p1];
e[p1] = e[p2];
e[p2] = temp;
}
void ordenar_equipas(t_equipa *e)
{
for(int i = 0; i < NEQS - 1; i++)
{
for(int j = 0; j < NEQS - i - 1; j++)
{
if (melhor_class(&e[j],&e[j+1]) == 0)
{
trocar_equipas(&e,j,j+1);
}
}
}
}
Output:
P V E D M S
Arouca 0 0 0 0 0 0
Belenenses 0 0 0 0 0 0
Benfica 32 0 0 0 10 10
Boavista 0 0 0 0 0 0
Braga 0 0 0 0 0 0
Estoril 0 0 0 0 0 0
Famalicao 0 0 0 0 0 0
Gil Vicente 33 0 0 0 18 7
Maritimo 0 0 0 0 0 0
Moreirense 0 0 0 0 0 0
Pacos Ferreira 0 0 0 0 0 0
Porto 24 0 0 0 20 8
Portimonense 0 0 0 0 0 0
Santa Clara 0 0 0 0 0 0
Sporting 31 0 0 0 10 7
Tondela 0 0 0 0 0 0
Vitoria 0 0 0 0 0 0
Vizela 0 0 0 0 0 0
I really need to put that swaping part in another function! I appreciate any type of help! Thanks
ANSWER
Answered 2021-Dec-30 at 11:54The function trocar_equipas receives a pointer as an argument, so you can just pass it like this:
void trocar_equipas(t_equipa *e, int p1, int p2)
{
t_equipa temp = e[p1];
e[p1] = e[p2];
e[p2] = temp;
}
void ordenar_equipas(t_equipa *e)
{
for(int i = 0; i < NEQS - 1; i++)
{
for(int j = 0; j < NEQS - i - 1; j++)
{
if (melhor_class(&e[j],&e[j+1]) == 0)
{
trocar_equipas(e,j,j+1);
}
}
}
}
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install Clara
Support
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesExplore Kits - Develop, implement, customize Projects, Custom Functions and Applications with kandi kits
Save this library and start creating your kit
Share this Page