yolo | air installation and testing of mobile applications | iOS library
kandi X-RAY | yolo Summary
Support
Quality
Security
License
Reuse
- Icon icon list
- Draw an icon
- reducer reducer to update state
- Refreshes the new state
- set previous value
- tick callback
yolo Key Features
yolo Examples and Code Snippets
# last build for the berty app for iOS curl -su :TOKEN "https://yolo.berty.io/api/build-list?project_id=https://github.com/berty/berty&artifact_kinds=1" | jq '.builds[0]'
{ "id": "https://buildkite.com/berty/berty-open/builds/535", "created_at": "2020-04-29T15:06:33.796Z", "state": "Passed", "message": "feat: add multipeer connectivity Transport and add it in libp2p\n\nfeat: add multipeer connectivity for mobile devices\n\nfix: fix linux compilation failed\n\nchore: remove old references to BLE\n\nchore: remove xcode project directory\n\nchore: goimports passed\n\nfeat: pass functional logger to the mc transport\n\nchore: improve log message in the mc driver", "started_at": "2020-04-29T15:45:14Z", "finished_at": "2020-04-29T15:58:41Z", "branch": "D4ryl00:feat/multipeer-connectivity-integration", "driver": "Buildkite", "short_id": "535", "has_artifacts": [ { "id": "buildkite_524ced1e072c6bb74e3bf9556854b339", "created_at": "2020-04-29T15:06:33.796Z", "file_size": "38838861", "local_path": "Berty-Yolo-08a8bb0dee9935ab14e62648c6969cd5dfd9f517.ipa", "download_url": "https://api.buildkite.com/v2/organizations/berty/pipelines/berty-open/builds/535/jobs/323605e5-72fd-4495-8198-615a68672148/artifacts/16bab990-66ee-4ed5-a9d4-db69704bc0fd/download", "mime_type": "application/octet-stream", "state": "Finished", "kind": "IPA", "driver": "Buildkite", "has_build_id": "https://buildkite.com/berty/berty-open/builds/535", "dl_artifact_signed_url": "/api/artifact-dl/buildkite_524ced1e072c6bb74e3bf9556854b339?sign=REDACTED", "plist_signed_url": "%2Fapi%2Fplist-gen%2Fbuildkite_524ced1e072c6bb74e3bf9556854b339.plist%3Fsign%3DREDACTED" } ], "has_commit_id": "08a8bb0dee9935ab14e62648c6969cd5dfd9f517", "has_project": { "id": "https://github.com/berty/berty", "created_at": "2018-07-16T05:21:19Z", "updated_at": "2020-04-29T13:13:05Z", "driver": "GitHub", "name": "berty", "description": "Berty is a secure peer-to-peer messaging app that works with or without internet access, cellular data or trust in the network", "has_owner": { "id": "https://github.com/berty", "name": "berty", "driver": "GitHub", "avatar_url": "https://avatars1.githubusercontent.com/u/22157871?v=4", "kind": "Organization" }, "has_owner_id": "https://github.com/berty" }, "has_project_id": "https://github.com/berty/berty", "has_mergerequest": { "id": "https://github.com/berty/berty/pull/1908", "created_at": "2020-04-23T08:19:28Z", "updated_at": "2020-04-30T09:54:59Z", "title": "WIP feat: add the multipeer connectivity transport", "message": "Add the multipeer connectivity transport of berty v1 to the master branch of berty\r\n* [x] add the transport + driver in an internal package\r\n* [ ] switch on/off that transport from the front", "driver": "GitHub", "branch": "D4ryl00:feat/multipeer-connectivity-integration", "state": "Opened", "commit_url": "https://github.com/berty/berty/commit/5face40d919f102d9d0f2b19061bae666f4b940a", "short_id": "1908", "has_project": { "id": "https://github.com/berty/berty", "created_at": "2018-07-16T05:21:19Z", "updated_at": "2020-04-29T13:13:05Z", "driver": "GitHub", "name": "berty", "description": "Berty is a secure peer-to-peer messaging app that works with or without internet access, cellular data or trust in the network", "has_owner_id": "https://github.com/berty" }, "has_project_id": "https://github.com/berty/berty", "has_author": { "id": "https://github.com/D4ryl00", "name": "D4ryl00", "driver": "GitHub", "avatar_url": "https://avatars3.githubusercontent.com/u/13605410?v=4", "kind": "User" }, "has_author_id": "https://github.com/D4ryl00", "has_commit_id": "08a8bb0dee9935ab14e62648c6969cd5dfd9f517" }, "has_mergerequest_id": "https://github.com/berty/berty/pull/1908" }
$ yolo -h USAGE server [flags] SUBCOMMANDS server Start a Yolo Server dump-objects info FLAGS -v false increase log verbosity
$ yolo server -h USAGE server FLAGS -auth-salt ... salt used to generate authentication tokens at the end of the URLs -basic-auth-password ... if set, enables basic authentication -bintray-token ... Bintray API Token -bintray-username ... Bintray username -buildkite-token ... BuildKite API Token -circleci-token ... CircleCI API Token -cors-allowed-origins ... CORS allowed origins (*.domain.tld) -db-path :temp: DB Store path -dev-mode false enable insecure helpers -github-token ... GitHub API Token -grpc-bind :9000 gRPC bind address -http-bind :8000 HTTP bind address -max-builds 100 maximum builds to fetch from external services (pagination) -realm Yolo authentication Realm -request-timeout 5s request timeout -shutdown-timeout 6s server shutdown timeout -with-cache false enable API caching
Trending Discussions on yolo
Trending Discussions on yolo
QUESTION
I have two large-ish data frames I am trying to append...
In df1, I have state codes, county codes, state names (Alabama, Alaska, etc.), county names, and years from 2010:2020.
In df2, I have county names, state abbreviations (AL, AK), and data for the year 2010 (which I am trying to merge into df1. The issue lies in that without specifying the state name and simply merging df1 and df2, some of the data which I am trying to get into df1 is duplicated due to there being some counties with the same name...hence, I am trying to also join by state to prevent this, but I have state abbreviations, and state names.
Is there any way in which I can make either the state names in df1 abbreviations, or the state names in df2 full names? Please let me know! Thank you for the help.
Edit: dput(df2)
dput(birthdata2)
structure(list(County = c("Ada County", "Adams County", "Adams County",
"Aiken County", "Alachua County", "Alamance County", "Alameda County",
"Albany County", "Alexandria city", "Allegan County", "Allegheny County",
"Allen County", "Allen County", "Anchorage Borough", "Anderson County",
"Androscoggin County", "Anne Arundel County", "Anoka County",
"Arapahoe County", "Arlington County", "Ascension Parish", "Ashtabula County",
"Atlantic County", "Baldwin County", "Baltimore city", "Baltimore County",
"Barnstable County", "Bartow County", "Bay County", "Bay County",
"Beaufort County", "Beaver County", "Bell County", "Benton County",
"Benton County", "Bergen County", "Berkeley County", "Berkeley County",
"Berks County", "Berkshire County", "Bernalillo County", "Berrien County",
"Bexar County", "Bibb County", "Black Hawk County", "Blair County",
"Blount County", "Bonneville County", "Boone County", "Boone County",
"Bossier Parish", "Boulder County", "Brazoria County", "Brazos County",
"Brevard County", "Bristol County", "Bronx County", "Broome County",
"Broward County", "Brown County", "Brunswick County", "Bucks County",
"Buncombe County", "Burlington County", "Butler County", "Butler County",
"Butte County", "Cabarrus County", "Cache County", "Caddo Parish",
"Calcasieu Parish", "Calhoun County", "Calhoun County", "Cambria County",
"Camden County", "Cameron County", "Canadian County", "Canyon County",
"Cape May County", "Carroll County", "Carroll County", "Cass County",
"Catawba County", "Cecil County", "Centre County", "Champaign County",
"Charles County", "Charleston County", "Charlotte County", "Chatham County",
"Chautauqua County", "Cherokee County", "Chesapeake city", "Chester County",
"Chesterfield County", "Chittenden County", "Citrus County",
"Clackamas County", "Clark County", "Clark County", "Clark County",
"Clark County", "Clarke County", "Clay County", "Clay County",
"Clayton County", "Clermont County", "Cleveland County", "Cobb County",
"Cochise County", "Coconino County", "Collier County", "Collin County",
"Columbia County", "Columbiana County", "Comal County", "Comanche County",
"Contra Costa County", "Cook County", "Coweta County", "Cowlitz County",
"Craven County", "Cumberland County", "Cumberland County", "Cumberland County",
"Cumberland County", "Cuyahoga County", "Dakota County", "Dallas County",
"Dane County", "Dauphin County", "Davidson County", "Davidson County",
"Davis County", "DeKalb County", "DeKalb County", "Delaware County",
"Delaware County", "Delaware County", "Denton County", "Denver County",
"Deschutes County", "DeSoto County", "District of Columbia",
"Dona Ana County", "Dorchester County", "Douglas County", "Douglas County",
"Douglas County", "Douglas County", "Douglas County", "DuPage County",
"Durham County", "Dutchess County", "Duval County", "East Baton Rouge Parish",
"Eaton County", "Ector County", "El Dorado County", "El Paso County",
"El Paso County", "Elkhart County", "Ellis County", "Erie County",
"Erie County", "Escambia County", "Essex County", "Essex County",
"Etowah County", "Fairfax County", "Fairfield County", "Fairfield County",
"Faulkner County", "Fayette County", "Fayette County", "Fayette County",
"Florence County", "Fond du Lac County", "Forsyth County", "Forsyth County",
"Fort Bend County", "Franklin County", "Franklin County", "Franklin County",
"Frederick County", "Fresno County", "Fulton County", "Galveston County",
"Gaston County", "Genesee County", "Gloucester County", "Grayson County",
"Greene County", "Greene County", "Greenville County", "Gregg County",
"Guadalupe County", "Guilford County", "Gwinnett County", "Hall County",
"Hamilton County", "Hamilton County", "Hamilton County", "Hampden County",
"Hampshire County", "Hampton city", "Hardin County", "Harford County",
"Harnett County", "Harris County", "Harrison County", "Hartford County",
"Hawaii County", "Hays County", "Henderson County", "Hendricks County",
"Hennepin County", "Henrico County", "Henry County", "Hernando County",
"Hidalgo County", "Hillsborough County", "Hillsborough County",
"Hinds County", "Honolulu County", "Horry County", "Houston County",
"Houston County", "Howard County", "Hudson County", "Humboldt County",
"Hunterdon County", "Imperial County", "Indian River County",
"Ingham County", "Iredell County", "Jackson County", "Jackson County",
"Jackson County", "Jackson County", "Jasper County", "Jefferson County",
"Jefferson County", "Jefferson County", "Jefferson County", "Jefferson County",
"Jefferson County", "Jefferson Parish", "Johnson County", "Johnson County",
"Johnson County", "Johnson County", "Johnston County", "Kalamazoo County",
"Kanawha County", "Kane County", "Kankakee County", "Kaufman County",
"Kendall County", "Kennebec County", "Kenosha County", "Kent County",
"Kent County", "Kent County", "Kenton County", "Kern County",
"King County", "Kings County", "Kings County", "Kitsap County",
"Knox County", "Kootenai County", "La Crosse County", "La Porte County",
"Lackawanna County", "Lafayette Parish", "Lake County", "Lake County",
"Lake County", "Lake County", "Lancaster County", "Lancaster County",
"Lane County", "Larimer County", "LaSalle County", "Lebanon County",
"Lee County", "Lee County", "Lehigh County", "Leon County", "Lexington County",
"Licking County", "Linn County", "Linn County", "Litchfield County",
"Livingston County", "Livingston Parish", "Lorain County", "Los Angeles County",
"Loudoun County", "Lowndes County", "Lubbock County", "Lucas County",
"Luzerne County", "Lycoming County", "Macomb County", "Macon County",
"Madera County", "Madison County", "Madison County", "Madison County",
"Mahoning County", "Manatee County", "Marathon County", "Maricopa County",
"Marin County", "Marion County", "Marion County", "Marion County",
"Martin County", "Maui County", "McHenry County", "McLean County",
"McLennan County", "Mecklenburg County", "Medina County", "Merced County",
"Mercer County", "Mercer County", "Merrimack County", "Mesa County",
"Miami County", "Miami-Dade County", "Middlesex County", "Middlesex County",
"Middlesex County", "Midland County", "Milwaukee County", "Minnehaha County",
"Missoula County", "Mobile County", "Mohave County", "Monmouth County",
"Monroe County", "Monroe County", "Monroe County", "Monroe County",
"Monterey County", "Montgomery County", "Montgomery County",
"Montgomery County", "Montgomery County", "Montgomery County",
"Montgomery County", "Morgan County", "Morris County", "Multnomah County",
"Muscogee County", "Muskegon County", "Napa County", "Nassau County",
"Navajo County", "New Castle County", "New Hanover County", "New Haven County",
"New London County", "New York County", "Newport News city",
"Niagara County", "Norfolk city", "Norfolk County", "Northampton County",
"Nueces County", "Oakland County", "Ocean County", "Okaloosa County",
"Oklahoma County", "Olmsted County", "Oneida County", "Onondaga County",
"Onslow County", "Ontario County", "Orange County", "Orange County",
"Orange County", "Orange County", "Orleans Parish", "Osceola County",
"Oswego County", "Ottawa County", "Ouachita Parish", "Outagamie County",
"Palm Beach County", "Parker County", "Pasco County", "Passaic County",
"Paulding County", "Pennington County", "Penobscot County", "Peoria County",
"Philadelphia County", "Pickens County", "Pierce County", "Pima County",
"Pinal County", "Pinellas County", "Pitt County", "Placer County",
"Plymouth County", "Polk County", "Polk County", "Portage County",
"Porter County", "Portsmouth city", "Potter County", "Prince George's County",
"Prince William County", "Providence County", "Pueblo County",
"Pulaski County", "Queens County", "Racine County", "Ramsey County",
"Randall County", "Randolph County", "Rankin County", "Rapides Parish",
"Rensselaer County", "Richland County", "Richland County", "Richmond city",
"Richmond County", "Richmond County", "Riverside County", "Robeson County",
"Rock County", "Rock Island County", "Rockingham County", "Rockland County",
"Rowan County", "Rutherford County", "Sacramento County", "Saginaw County",
"Saline County", "Salt Lake County", "San Bernardino County",
"San Diego County", "San Francisco County", "San Joaquin County",
"San Juan County", "San Luis Obispo County", "San Mateo County",
"Sandoval County", "Sangamon County", "Santa Barbara County",
"Santa Clara County", "Santa Cruz County", "Santa Fe County",
"Santa Rosa County", "Sarasota County", "Saratoga County", "Sarpy County",
"Schenectady County", "Schuylkill County", "Scott County", "Scott County",
"Sebastian County", "Sedgwick County", "Seminole County", "Shasta County",
"Shawnee County", "Sheboygan County", "Shelby County", "Shelby County",
"Skagit County", "Smith County", "Snohomish County", "Solano County",
"Somerset County", "Sonoma County", "Spartanburg County", "Spokane County",
"Spotsylvania County", "St. Charles County", "St. Clair County",
"St. Clair County", "St. Johns County", "St. Joseph County",
"St. Lawrence County", "St. Louis city", "St. Louis County",
"St. Louis County", "St. Lucie County", "St. Mary's County",
"St. Tammany Parish", "Stafford County", "Stanislaus County",
"Stark County", "Stearns County", "Strafford County", "Suffolk County",
"Suffolk County", "Sullivan County", "Summit County", "Sumner County",
"Sumter County", "Sussex County", "Sussex County", "Tangipahoa Parish",
"Tarrant County", "Taylor County", "Tazewell County", "Terrebonne Parish",
"Thurston County", "Tippecanoe County", "Tolland County", "Tom Green County",
"Tompkins County", "Travis County", "Trumbull County", "Tulare County",
"Tulsa County", "Tuscaloosa County", "Ulster County", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Unidentified Counties",
"Unidentified Counties", "Unidentified Counties", "Union County",
"Union County", "Utah County", "Vanderburgh County", "Ventura County",
"Vigo County", "Virginia Beach city", "Volusia County", "Wake County",
"Walworth County", "Warren County", "Warren County", "Warren County",
"Washington County", "Washington County", "Washington County",
"Washington County", "Washington County", "Washington County",
"Washington County", "Washington County", "Washington County",
"Washoe County", "Washtenaw County", "Waukesha County", "Wayne County",
"Wayne County", "Wayne County", "Webb County", "Weber County",
"Weld County", "Westchester County", "Westmoreland County", "Whatcom County",
"Whitfield County", "Wichita County", "Will County", "Williamson County",
"Williamson County", "Wilson County", "Windham County", "Winnebago County",
"Winnebago County", "Wood County", "Woodbury County", "Worcester County",
"Wright County", "Wyandotte County", "Yakima County", "Yavapai County",
"Yellowstone County", "Yolo County", "York County", "York County",
"York County", "Yuma County"), State = c(" ID", " CO", " PA",
" SC", " FL", " NC", " CA", " NY", " VA", " MI", " PA", " IN",
" OH", " AK", " SC", " ME", " MD", " MN", " CO", " VA", " LA",
" OH", " NJ", " AL", " MD", " MD", " MA", " GA", " FL", " MI",
" SC", " PA", " TX", " AR", " WA", " NJ", " SC", " WV", " PA",
" MA", " NM", " MI", " TX", " GA", " IA", " PA", " TN", " ID",
" KY", " MO", " LA", " CO", " TX", " TX", " FL", " MA", " NY",
" NY", " FL", " WI", " NC", " PA", " NC", " NJ", " OH", " PA",
" CA", " NC", " UT", " LA", " LA", " AL", " MI", " PA", " NJ",
" TX", " OK", " ID", " NJ", " GA", " MD", " ND", " NC", " MD",
" PA", " IL", " MD", " SC", " FL", " GA", " NY", " GA", " VA",
" PA", " VA", " VT", " FL", " OR", " IN", " NV", " OH", " WA",
" GA", " FL", " MO", " GA", " OH", " OK", " GA", " AZ", " AZ",
" FL", " TX", " GA", " OH", " TX", " OK", " CA", " IL", " GA",
" WA", " NC", " ME", " NC", " NJ", " PA", " OH", " MN", " TX",
" WI", " PA", " NC", " TN", " UT", " GA", " IL", " IN", " OH",
" PA", " TX", " CO", " OR", " MS", " DC", " NM", " SC", " CO",
" GA", " KS", " NE", " OR", " IL", " NC", " NY", " FL", " LA",
" MI", " TX", " CA", " CO", " TX", " IN", " TX", " NY", " PA",
" FL", " MA", " NJ", " AL", " VA", " CT", " OH", " AR", " GA",
" KY", " PA", " SC", " WI", " GA", " NC", " TX", " MO", " OH",
" PA", " MD", " CA", " GA", " TX", " NC", " MI", " NJ", " TX",
" MO", " OH", " SC", " TX", " TX", " NC", " GA", " GA", " IN",
" OH", " TN", " MA", " MA", " VA", " KY", " MD", " NC", " TX",
" MS", " CT", " HI", " TX", " NC", " IN", " MN", " VA", " GA",
" FL", " TX", " FL", " NH", " MS", " HI", " SC", " AL", " GA",
" MD", " NJ", " CA", " NJ", " CA", " FL", " MI", " NC", " MI",
" MO", " MS", " OR", " MO", " AL", " CO", " KY", " MO", " NY",
" TX", " LA", " IA", " IN", " KS", " TX", " NC", " MI", " WV",
" IL", " IL", " TX", " IL", " ME", " WI", " DE", " MI", " RI",
" KY", " CA", " WA", " CA", " NY", " WA", " TN", " ID", " WI",
" IN", " PA", " LA", " FL", " IL", " IN", " OH", " NE", " PA",
" OR", " CO", " IL", " PA", " AL", " FL", " PA", " FL", " SC",
" OH", " IA", " OR", " CT", " MI", " LA", " OH", " CA", " VA",
" GA", " TX", " OH", " PA", " PA", " MI", " IL", " CA", " AL",
" IL", " IN", " OH", " FL", " WI", " AZ", " CA", " FL", " IN",
" OR", " FL", " HI", " IL", " IL", " TX", " NC", " OH", " CA",
" NJ", " PA", " NH", " CO", " OH", " FL", " CT", " MA", " NJ",
" TX", " WI", " SD", " MT", " AL", " AZ", " NJ", " IN", " MI",
" NY", " PA", " CA", " AL", " MD", " OH", " PA", " TN", " TX",
" AL", " NJ", " OR", " GA", " MI", " CA", " NY", " AZ", " DE",
" NC", " CT", " CT", " NY", " VA", " NY", " VA", " MA", " PA",
" TX", " MI", " NJ", " FL", " OK", " MN", " NY", " NY", " NC",
" NY", " CA", " FL", " NC", " NY", " LA", " FL", " NY", " MI",
" LA", " WI", " FL", " TX", " FL", " NJ", " GA", " SD", " ME",
" IL", " PA", " SC", " WA", " AZ", " AZ", " FL", " NC", " CA",
" MA", " FL", " IA", " OH", " IN", " VA", " TX", " MD", " VA",
" RI", " CO", " AR", " NY", " WI", " MN", " TX", " NC", " MS",
" LA", " NY", " OH", " SC", " VA", " GA", " NY", " CA", " NC",
" WI", " IL", " NH", " NY", " NC", " TN", " CA", " MI", " AR",
" UT", " CA", " CA", " CA", " CA", " NM", " CA", " CA", " NM",
" IL", " CA", " CA", " CA", " NM", " FL", " FL", " NY", " NE",
" NY", " PA", " IA", " MN", " AR", " KS", " FL", " CA", " KS",
" WI", " AL", " TN", " WA", " TX", " WA", " CA", " NJ", " CA",
" SC", " WA", " VA", " MO", " IL", " MI", " FL", " IN", " NY",
" MO", " MN", " MO", " FL", " MD", " LA", " VA", " CA", " OH",
" MN", " NH", " MA", " NY", " TN", " OH", " TN", " SC", " DE",
" NJ", " LA", " TX", " TX", " IL", " LA", " WA", " IN", " CT",
" TX", " NY", " TX", " OH", " CA", " OK", " AL", " NY", " AK",
" AL", " AR", " AZ", " CA", " CO", " FL", " GA", " HI", " IA",
" ID", " IL", " IN", " KS", " KY", " LA", " MA", " MD", " ME",
" MI", " MN", " MO", " MS", " MT", " NC", " ND", " NE", " NH",
" NJ", " NM", " NV", " NY", " OH", " OK", " OR", " PA", " RI",
" SC", " SD", " TN", " TX", " UT", " VA", " VT", " WA", " WI",
" WV", " WY", " NC", " NJ", " UT", " IN", " CA", " IN", " VA",
" FL", " NC", " WI", " KY", " NJ", " OH", " AR", " MD", " MN",
" OR", " PA", " RI", " TN", " UT", " WI", " NV", " MI", " WI",
" MI", " NC", " OH", " TX", " UT", " CO", " NY", " PA", " WA",
" GA", " TX", " IL", " TN", " TX", " TN", " CT", " IL", " WI",
" OH", " IA", " MA", " MN", " KS", " WA", " AZ", " MT", " CA",
" ME", " PA", " SC", " AZ"), x = c(358, 549, NA, 149, 219, 178,
1367, 257, 194, 76, 1032, 432, 114, 278, 177, 69, 574, 265, 701,
208, NA, 77, 267, 176, 1044, 826, 107, NA, 179, 78, 193, 119,
538, 246, 162, 765, 202, NA, 409, 84, 778, 177, 2441, 321, 123,
101, 91, NA, NA, 173, NA, 228, 390, 192, 372, 472, 2190, 173,
1930, 209, NA, 469, 194, 379, 340, 112, 146, 225, NA, 573, 258,
103, 87, 119, 531, 622, NA, 187, 56, NA, 78, 141, 155, NA, 76,
198, 164, 419, 74, 378, 99, 240, 222, 371, 298, 95, 77, 243,
NA, 2217, 101, 340, 116, 170, 206, 437, 202, 228, 790, 140, 129,
248, 844, NA, 73, NA, 155, 851, 6598, NA, NA, NA, 164, 575, 219,
152, 1565, 319, 3336, 388, 335, 179, 825, 374, 1104, NA, 97,
131, 537, 621, 915, 83, 141, 934, 240, NA, 353, NA, NA, 680,
76, 789, 406, 187, 1171, 732, 72, 249, 115, 857, 1265, 235, 132,
788, 305, 412, 569, 1087, 93, 1060, 776, 87, NA, NA, 319, 137,
220, NA, NA, 477, 701, NA, 1676, 138, 237, 1266, 1467, 338, 248,
523, 253, 109, 264, 137, 501, 167, NA, 562, 863, 221, 256, 1043,
436, 451, 74, 177, NA, 202, NA, 6004, 295, 835, 180, NA, NA,
111, 1167, 352, 221, 110, 1284, 1484, 317, 553, 1181, 311, NA,
185, 259, 842, 76, 70, 182, 89, 257, 148, 156, 795, 159, 150,
109, 998, 486, 972, 179, 130, 368, 551, 114, 100, 463, 169, 167,
245, 182, 518, 99, NA, NA, 70, 155, 173, 654, 102, 199, 1010,
1602, 158, 3530, 189, 449, 105, 65, 123, 177, 289, 230, 644,
618, 147, 302, 517, 215, 257, 73, 97, 117, 488, 333, 267, 274,
157, 178, 79, 118, 108, NA, 231, 9714, 345, NA, 399, 562, 272,
73, 766, 118, 164, 446, 255, 112, 242, 262, 68, 3848, 141, 267,
1403, 279, 77, 144, 263, 155, 255, 1340, 87, 285, 403, 72, 93,
110, NA, 2862, 72, 1389, 853, 167, 1329, 200, NA, 694, 160, 472,
79, 99, 722, 145, 370, 348, 1026, 634, 612, 257, 474, 134, 355,
629, 348, 164, 82, 1137, NA, 631, 176, 833, 190, 1737, 266, 166,
379, 479, 281, 377, 1045, 530, 185, 1095, 115, 186, 429, 304,
76, 2459, 1373, 108, 345, 571, 301, 95, 183, 253, 157, 1238,
NA, 396, 590, NA, NA, 78, 198, 2523, 65, 633, 853, 328, 771,
197, 206, 369, 601, 504, 119, 74, 155, 175, 1243, 497, 602, 174,
537, 2557, 174, 516, 108, 131, 225, 156, 111, 109, 566, 336,
336, 466, 1989, 235, 165, 143, 170, 309, 150, 307, 1419, 202,
NA, 1323, 2224, 2895, 622, 746, 109, 121, 655, NA, 237, 325,
1648, 178, 135, 117, 213, 144, 147, 153, 100, 142, NA, 137, 688,
343, 104, 177, 59, 203, 1526, 64, 238, 536, 320, 279, 307, 390,
387, NA, 326, 320, 133, 106, 294, 70, 597, 118, 1005, 271, NA,
204, NA, 464, 343, 93, 81, 860, 1301, 151, 552, 129, 176, 201,
103, 202, 2341, 146, 92, 173, 149, 135, 68, 113, NA, 1250, 167,
508, 809, 283, 115, 369, 2516, 2223, 401, 599, 849, 1237, 5841,
57, 1503, 911, 1908, 2311, 1294, 3545, 2704, 51, 625, 268, 1538,
1605, 2599, 3450, 744, 3550, 464, 696, 201, 45, 1143, 297, 1472,
2483, 2155, 643, 1334, 61, 1797, 604, 2718, 5530, 868, 3426,
276, 944, 1483, 1696, 673, 202, 570, 752, 184, 686, 71, 485,
389, 1018, NA, NA, 53, 178, 232, 155, 157, 390, 163, 56, 106,
NA, 59, 448, 318, 284, 2572, 121, 100, 398, 328, 267, 965, 251,
103, NA, 150, 643, 125, 458, NA, 68, 321, 142, 75, 106, 653,
NA, 239, 292, 110, 141, 125, 113, 425, 256, 197), year = c(2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010,
2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2010
)), row.names = c(NA, -628L), class = c("tbl_df", "tbl", "data.frame"
))
ANSWER
Answered 2022-Apr-18 at 03:52Here's one way you could turn state abbreviations into state names using R's built in state vectors:
a <- c('NJ', 'MA', 'FL')
state.name[sapply(a, \(x) which(x == state.abb))]
[1] "New Jersey" "Massachusetts" "Florida"
Applying this to birthdata2
, we might create a column called state_name
. But first we need to trim whitespace from the State column:
birthdata2$State <- trimws(birthdata2$State)
birthdata2$state_name <- state.name[sapply(birthdata2$State, \(x) which(x == state.abb)[1])]
County State x year state_name
1 Ada County ID 358 2010 Idaho
2 Adams County CO 549 2010 Colorado
3 Adams County PA NA 2010 Pennsylvania
4 Aiken County SC 149 2010 South Carolina
5 Alachua County FL 219 2010 Florida
6 Alamance County NC 178 2010 North Carolina
7 Alameda County CA 1367 2010 California
8 Albany County NY 257 2010 New York
9 Alexandria city VA 194 2010 Virginia
10 Allegan County MI 76 2010 Michigan
# … with 618 more rows
QUESTION
Gonna ask a newbie question...
If I finished training my model, or I am using trained model like YOLO. And I want to put the model on a robot that has a 6GB VRAM. In this case, do I need to concern about the batch size at all?
I am trying to find out if models like YOLO will fit in my GPU. Thank you
ANSWER
Answered 2022-Apr-11 at 17:09Generally speaking, batch size can be adjusted at any time without creating a problem. Each element of a batch is independent, but they are fed through the network together for efficiency reasons.
Note that batch size does affect training quality, as the gradients from larger batches will average out and have less variance. But that is irrelevant when doing inference (actually using the model).
You also asked what prevents you deploying a huge model on a small GPU, and the answer is simply performance. It is entirely possible to load part of a large model onto the GPU, run that part, load the next part, run it, and so on. You would need to balance the batch size and model part size, because if you only use a batch size of 1, the continual copying of model parameters will probably make it slower than running the whole model on the CPU.
QUESTION
What does the code mean, and how to invoke it?
function (YOLO)
YOLO + 1
end
Quoted from here.
Thanks
ANSWER
Answered 2022-Mar-05 at 03:43It's an Anonymous function.
The usual way to use them is to either assign it to a variable, which would become the function's name:
julia> y = function (YOLO)
YOLO + 1
end
#43 (generic function with 1 method)
julia> y(4)
5
or pass the function itself directly as an argument to a different function (though for that, the shorter YOLO -> YOLO + 1
or the do ... end
syntaxes are usually used).
Another way to invoke it is to just immediately call it:
julia> (function (YOLO)
YOLO + 1
end)(43)
44
QUESTION
I have created a map of multiple different data types. s64, f64, Arrays, Images, etc. To do so, i used a map of type std::map> database;
. I want to store it, and reload it from filesystem. But i heard that maps cant be sored in a one-liner. So i tried to store the data part std::unique_ptr test;
of one pair first:
friend class boost::serialization::access;
template
void serialize(Archive& ar, unsigned int version) {
ar & test;
}
The program crashes at ar & test
throwing an exception: "unregistered class - derived class not registered or exported"
. whats the issue? i dont understand it.
here is the minimal code: (deleted)
#include
//....
};
as 463035818_is_not_a_number pointed out, my sniped is not working. i recreated it, and got a lot further i think. But as soon as i insert the load from file function, it does not compile anymore saying: error C2280: "std::pair::pair(const std::pair &)" : Es wurde versucht, auf eine gelöschte Funktion zu verweisen
#include
#include
#include
#include
#include
#include
#include
#include
class MapEntryInterface {
public:
MapEntryInterface(std::string type_str) : type(type_str) {}
std::string type;
friend class boost::serialization::access;
template
void serialize(Archive& ar, unsigned int version) {
if (type == "s64") {
MapEntryS64* cast = (MapEntryS64*)this;
cast->serialize(ar, version);
}
if (type == "f64") {
MapEntryF64* cast = (MapEntryF64*)this;
cast->serialize(ar, version);
}
}
};
class MapEntryS64 : public MapEntryInterface {
public:
MapEntryS64(int init_val, const std::string& type_str)
: data(init_val), MapEntryInterface(type_str)
{}
uint64_t data;
friend class boost::serialization::access;
template
void serialize(Archive& ar, unsigned int version) {
ar & type;
ar & data;
}
};
class MapEntryF64 : public MapEntryInterface {
public:
MapEntryF64(double init_val, const std::string& type_str)
: data(init_val), MapEntryInterface(type_str)
{}
double data;
friend class boost::serialization::access;
template
void serialize(Archive& ar, unsigned int version) {
ar & type;
ar & data;
}
};
class MapDataBase {
public:
MapDataBase()
//: test(std::unique_ptr(new MapEntryS64(381, "s64")))
{
database["key1"] = std::unique_ptr(new MapEntryS64(381, "s64"));
database["key2"] = std::unique_ptr(new MapEntryF64(3.124512, "f64"));
};
bool SaveToFile() {
std::ofstream ofs("boost_export.dat");
if (ofs.is_open()) {
boost::archive::text_oarchive oa(ofs);
oa & *this;
return true;
}
return false;
}
bool loadFromFile() {
std::ifstream ifs("boost_export.dat");
if (ifs.is_open())
{
try
{
boost::archive::text_iarchive ia(ifs);
ia & *this;
//std::string yolo;
//ia >> yolo;
//ia >> bam;
}
catch (std::exception& ex)
{
std::cout << ex.what() << std::endl;
return false;
}
}
return true;
}
private:
std::map> database;
//std::unique_ptr test;
friend class boost::serialization::access;
template
void serialize(Archive& ar, unsigned int version) {
ar & database;
}
};
void main() {
MapDataBase tmp;
tmp.SaveToFile();
MapDataBase tmp2;
//tmp2.loadFromFile();
}
ANSWER
Answered 2022-Feb-28 at 15:10I spent a large amount of time making things self-contained. Among the many changes:
you should not have the base class serializing the derived (that's classic OOP), instead Boost expects derived classes to serialize their
base_object<>
(allowing static polymorphism and, incidentally, type registration)of course, the base class should serialize its data members (
type
)the base class SHOULD have a virtual destructor (otherwise deleting through
unique_ptr
's destructor will be unspecifiedUp till here:
class MapEntryInterface {
public:
virtual ~MapEntryInterface() = default;
protected:
MapEntryInterface(std::string type_str) : type_(type_str) {}
std::string type_;
friend class boost::serialization::access;
template void serialize(Ar& ar, unsigned) { ar& type_; }
};
using EntryPtr = std::unique_ptr;
The base/member initialize lists are in misleading order; initialization happens in declaration order anyways
The type_str
is a code smell: the whole idea of OOP virtualization is not to have switching on types everywhere. I went half-way for you by at least defaulting the values, but you could probably do without it entirely. After all the type is the type.
Now add the base_object
serialization:
template void serialize(Ar& ar, unsigned)
{
ar& boost::serialization::base_object(*this);
ar& data_;
}
MapDatabase
benefits from several simplifications
- never use
new
ordelete
- checking the streams before is redundant since you already handle exceptions
- since
loadFromFile
has no useful way of handling the exception, re-throw, or just let is escape, - also allowing you to make
MapDatabase
the return type instead ofbool
.
saveToFile
and loadFromFile
should probably take a filename parameter
Not shown: arguably saveToFile
and loadFromFile
need not be part of MapDatabase
At this point, adding a little bit of code to print database contents:
#include
#include
#include
#include
#include
#include
#include
#include
class MapEntryInterface {
public:
virtual ~MapEntryInterface() = default;
virtual void print(std::ostream&) const = 0;
protected:
MapEntryInterface(std::string type_str) : type_(type_str) {}
std::string type_;
friend class boost::serialization::access;
template void serialize(Ar& ar, unsigned) { ar& type_; }
friend std::ostream& operator<<(std::ostream& os, MapEntryInterface const& e) {
e.print(os);
return os;
}
};
using EntryPtr = std::unique_ptr;
class MapEntryS64 : public MapEntryInterface {
public:
MapEntryS64(int init_val = 0, const std::string& type_str = "s64")
: MapEntryInterface(type_str)
, data_(init_val)
{
}
private:
uint64_t data_;
friend class boost::serialization::access;
template void serialize(Ar& ar, unsigned) {
ar& boost::serialization::base_object(*this);
ar& data_;
}
virtual void print(std::ostream& os) const override {
os << "S64(" << data_ << ", " << std::quoted(type_) << ")";
}
};
class MapEntryF64 : public MapEntryInterface {
public:
MapEntryF64(double init_val = 0, const std::string& type_str = "f64")
: MapEntryInterface(type_str)
, data_(init_val)
{
}
private:
double data_;
friend class boost::serialization::access;
template void serialize(Ar& ar, unsigned)
{
ar& boost::serialization::base_object(*this);
ar& data_;
}
virtual void print(std::ostream& os) const override {
os << "F64(" << data_ << ", " << std::quoted(type_) << ")";
}
};
class MapDatabase {
public:
using Map = std::map;
MapDatabase() {
database_.emplace("key1", std::make_unique(381));
database_.emplace("key2", std::make_unique(3.124512));
};
bool SaveToFile(std::string const& filename) const
{
try {
std::ofstream ofs(filename, std::ios::binary);
boost::archive::text_oarchive oa(ofs);
oa << *this;
return true;
} catch (std::exception& ex) {
std::cout << ex.what() << std::endl;
return false;
}
}
static MapDatabase loadFromFile(std::string const& filename)
{
MapDatabase db;
std::ifstream ifs(filename, std::ios::binary);
boost::archive::text_iarchive ia(ifs);
ia >> db;
return db;
}
friend std::ostream& operator<<(std::ostream& os, MapDatabase const& mdb)
{
for (auto const& [k, b] : mdb.database_)
if (b) os << std::quoted(k) << " -> " << *b << "\n";
else os << std::quoted(k) << " -> NULL\n";
return os;
}
private:
Map database_;
friend class boost::serialization::access;
template void serialize(Ar& ar, unsigned) { ar& database_; }
};
int main() {
{
MapDatabase tmp;
std::cout << "------ tmp:\n" << tmp << "\n";
if (not tmp.SaveToFile("boost_export.dat"))
return 1;
}
MapDatabase roundtrip = MapDatabase::loadFromFile("boost_export.dat");
std::cout << "------ roundtrip:\n" << roundtrip << "\n";
}
Prints
------ tmp:
"key1" -> S64(381, "s64")
"key2" -> F64(3.12451, "f64")
unregistered class - derived class not registered or exported
unregistered class
runtime error
The message says it all: at time of deserialization, there is no information about the types that can be deserialized. Add that:
#include
BOOST_CLASS_EXPORT(MapEntryF64)
BOOST_CLASS_EXPORT(MapEntryS64)
Now it prints
------ tmp:
"key1" -> S64(381, "s64")
"key2" -> F64(3.12451, "f64")
------ roundtrip:
"key1" -> S64(381, "s64")
"key2" -> F64(3.12451, "f64")
In the case of separate translation units, split the class export macros as per the documentation. E.g. the class export macro ends up doing
#define BOOST_CLASS_EXPORT_GUID(T, K) \
BOOST_CLASS_EXPORT_KEY2(T, K) \
BOOST_CLASS_EXPORT_IMPLEMENT(T) \
/**/
So, naively (taking another half-hour to split it all op sensibly:)
File
test.cpp
#include "MapDatabase.h"
#include
int main() {
{
MapDatabase tmp;
std::cout << "------ tmp:\n" << tmp << "\n";
if (not tmp.SaveToFile("boost_export.dat"))
return 1;
}
MapDatabase roundtrip = MapDatabase::loadFromFile("boost_export.dat");
std::cout << "------ roundtrip:\n" << roundtrip << "\n";
}
File MapDatabase.h
#pragma once
#include "MapEntryS64.h"
#include "MapEntryF64.h"
#include
#include
class MapDatabase {
public:
using Map = std::map;
MapDatabase();
bool SaveToFile(std::string const& filename) const;
static MapDatabase loadFromFile(std::string const& filename);
friend std::ostream& operator<<(std::ostream&, MapDatabase const&);
private:
Map database_;
friend class boost::serialization::access;
template void serialize(Ar& ar, unsigned) { ar& database_; }
};
File MapDatabase.cpp
#include "MapDatabase.h"
#include
#include
#include
#include
MapDatabase::MapDatabase() {
database_.emplace("key1", std::make_unique(381));
database_.emplace("key2", std::make_unique(3.124512));
}
bool MapDatabase::SaveToFile(std::string const& filename) const
{
try {
std::ofstream ofs(filename, std::ios::binary);
boost::archive::text_oarchive oa(ofs);
oa << *this;
return true;
} catch (std::exception& ex) {
std::cout << ex.what() << std::endl;
return false;
}
}
MapDatabase MapDatabase::loadFromFile(std::string const& filename)
{
MapDatabase db;
std::ifstream ifs(filename, std::ios::binary);
boost::archive::text_iarchive ia(ifs);
ia >> db;
return db;
}
std::ostream& operator<<(std::ostream& os, MapDatabase const& mdb)
{
for (auto const& [k, b] : mdb.database_)
if (b) os << std::quoted(k) << " -> " << *b << "\n";
else os << std::quoted(k) << " -> NULL\n";
return os;
}
File MapEntryF64.h
#pragma once
#include "MapEntryInterface.h"
#include
class MapEntryF64 : public MapEntryInterface {
public:
MapEntryF64(int init_val = 0, const std::string& type_str = "f64");
private:
uint64_t data_;
friend class boost::serialization::access;
template void serialize(Ar& ar, unsigned) {
ar& boost::serialization::base_object(*this);
ar& data_;
}
virtual void print(std::ostream& os) const override;
};
#include
BOOST_CLASS_EXPORT_KEY(MapEntryF64)
File MapEntryInterface.h
#pragma once
#include
#include
#include
#include
class MapEntryInterface {
public:
virtual ~MapEntryInterface() = default;
virtual void print(std::ostream&) const = 0;
protected:
MapEntryInterface(std::string type_str);
std::string type_;
friend class boost::serialization::access;
template void serialize(Ar& ar, unsigned) { ar& type_; }
friend std::ostream& operator<<(std::ostream&, MapEntryInterface const&);
};
using EntryPtr = std::unique_ptr;
File MapEntryS64.h
#pragma once
#include "MapEntryInterface.h"
#include
class MapEntryS64 : public MapEntryInterface {
public:
MapEntryS64(int init_val = 0, const std::string& type_str = "s64");
private:
uint64_t data_;
friend class boost::serialization::access;
template void serialize(Ar& ar, unsigned) {
ar& boost::serialization::base_object(*this);
ar& data_;
}
virtual void print(std::ostream& os) const override;
};
#include
BOOST_CLASS_EXPORT_KEY(MapEntryS64)
File MapEntryF64.cpp
#include "MapEntryF64.h"
#include
#include
MapEntryF64::MapEntryF64(int init_val, const std::string& type_str)
: MapEntryInterface(type_str)
, data_(init_val)
{
}
void MapEntryF64::print(std::ostream& os) const {
os << "F64(" << data_ << ", " << std::quoted(type_) << ")";
}
BOOST_CLASS_EXPORT_IMPLEMENT(MapEntryF64)
File MapEntryInterface.cpp
#include "MapEntryInterface.h"
MapEntryInterface::MapEntryInterface(std::string type_str) : type_(type_str) {}
std::ostream& operator<<(std::ostream& os, MapEntryInterface const& e)
{
e.print(os);
return os;
}
File MapEntryS64.cpp
#include "MapEntryS64.h"
#include
#include
MapEntryS64::MapEntryS64(int init_val, const std::string& type_str)
: MapEntryInterface(type_str)
, data_(init_val)
{
}
void MapEntryS64::print(std::ostream& os) const {
os << "S64(" << data_ << ", " << std::quoted(type_) << ")";
}
BOOST_CLASS_EXPORT_IMPLEMENT(MapEntryS64)
Prints: Live On Wandbox
------ tmp:
"key1" -> S64(381, "s64")
"key2" -> F64(3, "f64")
unregistered class - derived class not registered or exported
Not at all. Just need to read the documentation closely:
BOOST_CLASS_EXPORT
in the same source module that includes any of the archive class headers will instantiate code required to serialize polymorphic pointers of the indicated type to the all those archive classes. If no archive class headers are included, then no code will be instantiated.
So, adding the includes:
#include
#include
BOOST_CLASS_EXPORT_IMPLEMENT(MapEntryF64)
(and the same for MapEntryS64
):
------ tmp:
"key1" -> S64(381, "s64")
"key2" -> F64(3, "f64")
------ roundtrip:
"key1" -> S64(381, "s64")
"key2" -> F64(3, "f64")
I prefer simplicity. I would probably replace all of the above with:
File
MapDatabase.h
#pragma once
#include
#include
namespace Database {
struct Nil { void serialize(auto&, unsigned) {} };
using S64 = uint64_t;
using F64 = double;
using Entry = boost::variant;
using Map = std::map;
std::string_view typeOf(Entry const&);
void SaveToFile(std::string const& filename, Map const& m);
[[nodiscard]] Map loadFromFile(std::string const& filename);
std::ostream& operator<<(std::ostream&, Nil);
std::ostream& operator<<(std::ostream&, Map const&);
} // namespace Database
File MapDatabase.cpp
#include "MapDatabase.h"
#include
#include
#include
#include
#include
#include
#include
namespace Database {
std::string_view typeOf(Entry const& e) {
assert(e.which() < 3);
return std::array{"Nil", "S64", "F64"}[e.which()];
}
void SaveToFile(std::string const& filename, Map const& m)
{
std::ofstream ofs(filename, std::ios::binary);
boost::archive::text_oarchive oa(ofs);
oa << m;
}
Map loadFromFile(std::string const& filename)
{
Map db;
std::ifstream ifs(filename, std::ios::binary);
boost::archive::text_iarchive ia(ifs);
ia >> db;
return db;
}
std::ostream& operator<<(std::ostream& os, Nil) { return os << "NULL"; }
std::ostream& operator<<(std::ostream& os, Map const& m)
{
for (auto const& [k, v] : m)
os << typeOf(v) << "\t" << std::quoted(k) << " -> " << v << "\n";
return os;
}
} // namespace Database
File test.cpp
#include "MapDatabase.h"
#include
int main() {
SaveToFile("boost_export.dat",
Database::Map{
{"key1", 381ul},
{"key3", {}},
{"key2", 3.124512},
});
std::cout << Database::loadFromFile("boost_export.dat");
}
Printing Live On Wandbox
S64 "key1" -> 381
F64 "key2" -> 3.12451
Nil "key3" -> NULL
QUESTION
You'll need this notebook to reproduce the error which downloads the files below and runs the exact same code following the description.
labels.csv
: each row containsx0
,y0
,x1
,y1
text coordinates, and other columns not affecting the outcome.yolo-train-0.tfrecord
: Contains 90% of the examples found inlabels.csv
. Each example contains all labels/rows corresponding to the image in the example.
I'm experiencing a recurring error that happens when iterating over a tfrecord dataset. After 2000-4000 iterations that successfully read batches from the dataset, I get the following error:
iteration: 3240 2022-02-14 04:25:15.376625: W tensorflow/core/framework/op_kernel.cc:1745] OP_REQUIRES failed at scatter_nd_op.cc:219 : INVALID_ARGUMENT: indices[189] = [6, 30, 38, 0] does not index into shape [8,38,38,3,6]
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/tensorflow/python/data/ops/iterator_ops.py", line 800, in __next__
return self._next_internal()
File "/usr/local/lib/python3.7/dist-packages/tensorflow/python/data/ops/iterator_ops.py", line 786, in _next_internal
output_shapes=self._flat_output_shapes)
File "/usr/local/lib/python3.7/dist-packages/tensorflow/python/ops/gen_dataset_ops.py", line 2845, in iterator_get_next
_ops.raise_from_not_ok_status(e, name)
File "/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/ops.py", line 7107, in raise_from_not_ok_status
raise core._status_to_exception(e) from None # pylint: disable=protected-access
tensorflow.python.framework.errors_impl.InvalidArgumentError: indices[189] = [6, 30, 38, 0] does not index into shape [8,38,38,3,6]
[[{{function_node __inference_transform_targets_for_output_1051}}{{node TensorScatterUpdate}}]] [Op:IteratorGetNext]
It is near impossible to tell which exact inputs that are causing the issue thanks to tensorflow's brilliant graph execution. I tried using pdb
, tf.print
statements and many other desperate measures trying to identify which examples in labels.csv
that cause the problem and need to be excluded, and nothing looks particularly suspicious.
Here's what the notebook runs and eventually results in the error mentioned.
import numpy as np
import pandas as pd
import tensorflow as tf
def transform_images(x, image_shape):
x = tf.image.resize(x, image_shape)
return x / 255
@tf.function
def transform_targets_for_output(y_true, grid_size, anchor_indices):
n = tf.shape(y_true)[0]
y_true_out = tf.zeros((n, grid_size, grid_size, tf.shape(anchor_indices)[0], 6))
anchor_indices = tf.cast(anchor_indices, tf.int32)
indexes = tf.TensorArray(tf.int32, 1, dynamic_size=True)
updates = tf.TensorArray(tf.float32, 1, dynamic_size=True)
idx = 0
for i in tf.range(n):
for j in tf.range(tf.shape(y_true)[1]):
if tf.equal(y_true[i][j][2], 0):
continue
anchor_eq = tf.equal(anchor_indices, tf.cast(y_true[i][j][5], tf.int32))
if tf.reduce_any(anchor_eq):
box = y_true[i][j][0:4]
box_xy = (y_true[i][j][0:2] + y_true[i][j][2:4]) / 2
anchor_idx = tf.cast(tf.where(anchor_eq), tf.int32)
grid_xy = tf.cast(box_xy // (1 / grid_size), tf.int32)
indexes = indexes.write(
idx, [i, grid_xy[1], grid_xy[0], anchor_idx[0][0]]
)
updates = updates.write(
idx, [box[0], box[1], box[2], box[3], 1, y_true[i][j][4]]
)
idx += 1
return tf.tensor_scatter_nd_update(y_true_out, indexes.stack(), updates.stack())
def transform_targets(y, anchors, anchor_masks, size):
y_outs = []
grid_size = size // 32
anchors = tf.cast(anchors, tf.float32)
anchor_area = anchors[..., 0] * anchors[..., 1]
box_wh = y[..., 2:4] - y[..., 0:2]
box_wh = tf.tile(tf.expand_dims(box_wh, -2), (1, 1, tf.shape(anchors)[0], 1))
box_area = box_wh[..., 0] * box_wh[..., 1]
intersection = tf.minimum(box_wh[..., 0], anchors[..., 0]) * tf.minimum(
box_wh[..., 1], anchors[..., 1]
)
iou = intersection / (box_area + anchor_area - intersection)
anchor_idx = tf.cast(tf.argmax(iou, axis=-1), tf.float32)
anchor_idx = tf.expand_dims(anchor_idx, axis=-1)
y = tf.concat([y, anchor_idx], axis=-1)
for anchor_indices in anchor_masks:
y_outs.append(transform_targets_for_output(y, grid_size, anchor_indices))
grid_size *= 2
return tuple(y_outs)
def read_example(
example,
feature_map,
class_table,
max_boxes,
image_shape,
):
features = tf.io.parse_single_example(example, feature_map)
image = tf.image.decode_png(features['image'], channels=3)
image = tf.image.resize(image, image_shape)
object_name = tf.sparse.to_dense(features['object_name'])
label = tf.cast(class_table.lookup(object_name), tf.float32)
label = tf.stack(
[tf.sparse.to_dense(features[feature]) for feature in ['x0', 'y0', 'x1', 'y1']]
+ [label],
1,
)
padding = [[0, max_boxes - tf.shape(label)[0]], [0, 0]]
label = tf.pad(label, padding)
return image, label
def read_tfrecord(
fp,
classes_file,
image_shape,
max_boxes,
shuffle_buffer_size,
batch_size,
anchors,
masks,
classes_delimiter='\n',
):
text_initializer = tf.lookup.TextFileInitializer(
classes_file, tf.string, 0, tf.int64, -1, delimiter=classes_delimiter
)
class_table = tf.lookup.StaticHashTable(text_initializer, -1)
files = tf.data.Dataset.list_files(fp)
dataset = files.flat_map(tf.data.TFRecordDataset)
feature_map = {
'image': tf.io.FixedLenFeature([], tf.string),
'x0': tf.io.VarLenFeature(tf.float32),
'y0': tf.io.VarLenFeature(tf.float32),
'x1': tf.io.VarLenFeature(tf.float32),
'y1': tf.io.VarLenFeature(tf.float32),
'object_name': tf.io.VarLenFeature(tf.string),
'object_index': tf.io.VarLenFeature(tf.int64),
}
return (
dataset.map(
lambda x: read_example(x, feature_map, class_table, max_boxes, image_shape),
tf.data.experimental.AUTOTUNE,
)
.batch(batch_size)
.shuffle(shuffle_buffer_size)
.map(
lambda x, y: (
transform_images(x, image_shape),
transform_targets(y, anchors, masks, image_shape[0]),
)
)
.prefetch(tf.data.experimental.AUTOTUNE)
)
if __name__ == '__main__':
input_shape = (608, 608, 3)
labels = pd.read_csv('labels.csv')
classes_file = 'classes.txt'
max_boxes = max([g[1].shape[0] for g in labels.groupby('image')])
shuffle_buffer_size = 256
batch_size = 8
anchors = np.array(
[
(10, 13),
(16, 30),
(33, 23),
(30, 61),
(62, 45),
(59, 119),
(116, 90),
(156, 198),
(373, 326),
]
) / np.array(input_shape[:-1])
masks = np.array([[6, 7, 8], [3, 4, 5], [0, 1, 2]])
train_dataset = read_tfrecord(
'/content/yolo-train-0.tfrecord',
classes_file,
input_shape[:-1],
max_boxes,
shuffle_buffer_size,
batch_size,
anchors,
masks,
)
for i, _ in enumerate(train_dataset, 1): # There should be around 11000 iterations
print(f'\riteration: {i}', end='')
Is there a way to filter out the problematic examples?
I tried the following using try and except blocks and it doesn't work and gives the exception being specified despite adding the following to create_tfrecord
dataset = iter(dataset)
while True:
try:
yield next(dataset)
except InvalidArgumentError:
pass
ANSWER
Answered 2022-Feb-14 at 16:16Wrapping the transform_targets_for_output
method with a try-except-raise
clause and applying tf.data.experimental.ignore_errors
to the dataset seems to actually work:
def transform_targets_for_output(y_true, grid_size, anchor_indices):
try:
n = tf.shape(y_true)[0]
y_true_out = tf.zeros((n, grid_size, grid_size, tf.shape(anchor_indices)[0], 6))
anchor_indices = tf.cast(anchor_indices, tf.int32)
indexes = tf.TensorArray(tf.int32, 1, dynamic_size=True)
updates = tf.TensorArray(tf.float32, 1, dynamic_size=True)
idx = 0
for i in tf.range(n):
for j in tf.range(tf.shape(y_true)[1]):
if tf.equal(y_true[i][j][2], 0):
continue
anchor_eq = tf.equal(anchor_indices, tf.cast(y_true[i][j][5], tf.int32))
if tf.reduce_any(anchor_eq):
box = y_true[i][j][0:4]
box_xy = (y_true[i][j][0:2] + y_true[i][j][2:4]) / 2
anchor_idx = tf.cast(tf.where(anchor_eq), tf.int32)
grid_xy = tf.cast(box_xy // (1 / grid_size), tf.int32)
indexes = indexes.write(
idx, [i, grid_xy[1], grid_xy[0], anchor_idx[0][0]]
)
updates = updates.write(
idx, [box[0], box[1], box[2], box[3], 1, y_true[i][j][4]]
)
idx += 1
return tf.tensor_scatter_nd_update(y_true_out, indexes.stack(), updates.stack())
except tf.errors.InvalidArgumentError:
raise
Using a batch size of 8, I was able to iterate through the dataset successfully:
train_dataset = train_dataset.apply(tf.data.experimental.ignore_errors())
for i, _ in enumerate(train_dataset, 1): # There should be around 11000 iterations
print(f'\riteration: {i}', end='')
iteration: 11244
QUESTION
I am taking an uploaded image from the user and then sending it to a YOLO model which then returns me an image.
I want to store that returned image in my Local Directory and then display it on the user interface.
This is the Code of views.py
that takes in an image and sends it to the Yolo Model,
def predictImage(request):
# print(request)
# print(request.POST.dict())
fileObj = request.FILES['filePath']
fs = FileSystemStorage()
filePathName = fs.save(fileObj.name, fileObj)
filePathName = fs.url(filePathName)
testimage = '.'+filePathName
# img = image.load_img(testimage, target_size=(img_height, img_width))
img = detect_image(testimage)
filePathName = fs.save(fileObj.name + "_result", img) # -> HERE IS THE ERROR
filePathName = fs.url(filePathName)
This is the function of YOLO Model that uses OpenCV to read the image (Image is sent as argument to the function) and then returns that image,
import numpy as np
import cv2
def detect_image(img_path):
confidenceThreshold = 0.5
NMSThreshold = 0.3
modelConfiguration = 'cfg/yolov3.cfg'
modelWeights = 'yolov3.weights'
labelsPath = 'coco.names'
labels = open(labelsPath).read().strip().split('\n')
np.random.seed(10)
COLORS = np.random.randint(0, 255, size=(len(labels), 3), dtype="uint8")
net = cv2.dnn.readNetFromDarknet(modelConfiguration, modelWeights)
image = cv2.imread(img_path)
(H, W) = image.shape[:2]
#Determine output layer names
layerName = net.getLayerNames()
layerName = [layerName[i - 1] for i in net.getUnconnectedOutLayers()]
blob = cv2.dnn.blobFromImage(image, 1 / 255.0, (416, 416), swapRB = True, crop = False)
net.setInput(blob)
layersOutputs = net.forward(layerName)
boxes = []
confidences = []
classIDs = []
for output in layersOutputs:
for detection in output:
scores = detection[5:]
classID = np.argmax(scores)
confidence = scores[classID]
if confidence > confidenceThreshold:
box = detection[0:4] * np.array([W, H, W, H])
(centerX, centerY, width, height) = box.astype('int')
x = int(centerX - (width/2))
y = int(centerY - (height/2))
boxes.append([x, y, int(width), int(height)])
confidences.append(float(confidence))
classIDs.append(classID)
#Apply Non Maxima Suppression
detectionNMS = cv2.dnn.NMSBoxes(boxes, confidences, confidenceThreshold, NMSThreshold)
if(len(detectionNMS) > 0):
for i in detectionNMS.flatten():
(x, y) = (boxes[i][0], boxes[i][1])
(w, h) = (boxes[i][2], boxes[i][3])
color = [int(c) for c in COLORS[classIDs[i]]]
cv2.rectangle(image, (x, y), (x + w, y + h), color, 2)
text = '{}: {:.4f}'.format(labels[classIDs[i]], confidences[i])
cv2.putText(image, text, (x, y - 5), cv2.FONT_HERSHEY_SIMPLEX, 0.5, color, 2)
return image
#cv2.imshow('Image', image)
#cv2.waitKey(0)
On this line,
filePathName = fs.save(fileObj.name + "_result", img)
I am getting this following error,
'numpy.ndarray' object has no attribute 'read'
I am not sure how can I resolve this. I tried searching how can I store OpenCV Modified file usnig FileSystemStorage but found nothing of help. Can anyone help me regarding this?
ANSWER
Answered 2022-Feb-13 at 16:57You can use the imwrite
function of cv2
library to store your files in the local directory, i.e.,
In your case, simply do this,
img = detect_image(testimage)
cv2.imwrite(fileObj.name+"_result.jpg", img=img)
QUESTION
There have been already some similar style questions asked before (1, 2) However, none have mentioned the new Yolov5
style annotations.
Is there a simple function that takes in a normalized Yolov5
bounding box like:-
test = [0.436523 0.535156 0.587891 0.484375]
def some_function(test):
...
return pascal_coords
And returns it in Pascal-VOC format?
I have tried plenty of online scripts - like https://dbuscombe-usgs.github.io/MLMONDAYS/blog/2020/08/17/blog-post and https://blog.roboflow.com/how-to-convert-annotations-from-pascal-voc-to-yolo-darknet/
But they aim for full dataset conversion including the xml
, and some don't accept normalized boxes
This is the format:-
Yolov5 [ ]
|---> Converted to <-----|
Pascal VOC [x-top-left, y-top-left, x-bottom-right, y-bottom-right]
I simply want the converted bounding box :) TIA!
ANSWER
Answered 2021-Nov-17 at 18:39There is no direct way to convert the normalized Yolo format to another format like Pascal VOC, because you need to know the size of the image to do the conversion. (Just like you need to know the size of the image to convert it to the normalized yolo format in the first place.) You will also want to provide some mapping to convert the class numbers to class names.
I am working on a Python package to simplify these kinds of conversions called PyLabel. I have a sample notebook that will convert Yolo v5 annotations to VOC format here https://github.com/pylabel-project/samples/blob/main/yolo2voc.ipynb. You can open it in Colab using this link.
The core code will be something like this:
from pylabel import importer
dataset = importer.ImportYoloV5(path=path_to_annotations, path_to_images=path_to_images)
dataset.export.ExportToVoc(output_path=output_path)
Hope that helps. Feel free to contact me if you have feedback or need assistance.
QUESTION
Yolov5 doesn't support segmentation labels and I need to convert it into the correct format.
How would you convert this to yolo format?
"boundingPoly": {
"normalizedVertices": [{
"x": 0.026169369
}, {
"x": 0.99525446
}, {
"x": 0.99525446,
"y": 0.688811
}, {
"x": 0.026169369,
"y": 0.688811
}]
}
The yolo format looks like this
0 0.588196 0.474138 0.823607 0.441645
ANSWER
Answered 2021-Dec-21 at 00:34After our back and forth in the comments I have enough info to answer your question. This is output from the Google Vision API. The normalizedVertices are similar to the YOLO format, because they are "normalized" meaning the coordinates are scaled between 0 and 1 as opposed to being pixels from 1 to n. Still, you need to do some transformation to put into the YOLO format. In the YOLO format, the X and Y values in the 2nd and 3rd columns refer to the center of the bounding box, as opposed to one of the corners.
Here is a code snipped that will sample at https://ghostbin.com/hOoaz/raw into the follow string in YOLO format '0 0.5080664305 0.5624289849999999 0.9786587390000001 0.56914843'
#Sample annotation output
json_annotation = """
[
{
"mid": "/m/01bjv",
"name": "Bus",
"score": 0.9459266,
"boundingPoly": {
"normalizedVertices": [
{
"x": 0.018737061,
"y": 0.27785477
},
{
"x": 0.9973958,
"y": 0.27785477
},
{
"x": 0.9973958,
"y": 0.8470032
},
{
"x": 0.018737061,
"y": 0.8470032
}
]
}
}
]
"""
import json
json_object = json.loads(json_annotation, strict=False)
#Map all class names to class id
class_dict = {"Bus": 0}
#Get class id for this record
class_id = class_dict[json_object[0]["name"]]
#Get the max and min values from segmented polygon points
normalizedVertices = json_object[0]["boundingPoly"]["normalizedVertices"]
max_x = max([v['x'] for v in normalizedVertices])
max_y = max([v['y'] for v in normalizedVertices])
min_x = min([v['x'] for v in normalizedVertices])
min_y = min([v['y'] for v in normalizedVertices])
width = max_x - min_x
height = max_y - min_y
center_x = min_x + (width/2)
center_y = min_y + (height/2)
yolo_row = str(f"{class_id} {center_x} {center_y} {width} {height}")
print(yolo_row)
If you are trying to train a YOLO model there are a few more steps you will need to do: You need to setup the images and annotations in a particular folder structure. But this should help you convert your annotations.
QUESTION
I am trying to run this github repo found at this link: https://github.com/HowieMa/DeepSORT_YOLOv5_Pytorch After installing the requirements via pip install -r requirements.txt. I am running this in a python 3.8 virtual environment, on a dji manifold 2g which runs on an Nvidia jetson tx2.
The following is the terminal output.
$ python main.py --cam 0 --display
Namespace(agnostic_nms=False, augment=False, cam=0, classes=[0], conf_thres=0.5, config_deepsort='./configs/deep_sort.yaml', device='', display=True, display_height=600, display_width=800, fourcc='mp4v', frame_interval=2, img_size=640, input_path='input_480.mp4', iou_thres=0.5, save_path='output/', save_txt='output/predict/', weights='yolov5/weights/yolov5s.pt')
Initialize DeepSORT & YOLO-V5
Using CPU
Using webcam 0
Traceback (most recent call last):
File "main.py", line 259, in
with VideoTracker(args) as vdo_trk:
File "main.py", line 53, in __init__
cfg.merge_from_file(args.config_deepsort)
File "/home/dji/Desktop/targetTrackers/howieMa/DeepSORT_YOLOv5_Pytorch/utils_ds/parser.py", line 23, in merge_from_file
self.update(yaml.load(fo.read()))
TypeError: load() missing 1 required positional argument: 'Loader'
I have found some suggestions on github, such as in here TypeError: load() missing 1 required positional argument: 'Loader' in Google Colab, which suggests to change yaml.load to yaml.safe_load
This is the code block to modify:
class YamlParser(edict):
"""
This is yaml parser based on EasyDict.
"""
def __init__(self, cfg_dict=None, config_file=None):
if cfg_dict is None:
cfg_dict = {}
if config_file is not None:
assert(os.path.isfile(config_file))
with open(config_file, 'r') as fo:
cfg_dict.update(yaml.load(fo.read()))
super(YamlParser, self).__init__(cfg_dict)
def merge_from_file(self, config_file):
with open(config_file, 'r') as fo:
self.update(yaml.load(fo.read()))
def merge_from_dict(self, config_dict):
self.update(config_dict)
However, changing yaml.load into yaml.safe_load leads me to this error instead
$ python main.py --cam 0 --display
Namespace(agnostic_nms=False, augment=False, cam=0, classes=[0], conf_thres=0.5, config_deepsort='./configs/deep_sort.yaml', device='', display=True, display_height=600, display_width=800, fourcc='mp4v', frame_interval=2, img_size=640, input_path='input_480.mp4', iou_thres=0.5, save_path='output/', save_txt='output/predict/', weights='yolov5/weights/yolov5s.pt')
Initialize DeepSORT & YOLO-V5
Using CPU
Using webcam 0
Done..
Camera ...
Done. Create output file output/results.mp4
Illegal instruction (core dumped)
Has anyone encountered anything similar ? Thank you !
ANSWER
Answered 2021-Nov-11 at 05:39Try this:
yaml.load(fo.read(), Loader=yaml.FullLoader)
It seems that pyyaml>=5.1 requires a Loader
argument.
QUESTION
I wanted to know the number of vehicles in the picture using yolov5 However, the result of the model was different from detect.py
0. img
1. model_result
# Model
model = torch.hub.load('ultralytics/yolov5', 'yolov5s') # or yolov5m, yolov5l, yolov5x, custom
# Images
img = 'D:\code\YOLO\dataset\img\public02.png' # or file, Path, PIL, OpenCV, numpy, list
# Inference
results = model(img)
# Results
results.print() # or .show(), .save(), .crop(), .pandas(), etc.
result -> (no detections)
2. detect.py
from IPython.display import Image
import os
val_img_path = 'D:\code\YOLO\dataset\img\public02.png'
!python detect.py --img 416 --conf 0.25 --source "{val_img_path}"
result ->
I know that if I don't specify weight option in detect.py, the default yolo5s model is used. but, Result 1 differs from Result 2 using the same model.
ANSWER
Answered 2021-Oct-24 at 21:48It seems it is an image processing issue.
Indeed with your example, the model loaded from torch hub gives a different output than detect.py
. Looking at the source code of detect.py
I see that there is some good image pre-processing. From the model hub, I really don't know what's happening to the input. From the model hub, the resulting image is this:
With their pre-processing, this is basically the image that you are feeding into the model. Would not expect any detections from this to be honest.
But then I tried doing the pre-processing myself (noted as well in their tutorial)
import torch
import cv2
# Model
model = torch.hub.load('ultralytics/yolov5', 'yolov5s') # or yolov5m, yolov5l, yolov5x, custom
# Image
imgPath = '/content/9X9FP.png'
img = cv2.imread(imgPath)[..., ::-1] # Pre-processing OpenCV image (BGR to RGB)
# Inference
results = model(img)
# Results
results.save()
And it all works fine:
So for a quick and easy answer, I would just do the pre-processing my self, it's just a simple one line extra step. Good luck!
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install yolo
Support
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesExplore Kits - Develop, implement, customize Projects, Custom Functions and Applications with kandi kits
Save this library and start creating your kit
Share this Page