Human Generated Data

Title

Impromptu Hoedown on the Third Floor of a Horse Barn

Date

1948

People

Artist: Wayne Miller, American 1918 - 2013

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.699

Human Generated Data

Title

Impromptu Hoedown on the Third Floor of a Horse Barn

People

Artist: Wayne Miller, American 1918 - 2013

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 99.5
Person 98.5
Leisure Activities 96.5
Musical Instrument 95.8
Musician 95.8
Apparel 88.8
Clothing 88.8
Music Band 88.8
Guitar 86.5
Person 86.4
Performer 73.9
Guitarist 71.6
Footwear 67.3
Boot 65.9
Person 64.9
Shoe 62.5
Shoe 61.2
Fiddle 58.7
Violin 58.7
Viola 58.7
Shoe 51.1

Imagga
created on 2022-01-09

musical instrument 42.4
wind instrument 33.5
man 28.2
accordion 22.8
male 21.3
people 21.2
adult 19.7
brass 18.4
keyboard instrument 18.3
person 17
kin 16.8
dress 15.3
clothing 14.8
danger 14.5
traditional 14.1
city 14.1
old 13.9
men 13.7
protection 12.7
soldier 11.7
portrait 11.6
military 11.6
mask 11.2
statue 10.6
bride 10.5
couple 10.4
uniform 9.9
fashion 9.8
war 9.6
urban 9.6
building 9.6
gun 9.5
women 9.5
trombone 9.3
dirty 9
rifle 9
groom 9
suit 9
outdoors 8.9
celebration 8.8
protective 8.8
weapon 8.7
scene 8.6
culture 8.5
two 8.5
black 8.4
dark 8.3
street 8.3
wedding 8.3
looking 8
day 7.8
sculpture 7.8
destruction 7.8
architecture 7.8
accident 7.8
world 7.8
gas 7.7
musician 7.6
elegance 7.6
art 7.5
tradition 7.4
bass 7.4
safety 7.4
historic 7.3
holiday 7.2
family 7.1
love 7.1
happiness 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 98.9
musical instrument 93.5
clothing 92.6
text 92.2
concert 79.6
man 76.7
black and white 63.5
dance 62.5
drum 60.3
people 56.2
old 42.3

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 53.9%
Calm 93.6%
Happy 1.9%
Surprised 1.9%
Angry 0.8%
Sad 0.6%
Confused 0.6%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Happy 97.5%
Surprised 1.1%
Confused 0.5%
Fear 0.3%
Disgusted 0.2%
Angry 0.2%
Calm 0.2%
Sad 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Guitar 86.5%
Shoe 62.5%

Captions

Microsoft

a group of people standing in front of a building 92.5%
a group of people standing next to a building 91.7%
a group of people standing outside of a building 91.5%