Human Generated Data

Title

Untitled (USO show, Long Binh Post, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.285.2

Human Generated Data

Title

Untitled (USO show, Long Binh Post, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.285.2

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Human 99.7
Person 99.7
Person 99.3
Person 99.3
Person 99
Person 98.6
Person 97.6
Person 97.3
Person 94.3
Person 93.7
Clothing 91.3
Apparel 91.3
Person 86.2
Person 85.2
Mannequin 84.8
Person 65.7
Advertisement 65.6
Poster 63
Person 62.7
People 61.8
Collage 60.5
Room 58.1
Indoors 58.1

Clarifai
created on 2019-08-09

people 99.9
group together 99.3
group 99.2
many 97.7
adult 97.5
several 95.5
man 95.4
woman 94
wear 93.4
administration 92.6
child 89
leader 87.9
five 85.6
music 84.9
military 84.1
recreation 82.9
outfit 80.5
crowd 80.3
four 79.5
war 76.1

Imagga
created on 2019-08-09

musical instrument 57.8
wind instrument 51.1
accordion 32
keyboard instrument 25.6
man 22.8
people 22.3
brass 20.1
male 19.1
stage 17.3
person 17.3
men 16.3
dress 16.2
sax 15.7
adult 15.1
cornet 14.9
group 13.7
clothing 13
fashion 12.8
business 12.1
free-reed instrument 11.5
couple 11.3
old 11.1
black 10.8
scene 10.4
portrait 10.3
clothes 10.3
women 10.3
room 10
religion 9.9
art 9.8
platform 9.3
city 9.1
groom 9.1
businessman 8.8
happy 8.8
bride 8.6
wall 8.5
face 8.5
outfit 8.5
religious 8.4
human 8.2
family 8
harmonica 7.9
love 7.9
holiday 7.9
hands 7.8
life 7.8
musician 7.6
hand 7.6
catholic 7.6
traditional 7.5
vintage 7.4
style 7.4
church 7.4
tradition 7.4
wedding 7.4
new 7.3
lifestyle 7.2
building 7.1
work 7.1
indoors 7

Google
created on 2019-08-09

Photograph 95.4
Standing 83.6
History 68.8
Team 64.1
Black-and-white 56.4
Vintage clothing 54.7
Crew 52.6

Microsoft
created on 2019-08-09

person 98.7
clothing 93.7
statue 83.9
text 81.2
man 74.4
group 70.8
black and white 63.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Female, 50.3%
Confused 45.1%
Angry 48.7%
Disgusted 45.1%
Fear 45.1%
Calm 49.2%
Surprised 45.3%
Happy 46.2%
Sad 45.4%

AWS Rekognition

Age 10-20
Gender Female, 50.4%
Happy 46%
Disgusted 45.1%
Surprised 45.2%
Angry 45.5%
Fear 45.4%
Calm 46.4%
Sad 51.3%
Confused 45.2%

AWS Rekognition

Age 50-68
Gender Male, 53.8%
Disgusted 45.1%
Calm 50.3%
Sad 48.6%
Fear 45.1%
Happy 45.1%
Angry 45.4%
Surprised 45.2%
Confused 45.2%

AWS Rekognition

Age 23-37
Gender Male, 52.4%
Angry 46.1%
Confused 45.2%
Happy 45.1%
Disgusted 45%
Calm 48.3%
Fear 45.2%
Sad 49.9%
Surprised 45.1%

AWS Rekognition

Age 50-68
Gender Male, 51.6%
Sad 45.7%
Surprised 45.2%
Happy 46.2%
Angry 45.4%
Disgusted 45.5%
Fear 45.1%
Calm 51.6%
Confused 45.4%

Feature analysis

Amazon

Person 99.7%