Human Generated Data

Title

Untitled (USO show, Long Binh Post, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.284.3

Human Generated Data

Title

Untitled (USO show, Long Binh Post, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.284.3

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Human 98.9
Person 98.9
Person 98.8
Person 98.5
Person 98.4
Person 98
Indoors 97.3
Interior Design 97.3
Person 95.9
Person 95.7
Clothing 94.8
Apparel 94.8
Person 92.4
Person 91.1
Person 90
Room 89.8
Person 88.9
Person 87.5
Person 86.9
Person 85.7
Person 83.2
Coat 75.4
Overcoat 75.4
Poster 71.6
Collage 71.6
Advertisement 71.6
Crowd 69.6
Person 65.7
Suit 62.6
Audience 57.5
Mannequin 55.2

Clarifai
created on 2019-08-09

people 100
group 99.6
group together 99.3
many 98.9
adult 98.1
wear 97.1
child 95.9
woman 95.9
several 95.4
man 94.4
leader 92.4
outfit 91.4
administration 91
military 89
war 87.8
recreation 87.6
music 87.1
actor 86.5
five 80.5
veil 78.8

Imagga
created on 2019-08-09

old 25.8
people 17.3
religion 17
statue 16.3
sculpture 15.8
person 15.5
art 15.3
architecture 14.8
cemetery 14.7
vintage 13.2
world 13.2
history 12.5
man 12.1
ancient 12.1
city 11.6
male 11.3
religious 11.2
stone 11.1
building 10.6
black 10.5
men 10.3
monument 10.3
cathedral 9.6
god 9.6
antique 9.5
scene 9.5
grunge 9.4
light 9.4
church 9.2
dress 9
dirty 9
catholic 8.8
instrument 8.4
monk 8.3
kin 8.2
group 8.1
guillotine 8
face 7.8
device 7.7
war 7.7
mystery 7.7
texture 7.6
musical instrument 7.5
silhouette 7.4
tourism 7.4
symbol 7.4
detail 7.2
clothing 7.1
spectator 7.1
portrait 7.1
travel 7

Google
created on 2019-08-09

Photograph 95.3
Standing 84.8
History 68.8
Monochrome 54.4
Family 52.5

Microsoft
created on 2019-08-09

clothing 97.4
person 96.4
outdoor 91
man 88.2
text 81.8
black and white 71.1
woman 52.8
posing 37.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 33-49
Gender Female, 50.7%
Fear 45.4%
Angry 45%
Happy 45%
Disgusted 45%
Calm 45.7%
Confused 45%
Sad 53.9%
Surprised 45%

AWS Rekognition

Age 3-9
Gender Female, 50.9%
Calm 52.6%
Happy 45.4%
Fear 45%
Surprised 45.1%
Confused 45.1%
Disgusted 45%
Angry 45.1%
Sad 46.6%

AWS Rekognition

Age 28-44
Gender Female, 50.6%
Happy 45.1%
Angry 45.1%
Sad 54%
Disgusted 45%
Confused 45%
Calm 45.1%
Fear 45.7%
Surprised 45%

AWS Rekognition

Age 27-43
Gender Female, 50.8%
Happy 45%
Angry 45.1%
Disgusted 45%
Fear 45.1%
Sad 54%
Surprised 45%
Calm 45.4%
Confused 45.3%

AWS Rekognition

Age 37-55
Gender Male, 54.2%
Disgusted 45%
Sad 47.4%
Fear 45.1%
Happy 45.1%
Surprised 45.1%
Confused 45.1%
Calm 51.5%
Angry 45.8%

AWS Rekognition

Age 50-68
Gender Male, 52.9%
Angry 45.1%
Calm 45.4%
Sad 45.2%
Confused 53.9%
Disgusted 45%
Happy 45.1%
Fear 45.2%
Surprised 45.1%

AWS Rekognition

Age 48-66
Gender Male, 52.9%
Angry 45.1%
Calm 50.8%
Sad 48.9%
Confused 45.1%
Disgusted 45%
Happy 45.1%
Fear 45%
Surprised 45%

AWS Rekognition

Age 26-42
Gender Female, 51%
Angry 45.1%
Confused 45.1%
Happy 50.4%
Disgusted 45.1%
Calm 48.6%
Fear 45.1%
Sad 45.2%
Surprised 45.4%

Feature analysis

Amazon

Person 98.9%

Text analysis

Google

The own Rown
The
own
Rown