Human Generated Data

Title

Untitled (USO show, Long Binh Post, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.284.5

Human Generated Data

Title

Untitled (USO show, Long Binh Post, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.284.5

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Human 99.7
Person 99.7
Person 99.7
Person 98.9
Apparel 98.9
Clothing 98.9
Person 98.4
Person 98.3
Person 97.8
Person 87.1
Person 81.6
Person 80.9
Footwear 73.6
Shoe 73.6
Helmet 69.6
Coat 69
Overcoat 69
Fashion 67.1
Person 63.6
People 59.9
Mammal 59.3
Animal 59.3
Canine 59.3
Suit 58.3
Female 56.5
Cloak 55.5
Sleeve 55.1

Clarifai
created on 2019-08-09

people 100
group together 99.7
group 99.4
many 98.8
several 98
adult 97.9
wear 96.6
administration 95.3
child 94.5
man 93.9
woman 93.8
five 93.3
war 92.1
military 91
leader 90.9
outfit 89.1
four 88.9
music 83.5
vehicle 83.5
recreation 82.5

Imagga
created on 2019-08-09

silhouette 27.3
kin 25.8
man 25.5
people 25.1
musical instrument 24.6
marimba 22.2
percussion instrument 20
sunset 18.9
person 18.5
male 17.7
black 15.1
beach 14.3
adult 13.9
water 13.3
wind instrument 13.1
sea 12.5
ocean 11.6
sky 11.5
group 11.3
travel 11.3
men 11.2
women 11.1
love 11
business 10.9
brass 10.9
family 10.7
together 10.5
couple 10.4
summer 10.3
life 10.3
clothing 9.9
dusk 9.5
happiness 9.4
evening 9.3
leisure 9.1
tourism 9.1
sun 8.9
businessman 8.8
boy 8.7
scene 8.7
dark 8.3
fashion 8.3
human 8.2
window 8.2
calm 8.2
outdoors 8.2
tourist 8.2
dress 8.1
room 8
lifestyle 7.9
world 7.8
outdoor 7.6
hand 7.6
relax 7.6
shore 7.4
sax 7.4
vacation 7.4
light 7.3
holiday 7.2
art 7.2
romantic 7.1
day 7.1

Google
created on 2019-08-09

Photograph 96.1
Standing 88.6
Snapshot 83.9
Team 70.5
Crew 69.3
History 67.6
Vintage clothing 57
Family 56.1

Microsoft
created on 2019-08-09

person 97
text 93.9
clothing 90.2
black and white 77.8
man 77.4
standing 75.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 13-25
Gender Male, 54.7%
Angry 45%
Confused 45%
Calm 46.9%
Surprised 45%
Happy 45%
Disgusted 45%
Fear 45%
Sad 53%

AWS Rekognition

Age 33-49
Gender Male, 54.3%
Sad 53.3%
Happy 45%
Fear 46.7%
Calm 45%
Surprised 45%
Angry 45%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 23-35
Gender Male, 54.1%
Sad 54.4%
Surprised 45%
Happy 45%
Angry 45.2%
Disgusted 45%
Fear 45.3%
Calm 45.1%
Confused 45%

AWS Rekognition

Age 55-73
Gender Male, 53.9%
Angry 45.9%
Fear 49%
Surprised 45.2%
Disgusted 45.1%
Calm 46.5%
Sad 48.1%
Confused 45.3%
Happy 45%

Feature analysis

Amazon

Person 99.7%
Shoe 73.6%
Helmet 69.6%

Text analysis

Google

The
The