Human Generated Data

Title

Untitled (USO show, Long Binh Post, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.289.4

Human Generated Data

Title

Untitled (USO show, Long Binh Post, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.289.4

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Person 99.5
Human 99.5
Person 98.9
Person 98.3
Person 98.2
Person 98.1
Person 97.5
Person 96.7
Apparel 96.7
Clothing 96.7
Person 94.5
Person 93.3
Person 92.1
Person 88.3
Person 84.9
Person 80.9
Person 79.8
Person 77.5
Person 77.5
People 72.1
Person 67.1
Hat 66.9
Person 62.4
Person 59.2
Crowd 55.8
Person 53.3

Clarifai
created on 2019-08-09

people 99.9
group 99
group together 98.7
many 98.3
adult 96.3
man 94.6
administration 93.4
leader 90.9
several 90.8
child 89.9
woman 89.6
wear 89.3
crowd 85.1
music 77.5
war 73.5
education 71
military 70.9
vehicle 66.9
recreation 66.8
four 65.9

Imagga
created on 2019-08-09

man 21.5
people 20.1
person 20.1
male 16.4
human 15.7
men 14.6
business 14.6
adult 13.9
musical instrument 13.4
city 13.3
clothing 12.9
world 12.8
black 12.6
urban 12.2
women 11.9
happiness 11.7
room 11.4
negative 11.3
portrait 11
happy 10
life 9.8
businessman 9.7
shop 9.7
chemical 9.6
wind instrument 9.6
bag 9.5
wall 9.4
film 9.4
fashion 9
dress 9
suit 9
couple 8.7
bride 8.6
street 8.3
worker 8.1
group 8.1
work 7.9
commuter 7.9
love 7.9
instrument 7.8
scene 7.8
chemistry 7.7
old 7.7
wedding 7.4
professional 7.3
color 7.2
sexy 7.2
mask 7.2
team 7.2
accordion 7.1
hair 7.1
interior 7.1
working 7.1
medical 7.1

Google
created on 2019-08-09

Photograph 96.1
Social group 90
Standing 87
Snapshot 84.9
Team 84.6
Black-and-white 68.3
Room 65.7
Crew 63
Photography 62.4
Monochrome 60.1
History 57.6
Family 51
Style 51
Crowd 50.7

Microsoft
created on 2019-08-09

person 98.4
text 94.8
wedding dress 94.2
dress 88.9
clothing 86.2
bride 80.4
group 79.4
woman 76.9
dance 54.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 44-62
Gender Male, 52.1%
Disgusted 45%
Happy 45.4%
Sad 46.6%
Calm 52.2%
Fear 45.1%
Angry 45.4%
Confused 45.1%
Surprised 45.2%

AWS Rekognition

Age 16-28
Gender Female, 53.4%
Happy 45.3%
Angry 51.6%
Disgusted 45.2%
Calm 46.5%
Confused 45.3%
Sad 45.6%
Surprised 45.3%
Fear 45.3%

AWS Rekognition

Age 20-32
Gender Male, 51.6%
Fear 45%
Sad 45.2%
Disgusted 45.1%
Happy 49.9%
Surprised 45.2%
Angry 45.2%
Calm 49.3%
Confused 45.1%

AWS Rekognition

Age 32-48
Gender Female, 54.5%
Calm 45.1%
Confused 45.1%
Happy 46.2%
Surprised 45.1%
Angry 53.4%
Fear 45.1%
Disgusted 45%
Sad 45%

AWS Rekognition

Age 47-65
Gender Male, 52.5%
Confused 45.8%
Calm 46.7%
Surprised 45.3%
Angry 48.8%
Sad 45.9%
Disgusted 46.6%
Happy 45.7%
Fear 45.2%

AWS Rekognition

Age 15-27
Gender Male, 53.6%
Calm 45.1%
Confused 45.2%
Sad 45.4%
Fear 45.6%
Disgusted 45.1%
Surprised 46.5%
Angry 51.9%
Happy 45.3%

AWS Rekognition

Age 18-30
Gender Male, 50.3%
Happy 45.1%
Disgusted 45%
Confused 45%
Sad 45.1%
Fear 45%
Surprised 45%
Angry 54.7%
Calm 45%

Feature analysis

Amazon

Person 99.5%

Categories

Text analysis

Amazon

Band
Band cr
cr

Google

The Band of Ren
The
Band
of
Ren