Human Generated Data

Title

Untitled (studio portrait of women in band uniforms and hats)

Date

c. 1934, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5825

Human Generated Data

Title

Untitled (studio portrait of women in band uniforms and hats)

People

Artist: Durette Studio, American 20th century

Date

c. 1934, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5825

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Military 99.5
Military Uniform 99.4
Person 99.4
Human 99.4
Person 99.3
Person 99
Troop 98.8
Army 98.8
People 98.8
Armored 98.8
Person 98.7
Person 98.1
Person 98
Helmet 97.3
Apparel 97.3
Clothing 97.3
Helmet 92.8
Helmet 91.9
Person 91.4
Helmet 91.3
Person 90.8
Soldier 90.3
Helmet 90.3
Helmet 90
Helmet 89
Person 88.5
Person 88.2
Helmet 86.7
Officer 83.7
Person 80.2
Helmet 77.4
Person 76.7
Person 75.3
Person 72.1
Person 71.8
Helmet 71.5
Person 65.2
Person 65.2
Crowd 59
Helmet 56.6
Shorts 56

Clarifai
created on 2019-11-16

people 99.9
group together 99.3
adult 98.9
wear 97.6
many 97.3
uniform 96.4
outfit 95.9
woman 95.8
portrait 95.1
man 95
group 93.1
one 90.5
crowd 85.4
veil 84.9
athlete 84.8
school 83
competition 82.3
music 79.5
military 76.3
education 76.3

Imagga
created on 2019-11-16

people 24
black 19.4
silhouette 19
man 16.9
person 16.8
group 16.1
cockpit 15.1
sexy 14.4
crowd 14.4
fashion 13.6
adult 13
team 12.5
device 11.9
dance 11.5
competition 11
control panel 10.6
body 10.4
business 10.3
sport 10.2
support 9.9
pretty 9.8
human 9.7
technology 9.6
style 9.6
design 9.6
women 9.5
model 9.3
athlete 9.3
male 9.2
city 9.1
modern 9.1
symbol 8.7
skill 8.7
equipment 8.6
performance 8.6
party 8.6
men 8.6
bright 8.6
3d 8.5
legs 8.5
dancer 8.4
attractive 8.4
lights 8.3
training 8.3
digital 8.1
posing 8
hair 7.9
shiny 7.9
urban 7.9
world 7.9
cheering 7.8
face 7.8
audience 7.8
motion 7.7
grunge 7.7
erotic 7.5
flag 7.5
vivid 7.4
headstock 7.4
event 7.4
music 7.3
pose 7.2
businessman 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.5
clothing 94.6
person 92.2
black and white 80.5
footwear 76.5
posing 71.9
smile 57.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 14-26
Gender Male, 53.3%
Disgusted 45%
Surprised 45.1%
Happy 45.1%
Confused 45%
Sad 45.1%
Fear 45%
Angry 45.1%
Calm 54.6%

AWS Rekognition

Age 18-30
Gender Female, 54.4%
Sad 45.7%
Surprised 45.3%
Angry 45.3%
Calm 46.1%
Happy 49.4%
Fear 46.4%
Disgusted 46.6%
Confused 45.3%

AWS Rekognition

Age 12-22
Gender Female, 53.5%
Disgusted 45%
Confused 45%
Sad 45.1%
Calm 54.7%
Angry 45.1%
Fear 45%
Surprised 45%
Happy 45.1%

AWS Rekognition

Age 20-32
Gender Female, 51.5%
Confused 45%
Disgusted 45%
Surprised 45%
Calm 52.8%
Angry 45.1%
Sad 47%
Happy 45%
Fear 45%

AWS Rekognition

Age 10-20
Gender Male, 51.5%
Calm 54.4%
Surprised 45%
Happy 45.1%
Sad 45.3%
Confused 45%
Angry 45.1%
Fear 45%
Disgusted 45.1%

AWS Rekognition

Age 22-34
Gender Female, 53.7%
Surprised 45%
Angry 45.1%
Happy 45%
Fear 45.1%
Calm 45.3%
Disgusted 45.1%
Sad 54.3%
Confused 45%

AWS Rekognition

Age 18-30
Gender Female, 53.1%
Calm 54.4%
Sad 45.1%
Happy 45.1%
Angry 45.1%
Surprised 45.1%
Confused 45%
Disgusted 45.1%
Fear 45%

AWS Rekognition

Age 15-27
Gender Female, 51.7%
Happy 45%
Fear 45%
Sad 45.4%
Confused 45%
Disgusted 45%
Angry 45.1%
Calm 54.5%
Surprised 45%

AWS Rekognition

Age 21-33
Gender Female, 54.5%
Angry 45.1%
Happy 51.3%
Disgusted 45.1%
Sad 45.3%
Calm 48%
Surprised 45.1%
Confused 45.1%
Fear 45.1%

AWS Rekognition

Age 25-39
Gender Female, 54.9%
Angry 45.1%
Confused 45.1%
Disgusted 45.4%
Happy 46.9%
Sad 45.4%
Fear 45.1%
Calm 51.9%
Surprised 45.1%

AWS Rekognition

Age 13-23
Gender Male, 54%
Happy 45.2%
Fear 45.1%
Disgusted 45.1%
Angry 45.4%
Surprised 45.1%
Sad 45.7%
Confused 45.1%
Calm 53.1%

AWS Rekognition

Age 15-27
Gender Female, 54.1%
Calm 54.4%
Fear 45%
Angry 45.1%
Disgusted 45%
Sad 45.3%
Happy 45.1%
Surprised 45%
Confused 45%

AWS Rekognition

Age 16-28
Gender Female, 53.4%
Fear 48.4%
Happy 47.9%
Calm 45.4%
Sad 46.9%
Angry 45.3%
Surprised 45.8%
Disgusted 45.2%
Confused 45.2%

AWS Rekognition

Age 22-34
Gender Female, 54.7%
Fear 45.1%
Surprised 45.2%
Angry 45.4%
Calm 49.6%
Sad 45.7%
Happy 48.3%
Disgusted 45.4%
Confused 45.2%

AWS Rekognition

Age 13-25
Gender Male, 53.2%
Calm 50.6%
Angry 45.9%
Fear 45.3%
Happy 46.8%
Sad 45.5%
Confused 45.2%
Disgusted 45.5%
Surprised 45.2%

AWS Rekognition

Age 9-19
Gender Male, 54.3%
Angry 45.1%
Sad 45.5%
Confused 45.1%
Disgusted 45.2%
Surprised 45.3%
Happy 46.1%
Fear 45.6%
Calm 52.1%

AWS Rekognition

Age 6-16
Gender Female, 54.4%
Angry 46.3%
Surprised 45.3%
Sad 45.9%
Fear 45.5%
Calm 50.9%
Happy 45.4%
Confused 45.2%
Disgusted 45.4%

Microsoft Cognitive Services

Age 25
Gender Female

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 19
Gender Female

Microsoft Cognitive Services

Age 20
Gender Female

Microsoft Cognitive Services

Age 21
Gender Male

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 19
Gender Female

Microsoft Cognitive Services

Age 19
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 12
Gender Female

Microsoft Cognitive Services

Age 24
Gender Female

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 21
Gender Female

Microsoft Cognitive Services

Age 14
Gender Female

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 23
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Helmet 97.3%

Categories

Imagga

paintings art 95.9%
interior objects 2.7%

Text analysis

Amazon

dfpde