Human Generated Data

Title

Untitled (four women in matching flowered dresses seated in yard)

Date

c. 1945, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6783

Human Generated Data

Title

Untitled (four women in matching flowered dresses seated in yard)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6783

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Advertisement 99.9
Collage 99.9
Human 99.1
Person 99.1
Person 99
Person 97.4
Face 97.4
Person 97.1
Head 95.5
Person 93.3
Person 85.3
Smile 84.9
People 80
Apparel 78
Clothing 78
Outdoors 75.6
Person 72.7
Photography 72.3
Photo 72.3
Person 70.9
Portrait 69.6
Crowd 66.5
Nature 63.6
Urban 61.5
Person 61.4
Female 61.2
Person 60.8
Girl 58.8
Man 58.2
Laughing 57.3
Poster 57

Clarifai
created on 2019-11-16

people 99.9
adult 98.8
group 98.7
man 97.7
woman 97.5
wear 95.7
portrait 95.4
two 95.3
facial expression 91.8
many 91.5
actress 91.3
child 91
leader 89.7
three 88.7
offspring 87.8
movie 85.7
group together 85.6
four 83.2
music 81.6
outfit 81.6

Imagga
created on 2019-11-16

billboard 34.2
signboard 27.7
structure 24
statue 22.4
background 21.4
old 19.5
screen 19.1
art 18.5
man 18.1
sculpture 15.6
book jacket 15.1
black 15
ancient 14.7
person 14.7
male 14.2
vintage 14.1
people 13.9
jacket 12.7
antique 12.1
display 12.1
web site 11.8
religion 11.7
portrait 11.6
symbol 10.8
marble 10.8
history 10.7
film 10.3
monument 10.3
negative 9.9
stone 9.4
cemetery 9.4
dirty 9
sky 8.9
wrapping 8.9
soldier 8.8
electronic device 8.6
face 8.5
world 8.4
mask 7.8
adult 7.8
death 7.7
grunge 7.7
god 7.7
memorial 7.6
decoration 7.5
historical 7.5
dark 7.5
religious 7.5
travel 7
architecture 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 96.1
clothing 93.6
person 93.6
smile 91.9
human face 90.3
man 85.4
woman 81.1
gallery 74.1
black and white 68
wedding dress 65.5
old 62.9
posing 56.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 99.4%
Calm 2.7%
Sad 0.1%
Confused 0.1%
Happy 96.8%
Angry 0.1%
Disgusted 0.1%
Fear 0%
Surprised 0.1%

AWS Rekognition

Age 21-33
Gender Female, 92.3%
Disgusted 0.1%
Surprised 0.1%
Happy 99.2%
Confused 0.1%
Sad 0.1%
Fear 0%
Angry 0.1%
Calm 0.2%

AWS Rekognition

Age 22-34
Gender Female, 75.2%
Sad 0.1%
Disgusted 0.3%
Happy 97.7%
Confused 0.3%
Fear 0.1%
Angry 0.3%
Calm 1.1%
Surprised 0.2%

AWS Rekognition

Age 22-34
Gender Female, 98.5%
Confused 1.7%
Surprised 0.8%
Disgusted 3.6%
Happy 85.1%
Calm 6.5%
Sad 0.7%
Fear 0.3%
Angry 1.3%

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Poster 57%

Text analysis

Amazon

AL3AYS
WT13

Google

SAFETY FILM SAFETY
SAFETY
FILM