Human Generated Data

Title

Untitled (Duncombe Class of 1932)

Date

1932

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1119

Human Generated Data

Title

Untitled (Duncombe Class of 1932)

People

Artist: Harry Annas, American 1897 - 1980

Date

1932

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Text 98.7
Person 97.9
Human 97.9
Person 97.8
Person 97.7
Person 97.4
Person 97.3
Person 97.1
Person 97
Person 96
Person 95.5
Head 93.1
Face 90.8
Word 83.3
Glasses 72.9
Accessories 72.9
Accessory 72.9
Number 71
Symbol 71
Female 63.7
Alphabet 62.6
Photo Booth 62.1
Portrait 62.1
Photography 62.1
Photo 62.1
Handwriting 58.8

Imagga
created on 2022-01-23

drawing 43.1
representation 40.5
diagram 27.4
silhouette 27.3
cartoon 26.8
mug shot 25.2
black 24.8
design 20.8
photograph 19.6
art 19.3
icon 18.2
symbol 17.5
set 16.1
gymnasium 15.3
sketch 14.8
graphic 14.6
man 14.1
sign 13.6
creation 13.5
people 13.4
pick 13.4
male 12.8
love 12.6
athletic facility 12.2
facility 12.1
style 11.9
person 11.2
grunge 11.1
mustache 10.8
device 10.4
bodybuilder 10.2
color 10
human 9.8
haircut 9.7
logo 9.6
cute 9.3
head 9.2
artwork 9.2
shape 9.2
retro 9
horror 8.7
animal 8.5
element 8.3
valentine 8.2
game 8
holiday 7.9
smile 7.8
boy 7.8
face 7.8
scary 7.7
four 7.7
happy 7.5
frame 7.5
moon 7.4
business 7.3
decoration 7.2
night 7.1

Google
created on 2022-01-23

Forehead 98.5
Nose 98.4
Chin 96.8
Eyebrow 95.7
Outerwear 95.1
Hairstyle 95.1
Product 90.8
Jaw 88.4
Font 85.3
Social group 82.8
Beauty 75.3
Circle 73
Suit 73
History 70.3
Event 69.5
Team 67
Facial hair 66.2
Stock photography 64.9
Collection 63
Eyelash 61.4

Microsoft
created on 2022-01-23

text 99.9
book 99.5
human face 99.3
person 97
woman 95.4
drawing 83
smile 81.2
sketch 77.4
man 76.8
clothing 73.3
different 65.4
bunch 44.6
several 15.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 100%
Happy 98.8%
Confused 0.4%
Surprised 0.2%
Calm 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Sad 0.1%

AWS Rekognition

Age 21-29
Gender Female, 100%
Calm 70.9%
Happy 16%
Fear 4.8%
Surprised 2.1%
Confused 2%
Disgusted 1.6%
Sad 1.4%
Angry 1.1%

AWS Rekognition

Age 16-24
Gender Female, 100%
Happy 98.9%
Surprised 0.3%
Confused 0.3%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%
Calm 0.1%
Sad 0%

AWS Rekognition

Age 19-27
Gender Male, 100%
Happy 92%
Calm 7%
Confused 0.3%
Surprised 0.2%
Angry 0.2%
Sad 0.2%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 19-27
Gender Female, 99.3%
Calm 98.4%
Confused 0.5%
Happy 0.5%
Sad 0.2%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 100%
Happy 99.5%
Calm 0.1%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Fear 0.1%
Sad 0%
Disgusted 0%

AWS Rekognition

Age 23-33
Gender Male, 93.6%
Calm 99.2%
Angry 0.2%
Sad 0.2%
Confused 0.1%
Happy 0.1%
Surprised 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 14-22
Gender Male, 99.8%
Calm 64.6%
Confused 23.3%
Sad 4.4%
Surprised 1.9%
Disgusted 1.6%
Fear 1.4%
Happy 1.4%
Angry 1.3%

AWS Rekognition

Age 16-22
Gender Male, 99.5%
Calm 97.5%
Happy 1.2%
Sad 0.4%
Confused 0.3%
Angry 0.3%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 45
Gender Female

Microsoft Cognitive Services

Age 23
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%
Glasses 72.9%

Captions

Microsoft

a group of people looking at a book 25.8%
a close up of a book 25.7%
a group of people standing next to a book 25.6%

Text analysis

Amazon

ANNAS
DUNCOMBE
CLASS1932

Google

DUNCOMBE CLASS 1932
CLASS
DUNCOMBE
1932