Human Generated Data

Title

Untitled (older man and woman seated with two young girls at sides)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12902

Human Generated Data

Title

Untitled (older man and woman seated with two young girls at sides)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12902

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.6
Human 99.6
Couch 98.5
Furniture 98.5
Person 98.3
Person 97.6
Accessories 97.4
Tie 97.4
Accessory 97.4
Clothing 95.2
Apparel 95.2
People 92
Overcoat 82.3
Coat 82.3
Suit 82.3
Family 80.7
Person 69.5
Portrait 63.5
Photo 63.5
Photography 63.5
Face 63.5
Shorts 62.5
Sitting 60.1
Finger 58.6
Kid 57.2
Child 57.2
Female 56.5
Tuxedo 55.9

Clarifai
created on 2019-11-16

people 99.9
group 99.4
portrait 97.7
adult 97.6
man 97.1
two 96.9
wear 96.7
child 96.5
facial expression 95.1
outfit 95.1
actor 95
three 94.6
leader 94.5
music 94.4
group together 93.8
family 93.7
offspring 92.3
four 91.8
woman 91
administration 90.4

Imagga
created on 2019-11-16

man 32.9
person 31.5
male 27.7
people 26.8
black 21.2
adult 18.4
world 17.5
dark 16.7
couple 15.7
portrait 15.5
human 15
love 14.2
suit 14
attractive 13.3
one 12.7
kin 12.4
lifestyle 12.3
performer 12.1
sexy 12
men 11.2
planner 10.9
clothing 10.8
silhouette 10.8
fashion 10.5
boy 10.4
style 10.4
model 10.1
romantic 9.8
body 9.6
happy 9.4
business 9.1
pose 9.1
comedian 9
posing 8.9
hair 8.7
guy 8.6
sitting 8.6
tie 8.5
youth 8.5
relax 8.4
room 8.2
group 8.1
office 8
family 8
night 8
smiling 8
brass 8
women 7.9
together 7.9
happiness 7.8
face 7.8
pretty 7.7
two 7.6
hand 7.6
passion 7.5
relationship 7.5
dress 7.2
home 7.2
mask 7.2
modern 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.5
clothing 99
human face 98.2
smile 97.8
wall 97.5
baby 94.2
person 89.8
indoor 86.8
toddler 85.2
woman 77.5
man 72.9
black and white 71.2
boy 59
posing 51.1
old 41.4
picture frame 7.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 98.3%
Angry 3.3%
Surprised 1%
Disgusted 1%
Confused 1.7%
Fear 0.7%
Calm 65.4%
Happy 21.6%
Sad 5.3%

AWS Rekognition

Age 39-57
Gender Male, 98.4%
Surprised 0.2%
Calm 5%
Fear 0.1%
Disgusted 0.3%
Happy 93.7%
Angry 0.1%
Sad 0.3%
Confused 0.2%

AWS Rekognition

Age 33-49
Gender Female, 97.5%
Angry 1.2%
Sad 1.4%
Confused 1.6%
Surprised 1%
Happy 2.5%
Disgusted 0.5%
Calm 91.6%
Fear 0.2%

AWS Rekognition

Age 8-18
Gender Male, 65.7%
Fear 7.1%
Angry 1.7%
Calm 36.3%
Surprised 0.7%
Happy 1.3%
Confused 0.7%
Sad 52.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 56
Gender Male

Microsoft Cognitive Services

Age 46
Gender Female

Microsoft Cognitive Services

Age 9
Gender Male

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Tie 97.4%

Categories

Text analysis

Google

22
22