Human Generated Data

Title

Untitled (studio portrait of family with two young children)

Date

1925-1945

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10198

Human Generated Data

Title

Untitled (studio portrait of family with two young children)

People

Artist: Martin Schweig, American 20th century

Date

1925-1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10198

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.5
Human 99.5
Person 99.5
Person 98.8
Tie 98.7
Accessories 98.7
Accessory 98.7
Suit 96.9
Coat 96.9
Clothing 96.9
Overcoat 96.9
Apparel 96.9
Person 96.2
Female 86.8
People 86.4
Girl 77.7
Face 71.9
Hair 71.1
Suit 67.1
Family 62.9
Sleeve 62.7
Person 61.3
Sailor Suit 60.8
Kid 58.4
Child 58.4
Woman 58.3
Dress 58.1
Officer 56.5
Military 56.5
Military Uniform 56.5
Teen 55.5

Clarifai
created on 2023-10-26

people 100
portrait 99.6
group 99.6
three 99.4
two 98.7
nostalgia 98.2
child 98
adult 97.8
retro 97.7
four 96.9
wear 96.4
sibling 96
facial expression 94.2
centennial 93.6
son 93.3
offspring 93.1
administration 92.1
outfit 91.7
actor 91.7
man 90.9

Imagga
created on 2022-01-22

man 41
businessman 38.9
business 35.3
smiling 34.8
male 34.5
people 34.1
happy 32.6
bow tie 32.5
suit 32.3
group 32.3
corporate 31
businesswoman 30
office 29.9
team 29.6
men 27.5
executive 26.6
professional 26.5
adult 25.4
teamwork 25.1
necktie 25
meeting 24.5
couple 24.4
businesspeople 22.8
kin 22.8
person 22.6
success 22.6
standing 21.8
smile 21.4
partnership 21.2
portrait 20.7
work 20.4
brother 20.3
sibling 20.1
happiness 18.8
together 18.4
women 18.2
colleagues 17.5
confident 17.3
job 16.8
attractive 16.8
successful 16.5
boss 16.3
businessmen 15.6
partner 15.5
cheerful 15.5
working 15
manager 14.9
family 14.2
two 13.6
30s 13.5
indoors 13.2
boy 13.1
lifestyle 13
garment 13
father 12.7
staff 12.7
leadership 12.5
building 12.2
camera 12
looking 12
partners 11.7
confidence 11.5
formal 11.5
workplace 11.4
tie 11.4
face 11.4
sitting 11.2
company 11.2
color 11.1
love 11.1
husband 10.9
coworkers 10.8
handsome 10.7
corporation 10.6
clothing 10.6
child 10.6
diversity 10.6
education 10.4
expression 10.2
parent 10.1
holding 9.9
mother 9.9
businesswomen 9.8
colleague 9.8
home 9.6
employee 9.6
dad 9.5
wife 9.5
buddy 9.5
smart 9.4
black 9.3
three 9.3
communication 9.2
worker 9.2
20s 9.2
indoor 9.1
married 8.6
career 8.5
guy 8.4
human 8.3
diverse 7.8
40s 7.8
elegant 7.7
pretty 7.7
talking 7.6
females 7.6
adults 7.6
friends 7.5
children 7.3

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.8
wall 99.6
clothing 98
human face 97.6
text 96.8
smile 96.5
posing 95.9
indoor 92.3
standing 82.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-38
Gender Male, 100%
Calm 99.3%
Confused 0.2%
Surprised 0.2%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%
Sad 0%

AWS Rekognition

Age 2-8
Gender Male, 100%
Happy 98%
Calm 0.7%
Confused 0.5%
Angry 0.2%
Surprised 0.2%
Sad 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Female, 100%
Calm 95.2%
Sad 1.5%
Surprised 0.8%
Confused 0.8%
Happy 0.6%
Disgusted 0.4%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 2-10
Gender Female, 100%
Calm 85.9%
Confused 5.1%
Surprised 2.9%
Happy 2.1%
Angry 1.7%
Sad 0.9%
Fear 0.8%
Disgusted 0.6%

Microsoft Cognitive Services

Age 37
Gender Male

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 7
Gender Male

Microsoft Cognitive Services

Age 11
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Tie 98.7%
Suit 96.9%

Categories

Imagga

people portraits 99.9%

Text analysis

Amazon

WITHIN
no
BNED
WITHIN the no
and
and should
should
PROPERTY
Apr
the

Google

NED wITHIN
NED
wITHIN