Human Generated Data

Title

Untitled (family with three young children and baby posed seated on living room chair)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9300

Human Generated Data

Title

Untitled (family with three young children and baby posed seated on living room chair)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9300

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 98.7
Person 97.5
Person 97.4
Person 96.7
Clothing 93.9
Apparel 93.9
Person 92.6
Furniture 92
Chair 86
Face 84.7
Person 84.1
Indoors 82.2
Suit 80.8
Overcoat 80.8
Coat 80.8
People 76.3
Room 75.9
Table 73.2
Collage 72.3
Advertisement 72.3
Poster 72.3
Living Room 67.3
Portrait 64.4
Photography 64.4
Photo 64.4
Female 64.2
Kid 61.1
Child 61.1
Text 58.8
Crowd 57.2
Curtain 56.8

Clarifai
created on 2023-10-27

people 99.8
group 99.6
child 99
woman 95.9
education 95.3
man 94.3
adult 93.2
son 92.8
monochrome 92.4
group together 91.5
chair 91.3
sit 90.7
music 90.7
boy 90.4
family 90.3
leader 89.6
teacher 88.7
three 87.5
nostalgia 86.3
room 85.6

Imagga
created on 2022-01-23

person 36.3
man 34.3
male 31.9
people 30.7
adult 25.5
room 23.7
teacher 23.7
home 21.5
grandfather 19
sitting 18
professional 17.8
classroom 17.3
men 17.2
family 16.9
senior 16.9
smiling 16.6
education 16.4
student 15.8
chair 15.4
patient 15.4
group 15.3
musical instrument 15.2
lifestyle 15.2
happy 15
indoors 14.9
holding 14.8
table 14.7
work 14.2
kin 14
women 13.4
instrument 13.2
smile 12.8
child 12.6
happiness 12.5
class 12.5
medical 12.4
portrait 12.3
school 12.2
office 12.1
worker 11.8
teaching 11.7
businessman 11.5
couple 11.3
boy 11.3
mother 10.9
team 10.7
wind instrument 10.7
kid 10.6
interior 10.6
working 10.6
cheerful 10.6
clinic 10.5
human 10.5
old 10.4
business 10.3
mature 10.2
teamwork 10.2
nurse 10.1
board 9.9
equipment 9.9
handsome 9.8
job 9.7
together 9.6
studying 9.6
salon 9.4
youth 9.4
casual 9.3
brass 9.1
modern 9.1
health 9
hospital 8.8
medicine 8.8
couch 8.7
test 8.7
husband 8.6
two 8.5
doctor 8.5
music 8.3
indoor 8.2
lesson 7.8
father 7.8
educator 7.8
laboratory 7.7
executive 7.7
stringed instrument 7.7
exam 7.7
elderly 7.7
hand 7.6
desk 7.6
togetherness 7.5
meeting 7.5
manager 7.4
vintage 7.4
technology 7.4
style 7.4
coffee 7.4
care 7.4
phone 7.4
retro 7.4
occupation 7.3
confident 7.3
looking 7.2
to 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 95.6
person 90.4
clothing 85.3
smile 73.1
old 70.6
human face 61.9
christmas tree 52
posing 40.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 99%
Happy 61.2%
Calm 28.8%
Surprised 7.4%
Sad 1.2%
Confused 0.7%
Disgusted 0.3%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 6-14
Gender Male, 98.9%
Happy 86.9%
Surprised 5.7%
Angry 2%
Calm 1.5%
Confused 1.3%
Disgusted 1.1%
Sad 1%
Fear 0.6%

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Happy 79%
Surprised 14.5%
Calm 2.5%
Confused 1.7%
Disgusted 1.2%
Sad 0.7%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Surprised 54.9%
Happy 38.5%
Confused 3.5%
Disgusted 1.4%
Sad 0.7%
Angry 0.4%
Fear 0.3%
Calm 0.2%

AWS Rekognition

Age 23-33
Gender Male, 94.2%
Calm 39.2%
Fear 31.1%
Happy 19.6%
Surprised 4.5%
Angry 3.4%
Disgusted 1%
Sad 0.8%
Confused 0.4%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 61.4%
Surprised 27%
Angry 4.5%
Happy 2.1%
Fear 1.7%
Confused 1.3%
Disgusted 1.2%
Sad 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

KODAR-SEA
14885