Human Generated Data

Title

Untitled (three children at low table)

Date

c. 1960

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17233

Human Generated Data

Title

Untitled (three children at low table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17233

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.4
Human 99.4
Person 98.9
Person 98.8
Person 98.3
Room 91.5
Indoors 91.5
Furniture 89
Clothing 88.9
Apparel 88.9
Table 82.6
Classroom 82.5
School 82.5
Chair 73.9
Portrait 62.1
Face 62.1
Photography 62.1
Photo 62.1
People 60.4
Shoe 57
Footwear 57
Clinic 56.8
Plant 56.3
Shorts 56
Living Room 56

Clarifai
created on 2023-10-29

people 99.9
child 99.3
group 98.2
three 97.4
group together 97.1
adult 97.1
two 95.4
four 94.8
furniture 94
family 93.2
monochrome 92.8
education 92.1
boy 92.1
man 91.8
room 91.7
woman 90.7
several 90
teacher 89
offspring 88.6
chair 87.5

Imagga
created on 2022-02-26

man 43.7
hospital 41
male 35.6
patient 35.2
person 31.7
nurse 29.2
people 27.3
adult 25.4
room 23.5
indoors 22.8
medical 22.1
senior 21.6
smiling 20.3
doctor 19.7
home 19.1
professional 19
men 18.9
health 18.1
sitting 16.3
case 16.1
clinic 15.9
mature 15.8
sick person 15.4
worker 15.2
happy 15
classroom 15
working 15
occupation 14.7
work 14.3
holding 14
couple 13.9
table 13.8
40s 13.6
women 13.4
surgeon 13.4
color 13.3
family 13.3
job 13.3
together 13.1
elderly 12.4
businessman 12.4
medicine 12.3
lifestyle 12.3
standing 12.2
business 12.1
kitchen 11.8
casual clothing 11.7
two people 11.7
portrait 11.6
care 11.5
talking 11.4
hairdresser 11.4
happiness 11
office 10.9
team 10.7
surgery 10.7
interior 10.6
cheerful 10.6
exam 10.5
illness 10.5
chair 9.9
teacher 9.7
food 9.7
adults 9.5
two 9.3
20s 9.2
modern 9.1
operation 8.9
50s 8.8
boy 8.7
mid adult 8.7
30s 8.7
retirement 8.6
day 8.6
uniform 8.6
smile 8.5
meeting 8.5
casual 8.5
old 8.4
teamwork 8.3
looking 8
to 8
child 7.9
education 7.8
discussion 7.8
colleagues 7.8
retired 7.8
angle 7.7
house 7.5
machine 7.4
equipment 7.3
group 7.3

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 96.1
text 94.4
sink 88.4
bathroom 77.3
table 77.2
black and white 73.3
clothing 68
furniture 66.7
human face 64.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 99.6%
Calm 83%
Sad 9.7%
Confused 3.2%
Happy 2%
Disgusted 1%
Angry 0.4%
Fear 0.3%
Surprised 0.3%

AWS Rekognition

Age 20-28
Gender Male, 90.5%
Happy 55.3%
Calm 35.7%
Surprised 5.7%
Confused 0.9%
Angry 0.8%
Sad 0.6%
Fear 0.5%
Disgusted 0.4%

AWS Rekognition

Age 39-47
Gender Male, 53.2%
Calm 91.2%
Sad 6.4%
Happy 0.9%
Disgusted 0.5%
Angry 0.3%
Surprised 0.3%
Confused 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.4%
Person 98.9%
Person 98.8%
Person 98.3%
Shoe 57%

Categories

Imagga

paintings art 96.8%
people portraits 2.2%

Text analysis

Amazon

KODAK-A--1TW

Google

MJI7--YT3A°2--AGO
MJI7--YT3A°2--AGO