Human Generated Data

Title

Untitled (man, woman, and baby by window)

Date

1962

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16500

Human Generated Data

Title

Untitled (man, woman, and baby by window)

People

Artist: Lucian and Mary Brown, American

Date

1962

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16500

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.6
Human 99.6
Person 99.6
Person 99.5
Clothing 99.1
Apparel 99.1
Face 78.6
Icing 77.5
Food 77.5
Dessert 77.5
Cake 77.5
Cream 77.5
Creme 77.5
Hat 73.9
Door 69.5
Home Decor 64.3
Bonnet 59.7
Plant 57.1
Porch 56

Clarifai
created on 2023-10-28

people 99.7
monochrome 98.4
family 98.3
group 97.9
woman 97.7
man 97.7
group together 94.5
indoors 94.4
adult 93.9
child 93.6
room 92
three 91.1
two 90.4
four 87
furniture 86.7
sit 86.5
education 83.3
window 80.5
couple 80.4
elderly 77.8

Imagga
created on 2022-02-11

man 41
male 31.2
people 30.1
person 29.1
adult 28.3
senior 23.4
business 21.9
office 21.3
businessman 21.2
professional 20.5
happy 20
room 19.5
indoors 19.3
elderly 18.2
looking 17.6
home 17.5
teacher 17.4
sitting 17.2
men 17.2
working 16.8
worker 16.1
computer 15.2
smiling 15.2
work 14.9
portrait 14.9
mature 14.9
executive 14.9
patient 14.8
couple 14.8
blackboard 14.4
old 13.9
team 13.4
meeting 13.2
casual 12.7
happiness 12.5
education 12.1
smile 12.1
teamwork 12
occupation 11.9
indoor 11.9
laptop 11.8
job 11.5
group 11.3
doctor 11.3
communication 10.9
monitor 10.8
specialist 10.8
medical 10.6
retirement 10.6
case 10.5
desk 10.4
technology 10.4
corporate 10.3
day 10.2
camera 10.2
lifestyle 10.1
back 10.1
suit 9.9
retired 9.7
educator 9.6
businesspeople 9.5
women 9.5
hospital 9.3
horizontal 9.2
alone 9.1
modern 9.1
holding 9.1
student 9.1
health 9
negative 9
black 9
handsome 8.9
family 8.9
world 8.8
screen 8.6
bright 8.6
career 8.5
adults 8.5
house 8.4
leisure 8.3
successful 8.2
care 8.2
one 8.2
classroom 8
to 8
nurse 8
interior 8
hair 7.9
together 7.9
gray hair 7.9
face 7.8
table 7.8
older 7.8
30s 7.7
film 7.7
husband 7.6
two 7.6
clothing 7.6
coat 7.5
surgeon 7.5
manager 7.4
sick person 7.4
clinic 7.4
equipment 7.4
design 7.3
success 7.2
school 7.2
love 7.1
window 7.1
medicine 7

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

window 98.9
person 97.6
clothing 96.3
man 94.4
human face 90
text 75.4
posing 38.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 93.1%
Calm 98.2%
Surprised 1.5%
Sad 0.2%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 27-37
Gender Female, 83.6%
Surprised 61.8%
Calm 22.7%
Happy 5.2%
Angry 4.5%
Confused 2.6%
Fear 1.3%
Disgusted 1.1%
Sad 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.6%
Person 99.6%
Person 99.5%

Text analysis

Amazon

5
2
3