Human Generated Data

Title

Untitled (two children on floor in room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17701

Human Generated Data

Title

Untitled (two children on floor in room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.8
Human 99.8
Person 98
Play 97.3
Apparel 83.8
Clothing 83.8
Boy 81.5
Floor 78.9
Kid 78.5
Child 78.5
Baby 73.9
Face 72.2
Shorts 70.4
Indoors 67.2
Portrait 64.9
Photography 64.9
Photo 64.9
Head 64.8
Pottery 58.4
Potted Plant 58.4
Jar 58.4
Vase 58.4
Plant 58.4
Door 57.4
Girl 56.5
Female 56.5
Room 55.4

Imagga
created on 2022-02-26

laptop 46.1
man 43.7
people 39.1
male 38.4
business 37.1
businesswoman 35.5
office 35.3
adult 35.2
person 32.6
happy 32
computer 31.6
meeting 30.2
businessman 30
group 28.2
work 26.7
corporate 26.6
table 25.4
professional 25.2
team 25.1
executive 24
smiling 23.9
home 23.1
room 22.9
teamwork 22.3
job 21.2
indoors 21.1
desk 21
couple 20.9
sitting 20.6
businesspeople 19.9
education 19
men 18.9
working 18.6
communication 18.5
smile 17.8
classroom 17.8
manager 17.7
looking 17.6
teacher 17.5
women 17.4
indoor 17.3
confident 17.3
portrait 16.8
notebook 16.7
conference 16.6
worker 16.1
together 15.8
modern 15.4
casual 15.3
lifestyle 15.2
director 14.9
technology 14.8
attractive 14.7
discussion 14.6
success 14.5
successful 13.7
face 13.5
interior 13.3
suit 12.6
happiness 12.5
workplace 12.4
cheerful 12.2
student 11.9
talking 11.4
senior 11.2
company 11.2
mature 11.2
entrepreneur 11
horizontal 10.9
world 10.7
colleagues 10.7
new 10.5
presentation 10.2
coffee 10.2
handsome 9.8
conversation 9.7
elderly 9.6
paper 9.6
career 9.5
clothing 9.4
contemporary 9.4
expression 9.4
two 9.3
call 9
life 9
seminar 8.8
educator 8.8
2 8.7
corporation 8.7
busy 8.7
boss 8.6
husband 8.6
college 8.5
chair 8.5
friends 8.5
hall 8.4
house 8.4
color 8.3
training 8.3
holding 8.3
family 8
debate 7.9
bright 7.9
good mood 7.8
standing 7.8
businessperson 7.8
grandfather 7.8
partner 7.7
mid adult 7.7
leader 7.7
pretty 7.7
30s 7.7
talk 7.7
old 7.7
learning 7.5
keyboard 7.5
one 7.5
study 7.5
blackboard 7.5
glasses 7.4
phone 7.4
20s 7.3
child 7.2
school 7.1
to 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

toddler 94.8
text 94.6
baby 84.5
child 82.7
clothing 81.2
person 80.4
boy 75.9
human face 75.4
black and white 70.7

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 97.8%
Happy 61.8%
Surprised 32.7%
Calm 3.1%
Fear 0.9%
Sad 0.7%
Disgusted 0.4%
Angry 0.2%
Confused 0.2%

AWS Rekognition

Age 21-29
Gender Male, 99.9%
Calm 79.2%
Surprised 14.4%
Fear 3.9%
Happy 1.3%
Disgusted 0.6%
Sad 0.2%
Angry 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a girl posing for a photo 54.1%
a person sitting in a box 53.5%
a child posing for the camera 43.8%

Text analysis

Amazon

m
t
ODVR
VT27A2
or