Human Generated Data

Title

Untitled (girl on bed with doll)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17483

Human Generated Data

Title

Untitled (girl on bed with doll)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17483

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.9
Bedroom 99
Room 99
Indoors 99
Chair 98.6
Person 95.7
Human 95.7
Bed 95.7
Clothing 89.8
Apparel 89.8
Female 88.3
Face 87.7
Blonde 86.3
Girl 86.3
Kid 86.3
Woman 86.3
Teen 86.3
Child 86.3
Pillow 83.2
Cushion 83.2
Person 77.6
Dog 74.2
Mammal 74.2
Animal 74.2
Canine 74.2
Pet 74.2
Table 71.9
Dorm Room 70.4
Portrait 67.2
Photography 67.2
Photo 67.2
Blackboard 58.9
Person 56.3
Housing 55.3
Building 55.3

Clarifai
created on 2023-10-29

people 99.8
monochrome 98.1
two 97.8
child 96.1
woman 96
adult 95
room 94.7
group 93.9
actress 93.5
man 93.3
family 92.8
indoors 92
furniture 91.3
portrait 89.4
wedding 87.7
girl 86.5
three 85.9
music 84.3
mirror 84
bed 81.6

Imagga
created on 2022-02-26

man 32.2
people 31.8
person 28.5
adult 27.5
male 25.6
brass 24.2
wind instrument 22.9
room 21.3
indoors 21.1
home 20.7
sitting 20.6
happiness 20.4
business 20
office 18.9
couple 18.3
women 17.4
businessman 16.8
men 16.3
cheerful 16.2
smiling 15.9
musical instrument 15.7
table 15.7
happy 15.7
group 15.3
meeting 15.1
cornet 15
sax 14.9
smile 14.2
lifestyle 13.7
indoor 13.7
portrait 13.6
team 13.4
work 13.3
professional 12.2
love 11.8
world 11.8
worker 11.7
family 11.6
interior 11.5
together 11.4
businesspeople 11.4
executive 11.3
senior 11.2
laptop 11.1
holding 10.7
computer 10.4
looking 10.4
two 10.2
color 10
businesswoman 10
suit 9.9
modern 9.8
groom 9.8
job 9.7
fun 9.7
chair 9.7
couch 9.7
corporate 9.4
salon 9.4
confident 9.1
life 9
black 9
handsome 8.9
classroom 8.9
teacher 8.7
education 8.7
bride 8.6
desk 8.6
casual 8.5
communication 8.4
mature 8.4
house 8.4
teamwork 8.3
successful 8.2
alone 8.2
one 8.2
child 8.1
success 8
working 7.9
face 7.8
pretty 7.7
married 7.7
old 7.7
reading 7.6
adults 7.6
career 7.6
females 7.6
togetherness 7.5
music 7.3
playing 7.3
new 7.3
dress 7.2
sexy 7.2
paper 7.2

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 91
indoor 86.4
window 86.2
clothing 85.9
person 79.3
black and white 73.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 94.4%
Calm 26.2%
Disgusted 18.7%
Confused 16.6%
Angry 15.7%
Surprised 9.9%
Sad 5.8%
Happy 4.3%
Fear 2.8%

AWS Rekognition

Age 28-38
Gender Female, 60.9%
Surprised 94.1%
Fear 5.3%
Happy 0.2%
Calm 0.2%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%
Sad 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Dog
Person 95.7%
Person 77.6%
Person 56.3%
Dog 74.2%

Categories