Human Generated Data

Title

Untitled (little boy sitting on chair)

Date

1962

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16495

Human Generated Data

Title

Untitled (little boy sitting on chair)

People

Artist: Lucian and Mary Brown, American

Date

1962

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16495

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Furniture 100
Crib 98.2
Person 97.1
Human 97.1
Cradle 90.7
Clothing 87.1
Apparel 87.1
Hat 64.6
Portrait 64.3
Photography 64.3
Photo 64.3
Face 64.3
Smile 57.9
Chair 57.9

Clarifai
created on 2023-10-28

people 99.6
child 98.8
one 98.6
girl 94.8
family 94.8
wear 94.7
lid 94.5
adult 93.6
dress 92.9
woman 91.4
baby 90.9
wedding 90
portrait 90
veil 89.6
chair 88.7
beautiful 87.8
bed 87.6
indoors 87
fun 85.9
monochrome 85.7

Imagga
created on 2022-02-11

home 24.8
baby bed 24.1
furniture 22.9
adult 22
people 21.7
shower cap 20.8
person 19.4
cap 17.1
indoors 16.7
clothing 16.3
room 15
man 14.8
interior 14.1
portrait 13.6
headdress 13.2
light 12.7
love 12.6
attractive 12.6
furnishing 12.6
house 12.5
lifestyle 12.3
bassinet 12.2
fashion 12
happy 11.9
dress 11.7
bride 11.5
male 11.4
sitting 11.2
elegance 10.9
human 10.5
luxury 10.3
harp 10.3
casual 10.2
smiling 10.1
wedding 10.1
relaxation 10
smile 10
student 10
face 9.9
negative 9.8
modern 9.8
black 9.6
hair 9.5
crib 9.5
happiness 9.4
working 8.8
domestic 8.8
work 8.7
bedroom 8.7
women 8.7
architecture 8.6
men 8.6
book 8.4
device 8.2
indoor 8.2
worker 8
looking 8
decoration 8
cute 7.9
film 7.8
child 7.8
old 7.7
statue 7.6
life 7.6
hand 7.6
holding 7.4
style 7.4
art 7.3
cheerful 7.3
design 7.3
holiday 7.2
glass 7.1
romantic 7.1
medical 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 94.4
human face 92
person 77.3
old 64.9
clothing 62.2
posing 41.9
net 18.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 7-17
Gender Female, 80.7%
Happy 67.7%
Surprised 24.4%
Calm 3.6%
Fear 1.8%
Angry 0.8%
Disgusted 0.8%
Sad 0.7%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Hat
Person 97.1%
Hat 64.6%

Categories

Imagga

paintings art 89.3%
interior objects 7.3%
food drinks 3.2%

Text analysis

Amazon

14
3
KODA-IW

Google

14 MJIA- -YT A°2- -XAGON
14
MJIA-
-YT
A°2-
-XAGON