Human Generated Data

Title

Untitled (girl on rocking horse)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16446

Human Generated Data

Title

Untitled (girl on rocking horse)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16446

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.4
Human 99.4
Clothing 96.3
Apparel 96.3
Furniture 90.3
Female 89.2
Blonde 86.6
Teen 86.6
Girl 86.6
Kid 86.6
Woman 86.6
Child 86.6
Shelf 85
Shorts 77.3
Face 65.5
Shop 64.8
Indoors 63.9
Cupboard 63.4
Closet 63.4
Living Room 63.1
Room 63.1
Shoe 60.6
Footwear 60.6
Play 58.5
Shoe 57.4

Clarifai
created on 2023-10-28

people 99.9
monochrome 99.5
one 97.6
adult 96.9
wear 96.9
child 95.3
recreation 93.8
woman 93
two 91
portrait 90.7
music 85.1
sports equipment 85
retro 80
indoors 78.1
furniture 77.8
actress 77.3
outfit 76.8
facial expression 76.7
baseball 76.6
adolescent 76.2

Imagga
created on 2022-02-11

portrait 27.8
adult 25.9
person 25
sexy 24.9
fashion 23.4
people 22.9
shop 22.7
attractive 21
pretty 20.3
model 18.7
black 18
lady 17.8
refrigerator 17.3
dress 17.2
clothing 16.1
blond 15.7
face 15.6
elegance 15.1
house 15
happy 15
white goods 14.8
home 14.4
hair 14.3
brassiere 13.7
lifestyle 13.7
mercantile establishment 13.6
sensual 13.6
cute 13.6
window 13.2
home appliance 12.9
one 12.7
look 12.3
luxury 12
appliance 11.8
posing 11.5
newspaper 11.4
standing 11.3
elegant 11.1
sensuality 10.9
gorgeous 10.9
interior 10.6
indoors 10.5
style 10.4
device 10.2
skin 10.2
head 10.1
makeup 10.1
product 9.9
human 9.7
looking 9.6
body 9.6
women 9.5
consumer goods 9.4
woman's clothing 9.4
undergarment 9.4
man 9.4
smiling 9.4
smile 9.3
covering 9.3
garment 9.2
place of business 9.1
old 9.1
healthy 8.8
life 8.8
vogue 8.7
happiness 8.6
male 8.5
casual 8.5
building 8.4
film 8.4
furniture 8.3
indoor 8.2
make 8.2
shoe shop 8.1
lovely 8
bathroom 8
urban 7.9
brunette 7.8
eyes 7.7
modern 7.7
expression 7.7
grunge 7.7
negative 7.7
city 7.5
vintage 7.4
lips 7.4
retro 7.4
nice 7.3
alone 7.3
stylish 7.2

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 98.8
clothing 97.4
person 96
footwear 92.6
indoor 90.8
window 89.6
human face 78.2
open 72.2
baseball 63.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 90.8%
Calm 60%
Happy 34.4%
Sad 1.7%
Disgusted 0.9%
Confused 0.9%
Surprised 0.8%
Angry 0.7%
Fear 0.6%

AWS Rekognition

Age 23-31
Gender Male, 76.5%
Calm 96.1%
Sad 2.3%
Happy 0.4%
Confused 0.4%
Angry 0.3%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 19-27
Gender Female, 81.8%
Calm 66.4%
Happy 28.2%
Sad 2%
Angry 1%
Fear 0.8%
Disgusted 0.7%
Confused 0.6%
Surprised 0.4%

AWS Rekognition

Age 22-30
Gender Male, 68.4%
Calm 91.4%
Sad 3.1%
Happy 2.9%
Confused 0.8%
Disgusted 0.6%
Fear 0.4%
Angry 0.4%
Surprised 0.4%

AWS Rekognition

Age 21-29
Gender Male, 83.8%
Calm 91.6%
Sad 6.8%
Confused 0.5%
Happy 0.4%
Disgusted 0.2%
Angry 0.2%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.4%
Shoe 60.6%
Shoe 57.4%

Categories

Imagga

paintings art 76.2%
interior objects 22.9%

Text analysis

Amazon

36
a
CH
KODA-ITW
X
OF
H
HR
g

Google

MJI7-- YT37A°2- - XAGOX 36 2 2
MJI7--
YT37A°2-
-
XAGOX
36
2