Human Generated Data

Title

Untitled (boy holding fake gun)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16398

Human Generated Data

Title

Untitled (boy holding fake gun)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Human 99.7
Person 99.7
Guitar 97.8
Musical Instrument 97.8
Leisure Activities 97.8
Clothing 94
Sleeve 94
Apparel 94
Animal 91.9
Bird 91.9
Home Decor 76
Long Sleeve 67.7
Face 67.3
Photo 67.1
Photography 67.1
Indoors 64.9
Portrait 64.2
Finger 63.6
Floor 62.5
Flooring 59.6
Text 57.2
Chair 56.2
Furniture 56.2
Pants 55.6

Imagga
created on 2022-02-11

person 27.6
people 27.3
portrait 25.9
fashion 25.6
home 23.1
attractive 22.4
adult 21.6
lady 20.3
indoors 20.2
sexy 20.1
pretty 19.6
model 19.4
dress 19
body 18.4
face 17.8
house 17.5
happy 16.9
cute 16.5
elegance 15.9
hair 15.8
human 15.7
clothing 14.5
posing 14.2
style 14.1
smiling 13.7
one 13.4
happiness 13.3
standing 13
vertical 12.2
blond 12
makeup 11.9
casual 11.9
sensual 11.8
joy 11.7
child 11.7
interior 11.5
art 11.4
looking 11.2
elegant 11.1
lifestyle 10.8
man 10.8
room 10.7
hairstyle 10.5
costume 10.4
alone 10
stylish 9.9
fitness 9.9
fun 9.7
apartment 9.6
moving 9.5
women 9.5
fashionable 9.5
love 9.5
healthy 9.4
expression 9.4
male 9.3
smile 9.3
exercise 9.1
make 9.1
box 9.1
active 9
look 8.8
vogue 8.7
jeans 8.6
studio 8.4
hand 8.4
health 8.3
girls 8.2
outfit 8.1
cheerful 8.1
work 7.8
party 7.7
film 7.7
luxury 7.7
modern 7.7
old 7.7
energy 7.6
mature 7.4
holding 7.4
20s 7.3
black 7.3
color 7.2
negative 7.2
fresh 7.2
romantic 7.1
lovely 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 97.4
man 95.7
clothing 94.4
person 91.6
black and white 87.7

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Female, 96.5%
Calm 54.5%
Sad 27.3%
Happy 11.3%
Disgusted 2.7%
Confused 1.2%
Fear 1%
Angry 1%
Surprised 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Guitar 97.8%
Bird 91.9%

Captions

Microsoft

a man standing in a room 76.1%
a man that is standing in a room 72.9%
a young man standing in a room 55.6%

Text analysis

Amazon

3

Google

MJI7--YT 37A°2 -- XAGOX
MJI7--YT
--
37A°2
XAGOX